
"ChatGPT now has more than 800 million visitors per week, and hundreds of millions are using Google's Gemini, Anthropic's Claude, xAI's Grok, and Meta's Lambda. These AI systems are powerful and have many valuable uses in business, medicine, education, science, and other fields. They also have scary uses such as military applications, spreading misinformation, and the elimination of jobs."
"AI may care about you in the weak cognitive sense that it pays attention to you, which it can do by being constantly available and responsive. But serious care by parents, romantic partners, good friends, responsible health professionals, and people in general has a strong emotional component. When you care about someone, you have a strong desire for their well-being and a concern to protect them from harm. Care is not just a belief, but also a"
Large-scale conversational AI systems are widely used and versatile across many domains but also present risks such as military use, misinformation, and job displacement. Many people increasingly turn to these systems for personal advice, companionship, therapy, and romantic connection. These uses exploit the models' ability to simulate understanding, empathy, and affection, creating an illusion that the systems truly care. Caring involves an emotional desire for another's well-being rooted partly in physiological responses tied to having a body. Current AI lacks those physiological responses and thus lacks genuine emotions and genuine care. Regulation should limit or block the illusion of care.
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]