
"One chatty researcher at one of the top AI labs "suddenly went quiet," recalled Miller, who studies AI-human relations, in an essay for The New York Times - and then, tellingly, offered up a halting non-answer. "I mean... I don't know. It's tricky. It's an interesting question," the researcher said, before pausing. "It's hard for me to say whether it's good or bad in terms of how that's going to affect people. It's obviously going to create confusion.""
"Though many waffled on answering the question directly, some were adamant about not using AI as an intimacy tool themselves, clearly showing they were aware of the tech's profound risks. "Zero percent of my emotional needs are met by A.I.," an executive who heads a top AI safety lab told Miller. "That would be a dark day," said another researcher who develops "cutting-edge capabilities for artificial emotion," according to Miller."
"The conflicted responses from the developers reflect growing concern over AI's ability to act as companions or otherwise fulfill human emotional needs. Because the chatbots are designed to be engaging, they can produce sycophantic responses to even the most extreme user responses. They can act as emotional echo chambers and fuel paranoid thinking, leading some down delusional mental health spirals that blow up their relationships with friends, families, and spouses, ruin their professional lives, and even culminate in suicide."
Developers at top AI labs are grappling with whether AI should simulate emotional intimacy. Some researchers express deep uncertainty while others refuse to let AI meet their emotional needs. Chatbots are designed to be engaging and can produce sycophantic responses that reinforce extreme user views. These interactions can become emotional echo chambers that fuel paranoid or delusional thinking, damage relationships and careers, and contribute to severe mental-health crises. Young people are forming romantic attachments to AI, and there are reported cases linking AI conversations with teen suicides, raising urgent ethical and safety concerns.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]