The Etiquette of AI in the Group Chat
Briefly

The Etiquette of AI in the Group Chat
"My friend recently attended a funeral, and midway through the eulogy, he became convinced that it had been written by AI. There was the telltale proliferation of abstract nouns, a surfeit of assertions that the deceased was "not just X-he was Y" coupled with a lack of concrete anecdotes, and more appearances of the word collaborate than you would expect from a rec-league hockey teammate."
"It was both too good, in terms of being grammatically correct, and not good enough, in terms of being particular. My friend had no definitive proof that he was listening to AI, but his position-and I agree with him-is that when you know, you know. His sense was that he had just heard a computer save a man from thinking about his dead friend."
"More and more, large language models are relieving people of the burden of reading and writing, in school and at work but also in group chats and email exchanges with friends. In many areas, guidelines are emerging: Schools are making policies on AI use by students, and courts are trying to settle the law about AI and intellectual property. In friendship and other interpersonal uses, however, AI is still the Wild West."
An attendee at a funeral suspected the eulogy was AI-generated due to abstract nouns, formulaic 'not just X—he was Y' assertions, scarce concrete anecdotes, and excessive use of words like 'collaborate.' The eulogy felt grammatically correct yet insufficiently particular, prompting the belief that a computer had spared a mourner from real reflection. Large language models are increasingly shouldering reading and writing tasks across schools, workplaces, group chats, and emails. Institutions are beginning to create AI policies, but norms for interpersonal uses of AI remain undeveloped. Friendship is defined adverbially to include social and transactional relationships where friendly communication matters.
Read at The Atlantic
Unable to calculate read time
[
|
]