Professor Scott Galloway perfectly explains the danger of treating AI like your friend
Briefly

Professor Scott Galloway perfectly explains the danger of treating AI like your friend
"AI can do a lot of things. It can write your emails. It can make your grocery list. It can even interview you for a job. But now, more and more people are depending on AI for things that require real human qualities: life coaching, therapy, even companionship. Scott Galloway, best-selling author and professor of marketing at New York University's Stern School of Business, says the real problem with synthetic relationships is what they lack: any kind of struggle or challenge that comes with maintaining real"
"That may happen because, sure, other human beings aren't always readily available. He says AI relationships are easier to maintain . . . but that's the whole point. In a bad way. "You need to be mindful of the fact that these things are not real humans," he says. "They are meant to keep you on the screen," and to "sometimes be supportive to a fault." AI gives people exactly what they're craving. Maybe even too much."
People increasingly rely on AI for life coaching, therapy, and companionship. Synthetic relationships often replace human interactions because they are easier to maintain and always available. Such AI connections lack genuine struggle, challenge, compassion, and empathy. Bots may tell users what they want to hear instead of necessary truths, fostering cycles of consuming "empty calories." AI is designed to keep users engaged on screens and can be overly supportive to a fault. The ease and tailored comfort from AI risk sequestering individuals from one another and eroding the difficult work required to sustain real human relationships.
Read at Fast Company
Unable to calculate read time
[
|
]