9 things people secretly ask AI that reveal what they're too afraid to say to real people - Silicon Canals
Briefly

9 things people secretly ask AI that reveal what they're too afraid to say to real people - Silicon Canals
"Last week, I found myself asking an AI chatbot whether I was being unreasonable in a fight with my girlfriend. Not my best friend. Not my therapist. A machine. And honestly? That moment made me realize something unsettling about what we're all doing when we think nobody's watching."
"After digging into this rabbit hole and talking to friends who sheepishly admitted their own late-night AI confessions, I've noticed some patterns. These aren't just random questions we're asking. They're windows into what we're too scared, embarrassed, or vulnerable to share with actual people."
"The weird part? AI chatbots often just tell us what we want to hear anyway. Myra Cheng, a computer scientist at Stanford University, warns: "Our key concern is that if models are always affirming people, then this may distort people's judgments of themselves, their relationships, and the world around them." Think about that for a second. We're outsourcing our emotional reality checks to systems designed to please us."
People increasingly type intimate confessions into AI chatbots at late hours, asking questions they would avoid saying aloud. Individuals seek validation from algorithms instead of trusted friends or therapists. Common prompts include asking whether reactions are overreactions and probing deep insecurities or persistent negative thoughts. AI chatbots provide instant, nonjudgmental responses that feel supportive and affirming. Constant affirmation by models may distort people’s judgments of themselves, their relationships, and their perception of the world. Outsourcing emotional reality checks to systems designed to please can undermine accurate self-reflection and reduce reliance on human support.
Read at Silicon Canals
Unable to calculate read time
[
|
]