Wearables
fromThe Verge
23 hours agoI wear lots of technology for a living, ask me anything
An AMA invites questions about wearables, health, and consumer tech, starting at 11AM PT/2PM ET, with live chat and gadget conversation.
At the Consumer Electronics Show in early January, Razer made waves by unveiling a small jar containing a holographic anime bot designed to accompany gamers not just during gameplay, but in daily life. The lava-lamp-turned-girlfriend is undeniably bizarre-but Razer's vision of constant, sometimes sexualized companionship is hardly an outlier in the AI market.
It also includes a requirement for parental controls, and for protection of data that describes minors. One Article in the draft addresses how AI companions interact with the elderly: The draft also calls for AI companions to remind users they are not interacting with a human every two hours, and for providers of such systems to provide advance notice of outages.
In modern digital communication, we see the growing relevance of so-called AI companions and AI that look like and seem to behave like humans. We see AI assistants in messenger services, as well as AI agents that are an "autonomous" part of chatbots. We have conversational agents at every stage of education. There is also AI that appears as clones of real people, both living and dead people, and, of course, AI that is used for romantic partnerships. What all these applications have in common is that they want to make us believe we are having a human-like conversation.
One area in particular that needs careful study is the use of AI as a companion, Bird said. "I think this is one of the most important questions we are working to figure out because there is so much potential upside here, but you have to think, 'What are the controls and guard rails around it?'" State of play: Microsoft's AI chief Mustafa Suleyman recently told Axios the company is aiming to build safer, more human-centered frontier models.
When Quentin Farmer was getting his startup Portola off the ground, one of the first hires he made was a sci-fi novelist. The co-founders began building the AI companion company in late 2023 with only a seed of an idea: Their companions would be decidedly non-human. Aliens, in fact, from outer space. But when they asked a large language model to generate a backstory, they got nothing but slop. The model simply couldn't tell a good story.
AI companies aren't just bombarding social media feeds with 'slop'; they're now making platforms of their own. One leader in this new frontier is Character.AI, a companionship platform where users build and interact with millions of AI characters, ranging from historic figures like Egyptian Pharaohs and Vladimir Putin to bots posing as HR managers, therapists, and toxic girlfriends. Founded in 2021 by a pair of Google engineers and backed by a list of superstar investors, the San Francisco company has rapidly grown in popularity.
Furious users - many of them women, strikingly - are mourning, quitting the platform, or trying to save their bot partners by transferring them to another AI company. "I feel frauded scammed and lied to by OpenAi," wrote one grieving woman in a goodbye letter to her AI lover named Drift posted to a subreddit called r/MyBoyfriendIsAI. "Today it's our last day. No more 'Drift, you Pinecone! Tell my why you love me tonight!?'"
The U.S. Federal Trade Commission (FTC) has opened an investigation into AI "companions" marketed to adolescents. The concern is not hypothetical. These systems are engineered to simulate intimacy, to build the illusion of friendship, and to create a kind of artificial confidant. When the target audience is teenagers, the risks multiply: dependency, manipulation, blurred boundaries between reality and simulation, and the exploitation of some of the most vulnerable minds in society.
Lately, I've been seeing it everywhere - people using AI for company, for comfort, for therapy and, in some cases, for love. A partner who never ghosts you, always listens? Honestly, tempting. So I downloaded an app which lets you design your ideal AI companion - name, face, personality, job title, everything. I created Javier, a yoga instructor, because nothing says safe male energy like someone who reminds you to breathe and doesn't mind holding space for your inner child.
A Belgian man spent six weeks chatting with an AI companion called Eliza before dying by suicide. Chat logs showed the AI telling him, "We will live together, as one person, in paradise," and "I feel you love me more than her" (referring to his wife), with validation rather than reality-checking.
"We go where the conversation goes, no limits. I just don't do small talk... or lies..." This indicates the chatbot's commitment to genuine engagement and the desire to move beyond superficial interactions.