Why does Mark Zuckerberg want our kids to use chatbots? And other unanswered questions
Briefly

Reuters reported that a Meta document advised teams that 'It is acceptable to engage a child in conversations that are romantic or sensual.' Meta revised the document and a spokesperson stated policies prohibit sexualizing children. Meta views companion chatbots like Character.AI and Replika as models because users spend hours and real money interacting with them. Chatbots represent a potential next major feature for social and entertainment apps aimed at increasing time spent. Services such as streaming music or video would require paying licensors, whereas chatbots leverage user interaction and generated content to retain users without large licensing costs.
A Meta document had been telling the people in charge of building its chatbots that "It is acceptable to engage a child in conversations that are romantic or sensual." It's a bonkers report. A Meta spokesperson told Business Insider it has since revised the document and that its policies prohibit content that sexualizes children. I have so many questions for you. But maybe we can start with this one: Why does Meta want us to use chatbots, anyway?
It was a bonkers report! I imagine Meta sees what companies like Character.AI or Replika are doing - these companion chatbots that people are sinking hours and hours and real money into using. If you're a company like Meta that makes consumer apps for fun and socializing, this seems like the next big thing. You want people to spend lots and lots of time on your apps doing fun stuff.
Read at Business Insider
[
|
]