Roblox, Discord, OpenAI and Google found new child safety group
Briefly

ROOST, a nonprofit launched by Roblox, Discord, OpenAI, and Google, aims to develop robust safety tools for public and private organizations, particularly for child safety. The initiative will focus on creating open-source tools to help detect and report child sexual abuse material (CSAM). Given the rapid evolution of online interactions due to generative AI, ROOST seeks to address urgent safety infrastructure needs, allowing organizations to implement reliable safety measures without the burden of developing them in-house. This comes amid growing concerns about child safety on platforms like Roblox, known for its user base of younger children.
Roblox, Discord, OpenAI, and Google announce ROOST, a nonprofit to create open-source safety tools, focusing on child safety in the evolving AI landscape.
Read at Engadget
[
|
]