Roblox joins $27 million industry nonprofit to support online safety
Briefly

Roblox, alongside major tech players like Google and OpenAI, has established ROOST, a nonprofit focused on creating free, open-source tools to enhance online safety. This initiative is particularly significant for platforms popular with children, addressing past criticisms regarding inadequate safety measures. With the rise of AI, tools have been developed to rapidly detect harmful content. Roblox's existing AI model for identifying inappropriate audio has been successful; it will expand with multilingual support, making such technologies accessible to developers of all sizes to ensure child safety online.
If you're a small game developer and you want to moderate content effectively, having access to these tools is crucial.
Read at Fast Company
[
|
]