Roblox, alongside major tech players like Google and OpenAI, has established ROOST, a nonprofit focused on creating free, open-source tools to enhance online safety. This initiative is particularly significant for platforms popular with children, addressing past criticisms regarding inadequate safety measures. With the rise of AI, tools have been developed to rapidly detect harmful content. Roblox's existing AI model for identifying inappropriate audio has been successful; it will expand with multilingual support, making such technologies accessible to developers of all sizes to ensure child safety online.
These decisions need to happen within milliseconds.
If you're a small game developer and you want to moderate content effectively, having access to these tools is crucial.
Collection
[
|
...
]