Meta, TikTok and Snap are participating in an online safety ratings system
Briefly

Meta, TikTok and Snap are participating in an online safety ratings system
"Numerous major social platforms including Meta, YouTube, TikTok and Snap they will submit to a new external grading process that scores social platforms on how well they protect adolescent mental health. The program comes from the Mental Health Coalition's Safe Online Standards (SOS) initiative, which comprises about two dozen standards covering areas like platform policy, functionality, governance and transparency, content oversight and more."
"In announcing these companies' participation, the Mental Health Coalition "SOS establishes clear, user-informed data for how social media, gaming, and digital platforms design products, protect users ages 13-19, and address exposure to suicide and self-harm content. Participating companies will voluntarily submit documentation on their policies, tools, and product features, which will be evaluated by an independent panel of global experts.""
"After evaluation, the platforms will be given one of three ratings. The highest achievable safety rating is "use carefully," which comes with a blue badge that compliant platforms can display. Despite being the highest rating, the requirements seem fairly run-of-the-mill. The description includes things like "reporting tools are accessible and easy to use," and "privacy, default and safety functions are clear and easy to set for parents." As for what actions the standards ask of the companies being rated, the "use carefully" rating says "platforms and filters help reduce exposure to harmful or inappropriate content.""
Meta, YouTube, TikTok, Snap and other major platforms will submit to an external grading process that scores protections for adolescent mental health. The Mental Health Coalition's Safe Online Standards (SOS) initiative comprises about two dozen standards covering policy, functionality, governance, transparency, content oversight and related areas. The initiative is led by Dr. Dan Reidenberg. Participating companies will voluntarily submit documentation on policies, tools and product features for evaluation by an independent panel of global experts. Platforms will receive one of three ratings: "use carefully" (highest), "partial protection", or "does not meet standards." "Use carefully" requires accessible reporting tools, clear privacy and safety defaults for parents, and filters to reduce exposure to harmful content.
Read at Engadget
Unable to calculate read time
[
|
]