Comply with child age checks or face consequences, Ofcom tells tech firms
Briefly

New online safety regulations expect risky sites and apps to utilize effective age checks to protect children from harmful content such as pornography and self-harm. Critics allege that the regulator Ofcom has favored the interests of big tech companies over ensuring children's safety. Campaigners argue that these reforms lack the necessary ambition and accountability. The Molly Rose Foundation, established after a tragic incident involving harmful social media content, voiced significant concerns about the efficacy of these changes. Ofcom maintains that the reforms aim to enhance safety while considering the operational concerns of technology firms.
From Friday, so-called risky sites and apps will be expected to use highly effective age checks to identify which users are children and subsequently prevent them from accessing pornography, as well as other harmful content including self-harm, suicide, eating disorders and extreme violence.
Some online safety campaigners said while the new measures should have been a watershed moment for young people, regulator Ofcom has instead let down parents, accusing it of prioritizing the business needs of big tech over children's safety.
The Molly Rose Foundation, founded by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media, said the changes lack ambition and accountability and warned that big tech will have taken note.
Ofcom chief executive Dame Melanie Dawes has previously defended the reforms, insisting that the intent is to enhance safety for young users while balancing it with the operational realities of technology platforms.
Read at www.independent.co.uk
[
|
]