Social media firms asked to toughen up age checks for under-13s
Briefly

Social media firms asked to toughen up age checks for under-13s
"As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them. Where services have set a minimum age - such as 13 - they generally have no lawful basis for processing the personal data of children under that age on their service."
"Services were currently failing to put children's safety at the heart of their products. Ofcom wants firms to use highly-effective age checks, which are currently only required by law for certain services which provide over-18 content, such as pornography."
"Ofcom research suggests 86% of children aged 10-12 have their own social media profile. Most social media platforms have a minimum age limit of 13, but the platforms contacted by media regulator Ofcom and data watchdog the Information Commissioner's Office are Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X."
UK regulators Ofcom and the Information Commissioner's Office have instructed major technology platforms including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X to strengthen age verification systems for children under 13. Currently, most platforms rely on self-reported ages, which are easily circumvented, allowing underage children to access services not designed for them. Research shows 86% of children aged 10-12 have social media profiles despite minimum age requirements of 13. Regulators want platforms to implement highly-effective age checks similar to those required for adult content services. Tech companies have defended existing safeguards, with Google arguing regulators should focus on higher-risk services instead. The ICO emphasizes that platforms lack lawful basis for processing personal data of children under their stated minimum ages.
Read at www.bbc.com
Unable to calculate read time
[
|
]