
"The government is committed to following the evidence, and these powers will mean we can act fast on [the consultation] findings within months, rather than waiting years for new primary legislation every time technology evolves. This approach enables rapid response to emerging digital safety challenges without prolonged legislative delays."
"In January 2026, X's Grok AI tool came under fire for mass-producing nudified content of women and girls without their consent. According to researchers, Grok AI generated about three million sexualised images in less than two weeks, including 23,000 that appear to depict children, demonstrating urgent need for AI regulation."
"While the UK's OSA is one of the world's strictest online safety regimes, campaigners have long warned the laws are insufficient, enabling tech firms to continue operating with impunity, highlighting gaps between regulatory intent and practical enforcement effectiveness."
The UK government is consulting on digital well-being measures for young people, including a potential ban on social media access for under-16s. Proposed restrictions target addictive design features like infinite scrolling and VPN access, while requiring tech companies to prevent child sexual abuse material transmission. Prime Minister Keir Starmer announced amendments to the Online Safety Act to mandate AI chatbot developers protect users, following incidents where Elon Musk's Grok platform generated millions of non-consensual intimate images. The government plans to introduce legal powers enabling rapid implementation of consultation findings within months rather than years. Despite the UK's Online Safety Act being one of the world's strictest regimes since March 2025, campaigners argue existing laws remain insufficient, allowing tech firms to operate with limited accountability.
#social-media-age-restrictions #ai-regulation-and-chatbots #child-online-safety #non-consensual-intimate-images #online-safety-act-amendments
Read at ComputerWeekly.com
Unable to calculate read time
Collection
[
|
...
]