Under the new Online Safety Act in the UK, social media platforms are mandated to implement comprehensive measures to combat illegal content, with severe penalties for non-compliance. Tech companies are required to act against harmful materials including fraud, terrorism, and child sexual abuse. Fines can reach up to £18 million or 10% of global revenue, pushing accountability. The act specifies 130 priority illegal content types and outlines codes of conduct from Ofcom to help platforms enhance safety measures, marking a significant shift in the industry's approach to user protection.
Social media platforms face fines if they fail to combat illegal content under new UK digital safety laws, requiring platforms to safeguard against fraud and exploitation.
From Monday, every site under the Online Safety Act must act against content that encourages suicide, extreme pornography, or drug sales, marking a crucial shift in accountability.
Tech companies are under increased pressure to treat safety as a priority rather than an afterthought, with significant penalties for non-compliance including potential billions in fines.
The Act outlines 130 illegal content types requiring immediate attention, with guidelines from Ofcom on moderating systems to combat issues like online fraud and child safety.
Collection
[
|
...
]