The Online Safety Act, enacted in October 2023 in the UK, requires over 100,000 companies to prevent user access to illegal or harmful content. These regulations apply to social media, forums, messaging services, and more. Companies must enforce strict age limits for certain content and promptly remove harmful material. Non-compliance can lead to hefty fines and potential criminal charges for senior managers. Ofcom's guidance, enforceable from March 2025, outlines mandatory age checks and content moderation processes for platforms frequented by children.
The UK's Online Safety Act mandates that over 100,000 companies prevent users from accessing illegal or harmful content, imposing stringent age restrictions and removing offending material swiftly.
Ofcom has the authority to impose fines of 10% of a company's global revenue or £18 million for non-compliance with the Online Safety Act to ensure child safety online.
Senior managers of online platforms face criminal liability for failing to comply with regulations or ensuring their companies abide by child safety duties under the Online Safety Act.
Ofcom's Illegal Harms Codes require internet services accessed by children to conduct robust age checks and implement content moderation to filter harmful content effectively.
Collection
[
|
...
]