
"Tech platforms would have to remove intimate images which have been shared without consent within 48 hours, under a proposed UK law. The government said tackling intimate image abuse should be treated with the same severity as child sexual abuse material (CSAM) and terrorist content. Failure to abide by the rules could result in companies being fined up to 10% of their global sales or have their services blocked in the UK."
"Janaya Walker, interim director of the End Violence Against Women Coalition, said the "welcome and powerful move... rightly places the responsibility on tech companies to act." The proposals are being made through an amendment to the Crime and Policing Bill, which is making its way through the House of Lords. Under the plans, victims would only have to flag an image once, rather than contact different platforms separately."
"Tech companies would have to block the images from being re-uploaded once they have been taken down. The proposal would also provide guidance for internet service providers to be able to block access to sites hosting illegal content, the idea being that this would target rogue websites that currently fall outside of the reach of the Online Safety Act. Women, girls and LGBT people are disproportionately affected by Intimate Image Abuse (IIA)."
An amendment to the Crime and Policing Bill would require tech platforms to remove intimate images shared without consent within 48 hours and treat intimate image abuse with the same severity as CSAM and terrorist content. Failure to comply could lead to fines of up to 10% of global sales or blocking of services in the UK. Victims would need to flag an image only once, and platforms would be required to block re-uploads. Guidance for internet service providers would enable blocking of rogue websites currently outside the Online Safety Act. Women, girls and LGBT people are disproportionately affected; reports note rises in sextortion and a 20.9% increase in reports in 2024.
Read at www.bbc.com
Unable to calculate read time
Collection
[
|
...
]