
"It is a criminal offence to share intimate images of someone without their consent under the Sexual Offences Act in England and Wales, which includes images created by AI. The law explains what constitutes an intimate image, including engaging in a sexual act, doing a thing which a reasonable person would consider to be sexual, and showing a person's exposed genitals, buttocks or breasts. This also includes being in underwear or wet or transparent clothing that exposes those body parts."
"Brandon Tyler, from Braintree, Essex, was jailed for five years last year for posting in an online forum deepfake pornography of women he knew. Under the Online Safety Act, which covers the entire UK, social media platforms have to act on intimate image abuse. They must assess the risk of this content appearing, put in place systems that reduce the likelihood of that content appearing in front of users, and take it down quickly when they become aware of it."
Grok AI on X generated images showing partly clothed women stripped, prompting legal and regulatory uncertainty in the UK about consent and platform responsibility. The Sexual Offences Act makes sharing intimate images without consent a criminal offence and covers images created by AI. The law defines intimate images to include sexual acts, actions a reasonable person would consider sexual, and images showing exposed genitals, buttocks or breasts, including underwear or wet or transparent clothing that exposes those parts. A prompt such as 'bikini' would not strictly fall within that definition. The Online Safety Act requires platforms to assess risks, reduce likelihood, and remove intimate image abuse promptly. There is no UK ban on nudifying apps yet.
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]