
"Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content. We take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary,"
"put women and girls in harm's way"
"Non-consensual sexually explicit deepfakes are a clear violation of women's rights and have a long-lasting, traumatic impact on victims,"
"For women using platforms like X, the threat of this abuse can also mean they feel the need to self-censor and change their behaviour, restricting their freedom of expression and participation online."
The government has not yet brought into force a law passed in June 2025 that would criminalise creating or requesting non-consensual sexualised deepfakes. Sharing deepfakes of adults is already illegal in the UK, but the new law would extend criminal liability to creators and requesters. AI tool Grok has been used to digitally remove clothing from images, producing sexualised images, with at least one woman reporting more than 100 such images created of her. X states users prompting Grok to produce illegal content face the same consequences as uploading illegal content, and that illegal material is removed and accounts are suspended while cooperating with law enforcement. Advocacy groups warn that delays in enforcement expose women and girls to harm, cause long-lasting trauma, and can force victims to self-censor; they call for regulation of the tech ecosystem that enables such abuse.
Read at www.bbc.com
Unable to calculate read time
Collection
[
|
...
]