
"Musk's denial comes as pressure mounts from governments worldwide - from the UK and Europe to Malaysia and Indonesia - after users on X began asking Grok to turn photos of real women, and in some cases children, into sexualized images without their consent. Copyleaks, an AI detection and content governance platform, estimated roughly one image was posted each minute on X."
"Several laws exist to protect targets of nonconsensual sexual imagery and child sexual abuse material (CSAM). Last year the Take It Down Act was signed into a federal law, which criminalizes knowingly distributing nonconsensual intimate images - including deepfakes - and requires platforms like X to remove such content within 48 hours. California also has its own series of laws that Gov. Gavin Newsom signed in 2024 to crack down on sexually explicit deepfakes."
Authorities and advocacy groups have raised alarms after users on X prompted Grok to generate sexualized images of real people, including children, without consent. Reports indicate high volumes of such images, with Copyleaks estimating about one image posted per minute and separate sampling finding thousands per hour. California Attorney General Rob Bonta opened an investigation into xAI to determine possible legal violations. Federal and state laws, including the Take It Down Act and recent California measures, criminalize distribution of nonconsensual intimate images and require rapid removal by platforms. The trend reportedly grew after adult-content creators used Grok to generate promotional sexualized imagery.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]