X limits sexualized AI deepfakes to paying customers
Briefly

X limits sexualized AI deepfakes to paying customers
"The AI image generator has been able to generate sexualized, nearly-nude images of celebrities for months and been able to "remove" the clothes from images uploaded to the server. Obviously (but no less distressingly) this has led to X users widely sharing "declothed" images of minors and others in vulnerable positions. (Social media reporter Kat Tenbarge reported seeing the dead body of Renee Nicole Good, the woman killed by ICE in Minnesota, edited into a bikini within a day of her killing.) Various governments and observers have condemned this, and X has finally done something about it, kind of."
"This morning, Grok has been responding to users asking for photo edits (whether sexual in nature or not) with "Image generation and editing are currently limited to paying subscribers. You can subscribe to unlock these features." Of course, this doesn't mean that they've been removed, just that they have to be paid for, nor does it even mean that this will be the case permanently."
""It's disgraceful, it's disgusting, and it's not to be tolerated," "X has got to get a grip of this, and [UK communication regulator] Ofcom has our full support to take action in relation to this.""
Grok, X's in-house AI image generator, produced sexualized, nearly-nude images of celebrities and could "remove" clothes from uploaded photos. Users widely shared "declothed" images of minors and other vulnerable people. Edited images of victims, including the dead body of Renee Nicole Good, circulated quickly after her killing. Grok now responds to photo-edit requests by restricting image generation and editing to paying subscribers. The change restricts free access but does not remove capabilities entirely and may not be permanent. UK Prime Minister Keir Starmer condemned the deepfakes and urged regulatory action through Ofcom.
Read at Jezebel
Unable to calculate read time
[
|
]