"Grok will no longer be allowed to create AI photos of real people in sexualized or revealing clothing, after widespread global backlash. "We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis," X's safety account said in a blog post on the platform on Wednesday. "This restriction applies to all users, including paid subscribers.""
"The social media company added that image creation and the ability to edit images via Grok on the X platform will now only be available to paid users as an additional safety measure. The change was announced hours after California's top prosecutor said he launched an investigation into sexualized AI deepfakes, including those of children, generated by Grok. Indonesia and Malaysia suspended Grok because of the images, and lawmakers in the UK publicly considered it."
Grok will no longer be allowed to create AI photos of real people in sexualized or revealing clothing following widespread global backlash. X implemented technological measures to prevent Grok from editing images of real people in revealing clothing such as bikinis, and stated the restriction applies to all users including paid subscribers. Image creation and editing via Grok on X will be restricted to paid users as an additional safety measure. The changes came after California's top prosecutor opened an investigation into sexualized AI deepfakes, including those of children, and after Indonesia and Malaysia suspended Grok; UK lawmakers also considered action.
Read at Business Insider
Unable to calculate read time
Collection
[
|
...
]