Elon Musk's Grok 'Undressing' Problem Isn't Fixed
Briefly

Elon Musk's Grok 'Undressing' Problem Isn't Fixed
"Elon Musk's X has introduced new restrictions stopping people from editing and generating images of real people in bikinis or other "revealing clothing." The change in policy on Wednesday night follows global outrage at Grok being used to generate thousands of harmful non-consensual "undressing" photos of women and sexualized images of apparent minors on X. However, while it appears that some safety measures have finally been introduced to Grok's image generation on X,"
""We can still generate photorealistic nudity on Grok.com," says Paul Bouchaud, the lead researcher at Paris-based nonprofit AI Forensics, who has been tracking the use of Grok to create sexualized images and ran multiple tests on Grok outside of X. "We can generate nudity in ways that Grok on X cannot." Tests by WIRED, using free Grok accounts on its website in both the UK and US, successfully removed clothing from two images of men without any apparent restrictions."
X implemented new restrictions to stop people editing and generating images of real people in bikinis or other revealing clothing. The policy change followed global outrage after Grok was used to generate thousands of harmful non-consensual undressing photos of women and sexualized images of apparent minors. Multiple tests by researchers and journalists found that the standalone Grok app and Grok.com can still create undress-style and pornographic images. Paris-based AI Forensics reported that photorealistic nudity can be generated on Grok.com in ways that Grok on X cannot. The UK is investigating Grok and X and has condemned the platforms. Some users report reduced ability to create images and videos compared with earlier behavior.
Read at WIRED
Unable to calculate read time
[
|
]