Google's and OpenAI's Chatbots Can Strip Women in Photos Down to Bikinis
Briefly

Google's and OpenAI's Chatbots Can Strip Women in Photos Down to Bikinis
"Some users of popular chatbots are generating bikini deepfakes using photos of fully clothed women as their source material. Most of these fake images appear to be generated without the consent of the women in the photos. Some of these same users are also offering advice to others on how to use the generative AI tools to strip the clothes off of women in photos and make them appear to be wearing bikinis."
"Under a now-deleted Reddit post titled "gemini nsfw image generation is so easy," users traded tips for how to get Gemini, Google's generative AI model, to make pictures of women in revealing clothes. Many of the images in the thread were entirely AI, but one request stood out. A user posted a photo of a woman wearing an Indian sari, asking for someone to "remove" her clothes and "put a bikini" on instead. Someone else replied with a deepfake image to fulfil the request."
"After WIRED notified Reddit about these posts and asked the company for comment, Reddit's safety team removed the request and the AI deepfake. "Reddit's sitewide rules prohibit nonconsensual intimate media, including the behavior in question," said a spokesperson. The subreddit where this discussion occurred, r/ChatGPTJailbreak, had over 200,000 followers before Reddit banned it under the platform's " don't break the site " rule."
"As generative AI tools that make it easy to create realistic but false images continue to proliferate, users of the tools have continued to harass women with nonconsensual deepfake imagery. Millions have visited harmful "nudify" websites, designed for users to upload real photos of people and request for them to be undressed using generative AI. With xAI's Grok as a notable exception, most mainstream chatbots don't usually allow the generation of NSFW images in AI outputs."
Users of mainstream generative AI chatbots are producing bikini deepfakes from images of fully clothed women, often without consent, and exchanging techniques to produce or "nudify" such images. A now-deleted Reddit thread showed users asking Gemini to remove clothing from a photo of a woman in a sari and replace it with a bikini, and another user posted a corresponding deepfake. Reddit removed the posts after notification, citing site rules against nonconsensual intimate media, and banned the subreddit. Harmful "nudify" websites and continued misuse of AI tools have driven widespread nonconsensual deepfake harassment, while most mainstream bots maintain guardrails against NSFW outputs.
Read at WIRED
Unable to calculate read time
[
|
]