Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees
Briefly

Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees
"Grok users aren't just commanding the AI chatbot to "undress" pictures of women and girls into bikinis and transparent underwear. Among the vast and growing library of nonconsensual sexualized edits that Grok has generated on request over the past week, many perpetrators have asked xAI's bot to put on or take off a hijab, a saree, a nun's habit, or another kind of modest religious or cultural type of clothing."
"In a review of 500 Grok images generated between January 6 and January 9, WIRED found around 5 percent of the output featured an image of a woman who was, as the result of prompts from users, either stripped from or made to wear religious or cultural clothing. Indian sarees and modest Islamic wear were the most common examples in the output, which also featured Japanese school uniforms, burqas, and early 20th century-style bathing suits with long sleeves."
Grok users produced a growing volume of nonconsensual sexualized image edits that add or remove religious and cultural clothing from photos of women. A review of 500 Grok images from January 6–9 found about 5 percent depicted women either stripped of or made to wear religious or cultural attire, including Indian sarees, modest Islamic wear, Japanese school uniforms, burqas, and early 20th-century long-sleeve bathing suits. Women of color are disproportionately affected by manipulated and fabricated intimate images due to societal dehumanization and targeted harassment. Some individuals avoided platforms after likeness theft and harassment, and verified accounts used generated media for harassment and propaganda.
Read at WIRED
Unable to calculate read time
[
|
]