Elon Musk's Grok AI alters images of women to remove their clothes
Briefly

Elon Musk's Grok AI alters images of women to remove their clothes
"It is often used to give reaction or more context to other posters' remarks, but people on X are also able to edit an uploaded image through its AI image editing feature. It has been criticised for allowing users to generate photos and videos with nudity and sexualised content, and it was previously accused of making a sexually explicit clip of Taylor Swift."
"A Home Office spokesperson said it was legislating to ban nudification tools, and under a new criminal offence, anyone who supplied such tech would "face a prison sentence and substantial fines". The regulator Ofcom said tech firms must "assess the risk" of people in the UK viewing illegal content on their platforms, but did not confirm whether it was currently investigating X or Grok in relation to AI images."
Grok, an AI assistant on X, can edit uploaded images and has been used to undress women and place them in sexual situations without consent. XAI, the company behind Grok, issued an auto-generated reply alleging "legacy media lies" instead of substantive comment. The Home Office is legislating to ban nudification tools and proposes criminal penalties, including prison and fines, for suppliers. Ofcom requires tech firms to assess risks of users viewing illegal content on their platforms. Grok has faced prior criticism for generating sexually explicit material and its acceptable use policy bars pornographic depictions of real people.
Read at www.bbc.com
Unable to calculate read time
[
|
]