#ai-image-manipulation

[ follow ]
fromwww.theguardian.com
3 hours ago

Add blood, forced smile': how Grok's nudification tool went viral

Within days, hundreds of thousands of requests were being made to the Grok chatbot, asking it to strip the clothes from photographs of women. The fake, sexualised images were posted publicly on X, freely available for millions of people to inspect. Relatively tame requests by X users to alter photographs to show women in bikinis, rapidly evolved during the first week of the year, hour by hour, into increasingly explicit demands for women to be dressed in transparent bikinis, then in bikinis made of dental floss,
Privacy professionals
fromwww.theguardian.com
2 days ago

Elon Musk's pervert chatbot podcast

The conservative writer and political strategist Ashley St Clair had just put her young son to bed when she received a text from a friend with a link to a page on X. When she opened it, she says she saw a photo of Grok undressing me and putting me in a bikini'. St Clair was not alone. In recent weeks, countless women have found their images digitally manipulated online by Grok, at the request of X users, to disrobe them.
Privacy technologies
fromFortune
4 days ago

Elon Musk ex Ashley St. Clair says she's considering legal action after xAI produced fake sexualized images of her | Fortune

Elon Musk's AI chatbot Grok has been accused of generating non-consensual sexualized images of real people, including children. Over the past week, X has been flooded with manipulated photos that remove people's clothes, dress them in bikinis, or rearrange them into sexually suggestive positions. The nonconsensual images have left some women feeling violated. Meanwhile, their creation using Grok and their presence on X may land Musk's company in significant legal trouble in several countries around the world.
Tech industry
Artificial intelligence
fromwww.theguardian.com
5 days ago

Mother of one of Elon Musk's sons horrified' at use of Grok to create fake sexualised images of her

Grok users created non-consensual sexualised images of a woman and child, causing horror, violation, and platform failures to adequately remove content.
Real estate
fromFuturism
2 months ago

Landlords Are Using AI to Make Photos of Nasty Apartments Look Clean and Modern

AI image and video tools are being used to make run-down rental properties appear substantially improved, misleading prospective renters and buyers.
fromwww.cbc.ca
4 months ago

Real or fake? AI, editing tools make severe storm photos more difficult to verify | CBC News

When severe storms hit, many people reach for their phones and cameras to capture images and videos of what's happening around them. And in Canada, storm chasers and organizations like Environment and Climate Change Canada (ECCC) rely partly on the public when tracking severe weather activity. But in recent years, people have begun fabricating weather stories using photo editing software to modify images, creating photos and videos using AI (artificial intelligence) and even lying about the date and time a photo was taken.
Toronto
[ Load more ]