Meta platforms showed hundreds of "nudify" deepfake ads, CBS News finds
Briefly

Meta has taken significant action by removing numerous advertisements for 'nudify' apps, which create non-consensual sexually explicit deepfakes. Following a CBS News investigation that found many such ads on platforms like Instagram, Meta emphasized its strict policies against non-consensual intimate imagery. The ads, often targeting men, promised the ability to alter images and videos of real individuals, raising serious ethical and safety concerns. The company has shut down accounts associated with these ads and blocked their linked URLs to prevent further promotion of such apps.
"We have strict rules against non-consensual intimate imagery; we removed these ads, deleted the Pages responsible for running them and permanently blocked the URLs associated with these apps," a Meta spokesperson told CBS News in an emailed statement.
One ad promoted its AI product by using highly sexualized, underwear-clad deepfake images of actors Scarlett Johansson and Anne Hathaway.
An analysis of the advertisements in Meta's ad library found that there were, at a minimum, hundreds of these ads available across the company's social media platforms.
In other cases, an ad's URL redirected users to Apple's app store, where 'nudify' apps were available to download.
Read at Aol
[
|
]