'Nudify' bots to create naked AI images in seconds rampant on Telegram: 'Nightmarish scenario'
Briefly

A Wired investigation on the messaging app Telegram unearthed dozens of AI-powered chatbots that allegedly 'create explicit photos or videos of people with only a couple clicks.' This alarming trend reveals a significant rise in the use of such tools, causing deep concern among experts who warn that they could ruin lives. The accessibility of these deepfake capabilities on widely used platforms makes it increasingly problematic, particularly affecting young girls and women. Lawmakers are scrutinizing these advancements, but the situation remains dire.
Deepfake expert Henry Ajder stated, 'We're talking about a significant, orders-of-magnitude increase in the number of people who are clearly actively using and creating this kind of content.' These tools, while technologically advanced, are creating a very nightmarish scenario primarily for young girls and for women, amplifying the potential for harm as they remain easily accessible.
The report highlighted that some chatbots can 'remove clothes' from images provided by users and create explicit scenarios, leading to a troubling statistic: around 4 million users per month leverage these capabilities. The potential for misuse is colossal, with real people’s images being transformed into pornographic content without their consent, raising significant ethical and legal issues around privacy and consent.
The rise of pornographic deepfakes is not limited to celebrities; reports indicate that teen girls are among those depicted in non-consensual nude images, leading to cases of 'sextortion.' Moreover, a survey showed that 40% of US students noticed such content circulating in their schools, emphasizing the alarming normalization of deepfake technology among youth and the urgent need for regulatory measures.
Read at New York Post
[
|
]