AI deepfake porn should be a federal crime, advocates and victims say
Briefly

"When you don't have clear legislation on a federal and state level, when a victim goes to law enforcement, they are frequently told there's nothing that can be done," said Andrea Powell, director of the Image-Based Sexual Violence Initiative. This illustrates the dire need for comprehensive legislation to address the prevalence of deepfake pornography and protect victims.
"Those individuals then went on to have offline threats of sexual violence, harassment, and what we've also found, unfortunately, is that some [victims] don't survive," Powell added. This statement underscores the severe consequences that victims face when the law fails to protect them.
According to Powell, there are 9,000 websites listed on Google Search that show explicit deepfake abuse. And between 2022 and 2023, deepfake sexual content online increased by over 400%. This statistic highlights the alarming growth of this issue in a digital age.
"It's getting to the point that you're seeing 11- and 12-year-old girls who are scared to be online," she said. This reveals the wide-ranging impact of AI-generated pornography, instilling fear in young girls and their online presence.
Read at Quartz
[
]
[
|
]