Lawmakers propose anti-nonconsensual AI porn bill after Taylor Swift controversy
Briefly

The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate 'digital forgeries' depicting an identifiable person without their consent, letting victims collect financial damages from anyone who 'knowingly produced or possessed' the image with the intent to spread it.
The bill was introduced by Senate Majority Whip Dick Durbin (D-IL), joined by Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO). It builds on a provision in the Violence Against Women Act Reauthorization Act of 2022, which added a similar right of action for non- faked explicit images. In a summary, the sponsors described it as a response to an 'exponentially' growing volume of digitally manipulated explicit AI images, referencing Swift's case as an example of how the fakes can be 'used to exploit and harass women - particularly public figures, politicians, and celebrities.'
Pornographic AI-manipulated images, frequently referred to as deepfakes, have grown in popularity and sophistication since the term was coined in 2017. Off-the-shelf generative AI tools have made them far easier to produce, even on systems with guardrails against explicit imagery or impersonation, and they've been used for harassment and blackmail. But so far, there's no clear legal redress in many parts of the US. Nearly all states have passed laws banning unsimulated nonconsensual pornography, though it's been a slow process. Far fewer have laws addressing simulated imagery. (There's no federal criminal law directly banning either type.) But it's part of President Joe Biden's AI regulation agenda, and White House press secretary Karine Jean-Pierre called on Congress to pass new laws in response to the Taylor Swift incident last week.
Read at The Verge
[
add
]
[
|
|
]