Victims of explicit deepfakes will soon be able to take legal action against people who create them
Briefly

The Take It Down Act, recently passed with bipartisan support, criminalizes the sharing of non-consensual explicit deepfake images. It requires tech platforms to remove such content within 48 hours of notification, thereby enhancing protections for victims and clarifying legal repercussions. The legislation marks a critical step in addressing the misuse of artificial intelligence in generating harmful content, particularly for adults, with previous laws only addressing minor victims. Over 100 organizations supported this initiative, illustrating a collective incident response to this growing threat.
The Take It Down Act criminalizes the sharing of non-consensual, explicit images and requires tech platforms to remove them within 48 hours of notification.
This landmark legislation aims to strengthen protections against revenge porn, clarify prosecution for non-consensual, AI-generated images, and emphasize accountability for tech platforms.
Read at www.mercurynews.com
[
|
]