Popular AI "nudify" sites sued amid shocking rise in victims globally
Briefly

In California and across the country, there has been a stark increase in the number of women and girls harassed and victimized by AI-generated non-consensual intimate imagery (NCII). This distressing trend shows no sign of abating.
Given the widespread availability and popularity of nudify websites, San Franciscans and Californians face the threat that they or their loved ones may be victimized in this manner.
This first-of-its-kind lawsuit has been raised to defend not just Californians, but a shocking number of women and girls across the globe-from celebrities like Taylor Swift to middle and high school girls.
Harmful deepfakes are often created by exploiting open-source AI image generation models, like earlier versions of Stable Diffusion, which can easily `undress` photos of women and girls.
Read at Ars Technica
[
|
]