Boys are taking images of female classmates and using AI to deepfake nude photos. A landmark lawsuit could stop it.
Briefly

The proliferation of these images has exploited a shocking number of women and girls across the globe. These images are used to bully, humiliate and threaten women and girls. The impact on the victims has been devastating on their reputation, mental health, loss of autonomy, and in some instances, causing some to become suicidal.
This lawsuit alleges that the services broke numerous state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children. But it can be hard to determine who runs the apps, which are unavailable in phone app stores but still easily found on the internet.
Read at Fortune
[
]
[
|
]