Teen sues to destroy the nudify app that left her in constant fear
Briefly

Teen sues to destroy the nudify app that left her in constant fear
"For the teen suing, the prime target remains ClothOff itself. Her lawyers think it's possible that she can get the app and its affiliated sites blocked in the US, the WSJ reported, if ClothOff fails to respond and the court awards her default judgment. But no matter the outcome of the litigation, the teen expects to be forever "haunted" by the fake nudes that a high school boy generated without facing any charges."
"The teen has felt "mortified and emotionally distraught, and she has experienced lasting consequences ever since," her complaint said. She has no idea if ClothOff can continue to distribute the harmful images, and she has no clue how many teens may have posted them online. Because of these unknowns, she's certain she'll spend "the remainder of her life" monitoring "for the resurfacing of these images.""
""Knowing that the CSAM images of her will almost inevitably make their way onto the Internet and be retransmitted to others, such as pedophiles and traffickers, has produced a sense of hopelessness" and "a perpetual fear that her images can reappear at any time and be viewed by countless others, possibly even friends, family members, future partners, colleges, and employers, or the public at large," her complaint said."
Telegram's terms of service explicitly forbid nonconsensual pornography and the tools to create it, and such content is removed when discovered. A teen sued ClothOff and a boy after AI-generated fake nudes caused humiliation and lasting emotional harm. Her complaint states responsible individuals and witnesses failed to cooperate with law enforcement, and she fears ongoing distribution and unknown reposting of the images. The complaint describes hopelessness and perpetual fear that the images could reappear to friends, family, future partners, colleges, employers, or the public. The lawsuit joins broader efforts to crack down on AI-generated CSAM and NCII.
Read at Ars Technica
Unable to calculate read time
[
|
]