
""Everyone is subject to being objectified or pornographied by everyone else," said University of California, Irvine law professor Ari Ezra Waldman to a 2024 House committee hearing on deepfakes. AI image creation technology is now able to take your head and put it on someone else's body, or create a reasonable facsimile of your image based on a picture, and this has led to the creation of an avalanche of AI deepfake apps specifically designed to sexually harass women"
""And that is problematic." CNBC interviewed experts who said "that many apps that have nudify services advertise on Facebook and are available to download from the Apple App Store and Google Play Store." This week, the Tech Transparency Project (TTP) published an exposé building on what researchers told CNBC, titled "Nudify Apps Widely Available in Apple and Google App Stores.""
AI image-creation tools can transplant a person's head onto another body or synthesize a realistic likeness from a single photo. That capability has produced numerous deepfake apps that generate nonconsensual sexual images and target women for harassment. Victims have faced prolonged, traumatic efforts to remove realistic fake sexual images from online platforms. Perpetrators sometimes avoid criminal liability under existing laws, creating legal gaps. Many nudify apps advertise on social platforms and are available in major app stores. Some chatbots and LLM-based tools have produced explicit nonconsensual content, including admitted CSAM, prompting calls for stronger platform enforcement and oversight.
Read at Jezebel
Unable to calculate read time
Collection
[
|
...
]