
"It worries me that it's so normalised. He obviously wasn't hiding it. He didn't feel this was something he shouldn't be doing. It was in the open and people saw it. That's what was quite shocking. A headteacher is describing how a teenage boy, sitting on a bus on his way home from school, casually pulled out his phone, selected a picture from social media of a girl at a neighbouring school and used a nudifying app to doctor her image."
"Ten years ago it was sexting and nudes causing havoc in classrooms. Today, advances in artificial intelligence (AI) have made it child's play to generate deepfake nude images or videos, featuring what appear to be your friends, your classmates, even your teachers. This may involve removing clothes, getting an image to move suggestively or pasting someone's head on to a pornographic image."
Teenagers are using AI-powered nudifying apps and deepfake tools to create sexualized images and videos of peers, classmates, and teachers. A teenage boy on a bus selected a social-media photo of a girl at a neighbouring school and doctored it with a nudifying app. Schools, parents, and police can become involved after incidents are reported, but stigma and shame often lead to victims not being informed. Deepfakes can remove clothes, animate images suggestively, or paste heads onto pornographic content. Convictions have occurred internationally, including boys sentenced for producing and sharing fake naked images of female schoolmates.
Read at www.theguardian.com
Unable to calculate read time
Collection
[
|
...
]