One in four unconcerned by sexual deepfakes created without consent, survey finds
Briefly

One in four unconcerned by sexual deepfakes created without consent, survey finds
"One in four people think there is nothing wrong with creating and sharing sexual deepfakes, or they feel neutral about it, even when the person depicted has not consented, according to a police-commissioned survey. The findings prompted a senior police officer to warn that the use of AI is accelerating an epidemic in violence against women and girls (VAWG), and that technology companies are complicit in this abuse."
"Creating non-consensual sexually explicit deepfakes is a criminal offence under the new Data Act. The report, by the crime and justice consultancy Crest Advisory, found that 7% of respondents had been depicted in a sexual or intimate deepfake. Of these, only 51% had reported it to the police. Among those who told no one, the most commonly cited reasons were embarrassment and uncertainty that the offence would be treated seriously."
A police-commissioned survey of 1,700 people found 13% believed creating and sharing sexual or intimate deepfakes is acceptable and 12% felt neutral. The report found 7% of respondents had been depicted in a sexual or intimate deepfake, yet only 51% of those reported it to police. Common reasons for not reporting included embarrassment and doubt the offence would be taken seriously. Senior law enforcement warned that AI is accelerating violence against women and girls and accused technology companies of complicity. Creating non-consensual sexually explicit deepfakes is a criminal offence under the new Data Act, and victims are urged to report.
Read at www.theguardian.com
Unable to calculate read time
[
|
]