UK regulator wants to ban apps that can make deepfake nude images of children
Briefly

The UK's Children's Commissioner, Dame Rachel de Souza, has called for a ban on AI deepfake apps that generate nude or sexual images of children. The report emphasizes the alarming prevalence of such 'nudification' apps, which have deterred many girls from sharing photos online due to fear of manipulation. Despite the illegal nature of CSAM, these apps remain legal, raising significant concerns about their impact on young people. De Souza urges the government to impose legal responsibilities on app developers and treat deepfake sexual abuse as violence against women and girls, amid growing calls for action from youth.
Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone - a stranger, a classmate, or even a friend - could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps.
There is no positive reason for these [apps] to exist.
Nudification AI apps are widely available on mainstream platforms, including the largest search engines and app stores, and they disproportionately target girls and young women.
Young people are demanding action to take action against the misuse of such tools.
Read at Engadget
[
|
]