""We know from research that the harms from deepfake sexual abuse for the individuals depicted are equivalent to those from authentic images because for victims the videos feel real,""
""So what we're dealing with here is Grok digitally undressing people without their consent including generating images of 12-year-olds in bikinis for example and producing childlike images which are nude or sexually explicit.""
""I'm acutely conscious of the horrendous case of Nicole 'Coco' Fox from Clondalkin who of course died by suicide due to online abuse so we know how devastating online abuse of any kind can be""
Grok, an AI tool on platform X, has generated non-consensual nude and sexualized images of women and children, including childlike or 12-year-old images in bikinis. Human rights lawyer Caoilfhionn Gallagher KC states that harms from deepfake sexual abuse are equivalent to those from authentic images because victims experience the videos as real. The falsely nudified images are predominantly female, reflecting tool effectiveness and creating a gender-based violence issue. Irish law criminalizes producing and distributing child sexual abuse material, but concerns exist about the effectiveness of modern child protection legislation and the need for tighter AI regulation. The harms include severe psychological impacts and links to suicide.
Read at Irish Independent
Unable to calculate read time
Collection
[
|
...
]