There are hundreds of open investigations into content shared on X, a senior Irish police officer has said, amid concerns over potential child sexual abuse material (CSAM) generated on the platform using the artificial intelligence tool Grok. Irish politicians convened on Wednesday for an Oireachtas Media Committee hearing with Irish police and other experts which largely dealt with growing concerns over the proliferation of CSAM and other AI-generated sexualised material on the social network.
The Internet Watch Foundation (IWF) says its analysts have discovered "criminal imagery" of girls aged between 11 and 13 which "appears to have been created" using Grok. The AI tool is owned by Elon Musk's firm xAI. It can be accessed either through its website and app, or through the social media platform X. The IWF said it found "sexualised and topless imagery of girls" on a "dark web forum" in which users claimed they used Grok to create the imagery.
For the teen suing, the prime target remains ClothOff itself. Her lawyers think it's possible that she can get the app and its affiliated sites blocked in the US, the WSJ reported, if ClothOff fails to respond and the court awards her default judgment. But no matter the outcome of the litigation, the teen expects to be forever "haunted" by the fake nudes that a high school boy generated without facing any charges.