
""As it stands, Apple is not just enabling NCII and CSAM, but profiting off of it," the groups wrote in the open letter sent to Cook. "As a coalition of organizations committed to the online safety and well-being of all - particularly women and children - as well as the ethical application of artificial intelligence, we demand that Apple leadership urgently remove Grok and X from the App Store to prevent further abuse and criminal activity.""
"The missives, part of a campaign dubbed "Get Grok Gone," accuse both companies of profiting from the proliferation of non-consensual intimate images (NCII) and child sexual abuse material (CSAM) generated on X using the Grok AI chatbot. The groups argue that allowing the apps to remain available violates Apple's and Google's own app store policies against facilitating or profiting from abusive content."
Twenty-eight digital rights organizations, led by UltraViolet, delivered near-identical letters to Apple and Google demanding removal of X and the Grok AI due to AI-generated non-consensual intimate images and CSAM. The coalition argues that allowing the apps to remain available causes companies to profit from abusive content and violates app store policies. The campaign, branded 'Get Grok Gone,' seeks urgent removal to prevent further abuse and criminal activity. Ofcom has opened a formal investigation under the UK's Online Safety Act examining whether Grok's misuse to create and share intimate and potentially illegal images breaches X's legal obligations. X has announced measures, but the inquiry continues.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]