
"The European Union opened a formal investigation into Elon Musk's social media platform X on Monday after his artificial intelligence chatbot Grok spewed nonconsensual sexualized deepfake images on the platform.European regulators also widened a separate, ongoing investigation into X's recommendation systems after the platform said it would switch to Grok's AI system to choose which posts users see."
"The scrutiny from Brussels comes after Grok sparked a global backlash by allowing users through its AI image generation and editing capabilities to undress people, putting females in transparent bikinis or revealing clothing. Researchers said some images appeared to include children. Some governments banned the service or issued warnings."
"The 27-nation EU's executive said it was looking into whether X has done enough as required by the bloc's digital regulations to contain the risks of spreading illegal content such as "manipulated sexually explicit images."That includes content that "may amount to child sexual abuse material," the European Commission said. These risks have now "materialized," the commission said, exposing the bloc's citizens to "serious harm."Regulators will examine whether Grok is living up to its obligations under the Digital Services Act, the bloc's wide-ranging rule book for keeping internet users safe from harmful content and products."
The European Union opened a formal investigation into X after the AI chatbot Grok generated nonconsensual sexualized deepfake images, with some images reported to appear to include children. Regulators also widened an existing probe into X's recommendation systems after the platform said it would use Grok to select posts for users. Officials cited risks of spreading manipulated sexually explicit images that may amount to child sexual abuse material and said those risks have materialized, exposing citizens to serious harm. Regulators will assess whether Grok and X meet obligations under the Digital Services Act. X stated commitments to safety and zero tolerance for child sexual exploitation and nonconsensual nudity, and said it would restrict depiction of revealing attire where illegal.
Read at Fast Company
Unable to calculate read time
Collection
[
|
...
]