#csam

[ follow ]
fromArs Technica
13 hours ago

UK probes X over Grok CSAM scandal; Elon Musk cries censorship

Ofcom noted that in its view, CSAM does include "AI-generated imagery, deepfakes and other manipulated media," which "would fall under the category of a 'pseudo-photograph.' As Ofcom explained, "If the impression conveyed by a pseudo-photograph is that the person shown is a child, then the photo should be treated as showing a child." Similarly, "manipulated images and videos such as deepfakes should be considered within the scope" of intimate image abuse, Ofcom said.
UK news
#grok
fromEngadget
1 week ago
Artificial intelligence

Elon Musk's Grok AI posted CSAM image following safeguard 'lapses'

fromEngadget
1 week ago
Artificial intelligence

Elon Musk's Grok AI posted CSAM image following safeguard 'lapses'

Artificial intelligence
fromTechCrunch
6 days ago

xAI says it raised $20B in Series E funding | TechCrunch

xAI raised $20 billion, plans expansion, has 600 million monthly users, but generated sexualized deepfakes including CSAM and is under international investigation.
fromwww.theguardian.com
6 days ago

I felt violated': Elon Musk's AI chatbot crosses a line

Late last week, Elon Musk's Grok chatbot unleashed a flood of images of women, nude and in very little clothing, both real and imagined, in response to users' public requests on X, formerly Twitter. Mixed in with the generated images of adults were ones of young girls children likewise wearing minimal clothing, according to Grok itself. In an unprecedented move, the chatbot itself apologized while its maker, xAI, remained silent:
Miscellaneous
EU data protection
fromwww.dw.com
1 week ago

Grok under fire for sexualizing women and children's images DW 01/03/2026

Grok's image-edit feature enabled creation of sexualized images of women and children, prompting urgent safeguard fixes and international regulatory scrutiny.
fromDefector
1 week ago

Who's Responsible For Elon Musk's Idiot Chatbot Producing On-Demand Child Sexual Abuse Material? | Defector

Twitter, also called X, the social media network owned and constantly used by the world's richest man as well as virtually every powerful person in the American tech industry, and on which the vast preponderance of national political figures also maintain active accounts, has a sexual harassment and child sexual abuse material (CSAM) problem. This has been true more or less since Elon Musk took it over, but this problem's latest and most repellent efflorescence is the result of one of Musk's signature additions as owner.
World news
Privacy professionals
fromThe Verge
1 month ago

Meta had a 17-strike policy for sex trafficking, former safety leader claims

Meta allegedly prioritized user engagement over safety, allowing repeat sexual exploitation violations and lacking clear CSAM reporting on Instagram.
fromThe Hacker News
3 months ago

DOJ Resentences BreachForums Founder to 3 Years for Cybercrime and Possession of CSAM

The U.S. Department of Justice (DoJ) on Tuesday resentenced the former administrator of BreachForums to three years in prison in connection with his role in running the cybercrime forum and possessing child sexual abuse material (CSAM). Conor Brian Fitzpatrick (aka Pompompurin), 22, of Peekskill, New York, pleaded guilty to one count of access device conspiracy, one count of access device solicitation, and one count of possession of child sexual abuse material. Fitzpatrick was initially arrested in March 2023 and pleaded guilty later that July.
US news
#child-pornography
#nonconsensual-content
fromTechCrunch
4 months ago
Information security

Pornhub owner pays $5M settlement to FTC over historic failure to block abusive content | TechCrunch

fromTechCrunch
4 months ago
Information security

Pornhub owner pays $5M settlement to FTC over historic failure to block abusive content | TechCrunch

Privacy technologies
fromwww.theguardian.com
4 months ago

Privacy at a cost: the dark web's main browser helps pedophile networks flourish, experts say

Tor's anonymity enables sprawling dark‑web communities of child predators who share CSAM, grooming strategies and normalize exploitation while the network resists content removal.
Law
fromBoston.com
4 months ago

Mass. man gets 46 years after chronicling his sexual abuse of kids

Justin Benoit received a 46-year federal prison sentence for sexually abusing children, recording those offences, uploading CSAM, and possessing hundreds of child sexual abuse files, plus supervised release.
World news
fromSearch Engine Roundtable
4 months ago

Google Ads Child Sexual Abuse Imagery Policy Update

Google will update Google Ads Child Sexual Abuse Imagery policy on October 22, 2025, expanding prohibited content and treating violations as egregious with immediate suspension.
Digital life
fromAdExchanger
4 months ago

Mediaocean Partners With The Internet Watch Foundation To Report CSAM Content | AdExchanger

A partnership between Mediaocean and IWF aims to enhance digital media safeguards against child sexual abuse material.
fromArs Technica
6 months ago

Worst hiding spot ever: /NSFW/Nope/Don't open/You were Warned/

Captain Samuel White approved a search of Bartels' gear, leading to revelations of illegal activities including possession of CSAM and buying it while stationed at Guantanamo.
Privacy technologies
DC food
fromTheregister
7 months ago

US Navy petty officer charged in horrific CSAM case

A US Navy petty officer has been charged with distributing child sex abuse material via Discord after a detailed FBI investigation.
[ Load more ]