#csam

[ follow ]
#child-exploitation

Feds test whether existing laws can combat surge in fake AI child sex images

Federal efforts to prosecute AI-generated child exploitation cases face legal challenges that may slow progress against emerging risks.

The DOJ makes its first known arrest for AI-generated CSAM

CSAM generated by AI is illegal and punishable under the law.

AI Is Triggering a Child-Sex-Abuse Crisis

Generative AI is increasingly being used to create sexually explicit images of children, highlighting an emerging crisis in safety.

Texan man gets 30 years in prison for running CSAM exchange

Robert Shouse was sentenced to 30 years in prison for running a dark web forum for child sexual abuse material and personally abusing children.

US: Alaska man busted with 10,000+ child sex abuse images despite his many encrypted apps

Child sexual abuse material (CSAM) is increasingly being hidden using advanced tech by some offenders, raising concerns about evolving exploitation methods.

Feds test whether existing laws can combat surge in fake AI child sex images

Federal efforts to prosecute AI-generated child exploitation cases face legal challenges that may slow progress against emerging risks.

The DOJ makes its first known arrest for AI-generated CSAM

CSAM generated by AI is illegal and punishable under the law.

AI Is Triggering a Child-Sex-Abuse Crisis

Generative AI is increasingly being used to create sexually explicit images of children, highlighting an emerging crisis in safety.

Texan man gets 30 years in prison for running CSAM exchange

Robert Shouse was sentenced to 30 years in prison for running a dark web forum for child sexual abuse material and personally abusing children.

US: Alaska man busted with 10,000+ child sex abuse images despite his many encrypted apps

Child sexual abuse material (CSAM) is increasingly being hidden using advanced tech by some offenders, raising concerns about evolving exploitation methods.
morechild-exploitation
#apple

Apple sued over abandoning CSAM detection for iCloud | TechCrunch

Apple is being sued for not implementing a system to detect child sexual abuse material in iCloud, allegedly impacting victims' trauma.

Apple sued for failing to implement tools that would detect CSAM in iCloud

Apple is being sued for failing to implement iCloud scanning for child sexual abuse material, leading to harm for victims.

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool

Apple faces lawsuits from survivors of sexual abuse due to its failure to effectively address child sexual abuse material on its platforms.

Apple sued over abandoning CSAM detection for iCloud | TechCrunch

Apple is being sued for not implementing a system to detect child sexual abuse material in iCloud, allegedly impacting victims' trauma.

Apple sued for failing to implement tools that would detect CSAM in iCloud

Apple is being sued for failing to implement iCloud scanning for child sexual abuse material, leading to harm for victims.

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool

Apple faces lawsuits from survivors of sexual abuse due to its failure to effectively address child sexual abuse material on its platforms.
moreapple
#content-moderation

Telegram U-turns and joins child safety scheme

Telegram's recent partnership with the Internet Watch Foundation marks a significant pivot toward addressing child sexual abuse material.

Checkmarked X Users Caught Promoting Sites That Sell Child Sex Abuse Videos

The verification system on X has proven ineffective in preventing the promotion of child sexual abuse material.

Bluesky ramps up content moderation as millions join the platform

Bluesky is increasing its content moderation team to address a rise in concerning user content amid rapid growth.

Telegram U-turns and joins child safety scheme

Telegram's recent partnership with the Internet Watch Foundation marks a significant pivot toward addressing child sexual abuse material.

Checkmarked X Users Caught Promoting Sites That Sell Child Sex Abuse Videos

The verification system on X has proven ineffective in preventing the promotion of child sexual abuse material.

Bluesky ramps up content moderation as millions join the platform

Bluesky is increasing its content moderation team to address a rise in concerning user content amid rapid growth.
morecontent-moderation
#child-safety

Child predators are using AI to create sexual images of their favorite stars': My body will never be mine again'

Predators on the dark web are increasingly using AI to create sexually explicit images of children, particularly fixating on 'star victims.'

Snap calls New Mexico's child safety complaint a 'sensationalist lawsuit'

Snap argues New Mexico's AG manipulated evidence to portray its app negatively.
The company emphasizes its compliance with CSAM reporting laws.
State officials highlight serious concerns about children's safety in their allegations.

The world's leading AI companies pledge to protect the safety of children online

Leading AI companies pledge to prevent AI exploitation of children for CSAM, aiming to protect children from abuse using generative AI.

Child predators are using AI to create sexual images of their favorite stars': My body will never be mine again'

Predators on the dark web are increasingly using AI to create sexually explicit images of children, particularly fixating on 'star victims.'

Snap calls New Mexico's child safety complaint a 'sensationalist lawsuit'

Snap argues New Mexico's AG manipulated evidence to portray its app negatively.
The company emphasizes its compliance with CSAM reporting laws.
State officials highlight serious concerns about children's safety in their allegations.

The world's leading AI companies pledge to protect the safety of children online

Leading AI companies pledge to prevent AI exploitation of children for CSAM, aiming to protect children from abuse using generative AI.
morechild-safety
#child-sexual-abuse

AI-generated child sexual abuse imagery reaching tipping point', says watchdog

AI-generated child sexual abuse imagery is increasingly prevalent online, with reports significantly rising in the past six months.

EU proposes criminalizing AI-generated child sexual abuse and deepfakes | TechCrunch

AI-generated child sexual abuse imagery could be criminalized in the EU as part of updated legislation.
Livestreaming child sexual abuse and possession of pedophile manuals could also become criminal offenses.

AI-generated child sexual abuse imagery reaching tipping point', says watchdog

AI-generated child sexual abuse imagery is increasingly prevalent online, with reports significantly rising in the past six months.

EU proposes criminalizing AI-generated child sexual abuse and deepfakes | TechCrunch

AI-generated child sexual abuse imagery could be criminalized in the EU as part of updated legislation.
Livestreaming child sexual abuse and possession of pedophile manuals could also become criminal offenses.
morechild-sexual-abuse

X fails to avoid Australia child safety fine by arguing Twitter doesn't exist

X Corp's lack of transparency on CSAM led to civil penalties and possible costly repercussions under Australian law.

Telegram's Durov must remain in France and post a 5M bail

Pavel Durov, founder of Telegram, faces serious criminal charges in France including money laundering and CSAM distribution.
#ai-ethics

Was an AI Image Generator Taken Down for Making Child Porn?

AI companies face scrutiny for enabling tools that facilitate the creation of child sexual abuse material, raising ethical and legal concerns.

Nonprofit scrubs illegal content from controversial AI training dataset

The LAION-5B dataset has been re-released as Re-LAION-5B, now cleaned of links to child sexual abuse materials (CSAM).

Was an AI Image Generator Taken Down for Making Child Porn?

AI companies face scrutiny for enabling tools that facilitate the creation of child sexual abuse material, raising ethical and legal concerns.

Nonprofit scrubs illegal content from controversial AI training dataset

The LAION-5B dataset has been re-released as Re-LAION-5B, now cleaned of links to child sexual abuse materials (CSAM).
moreai-ethics

As Tech CEOs Are Grilled Over Child Safety Online, AI Is Complicating the Issue

CEOs of social media companies grilled by Senators on preventing online child sexual exploitation
Reports of child sexual abuse material reached a record high last year

We're unprepared for the threat GenAI on Instagram, Facebook, and Whatsapp poses to kids

Social media platforms are inundated with AI-generated Child Sexual Abuse Material (AIG-CSAM), creating challenges for law enforcement and anti-CSAM institutions.
[ Load more ]