AI Is Triggering a Child-Sex-Abuse Crisis
Briefly

"Perhaps millions of kids nationwide have been affected in some way by the emergence of this technology, either directly victimized themselves or made aware of other students who have been."
"Before the technology became widely available, most CSAM consisted of recirculating content, meaning anything that matched a database of known, abusive images could be flagged and removed."
"Although the problem is exceptionally challenging and upsetting, the experts I spoke with were hopeful that there may yet be solutions."
Read at The Atlantic
[
]
[
|
]