Under new law, cops bust famous cartoonist for AI-generated child sex abuse images
Briefly

"The creation of CSAM using AI is inherently harmful to children because the machine-learning models utilized by AI have been trained on datasets containing thousands of depictions of known CSAM victims, revictimizing these real children by using their likeness to generate AI CSAM images into perpetuity."
"In part, says the law, this is because all kinds of CSAM can be used to groom children into thinking sexual activity with adults is normal. But the law singles out AI-generated CSAM for special criticism due to the way that generative AI systems work."
Read at Ars Technica
[
|
]