Apple sued for failing to implement tools that would detect CSAM in iCloud
Briefly

The lawsuit against Apple alleges that after promising child safety measures, the company "failed to implement...or take any measures" to detect and limit CSAM, harming the victims.
Apple stated, "Child sexual abuse material is abhorrent, and we are committed to...combatting these crimes without compromising the security and privacy of all our users."
Read at Engadget
[
|
]