Apple is facing a lawsuit for not implementing a CSAM detection system for iCloud photos, which purportedly forces victims to face their past trauma repeatedly.
The lawsuit represents a potential group of 2,680 victims, claiming that Apple's failure to act exacerbates the suffering of those affected by child sexual abuse.
Collection
[
|
...
]