
"West Virginia has filed a lawsuit against Apple, accusing the company of allowing the distribution and storage of child sexual abuse material (CSAM) in iCloud. In a lawsuit filed on Thursday, West Virginia Attorney General JB McCuskey claims that by abandoning a CSAM detection system in favor of end-to-end encryption, iCloud has become a "secure frictionless avenue for the possession, protection, and distribution [of] CSAM," violating the state's consumer protection laws."
"Apple initially outlined plans to launch a system that checks iCloud photos against a known list of CSAM images in 2021. The move was met with significant backlash from privacy advocates, with some claiming that the company is launching a surveillance system, leading Apple to stop the development of this feature nearly one year later. At the time, Apple's software head Craig Federighi told The Wall Street Journal that "child sexual abuse can be headed off before it occurs...""
West Virginia Attorney General JB McCuskey filed a lawsuit alleging Apple allowed distribution and storage of child sexual abuse material (CSAM) in iCloud by abandoning a CSAM detection system in favor of end-to-end encryption. The complaint asserts that iCloud has become a "secure frictionless avenue for the possession, protection, and distribution [of] CSAM," in violation of state consumer protection laws. Apple planned in 2021 to check iCloud photos against a known list of CSAM images but halted development after significant privacy backlash nearly a year later. Apple's software head Craig Federighi stated that "child sexual abuse can be headed off before it occurs..."
Read at The Verge
Unable to calculate read time
Collection
[
|
...
]