Meta Goes to Trial in a New Mexico Child Safety Case. Here's What's at Stake
Briefly

Meta Goes to Trial in a New Mexico Child Safety Case. Here's What's at Stake
"Today, Meta went to trial in the state of New Mexico for allegedly failing to protect minors from sexual exploitation on its apps, including Facebook and Instagram. The state claims that Meta violated New Mexico's Unfair Practices Act by implementing design features and algorithms that created dangerous conditions for users. Now, more than two years after the case was filed, opening arguments have begun in Santa Fe."
"The plaintiffs in that case allege that social media companies designed their products in a negligent manner and caused various harms to minors using their apps. Snap, TikTok, and Google were named as defendants alongside Meta; Snap and TikTok have already settled. The fact that Meta has not means that some of the company's top executives may be called to the witness stand in the coming weeks."
"It's the first standalone, state-led case against Meta that has actually gone to trial in the US. It's also a highly charged case alleging child sexual exploitation that will ultimately lean on very technical arguments, including what it means to "mislead" the public, how algorithmic amplification works on social media, and what protections Meta and other social media platforms have through Section 230."
Meta is on trial in New Mexico over allegations that its apps failed to protect minors from sexual exploitation through design features and algorithms that created dangerous conditions, violating the state's Unfair Practices Act. Opening arguments began in Santa Fe two years after the case was filed. A California JCCP trial examines social media addiction and consolidates civil suits claiming negligent product design harmed minors; Snap and TikTok settled while Meta did not, raising the likelihood of executive testimony. Meta executives are unlikely to testify live in New Mexico. The case raises technical legal issues about misleading users, algorithmic amplification, and Section 230 protections. Depositions may reveal company policies and responses to underage user complaints.
Read at WIRED
Unable to calculate read time
[
|
]