Instagram to alert parents if teens repeatedly search self-harm terms
Briefly

Instagram to alert parents if teens repeatedly search self-harm terms
"Our goal is to empower parents to step in if their teen's searches suggest they may need support. We also want to avoid sending these notifications unnecessarily, which, if done too much, could make the notifications less useful overall."
"Instagram said it already blocks such content from showing up in teen accounts' search results and directs people to helplines instead. The alerts will be sent via email, text or WhatsApp, depending on the parent's contact information available, as well as a notification through the parent's Instagram account."
"Thousands of families along with school districts and government entities have sued Meta and other social media companies claiming they deliberately design their platforms to be addictive and fail to protect kids from content that can lead to depression, eating disorders and suicide."
Instagram announced a new safety feature alerting parents through email, text, WhatsApp, or in-app notifications when their teens repeatedly search for terms associated with suicide or self-harm. The alerts target only parents enrolled in Instagram's parental supervision program. Instagram already blocks such content from appearing in teen accounts' search results and redirects users to helplines. This announcement occurs amid two ongoing trials against Meta regarding child safety and platform design. Meta faces allegations that its platforms are deliberately addictive and fail to protect minors from harmful content linked to depression, eating disorders, and suicide. Meta executives, including CEO Mark Zuckerberg, have disputed claims that social media causes mental health harms. Meta is also developing similar notifications for parents regarding their teens' interactions with artificial intelligence.
Read at www.theguardian.com
Unable to calculate read time
[
|
]