Stephanie Mistre's life turned tragic when her 15-year-old daughter, Marie, died by suicide in 2021. Mistre discovered that TikTok's algorithm had pushed harmful content promoting suicidal thoughts to her daughter. She believes the platform normalizes depression, self-harm, and a twisted sense of belonging among vulnerable users. Mistre, along with six other families affected by similar tragedies, is suing TikTok France, accusing the app of negligence in moderating harmful content. TikTok countered that it prohibits such content and employs many professionals to remove dangerous posts, offering resources for mental health support as well.
They normalized depression and self-harm, turning it into a twisted sense of belonging.
I went from light to darkness in a fraction of a second.
TikTok has guidelines that forbid any promotion of suicide, employing 40,000 trust and safety professionals to remove dangerous posts.
Marie made several videos to explain her decision, citing difficulties and quoting a song by Suicideboys.
Collection
[
|
...
]