TikTok Faces Lawsuits Over Alleged Algorithm-Induced Suicides, Depression

The app is infamous for promoting dangerous and damaging behaviors to teens.

What’s happening: TikTok’s algorithm has been promoting suicide to vulnerable teens. The app is now facing multiple lawsuits after a slew of deaths related to the content. One of the cases revolves around Chase Nasca, a 16-year-old boy who committed suicide over a year ago. His family is filing a wrongful death lawsuit against TikTok and its parent company.

One of the videos promoted to Nasca’s account in February, days before the first anniversary of his death, says, “Take the pain away. Death is a gift.” In another, a male voice says, “I’m going to put a shotgun in my mouth and blow the brains out the back of my head.” “Cool,” a female voice responds.

Further problems with the algorithm: In addition to the depression and suicide-inducing content being pushed on teens, the app is infamous for promoting videos that advocate for other destructive behaviors like self-harm and unhealthy dieting habits. There’s also been an uptick in users popularizing mental illnesses, such as dissociative identity disorder, with the sheer number of posts seeming to indicate that many users are faking it for clicks.

Amazon too: TikTok is not the only tech platform that has garnered controversy for its questionable algorithm. A lawsuit was filed in 2022 against Amazon alleging that the website was recommending suicide kits consisting of the deadly sodium nitrite to teenagers, leading to the death of two teens. Corporations are realizing the growing danger of unchecked algorithms.

Reply

or to participate.