French Families Sue TikTok for Allegedly Contributing to Adolescent Suicides

Key Takeaways
  • TikTok is being sued by seven French families who claim that two 15-year-olds committed suicide as a result of toxic content.
  • They contend that the platform’s algorithm exposed children to content about eating disorders and self-harm.
  • At the same time, TikTok is under growing legal scrutiny in the US because of its effects on kids’ mental health.

Seven French families are suing TikTok, claiming that two 15-year-olds committed suicide as a result of toxic content on the app.

On Monday, November 4, Laure Boutron-Marmion, their attorney, will take the Chinese social network to court. According to the lawsuit, TikTok’s algorithm exposed the seven teenagers to videos that advocated eating disorders, suicide, and self-harm, she told Franceinfo.

According to Boutron-Marmion, the complaint filed in the Créteil judicial court is the first of its sort in Europe.

Families Say Tragic Events Were Caused by Dangerous Content

According to the platform’s charter, which serves over 1.2 billion users worldwide and over 21 million users in France, it offers a secure environment and promotes digital wellness for families and teenagers. Parents counter that their kids were exposed to violent material. One mother describes how her daughter, who had been harassed, sought solace on TikTok, where she saw upsetting recommendations about despair and self-harm that made her distress worse.

Boutron-Marmion, who is representing the seven families, claims that TikTok did not protect children and regulate harmful content, and they query why there were no warnings about the app’s addictive nature.They assert that the algorithm had a detrimental effect on their kids’ mental health, leading to two suicides at the age of 15, four suicide attempts, and one case of anorexia.

The parents want the court to acknowledge TikTok’s legal liability and hold the site accountable for their children’s deteriorating physical and mental health. Given that it caters to children, Boutron-Marmion underlined that the business must take responsibility for its goods. Additionally, they want TikTok to better control videos in order to shield users from content that encourages suicide. According to Reuters, TikTok was not available for comment regarding the claims.

TikTok Is Dealing With Legal Issues Regarding Child Safety

TikTok has made it clear in the past that it takes the mental health of kids seriously. US legislators were informed by CEO Shou Zi Chew that the company had made investments in safeguards for its younger customers.

The social networking app is criticized for its content moderation, meanwhile, and has been sued several times in the US for allegedly harming children’s mental health through addiction, much like Meta’s Facebook and Instagram.

Thirteen US states and the District of Columbia filed a lawsuit against TikTok in October, alleging that the app promotes addiction and misleads users about its safety, thereby harming children’s mental health. TikTok refuted these claims, highlighting its safeguards for children.

The same month, a US federal judge ordered that Google, TikTok, Snap, and Meta face lawsuits from school districts alleging their addicting apps caused a mental health crisis among students. The judge accepted the school districts’ arguments and permitted the lawsuit to continue, even though Meta had recently won on child safety issues.

Leave a Reply

Your email address will not be published. Required fields are marked *