TikTok’s algorithms are promoting videos about self-harm and eating disorders to vulnerable teens, according to a report published Wednesday highlighting concerns about social media and its impact on youth mental health.
Researchers at the nonprofit Center for Countering Digital Hate created TikTok accounts for fictional teen personas in the U.S., United Kingdom, Canada, and Australia. The researchers operating the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would respond.
Within minutes, the wildly popular platform recommended videos about losing weight and self-harm, including pictures of models and idealized body types, images of razor blades, and discussions of suicide.
When the researchers created accounts with user names that suggested a particular vulnerability to eating disorders, including the words “lose weight,” the charges were fed even more harmful.
“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the center’s CEO Imran Ahmed, whose organization has offices in the U.S. and U.K. “It is pumping the most dangerous possible messages to young people.”
TikTok is not the only platform failing to protect young users from harmful content and aggressive data collection.
In a statement from a company spokesperson, TikTok disputed the findings, noting that the researchers didn’t use the platform like typical users and saying that the results were skewed. The company also declared that a user’s account name shouldn’t affect the content the user receives.
TikTok prohibits users younger than 13, and its official rules prohibit videos that encourage eating disorders or suicide. Users in the U.S. who search for content about eating disorders on TikTok receive a prompt offering mental health resources and contact information for the National Eating Disorder Association.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” said the statement from TikTok.
Despite the platform’s efforts, researchers at the Center for Countering Digital Hate found that content about eating disorders had been viewed on TikTok billions of times. In some cases, researchers found, young TikTok users were using coded language about eating disorders to evade TikTok’s content moderation.
The amount of harmful content being fed to teens on TikTok shows that self-regulation has failed.
Ahmed noted that the version of TikTok offered to domestic Chinese audiences is designed to promote content about math and science to young users and limits how long 13- and 14-year-olds can be on the site daily.