FP StaffDec 15, 2022 18:04:01 IST
TikTok’s algorithms are promoting videos about self-harm and eating disorders to vulnerable teens, according to a report published Wednesday that highlights concerns about social media and its impact on youth mental health.
Researchers at the nonprofit Center for Countering Digital Hate created TikTok accounts for fictional teen personas in the US, United Kingdom, Canada and Australia. The researchers operating the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would respond.
Within minutes, the wildly popular platform was recommending videos about losing weight and self-harm, including ones featuring pictures of models and idealized body types, images of razor blades and discussions of suicide.
When the researchers created accounts with user names that suggested a particular vulnerability to eating disorders that included the words “lose weightm” the accounts were fed even more harmful content.
“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the center’s CEO Imran Ahmed, whose organization has offices in the U.S. and U.K. “It is literally pumping the most dangerous possible messages to young people.”
TikTok is not the only platform failing to protect young users from harmful content and aggressive data collection.
In a statement from a company spokesperson, TikTok disputed the findings, noting that the researchers didn’t use the platform like typical users, and saying that the results were skewed as a result. The company also said a user’s account name shouldn’t affect the kind of content the user receives.
TikTok prohibits users who are younger than 13, and its official rules prohibit videos that encourage eating disorders or suicide. Users in the U.S. who search for content about eating disorders on TikTok receive a prompt offering mental health resources and contact information for the National Eating Disorder Association.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” said the statement from TikTok.
Despite the platform’s efforts, researchers at the Center for Countering Digital Hate found that content about eating disorders had been viewed on TikTok billions of times. In some cases, researchers found, young TikTok users were using coded language about eating disorders in an effort to evade TikTok’s content moderation.
The sheer amount of harmful content being fed to teens on TikTok shows that self-regulation has failed.
Ahmed noted that the version of TikTok offered to domestic Chinese audiences is designed to promote content about math and science to young users, and limits how long 13- and 14-year-olds can be on the site each day.