TikTok recommends self-harm and eating disorder content to some users within minutes of joining the platform, according to a new report released Wednesday by the Center for Countering Digital Hate (CCDH).
The new study had researchers create TikTok accounts posing as 13-year-old users interested in content about body image and mental health. It found that within 2.6 minutes of logging into the app, TikTok’s algorithm was recommending suicidal content. The report showed that the eating disorder content was recommended in just 8 minutes.
In the course of this study, researchers found 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views.
“The new report from the Center for Countering Digital Hate underscores why it’s high time TikTok took steps to address the platform’s dangerous algorithmic amplification,” said James P. Steyer, founder and CEO of Common Sense Media, who is unaffiliated with the study. “TikTok’s algorithm is bombarding teens with harmful content promoting suicide, eating disorders, and body image issues that are fueling the teen mental health crisis.”
TikTok, which was launched globally by the Chinese company ByteDance in 2017 and operates through algorithms based on personal data – likes, followers, watch time, interests of a user – has become the fastest growing social media app worldwide, reaching one billion monthly active users by 2021.
The CCDH report details how TikTok’s algorithms refine the videos shown to users as the app gathers more information about their preferences and interests. The algorithmic suggestions on the “For You” feed are designed, as the app claims, to be “central to the TikTok experience.” But new research shows that the video platform can send malicious content to vulnerable users while trying to keep them interested.
To test the algorithm, CCDH researchers registered as users in the United States, United Kingdom, Canada and Australia and created “standard” and “vulnerable” accounts on TikTok. A total of eight accounts were created and data was collected from each account for the first 30 minutes of use. The CCDH says the small registration window was created to show how quickly the video platform can understand each user and send potentially harmful content.
In the report, each researcher, posing as a 13-year-old, the minimum age TikTok allows to sign up for its service, set up two accounts in the designated country. An account has been assigned a female username. The other, a username indicating a concern with body image—the name included the phrase “lose weight.” In all accounts, the researchers touched briefly on body image and mental health videos. They “liked” those videos, as if they were teenagers interested in that content.
When the “lose weight” account was compared to the standard, the researchers found that the “lose weight” accounts were served three times more harmful content overall and 12 times more self-harm and suicide videos than the standard accounts.
“TikTok is able to recognize users’ vulnerability and tries to exploit it,” said Imran Ahmed, CEO of CCDH, which is located in Washington DC, advocating the Kids Online Safety Act (KOSA), which would put in place barriers to protect minors online. “It’s part of what makes TikTok’s algorithms so insidious; the app is constantly testing our children’s psychology and adapting to keep them online.”
Content sent to vulnerable accounts included a video with the caption: “Make everyone think your [sic] it’s okay so you can try it in private”.
The video insinuating the suicide attempt amassed 386,900 likes. The report also featured a video of a teenage girl crying, with words on the screen saying, “You’re not thinking about taking your own life, are you?” And then referring to a TV character named Sarah Lynn, who dies of an overdose in the Netflix animated series “Bojack Horseman”. That video received 327,900 likes. And another with a link to PrettyScale.com, a website where users upload body and face pictures to rank their attractiveness based on a “mathematical formula.” The video had 17,300 likes.
Reached for comment, a TikTok spokesperson disputed the study’s methodology.
“We regularly consult with health care experts, remove violations of our policies, and provide access to support resources to anyone who needs them,” the representative said.
TikTok’s spokesperson went on to say that the video platform was “aware that opting in to content is unique to each individual” and that the social platform “remains[s] focused on promoting a safe and comfortable space for all.”
How 60 minutes carried over Sunday, this study comes as more than 1,200 families are pursuing lawsuits against social media companies, including TikTok. These lawsuits allege that content on social media platforms has had a profound impact on their children’s mental health and, in some cases, helped lead to their deaths. More than 150 lawsuits are expected to go forward next year.
If you or someone you know is experiencing emotional distress or a suicidal crisis, call the National Suicide Prevention Hotline at 1-800-273-TALK (8255).
For more information about mental health resources and support, you can contact the National Alliance on Mental Illness (NAMI) HelpLine Monday through Friday, 10am to 6pm ET, at 1-800-950- NAMI (6264) or send an email to info@nami.org.