TikTok could be sending potentially harmful content to teens in minutes, according to a study



Cnn

TikTok could show teens potentially harmful content related to suicide and eating disorders within minutes of creating an account, a new study suggests, likely adding to the growing scrutiny of the app’s impact on its younger users.

In a report released Wednesday, the non-profit Center for Countering Digital Hate (CCDH) found out It can take less than three minutes after signing up for a TikTok account to see suicide-related content, and another five minutes or so to find a community promoting eating disorder content.

The researchers said they created eight new accounts in the US, UK, Canada and Australia at the minimum TikTok user age of 13. These accounts stopped short and enjoyed content about body image and mental health. The CCDH said the app recommended body image and mental health videos approximately every 39 seconds in a 30-minute period.

The report comes as state and federal lawmakers look for ways to crack down on TikTok for privacy and security concerns, as well as determine if the app is appropriate for teenagers. It also comes more than a year after executives of social media platforms, including TikTok, faced tough questions from lawmakers during a series of congressional hearings about how their platforms can target younger users, especially girls teenagers, to harmful content, harming their own mental health and body image.

After Those hearings, following revelations from Facebook whistleblower Frances Haugen about Instagram’s impact on teens, the companies have vowed to change. But the latest CCDH findings suggest more work may still need to be done.

“The results are every parent’s nightmare: Young people’s feeds are bombarded with harmful and heartbreaking content that can have a significant cumulative impact on their understanding of the world around them and their physical and mental health,” Imran Ahmed, CEO of the CCDH, said in the report.

A TikTok spokesperson dismissed the study, saying it was an inaccurate representation of the viewing experience on the platform for various reasons, including the small sample size, the limited 30-minute window for testing, and how users accounts have passed a number of unrelated topics to search for other content.

“This activity and the resulting experience do not reflect genuine behaviors or viewing experiences of real people,” a TikTok spokesperson told CNN. “We consult regularly with healthcare experts, fix violations of our policies, and provide access to support resources to anyone who needs them. We understand that enabling content is unique to each individual, and we remain focused on promoting a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others about these important topics.

The spokesman said the CCDH does doesn’t distinguish between positive and negative videos on certain topics, adding that people often share inspiring stories about recovering from eating disorders.

TikTok said it continues to roll out new safeguards for its users, including ways to filter mature or “potentially problematic” videos. In July, it added a “maturity score” to videos detected as potentially containing mature or complex themes, as well as a feature to help people decide how much time they want to spend on TikTok videos, set regular screen breaks, and provide a dashboard that specifies the number of times they opened the app. TikTok also offers a handful of parental controls.

This isn’t the first time social media algorithms have been tested. In October 2021, Sen. Richard Blumenthal’s staff registered an Instagram account as a 13-year-old girl and proceeded to follow a few diet and eating disorder accounts (the latter of which should be banned from Instagram). The Instagram algorithm soon began almost exclusively recommending the young teen account follow increasingly extreme diet accounts, the senator told CNN at the time.

(After CNN sent a sample from this list of five accounts to Instagram for comment, the company removed them, saying all hacked Instagram policies against encouraging eating disorders.)

TikTok said it doesn’t allow content that depicts, promotes, normalizes, or glorifies activities that could lead to suicide or self-harm. Of the videos removed for violating its suicide and self-harm content policies from April to June of this year, 93.4% were removed with zero views, 91.5% were removed within 24 hours of posting, and the 97.1% were removed before any reports, according to the company.

The spokesperson told CNN that when someone searches for banned words or phrases like #selfharm, they won’t see any results and will instead be redirected to local support resources.

Still, says the CCDH More needs to be done to restrict specific content on TikTok and strengthen protections for young users.

“This report underlines the urgent need for online reform,” CCDH said Ahmad. “Unsupervised, TikTok’s opaque platform will continue to profit by serving its users — 13-year-olds, remember — ever more intense and distressing content without oversight, resources or support.”

Leave a Reply

Your email address will not be published. Required fields are marked *