Family December 15, 2022

TikTok pushes harmful content to teens every 39 seconds, new report claims

WATCH: New study says TikTok feeds harmful content to young teens

A new NGO report released Thursday claims TikTok's algorithm targets "vulnerable teens" and recommends "harmful" content to them, sometimes as rapidly as every 27 to 39 seconds.

The report, from the Center for Countering Digital Hate, defines harmful content as content about topics like eating disorders, weight-shaming, self-harm, and sexual assault.

The center's researchers conducted the study by signing up eight new accounts in the U.S., Canada, the U.K. and Australia, setting them to a minimum age of 13, and watching and liking videos that discussed body image and mental health. Half of the accounts had usernames with "loseweight" in them and were deemed "vulnerable" accounts by the researchers, while the other half were deemed "standard" accounts without a "loseweight" username.

"For each account, we recorded the first 30 minutes of algorithmically recommended content on each account's 'For You' feed, watching and liking any videos about body image, mental health or eating disorders. The resulting recordings were analyzed to examine the frequency of recommendations for body image, mental health, selfharm and eating disorder content."

According to the researchers' findings, once the social media app's algorithm kicked in, content about suicide surfaced on the "standard" accounts' feed as early as 2.6 minutes in, and content about eating disorders popped up within 8 minutes. "Vulnerable" accounts, meanwhile, were recommended three times as many harmful videos and 12 times as many self-harm videos as the "standard" accounts.

"The harmful content TikTok recommended to the Vulnerable Teen Accounts was more extreme than content shown to the Standard Teen Accounts," the researchers added. "An increase in videos about eating disorders was accompanied by an increase in self-harm and suicide content, which included thinspo, methods to self-harm, and videos of teens discussing plans to commit suicide."

Imran Ahmed, CEO of the Center for Countering Digital Hate, called the results "every parent's nightmare."

NurPhoto via Getty Images, FILE
Photo illustration a TikTok logo seen displayed on a smartphone In Brussels - Belgium, Sept. 18, 2022.
Editor's Picks

When reached by "Good Morning America," TikTok pushed back on the report, pointing out the study's small sample size and argued researchers' use of the app did not reflect users' real-world behavior patterns and activities on TikTok.

"This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people," a TikTok spokesperson said in a statement. "We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need. We're mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics."

The company reiterated that content about disordered eating and similar topics violates the app's community guidelines and is taken down once identified by its keyword detection systems or flagged by community members or external partners to TikTok's moderation team.

TikTok also said it worked with partners such as the National Eating Disorders Association to provide resources to users, for example, redirecting any searches for topics such as self harm or eating disorders to the 988 suicide and crisis phone and text lines or the NEDA phone and text lines.

TikTok has come under scrutiny in the past as it has grown into one of the most popular social media apps. Last December, mental health professionals spoke out when a trend had teen users attempting to diagnose themselves with mental disorders using TikTok, and earlier this spring, the National Institutes of Health launched research into how apps like TikTok could be linked to addiction and impact teenagers' brains.

MORE: New study warns young children may be susceptible to eating disorders

The Center for Countering Digital Hate called for more government regulation and accountability from social media companies in the report's conclusion.

Earlier this year, Senate lawmakers introduced the Kids Online Safety Act, which would require social media companies to allow users more privacy options, the ability to disable addictive features and the ability to opt out of algorithmic recommendations. It would also require them to provide parents with tools to track app usage and limit app purchases, and would force companies to "prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials on, or engagement with, the platform," among other things.

The legislation has not yet passed out of committee or been brought to a vote.

The state of California also passed new legislation in August, called the California Age-Appropriate Design Code Act, requiring apps and websites to boost safety precautions for children specifically, and threatening them with fines if companies do not adhere to the law.

If you are experiencing suicidal thoughts, substance use or other mental health crises, please call or text the new three digit code at 988. You will reach a trained crisis counselor for free, 24 hours a day, seven days a week.

If you or someone you know is battling an eating disorder, contact the National Eating Disorders Association (NEDA) at 1-800-931-2237 or NationalEatingDisorders.org.