TikTok pushes harmful content to kids every 39 seconds

Marieta Dippenaar | Psychologist

Study shows that TikTok pushes potentially harmful content to children as often as every 39 seconds

A new report published recently by the Center for Countering Digital Hate (CCDH) in the United States has shown that TikTok recommends self-harm and eating disorder content to some users within minutes of joining the platform.

As part of the study, researchers set up TikTok accounts posing as 13-year-old users interested in content about body image and mental health. It found that within as few as 2.6 minutes after joining the app, TikTok’s algorithm recommended suicidal content. The report also showed that eating disorder content was recommended within as few as 8 minutes.

Over the course of this study, researchers found 56 TikTok hashtags / handles hosting eating disorder videos – with over 13 billion views.

“The new report underscores why it is way past time for TikTok to take steps to address the platform’s dangerous algorithmic amplification,” said James P. Steyer, Founder and CEO of Common Sense Media, which is unaffiliated with the study. “TikTok’s algorithm is bombarding teens with harmful content that promotes suicide, eating disorders, and body image issues that is fuelling the teens’ mental health crisis.”

To test TikTok’s algorithm, CCDH researchers registered as users in the United States, United Kingdom, Canada, and Australia and created “standard” and “vulnerable” accounts on TikTok.  There was a total of eight accounts created, and data was gathered from each account for the first 30 minutes of use. The Center says the small recording window was done to show how quickly the video platform can understand each user and push out potentially harmful content.

CBS’ 60-Minutes reported that each researcher, posing as 13 years of age, made two accounts in their designated country. One account was given a female username. The other, a username that indicates a concern about body image—the name included the phrase “loseweight.”  Across all accounts, the researchers paused briefly on videos about body image and mental health. They “liked” those videos, as if they were teens interested in that content. When the “loseweight” account was compared with the standard, the researchers found that “loseweight” accounts were served three times more overall harmful content, and 12 times more self-harm and suicide specific videos than the standard accounts.

According to CBS, TikTok’s spokesperson challenged the methodology of the study; “We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.”

More than 1,200 American families are pursuing lawsuits against social media companies including TikTok, alleging that content on social media platforms profoundly impacted the mental health of their children, and in some cases, helped lead to the death of their children.

TikTok already has over 6.4 million users in South Africa, mainly aged between 15 to 34. Of that number, roughly 6 million use the app every single day.

Raising a teenager can be challenging enough without the effects of social media. Platforms such as WhatsApp and TikTok only make the journey even more scary. Evexia’s proactive adolescent program has been developed to help particularly vulnerable adolescents and their parents understand and deal with their daily challenges, including the effects of social media messaging.

In addition, the Evexia parent support group sessions offer a platform for parents to express their challenges and fears, and, through support from other parents find solutions and better ways of handling certain situations.

You can contact the support teams at Evexia by emailing info@evexia.co.za or calling 012 348 8200.

Do you need help or advice about social media and your children?

The teams at Evexia are ready to help you.

Simply complete the form and we’ll get in touch.