NewsU.S. and the World

Actions

Report: Teenagers may quickly encounter harmful posts on TikTok after signing up

TikTok-Shopping
Posted at 1:49 PM, Dec 16, 2022
and last updated 2022-12-16 15:51:51-05

An organization that is working to counter hate and disinformation online is raising concerns about the type of videos young people may be seeing on TikTok.

The Centers for Countering Digital Hate set out to test TikTok's algorithm after hearing concerns from parents about what their children are seeing on the platform. The nonprofit organization says it set up eight new accounts in the U.S., U.K, Canada and Australia, and listed the user as 13 years old, the youngest allowed by TikTok.

For the experiment, the accounts briefly watched and liked videos about body image and mental health.

"Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content related to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to teens," the organization claims.

The Centers for Countering Digital Hate says some of the videos disclosed information about teens expressing the desire to attempt suicide. Other self-harm videos featured razor blades.

In a statement about the report, a spokesperson said the experiment does not reflect how regular people use the platform.

"This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people," the spokesperson said. "We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need. We're mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics."

TikTok's Community Guidelines state that it does not allow content "depicting, promoting, normalizing, or glorifying activities that could lead to suicide, self-harm, or disordered eating." It vowed to remove content cited in the report that violated its Community Guidelines.

While TikTok acknowledges it will not catch every piece of content that violates its guidelines, it says it has a team of more than 40,000 safety professionals who are responsible for helping keep the platform safe.

TikTok says from April to June of this year, more than 90% of content that violated policies for suicide and self-harm was removed before there was a single view.

Still, the Centers for Countering Digital Hate believes more should be done to protect teens from dangerous content. It created a guide for parents to help them understand the potential problems on TikTok. It's also pushing policymakers to force social media companies to be more transparent about their algorithms and economic incentives. In addition, the organization believes companies should be held accountable when they fail to enforce policies that are aimed at stopping harm.

TikTok launched what it calls an industry-leading Transparency Center two years ago. The company says it regularly publishes transparency reports to hold itself accountable.