TikTok’s powerful algorithm is quickly exposing young users to extremist content, including videos promoting suicide, according to a report by corporate accountability group Ekō, Vice reports.
Despite TikTok’s promises to crack down on harmful content, such videos remain easily accessible by new users.
Ekō’s research discovered that within just 10 minutes of using the platform, the “For You” page for new accounts begins pushing increasingly extreme content, including suicide-related material.
Maen Hammad, Ekō campaigner and co-author of the report, expressed concern about the ease with which children can be exposed to such content: “Ten minutes and a few clicks on TikTok is all that is needed to fall into the rabbit hole of some of the darkest and most harmful content online.”
In response to the report, TikTok said, “We work hard to prevent the surfacing of harmful content on our platform by removing violations.”
However, the platform has consistently failed to address problematic content, including self-harm, terrorism, and disinformation.
The report highlights how simple it is for young users to find suicide-related content on TikTok using various hashtags.
One test account was shown a video featuring actor Jake Gyllenhaal with a rifle in his mouth and text that read: “Get shot or see her with someone else?”
The video, now removed, had accumulated over 440,000 likes, 2.1 million views, 7,200 comments, and 11,000 shares, with many commenters supporting the suggested suicide.
Ciaran O’Connor, a researcher at the Institute of Strategic Dialogue (ISD), called on TikTok to evolve its policies beyond simple hashtag and keyword bans, as users continue to exploit enforcement gaps: “We have repeatedly called on TikTok to evolve its policies beyond narrow hashtag and keyword bans yet, as documented in other research related to terrorist content, it appears users are exploiting this enforcement gap with ease.”
TikTok’s CEO Shou Zi Chew is scheduled to appear before Congress later this week, as the platform faces possible bans in the U.S. due to its parent company ByteDance’s links to the Chinese government.