Far-right groups are using music and other creative features to circumvent TikTok's attempts to curb hate speech by masking their hate speech behind catchy pop songs, a study by a UK-based think tank has found.
Follow Israel Hayom on Facebook and Twitter
Authored by Ciaran O'Connor of the Institute for Strategic Dialogue, the report – "Hatescape: An In-Depth Analysis of Extremism and Hate Speech on TikTok" – "aims to provide an in-depth analysis on the state of extremism and hate on TikTok. It is the culmination of three months of research on a sample of 1,030 videos, equivalent to just over eight hours of content, posted on the social media platform. These videos were used to promote hatred, as well as to glorify extremism and terrorism.
The Institute for Strategic Dialogue states that it is "an independent, non-profit organization dedicated to safeguarding human rights and reversing the rising tide of polarization, extremism, and disinformation worldwide."
The ISD further said that the report "set out to examine the state of hate and extremism on TikTok in two ways: The first objective involved analyzing how individuals or groups promote hateful ideologies and target people on the platform based on numerous protected attributes such as ethnicity, religion, gender or others. Second, using the same framework, ISD investigated how features on TikTok like profiles, hashtags, share functions, video effects and music are used to spread hate.
"This report seeks to start a conversation around how platforms like TikTok can improve their own practices to protect users from harm. Additionally, it underscores the clear need for independent oversight of such platforms, which currently leave users and the wider public open to significant risks to their health, security and rights," the think tank said.
The review of the videos sampled as part of the study found a troubling trend hidden in the posts on the video-sharing platform, as many of them seemed to contain antisemitic and other hateful content getting millions of views and at times, tens of thousands of interactions.
The study found that many extremist TikTok creators use music and video effects available on the platform, such as the "duet/stitch video creation" features or video effects, – to get around the regulations seeking to muzzle hate speech.
Such "evasion tactics" include banned users opening new acocunts with nearly identical usernames, making strategic use of privacy functions and comment restrictions, using alternative hashtag spellings, and exploiting a profiles' video grid layout to promote hatred.
"TikTok has a content moderation problem," the study concluded. "Extremist and even terrorist-related footage is easily discoverable on TikTok, despite contravening the platform's terms of service.
"Content that praises and celebrates the actions of terrorists and extremists, or denies the existence of historically violent incidents such as genocides, is captured in our data, demonstrating how the platform operates as a new arena for violence-endorsing, hateful ideologies. The breadth of this enforcement gap is concerning.
"The platform enables hatred targeting Muslims, Jews, Asians, Black people, refugees, women and members of the LGBTQ+ community, including content celebrating the death of people within these communities, mocking victims of racism and persecution, and both implicit and explicit manifestations of hate against these communities," the study said.
"The research also demonstrates how features of the platform – profiles, Sounds, video filters and effects – are all part and parcel of the system that has enabled targeted hate and extremism to proliferate on the platform. Products built by TikTok to encourage engagement, creativity and virality are being exploited to help spread hate and extremism."
Subscribe to Israel Hayom's daily newsletter and never miss our top stories!