tiktok-logo-on-phone
(Image: Adobe)

Social media darling TikTok has become famous for short, flashy dance moves, Scott Morrison and Donald Trump memes and a sock facemask, but its public image has taken a darker, dangerous turn in recent days with the posting of a suicide video.

TikTok management has been trying to remove versions of the video in response to user outrage. Parents pointed to the fact that the footage had been placed into a WhatsApp group for secondary school students, set up so they can know each other. School principals have been warning parents about the post.

But that’s not the only thing challenging TikTok. Terrorism analysts and researchers say far-right extremists are using it to spread white supremacist propaganda and other hate-based materials.

Mollie Saltskog, a senior intelligence analyst with the Soufan Group in the United States, told Crikey that far-right extremist groups such as the Atomwaffen Division use gaming and social media platforms to draw a younger crowd to their ideologies.

Attomwaffen Division calls for race wars, and began its life via an online forum called the Iron March.

Social media platforms are not created as inherent good or bad forums, Saltskog says, but can be used by state-based or non-state-based “evil and rogue actors”.

“If you have created something you also need to figure out how to make sure that your platform cannot be used to co-ordinate or organise and conduct violent attacks against innocent civilians,” Saltskog says.

Gabriel Weimann and Natalie Masri from Haifa University’s Institute for Counter Terrorism published research looking at the spread of far=right ideologies on TikTok in the June 2020 edition of Studies in Conflict and Terrorism.

Their research provides further empirical support for Saltskog’s concerns.

Weimann and Masri said TikTok was used by young children and the platform was yet to work out how best to protect users from harmful content given its relative infancy.

“The far right’s online presence had developed over three decades, using bulletin board systems, websites, online forums, and more recently, social media,” they wrote. 

Social media had given far-right groups platforms where they could use text, memes and videos to draw people into their anti-immigration, anti-Semitic and white supremacist dialogue.

They found posts had content that dealt with fascism, racism, anti-Semitism, anti-immigration, chauvinism, nativism and xenophobia.

“We found multiple profile pictures featuring far-right symbols, including four accounts using SS bolts, three accounts with the Stormfront logo, three pictures depicting the Totenkopf used by Hitler’s SS, 12 accounts featuring swastikas, five accounts depicting the Nazi eagle and six accounts depicting the Nazi flag,” they wrote.

“We also found 13 accounts depicting the sonnenrad, prominently used in Nazi Germany, and today is popular among groups such as Atomwaffen Division.”

They also found posts that glorified the acts of killers such as the recently sentenced Australian man responsible for murdering 51 people attending two mosques in Christchurch in March 2019, and the Norwegian far-right terrorist responsible for murdering 77 people in 2011.