Far-right extremists’ use of social media: An explainer
Authored by Pool Re
Our new cyber terrorism series explores how terrorist organisations are harnessing the Internet to radicalise and train potential recruits, disseminate illegal content, facilitate communication, and incite acts of terrorism. Our third article examines how far-right extremists are exploiting the unique features of social media platforms to spread hatred and create communities of racist, homophobic, anti-Islamic, nationalist individuals.
The Director General of MI5 and the UK Minister for Borders and Defence have identified far-right extremism as the fastest growing threat to the UK. Like Islamist militants, these actors exploit social media platforms to recruit and radicalise others, targeting young people in particular.
The far-right exploit the unique features across several social media platforms in their mission to spread hatred and create communities of racist, homophobic, anti-Islamic, nationalist individuals. In this piece we examine the most prominent sites used by these extremists in their mission to spread propaganda and recruit others.
Facebook/Twitter/Instagram
The far-right feature prominently on Facebook. They use the platform to recruit, radicalise, and provide live-streamed fitness classes to motivate supporters and prepare them for a “race war”. They also use Facebook, along with Twitter, to organise protests and rallies such as ‘Unite the Right’ in Charlottesville, Virginia in 2017, which led to the death of one individual.
White Supremacists and neo-Nazis can speak openly on the platform, requiring little effort to disguise their messages and circumvent any content moderation. On Facebook and Instagram in particular, there are also many pages relating to online stores advertising and selling white supremacist clothing, music, or accessories.
YouTube
YouTube is used by the far-right to attract attention through propaganda videos, documentaries, and speeches. While some overt far-right content is removed by platform moderators, extremists have begun to understand the line between what is permitted and what is likely to be removed. They have used this understanding to share a low level of propaganda that is unlikely to breach these policies and using the content to direct viewers towards other less moderated platforms and messaging services, such as BitChute or Telegram, as a way for viewers to ‘learn more’.
By doing this, far-right extremists and white supremacists utilise the global reach and trusted notoriety of YouTube to direct users to lesser known, unrestricted platforms which they may not have found otherwise. YouTube therefore acts as a portal platform for the far-right to direct users to sites filled with extreme and violent messaging and imagery.
Telegram
Telegram became a favoured messaging platform due to its end-to-end encryption, meaning the only people able to view messages are the sender and recipient(s). Far-right extremists and neo-Nazis use group channels and private messaging to openly call for violence, share propaganda, recruit, and spread misinformation or inflammatory videos to radicalise potential recruits.
Telegram chats have also been utilised in attack or rally planning between far-right extremists globally. This was seen in the planning of numerous trucker convoys across the US in February 2022 protesting COVID-19 vaccine mandates.
Reddit/4chan/8chan
Anonymous message and image board platforms such as Reddit, 4chan and 8chan formerly played a key role in the sharing of far-right ideology on their politics boards. Notably, the perpetrators of the Christchurch Mosque attack and the San Diego synagogue attack in 2019 shared their manifestos on these sites.
The far-right have used these platforms for ‘doxing’ other individuals who opposed them, by sharing private personal information without permission. This doxing led Reddit to ban two of the largest far-right “subreddit” threads, r/altright and r/alternativeright, which seems to be one of the only things Reddit will act upon within their content moderation.
BitChute /Odysee
Both BitChute and Odysee are alternative video sharing platforms with less restrictions and content moderation than mainstream platforms like YouTube. These are used by far-right extremists and neo-Nazis to share videos containing white supremacist propaganda, COVID-19 or election conspiracy theories, racism, and violence.
Marketed as “free speech” platforms, extremists use these to post dangerous far-right and racist content which not only avoids restriction, but also regularly features on the daily “trending” list. This provides in-house promotion of the extremist content and pushes it towards potential recruits, as well as those already radicalised.
Gab/Parler
Functioning as microblogging, social networking and messaging platforms, Gab and Parler have become an alternate version of Twitter, which has taken steps to crack-down on posts and accounts of far-right supporters. Their free speech and libertarian approach has been exploited by neo-Nazis and white supremacists looking to share extreme views, calls for violence and express violent intent following mass-banning on mainstream platforms.
Twitter in particular functions as a gateway for the far-right to share links to extremist content on Gab. In 2021 research found that between June 7 and August 22, more than 112,000 tweets included links to Gab’s website, with a potential reach of over 254 million views, demonstrating the pathway from mainstream social media to lesser known platforms linked to extremism.
Gab was criticised after it was found out that the suspected shooter in the attack against a Pittsburgh synagogue in 2017 had posted on Gab stating his violent intent immediately prior to the attack. The platform also played a significant role in facilitating planning stages for the 06 January riots at the US Capitol. Despite the criticism, Gab has remained a key platform within the far-right communities, particularly for British far-right activist Tommy Robinson, and extremist groups like Britain First, who relocated to Gab following their bans from Twitter and Facebook.
VKontakte (VK)
VK is a Russian based equivalent for Facebook which is predominantly used by Russian citizens but has become a hotbed for western neo-Nazis openly using racist and antisemitic language and images to promote their extreme viewpoints.
Unlike some of the other Facebook substitutes such as Minds and Gab, VK is not a fringe platform and is the 20th most visited website in the world. Some far-right extremists use VK to openly display their white supremacist beliefs that they would not share on Facebook. This may be due to Facebooks content moderation or extremists wanting to keep two aspects of their life separate.
Regardless of the reason, extremists use VK without any form of self-censorship, sharing images of swastikas, Hitler, common Nazi phrases, and their group associations. They use their modest Facebook accounts to push users towards their extremist VK accounts where other far-right supporters can join the neo-Nazi echo chamber that VK has become.
It is likely that the far-right favour VK as it is not a Western based platform and therefore faces less stringent content moderation policies. Users are provided with a Facebook-style platform with very little policy enforcement, and they use this to communicate with other users from numerous different countries, sharing openly racist and extremist profiles and posts.
Gaming platforms (Twitch, Discord, Steam, DLive)
To recruit younger people, the far-right have recently begun exploiting gaming related platforms to recruit and radicalise children and teenagers. They befriend young people through game-related chat features, or the use of propaganda videos framed as first-person shooter games.
Some far-right extremists are using gaming chats and message-boards as social media platforms, sharing propaganda unrelated to gaming and recruiting young gamers. They also link their profiles to extremist games, largely as a badge of honour as opposed to playing the games.
On game streaming platforms like Twitch, neo-Nazis will stream popular games whilst dressed up as white supremacist or racist versions of ordinary video game characters. They play first-person shooter games and imagine the game is fulfilling their racist, homophobic, and antisemitic fantasies. It is likely that young people viewing this content are unaware of the terroristic background when they first engage with it, however they are drawn in under a false pretence that they are simply watching live gameplay.