This article is also available in German.
We define social media platforms broadly and include YouTube, Reddit and 4chan /pol/, as these allow not just the dissemination of content but interactions between users, and thus can be considered social. The platforms have been selected based on the size of their user bases as well as their importance to online antisemitism as understood by the authors of this report. We also aimed to explore as wide a variety of platforms as possible, in order to compare and to better understand how different mediums influence the expression of antisemitism. For practical reasons we have therefore excluded sites that are similar in their functionality, for example including Parler but not Gab.
The goal of this analysis is to examine how antisemitism presents itself across different platforms: which types of antisemitism are predominant on them, which platform functions are used most actively to propagate antisemitic ideas. The aim is not to accurately quantify and compare antisemitism across the internet, rather the goal is to provide an up-to-date and accurate picture over how antisemitism is expressed and what tactics are used to spread antisemitism on these different platforms.
The case studies find a range of types of antisemitism, the most prominent being conspiratorial antisemitism, which is found across every single platform. We also find large amounts of Holocaust denial, trivialisation of the Holocaust and extreme anti-Zionism on both mainstream and niche platforms. More extreme, genocidal and directly violent antisemitic content is most prominently, but not exclusively, found on more niche communities on platforms like Telegram, Parler and 4chan /pol/. These ideas are expressed in a variety of ways, including pseudo-scientific arguments around race and historical events, and on the other hand as transgressive ‘jokes’.
Conspiratorial antisemitism is one of the forms of antisemitism that our case studies identify as the most prominent across all platforms in this report. Both mainstream platforms, such as Facebook, and largely unmoderated forums, such as 4chan /pol/ and Telegram, are awash with antisemitic conspiracy theories tied to the common thesis such as supposed Jewish influence over governments and world politics (often referred to as ‘ZOG’ for “Zionist Occupied Government” and related ideas, such as the ‘Deep State’).
On mainstream platforms we found that antisemitic conspiracy theories make up the largest proportion of antisemitism that was identified. A likely explanation is that, as opposed to more explicit antisemitism (which in many cases constitutes a criminal offence), conspiratorial antisemitism is often less easy to recognise as hate, and is therefore more prevalent on more heavily moderated platforms like Facebook and YouTube.
During the last years conspiracy theory communities online have grown. Reddit offers a dedicated subreddit to conspiracy theories called r/conspiracy which has been active since 2008, and is dedicated to discussion of any and all forms of conspiracy theory. In the following case study we find that users in the forum often talk of antisemitic conspiracy theories related to Jewish families and individuals such as the Rothschilds and George Soros, to which they ascribe vast and malign influence over world politics and finance.
The ongoing COVID-19 pandemic is also a frequent topic of discussion that could have contributed to attracting new members to the forum. Antisemitic conspiracy theories are spread through text, video and images which makes quantification difficult. However, we can gauge the spread of individual pieces of influential media and on what platforms they are most commonly found. Several different videos are, at the time of writing, spread regularly in conspiracy theory groups and channels as well as in far-right contexts.
EUROPA: The Last Battle is currently one of the most shared antisemitic conspiracy theory films. It is a 12-hour long “documentary” from 2017 detailing the supposedly undue influence of Jews in Europe. It was produced by a Swedish far-right activist who has associated with the Nazi street group, the Nordic Resistance Movement. The film is, however, now spread widely in antisemitic and conspiracy theory groups and therefore highlights the issue of how conspiracy theories can be a route into more extreme far-right sympathies. Uploads of the film have been blocked and moderated on YouTube and Facebook, but links to the video on other sites can still be shared. On the far-right video sharing site BitChute it has received over 900,000 views as of September 2021.
Using Crowdtangle, Facebook’s social media insight tool, to gather statistics on the number of shares of EUROPA shows that Instagram is one of the most important sources of traffic to the video. This highlights how even a primarily mobile social media platform focused on sharing visual content can be an effective tool to spread conspiracy theories and far-right propaganda. Across posts on public profiles on Instagram, links to the video have received over 8,000 ‘likes’ and 37,000 views. However, among far-right and conspiracy theory channels on Telegram, the link to the Bitchute video has been shared at least 594 times, with the most viewed post having 221,000 views. The film has also been cut up in hour-long pieces and been uploaded to a dedicated Telegram channel.
Coded language is another practice that our case studies identify across the multiple platforms in this report. The tactic is simple, aiming to avoid moderation and detection by an automated algorithm by modifying spelling or changing antisemitic vocuabularly. It can involve introducing explicit coded language, such as using the word ‘skypes’ to refer to Jews (as promoted by a far-right campaign in 2016) or using words that are spelled differently but sound similar, for example using ‘juice’ to mean ‘Jews’. Simpler versions might just replace individual letters with an asterix. On TikTok, our case study identifies emoji combinations as a form of coded language.
While simple in nature and easily recognisable if you are attuned to far-right campaigns of the moment, ever-changing vocabulary means that antisemitic coded language presents difficulties for moderation at scale. Coded language was most often found on mainstream and moderated platforms, including Facebook, Reddit and Twitter. The explanation is likely that these larger platforms are more proficient in removing antisemitic language and that it can result in bans, whereas 4chan /pol/ and Telegram does not punish users who engage in explicit antisemitism.
Coding language also goes beyond the purely practical goal of avoiding moderation. It also functions as far-right jargon that signifies belonging to the movement, meaning that individuals even on largely unmoderated platforms might occasionally use it as well, especially language that is connected to far-right campaigns, like the word ‘skypes’.
By its nature coded language is difficult to quantify, as it explicitly attempts to avoid automated ways of categorising it as antisemitism. Keyword search and even more sophisticated text categorisation methods are therefore often insufficient.
Advocation for terrorism
Largely unmoderated platforms like Telegram and 4chan /pol/, and to a somewhat lesser degree Parler, are home to some of the most extreme antisemitism on any of the platforms studied in this report. Telegram and 4chan / pol/ have both facilitated far-right terrorist subcultures, of which antisemitism is a core component.
4chan /pol/ and previously its counterpart on 8chan, alongside other similar imageboards, have been used to pre-announce terror acts and spread terror manifestos and letters ahead of the attacks in Christchurch, New Zealand and Halle, Germany. These manifestos paint progressives, media, minorities as the enemies of the ‘white race’ and they are filled with antisemitic ideas and conspiracy theories. Especially concerning is the often direct encouragement and support terrorism received on these forums. On Telegram and 4chan /pol/ there is a subculture that cheerleads for and deifies terrorists, and regards mass murder not only as a means to revolution and retribution, but as a form of entertainment.
This is the audience to which these gunmen addressed their sprees, and enables budding far-right activists to radicalise and network, and allows new groupings to coalesce. After the attacks, live-streamed videos and images of the attackers as well as victims and the manifestos spread rapidly across far-right channels on all platforms studied in this report. However, while moderated platforms like Facebook and YouTube eventually managed to remove most references to the material, they continue to circulate on Telegram and similarly unmoderated forums
The previous section on the growing conspiracy communities during the pandemic, across the platforms examined for this report, demonstrates the role of news events in influencing online discussions. A more extreme form of this phenomenon is the role of terrorism and attacks on minority communities in outbursts of antisemitism online.
Looking closer at antisemitic attacks, we find that these trigger events cause a rise in antisemitic content in the far-right communities on 4chan / pol/. The forum is consistently an antisemitic platform; based on keyword matching of antisemitic terminology we find that, on average, 5.5% of all posts contained antisemitic slurs. This number might not sound high, but considering that many posts on /pol/ are simply images or contain very little text, it is considerable and higher than any other platform studied for this report. The number also likely underestimates the amount of antisemitism on the forum as there are other, less common, terms also used and antisemitism can be expressed in conspiratorial ideas without directly mentioning Jewish people, or through images and videos which we have not analysed for this report. Moreover, many posts contain other forms of racism.
2019 saw the most deadly far-right terror attack since the Oslo attack in 2011. In March a far-right activist published a screed which railied against Muslims and was filled with extreme antisemitic conspiracy theories, and then killed 51 people in two mosques in Christchurch, New Zealand. Activity on /pol/ spiked by 84% on the day of the attack and remained elevated for several days. Although the attack was aimed at two mosques, antisemitic language similarily rose to its highest level of 2019.
It should be noted that relative amounts of antisemitism compared to the total amount of posts on the forum remained within the average. These observations speak to the fact that / pol/ has provided a forum for the expression of antisemitic ideas and that terror attacks, as well as other prominent news events, provide topics to discuss even if its only partially connected to Jews or antisemitism. When there is more to discuss on the site, generally, more antisemitism will be expressed because antisemitic language is part of its culture and its members find ways to draw connections between news events and the existing vast repertoire of antisemitic tropes and conspiracy theories.
Though the amount of antisemitism expressed is deeply worrying, part of the explanation for why it does not rise (in relative terms) even more could also be that the boards are already so saturated with antisemitic content that it might not be possible to increase significantly above its current level. However, it is important to underline that the type of antisemitism being expressed could still vary over time and take on a more violent nature after terror attacks, as indicated in the previous section.
All social media platforms in the following case studies display almost all kinds of antisemitism, ranging from conspiracy theories to calls for direct violence. However, different sorts of antisemitic language are present to different degrees and are expressed through a variety of mediums and tone, in a more coded manner or more overt. In other words, the platforms play different roles in spreading antisemitism.
Users have adapted their discourse in relation to the moderation practices, anonymity and the functions of the different platforms. The following case studies look closer at the role of comment functions, recommendations systems, humoristic images, short video clips and many other ways in which antisemitism is perpetuated today. Together they demonstrate the malleability of antisemitic discourse and the creativity of those that seek to spread its ideas.
You can find the detailed case studies on Facebook, Instagram, Parler, Reddit, Telegram, TikTok, Twitter, YouTube and 4chan/pol in the report:
Antisemitism in the Digital Age
Online Antisemitic Hate, Holocaust Denial, Conspiracy Ideologies and Terrorism in Europe
A Collaborative Research Report by Amadeu Antonio Foundation, Expo Foundation and HOPE not hate
- Executive Summary
- The Report in Numbers
- Conspiracy Ideologies, COVID-19 and Antisemitism
- Superconspiracies: QAnon and the New World Order
- Case study: Path of radicalisation into antisemitism
- The Changing Nature of Holocaust Denial in the Digital Age
- Case Studies of Antisemitism on Social Media:
- “4chan /pol/”
- Glossary of Antisemitic Terms
- Learnings from Project
Download the report as a PDF here: