From Mainstream to Extreme: The Transmission of Harmful Messaging Online

April 19, 2024


In today’s digital age, online spaces have transformed the way in which people communicate. The internet has been a sort of breeding ground for radicalization, often fostering echo chambers and hosting fringe communities. Exploring the dynamics of these virtual environments, this blog piece explores themes surrounding extremist messaging, information dissemination, and online spaces through speaking with Anna George, a scholar at the University of Oxford specialising in online political behaviours. We speak about fostering digital resilience in an era of misinformation    and disinformation and how online spaces can be both a place for mobilizing and minimizing extremism.

During my time at the DRIVE project, I was able to overlook the myriad way polarisation and existence on the fringes can drive exclusion and extremist views. How and why are people increasingly disenchanted with the world around them? How can online spaces specifically provide solace or further divide us?

Background and context

I had the opportunity to speak with and interview Anna George, a Social Data Science  scholar at the Oxford Internet Institute researching  online political behaviours and computational approaches to studying online harms. As she finalises her doctorate, her research has largely focused on the transmission of harmful messaging, online communities, alternative news, and extremist sentiments. Her ambitions rest on making the internet a safer space.

Mainstream social media and alternative spaces

We delved deep into our discussion, examining trends in low-quality news sources, the framing of COVID-19, the Ukraine war, climate change, and migration in extremist online spaces. What Anna’s research has been noticing is that as movements become more mainstream, they are more likely to mobilise offline. Online spaces allow people to meet other like-minded people. This can be as innocent as joining your local knitting club; however, due to the vast nature of the internet, it can quickly become more sinister. Extremist people and organisations both exploit and use the ease of the internet to broaden cross-sections of individuals who might share common views, facilitating radicalization or providing communities where extremist sentiments can be freely shared.  While individuals associated with extremist ideologies use and interact with virtual spaces for several reasons, such as the community and feeling of inclusion, Anna points out examples of when groups and lone actors can mobilise and move offline. With examples such as the attack in Charleston, USA, the Capitol Insurrection, and even her local environment of Oxford climate change conspiracy networks, we can analyse groups that used to be sparse becoming to morph together. Anna marks QAnon as a landmark turning point for rallying conspiratorial groups together, representing a conspiracist milieu lying adjacent to other extremist beliefs capable of violent goals, such  as the New World Order or the Great Replacement theories within white supremacist spaces. However, what is pointed out throughout our interview is the need to emphasise not only the capabilities of violence but also the detrimental effects of the dissemination of online disinformation through diverse supporters. What we can observe is the potentiality of conspiracy theories “going mainstream” through creating subcultures, boosting alternative media, persuading the masses, and proxy wars, all of which have political effects. 

Conspiracy narratives: trust, COVID-19, and anti-institutional extremism

As we continue our discussion, we land on a central insight in researching online spaces and why people are increasingly turning to extremist spaces: trust. Over the past 50 years or so, public trust in institutions and the news media has collapsed. The consumption, distribution, and production of news have altered thanks to the digital era and social media. This isn’t necessarily a phenomenon touching solely conspiratorial people; a lot of people are starting to lose trust in democratic governments. Anna emphasises this as a major gateway into foreign state entities playing into these claims, as at the core of extremist ideals, there is the perception that you can’t fully trust the government.  We discuss the effects that misinformation online has on undermining public trust. Anna continues in saying “it is political, because the topics that are discussed are political issues: lockdown, climate change,” stating the US to be the best example of this, while the trend is beginning to increase in the UK political sphere. The effects of this point to the weaponisation of topics.The COVID-19 pandemic sparked a feeling of alienation and mistrust in institutions. The uncertainty around COVID brought a lot of  questioning and fear, with corresponding measures taken by governments heightening anti-government  action in Europe. Low-quality news outlets and fringe movements used over-reporting on issues like vaccine hesitancy to spread anti-institutional sentiments, much of which spilled over onto mainstream social media. This environment was one that conspiracies could thrive in, as Anna points out that trends in conspiracies increase with uncertainty.  This provided an opportunity for extremist groups to spread disinformation, gaining exposure and recruitment benefits.

Accountability and Policy Initiatives: Turning Points and Lessons Learned

The topic delves into the incentives and stakes that actors and governments have in information sharing and online spaces. Anna contends that there remains a lot of confusion around this landscape and that the issue of online safety and its effects on the rise of domestic terrorism is a paramount policy avenue. We must continue to fund and research the impact of these movements, as well as the reasons why mistrust in governments and institutions is on the rise as extremist movements, populist movements, and foreign state media benefit from these narratives. She continues, calling for a multipronged approach among governments, social media companies, and civil society, drawing engagement from an educational standpoint in order to teach young people, in particular, the importance of online literacy and critical thought. In her own work, Anna performed a systematic review of computational approaches, offering methods and techniques social media companies can utilise in combating misinformation and disinformation. Interventions such as content-labelling (as true or false, trustworthy) or fact-checking videos can prime users into questioning the accuracy of content in order for people to discern for themselves the validity of sources. The UK government's Online  Safety Act (2023) is a great example of a government’s novel approach to enforcing measures to improve online  safety and the duties of internet platforms in managing harmful or illegal content.

Closing thoughts

While we covered a wide range of topics during our meeting, it is without a doubt an area of research that academics such as Anna George should push towards creating safer online and critical spaces. Topics of mistrust and disinformation regarding online spaces can aid us in gaining insight into the role of social exclusion and emotions regarding why individuals are attracted to these movements or exploited or recruited into movements through intersections of beliefs; perceptions are powerful. 

Written by:

The opinions expressed here are the author's own and should not be taken to represent the views of the DRIVE project.
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram