Cutting through the chaos – how disinformation is shaping our reality
Professor Martin Innes’ research on disinformation explores its impact on democratic societies in the information age.
Early in 2024, when false rumours began circulating about the Princess of Wales, before she announced she was stepping back from public duties due to her cancer diagnosis, experts at Cardiff University’s Security, Crime, and Intelligence Innovation Institute (SCIII) were carefully studying social media.
Among the deluge of speculation and conjecture, Professor Martin Innes and his team quickly identified something more sinister: an organised, foreign state-backed operation to amplify those stories online. The revelations would go on to hit headlines around the world, with coverage in the BBC, New York Times, and more than 350 other news outlets internationally.
In October 2024, six Russian agencies and individuals involved in this campaign were sanctioned by the UK government. The UK Foreign Office said the so-called Doppelganger group were part of a “vast malign online network” intended to cause disruption and confusion, distributing fake news and undermining democracy.
Professor Innes says: “The massive international interest in this story was the ideal breeding ground for such a campaign. Doppelganger’s signature methodology is deploying very large numbers of disposable social media accounts to flood the information space around particular stories. This can prove especially influential when they are able to amplify narratives that appear less overtly political.
“And this is precisely what they did in trying to exploit the rumours and conspiracies about the Princess of Wales. In repeating and reheating these, they were able to disperse their anti-Ukrainian messaging, whilst also attacking a key British institution: the Royal Family.
“What we've seen over an extended period of time is the many ways that the techniques being pioneered by foreign state actors are becoming normalised on social media.”
A growing and evolving threat
“The stakes have never been higher when it comes to disinformation,” says Professor Innes. “And it’s our role as academics to help people understand that.”
As co-director of SCIII, Professor Innes has been leading major studies into disinformation for a decade. The Disinformation, Strategic Communications and Open Source (DISCOS) research group has now worked on or in more than 40 countries, blending social, behavioural, data and computer science methods, supported by governments, research funding councils and civil society organisations. The scale of the work reflects the growing concern about the cumulative impacts of disinformation upon democratic societies. And after a year of major conflicts and elections, the prevalence of disinformation certainly shows no sign of abating.
“Disinformation is information that has been sent into the world with the direct intention of sowing distrust and chaos,” explains Professor Innes.
“Disinformation is not a new thing – but the speed and scale at which it can spread now thanks to technology is what has made it one of the most pressing challenges of modern times. Indeed, at the start of 2024 the World Economic Forum highlighted disinformation as the number one short-term risk to global prosperity and security. Major world events over the past few years have created ‘a goldilocks environment’ for disinformation – we’ve seen Brexit, COVID-19, the war in Ukraine – among others, be used as a conduit for disinformation to enter our social media feeds.
“In the run up to the recent US election, we saw lots of different claims being thrown around. In an environment where views have become so polarised, it presents an opportunity for false narratives – by both domestic and foreign actors - to be planted and amplified more easily. And in Russia, we’re seeing that this isn’t just becoming a technique used in the shadows by its intelligence community; it’s becoming an increasingly commercial, highly-organised and well-regarded professional industry. As the pace of technology moves quicker and quicker – these methods are only going to become more sophisticated.”
An information revolution
So, when did online disinformation begin to become such a powerful force? Professor Innes’ work had initially focused on UK policing – including pioneering in-depth studies of how police investigators organise and conduct their work.
But as the role of technology and social media became more central to police investigations, in turn, it has also become integral to the work of Professor Innes and his team.
He explains: “Following the murder of Lee Rigby, a lot of work was being done to understand the role of mass and social media in shaping public reactions and understanding during and after terrorist events. It was clear that changes in the way people were obtaining information was having an influence on how terrorism was being carried out, what press and broadcast journalism were subsequently reporting and how the public was receiving that information. Police communication strategies therefore needed to reflect that to keep pace with the changes.
“In the years since that work was carried out, the use of social media has completely transformed how we consume information. Our work has had to quickly evolve to respond to that. The threat has grown and become more complex – technology now holds the keys to all aspects of our lives. It’s a central tool for us – but as the geopolitical climate has become more volatile, we’ve seen social media being hijacked for more sinister purposes.”
Research at the Security, Crime, and Intelligence Innovation Institute
We bring together world-leading inter-disciplinary expertise to develop new insights, evidence and knowledge to manage the key crime and security challenges of our age.
Joining the dots
Tara Flores is part of the research team sifting through vast amounts of online posts on a daily basis across a variety of platforms, looking for the signals and traces that can pinpoint instances of disinformation. With a background in public relations and strategic communications, she made the move into the study of disinformation following the pandemic.
“For me, coming to the work with a PR and media relations background is maybe a bit different to the usual academic route and it gives me a different perspective. I might see something and think about it in terms of what audience it’s trying to reach or have a view on how that narrative might develop.
“It was initially a little bit of a black box in some ways for me,” she adds. “Everyone says it’s happening – but I wanted to delve deeper to see what that means and to better understand the impact of disinformation. Rigorously studying it gives us the opportunity to educate major stakeholders – such as governments, policymakers and technology companies.
“The excitement comes from seeing these social media profiles that you think don’t seem quite right, but you don’t understand why. When you dig deeper and start finding connections and patterns which tie into a bigger picture, there’s a real buzz – providing those missing insights that haven’t been published or seen before.”
But Tara admits that finding disinformation is only the first piece of the puzzle. “The pace of technology is changing at an incredible rate and there are lots of different parties taking advantage of that,” she says.
“It requires lots of different teams like ours keeping track of those developments. If nobody is watching, then the other side is at a complete advantage. But what happens next is the bigger, more difficult question. How do you protect democracy and communicate effectively to people who are understandably distrustful and confused by the content they see? It’s a huge challenge – and one that needs effective policies in place to navigate it.”
Emotional triggers
Researchers aren’t just using reactive techniques to understand the methods used to spread disinformation: they’re working to understand the psychological factors that make us all more susceptible to false narratives.
Bella Orpen is a research assistant working on survey data to understand the psychological influences that lead people to believe and share false stories online. “There’s a lot of evidence to suggest that people tend to think they are able to spot misinformation and that it’s others who get sucked in,” she says. “But actually, our studies show that people often overestimate their ability to distinguish fake news, particularly when certain triggers are used.”
For one study, the team devised a fake news story and showed it to 8,630 people to assess their reactions. The fabricated story, about a ‘spy dolphin’ heading to a popular holiday resort, was intentionally similar to previous media reports about the use of dolphins and whales for government espionage activities.
The results showed more than half (53%) of those who thought they recognised the news story believed the content to some degree.
The data showed emotional impact was a key driver in engagement, with 78% of those who felt very ‘fearful’, 70% of those who were very ‘surprised’ and 84% who were very ‘excited’ saying they would have interacted with the story in some way.
Bella says: “Recognition and repetition is a powerful way of getting people to believe disinformation, and this creates issues for some protective methods, such as fact checking, which typically involve repeating the false claim. There are also certain emotional factors that will get people to share content. If they have a psychological reaction, even a positive one, to what they’re seeing, they are more likely to amplify it in some way – either by sharing or commenting on the post.”
A growing industry
Academics in the West aren’t alone in their quest to understand disinformation. In Russia, political technologists are also involved in the study of the impact and causes of disinformation.
This sector is becoming increasingly well-respected – with the insights being used to inform Russia’s geopolitical strategy.
Analysis by Professor Innes and his team showed that at an online conference held last year, political technologists were planning for influence operations designed to impact the upcoming US elections by exploiting “wedge issues” about immigration, identity politics and culture wars.
They also constructed detailed profiles of senior US political figures who they predicted might feature prominently in the electoral race.
One political technologist even visited London for his research into British electoral processes during the Brexit vote in 2016. Yevgeny Minchenko’s publication, ‘How elections are won in the USA, Great Britain and the European Union: analysis of political technologies’, includes survey materials provided to him by UK politicians, campaign staff, political consultants, and journalists.
The team’s analysis of open-source data shows that Minchenko was in Mayfair conducting “participant observation” on the day of the historic Brexit vote in 2016, sharing pictures of polling stations with his social media followers in Russia.
Professor Innes says: “For a number of years, our conversations about these issues have pivoted around the discovery of fake social media accounts, and the disinforming narratives and visuals that they disseminate. This evidence adds a new element, shifting our attention upstream to show who is responsible for designing and deploying these campaigns in the first place.
“In many ways, political technologists mirror the techniques and approaches being used by Western ‘open-source intelligence’ analysts, albeit refracted through a Russian lens. The level of resource and training we are seeing Russia pour into this field is indicative of the increasing emphasis it puts on digital political technologies as a strategic instrument for geopolitical influence.
“There is no doubt that information warfare is evolving at pace – and we need to be armed with the knowledge so that we can find effective strategies to counter it.”
Professor Martin Innes
Co-Director (Lead) of the Security, Crime and Intelligence Innovation Institute
Our research
Read more about our research into disinformation.
Reports
Putin’s ‘Little Grey Men’: Russia’s political technologists and their methods
Report from Security, Crime and Intelligence Innovation Institute
If this document cannot be read by your assistive software, you can request an accessible version by emailing web@cardiff.ac.uk. Please include the assistive tools you use and the format you require.
How a Kremlin-Linked Influence Operation is Systematically Manipulating Western Media to Construct & Communicate Disinformation
Minutes to Months: A rapid evidence assessment of the impact of media and social media during and after terror events
How a Kremlin-Linked Influence Operation is Systematically Manipulating Western Media to Construct & Communicate Disinformation
Minutes to Months: A rapid evidence assessment of the impact of media and social media during and after terror events
Articles
- Innes, M. , Davies, B. and Lowe, T. 2023. Counter-governance and 'post-event prevent': regulating rumours, fake news and conspiracy theories in the aftermath of terror. International Journal of Law, Crime and Justice 72 100370. (10.1016/j.ijlcj.2019.100370)
- Innes, H. and Innes, M. 2023. De-platforming disinformation: conspiracy theories and their control. Information, Communication and Society 26 (6), pp.1262-1280. (10.1080/1369118X.2021.1994631)
- Innes, M. , Dobreva, D. and Innes, H. 2021. Disinformation and digital influencing after terrorism: spoofing, truthing and social proofing. Contemporary Social Science 16 (2), pp.241-255. (10.1080/21582041.2019.1569714)
- Innes, M. et al. 2021. The normalisation and domestication of digital disinformation: on the alignment and consequences of far-right and Russian State (dis)information operations and campaigns in Europe. Journal of Cyber Policy 6 (1), pp.31-49. (10.1080/23738871.2021.1937252)
- Dobreva, D. , Grinnell, D. and Innes, M. 2020. Prophets and loss: how "soft facts" on social media influenced the Brexit campaign and social reactions to the murder of Jo Cox MP. Policy and Internet 12 (2), pp.144-164. (10.1002/poi3.203)
- Grinnell, D. et al. 2020. Normalisation et domestication de la désinformation numérique : les opérations informationnelles d’interférence et d’influence de l’extrême droite et de l’État russe en Europe. Herodote: Revue de Geographie et de Geopolitique 2-3 (177/17), pp.101-123.