The misinformation epidemic
This year, the World Economic Forum identified the spread of digital misinformation as one of the most serious threats to human society. The good news? King's researchers have found a surprisingly fun way to strengthen our defenses against it.
False information spreads fast.
In a world where we’re constantly online, a handful of posts and news stories can quickly grow into an epidemic of misinformation and disinformation.
With mobile apps like ChatGPT, Midjourney and Gemini putting artificial intelligence in the pockets of millions of users worldwide, fake content can now be generated in just a few clicks.
Social media platforms are particularly fertile ground for misleading content: recent studies have shown that fake news spreads faster, further and deeper than the truth on social media.
What’s the difference between misinformation and disinformation?
Misinformation
Misinformation is fake content that is created and spread accidentally – think miscalculated statistics or misinterpretations of real events or news. People sharing misinformation genuinely don’t know that what they’re passing on isn’t true; they are not setting out to deceive others.
Disinformation
Disinformation, on the other hand, is deliberately crafted to mislead. Whether it’s clickbait headlines, doctored images or fraudulent claims, the goal is often to influence the thoughts and behaviour of others. This might be for financial gain, political power or ideological reasons.
In an analysis of 126,000 rumours spread by approximately 3 million people...
... researchers found that false news diffused to up to 100 times more people than the truth.
62% of internet users report seeing content on news sites or social media that they consider doubtful or untrue.
When presented with a list of five conspiracy theories, ranging from the existence of a secret group controlling world events to contact with aliens, the majority of Britons believed at least one to be true.
Once misinformation and disinformation spread, they can become a contagion.
While some fake news may seem harmless – like false rumours about celebrity relationships or football transfer gossip – in other contexts, it can be dangerous.
False claims of election fraud during the 2020 United States Presidential election roused protestors to storm the Capitol building, resulting in deaths, injuries and over a thousand arrests. Similar false claims narratives have taken off around elections across the globe, ranging from the EU, to Brazil and India, fuelling growing distrust in democratic institutions.
The rise of health misinformation gaining traction online poses a serious threat to global wellbeing.
Dr Rachael Kent, Senior Lecturer in Digital Economy & Society Education, highlights attitudes towards sunscreen as an example. Dermatologists report treating more patients with severe sunburns or suspicious moles after they stopped using sunscreen based on misleading TikTok and Instagram videos. Online creators touting ‘natural’ cures for diseases like cancer, diabetes and asthma have led to a rise in patients rejecting safe, clinically proven treatments in favour of bogus alternatives.
‘Misinformation created by social media influencers is spreading and this isn’t just a random trend – social media platforms [are becoming] unregulated public health platforms. They influence what users see and believe about health, but unlike public health institutions, they’re not bound by standards for accuracy or harm reduction. The consequences can be serious.’
Dr Rachael Kent
Inoculation can stop the spread of misinformation and disinformation.
Researchers in the Department of War Studies at King’s have found a creative way to protect people from fake news and promote a healthier information environment.
Working with the Department of Psychology at the University of Cambridge, King’s researchers have developed a suite of award-winning games designed to counter misinformation and disinformation, and improve media literacy in a fun and approachable way.
The games are based on inoculation theory, a psychological concept where exposure to small doses of common disinformation tactics helps build resistance against fake news in the real world.
‘Fake news thrives on its virality. People believe it and spread it, infecting others. But if you can inoculate people against believing and sharing fake news in the first place, it works like a vaccine. The virus of disinformation can be contained — and those creating content in bad faith lose their power.
‘Rather than simply telling people what’s true or false, we focus on preemptively debunking misinformation and disinformation by revealing the tactics used to create them. The goal is to empower the public to recognise when someone may be trying to mislead them — and to make their own informed decisions. When you can walk a mile in the shoes of a disinformation creator, you gain a deeper understanding of how it works.’
Dr Jon Roozenbeek
Researchers in the Department of War Studies at King’s have found a creative way to protect people from fake news and promote a healthier information environment.
Working with the Department of Psychology at the University of Cambridge, King’s researchers have developed a suite of award-winning games designed to counter misinformation and disinformation, and improve media literacy in a fun and approachable way.
The games are based on inoculation theory, a psychological concept where exposure to small doses of common disinformation tactics helps build resistance against fake news in the real world.
‘Fake news thrives on its virality. People believe it and spread it, infecting others. But if you can inoculate people against believing and sharing fake news in the first place, it works like a vaccine. The virus of disinformation can be contained – and those creating content in bad faith lose their power.
‘Rather than simply telling people what’s true or false, we focus on preemptively debunking misinformation and disinformation by revealing the tactics used to create them. The goal is to empower the public to recognise when someone may be trying to mislead them – and to make their own informed decisions. When you can walk a mile in the shoes of a disinformation creator, you gain a deeper understanding of how it works.’
Dr Jon Roozenbeek
Players take on the persona of a disinformation agent, mastering real-world manipulation tactics like polarisation, sowing conspiracy theories and trolling. Each of the games focuses on different domains of misinformation, including online fake news, political disinformation and intergroup polarisation.
In Bad News, players are given the role of a fake news media tycoon, trying to gain as many followers as possible without completely losing all credibility.
To advance their mission, players create their own alarmist headlines and memes, preying on people’s emotions to go viral. They can try out impersonating real news sources to piggyback on their credibility, and defend themselves against attacks from fact checkers by going on the counteroffensive.
Game over for fake news.
By exposing exactly how disinformation techniques work – and why they’re so effective – the games help players recognise manipulation tactics in the real world and build lasting immunity against them.
And the data proves these games work. In an experiment, participants played either Bad News or Tetris, and were then asked to rate the reliability of news headlines that used manipulation techniques. The group who had played Bad News rated the fake news as significantly less reliable. The effects of inoculation were also consistent across people with different levels of education, beliefs and personality types.
Bad News, Harmony Square and Cat Park have been played by millions of people around the world, have been translated into multiple languages, and have made their way into hundreds of classrooms as part of new digital literacy curricula.
Researchers leading the project have also advised governments, international bodies and companies like Google and Meta on psychological approaches to understanding and fighting misinformation.
‘Our research shows these games confer real psychological resistance against manipulation techniques. Players from around the world find social media content making use of these techniques significantly less reliable after playing, they are more confident in their ability to spot such content, and less likely to report sharing it with others in their network. They are effective for anyone who does not appreciate being manipulated.’
Dr Jon Roozenbeek
Ready to tackle the fake information epidemic?
Step into the shoes of a disinformation creator, spot the techniques they use to misdirect and mislead, and get inoculated against the spread of fake news.

EDITORIAL TEAM
Teresa Richards
Ellie Stone
WRITERS
Kelly Archer
Paul Brooks
Hermione Cameron
Kate Denereaz
Kate Hazlehurst
Joely Langston
DESIGN
Principal design by Jonathan Vickers
Additional design by Harpoon Productions and Carly Yung
Photography by Nathan Clarke and David Tett
WITH SPECIAL THANKS TO
JH Norris
ALUMNI & EDITORIAL OFFICE
King’s College London
57 Waterloo Road,
London,
SE1 8WA
© King’s College London 2025
InTouch is published by the University’s Philanthropy & Alumni Engagement Office. The opinions expressed in it are those of the writers and not necessarily those of the University.
If you have a story for our Spring 2026 issue, email us at forever@kcl.ac.uk
Terms & Conditions | Privacy Policy | Cookie Policy | Accessibility Statement
EDITORIAL TEAM
Teresa Richards
Ellie Stone
WRITERS
Kelly Archer
Paul Brooks
Hermione Cameron
Kate Denereaz
Kate Hazlehurst
Joely Langston
DESIGN
Principal design by Jonathan Vickers
Additional design by Harpoon Productions and Carly Yung
Photography by Nathan Clarke and David Tett
WITH SPECIAL THANKS TO
JH Norris
ALUMNI & EDITORIAL OFFICE
King’s College London
57 Waterloo Road,
London,
SE1 8WA
InTouch is published by the University’s Philanthropy & Alumni Engagement Office. The opinions expressed in it are those of the writers and not necessarily those of the University.
If you have a story for our Spring 2026 issue, email us at forever@kcl.ac.uk
© King’s College London 2025
Terms & Conditions | Privacy Policy | Cookie Policy | Accessibility Statement

