People can be “vaccinated” and at least partially protected from misinformation online, according to a new study, which found that becoming familiar with common manipulation tactics helped them spot harmful content.
By explaining these tactics through short cartoons played during YouTube ad slots, the researchers were able to “vaccinate” users against misinformation.
The researchers behind the Inoculation Science project liken it to a vaccine: by giving people a “microdose” of misinformation in advance, it helps prevent them from falling for it in the future.
The technique, described by the study authors as a “psychological inoculation,” was found to be consistently effective with users across the political spectrum.
The study on YouTube, which has more than 2 billion active users worldwide, included the first “real field study” of vaccination theory on a social media platform.
And the authors claim the short cartoons avoid prejudices people might have about where information came from and whether or not it’s true. Rather, they gave general advice on how to spot potential misinformation.
“Our interventions make no claims about what is true or fact, which is often disputed,” said lead author Dr. Jon Roozenbeek of the University of Cambridge’s Social Decision-Making Lab.
“They are effective for anyone who doesn’t appreciate being manipulated”.
He added: “The vaccination effect was the same for liberals and conservatives. It worked for people with different levels of education and different personality types. This is the basis of a general vaccination against misinformation.”
Working with a team within Google called Jigsaw, which the company says works to combat threats to open societies, psychologists from the Universities of Cambridge and Bristol have created 90-second animated clips that familiarize people with common misinformation techniques make.
These include the use of emotional language, intentional incoherence, false dichotomies, scapegoating, and ad hominem attacks.
The idea is to prebunk misinformation before it is consumed by viewers, rather than debunking misinformation after it has already spread.
The authors argue that debunking is impossible to do at scale and that prebunking might be more effective.
“YouTube has well over 2 billion active users worldwide. Our videos could easily be embedded in advertising space on YouTube to prevent misinformation,” said study co-author Professor Sander van der Linden, head of the Social Decision-Making Lab.
“Our research provides the necessary proof of concept that the principle of psychological vaccination can be easily scaled to hundreds of millions of users worldwide.”
The study consisted of seven experiments involving nearly 30,000 participants.
Six initial controlled experiments involved 6,464 participants, with the sixth experiment conducted a year after the first five to ensure previous results could be replicated.
Data on each participant was collected, including basic demographic information, but also numeracy skills, conspiratorial thinking, interest in news, and others.
They found that the vaccination videos improved people’s ability to spot misinformation and boosted their confidence to do so again.
Two of the videos were then tested in advertising space on YouTube as part of a large experiment.
Around 5.4 million people in the US saw the videos, with almost a million watching for at least 30 seconds. Some of these users were then asked to answer an optional test question.
“Manipulation tactics are predictable”
“Harmful misinformation takes many forms, but the manipulative tactics and narratives are often repeated and therefore can be predicted,” said Beth Goldberg, co-author and head of research and development for Google’s Jigsaw unit.
“Teaching people techniques such as ad hominem attacks aimed at manipulating them can help build resilience to believing and spreading misinformation in the future.
“We have shown that video ads can be used as a delivery method for prebunking messages to reach millions of people, potentially before harmful narratives take hold,” Goldberg said.
The ability to spot manipulation techniques at the heart of misinformation increased by an average of 5 percent among participants in the experiments.
“On average, users took part in the tests about 18 hours after watching the videos, so the vaccination seems to have stuck,” said van der Linden.
Researchers say extending this concept to social media platforms could have far-reaching implications for reducing the impact of misinformation.
Google says it’s already acting on the findings and that its Jigsaw team will conduct a cross-platform prebunking campaign in Poland, Slovakia and the Czech Republic in late August to forestall emerging disinformation about Ukrainian refugees.