Vaccination Against Disinformation

A new technique developed by Cambridge University academics, called “pre-spoofing”, which pre-emptively exposes people to malicious or false propaganda, proved to be effective in developing greater critical awareness, and promises to become a useful tool in combating disinformation.

 What is it and How Does “Inoculation Science” Work to Combat Disinformation?

By Gabriel Levy

www.galevy.com

One of the greatest challenges facing humanity at the beginning of this third decade of the 21st century is that of disinformation [1]. The last pandemic taught us that infomedia proved to be as serious as the virus itself. That is, belief in misinformation about coronavirus disease 2019 (COVID-19), significantly influenced a lower willingness to be vaccinated against the disease and a lower intention to comply with public health measures [2].

In this context, a team of psychologists from the universities of Cambridge and Bristol, in partnership with Jigsaw [3], a Google unit dedicated to addressing threats to open societies, decided to investigate the issue of misinformation, with the aim of implementing successful anticipation strategies.

The Science of Information Inoculation

The group of academics from both universities, in order to mold a sort of vaccine or antidote to fake news, created 90-second clips designed to familiarize users with manipulation techniques such as “scapegoating” and “deliberate incoherence.”

Using a technique called “pre-faking,” the scientists preemptively expose Internet users to pills of very obvious and deliberately malicious false information, so that users can better identify online falsehoods, regardless of the subject matter.

The project, which received the name “Science of inoculation to misinformation”, is based on the same principles used in biology for the development of vaccines, but, in this case, in the field of psychology, giving people “micro-doses” of misinformation in advance, which proved to be very effective in preventing them from falling into the trap of “fake news” in the future, which is based on a principle that social psychologists call “inoculation theory.”

The results of the research were recently published in the journal Science Advances[4], detailing a total of seven experiments involving nearly 30,000 participants.

Initially, they conducted six experiments and the seventh was conducted a year after the first five to ensure that previous findings could be replicated.

Data collection for each participant was exhaustive, ranging from basic information (gender, age, education, political leanings), to levels of arithmetic, conspiracy thinking, news and social network checking, “fake news receptivity,” and a personality inventory, among other variables.

Taking all this into account, the team found that the inoculation videos improved people’s ability to detect misinformation and increased their confidence in being able to do so again. The clips also improved the quality of “shared decisions”: whether or not to disseminate harmful content.

Successful Results

This is the first “real-world field study” of inoculation theory on a “social networking” platform, with a very significant sample and published in indexed journals.

Study shows how a single viewing of a clip from a film intentionally manipulated to inoculate misinformation increases awareness of it.

“Our research provides the necessary proof of concept that the principle of psychological inoculation can be easily scaled among hundreds of millions of users worldwide”[5]

Professor Sander van der Linden

The paper’s lead author, Dr. Jon Roozenbeek, of SDML Cambridge, describes the team’s videos as “source independent,” avoiding the biases people have about where information comes from and how it matches, or doesn’t match, what they already believe.

YouTube has more than two billion active users worldwide. Our videos could easily be embedded in YouTube’s advertising space to prevent misinformation,” said study co-author Professor Sander van der Linden, director of the Social Decision-Making Lab (SDML) in Cambridge, who led the work.[6].

Practical Applications

Google, YouTube’s parent company, is already capitalizing on the findings. In late August, Jigsaw will launch a multi-platform campaign in Poland, Slovakia and the Czech Republic to get ahead of emerging disinformation related to Ukrainian refugees.

“Harmful misinformation takes many forms, but manipulative tactics and narratives are often repeated and therefore predictable.

Teaching people about techniques such as ad-hominem attacks that set out to manipulate them can help build resilience to believe and spread misinformation in the future.

We have shown that video ads as a method of delivering preliminary messages can be used to reach millions of people, potentially before harmful narratives take hold.”

Beth Goldberg, co-author and Director of Research and Development for Google’s Jigsaw unit.

Better Sooner than Later

The researchers found that prior debunking can be more effective in combating the deluge of disinformation than verifying each falsehood after it is spread, the classic “debunking,” which is impossible to do on a large scale and can entrench conspiracy theories by feeling like personal attacks on those who believe them.

“Propaganda, lies and misdirection are almost always created from the same playbook,” said co-author and professor Stephan Lewandowsky of the University of Bristol.

“We developed the videos by analyzing the rhetoric of demagogues, who deal in scapegoating and false dichotomies (…) Fact-checkers can only refute a fraction of the falsehoods circulating online. We need to teach people to recognize the disinformation manual, so that they understand when they are being misled.”

 In conclusion, the study promoted by the universities of Cambridge and Bristol, in alliance with Jigsaw, demonstrated that by means of a technique based on the “inoculation theory”, psychologists manage to activate defensive mechanisms of users with respect to false news, increasing their critical capacity to identify them and, in this way, hindering their propagation.

 

[1] Lewandowsky, S.; Ecker, U., & Cook, J. (2017). Más allá de la desinformación: comprender y afrontar la era de la “posverdad”. En Aplicación J. Res. Mem. Cog. 6: 353–369.

[2] Roozenbeek, J.; Schneider, C.; Dryhurst, S.; Kerr, J.; Freeman, A.; Recchia, G. van der Bles, A, & van der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world. En R. Soc. Ciencia abierta. 7: 201199.

[3] Ver sitio oficial en https://jigsaw.google.com/

[4] Roozenbeek, J. et. Al. (2022). Psychological inoculation improves resilience against misinformation on social media. En Science Advance, 8(34). Disponible en https://www.science.org/doi/10.1126/sciadv.abo6254

[5] Ver artículo completo “Social media experiment reveals potential to ‘inoculate’ millions of users against misinformation”, de la Universidad de Cambridge, en https://www.cam.ac.uk/stories/inoculateexperiment

[6] Op. Cit. Disponible en https://www.cam.ac.uk/stories/inoculateexperiment