A vaccine against misinformation

Parallel to the implementation of the vaccination program for the SARS-CoV-2 virus, researchers from the Psychological Sciences have been carrying out research on the possibility of conferring immunity on the population against misinformation and false news. Get in this article your “vaccine for misinformation”

Despite efforts to the contrary, implemented at different levels in the population, it continues to spread and, daily, we see new outbreaks all over the place. We could be talking about the SARS-CoV-2 virus, whose sudden appearance just under a year ago resulted in the current pandemic.

However, the opening sentence of this text applies equally to disinformation and false news, which found in the dynamic underlying modern social networks an “ecology” suited to their emergence, propagation and adverse effects.

Surprisingly, the analogy between SARS-CoV-2 and misinformation on social media can go beyond the mere superficial parallelism rehearsed along these lines: at a time when vaccines for the SARS-CoV-2 virus were recently developed and programs implemented of vaccination, researchers in Psychological Sciences have been studying ways to implement similar strategies to combat misinformation.

The idea of ​​fostering psychological resistance at the individual level to false information is not new, but goes back to the 1960s.

At the time, social psychologists involved in the Attitude and Persuasion Program at Yale University conducted some studies in an attempt to answer fears of “brainwashing” (brain wash) and persuasion of American soldiers captured in the Far East.

It is in this context that William McGuire comes to develop the so-called Inoculation Theory (1970), which has re-emerged as the target of several recent studies as a potential strategy in combating modern versions of disinformation.

A “vaccine for misinformation” is particularly promising, as alternative solutions have been shown to be ineffective (such as false news correction initiatives that, by necessarily focusing on particular pieces of information, are more time-consuming and with less scope than misinformation itself) or even result in adverse effects (such as the implementation of computer algorithms that filter false news or legislative and regulatory initiatives).

The notion of psychological inoculation closely follows the basic idea of ​​the biomedical analogue: if a person is exposed to small samples of false information, relatively "weakened", this will trigger reasoning processes which, what "mental antibodies" might turn out to be. reactivated upon future exposure to misinformation, resulting in an equivalent of “psychological immunization”.

In its classic version, a "psychological vaccination" generally involves two basic components, an affective and a cognitive one: (i) the person is warned that they will be exposed to a skewed and false piece of information, similar to those you may find in the their day-to-day – the purpose here is to trigger an emotional response to a possible “threat” and consequent activation of resistance reasoning processes (affective component); (ii) the information is presented, which may be accompanied by counterarguments and possible answers (cognitive component).

One of the meta-analysis of 54 classic studies on “psychological inoculation” revealed that it is more effective in resisting misinformation than the mere provision of reliable information and that the “immunization” effect lasts for at least two weeks.

Obviously, as with its immunological analogue, the success of psychological inoculation critically depends on a clear understanding not only of the mechanisms and processes underlying the main forms of misinformation for which immunization is desired, but also the psychological phenomena associated with respective vulnerability.

Interestingly, some recent studies (eg, Pennycook et al., 2020) showed that, in general and despite individual differences, the ability to discern false news and information is not commensurate with the intention to share them on social networks.

Indeed, when participants in a study were asked to indicate the degree to which they believed or the degree to which they would share uninformative content on social media, the answers differed from each other.

In other words, what motivates anyone to share misinformation is not necessarily the degree to which they believe in its veracity, but rather the degree to which they agree with part of the content or the degree to which it is consonant with sociocultural affiliation, which is easily understandable if one notices that we tend to be positively reinforced (and to reinforce ourselves, with likes and social interactions, in the form of comments) by sharing content that is congruent with the social group we identify with, and not necessarily for its veracity and accuracy.

At the same time, when asked about this, most people indicate that it is relevant for them to only share credible and accurate information on social media.

Furthermore, in the same study, when people were asked to indicate the degree to which a single piece of news seemed credible or credible to them, later reports of intentions to share on social media for pieces of disinformation, even with different content, tended to correlate. if with judgments of credibility.

Apparently, the mere fact that they were previously asked to indicate “how much they believed” in any piece of news was enough to highlight that dimension and, consequently, trigger the same cognitive processes associated with the discernment of false news.

This result encapsulates the minimum conditions for a “psychological inoculation” – presentation of a sample of disinformation accompanied by a warning that it may not be trustworthy, implicitly present in the question posed to the participant.

In the same vein, other authors have sought to implement the logic of the Inoculation Theory in small interactive games, which present themselves as “broad spectrum vaccines” (still without Portuguese versions) for misinformation on social networks: Bad News (focused on False News, in general), harmony square (with an openly political context) and Go viral (with a focus on the current pandemic and related misinformation).

All of these share the same game mechanics: the player is invited to impersonate an agent of disinformation with the aim of sowing discord, confusion and splits in the “population”, implementing strategies similar to those used for spreading news and false information on social networks . Player performance is translated into virtual likes (similar to a score) and earning badges (badges) when mastering one of several common strategies.

The first of these, Bad News, developed in collaboration with Sander van der Linden and Jon Roozenbeek, researchers at Cambridge University, UK, and leaders in contemporary research on Inoculation Theory, has proven effective in improving the ability to discern and resist misinformation, in a large-scale study with 15000 participants.

Although the active component, as implemented in the game, is a relevant aspect in “psychological inoculation”, it is its informational aspect, instantiated in the game in the form of badges, that provides immunity to false news and biased content.

The reader can, therefore, and even without the playful aspect, immediately benefit from his “vaccine for misinformation” by apprehending the following strategies commonly used in false news. Note that while each may seem innocuous in isolation, a successful disinformation campaign tends to use them together. Can you identify one or more of these disinformation strategies in the feed news from your social networks?

Source forgery: Currently, on the internet and social networks, it is particularly easy and cheap to adopt a profile or create a "news" page that superficially simulates an expert or a professional and legitimate institution, by mimicking their appearance, by adopting logos and/or names related.

People, when sharing information online, rarely pay due attention to the source, and someone who only superficially mimics someone legitimate or trustworthy is enough to disseminate and spread disinformation.

Emotion: Emotions such as fear, anger and empathy are intrinsically motivating and compel people to act – either by sharing material that activated these emotional states or reacting (in the form of comments) to that material.

Obviously, not all emotional content on social media is necessarily fake. However, and knowing that people tend to suppress analytical reflection when emotionally activated, it is relatively easy to instill fear, anger or empathy in misinformation - consequently, people will reflect less on the veracity of the information and act based on how it makes you feel, especially if the emotions imply some urgency.

Polarization and false amplification: Disinformation agents, whether driven by any specific agenda, or just for the very purpose of disseminating false information, don't always even need to create original content.

Contemporary society and social networks are rich in cleavages between social groups and perspectives on numerous issues. Often, these splits and oppositions are relatively subtle and manageable on a day-to-day basis.

However, it is also relatively easy to exploit these to polarize opinions to the extreme and to manufacture conflicts. A good metaphor is the case of a wooden board that, when subjected to pressure, breaks at its weakest point. It's common for disinformation content to exploit these break lines by amplifying them and often forging opinions and news that support both sides.

A typical strategy is the use of bots – autonomous computer programs that “share” information on social networks simulating legitimate users. A small army of bots can be enough for an issue to gain traction on social media and appear far more relevant and prevalent than it actually is.

Conspiracy: Conspiracy theories are seductive – they provide an insight into the world that endows its followers with a sense of understanding and mastery of it.

The idea that we know or realize something that most people ignore can be heady and make someone feel superior and/or more capable. The internet and social media provide a basis for interactions and information sharing that easily feeds conspiracy theories. When orchestrated to oppose or cast doubt on an “official narrative”, they can be purposefully used as a vehicle for disinformation.

Discrediting: Inevitably, any piece of disinformation that gains prominence and prominence in society and social networks will be the target of refutation, whether by collective fact-checking efforts or by individual citizens.

When this occurs, the typical strategy is to try to discredit or question their legitimacy. Note that, for the maintenance and amplification of misinformation, it is not necessary (and even counterproductive) to respond to critical voices or the respective arguments - it is enough to deflect attention to their credibility, thus modifying the focus and preserving the intended misinformation Disseminate.

Discrediting doesn't even need to be trustworthy – the logic is rather to foster an idea of ​​“where there's smoke, there's fire”.

Trolling: This English-speaking term describes the technique known in Portuguese as “Pesca Trorico”, in which a sample of bait is dragged by a slow-moving boat to attract fish.

It came to be adapted and widely used on the internet and social networks to designate comments that are deliberately provocative or controversial, with the intention of triggering emotional responses and tricking people into starting a discussion.

Frequently on virtually any social network, the trolling it can be easily exploited by disinformation campaigns to undermine the credibility of opposing voices, shift the focus of the discussion as a distraction, or to attract followers and comments to a given topic it wants to amplify.


Author Nuno Alexandre de Sá Teixeira graduated in Psychology from the Faculty of Psychology and Educational Sciences of the University of Coimbra, and a Ph.D. in Experimental Psychology from the same institution.
He has worked as a PhD researcher at the Department of General Experimental Psychology at the University Johannes-Gutenberg, Mainz, Germany, at the Institute of Cognitive Psychology at the University of Coimbra and at the Center for Spatial Biomedicine at the University of Rome 'Tor Vergata', Italy.
He is currently Visiting Assistant Professor at the Department of Education and Psychology at the University of Aveiro.
His scientific work has focused on the study of how physical variables (in particular, gravity) are instantiated by the brain, as “internal models”, to support perceptual and motor functions in the interaction with the world. Thus, his interests depart from the hinge between thematic areas such as Psychology of Perception, Psychophysics and Neurosciences.






Help us to do the Sul Informação!
Contribute your donation so that we can continue to make your journal!

Click here to support us (Paypal)
Or use our IBAN PT50 0018 0003 38929600020 44