We live inundated with information, “infoxicados” say many. What is certain is that this abundance of information increases our chances to obtain knowledge… but also to consume information that are incorrect.
And the consumption of incorrect information is a major problem, since they are not at all easy to correct once this is understood. There are some examples of famous of this phenomenon, for example: as of today, there is still a fraction negligible of americans who believe that Barack Obama was not born in the united States, when in fact he was born in Hawaii; alarming also is the proportion of people who doubt that climate change is not caused by the effect of human activities, when in fact there is considerable scientific consensus on the human being itself is responsible for climate change; and etc, etc,…
A few months ago, John Cook of the Global Change Institute (University of Queensland) and Stephan Lewandowsky of the School of Psychology of The University of Western Australia, created an interesting document called The Debunking Handbook, a short guide of 9 pages aimed at communicators from various areas who want to confront the misinformation.
At the time, already reseñé The Debunking Handbook in this blog. Recently, Cook and Lewandowsky, next to Ullrich K. H. Ecker, Collen M. Seifert and Norbert Schwarz, they have developed the ideas of The Debunking Handbook in the form of a complete article for the magazine Pshycological science in the public interest. In the article, the authors give an answer to two questions: how do we generate the disinformation and why is it so difficult to alleviate?, and what are the most appropriate strategies to combat it?
In this post I will present a summary of the main points of the article (although I recommend the reader to complete reading, because it is rich in references to interesting studies on the misinformation).
The authors begin by introducing the main sources of misinformation in our current societies:
In the first place, we have the rumours and the fiction. You can find the rumors as an obvious source of disinformation, given our tendency to communicate information that causes an emotional impact on the receiver (a phenomenon especially visible in the media). But can the works of fiction is not a typical example of a source of disinformation in which we all sit. However, their effects are very important. Fictional stories often contain accurate information about the world, but may also contain facts wrong or invented that go unnoticed to the reader and to be incorporated into his vision of the world.
In the second place, an obvious source of disinformation are the government and politicians. Although it seems to be that the public has some kind of awareness about the presence of political bias in the society, this awareness does not seem to make us more able to differentiate between informaciñon correct and false, so it is not a protection against the effects of misinformation.
Third, the particular interests, which lead to companies and corporations to disseminate false information, a phenomenon well observed in those areas that have to do with public health or the environment.
In fourth place, another obvious source of disinformation, the media of communication: their tendency to simplify news complex, and to present points of view that are “balanced” on controversial issues often end up favoring the dissemination of information not very precise (when not openly false). The authors dedicate a special mention to the Internet, as social networks and platforms for creating blogs allows the creation of “ciberguetos”, in which individuals are only exposed to those information that are consistent with your points of track pre-conceived, that doesn’t have to be the most correct.
Point sources most common misinformation is important, because people can’t recognize incorrect information to not be that we warn you of this. In other words: we tend to take for granted the veracity of the information that we consume, unless we have a strong motivation that will lead us to examine with more attention.
In fact, people tend to take for correct certain information if:
It is consistent with other things that people assume: it is the well-known phenomenon of confirmation bias.
Form part of a larger story that gives meaning and coherence to its elements: in fact, the stories are consistent are easier to process and remember those who have gaps in their internal coherence.
It comes from a credible source: unfortunately, our judgments about the credibility of a source may not be very accurate (for example, the simple repetition of a name can make it more familiar, and familiarity can be associated with a reputation of credibility).
There are other people who consider it correct: a factor that in itself does not guarantee anything, as the repetition of information, even if it is false, it may lead to an illusion of social consensus (that is to say, we can believe that there are more people that gives in the correct information, although it is not true).
At the beginning of the post I said that the incorrect information are not easy to correct once they have been assimilated. Why?: the authors identify a series of cognitive processes that make difficult the correction of the false information:
In the first place, people tend to create mental models about how the world works: if the correction of false information that is part of a mental model creates a “gap” in this model, we can aim to keep the model that we had already formed so as to maintain consistency.
In the second place, we may retain information but do not remember exactly where we got it (what is known as amnesia of source): well, it could be that attribute the origins of the information (false) to a source that normally is objectively credible.
Thirdly, and as already mentioned above, the stories are coherent are processed and remembered more easily: however, that a story is coherent doesn’t mean it’s correct.
In fourth place, can produce the phenomenon of reactance: as a general rule, people don’t like to tell us what we need to do or think, so that may be a rejection of the correction (especially when this comes from sources with authority).
In fifth place, pre-existing beliefs: our ideologies particular can affect the degree of truth which we attribute to a false information once it has been corrected (it may even lead to reinforcing our belief in the false information).
Studies on the cognitive factors that affect the persistence of misinformation, offer a complex panorama. Even so, the authors, as did Cook and Lewandowsky’s The Debunking Handbook, offer a few principles of action for all those communicators who want to combat the misinformation:
1. Consider what “gaps” can create the correction of information in the mental model of the person, and fill that gap with an alternative explanation.
2. Repeat corrections, but taking into account that too much repetition can cause people to increase their trust in misinformation.
3. Emphasize the facts that you want to communicate, and not the incorrect information, since a repeated exposure to information may make it more accessible in memory.
4. If there is mention of a myth (an information that is not correct), provide an explicit warning that it is going to mention a false information.
5. Give priority to corrections short and simple: if the myth is more attractive and simple that the correction will be cognitively more attractive and easy to process.
6. Keep in mind if the correction can threaten the world view of the audience: such a threat can make the audience stick to their particular vision, blocking out the correction. In that case, the correction can be presented in a way that, in a certain way, affirm the vision of the listeners.
The authors mention an implication very interesting its study: techniques to combat the misinformation also can be used to misinform the population. And is that correct misinformation is cognitively indistinguishable from the assertion of a false information: for example, we think that providing a coherent narrative is both a technique for spreading false information as to correct them. Thus, the authors argue that it is important for the public to have a knowledge base of the effects of the disinformation, as a means to develop a healthy skepticism to the information that we receive, and protect us from the effects of the misinformation.