Menu Close

Where does misinformation come from, and what does it do?

Misinformation spreads rapidly and sticks tenaciously. Moon_son/Flickr

Obama is a Muslim, vaccinations cause autism, asylum seekers are breaking the law, GM foods cause cancer.

These are all pieces of unsubstantiated misinformation that are commonly encountered on TV, talk-back radio, blogs and other websites.

In a recent review paper in Psychological Science in the Public Interest, we follow the trails of such misinformation: where it originates, how it is spread, how it is processed, how it affects our cognition, and how its effects can be alleviated.

Misinformation comes in many guises. It can come from social media or from works of fiction (if you now wonder whether people really extract information from fiction, think about the fact that fiction author Michael Crichton has been invited as a climate “expert” to testify before a US Senate committee).

The evening news may report something believed to be accurate at the time, but overnight further investigations may reveal new evidence. What is genuinely believed to be a clandestine biological lab to produce weapons of mass destruction one day may actually turn out to be a legitimate commercial laboratory the next day.

And then of course there is intentional fabrication and propaganda.

Misinformation is also spread for a variety of reasons. People simply prefer passing on information that is likely to evoke an emotional response in the recipient — whether or not it is true is not always the top criterion. (How many lives does Charlie Sheen actually have? According to Facebook, he keeps dying!)

Sometimes misinformation is spread deliberately: claims that Obama was born outside the US, or that there is no evidence that humans are causing climate change, have a clear aim and purpose.

Unfortunately, the media often contribute to the spread of unsubstantiated myths because of a focus on “balanced” coverage. Alas, the “two sides of a story” don’t always deserve equal space because of an imbalance in evidence.

So why is that a problem? Surely, people can tease apart the truths from the falsehoods, right? Unfortunately, no.

Our research has shown that people continue to rely on misinformation even when there are clear retractions, with retracted misinformation affecting people’s memory, inferential reasoning, and decision making. For example, even when people know that health concerns associated with some treatment have been thoroughly debunked, they will hesitate to get that treatment.

But maybe better access to more and more information will eventually solve the problem? Probably not: I was recently asked whether I found it strange that in a time with unprecedented access to credible information, there were so many “truthers”, “birthers”, and science deniers. Apart from the fact that a minority group can be very vocal without actually being very large, I don’t find that strange for a number of reasons.

First, the amount of misinformation available grows proportionally with the availability of valid information. In fact, it may grow even faster because of the lack of fact-checking in much of the new media.

Second, the now common idea that we can “check facts ourselves” is often an illusion. The fact we can “look things up on the net” can give people the impression they understand something when in fact they are overlooking important domain-specific details, or are trusting the wrong sources. This ultimately leads to a decline in trust in true experts.

Third, it’s easy to get bogged down in a misinformation “echo chamber”. The same misinformation can appear on many linked websites, which may lead to the impression of corroborative evidence from multiple independent sources, when it is not.

Fourth, more and more information available also means that it is impossible to critically evaluate every piece of information we get. Sometimes we just have to use “heuristics”, or rules of thumb: we believe what fits in with what we already know, or what others believe. Being a skeptic in the true sense of the word — critically assessing evidence and questioning people’s motives, not to be confused with denial! — requires effort and time, and often we lack one or both of these.

Usually, these heuristics are benign: we try to save cognitive effort while trying to maintain a coherent and accurate view of the world. However, when people’s beliefs are very strong, these beliefs will bias information processing and lead to what is known as “motivated reasoning”. People with strong beliefs and motivations will preferentially attend to and interpret information in a way that supports their beliefs.

Motivated reasoning is a major obstacle for rational argument. If someone wants to believe that asylum seekers are breaking the law, or if someone wants to believe that virtually all of the world’s climate scientists have conspired to make up a huge global “climate change hoax”, then it is very difficult to change their minds even when the actual evidence is very, very clear.

Misinformation is an issue for any political citizen who wants to form opinions and make decisions based on facts. If we want evidence-based practice and policy in a democratic society, then science communication, journalism, and education will have to take on the challenges associated with misinformation. Some guidelines on how best to do that can be found here.

Author note: A related audio piece will appear on the Australian ABC’s Radio National “Future Tense” program, and a similar story was published in the Huffington Post.

Want to write?

Write an article and join a growing community of more than 181,800 academics and researchers from 4,938 institutions.

Register now