tag:theconversation.com,2011:/us/topics/confirmation-bias-13944/articlesConfirmation bias – The Conversation2023-12-07T13:30:02Ztag:theconversation.com,2011:article/2191872023-12-07T13:30:02Z2023-12-07T13:30:02ZHow new reports reveal Israeli intelligence underestimated Hamas and other key weaknesses<figure><img src="https://images.theconversation.com/files/563705/original/file-20231205-17-4sueqx.jpg?ixlib=rb-1.1.0&rect=5%2C0%2C1272%2C720&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Israeli Prime Minister Benjamin Netanyahu, center, meets with his security cabinet on Oct. 7, 2023, the day of the Hamas attack.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/israeli-prime-minister-benjamin-netanyahu-holds-a-meeting-news-photo/1711825822">Haim Zach (GPO) / Handout/Anadolu Agency via Getty Images</a></span></figcaption></figure><p>After the surprise Hamas terrorist attack on Israel from the Gaza Strip on Oct. 7, 2023, <a href="https://theconversation.com/how-did-israeli-intelligence-miss-hamas-preparations-to-attack-a-us-counterterrorism-expert-explains-how-israeli-intelligence-works-215410">many observers were puzzled</a> about how Israel could have been caught completely off-guard. </p>
<p><a href="https://www.cmu.edu/cmist/about-us/people/faculty/haleigh-bartos.html">We</a> <a href="https://scholar.google.com/citations?user=eHm-LrkAAAAJ&hl=en&oi=ao">were</a> among those puzzled, and <a href="https://mwi.westpoint.edu/what-went-wrong-three-hypotheses-on-israels-massive-intelligence-failure/">proposed three possible reasons</a>:</p>
<ol>
<li>Israeli leaders may have underestimated Hamas’ capabilities and misunderstood its intentions.</li>
<li>Israeli intelligence may have been tricked by Hamas’ secrecy, missing signs that it was planning and training.</li>
<li>Israeli intelligence leaders may have been so wedded to their prior conclusion that Hamas was not a major threat that they dismissed mounting evidence that it was preparing for war.</li>
</ol>
<p>New revelations from recent media coverage have shed additional light on what happened, which mostly confirm the role of faulty threat assessments, Hamas’ improved operational security, and <a href="https://www.jstor.org/stable/10.3366/j.ctv182jrtn.10">confirmation bias</a>. </p>
<h2>An official assessment</h2>
<p>On Oct. 29, The New York Times reported that since May 2021, Israel’s military intelligence leaders and National Security Council had <a href="https://www.nytimes.com/2023/10/29/world/middleeast/israel-intelligence-hamas-attack.html">officially assessed</a> that “Hamas had no interest in launching an attack from Gaza that might invite a devastating response from Israel.” </p>
<p>As a result, Israeli Prime Minister Benjamin Netanyahu and security leaders diverted attention and resources away from Hamas and toward what they saw as more existential threats: Iran and Hezbollah. For instance, in 2021, the Israeli military <a href="https://www.timesofisrael.com/top-israeli-intel-unit-wasnt-operational-on-october-7-due-to-personnel-decision/">cut personnel and funding for Unit 8200, a key military surveillance unit</a> watching Gaza. In 2022, the unit <a href="https://www.nytimes.com/2023/10/29/world/middleeast/israel-intelligence-hamas-attack.html">stopped listening in on Hamas militants’ radio communications</a>, though it apparently gathered other intelligence.</p>
<p>The U.S. made a similar shift, <a href="https://www.wsj.com/politics/national-security/u-s-all-but-stopped-spying-on-hamas-in-years-after-9-11-ebe8d61d">focusing on the Islamic State group and other militants</a>, leaving intelligence gathering on Hamas to Israel.</p>
<h2>Revealing surveillance</h2>
<p>Within days of Oct. 7, Egypt revealed that it had shared with Israel high-level warnings of impending Hamas violence – “<a href="https://www.timesofisrael.com/egypt-intelligence-official-says-israel-ignored-repeated-warnings-of-something-big/">something big</a>.” </p>
<p>A Guardian report in early November revealed that Hamas leaders who had planned the attack <a href="https://www.theguardian.com/world/2023/nov/07/secret-hamas-attack-orders-israel-gaza-7-october">took special measures</a> to avoid being detected by Israeli intelligence, including passing orders only by word of mouth, rather than by radio or internet communication. But Hamas’ planning did not totally escape detection. </p>
<p>The Times of Israel reported in late October that Israeli troops of the Combat Intelligence Corps surveilling the Israel-Gaza border months before Oct. 7 saw Hamas militants <a href="https://www.timesofisrael.com/surveillance-soldiers-warned-of-hamas-activity-on-gaza-border-for-months-before-oct-7/">digging holes, placing explosives, training frequently</a> and even practicing blowing up a mock fence. Their warnings were ignored. The Financial Times reported in early November that Israeli security leaders had also ignored specific alerts of Hamas training exercises from civilian <a href="https://www.ft.com/content/f1ec2502-8220-491c-95e0-5f1504ce9554">volunteers in southern Israel who eavesdropped</a> on Hamas communications.</p>
<p>The Financial Times also reported that weeks before the Hamas attack, <a href="https://www.ft.com/content/277573ae-fbbc-4396-8faf-64b73ab8ed0a">Israeli border guards</a> sent a classified warning to the top military intelligence officer in the southern command. They had detected a high-ranking Hamas military commander overseeing rehearsals of hostage-taking and warned that Hamas was training to imminently “blow up border posts at several locations, enter Israeli territory and take over kibbutzim.” The officer who received the message dismissed it as an “imaginary scenario.” Other leaders considered the warning unremarkable.</p>
<h2>A detailed plan</h2>
<p>On Nov. 30, The New York Times reported that <a href="https://www.nytimes.com/2023/11/30/world/middleeast/israel-hamas-attack-intelligence.html">Israeli intelligence obtained a detailed Hamas plan of attack</a> more than a year before Oct. 7. The plan ran to 40 pages and included specifics that actually were part of the attack, including an opening rocket barrage, drones knocking out security cameras and automated weapons at the border, and gunmen crossing into Israel in paragliders as well as on foot and by motorcycle.</p>
<p>The newspaper also reported that in July 2023, a Unit 8200 analyst observed Hamas training activities that lined up with the Hamas plan, which was code-named “Jericho Wall” by Israeli officials. The analyst determined that Hamas was preparing an attack designed to provoke a war with Israel. Superior officers dismissed her assessment, saying the “Jericho Wall” plan was only aspirational primarily because they thought Hamas lacked the capacity to carry it out.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="People walk past a fortified tower with cameras and weapons on top." src="https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/563714/original/file-20231205-28-ftbv9b.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Israel’s defenses include stations like this guard tower in the West Bank, with robotic weapons that can fire tear gas, stun grenades and sponge-tipped bullets, using artificial intelligence to track targets.</span>
<span class="attribution"><a class="source" href="https://newsroom.ap.org/detail/IsraelRobotGun/8473ec72946f4d8eacbb3da6acbd7171/photo">AP Photo/Mahmoud Illean</a></span>
</figcaption>
</figure>
<h2>A reflection on the Israeli intelligence community</h2>
<p>These recent reports make clear that Israeli officials had enough intelligence to step up security. The fact that they did not suggests they may have dismissed all that evidence in favor of other information they had, which suggested Hamas was not interested in or capable of going to war with Israel.</p>
<p>But that may not have been the only problem. Recent studies point to <a href="https://doi.org/10.1177/0095327X20903072">increasing fissures</a> in civil-military relations in Israel. For example, populist right-wing Israeli politicians in recent years have <a href="https://doi.org/10.1093/psquar/qqad121">viewed senior intelligence officials with skepticism</a> as potential leftist rivals, which could have led Netanyahu’s Likud government to be hostile to alternative viewpoints and various intelligence warnings on Hamas. </p>
<p>Although we cannot observe the extent of politicization among the senior Israeli intelligence ranks, the behavior of intelligence leaders who dismissed warnings prior to Oct. 7 is <a href="https://www.jstor.org/stable/26593670">consistent with groupthink</a>, a phenomenon that experts say may occur when social pressure, a leader’s influential position or self-censorship leads groups to express homogeneous views and make uniform – and usually poorer – decisions. </p>
<p>The fact that superiors ignored warnings from the Unit 8200 analyst and the Border Defense Corps is consistent with the idea that groupthink about <a href="https://www.newsnationnow.com/world/war-in-israel/ex-israeli-ambassador-intelligence-military-surprise-failure/">Hamas’ capabilities and intentions</a> led to confirmation bias dismissing Hamas as an imminent threat.</p>
<p>Some of the ignored intelligence analysts were young women, who have said they <a href="https://www.timesofisrael.com/surveillance-soldiers-say-oct-7-warnings-ignored-charge-sexism-played-a-role/">believe sexism could have been a reason</a> male superiors ignored their warnings.</p>
<p>Another form of prejudice may also have been at play. Israel has focused intensely on its <a href="https://doi.org/10.1177/0095327X20903072">technological advantages over its enemies</a>, assigning large numbers of personnel to electronic and cyber warfare units. Perhaps <a href="https://www.ft.com/content/f1ec2502-8220-491c-95e0-5f1504ce9554">technological optimism</a>, faith in what the Financial Times described as “aerial drones that eavesdrop on Gaza and the sensor-equipped fence that surrounds the strip,” won out. Maybe a reliance on technology led to a false sense of security, and even the dismissal of other forms of intelligence that, it turned out, had uncovered Hamas’ real plans.</p>
<h2>A turn toward the future</h2>
<p>In the wake of the Hamas attacks, Israel’s security apparatus will need to investigate these weaknesses further and undertake reforms. So far, it remains unclear how many people, and at what levels of the Israeli government, received the various warnings in advance of Oct. 7. Therefore, it’s not yet clear what specific changes in Israel might prevent a similar failure in the future.</p><img src="https://counter.theconversation.com/content/219187/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Recent media coverage mostly confirms the role of faulty threat assessments, Hamas’ improved operational security, and confirmation bias.John Joseph Chin, Assistant Teaching Professor of Strategy and Technology, Carnegie Mellon UniversityHaleigh Bartos, Associate Professor of the Practice in Strategy and Technology, Carnegie Mellon UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2119032023-08-27T20:04:32Z2023-08-27T20:04:32Z#GirlMaths: a seemingly innocent and fun way to justify expenses that can have serious financial consequences<p>These shoes are perfect, made for me! I have to get them! But really, I should be paying off my car loan instead. I can’t justify this purchase. Or can I …?</p>
<p>We all know this feeling, this tension between what you really want to do and what you really should, or shouldn’t, do. What you are experiencing is <a href="https://www.britannica.com/biography/Leon-Festinger/Cognitive-dissonance">cognitive dissonance</a>.</p>
<p>It’s a psychological discomfort we feel when our behaviours and our values or beliefs do not match. Not to worry, we can make that discomfort simply disappear with a good dose of #GirlMaths! </p>
<h2>So what is #GirlMaths?</h2>
<p>GirlMaths recently became a viral phenomenon on TikTok after New Zealand FVHZM radio hosts Fletch, Vaughan and Hayley used #GirlMaths to justify one host’s mother’s expensive dress purchase as basically free because the dress was going to be worn at least four times.</p>
<p><iframe id="tc-infographic-904" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/904/f0b5e215a804bb450e609c397b96c7fcbf46172f/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Since then, influencers have added to the #GirlMaths trend with gems such as “If I buy it for $100, wear it, and then resell it for $80 then I basically wore it for free”, “If I pay with cash, it means it’s free”, and “If I just returned something, then purchase something new for the same amount of money, then it’s free”. </p>
<p>The reason #GirlMaths resonates so well with everyone and allows it to go viral is that we are very familiar with this type of thinking. The mental gymnastics of #GirlMaths needed to justify cost-per-wear or cash-is-free is a perfect display of behavioural biases and heuristics, such as confirmation bias and denomination bias, being applied to everyday consumption decisions. </p>
<h2>The psychology of decision-making</h2>
<p>Behavioural biases and heuristics are shortcuts in our thinking that help us make decisions quicker and easier, and are great for reducing the cognitive dissonance we sometimes experience.</p>
<p>Our brain has a lot of decisions to make in a day and simply doesn’t have the power to scrutinise every little detail of every <a href="https://theconversation.com/what-shall-we-have-for-dinner-choice-overload-is-a-real-problem-but-these-tips-will-make-your-life-easier-193317">decision</a>. These shortcuts in our thinking may facilitate the decision making process, but they don’t always mean we make the most optimal decisions.</p>
<p>Confirmation bias is a bias where you justify your decisions by considering only the evidence that supports what you want and ignore the evidence that would mean you’d have to make a different decision. Cost-per-wear does sound quite financially savvy. It is just like bulk-buying pantry essentials, right?</p>
<p>The issue is you are ignoring the facts such as: 1) your disposable income does not match this expense in light of your utility bills, 2) you could rewear a cheaper dress all the same, and 3) by spending money on a fancy dress, you lose the opportunity to spend the money on other better investments for wealth accumulation, or to pay off your car loan.</p>
<h2>The financial and social costs</h2>
<p>But it’s all a bit of innocent fun, right? Surely people won’t take #GirlMaths that seriously? We beg to differ. </p>
<p>First, the term is unnecessarily gendered. Gendered language operates to reinforce societal expectations with a particular gender and can promote stereotypes, biases and binary categories. </p>
<p>In this case, the term “girl maths” reinforces problematic stereotypes that equate women with consumption, frivolity and extravagant spending. When stereotypes are reinforced within our own social circles, we are more likely to <a href="https://journals.sagepub.com/doi/abs/10.1177/0146167299025007004?casa_token=dOhnQVtFwPsAAAAA:XSBdix5AB6bDfGjNgfbX9OIjstw4KE071GP0l60mAxvHJMaEwkyPERqHXf3z9PhctWJUl6h7TgTHg_U">internalise these as part of our identity</a>.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Two women showing each other shirts in a shop" src="https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/544150/original/file-20230823-23-t4fl7p.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The term ‘girl maths’ reinforces the idea that women are frivolous with money.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/search/fashion-shopping?image_type=photo">Shutterstock</a></span>
</figcaption>
</figure>
<p>By representing women in a less favourable way, the term operates to both demean and discriminate on a gendered basis. This is heightened by the use of “girl” as opposed to “woman”, which implies someone is childlike or lacking in knowledge or experience. It also begs the question what “boy maths” - set up as something opposing and different - might connote. </p>
<p>Second, the #GirlMaths trend reminds us of the power of “<a href="https://theconversation.com/fintok-and-finfluencers-are-on-the-rise-3-tips-to-assess-if-their-advice-has-value-161406">finfluencers</a>” – social media content creators amassing huge online followings by sharing advice on anything from budgeting to buying a house, to investing.</p>
<p>These online gurus appeal to Gen Z and millennials, simplifying complex financial concepts into digestible nuggets, much like #GirlMaths simplifies purchases based on cost-per-wear or cash-as-free.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/are-you-financially-literate-here-are-7-signs-youre-on-the-right-track-202331">Are you financially literate? Here are 7 signs you're on the right track</a>
</strong>
</em>
</p>
<hr>
<p>Just as regulators such as <a href="https://moneysmart.gov.au/other-ways-to-borrow/buy-now-pay-later-services">ASIC</a> repeatedly warn us of the dangers of buy-now-pay-later services, we must caution the #GirlMaths trend as a dangerous cocktail for young women who are susceptible to the “advice” of finfluencers.</p>
<p>The trend resembles BNPL by breaking down expenses into smaller, more palatable portions, making purchases seem justifiable and affordable at the moment.</p>
<p>Denomination bias describes this tendency to spend more money when it is denominated in small amounts rather than large amounts. We find it much easier to spend $50 four times than $200 all at once. </p>
<p>However, the convenience of these shortcuts in our thinking can obscure the hidden financial risks. You may overlook the bigger picture of your financial health, and spend more than what you can afford. That’s why a large number of BNPL users find themselves ending up in a <a href="https://www.choice.com.au/money/credit-cards-and-loans/personal-loans/articles/bnpl-submission-to-treasury">modern debt trap</a>.</p>
<h2>The perils of #GirlMaths</h2>
<p>The danger of #GirlMaths to young women lies in the cocktail of feeling oddly familiar and reinforced in this biased thinking, the problematic stereotypes that shape identities, and the power of finfluencers, who wield increasing influence over the financial choices and decision-making of young women.</p>
<p>While the term may initially come across as innocent fun, it’s crucial not to underestimate its potential harms. Instead, let’s champion the use of inclusive language in finance that doesn’t perpetuate gender biases.</p>
<p>And if you’re a staunch supporter of #GirlMaths, we strongly urge you to take into account the possible adverse financial consequences of these quick-fix spending habits.</p><img src="https://counter.theconversation.com/content/211903/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Justifying purchases can make parting with money easier but a viral TikTok trend could leave girls spending more than they have.Janneke Blijlevens, Senior Lecturer in Marketing, RMIT UniversityAngel Zhong, Associate Professor of Finance, RMIT UniversityLauren Gurrieri, Associate Professor in Marketing, RMIT UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/2062732023-05-25T22:30:03Z2023-05-25T22:30:03Z‘Whose side are you on mate?’ How no one is free from bias – including referees<figure><img src="https://images.theconversation.com/files/527926/original/file-20230524-17-w6g0dp.jpg?ixlib=rb-1.1.0&rect=13%2C0%2C2968%2C1980&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Getty Images</span></span></figcaption></figure><p>When Jason Paris, head of the company that sponsors the New Zealand Warriors NRL team, <a href="https://www.newshub.co.nz/home/sport/2023/05/nrl-one-nz-chief-executive-jason-paris-stands-by-accusations-of-referee-bias-against-warriors-despite-nrl-investigation.html">complained recently</a> about Australian referee bias, more than a few heads will have nodded in agreement. </p>
<p>Sports fans often think the ref is biased against their team – penalising them for the very same actions the other side is getting away with.</p>
<p>But taking the element of trans-Tasman rivalry out of the argument for the moment, it’s worth asking whether it’s even possible for referees to operate without being unconsciously influenced by factors beyond their immediate control. </p>
<p>The honest answer is probably not – despite most professional sporting bodies <a href="https://www.newshub.co.nz/home/sport/2022/04/nrl-2022-referees-boss-graham-annesley-denies-unconscious-bias-against-nz-warriors-but-experts-disagree.html">regularly rejecting claims of bias</a>.</p>
<p>It’s clear from a wide range of research that, while it’s unlikely professional referees consciously cheat, they are likely to be affected by unconscious biases. In fact, referee bias has been reported in pretty much <a href="https://www-tandfonline-com.ezproxy.auckland.ac.nz/doi/full/10.1080/17461391.2020.1845814">every aspect of most sports</a>, including the use of yellow cards, red cards and penalty kicks. </p>
<p>None of this is surprising, or even particularly critical of referees. Humans are all subject to unconscious bias, and it’s very difficult to overcome.</p>
<h2>Confirmation bias is real</h2>
<p>We all use a range of reasoning shortcuts – also known as “heuristics” – to make decisions and assessments. While useful, many of these shortcuts can lead us astray, despite our best efforts. </p>
<p>For example, <a href="https://www.donchristoff.com/wp-content/uploads/2022/11/nickerson1998.pdf">one such heuristic</a> leads us to notice evidence that confirms positions we already hold and to overlook evidence that is inconsistent with those views. </p>
<p>This tendency – known as confirmation bias – has its uses. It lets us make quick decisions when we don’t have the time to consider all the evidence. And it may reduce mental conflict and increase self-esteem, since it reduces how often we have to acknowledge we were wrong. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/two-refs-are-better-than-one-so-why-does-the-nrl-want-to-drop-one-138722">Two refs are better than one, so why does the NRL want to drop one?</a>
</strong>
</em>
</p>
<hr>
<p>However, confirmation bias can also be problematic. In one striking <a href="https://www-sciencedirect-com.ezproxy.auckland.ac.nz/science/article/pii/S0379073805005876?casa_token=Y9036DWsDzMAAAAA:8v5PA8G2_llUk5uCyPLZ-TYlWqCwDplgNboc2ZbWKn3dTP-uxjA8PPG_pTSkG2KcEpd6jJEthRLC">non-sports experiment</a>, researchers asked five fingerprint experts to say whether a suspect’s fingerprints matched those from a crime scene. They didn’t tell the experts that they’d seen those same fingerprints five years earlier. </p>
<p>The experts had no reason to remember them, and they didn’t realise that five years earlier, they’d said they were a match. This time they were told they were looking into a probable case of mistaken identity; that the prints taken from the crime scene probably didn’t match those taken from the suspect. </p>
<p>Now only one of the five experts said they matched. Given exactly the same prints, but primed to look for evidence that the fingerprints didn’t match, their judgement changed.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1655866854278066181"}"></div></p>
<h2>Expectations influence outcomes</h2>
<p>What does all this have to do with referees? Well, they’re only human. Even if not consciously biased, they will have expectations about how players and teams will perform, and there is evidence that this influences their judgements.</p>
<p>In one <a href="https://ebookcentral.proquest.com/lib/auckland/reader.action?docID=819267&ppg=144">experiment</a>, researchers took advantage of the common practice in gymnastics of coaches ordering their competitors from weakest first to strongest last. </p>
<p>Films of competitors were reordered and the judges asked to rank them. Where in this lineup the the competitors appeared significantly affected the scoring, with the same routine receiving a higher or lower score depending on where it was positioned.</p>
<p>We suspect those expectations are one reason dominant teams and players tend to have close calls go their way. </p>
<p>Referees expect to see some players pull off moves that bring them close to infringing but which don’t cross that line. They are more likely to make a call against a journeyman player who they don’t expect to pull off the miracle play. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/split-second-decisions-with-little-praise-so-what-does-it-take-to-ref-a-game-of-nrl-57553">Split-second decisions with little praise: so what does it take to ref a game of NRL</a>
</strong>
</em>
</p>
<hr>
<h2>Refs aren’t superhuman</h2>
<p>Just like the fingerprint experts, confirmation bias leads them to see the same evidence differently. And if referees do have these kinds of expectations, it would be very difficult for them to factor these out of their decision making. </p>
<p>The fingerprint experts didn’t intend to tailor their judgements to suit the views they’d been primed to hold. Further, they made their judgements under calm laboratory conditions, with the evidence in front of them and plenty of time and equipment to examine and consider it.</p>
<p>It would be truly remarkable if referees – obliged to make calls in the heat of the moment, with pressure from players and crowds – were not at least equally affected. Referees would need to be superhuman to be immune to these dangers.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/cognitive-biases-and-brain-biology-help-explain-why-facts-dont-change-minds-186530">Cognitive biases and brain biology help explain why facts don’t change minds</a>
</strong>
</em>
</p>
<hr>
<p>There are also more straightforward sources of bias. <a href="https://www-sciencedirect-com.ezproxy.auckland.ac.nz/science/article/pii/S0165176520303815">Recent research</a> into the Bundesliga, German football’s highest division, took advantage of empty stadiums during the COVID pandemic to explore the influence of vocal crowd support on referees. Unsurprisingly, the evidence suggests it does have an influence. </p>
<p>Pre-COVID, referees gave fewer fouls and yellow cards for the home team relative to the away team. These differences changed during the crowd-free matches, so that home teams were treated less favourably than before.</p>
<p>None of this is meant as a dig at referees. They are surely aware of the research on bias, and receive training and support to address it. But confirmation bias is difficult, if not impossible, to beat. Maybe we just have to accept it as part of the game.</p><img src="https://counter.theconversation.com/content/206273/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Referees would need to be superhuman to be immune to the risk of bias – maybe that’s something all sports fans could agree on.Tim Dare, Professor of Philosophy, University of Auckland, Waipapa Taumata RauJustine Kingsbury, Senior Lecturer in Philosophy, University of WaikatoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1969192023-01-24T13:22:32Z2023-01-24T13:22:32ZLots of people believe in Bigfoot and other pseudoscience claims – this course examines why<figure><img src="https://images.theconversation.com/files/505185/original/file-20230118-22-sxk00c.png?ixlib=rb-1.1.0&rect=11%2C0%2C1979%2C997&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Don't believe the hype about Bigfoot, a flat Earth or ancient aliens.</span> <span class="attribution"><span class="source">Collage from Getty Images sources</span>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><figure class="align-right ">
<img alt="Text saying: Uncommon Courses, from The Conversation" src="https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=375&fit=crop&dpr=1 600w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=375&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=375&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=471&fit=crop&dpr=1 754w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=471&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/499014/original/file-20221205-17-kcwec8.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=471&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
</figcaption>
</figure>
<p><em><a href="https://theconversation.com/topics/uncommon-courses-130908">Uncommon Courses</a> is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching.</em> </p>
<h2>Title of course</h2>
<p>“Psychology of Pseudoscience”</p>
<h2>What prompted the idea for the course?</h2>
<p>While teaching a course on research methods at the United States Air Force Academy, I concluded that the course needed a bigger emphasis on broad scientific reasoning skills.</p>
<p>So I incorporated material about the difference between science – the <a href="https://sciencecouncil.org/about-science/our-definition-of-science/">systematic process of evidence-based inquiry</a> – and <a href="https://press.uchicago.edu/ucp/books/book/chicago/P/bo15996988.html">pseudoscience</a>, which is the promotion of unreliable scientific claims as if they are more reliable than other explanations. </p>
<p>I wanted to understand why people promote claims that conflict with science. I jumped at the opportunity to develop this type of course at SUNY Cortland.</p>
<h2>What does the course explore?</h2>
<p>We look at some of the common scientific reasoning failures that pseudoscience exploits. These include <a href="https://aiptcomics.com/2021/02/01/stormtroopers-science-evidence-anecdotes/">hand-picking anecdotes</a> to support a belief, <a href="https://plato.stanford.edu/entries/pseudo-science/">developing a set of beliefs</a> that explain every possible outcome, <a href="https://skepticalinquirer.org/2017/05/vaccines-autism-and-the-promotion-of-irrelevant-research-a-science-pseudosc/">promoting irrelevant research</a>, <a href="https://press.uchicago.edu/ucp/books/book/chicago/P/bo15996988.html">ignoring contradictory information</a> and <a href="https://doi.org/10.1177/1948550614567356">believing in unsubstantiated conpiracies</a>.</p>
<p>We particularly highlight <a href="https://doi.org/10.1037/0033-2909.108.3.480">motivated reasoning</a>, the tendency for people to process information in a way that helps them confirm what they already want to believe. For example, someone might accept scientific consensus about cancer treatments but question it with regard to vaccines – even though both are supported by strong scientific evidence and expert consensus. </p>
<p>We also review <a href="https://doi.org/10.1037/0022-3514.50.6.1141">group polarization</a>, in which people develop more extreme positions after interacting with similarly minded group members.</p>
<p>Some of the topics we examine include the <a href="https://blogs.scientificamerican.com/observations/yes-flat-earthers-really-do-exist/">flat-Earth</a> belief, <a href="https://global.oup.com/academic/product/denying-evolution-9780878936595?cc=us&lang=en&">creationism</a>, <a href="https://skepticalinquirer.org/2002/03/bigfoot-at-50-evaluating-a-half-century-of-bigfoot-evidence/">Bigfoot and other cryptozoology ideas</a>, <a href="https://skepticalinquirer.org/2022/02/the-great-australian-psychic-prediction-project-pondering-the-published-predictions-of-prominent-psychics/">psychic ability</a>, <a href="https://www.apa.org/pi/lgbt/resources/therapeutic-response.pdf">conversion therapy</a>, <a href="http://cup.columbia.edu/book/vaccines-and-your-child/9780231153072">anti-vaccination</a>, <a href="https://www.nature.com/articles/318419a0">astrology</a>, <a href="https://skepticalinquirer.org/2003/01/amityville-the-horror-of-it-all/">ghosts</a> and <a href="http://cup.columbia.edu/book/the-madhouse-effect/9780231177863">climate change denial</a>.</p>
<p>Students complete two papers to reinforce their knowledge. First, students develop their own bogus scientific claims and a corresponding plan to convince people that their claims are legitimate. Allowing students to invent and promote novel forms of pseudoscience gives them a safe context in which to examine specious scientific arguments.</p>
<p>Second, students review old issues of <a href="https://skepticalinquirer.org/">Skeptical Inquirer</a>, the leading national magazine about science and critical thinking, to summarize the topics that were being addressed at that time. Students also dive more deeply into a specific topic like unexplained cattle mutilations or the Bermuda Triangle. Then they write a paper based on an <a href="https://skepticalinquirer.org/2022/12/on-the-origin-of-skeptical-inquirer/">example I recently published</a> in Skeptical Inquirer. I’m hopeful that future column installments will include students’ work.</p>
<h2>Why is this course relevant now?</h2>
<p>The internet has provided pseudoscience communities with the unprecedented ability to promote their false claims.</p>
<p>For instance, flat-Earthers have <a href="https://theconversation.com/i-watched-hundreds-of-flat-earth-videos-to-learn-how-conspiracy-theories-spread-and-what-it-could-mean-for-fighting-disinformation-184589">relied on YouTube</a> to create doubt about Earth as a globe. The Bigfoot Field Researchers Organization uses Facebook to support Bigfoot belief. These platforms take advantage of people’s tendency to believe material posted by their <a href="https://doi.org/10.1080/13527266.2011.620764">friends</a> or <a href="https://doi.org/10.1007/s10503-011-9219-6">authoritative-sounding sources</a>.</p>
<p>This course is also relevant now because the consequences of poor scientific reasoning are so significant. People who believe these sorts of false claims risk their own health and that of the planet, by <a href="https://doi.org/10.1038/s41598-022-17430-6">avoiding helpful, safe vaccines</a> or <a href="http://cup.columbia.edu/book/the-madhouse-effect/9780231177863">useful discussions about the problems presented by climate change</a>.</p>
<h2>What’s a critical lesson from the course?</h2>
<p>It’s important for students to understand that <a href="https://centerforinquiry.org/video/why-were-all-susceptible-to-pseudoscience-craig-foster/">reasonable, intelligent people promote pseudoscience</a>. When people encounter pseudoscience they don’t personally believe, they sometimes conclude that the pseudoscience supporters are unintelligent or mentally unwell. This type of explanation is shortsighted. </p>
<p>Everyday people are drawn into believing pseudoscience because they have limited cognitive resources and they use cognitive strategies, like relying on anecdotes, that can lead to erroneous belief. Human scientific reasoning is particularly flawed when humans really <a href="https://www.jstor.org/stable/44085270">want to reach a particular conclusion</a>.</p>
<p>Belief in pseudoscience also develops out of social interactions. Friends and family members commonly share their reasons for believing in creationism, ghosts, fad diets and so forth. This type of social influence goes into overdrive when people <a href="https://doi.org/10.1177/13684302211050323">join communities that collectively promote pseudoscience</a>. I have attended Bigfoot and flat-Earth conferences. These conferences create powerful social experiences, because so many friendly people are available to explain that Bigfoot is alive or the Earth is flat, both of which are, clearly, false.</p>
<h2>What materials does the course feature?</h2>
<p>The “Defining Pseudoscience and Science” chapter by Sven Ove Hansson in “<a href="https://press.uchicago.edu/ucp/books/book/chicago/P/bo15996988.html">Philosophy of Pseudoscience: Reconsidering the Demarcation Problem</a>” sets up what I call the psychological puzzle of pseudoscience: How do people convince themselves and others that an unreliable scientific claim is actually reliable?</p>
<p>We also have guest speakers, including philosophy of science scholar <a href="https://massimopigliucci.org/">Massimo Pigliucci</a>, journalist and folklorist <a href="http://benjaminradford.com/">Ben Radford</a>, exposer of psychics <a href="https://skepticalinquirer.org/exclusive/susan-gerbic-back-on-tour/">Susan Gerbic</a>, a local Bigfoot enthusiast, and Janyce Boynton, who discussed <a href="https://www.facilitatedcommunication.org">facilitated communication</a>, a discredited communication technique in which some people physically assist nonverbal people with their communication, for example, by guiding their hands as they type.</p>
<h2>What will the course prepare students to do?</h2>
<p>The course prepares students to identify dubious scientific claims. In so doing, they should <a href="https://doi.org/10.1007/s11162-018-9513-3">become less vulnerable</a> to being drawn into pseudoscience. The course also enhances familiarity with specific forms of pseudoscience. I expect climate change denial, anti-vaccination and creationism to remain major points of contention in American society for decades. Educated people should understand the discussions that occur around these kind of social problems.</p><img src="https://counter.theconversation.com/content/196919/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Craig Foster is affiliated with facilitatedcommunication.org.
Anything else to declare: I am a Committee for Skeptical Inquiry Fellow.</span></em></p>A university course teaches students why people believe false and evidence-starved claims, to show them how to determine what’s accurate and real and what’s neither.Craig A. Foster, Professor and Chair, Department of Psychology, State University of New York CortlandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1974892023-01-15T14:36:14Z2023-01-15T14:36:14ZInformation literacy courses can help students tackle confirmation bias and misinformation<figure><img src="https://images.theconversation.com/files/504497/original/file-20230113-26-659oki.jpg?ixlib=rb-1.1.0&rect=14%2C73%2C4898%2C2987&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Understanding our confirmation biases can help us tackle fake news and misinformation.</span> <span class="attribution"><span class="source">(Shutterstock)</span></span></figcaption></figure><p>When it comes to the news these days, what we choose to regard as trustworthy <a href="https://theconversation.com/republicans-and-democrats-see-news-bias-only-in-stories-that-clearly-favor-the-other-party-192282">has more to do with our own world view</a> than what kinds of news practices are worthy of trust. </p>
<p>Many people are seeking out news that <a href="https://www.asc.upenn.edu/news-events/news/cable-news-networks-have-grown-more-polarized-study-finds">aligns with their politics</a>. But there’s just one problem with this: we are not always good judges of what constitutes trustworthy information and news.</p>
<p>That’s why learning about <a href="https://doi.org/10.5206/cjsotl-rcacea.2020.2.9472">news and information literacy</a> is so important. An information literacy course I teach at the University of Windsor, <a href="https://ctl2.uwindsor.ca/cuma/public/courses/pdf/71241738-66f2-4eea-9a74-e8759c306c53">Information Searching and Analysis</a>, tries to show students that the same phenomenon which makes us poor judges can also be turned around to make us better, more critical consumers of news and information. </p>
<p>The process I use in this information literacy course does not encourage “trust” in mainstream or legacy news media per se. Rather, students learn to assess news based on the characteristics of a news story: multiple, adversarial sources, the use of statistics and data in which the sources are named and can be accessed independently, the kinds of advertising present and whether it is related to the story.</p>
<h2>First lesson: Check your confirmation bias</h2>
<p><a href="https://www.britannica.com/science/confirmation-bias">Confirmation bias</a> suggests that our prior knowledge and experiences often inform our opinions. However, by becoming aware of our confirmation bias tendencies, we can begin to self-critique the way we process information and learn more about ourselves and how we interpret news and information.</p>
<p>The solution comes in the form of an experiential assignment in which students realize their confirmation bias tendencies. Students are tasked with a weekend assignment in which they look for and report on examples of confirmation bias around them and in media reports. They are told to focus mostly on themselves — how they often engage in confirmation bias. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A row of newspapers." src="https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=399&fit=crop&dpr=1 600w, https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=399&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=399&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=501&fit=crop&dpr=1 754w, https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=501&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/504500/original/file-20230113-20-ounndj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=501&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">By becoming aware of our confirmation biases, we can self-critique the way we process information and news.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>The assignment is an eye opener. In their end-of-semester papers, <a href="https://doi.org/10.1177/1077695819893171">80 per cent of students in the Information Searching and Analysis class</a> noted that the assignment was an important element of the course. Here are a few examples:</p>
<p>“I knew that in some aspects of my life, I may have exhibited confirmation bias towards certain ideas. However, I did not think it was as prominent as it was after the completion of the assignment.”</p>
<p>“…relating to my personal life, this was the most important assignment.”</p>
<p>“I think it was the most impactful and (will) stick with me the longest.”</p>
<p>“It was an insanely enriching experience for me to pull my biases out of the woodwork, particularly for someone like myself who regards themselves as quite unbiased when it comes to anything.”</p>
<p>“…extremely valuable was the consciousness I developed in regard to (how) social media was exclusively forming my opinions… I believe this is perhaps the most universal function of the class.” </p>
<p>The course uses a <a href="https://teaching.berkeley.edu/flipping-your-classroom">flipped classroom approach</a>. Flipped classrooms use class time for discussion, group activities and experiential education instead of lectures and passive forms of learning.</p>
<p>The key is self-confrontation. All the ways to engage in confirmation bias cannot be conveyed through a dry explanation of the concept. The point is to not preach or lecture them about their “faults.” Rather, it is about letting them understand for themselves how confirmation biases can result in inaccurate learning that may have negative effects.</p>
<h2>Media framing</h2>
<p>Over the rest of the semester students explore a social justice issue by looking at how interest groups, journalists and academic researchers have treated the issue. This serves to give them a holistic view of the information field and leads to a better understanding of both the issue and the social dynamics that inform debate about it.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man scrolls through a webpage on a smartphone." src="https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=338&fit=crop&dpr=1 600w, https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=338&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=338&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=424&fit=crop&dpr=1 754w, https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=424&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/504499/original/file-20230113-21-oevkr3.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=424&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Greater information literacy enables us to assess how trustworthy the news we see on social media is.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>It is also crucial that students understand the nature of <a href="https://www.indeed.com/career-advice/career-development/what-is-sponsored-content">sponsored content</a> and other <a href="https://support.google.com/admanager/answer/6366845?hl=en">native ads</a> which may look like news but embed a point of view.</p>
<p>News, information and misinformation play a significant role in improving and undermining democratic discourse and decision-making. Educators at all levels will need to give news and information literacy greater attention to ensure students know how to critique the news they encounter.</p><img src="https://counter.theconversation.com/content/197489/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>James Wittebols does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Teaching students about information literacy can help them determine what kinds of practices make news reports trustworthy.James Wittebols, Professor of Political Science, University of WindsorLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1935822022-11-04T12:29:56Z2022-11-04T12:29:56ZInoculate yourself against election misinformation campaigns – 3 essential reads<figure><img src="https://images.theconversation.com/files/492837/original/file-20221101-26-ck41ft.jpg?ixlib=rb-1.1.0&rect=90%2C36%2C5916%2C3971&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Get a shot of preparation and protect yourself from malicious information warriors.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/doctor-hold-syringe-prepare-for-injection-epidural-royalty-free-image/950747286">boonchai wedmakawand/Moment via Getty Images</a></span></figcaption></figure><p>As the midterm elections approach, Americans are already being subjected to <a href="https://www.nytimes.com/2022/09/23/technology/midterm-elections-misinformation.html">misinformation campaigns</a>, often online, that are intended to <a href="https://www.texastribune.org/2022/10/13/spanish-latino-misinformation-2022-elections/">provoke confusion</a>, anger or even action. When the election is over, it’s almost certain there will be even more misleading material competing for people’s attention.</p>
<p>You can defend yourself against this onslaught and help curb both the spread and the effect of misinformation. Several scholars have written for The Conversation U.S. about this process, often called “inoculation,” because it prepares your mind to repel infectious, harmful ideas. Here are some of their key pieces of advice.</p>
<h2>1. Learn about misinformation’s effects</h2>
<p>Misinformation not only gives people incorrect material – it leads them to disbelieve facts. As <a href="https://scholar.google.com/citations?user=ZEN_Z2UAAAAJ&hl=en&oi=sra">John Cook</a>, a cognitive psychologist at George Mason University, explained: “<a href="https://theconversation.com/inoculation-theory-using-misinformation-to-fight-misinformation-77545">When people were presented with both the facts</a> and misinformation about climate change, there was no net change in belief. The two conflicting pieces of information canceled each other out.”</p>
<p>He went on to explain that “when they collide, there’s a burst of heat followed by nothing. This reveals the subtle way that misinformation does damage. It doesn’t just misinform. It stops people believing in facts.”</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/inoculation-theory-using-misinformation-to-fight-misinformation-77545">Inoculation theory: Using misinformation to fight misinformation</a>
</strong>
</em>
</p>
<hr>
<h2>2. Know yourself</h2>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&rect=14%2C14%2C4778%2C3671&q=45&auto=format&w=1000&fit=clip"><img alt="A hand stopping a Pinocchio-nosed person" src="https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&rect=14%2C14%2C4778%2C3671&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=462&fit=crop&dpr=1 600w, https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=462&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=462&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=581&fit=crop&dpr=1 754w, https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=581&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=581&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Identify and stop the lies.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/big-hand-with-cartoon-character-stop-sign-royalty-free-illustration/1292878719">NLshop/iStock via Getty Images Plus</a></span>
</figcaption>
</figure>
<p>It’s useful to note, as social psychology scholar <a href="https://www.cci.msstate.edu/osil/bio.php?d=-1">H. Colleen Sinclair</a> at Mississippi State University did, that “<a href="https://theconversation.com/7-ways-to-avoid-becoming-a-misinformation-superspreader-157099">research has found people are more susceptible to misinformation</a> that aligns with their preexisting views.”</p>
<p>So, Sinclair recommends, “be particularly critical of information from groups or people with whom you agree or find yourself aligned – whether politically, religiously, or by ethnicity or nationality. Remind yourself to look for other points of view, and other sources with information on the same topic.”</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/7-ways-to-avoid-becoming-a-misinformation-superspreader-157099">7 ways to avoid becoming a misinformation superspreader</a>
</strong>
</em>
</p>
<hr>
<h2>3. Seek help</h2>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/xSIkkza9TVI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Russia is not the only source of misinformation, but here is a look at its propaganda machine.</span></figcaption>
</figure>
<p>The Baltic nations – Estonia, Latvia and Lithuania – are small countries right next to Russia, and former parts of the Soviet Union. Their people have decades of experience with misinformation campaigns and are among the best in the world at resisting them.</p>
<p><a href="https://theconversation.com/profiles/terry-thompson-660173">Terry Thompson</a>, a cybersecurity scholar at the University of Maryland, Baltimore County, explained how: First, they cooperate with other nations to report what’s going on, including “<a href="https://theconversation.com/countering-russian-disinformation-the-baltic-nations-way-109366">analyz[ing] Russian social media activities</a> targeting Baltic nations … and provid[ing] insight into identifying and detecting Russian disinformation campaigns” so regular citizens can be informed.</p>
<p>There are also “‘Baltic elves’ – volunteers who monitor the internet for Russian disinformation” and spread the word, Thompson explained. </p>
<p>Further, those nations are part of a collective European Union project that “identifies disinformation efforts and publicizes accurate information” that disinformation warriors would like to undermine.</p>
<p>It’s all part of a wide-ranging effort to help people understand what’s real and what’s out there to mislead them.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/countering-russian-disinformation-the-baltic-nations-way-109366">Countering Russian disinformation the Baltic nations' way</a>
</strong>
</em>
</p>
<hr>
<p><em>Editor’s note: This story is a roundup of articles from The Conversation’s archives.</em></p><img src="https://counter.theconversation.com/content/193582/count.gif" alt="The Conversation" width="1" height="1" />
As elections approach – and even after they’re done – there’s a lot of confusing, and deliberately misleading, information out there. Learn how to protect yourself.Jeff Inglis, Politics + Society Editor, The Conversation USLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1865302022-08-11T12:14:03Z2022-08-11T12:14:03ZCognitive biases and brain biology help explain why facts don’t change minds<figure><img src="https://images.theconversation.com/files/478603/original/file-20220810-15-x8t51l.jpg?ixlib=rb-1.1.0&rect=179%2C300%2C4503%2C3436&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">It can feel safer to block out contradictory information that challenges a belief.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/shaven-headed-man-with-fingers-in-ears-royalty-free-image/84437014">Peter Dazeley/The Image Bank via Getty Images</a></span></figcaption></figure><p>“<a href="https://www.cnn.com/factsfirst/politics">Facts First</a>” is the tagline of a CNN branding campaign which contends that “<a href="https://www.cnncreativemarketing.com/project/cnn_factsfirst/">once facts are established, opinions can be formed</a>.” The problem is that while it sounds logical, this appealing assertion is a fallacy not supported by research.</p>
<p>Cognitive psychology and neuroscience studies have found that the <a href="https://doi.org/10.1111/pops.12394">exact opposite is often true when it comes to politics</a>: People form opinions based on emotions, such as fear, contempt and anger, rather than relying on facts. New facts often do not change people’s minds.</p>
<p><a href="https://scholar.google.com/citations?user=LB6MiT4AAAAJ&hl=en&oi=ao">I study human development, public health and behavior change</a>. In my work, I see firsthand how hard it is to change someone’s mind and behaviors when they encounter new information that runs counter to their beliefs.</p>
<p>Your worldview, including beliefs and opinions, starts to form during childhood as you’re socialized within a particular cultural context. It gets reinforced over time by the social groups you keep, the media you consume, even how your brain functions. It influences how you think of yourself and how you interact with the world.</p>
<p>For many people, a challenge to their worldview feels like an attack on their personal identity and can cause them to harden their position. Here’s some of the research that explains why it’s natural to resist changing your mind – and how you can get better at making these shifts.</p>
<h2>Rejecting what contradicts your beliefs</h2>
<p>In an ideal world, rational people who encounter new evidence that contradicts their beliefs would evaluate the facts and change their views accordingly. But that’s generally not how things go in the real world. </p>
<p>Partly to blame is a cognitive bias that can kick in when people encounter evidence that runs counter to their beliefs. Instead of reevaluating what they’ve believed up until now, people tend to <a href="https://doi.org/10.1111/j.1540-5907.2006.00214.x">reject the incompatible evidence</a>. Psychologists call this phenomenon belief perseverance. Everyone can fall prey to this ingrained way of thinking. </p>
<p>Being presented with facts – whether via the news, social media or one-on-one conversations – that suggest their current beliefs are wrong causes people to feel threatened. This reaction is particularly strong when the beliefs in question are aligned with your political and personal identities. It can feel like an attack on you if one of your strongly held beliefs is challenged.</p>
<p>Confronting facts that don’t line up with your worldview may trigger a “<a href="https://doi.org/10.1073/pnas.1804840115">backfire effect</a>,” which can end up strengthening your original position and beliefs, particularly with politically charged issues. Researchers have identified this phenomenon in a number of studies, including ones about <a href="https://doi.org/10.1177/0093650211416646">opinions toward climate change mitigation policies</a> and <a href="https://doi.org/10.1542/peds.2013-2365">attitudes toward childhood vaccinations</a>. </p>
<h2>Focusing on what confirms your beliefs</h2>
<p>There’s another cognitive bias that can get in the way of changing your mind, called confirmation bias. It’s the natural tendency to seek out information or interpret things in a way that <a href="https://doi.org/10.1111/j.1540-5907.2006.00214.x">supports your existing beliefs</a>. <a href="https://doi.org/10.1371/journal.pone.0210423">Interacting with like-minded people and media</a> reinforces confirmation bias. The problem with confirmation bias is that it <a href="https://doi.org/10.1037/0033-295X.100.2.298">can lead to errors in judgment</a> because it keeps you from looking at a situation objectively from multiple angles. </p>
<p>A 2016 Gallup poll provides a great example of this bias. In just one two-week period spanning the 2016 election, both Republicans and Democrats <a href="https://news.gallup.com/poll/197474/economic-confidence-surges-election.aspx">drastically changed their opinions</a> about the state of the economy – in opposite directions.</p>
<p><iframe id="J9K34" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/J9K34/4/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>But nothing was new with the economy. What had changed was that a new political leader from a different party had been elected. The election outcome changed survey respondents’ interpretation of how the economy was doing – a confirmation bias led Republicans to rate it much higher now that their guy would be in charge; Democrats the opposite.</p>
<h2>Brain’s hard-wiring doesn’t help</h2>
<p>Cognitive biases are predictable patterns in the way people think that can keep you from objectively weighing evidence and changing your mind. Some of the basic ways your brain works can also work against you on this front.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="woman with smirking look holding a cellphone" src="https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/478604/original/file-20220810-6805-26kwy4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">It can feel really satisfying to get the better of an opponent, even if you’re not actually right.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/studio-portrait-of-businesswoman-text-messaging-royalty-free-image/136597526">Rob Lewine/Tetra images via Getty Images</a></span>
</figcaption>
</figure>
<p>Your brain is hard-wired to protect you – which can lead to reinforcing your opinions and beliefs, even when they’re misguided. Winning a debate or an argument triggers a flood of hormones, including dopamine and adrenaline. In your brain, they contribute to the feeling of pleasure you get during sex, eating, roller-coaster rides – and yes, <a href="https://us.macmillan.com/books/9781250013644/the-winner-effect">winning an argument</a>. That rush makes you feel good, maybe even invulnerable. It’s a feeling many people want to have more often.</p>
<p>Moreover, in situations of high stress or distrust, your body releases <a href="https://www.ncbi.nlm.nih.gov/books/NBK538239/">another hormone, cortisol</a>. It can <a href="https://doi.org/10.1001/archpsyc.64.7.810">hijack your advanced thought processes, reason and logic</a> – what psychologists call the executive functions of your brain. Your brain’s amygdala becomes more active, which <a href="https://doi.org/10.3390/biom11060823">controls your innate fight-or-flight reaction</a> when you feel under threat.</p>
<p>In the context of communication, people tend to raise their voice, push back and stop listening when these chemicals are coursing through their bodies. Once you’re in that mindset, it’s hard to hear another viewpoint. The desire to be right combined with the brain’s protective mechanisms make it that much harder to change opinions and beliefs, even in the presence of new information.</p>
<h2>You can train yourself to keep an open mind</h2>
<p>In spite of the cognitive biases and brain biology that make it hard to change minds, there are ways to short-circuit these natural habits. </p>
<p>Work to keep an open mind. Allow yourself to learn new things. Search out perspectives from multiple sides of an issue. Try to form, and modify, your opinions based on evidence that is accurate, objective and verified.</p>
<p>Don’t let yourself be swayed by outliers. For example, give more weight to the numerous doctors and public health officials who describe the preponderance of evidence that vaccines are safe and effective than what you give to one fringe doctor on a podcast who suggests the opposite.</p>
<p>Be wary of repetition, as repeated statements are often <a href="https://doi.org/10.1186/s41235-021-00301-5">perceived as more truthful</a> than new information, no matter how false the claim may be. Social media manipulators and politicians know this all too well. </p>
<p>Presenting things in a nonconfrontational way allows people to evaluate new information without feeling attacked. Insulting others and suggesting someone is ignorant or misinformed, no matter how misguided their beliefs may be, will cause the people you are trying to influence to reject your argument. Instead, try asking questions that lead the person to question what they believe. While opinions may not ultimately change, the <a href="https://doi.org/10.1177/1529100612451018">chance of success is greater</a>.</p>
<p>Recognize we all have these tendencies and respectfully listen to other opinions. Take a deep breath and pause when you feel your body ramping up for a fight. Remember, it’s OK to be wrong at times. Life can be a process of growth.</p><img src="https://counter.theconversation.com/content/186530/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Keith M. Bellizzi receives funding from the National Institutes of Health. </span></em></p>Here are some reasons for the natural human tendency to avoid or reject new information that runs counter to what you already know – and some tips on how to do better.Keith M. Bellizzi, Professor of Human Development and Family Sciences, University of ConnecticutLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1761342022-03-20T11:43:17Z2022-03-20T11:43:17ZMajor study shows the need to improve how scientists approach early-stage cancer research<figure><img src="https://images.theconversation.com/files/452401/original/file-20220316-15-1tmf3hp.jpg?ixlib=rb-1.1.0&rect=726%2C0%2C4837%2C2952&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Preclinical research — the kind that takes place before testing on humans — often guides decisions about which potential treatments should continue to clinical trials. But attempts to replicate 50 studies found the odds of getting the same results were only about 50-50.</span> <span class="attribution"><span class="source">(Pexels/Artem Podrez)</span></span></figcaption></figure><iframe style="width: 100%; height: 175px; border: none; position: relative; z-index: 1;" allowtransparency="" src="https://narrations.ad-auris.com/widget/the-conversation-canada/major-study-shows-the-need-to-improve-how-scientists-approach-early-stage-cancer-research" width="100%" height="400"></iframe>
<p>Preclinical studies, the kind that scientists perform before testing in humans, don’t get as much attention as their clinical counterparts. But they are the vital first steps to eventual treatments and cures. It’s important to get preclinical findings right. When they are wrong, scientists waste resources pursuing false leads. Worse, false findings can trigger <a href="https://doi.org/10.1186/s41231-019-0050-7">clinical studies with humans</a>. </p>
<p>Last December, the Center for Open Science (COS) released the worrying results of its eight-year $US 1.5 million <em><a href="https://doi.org/10.7554/eLife.71601">Reproducibility Project: Cancer Biology</a></em> study. Done in collaboration with research marketplace <a href="https://ww2.scienceexchange.com/s/about">Science Exchange</a>, independent scientists found that the odds of replicating results of 50 preclinical experiments from 23 high-profile published studies were no better than a coin toss. </p>
<p>Praise and controversy have followed the project from the beginning. The journal <em>Nature</em> applauded the replication studies as “<a href="https://doi.org/10.1038/541259b">the practice of science at its best</a>.” But the journal <em>Science</em> noted that reactions from some scientists whose studies were chosen ranged from “<a href="https://doi.org/10.1126/science.348.6242.1411">annoyance to anxiety to outrage</a>,” impeding the replications. Although none of the original experiments was described in enough detail to allow scientists to repeat them, <a href="https://doi.org/10.7554/eLife.67995">a third of the original authors were unco-operative</a>, and some were even <a href="https://www.sciencenews.org/article/cancer-biology-studies-research-replication-reproducibility">hostile</a> when asked for assistance.</p>
<figure class="align-center ">
<img alt="A person wearing PPE using a multi-channel pipette in a laboratory" src="https://images.theconversation.com/files/452293/original/file-20220315-15-60adun.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/452293/original/file-20220315-15-60adun.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=231&fit=crop&dpr=1 600w, https://images.theconversation.com/files/452293/original/file-20220315-15-60adun.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=231&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/452293/original/file-20220315-15-60adun.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=231&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/452293/original/file-20220315-15-60adun.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=291&fit=crop&dpr=1 754w, https://images.theconversation.com/files/452293/original/file-20220315-15-60adun.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=291&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/452293/original/file-20220315-15-60adun.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=291&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">It’s important to get preclinical findings right. When they are wrong, scientists waste resources pursuing false leads.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>COS executive director Brian Nosek cautioned that the findings pose “<a href="https://www.science.org/content/article/more-half-high-impact-cancer-lab-studies-could-not-be-replicated-controversial-analysis">challenges for the credibility of preclinical cancer biology</a>.” In a tacit acknowledgement that biomedical research has not been universally rigorous or transparent, the American National Institutes of Health (NIH), the largest funder of biomedical research in the world, has announced that it will <a href="https://www.chemistryworld.com/news/replication-failures-cast-doubt-on-some-cancer-studies/4014881.article">raise requirements for both of these qualities</a>.</p>
<p>I have taught classes and written about good scientific practice in psychology and biomedicine for over 30 years. I’ve reviewed more grant applications and journal manuscripts than I can count, and I’m not surprised.</p>
<figure class="align-right ">
<img alt="A stack of journal articles, with passages highlighted in the top one, with a pen resting on top." src="https://images.theconversation.com/files/452304/original/file-20220315-21-1j5qntp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/452304/original/file-20220315-21-1j5qntp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/452304/original/file-20220315-21-1j5qntp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/452304/original/file-20220315-21-1j5qntp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/452304/original/file-20220315-21-1j5qntp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/452304/original/file-20220315-21-1j5qntp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/452304/original/file-20220315-21-1j5qntp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Independent scientists found that the odds of replicating results of 50 preclinical experiments from 23 high-profile published studies were no better than a coin toss.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>The twin pillars of trustworthy science — transparency and dispassionate rigour — have wobbled under the stress of <a href="https://royalsocietypublishing.org/doi/10.1098/rsos.160384">incentives that</a> enhance careers at the expense of reliable science. Too often, proposed preclinical studies — and surprisingly, published peer-reviewed ones — <a href="https://doi.org/10.1161/CIRCRESAHA.114.303819">don’t follow the scientific method</a>. Too often, <a href="https://doi.org/10.1089/bio.2020.0037">scientists do not share</a> their government-funded data, even when required by the publishing journal.</p>
<h2>Controlling for bias</h2>
<p>Many preclinical experiments <a href="https://doi.org/10.1007/164_2019_279">lack the rudimentary controls against bias</a> that are taught in the social sciences, though <a href="https://www.cshlpress.com/default.tpl?cart=1646145461247203111&fromlink=T&linkaction=full&linksortby=oop_title&--eqSKUdatarq=1020">rarely in biomedical disciplines</a> such as medicine, cell biology, biochemistry and physiology. Controlling for bias is a key element of the scientific method because it allows scientists to disentangle experimental signal from procedural noise. </p>
<p>Confirmation bias, the tendency to see what we want to see, is one type of bias that good science controls by “blinding.” Think of the “double-blind” procedures in clinical trials in which neither the patient nor the research team knows who is getting the placebo and who is getting the drug. In preclinical research, blinding experimenters to samples’ identities minimizes the chance that they will alter their behaviour, however subtly, in favour of their hypothesis. </p>
<p>Seemingly trivial differences, such as whether a sample is processed in the morning or afternoon or whether an animal is caged in the upper or lower row, can also change results. This is not as unlikely as you might think. Moment-to-moment changes in the micro-environment, such as exposure to light and air ventilation, for example, <a href="https://arriveguidelines.org/arrive-guidelines/randomisation#:%7E:text=Using%20a%20validated%20method%20of,valid%20%5B4%2C5%5D">can change physiological responses</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A row of clear acrylic animal cages, each housing a white rat." src="https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=420&fit=crop&dpr=1 600w, https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=420&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=420&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=527&fit=crop&dpr=1 754w, https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=527&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/452275/original/file-20220315-15-39otqq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=527&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Seemingly trivial differences, such as whether an animal is caged in the upper or lower row, can change results.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>If all animals who receive a drug are caged in one row and all animals who do not receive the drug are caged in another row, any difference between the two groups of animals may be due to the drug, to their housing location or to an interaction between the two. You can’t honestly choose between the alternative explanations, and neither can the scientists.</p>
<p>Randomizing sample selection and processing order minimizes these procedural biases, makes the interpretation of the results clearer, and makes them more likely to be replicated. </p>
<p>Many of the replication experiments blinded and randomized, but it’s not known if the original experiments did. All that is known is that for the 15 animal experiments, only <a href="https://doi.org/10.7554/eLife.71601">one of the original studies reported randomization and none reported blinding</a>. But it would not be surprising if many of the studies neither randomized nor blinded.</p>
<h2>Study design and statistics</h2>
<p>According to one estimate, over half of the one million articles published each year <a href="https://doi.org/10.1016/S0140-6736%2809%2960329-9">have biased study designs</a>, contributing to 85 per cent of US$100-billion spent each year on (mostly preclinical) research being wasted. </p>
<p>In a widely reported commentary, industry scientist and former academic Glenn Begley reported being able to reproduce the results of only <a href="https://doi.org/10.1038/483531a">six of 53</a> academic studies (11 per cent). He listed <a href="https://doi.org/10.1038/497433a">six practices</a> of reliable research, including blinding. All six of the studies that replicated followed all six practices. The 47 studies that failed to replicate followed few or, sometimes, none of the practices. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Three people in white coats with a microscope in the foreground, superimposed with bar graphs and data points." src="https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=372&fit=crop&dpr=1 600w, https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=372&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=372&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=468&fit=crop&dpr=1 754w, https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=468&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/452283/original/file-20220315-19-vympx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=468&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Misuse of statistics is a common in biomedical research despite calls for better data analysis practices.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Another way to bias findings is by misusing statistics. As with blinding and randomization, it’s not known which, if any, of the original studies in the reproducibility project misused statistics, because of the studies’ lack of transparency. But that, too, is common practice.</p>
<p>A dictionary of terms describes a slew of poor data analysis practices that can manufacture statistically significant (but false) findings, such as <a href="https://doi.org/10.1207/s15327957pspr0203_4">HARKing</a> (Hypothesizing After the Results are Known), p-hacking (<a href="https://doi.org/10.1177%2F0956797611417632">repeating statistical tests until a desired result is produced</a>) and following a series of data-dependent analysis decisions known as a “<a href="https://doi.org/10.1511/2014.111.460">garden of forking paths</a>” to publishable findings. </p>
<p><a href="https://link.springer.com/chapter/10.1007/164_2019_278#Sec4">These practices</a> are <a href="https://acmedsci.ac.uk/policy/policy-projects/reproducibility-and-reliability-of-biomedical-research">common in biomedical research</a>. <a href="https://doi.org/10.1136/bmj.308.6924.283">Decades of pleas</a> from <a href="https://doi.org/10.1371/journal.pmed.0020124">methodologists</a>, and an <a href="https://magazine.amstat.org/blog/2021/08/01/task-force-statement-p-value/">unprecedented statement</a> from the American Statistical Association to change data analysis practices, however, have <a href="https://doi.org/10.1111/1740-9713.01505">gone unheeded</a>.</p>
<h2>A better future</h2>
<figure class="align-center ">
<img alt="A woman wearing a lab coat and safety glasses and green gloves examining lab samples" src="https://images.theconversation.com/files/452295/original/file-20220315-25-11qxmpb.jpg?ixlib=rb-1.1.0&rect=353%2C0%2C4871%2C3371&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/452295/original/file-20220315-25-11qxmpb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/452295/original/file-20220315-25-11qxmpb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/452295/original/file-20220315-25-11qxmpb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/452295/original/file-20220315-25-11qxmpb.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/452295/original/file-20220315-25-11qxmpb.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/452295/original/file-20220315-25-11qxmpb.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Incentives and standards should reward practices that produce trustworthy science and censor practices that do not, without killing innovation.</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Those who are anti-science should not take heart in these findings. Preclinical science’s accomplishments are real and impressive. Decades of preclinical research led to the <a href="https://www.nytimes.com/2022/01/15/health/mrna-vaccine.html">development of the COVID-19 mRNA vaccines</a>, for example. And most scientists are doing the best they can within a system that rewards <a href="https://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science">quick flashy results</a> over slower reliable ones. </p>
<p>But science is done by humans with all the strengths and weaknesses that go with it. The trick is to reward practices that produce trustworthy science and to censor practices that do not, without killing innovation. </p>
<p>Changing incentives and enforcing standards are the most effective ways to improve scientific practice. The goal is to improve efficiency by ensuring scientists who value transparency and rigour over speed and flash are given a chance to thrive. It’s been <a href="https://doi.org/10.1038/505612a">tried before</a>, with <a href="https://doi.org/10.1080/08989621.2020.1855427">minimal success</a>. This time may be different. The <em>Reproducibility Project: Cancer Biology</em> study and the NIH policy changes it prompted may be just the push needed to make it happen.</p><img src="https://counter.theconversation.com/content/176134/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Robert Nadon does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Preclinical studies are an important part of biomedical research, often guiding future trials in humans. Failure to replicate research results suggests a need to increase the quality of studies.Robert Nadon, Associate Professor, Department of Human Genetics, Faculty of Medicine, McGill UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1752312022-02-07T19:08:15Z2022-02-07T19:08:15Z4 reasons why you should never say ‘do your research’ to win the argument<figure><img src="https://images.theconversation.com/files/444442/original/file-20220203-23-1pajxu8.jpg?ixlib=rb-1.1.0&rect=138%2C292%2C4641%2C3072&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>It’s fairly common to see many claims or arguments end with a curt “do your research”. In some ways, it’s a bold call to action. </p>
<blockquote>
<p>“Come on people! Wake up! You’ll see the truth of the matter if only you see it with your own eyes!”</p>
</blockquote>
<p>This type of statement is highly evocative and persuasive – in an emotionally manipulative way. Here are four reasons why we should avoid telling others to do research when discussing a topic.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1378479031151042560"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/how-to-make-good-arguments-at-school-and-everywhere-else-121305">How to make good arguments at school (and everywhere else)</a>
</strong>
</em>
</p>
<hr>
<h2>1. Burden of proof</h2>
<p>There’s a general rule in argumentation: “What can be asserted without evidence can also be dismissed without evidence.” What this means is that if we make a claim about the world, we bear the burden of proving that our claim is true. Carl Sagan <a href="https://link.springer.com/article/10.1007/s11406-016-9779-7#auth-David-Deming">famously argued</a> this as “extraordinary claims require extraordinary evidence”.</p>
<p>This is an essential part of public discourse – if we want the public to agree with us, we must accept the burden of proof for demonstrating our ideas. </p>
<p>Say we want to make a claim like:</p>
<blockquote>
<p>“The COVID-19 vaccine is poison.”</p>
</blockquote>
<p>This is an extraordinary claim. We have a well-established track record of safe vaccines. To begin to take the “poison” claim seriously, we’ll need some serious facts to back it up. </p>
<p>Perhaps there are studies that demonstrate that a vaccine is poisonous or causes significant adverse reactions. But it’s still our job to provide that evidence – no one is required to take us seriously until we do. </p>
<p>Once that evidence is provided, we can evaluate whether that evidence is reliable and whether it relates to the main claim. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1368501856398417921"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/feel-free-to-disagree-on-campus-by-learning-to-do-it-well-151019">Feel free to disagree on campus ... by learning to do it well</a>
</strong>
</em>
</p>
<hr>
<h2>2. Confirmation bias</h2>
<p><a href="https://link.springer.com/article/10.1007/s10670-019-00128-z">Our minds don’t always work by being slow, reasonable and deliberate</a> – that would be exhausting. Instead we use what’s called heuristics (mental shortcuts) to enable us to act and behave quickly. </p>
<p>We use heuristics to make choices while driving in traffic, or deciding which way to dodge in a football game, or when to turn down the heat when cooking. There are simply too many tiny decisions to make every day to not have these shortcuts.</p>
<p>A cognitive bias is similar to a heuristic but with an important distinction – it comes with an error embedded in the decision. </p>
<p>A specific type of cognitive bias is a confirmation bias: the tendency to interpret facts and information in a way that supports what we already believe. For example, if we’re distrustful of government, we’re more likely to believe news stories about corruption and fraud on the part of our elected officials. </p>
<p>The problem with confirmation bias is that it leads us to irrationally privilege certain types of information over others. It’s much harder to change our minds when they’re already <a href="https://www.scielosp.org/article/csp/2020.v36suppl2/e00136620/en/">primed to believe certain things – about vaccines</a>, for example. In our search for information, we’ll look to sources that support claims we already agree with or deny claims we don’t like. If we are already suspicious or fearful of a vaccine and someone says “do your research on the harms of the vaccine”, we’re more likely to cherry-pick individual cases of adverse vaccine effects. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1487850816392527872"}"></div></p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/first-impressions-count-and-have-an-impact-on-the-decisions-we-make-later-on-175034">First impressions count, and have an impact on the decisions we make later on</a>
</strong>
</em>
</p>
<hr>
<h2>3. Poor intellectual virtue</h2>
<p>Someone who tells others to do the research is looking for others to come to the same conclusions they’ve already drawn. That’s not discussion or debate. It’s seeking uncritical agreement and social acceptance. </p>
<p>We all seek validation of our perspectives and beliefs, but we need to do more than this. We should welcome sincere engagement and criticism. </p>
<p><a href="https://www.tandfonline.com/doi/full/10.1080/00131857.2020.1811680?casa_token=bhBSUXcvnf8AAAAA%3AHxrtenEgshLuxyoR5CjvTz_a5pkCWZA3ikdWn4hO8EsKE3tMLifZ9eRqyEjQ7fZ_ETlXsSzBkEO7uw">Effective democracies require</a> that we engage with each other using intellectual virtues like honesty, open-mindedness and rigorousness. We should aim to be truth-seekers, looking to evaluate evidence and determine credibility in all things. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/changing-your-mind-about-something-as-important-as-vaccination-isnt-a-sign-of-weakness-being-open-to-new-information-is-the-smart-way-to-make-choices-167856">Changing your mind about something as important as vaccination isn't a sign of weakness – being open to new information is the smart way to make choices</a>
</strong>
</em>
</p>
<hr>
<h2>4. Unreasonable expectations</h2>
<p>We can’t expect that everyone has the time to thoroughly examine every publication on a given topic. Even if it took only ten minutes to read a scientific article on vaccination safety (which is a huge underestimation for a paper that is thousands of words long), effective research would have us reading at least half a dozen of them to see what experts in the field are saying. </p>
<p>And that’s just reading. It isn’t counting the time to learn various terms and vocabulary in that field, to learn about the disagreements and schools of thought, or to form our own opinion on the quality of that research. </p>
<p>At a minimum, we’d be looking at hours of investigation for someone else’s argument. If the arguer puts forward their evidence, we’d still need to do our research on whether that evidence was accurate – but at least now we’re talking about minutes, not hours.</p>
<figure class="align-center ">
<img alt="Pencil placed on scientific journal paper with highlighted sections" src="https://images.theconversation.com/files/444464/original/file-20220204-15-1xnssd6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/444464/original/file-20220204-15-1xnssd6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/444464/original/file-20220204-15-1xnssd6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/444464/original/file-20220204-15-1xnssd6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/444464/original/file-20220204-15-1xnssd6.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/444464/original/file-20220204-15-1xnssd6.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/444464/original/file-20220204-15-1xnssd6.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Proper research would require that a person has the time and expertise to read and assess lengthy articles by genuine experts.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<h2>Becoming better at arguing</h2>
<p>One of the most fundamental virtues in listening to each other and improving the quality of our discourse is curiosity. One of the real dangers for our lives is becoming uninterested in other perspectives – or, worse still, becoming uninterested in the truth itself. </p>
<p>We’ll never have a full picture of complex social and scientific problems. Our lives are busy and complex themselves and we simply don’t have the time to properly investigate every topic put before us. If someone wants to be taken seriously, the least they can do is present their argument in full. </p>
<p>We can still meaningfully engage with each other, but we have to be honest about our information and where we got it from. </p>
<p>It’s no good telling others to do our homework for us.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/actually-its-ok-to-disagree-here-are-5-ways-we-can-argue-better-121178">Actually, it's OK to disagree. Here are 5 ways we can argue better</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/175231/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Luke Zaphir is affiliated with the University of Queensland's Critical Thinking Project</span></em></p>We’ve all heard an exasperated “do your research!” from people who want to persuade us to accept their claim or point of view. The problem is it’s not likely to convince anyone.Luke Zaphir, Researcher, UQ Critical Thinking Project, The University of QueenslandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1709112021-11-11T15:41:51Z2021-11-11T15:41:51ZHow cognitive biases and adverse events influence vaccine decisions (maybe even your own)<figure><img src="https://images.theconversation.com/files/430453/original/file-20211105-25-wktftj.jpg?ixlib=rb-1.1.0&rect=1041%2C431%2C3862%2C2110&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Vaccine hesitancy has been a growing challenge for more than a decade. Concerns about vaccine safety and adverse events are the most commonly cited reasons.</span> <span class="attribution"><span class="source">(AP Photo/Rogelio V. Solis) </span></span></figcaption></figure><iframe style="width: 100%; height: 175px; border: none; position: relative; z-index: 1;" allowtransparency="" src="https://narrations.ad-auris.com/widget/the-conversation-canada/how-cognitive-biases-and-adverse-events-influence-vaccine-decisions--maybe-even-your-own-" width="100%" height="400"></iframe>
<p>The <a href="https://www.who.int/immunization/research/forums_and_initiatives/1_RButler_VH_Threat_Child_Health_gvirf16.pdf">World Health Organization</a> recognized vaccine hesitancy as a growing challenge in 2011, and identified it as <a href="https://www.who.int/wer/2011/wer8621.pdf">a new priority topic</a>. This was mostly because of the return of vaccine-preventable diseases like <a href="https://doi.org/10.1038/s41390-019-0354-3">measles in Europe and the United States</a>. </p>
<p>Ten years later, in 2021, we see that vaccine hesitancy has become an even more significant challenge despite all the efforts. The COVID-19 pandemic has brought it to a peak, and all efforts to manage the pandemic depend on the people’s willingness to take the vaccination. However, <a href="https://doi.org/10.3390/vaccines9020160">the numbers are not very promising as some percentage of populations in every country are reluctant to vaccinate</a>.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/410911/original/file-20210712-19-geybnm.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/410911/original/file-20210712-19-geybnm.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=600&fit=crop&dpr=1 600w, https://images.theconversation.com/files/410911/original/file-20210712-19-geybnm.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=600&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/410911/original/file-20210712-19-geybnm.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=600&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/410911/original/file-20210712-19-geybnm.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=754&fit=crop&dpr=1 754w, https://images.theconversation.com/files/410911/original/file-20210712-19-geybnm.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=754&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/410911/original/file-20210712-19-geybnm.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=754&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="https://theconversation.com/ca/topics/vaccine-confidence-in-canada-107061">Click here for more articles in our series about vaccine confidence.</a></span>
</figcaption>
</figure>
<p>Vaccine hesitancy means “<a href="https://doi.org/10.1016/S2352-4642(19)30092-6">delay in acceptance or refusal of vaccines despite availability of vaccination services</a>.” Vaccine-hesitant people cite <a href="https://doi.org/10.1016/j.vaccine.2015.01.068">distrust in vaccine safety and concerns over vaccine adverse events</a> as the most common reasons for reluctance to get vaccinated. </p>
<p>Vaccines are used in healthy people to prevent a disease that might harm them in the future. However, as they are healthy at the time of vaccination, they may worry about the vaccine’s safety.</p>
<p>Our team of business analytics and artificial intelligence researchers at Concordia University, along with a professor of epidemiology at McGill University, has published a paper in the <a href="https://doi.org/10.1186/s12889-021-11745-1"><em>BMC Public Health</em></a> journal that investigated this critical concern from two perspectives. </p>
<p>First, we addressed vaccine safety concerns by analyzing data from vaccine adverse events systems. These are vaccine surveillance systems where adverse events following immunization are reported, monitored and stored in a database. Canada’s system is called the <a href="https://www.canada.ca/en/public-health/services/immunization/canadian-adverse-events-following-immunization-surveillance-system-caefiss.html#_About_the_system">Canadian Adverse Events Following Immunization Surveillance System (CAEFISS)</a>.</p>
<p>Second, we focused on cognitive science and highlighted the critical role of cognitive biases in people’s vaccination decision-making that might lead to vaccine hesitancy.</p>
<h2>Data-driven evidence to address vaccine safety</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A vaccination centre with Ontario Premier Doug Ford in the background touring the facility and a line of people waiting to greet him in the foreground" src="https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=900&fit=crop&dpr=1 600w, https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=900&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=900&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1131&fit=crop&dpr=1 754w, https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1131&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/430455/original/file-20211105-19-6xhu5j.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1131&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ontario Premier Doug Ford tours a vaccine centre in Windsor, Ont. Distrust in vaccine safety and concerns over vaccine adverse events are the most cited reasons for vaccine hesitancy.</span>
<span class="attribution"><span class="source">THE CANADIAN PRESS/ Geoff Robins</span></span>
</figcaption>
</figure>
<p>A solution to mitigate distrust in vaccines safety is to <a href="https://doi.org/10.1177/0272989X15607855">provide evidence-based meaningful information about vaccine safety and adverse events</a>. We followed this path and analyzed all the adverse events reported to the <a href="https://vaers.hhs.gov/">U.S. Vaccine Adverse Event Reporting System (VAERS)</a>.</p>
<p>We analyzed almost 294,000 reports over eight years from 2011 to 2018. It equals roughly 115 reports per million people, covering 87 vaccine types. The most frequently reported vaccines were those for chickenpox, influenza, pneumococcal bacteria and human pappilomavirus (HPV).</p>
<p>Each VAERS report (representing one incident) involved an average of three adverse events, the most common being rashes, fever, swelling, pain and headaches. Only 5.5 per cent of the reports were marked as serious, resulting in hospitalization, disability, threats to life or death. The top adverse events in this group also include fever, pain, vomiting, headaches and shortness of breath. </p>
<p>We also analyzed the vaccine adverse events reported to <a href="https://www.canada.ca/en/health-canada/services/drugs-health-products/medeffect-canada/canada-vigilance-program.html">Canada Vigilance</a>. Our findings were consistent with those from the VAERS.</p>
<p>We have provided our results in an <a href="https://public.tableau.com/app/profile/aefi/viz/VAERSAdverseEventFollowingImmuinzationAEFIReports2011-2018/Dashboard1">interactive dashboard</a>. Health-care professionals and others involved in vaccine communication can use this dashboard to provide evidence-based information to the public. Research suggests that <a href="https://doi.org/10.1016/j.vaccine.2016.03.087">summarized data is the best format for communicating vaccine safety information</a>, so using this dashboard in vaccination communication can help mitigate vaccine hesitancy and safety concerns, and increase trust in vaccines.</p>
<h2>The role of cognitive biases in vaccine hesitancy</h2>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man in camouflage T-shirt and hat holding an anti-vaccine sign in the foreground, with a group of people in the background" src="https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=425&fit=crop&dpr=1 600w, https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=425&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=425&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=535&fit=crop&dpr=1 754w, https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=535&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/430454/original/file-20211105-17-1g2k9jz.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=535&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An anti-vaccine demonstrator in front of a hospital in Montréal in September 2021.</span>
<span class="attribution"><span class="source">THE CANADIAN PRESS/Paul Chiasson</span></span>
</figcaption>
</figure>
<p>In the second part of our study, after addressing concerns about vaccine adverse events, we examined the role of cognitive biases on vaccine hesitancy. We identified cognitive biases that might affect vaccine communication and decision-making. </p>
<p>As mentioned earlier, vaccines are administrated to healthy people. When people are making decisions about vaccination, they might feel some degrees of risk, ambiguity and uncertainty about the results, which can instigate cognitive biases in the decision-making process. Such cognitive biases might <a href="https://doi.org/10.1016/j.vaccine.2015.03.048">nudge people toward vaccine hesitancy</a>.</p>
<p>For example, contrary to the positive effect of providing people with summarized vaccine safety information that increases vaccine trust, detailed vaccine adverse event reports will decrease trust because of two cognitive biases. </p>
<p>First, when vaccine hesitant people read a detailed report about a vaccine adverse event, it gives them the chance to see what they want to see. It is an example of confirmation bias, which is the <a href="https://doi.org/10.1037/1089-2680.2.2.175">tendency to recall and interpret information that confirms our existing beliefs</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&rect=89%2C40%2C2748%2C1859&q=45&auto=format&w=1000&fit=clip"><img alt="An upper arm bearing a heart tattoo and a small round bandage over an injection site" src="https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&rect=89%2C40%2C2748%2C1859&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/430452/original/file-20211105-23-1m8sflu.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Research suggests that summarized vaccine safety information is the best format for increasing trust in vaccines.</span>
<span class="attribution"><span class="source">THE CANADIAN PRESS/Kayle Neis</span></span>
</figcaption>
</figure>
<p>Second, a detailed adverse event report will also increase the event’s vividness, making it easier to recall the next time there is a decision to be made about taking a vaccine. That is the effect of availability bias, <a href="https://doi.org/10.1016/0010-0285(73)90033-9">the tendency to attribute more weight to factors that are easier to recall</a>.</p>
<p>We identified 15 cognitive biases in the vaccine decision-making process and categorized them into three groups:</p>
<ul>
<li><p><strong>Cognitive biases triggered by processing vaccine-related information</strong> include availability bias, as in the above example, as well as framing effect, base rate neglect, availability bias, anchoring effect and authority bias.</p></li>
<li><p><strong>Cognitive biases triggered in vaccination decision-making</strong> include omission bias, which is when the results of not taking an action are viewed as less damaging than the results of taking action, even when this is not the case. Others include ambiguity aversion, optimism bias, present bias and protected values. </p></li>
<li><p><strong>Cognitive biases triggered by prior beliefs regarding vaccination</strong> include confirmation bias such as the one in the example, as well as belief bias, shared information bias and false consensus effect.</p></li>
</ul>
<p>The <a href="https://bmcpublichealth.biomedcentral.com/articles/10.1186/s12889-021-11745-1/tables/1">full list of cognitive biases affecting vaccination decision-making and their examples is available here</a>. Public health officials and practitioners can use this list and customize their plans, interventions and other forms of vaccine communication to decrease vaccine hesitancy. </p>
<p>You also can check the list and see if these biases have influenced your own vaccination decisions.</p>
<p><em>Do you have a question about COVID-19 vaccines? Email us at <a href="mailto:ca-vaccination@theconversation.com">ca-vaccination@theconversation.com</a> and vaccine experts will answer questions in upcoming articles.</em></p><img src="https://counter.theconversation.com/content/170911/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>To help increase trust in vaccines, researchers analyzed data on adverse events to address safety concerns, and then used cognitive science to show how cognitive biases feed vaccine hesitancy.Hossein Azarpanah, PhD Candidate in Business Technology Management, Concordia UniversityLouise Pilote, Professor of Medicine, James McGill Chair, McGill UniversityMohsen Farhadloo, Assistant professor, John Molson School of Business, Concordia UniversityRustam Vahidov, Professor, Dept. of Supply Chain & Business Technology Management, Concordia UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1617132021-06-29T12:06:22Z2021-06-29T12:06:22ZScience denial: Why it happens and 5 things you can do about it<figure><img src="https://images.theconversation.com/files/408741/original/file-20210628-25-dhlbk1.jpg?ixlib=rb-1.1.0&rect=171%2C171%2C5433%2C3829&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Are you open to new ideas and willing to change your mind?</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/entrepreneur-with-arms-crossed-at-modern-workplace-royalty-free-image/1210533708">Klaus Vedfelt/DigitalVision via Getty Images</a></span></figcaption></figure><p>Science denial became deadly in 2020. Many political leaders <a href="https://www.scientificamerican.com/article/the-failure-of-public-health-messaging-about-covid-19/">failed to support what scientists knew to be effective</a> prevention measures. Over the course of the pandemic, people <a href="https://www.washingtonpost.com/health/2020/11/16/south-dakota-nurse-coronavirus-deniers/">died from COVID-19 still believing it did not exist</a>.</p>
<p><a href="https://www.simonandschuster.com/books/Galileo/Mario-Livio/9781501194740">Science denial is not new</a>, of course. But it is more important than ever to understand why some people deny, doubt or resist scientific explanations – and what can be done to overcome these barriers to accepting science.</p>
<p>In our book “<a href="https://global.oup.com/academic/product/science-denial-9780190944681">Science Denial: Why It Happens and What to Do About It</a>,” we offer ways for you to understand and combat the problem. As <a href="https://scholar.google.com/citations?user=LzHZpAEAAAAJ&hl=en&oi=ao">two research</a> <a href="https://scholar.google.com/citations?user=VBvoFacAAAAJ&hl=en&oi=ao">psychologists</a>, we know that everyone is susceptible to forms of it. Most importantly, we know there are solutions.</p>
<p>Here’s our advice on how to confront five psychological challenges that can lead to science denial.</p>
<h2>Challenge #1: Social identity</h2>
<p>People are social beings and tend to align with those who hold <a href="https://doi.org/10.1002/9781119011071.iemp0153">similar beliefs and values</a>. Social media <a href="https://www.nature.com/articles/d43978-021-00019-4">amplify alliances</a>. You’re likely to <a href="https://www.penguinrandomhouse.com/books/309214/the-filter-bubble-by-eli-pariser/">see more of what you already agree with</a> and fewer alternative points of view. People live in information filter bubbles created by <a href="https://www.pewresearch.org/internet/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/">powerful algorithms</a>. When those in your social circle share misinformation, you are more likely to believe it and share it. Misinformation multiplies and science denial grows.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="two seated men in discussion" src="https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Can you find common ground to connect on?</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/W3Jl3jREpDY">LinkedIn Sales Solutions/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Action #1: Each person has multiple social identities. One of us talked with a climate change denier and discovered he was also a grandparent. He opened up when thinking about his grandchildren’s future, and the conversation turned to economic concerns, the root of his denial. Or maybe someone is vaccine-hesitant because so are mothers in her child’s play group, but she is also a caring person, concerned about immunocompromised children.</p>
<p>We have found it effective to listen to others’ concerns and try to find common ground. Someone you <a href="https://doi.org/10.1007/s11109-015-9312-x">connect with is more persuasive</a> than those with whom you share less in common. When one identity is blocking acceptance of the science, leverage a second identity to make a connection.</p>
<h2>Challenge #2: Mental shortcuts</h2>
<p>Everyone’s busy, and it would be exhausting to be vigilant deep thinkers all the time. You see an article online with a clickbait headline such as “Eat Chocolate and Live Longer” and you share it, because you assume it is true, want it to be or think it is ridiculous. </p>
<p>Action #2: Instead of sharing that article on how GMOs are unhealthy, learn to slow down and monitor the quick, intuitive responses that psychologist <a href="https://us.macmillan.com/books/9780374533557">Daniel Kahneman calls System 1 thinking</a>. Instead turn on the rational, analytical mind of System 2 and ask yourself, <a href="https://doi.org/10.1080/00461520.2020.1730181">how do I know this is true</a>? Is it plausible? Why do I think it is true? Then do some fact-checking. Learn to not immediately accept information you already believe, which is called <a href="https://doi.org/10.1037/1089-2680.2.2.175">confirmation bias</a>. </p>
<h2>Challenge #3: Beliefs on how and what you know</h2>
<p>Everyone has <a href="https://www.routledge.com/Handbook-of-Epistemic-Cognition/Greene-Sandoval-Braten/p/book/9781138013421">ideas about what they think knowledge is</a>, where it comes from and whom to trust. <a href="https://www.taylorfrancis.com/chapters/edit/10.4324/9781315795225-9/epistemic-cognition-psychological-construct-advancements-challenges-barbara-hofer">Some people think dualistically</a>: There’s always a clear right and wrong. But scientists view <a href="https://doi.org/10.1080/0163853X.2019.1629805">tentativeness as a hallmark</a> of their discipline. Some people may not understand that scientific claims will change as more evidence is gathered, so they may be distrustful of how public health policy shifted around COVID-19.</p>
<p>Journalists who present “both sides” of settled scientific agreements can unknowingly persuade readers that the science is more uncertain than it actually is, turning <a href="https://doi.org/10.1016/j.gloenvcha.2003.10.001">balance into bias</a>. Only 57% of Americans surveyed accept that climate change is caused by human activity, compared with <a href="https://climate.nasa.gov/faq/17/do-scientists-agree-on-climate-change/">97% of climate scientists</a>, and only <a href="https://climatecommunication.yale.edu/visualizations-data/ycom-us/">55% think that scientists are certain that climate change is happening</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="man with book looking off into distance" src="https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">How did you come to know what you know?</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/man-reading-book-on-the-table-royalty-free-image/980285120">ridvan_celik/E+ via Getty Images</a></span>
</figcaption>
</figure>
<p>Action #3: Recognize that other people (or possibly even you) may be operating with misguided beliefs about science. You can help them adopt what philosopher of science <a href="https://leemcintyrebooks.com">Lee McIntyre</a> calls a <a href="https://mitpress.mit.edu/books/scientific-attitude">scientific attitude</a>, an openness to seeking new evidence and a willingness to change one’s mind. </p>
<p>Recognize that very few individuals rely on a single authority for knowledge and expertise. Vaccine hesitancy, for example, has been successfully <a href="https://www.ama-assn.org/delivering-care/public-health/time-doctors-take-center-stage-covid-19-vaccine-push">countered by doctors</a> who persuasively contradict erroneous beliefs, as well as by friends who explain why they <a href="https://addisonindependent.com/joanna-colwell-i-didnt-vaccinate-my-child-and-then-i-did-0">changed their own minds</a>. <a href="https://www.churchleadership.com/leading-ideas/5-ways-churches-can-play-a-critical-role-in-vaccination-efforts/">Clergy can step forward</a>, for example, and some have offered places of worship as vaccination hubs.</p>
<h2>Challenge #4: Motivated reasoning</h2>
<p>You might not think that how you interpret a simple graph could depend on your political views. But when people were asked to look at the same charts depicting either housing costs or the rise in carbon dioxide in the atmosphere over time, interpretations differed by political affiliation. Conservatives were more likely than progressives to <a href="https://apadiv15.org/wp-content/uploads/2020/08/APA-2020-Hockey-Stick-1.pdf">misinterpret the graph</a> when it depicted a rise in CO2 than when it displayed housing costs. When people reason not just by examining facts, but with an unconscious bias to come to a preferred conclusion, <a href="https://www.discovermagazine.com/the-sciences/what-is-motivated-reasoning-how-does-it-work-dan-kahan-answers">their reasoning will be flawed</a>.</p>
<p>Action #4: Maybe you think that eating food from genetically modified organisms is harmful to your health, but have you really examined the evidence? Look at articles with both pro and con information, evaluate the source of that information, and be open to the evidence leaning one way or the other. If you give yourself the time to think and reason, you can short-circuit your own motivated reasoning and open your mind to new information.</p>
<h2>Challenge #5: Emotions and attitudes</h2>
<p>When Pluto got <a href="https://theconversation.com/nasa-missions-may-re-elevate-pluto-and-ceres-from-dwarf-planets-to-full-on-planet-status-36081">demoted to a dwarf planet</a>, many children and some adults responded with anger and opposition. Emotions and attitudes are linked. Reactions to hearing that humans influence the climate can range from anger (if you do not believe it) to frustration (if you are concerned you may need to change your lifestyle) to anxiety and hopelessness (if you accept it is happening but think it’s too late to fix things). How you feel about climate mitigation or GMO labeling aligns with whether you are for or against these policies.</p>
<p>Action #5: Recognize the role of emotions in decision-making about science. If you react strongly to a story about stem cells used to develop Parkinson’s treatments, ask yourself if you are overly hopeful because you have a relative in early stages of the disease. Or are you rejecting a possibly lifesaving treatment because of your emotions?</p>
<p>Feelings shouldn’t (and can’t) be put in a box separate from how you think about science. Rather, it’s important to understand and recognize that emotions are <a href="https://wwnorton.com/books/9780393709810">fully integrated ways of thinking and learning</a> about science. Ask yourself if your attitude toward a science topic is based on your emotions and, if so, give yourself some time to think and reason as well as feel about the issue. </p>
<p>[<em>You’re smart and curious about the world. So are The Conversation’s authors and editors.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=youresmart">You can read us daily by subscribing to our newsletter</a>.]</p>
<p>Everyone can be susceptible to these five psychological challenges that can lead to science denial, doubt and resistance. Being aware of these challenges is the first step toward taking action to meet them.</p><img src="https://counter.theconversation.com/content/161713/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Barbara K. Hofer has received research funding from the National Science Foundation and Vermont EPSCOR. </span></em></p><p class="fine-print"><em><span>Gale Sinatra has received funding from the National Science Foundation (NSF), Social Sciences and Humanities Research Council (SSHRC) of Canada, Mattel Children's Foundation. </span></em></p>Science denial is not new, but researchers have learned a lot about it. Here’s why it exists, how everyone is susceptible to it in one way or another and steps to take to overcome it.Barbara K. Hofer, Professor of Psychology Emerita, MiddleburyGale Sinatra, Professor of Education and Psychology, University of Southern CaliforniaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1570992021-03-18T12:19:39Z2021-03-18T12:19:39Z7 ways to avoid becoming a misinformation superspreader<figure><img src="https://images.theconversation.com/files/389858/original/file-20210316-16-1ifjiq8.jpg?ixlib=rb-1.1.0&rect=14%2C14%2C4778%2C3671&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Identify and stop the lies.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/illustration/big-hand-with-cartoon-character-stop-sign-royalty-free-illustration/1292878719">NLshop/iStock via Getty Images Plus</a></span></figcaption></figure><p>The problem of misinformation isn’t going away. Internet platforms like Facebook and Twitter have <a href="https://www.reuters.com/article/us-usa-election-twitter-idUSKCN2590FU">taken some steps to curb its spread</a> and say they are working on doing more. But no method yet introduced has been completely successful at removing all misleading content from social media. The best defense, then, is self-defense. </p>
<p>Misleading or outright false information – broadly called “misinformation” – can come from websites pretending to be news outlets, political propaganda or “<a href="http://source.sheridancollege.ca/fhass_huma_publ/1">pseudo-profound</a>” reports that seem meaningful but are not. Disinformation is a type of misinformation that is deliberately generated to maliciously mislead people. Disinformation is intentionally shared, knowing it is false, but misinformation can be <a href="https://doi.org/10.1371/journal.pone.0239666">shared by people who don’t know it’s not true</a>, especially because people often share links online <a href="https://www.wired.com/story/dont-want-to-fall-for-fake-news-dont-be-lazy/">without thinking</a>.</p>
<p>Emerging psychology research has revealed some tactics that can help protect our society from misinformation. Here are seven strategies you can use to avoid being misled, and to prevent yourself – and others – from spreading inaccuracies.</p>
<h2>1. Educate yourself</h2>
<p>The best inoculation against what the World Health Organization is calling the “<a href="https://www.who.int/news-room/feature-stories/detail/immunizing-the-public-against-misinformation">infodemic</a>” is to understand the <a href="https://theconversation.com/10-ways-to-spot-online-misinformation-132246">tricks that agents of disinformation are using</a> to try to manipulate you.</p>
<p>One strategy is called “<a href="https://www.spsp.org/news-center/blog/roozenbeek-van-der-linden-resisting-digital-misinformation">prebunking</a>” – a type of debunking that happens before you hear myths and lies. Research has shown that <a href="https://doi.org/10.5334/joc.91">familiarizing yourself with the tricks of the disinformation trade</a> can help you <a href="https://doi.apa.org/doi/10.1037/xap0000315">recognize false stories</a> when you encounter them, making you less susceptible to those tricks.</p>
<p>Researchers at the University of Cambridge have developed an online game called “<a href="https://www.getbadnews.com/">Bad News</a>,” which their studies have shown can <a href="https://doi.org/10.1057/s41599-019-027">improve players’ identification of falsehoods</a>.</p>
<p>In addition to the game, you can also learn more about how <a href="https://doi.org/10.1073/pnas.1920498117">internet and social media platforms work</a>, so you better understand the tools available to people seeking to manipulate you. You can also learn more about <a href="https://doi.org/10.1002/sce.21581">scientific research and standards of evidence</a>, which can help you be <a href="https://doi.org/10.1098/rsos.201199">less susceptible to lies and misleading statements</a> about health-related and scientific topics. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="Badges identify ways misinformation exploits people's minds" src="https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=504&fit=crop&dpr=1 600w, https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=504&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=504&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=633&fit=crop&dpr=1 754w, https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=633&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/389863/original/file-20210316-13-j493c8.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=633&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Playing the ‘Bad News’ online game illustrates different ways information warriors can prey on people’s psychological vulnerabilities.</span>
<span class="attribution"><a class="source" href="https://www.getbadnews.com/">Screenshot of Get Bad News</a></span>
</figcaption>
</figure>
<h2>2. Recognize your vulnerabilities</h2>
<p>The prebunking approach works for people across the political spectrum, but it turns out that people who underestimate their biases are actually more vulnerable to being misled than people who acknowledge their biases. </p>
<p>Research has found people are more <a href="https://www.scientificamerican.com/article/biases-make-people-vulnerable-to-misinformation-spread-by-social-media/">susceptible to misinformation</a> that aligns with their preexisting views. This is called “<a href="https://www.usatoday.com/story/money/columnist/2018/05/15/fake-news-social-media-confirmation-bias-echo-chambers/533857002/">confirmation bias</a>,” because a person is biased toward believing information that confirms what they already believe.</p>
<p>The lesson is to be particularly critical of information from groups or people with whom you agree or find yourself aligned – whether politically, religiously, or by ethnicity or nationality. Remind yourself to <a href="https://doi.org/10.3389/fdata.2019.00011">look for other points of view</a>, and other sources with information on the same topic. </p>
<p>It is especially important to be honest with yourself about <a href="https://www.allsides.com/rate-your-bias">what your biases are</a>. Many people assume others are biased, but <a href="https://www.cmu.edu/news/stories/archives/2015/june/bias-blind-spot.html">believe they themselves are not</a> – and imagine that <a href="https://onlinelibrary.wiley.com/doi/abs/10.1002/poi3.214">others are more likely to share misinformation</a> than they themselves are.</p>
<h2>3. Consider the source</h2>
<p>Media outlets have a range of biases. The <a href="https://www.adfontesmedia.com/">Media Bias Chart</a> describes which outlets are <a href="https://observer.com/2018/06/media-bias-can-readers-trust-media-pew-research-center-knight-foundation/">most and least partisan</a> as well as how reliable they are at <a href="https://www.poynter.org/fact-checking/media-literacy/2021/should-you-trust-media-bias-charts/">reporting facts</a>.</p>
<p>You can play an online game called “<a href="https://fakey.osome.iu.edu/">Fakey</a>” to see how susceptible you are to different ways news is presented online.</p>
<p>When consuming news, make sure you know how trustworthy the source is – or whether it’s <a href="https://www.cjr.org/fake-beta">not trustworthy at all</a>. Double-check stories from other sources with low biases and high fact ratings to find out who – and what – you can actually trust, rather than just <a href="https://doi.org/10.1111/pops.12586">what your gut tells you</a>. </p>
<p>Also, be aware that some disinformation agents <a href="https://www.forbes.com/sites/christopherelliott/2019/02/21/these-are-the-real-fake-news-sites/">make fake sites</a> that look like real news sources – so make sure you’re conscious of which site you are actually visiting. Engaging in this level of <a href="http://dx.doi.org/10.1073/pnas.1806781116">thinking about your own thinking</a> has been shown to improve your ability to tell fact from fiction.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A man leans back from his desk" src="https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/389867/original/file-20210316-17-1xajml4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Take a moment to think before you decide to share something online.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/man-in-office-royalty-free-image/641199968">10'000 Hours/Digital Vision via Getty Images</a></span>
</figcaption>
</figure>
<h2>4. Take a pause</h2>
<p>When most people go online, especially on social media, they’re there for <a href="https://www.searchenginejournal.com/seo-101/why-do-people-visit-websites-today/">entertainment, connection or even distraction</a>. Accuracy isn’t always high on the priority list. Yet <a href="https://doi.org/10.1177/1461444820969893">few want to be a liar</a>, and the <a href="https://www.hbo.com/documentaries/after-truth-disinformation-and-the-cost-of-fake-news">costs of sharing misinformation</a> can be high – to individuals, their relationships and society as a whole. Before you decide to share something, take a moment to remind yourself of the <a href="https://doi.org/10.1177%2F0956797620939054">value you place on truth and accuracy</a>. </p>
<p>Thinking “is what I am sharing true?” can help you stop the spread of misinformation and will encourage you to <a href="https://www.patheos.com/blogs/nosacredcows/2018/09/study-confirms-most-people-share-articles-based-only-on-headlines/">look beyond the headline</a> and potentially fact-check before sharing. </p>
<p>Even if you don’t think specifically about accuracy, <a href="http://dx.doi.org/10.1037/xge0000729">just taking a pause before sharing</a> can give you a chance for your mind to catch up with your emotions. Ask yourself whether you really want to share it, and if so, <a href="https://doi.org/10.37016/mr-2020-009">why</a>. Think about what the potential consequences of sharing it might be. </p>
<p>Research shows that most misinformation is shared quickly and <a href="https://doi.org/10.1016/j.cognition.2018.06.011">without much thought</a>. The impulse to share without thinking can <a href="https://www.apa.org/news/apa/2020/02/fake-news">even be more powerful</a> than partisan sharing tendencies. Take your time. There is no hurry. You are not a <a href="https://www.niemanlab.org/2013/11/sharing-fast-and-slow-the-psychological-connection-between-how-we-think-and-how-we-spread-news-on-social-media/">breaking-news</a> organization upon whom thousands depend for immediate information. </p>
<h2>5. Be aware of your emotions</h2>
<p>People often share things because of their gut reactions, rather than the conclusions of critical thinking. In a <a href="https://www.spsp.org/news-center/blog/martel-emotion-misinformation-social-media">recent study</a>, researchers found that people who viewed their social media feed while in an emotional mindset were <a href="https://doi.org/10.1186/s41235-020-00252-3">significantly more likely to share misinformation</a> than those who went in with a more rational state of mind. </p>
<p><a href="https://doi.org/10.1111/jcom.12164">Anger and anxiety</a>, in particular, make people more vulnerable to falling for misinformation.</p>
<h2>6. If you see something, say something</h2>
<p>Stand up to misinformation publicly. It may feel uncomfortable to challenge your friends online, especially if you fear conflict. The person to whom you respond with a link to a <a href="https://snopes.com">Snopes post</a> or other fact-checking site may not appreciate being called out. </p>
<p>But evidence shows that <a href="https://doi.org/10.1037/xge0000635">explicitly critiquing the specific reasoning</a> in the post and <a href="http://dx.doi.org/10.1080/1369118X.2017.1313883">providing counterevidence like a link</a> about how it is fake is <a href="https://doi.org/10.1080/10810730.2020.1838671">an effective technique</a>.</p>
<p>Even <a href="https://doi.org/10.1111/bjop.12383">short-format refutations</a> – like “this isn’t true” – are more effective than saying nothing. <a href="https://doi.org/10.1177%2F1077699017710453">Humor – though not ridicule of the person</a> – can work, too. When <a href="http://dx.doi.org/10.1016/j.chb.2019.03.032">actual people correct misinformation online</a>, it can be <a href="http://dx.doi.org/10.1080/10410236.2017.1331312">as effective</a>, if not <a href="https://doi.org/10.1080/10410236.2020.1794553">more so</a>, as when a social media company labels something as questionable. </p>
<p>People <a href="https://doi.org/10.1177%2F2056305120935102">trust other humans</a> more than algorithms and bots, especially those in our own social circles. That’s particularly true if you have <a href="https://doi.org/10.1177%2F1075547017731776">expertise in the subject</a> or are a <a href="http://dx.doi.org/10.1080/10584609.2017.1334018">close connection</a> with the person who shared it. </p>
<p>An additional benefit is that public debunking notifies other viewers that they may want to look more closely before choosing to share it themselves. So even if you don’t discourage the original poster, you are discouraging others.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="A child raises a finger" src="https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/389871/original/file-20210316-22-1ehva9y.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Even kids know to speak up when they see something wrong.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/latinx-toddler-points-his-index-finger-while-royalty-free-image/1198721866">Mireya Acierto/DigitalVision via Getty Images</a></span>
</figcaption>
</figure>
<h2>7. If you see someone else stand up, stand with them</h2>
<p>If you see someone else has posted that a story is false, don’t say “well, they beat me to it so I don’t need to.” When more people chime in on a post as being false, it signals that sharing misinformation is <a href="https://doi.org/10.1016/j.chb.2019.03.032">frowned upon by the group more generally</a>.</p>
<p>Stand with those who stand up. If you don’t and something gets shared over and over, that <a href="https://www.sciencedaily.com/releases/2019/12/191203094813.htm">reinforces people’s beliefs that it is OK</a> to share misinformation – because everyone else is doing it, and only a few, if any, are objecting.</p>
<p>Allowing misinformation to spread also makes it more likely that even more people will start to believe it – because people come to <a href="http://dx.doi.org/10.1037/xge0000098">believe things they hear repeatedly</a>, even if they know at first <a href="https://theconversation.com/unbelievable-news-read-it-again-and-you-might-think-its-true-69602">they’re not true</a>.</p>
<p>There is no perfect solution. Some misinformation is <a href="https://doi.org/10.1080/03637751.2018.1467564">harder to counter than others</a>, and some countering tactics are more effective at different times or for different people. But you can go a long way toward protecting yourself and those in your social networks from confusion, deception and falsehood.</p>
<p>[<em>Over 100,000 readers rely on The Conversation’s newsletter to understand the world.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=100Ksignup">Sign up today</a>.]</p><img src="https://counter.theconversation.com/content/157099/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>H. Colleen Sinclair receives funding from the Department of Defense.</span></em></p>A social psychologist explains how to avoid being misled, and how to prevent yourself – and others – from spreading inaccurate information.H. Colleen Sinclair, Associate Professor of Social Psychology, Mississippi State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1537082021-03-15T12:56:48Z2021-03-15T12:56:48Z6 tips to help you detect fake science news<figure><img src="https://images.theconversation.com/files/389103/original/file-20210311-20-90hym5.jpg?ixlib=rb-1.1.0&rect=781%2C889%2C4508%2C3098&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">If what you're reading seems too good to be true, it just might be.</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/dhCGbPx8wpk">Mark Hang Fung So/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>I’m a professor of chemistry, have a Ph.D. and <a href="https://scholar.google.com/citations?user=RpiSPiwAAAAJ&hl=en&oi=ao">conduct my own scientific research</a>, yet when consuming media, even I frequently need to ask myself: “Is this science or is it fiction?”</p>
<p>There are plenty of reasons a science story might not be sound. Quacks and charlatans take advantage of the complexity of science, some content providers can’t tell bad science from good and some politicians peddle fake science to support their positions.</p>
<p>If the science sounds too good to be true or too wacky to be real, or very conveniently supports a contentious cause, then you might want to check its veracity.</p>
<p>Here are six tips to help you detect fake science.</p>
<h2>Tip 1: Seek the peer review seal of approval</h2>
<p>Scientists rely on journal papers to share their scientific results. They let the world see what research has been done, and how.</p>
<p>Once researchers are confident of their results, they write up a manuscript and send it to a journal. Editors forward the submitted manuscripts to at least two external referees who have expertise in the topic. These reviewers can suggest the manuscript be rejected, published as is, or sent back to the scientists for more experiments. That process is called “peer review.”</p>
<p>Research published in <a href="https://undsci.berkeley.edu/article/howscienceworks_16">peer-reviewed journals</a> has undergone rigorous quality control by experts. Each year, about <a href="https://www.stm-assoc.org/2012_12_11_STM_Report_2012.pdf">2,800 peer-reviewed journals</a> publish roughly 1.8 million scientific papers. The body of scientific knowledge is constantly evolving and updating, but you can trust that the science these journals describe is sound. Retraction policies help correct the record if mistakes are discovered post-publication.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="man in white coat in lab at laptop" src="https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/389321/original/file-20210312-15-1iumcql.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">‘Peer-reviewed’ means other scientific experts have checked the study over for any problems before publication.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/scientist-using-computer-in-laboratory-royalty-free-image/1194829395">ljubaphoto/E+ via Getty Images</a></span>
</figcaption>
</figure>
<p>Peer review takes months. To get the word out faster, scientists sometimes post research papers on what’s called a preprint server. These often have “RXiv” – pronounced “archive” – in their name: MedRXiv, BioRXiv and so on. These articles have not been peer-reviewed and so are <a href="https://doi.org/10.1080/10410236.2020.1864892">not validated by other scientists</a>. Preprints provide an opportunity for other scientists to evaluate and use the research as building blocks in their own work sooner.</p>
<p>How long has this work been on the preprint server? If it’s been months and it hasn’t yet been published in the peer-reviewed literature, be very skeptical. Are the scientists who submitted the preprint from a reputable institution? During the COVID-19 crisis, with researchers scrambling to understand a dangerous new virus and rushing to develop lifesaving treatments, preprint servers have been littered with immature and unproven science. <a href="https://arstechnica.com/science/2020/05/a-lot-of-covid-19-papers-havent-been-peer-reviewed-reader-beware/">Fastidious research standards have been sacrificed for speed</a>.</p>
<p>A last warning: Be on the alert for research published in what are called <a href="https://www.nature.com/articles/d41586-019-03759-y">predatory journals</a>. They don’t peer-review manuscripts, and they charge authors a fee to publish. Papers from any of the <a href="https://guides.library.yale.edu/c.php?g=296124&p=1973764">thousands of known predatory journals</a> should be treated with strong skepticism.</p>
<h2>Tip 2: Look for your own blind spots</h2>
<p>Beware of biases in your own thinking that might predispose you to fall for a particular piece of fake science news.</p>
<p>People give their own memories and experiences more credence than they deserve, making it hard to accept new ideas and theories. Psychologists call this quirk the availability bias. It’s a useful built-in shortcut when you need to make quick decisions and don’t have time to critically analyze lots of data, but it messes with your fact-checking skills.</p>
<p>In the fight for attention, sensational statements beat out unexciting, but more probable, facts. The tendency to overestimate the likelihood of vivid occurrences is called the salience bias. It leads people to mistakenly believe overhyped findings and trust confident politicians in place of cautious scientists.</p>
<p>A confirmation bias can be at work as well. People tend to give credence to news that fits their existing beliefs. This tendency helps climate change denialists and anti-vaccine advocates believe in their causes in spite of the scientific consensus against them.</p>
<p>Purveyors of fake news know the weaknesses of human minds and try to take advantage of these natural biases. <a href="https://www.huffpost.com/entry/how-to-overcome-cognitive-bias-and-use-it-to-your-advantage_b_5900fff3e4b00acb75f1844f">Training can help you</a> <a href="https://hbr.org/2015/05/outsmart-your-own-biases">recognize and overcome</a> your own cognitive biases.</p>
<h2>Tip 3: Correlation is not causation</h2>
<p>Just because you can see a relationship between two things doesn’t necessarily mean that one causes the other.</p>
<p>Even if surveys find that people who live longer drink more red wine, it doesn’t mean a daily glug will extend your life span. It could just be that red-wine drinkers are wealthier and have better health care, for instance. Look out for this error in nutrition news.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="gloved hand holds a mouse" src="https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/389322/original/file-20210312-20-1s3fgpp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">What works well in rodents might not work at all in you.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/face-of-tiny-white-mouse-peeps-out-royalty-free-image/157440932">sidsnapper/E+ via Getty Images</a></span>
</figcaption>
</figure>
<h2>Tip 4: Who were the study’s subjects?</h2>
<p>If a study used human subjects, check to see whether it was placebo-controlled. That means some participants are randomly assigned to get the treatment – like a new vaccine – and others get a fake version that they believe is real, the placebo. That way researchers can tell whether any effect they see is from the drug being tested. </p>
<p>The best trials are also double blind: To remove any bias or preconceived ideas, neither the researchers nor the volunteers know who is getting the active medication or the placebo.</p>
<p>The size of the trial is important too. When more patients are enrolled, researchers can identify safety issues and beneficial effects sooner, and any differences between subgroups are more obvious. Clinical trials can have thousands of subjects, but some scientific studies involving people are much smaller; they should address how they’ve achieved the statistical confidence they claim to have.</p>
<p>Check that any health research was actually done on people. Just because a certain drug works <a href="https://twitter.com/justsaysinmice">in rats or mice</a> does not mean it will work for you.</p>
<h2>Tip 5: Science doesn’t need ‘sides’</h2>
<p>Although a political debate requires two opposing sides, a scientific consensus does not. When the media interpret objectivity to mean equal time, it undermines science. </p>
<h2>Tip 6: Clear, honest reporting might not be the goal</h2>
<p>To get their audience’s attention, morning shows and talk shows need something exciting and new; accuracy may be less of a priority. Many science journalists are doing their best to accurately cover new research and discoveries, but plenty of science media are better classified as entertaining rather than educational. <a href="https://www.bmj.com/content/349/bmj.g7346">Dr. Oz</a>, Dr. Phil and Dr. Drew should not be your go-to medical sources. </p>
<p>Beware of medical products and procedures that sound too good to be true. Be skeptical of testimonials. Think about the key players’ motivations and who stands to make a buck.</p>
<p>If you’re still suspicious of something in the media, make sure the news being reported reflects what the research actually found by <a href="https://www.sciencemag.org/careers/2016/03/how-seriously-read-scientific-paper">reading the journal article itself</a>.</p>
<p>[<em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>.]</p><img src="https://counter.theconversation.com/content/153708/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Marc Zimmer does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Whenever you hear about a new bit of science news, these suggestions will help you assess whether it’s more fact or fiction.Marc Zimmer, Professor of Chemistry, Connecticut CollegeLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1505092020-12-01T13:25:17Z2020-12-01T13:25:17ZYour brain’s built-in biases insulate your beliefs from contradictory facts<figure><img src="https://images.theconversation.com/files/372118/original/file-20201130-21-q7ey1o.jpg?ixlib=rb-1.1.0&rect=126%2C183%2C7539%2C4884&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">These psychological tendencies explain why an onslaught of facts won't necessarily change anyone's mind.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/conflict-royalty-free-image/1061219956">Francesco Carta fotografo/Moment via Getty Images</a></span></figcaption></figure><p>A rumor started circulating back in 2008 that Barack Obama was not born in the United States. At the time, I was serving as chair of the Hawaii Board of Health. The director and deputy director of health, both appointed by a Republican governor, <a href="https://www.nbcnews.com/id/wbna42519951">inspected Obama’s birth certificate</a> in the state records and certified that it was real.</p>
<p>I would have thought that this evidence would settle the matter, but it didn’t. Many people thought the birth certificate was a fabricated document. Today, many <a href="https://www.theatlantic.com/ideas/archive/2020/05/birtherism-and-trump/610978/">people still believe</a> that President Obama was not born in the U.S.</p>
<p>I once listened to a “Science Friday” <a href="https://www.npr.org/2011/01/07/132740175/paul-offit-on-the-anti-vaccine-movement">podcast on the anti-vaccination movement</a>. A woman called in who didn’t believe that vaccines were safe, despite <a href="https://www.vaccines.gov/basics/safety">overwhelming scientific evidence that they are</a>. The host asked her how much proof she would need in order to believe that vaccines were safe. Her answer: No amount of scientific evidence could change her mind.</p>
<p><a href="https://scholar.google.com/citations?user=87v4Nk4AAAAJ&hl=en&oi=ao">As a psychologist</a>, I was bothered, but not shocked, by this exchange. There are several well-known mechanisms in human psychology that enable people to continue to hold tight to beliefs even in the face of contradictory information.</p>
<h2>Cognitive shortcuts come with biases</h2>
<p>In its early days, the science of psychology assumed that people would make rational decisions. But over the decades, it’s become clear that many decisions people make – about choices ranging from romantic partners and finances to <a href="https://doi.org/10.1016/j.dr.2008.01.002">risky health behaviors</a> like unsafe sex and <a href="https://doi.org/10.2105/AJPH.2008.155382">health-promoting behaviors</a> – are not made rationally.</p>
<p>Instead, human minds have a tendency toward several <a href="https://www.theatlantic.com/magazine/archive/2018/09/cognitive-bias/565775/">cognitive biases</a>. These are systematic errors in the way you think about the world. Given the complexity of the world around you, your brain cuts a few corners to help you process complex information quickly.</p>
<p>For example, the availability bias refers to the tendency to use information you can quickly recall. This is helpful when you’re ordering ice cream at a place with 50 flavors; you don’t need to think about all of them, just one you recently tried and liked. Unfortunately these shortcuts can mean you end up at a nonrational decision.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="looking at camera man holds up a finger" src="https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In efficiency mode, your mind may discount contradictory information.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/young-businessman-rejecting-your-offer-royalty-free-image/1165905568">DjelicS/E+ via Getty Images</a></span>
</figcaption>
</figure>
<p>One form of cognitive bias is called <a href="https://www.sup.org/books/title/?id=3850">cognitive dissonance</a>. This is the feeling of discomfort you can experience when your beliefs are not in line with your actions or new information. When in this state, people can reduce their dissonance in one of two ways: changing their beliefs to be in line with the new information or interpreting the new information in a way that justifies their original beliefs. In many cases, people choose the latter, whether consciously or not.</p>
<p>For example, maybe you think of yourself as active, not at all a couch potato – but you spend all of Saturday lying on the couch bingeing reality TV. You can either start thinking about yourself in a new way or justify your behavior, maybe by saying you had a really busy week and need to rest up for your workout tomorrow.</p>
<p>The <a href="https://doi.org/10.1002/asi.23274">confirmation bias</a> is another process that helps you justify your beliefs. It involves favoring information that supports your beliefs and downplaying or ignoring information to the contrary. Some researchers have called this “<a href="https://www.hup.harvard.edu/catalog.php?isbn=9780674237827">my side blindness</a>” – people see the flaws in arguments that are contradictory to their own but are unable to see weaknesses in their own side. Picture fans of a football team that went 7-9 for the season, arguing that their team is actually really strong, spotting failings in other teams but not in theirs.</p>
<p>With the decline of mass media over the past few decades and the increase in niche media and social media, it’s become easier to <a href="https://theconversation.com/misinformation-and-biases-infect-social-media-both-intentionally-and-accidentally-97148">surround yourself with messages you already agree with</a> while minimizing your exposure to messages you don’t. These information bubbles reduce cognitive dissonance but also make it harder to change your mind when you are wrong.</p>
<h2>Shoring up beliefs about yourself</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="woman seething behind the wheel of a car" src="https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=416&fit=crop&dpr=1 600w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=416&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=416&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=522&fit=crop&dpr=1 754w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=522&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=522&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">I’m nice, so this confrontation must be their fault.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/road-rage-royalty-free-image/1070981954">Petri Oeschger/Moment via Getty Images</a></span>
</figcaption>
</figure>
<p>It can be especially hard to change certain beliefs that are central to your <a href="https://doi.org/10.1016/S0065-2601(08)60075-1">self-concept</a> – that is, who you think you are. For example, if you believe you’re a kind person and you cut someone off in traffic, instead of thinking that maybe you’re not all that nice, it’s easier to think the other person was driving like a jerk.</p>
<p>This relationship between beliefs and self-concept can be reinforced by affiliations with groups like political parties, cults or other like-minded thinkers. These groups are often belief bubbles where the majority of members believe the same thing and repeat these beliefs to one another, strengthening the idea that their beliefs are right.</p>
<p>Researchers have found that people generally think they are <a href="https://www.vox.com/science-and-health/2019/1/31/18200497/dunning-kruger-effect-explained-trump">more knowledgeable </a> about certain issues than they really are. This has been demonstrated across a variety of studies looking at vaccinations, Russia’s invasion of the Ukraine and <a href="https://www.penguinrandomhouse.com/books/533524/the-knowledge-illusion-by-steven-sloman-and-philip-fernbach/">even how toilets work</a>. These ideas then get passed from person to person without being based on fact. For example, <a href="https://www.politico.com/news/2020/11/09/republicans-free-fair-elections-435488">70% of Republicans</a> say they don’t believe the 2020 presidential election was free and fair despite a lack of any evidence of widespread voter fraud.</p>
<p>[<em>The Conversation’s science, health and technology editors pick their favorite stories.</em> <a href="https://theconversation.com/us/newsletters/science-editors-picks-71/?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=science-favorite">Weekly on Wednesdays</a>.]</p>
<p>Belief bubbles and the defenses against cognitive dissonance can be hard to break down. And they can have important downstream effects. For instance, these psychological mechanisms affect the ways people have chosen whether or not to follow public health guidelines around social distancing and wearing masks during the COVID-19 pandemic, sometimes with <a href="https://www.theatlantic.com/ideas/archive/2020/07/role-cognitive-dissonance-pandemic/614074/">deadly consequences</a>.</p>
<p>Changing people’s minds is difficult. Given the confirmation bias, evidence-based arguments counter to what someone already believes are likely to be discounted. The best way to change a mind is to start with yourself. With as open a mind as you can summon, think about why you believe what you do. Do you really understand the issue? Could you think about it in a different way?</p>
<p>As a professor, I like to have my students debate ideas from the side that they personally disagree with. This tactic tends to lead to deeper understanding of the issues and makes them question their beliefs. Give it an honest try yourself. You might be surprised by where you end up.</p><img src="https://counter.theconversation.com/content/150509/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jay Maddock does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Cognitive shortcuts help you efficiently move through a complicated world. But they come with an unwelcome side effect: Facts aren’t necessarily enough to change your mind.Jay Maddock, Professor of Public Health, Texas A&M UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1433002020-07-29T10:22:30Z2020-07-29T10:22:30ZCoronavirus shows how to get people to act on climate change – here’s the psychology<figure><img src="https://images.theconversation.com/files/349585/original/file-20200727-15-18o5stn.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Climate campaigner Greta Thunberg.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/greta-thunberg-famous-swedish-climate-activist-1401066575">Daniele COSSU/Shutterstock</a></span></figcaption></figure><p>Climate change and COVID-19 are the two most significant crises faced by the modern world – and widespread behaviour change is essential to cope with both. This means that official messaging by government and other authorities is critical. To succeed, leaders need to communicate the severe threat effectively and elicit high levels of public compliance, without causing undue panic. </p>
<p>But the extent to which people comply depends on their psychological filters when receiving the messages – as the coronavirus pandemic has shown. </p>
<p>With COVID-19, the early messaging attempted to circumscribe the nature of the threat. In March, the <a href="https://www.euro.who.int/en/health-topics/health-emergencies/coronavirus-covid-19/news/news/2020/3/who-announces-covid-19-outbreak-a-pandemic">WHO announced</a> that: “COVID-19 impacts the elderly and those with pre-existing health conditions most severely.” Similar statements <a href="https://publichealthmatters.blog.gov.uk/2020/03/04/coronavirus-covid-19-what-is-social-distancing/">were made by the UK government</a>.</p>
<p>A reasonable interpretation of this would be that the virus does not “affect” young people. But as new clinical data came in, this message was changed to emphasise that the virus could affect people of all ages and <a href="https://news.sky.com/video/coronavirus-virus-does-not-discriminate-gove-11964771">doesn’t discriminate</a>. </p>
<p>But human beings are not necessarily entirely rational in terms of processing information. Experimental psychology has uncovered many situations where our reasoning is, in fact, <a href="https://science.sciencemag.org/content/211/4481/453">limited or biased</a>. </p>
<p>For example, a mental process called the “affect heuristic” allows us to make decisions and solve problems quickly and (often) efficiently, but based on our feelings rather than logic. The bias <a href="http://bear.warrington.ufl.edu/brenner/mar7588/Papers/slovic-affect-heuristic-2002.pdf">has been shown to influence</a> both judgements of risk and behaviour. For COVID-19, the official messaging would have established a less negative reaction in young people compared to older people. This would have made them more likely to take more risks – even when new authoritative data about the actual risks came in. Researchers call this “psychophysical numbing”.</p>
<p>Another mental obstacle is <a href="https://theconversation.com/how-to-check-if-youre-in-a-news-echo-chamber-and-what-to-do-about-it-69999">confirmation bias</a>. This makes us blind to data that disagrees with our beliefs, making us overly attentive to messages that agree with them. It influences (among other things) automatic visual attention to certain aspects of messages. In other words, if you are young, you may, without any conscious awareness, pay little visual attention to the news that the virus is serious for people of all ages.</p>
<p>The initial positive message for young people also created an “<a href="https://www.theguardian.com/science/2012/jan/01/tali-sharot-the-optimism-bias-extract">optimism bias</a>”. This bias is very powerful – we know of various brain mechanisms that can ensure that a positive mood persists. One study found that people tend to have a <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3204264/">reduced level of neural coding</a> of more negative than anticipated information (in comparison with more positive than anticipated information) in a critical region of the prefrontal cortex, which is involved in decision making. This means that we tend to miss the incoming bad news and, even if we don’t, we hardly process it.</p>
<p>All of these biases affect our behaviour, and there is clear evidence that young people were more likely to fail to comply with the government’s directives about COVID-19. A survey conducted on March 30 by polling firm Ipsos MORI found that <a href="https://www.lshtm.ac.uk/newsevents/news/2020/uk-social-interaction-data-help-predict-virus-transmission-and-inform">nearly twice as many</a> 16-24 year-olds had low or limited concern about COVID-19 compared with adults who were 55 or older. The younger group was also four times as likely as older adults to ignore government advice.</p>
<h2>Lessons for climate change</h2>
<p>Our own research has shown that significant cognitive biases also operate with messaging about climate change. One is confirmation bias – those who don’t believe that climate change is a real threat simply don’t take in messages saying that it is.</p>
<figure class="align-center ">
<img alt="Picture of polar bears" src="https://images.theconversation.com/files/349584/original/file-20200727-35-17c66t8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/349584/original/file-20200727-35-17c66t8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=371&fit=crop&dpr=1 600w, https://images.theconversation.com/files/349584/original/file-20200727-35-17c66t8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=371&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/349584/original/file-20200727-35-17c66t8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=371&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/349584/original/file-20200727-35-17c66t8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=466&fit=crop&dpr=1 754w, https://images.theconversation.com/files/349584/original/file-20200727-35-17c66t8.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=466&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/349584/original/file-20200727-35-17c66t8.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=466&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Using polar bears as symbols for climate change can create bias.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/polar-bear-sow-cub-walk-on-703003783">FloridaStock/Shutterstock</a></span>
</figcaption>
</figure>
<p>What’s more, unlike coronavirus messages, most climate change messages inadvertently accentuate what we call “temporal” and “spatial” biases. The UK government campaign “<a href="https://www.campaignlive.co.uk/article/act-co2-voted-best-green-online-campaign/983666">Act on CO2</a>” used images of adults reading bedtime stories to children, which implied that that the real threat of climate change will <a href="https://www.researchgate.net/publication/329070757_The_Psychology_of_Climate_Change">present itself in the future</a> – a temporal bias. </p>
<p>Other campaigns have used the perennial polar bear in the associated images, which strengthens spatial bias – polar bears are in a different geographical location (to most of us). These messages therefore allow for a high degree of optimism bias – with people thinking that climate change won’t affect them and their own lives. </p>
<p>Research using eye-tracking to analyse how they process climate change messages <a href="https://research.edgehill.ac.uk/en/publications/staying-over-optimistic-about-the-future-uncovering-attentional-b-2">demonstrates the effects of such biases</a>. For example, optimistic people tend to fix their gaze on the more “positive” aspects of climate change messages (especially any mentions of disputes about the underlying science – there is less to worry about if the science isn’t definitive).</p>
<p>These gaze fixations can also affect what you remember from such messages and how vulnerable they make you feel. If you don’t think that climate change will affect you personally, the affect heuristic will not be guiding you directly to appropriate remedial action.</p>
<p>To make climate change messages more effective, we need to target these cognitive biases. To prevent temporal and spatial biases, for example, we need a clear message as to why climate change is bad for individuals in their own lives in the here and now (establishing an appropriate affect heuristic). </p>
<p>And to prevent optimism bias, we also need to avoid presenting “both sides of the argument” in the messaging - the science tells us that there’s only one side. There also needs to be a clear argument as to why recommended, sustainable behaviours will work (establishing a different sort of confirmation bias).</p>
<p>We also need everyone to get the message, not just some groups - that’s an important lesson from COVID-19. There can be no (apparent) exceptions when it comes to climate change.</p><img src="https://counter.theconversation.com/content/143300/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Geoff Beattie has received funding from the British Academy and Edge Hill University for research on psychological aspects of climate change.
</span></em></p><p class="fine-print"><em><span>Laura McGuire has received funding from Edge Hill University (with Beattie) for research on the development of attitudes to climate change in children. </span></em></p>To fight climate change, we need to take people’s cognitive biases into account.Geoff Beattie, Professor of Psychology, Edge Hill UniversityLaura McGuire, Research Fellow in Education, Edge Hill UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1413352020-06-25T12:18:56Z2020-06-25T12:18:56ZCoronavirus responses highlight how humans are hardwired to dismiss facts that don’t fit their worldview<figure><img src="https://images.theconversation.com/files/343846/original/file-20200624-132961-fwo33u.jpg?ixlib=rb-1.1.0&rect=165%2C285%2C4547%2C3051&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The more politicized an issue, the harder it is for people to absorb contradictory evidence.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/news-photo/flanked-by-white-house-coronavirus-response-coordinator-dr-news-photo/1213154746">Drew Angerer/Getty Images News via Getty Images</a></span></figcaption></figure><p>Bemoaning uneven individual and state compliance with public health recommendations, top U.S. COVID-19 adviser Anthony Fauci <a href="https://www.cnn.com/2020/06/18/politics/anthony-fauci-coronavirus-anti-science-bias/index.html">recently blamed</a> the country’s ineffective pandemic response on an American “anti-science bias.” He called this bias “inconceivable,” because “science is truth.” Fauci compared those discounting the importance of masks and social distancing to “anti-vaxxers” in their “amazing” refusal to listen to science. </p>
<p>It is Fauci’s profession of amazement that amazes me. As well-versed as he is in the science of the coronavirus, he’s overlooking the <a href="https://www.motherjones.com/politics/2011/04/denial-science-chris-mooney/">well-established science</a> of “anti-science bias,” or science denial.</p>
<p>Americans increasingly exist in highly polarized, informationally insulated ideological communities occupying their own <a href="https://www.vox.com/policy-and-politics/2019/11/16/20964281/impeachment-hearings-trump-america-epistemic-crisis">information universes</a>. </p>
<p>Within segments of the political blogosphere, <a href="https://www.desmogblog.com/heartland-institute">global warming</a> is dismissed as either a hoax or so uncertain as to be unworthy of response. Within other geographic or online communities, the science of <a href="https://www.npr.org/tags/399145964/anti-vaccination-movement">vaccine safety</a>, <a href="https://blogs.scientificamerican.com/but-not-simpler/why-portland-is-wrong-about-water-fluoridation/">fluoridated drinking water</a> and <a href="https://www.nationalgeographic.com/culture/food/the-plate/2016/05/17/scientists-say-gmo-foods-are-safe-public-skepticism-remains/">genetically modified foods</a> is distorted or ignored. There is a <a href="https://theconversation.com/coronavirus-new-survey-shows-how-republicans-and-democrats-are-responding-differently-138394">marked gap in expressed concern</a> over the coronavirus depending on political party affiliation, apparently based in part on partisan disagreements over factual issues like the <a href="https://www.pewresearch.org/science/2020/06/03/partisan-differences-over-the-pandemic-response-are-growing/ps_2020-06-03_sci-am-trust_00-3/">effectiveness of social distancing</a> or <a href="https://news.gallup.com/poll/311408/republicans-skeptical-covid-lethality.aspx">the actual COVID-19 death rate</a>.</p>
<p>In theory, resolving factual disputes should be relatively easy: Just present strong evidence, or evidence of a strong expert consensus. This approach succeeds most of the time, when the issue is, say, the atomic weight of hydrogen.</p>
<p>But things don’t work that way when scientific advice presents a picture that threatens someone’s perceived interests or ideological worldview. In practice, it turns out that one’s political, religious or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.</p>
<p>“<a href="https://www.psychologytoday.com/us/basics/motivated-reasoning">Motivated reasoning</a>” is what social scientists call the process of deciding what evidence to accept based on the conclusion one prefers. As I explain in my book, “<a href="https://www.amazon.com/Truth-About-Denial-Self-Deception-Politics/dp/0190062274">The Truth About Denial</a>,” this very human tendency applies to all kinds of facts about the physical world, economic history and current events.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=445&fit=crop&dpr=1 600w, https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=445&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=445&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=559&fit=crop&dpr=1 754w, https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=559&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/312935/original/file-20200130-41527-1q4zuso.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=559&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The same facts will sound different to people depending on what they already believe.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Nightclub-Shooting-Florida/4d33732e41f34ce89a416c03d669a0b0/1/0">AP Photo/John Raoux</a></span>
</figcaption>
</figure>
<h2>Denial doesn’t stem from ignorance</h2>
<p>The interdisciplinary study of this phenomenon has made one thing clear: The failure of various groups to acknowledge the truth about, say, climate change, is <a href="https://theconversation.com/facts-versus-feelings-isnt-the-way-to-think-about-communicating-science-80255">not explained by a lack of information</a> about the scientific consensus on the subject.</p>
<p>Instead, what strongly predicts denial of expertise on many controversial topics is simply one’s political persuasion.</p>
<p>A <a href="https://doi.org/10.1177/0002716214558393">2015 metastudy</a> showed that ideological polarization over the reality of climate change actually increases with respondents’ knowledge of politics, science and/or energy policy. The chances that a conservative is a climate science denier is <a href="http://www.people-press.org/2008/05/08/a-deeper-partisan-divide-over-global-warming/">significantly higher</a> if he or she is college educated. Conservatives scoring highest on tests for <a href="http://dx.doi.org/10.2139/ssrn.2182588">cognitive sophistication</a> or <a href="http://dx.doi.org/10.2139/ssrn.2319992">quantitative reasoning skills</a> are most susceptible to motivated reasoning about climate science. </p>
<p>Denialism is not just a problem for conservatives. Studies have found <a href="https://doi.org/10.1080/13669877.2010.511246">liberals are less likely to accept</a> a hypothetical expert consensus on the possibility of safe storage of nuclear waste, or on the effects of concealed-carry gun laws.</p>
<h2>Denial is natural</h2>
<p>The human talent for rationalization is a product of many hundreds of thousands of years of adaptation. Our ancestors evolved in small groups, where <a href="https://doi.org/10.1017/S0140525X10000968">cooperation and persuasion</a> had at least as much to do with reproductive success as holding accurate factual beliefs about the world. Assimilation into one’s tribe required assimilation into the group’s ideological belief system – regardless of whether it was grounded in science or superstition. An instinctive bias in favor of one’s “<a href="https://www.simplypsychology.org/social-identity-theory.html">in-group</a>” and its worldview is deeply ingrained in human psychology. </p>
<p>A human being’s very sense of self <a href="https://doi.org/10.1080/10463280701592070">is intimately tied up with</a> his or her identity group’s status and beliefs. Unsurprisingly, then, people respond automatically and defensively to information that threatens the worldview of groups with which they identify. We respond with rationalization and selective assessment of evidence – that is, we engage in “<a href="https://www.psychologytoday.com/us/blog/science-choice/201504/what-is-confirmation-bias">confirmation bias</a>,” giving credit to expert testimony we like while finding reasons to reject the rest.</p>
<p>Unwelcome information can also threaten in other ways. “<a href="https://www.apa.org/science/about/psa/2017/06/system-justification">System justification</a>” theorists like psychologist <a href="https://scholar.google.com/citations?user=Zh1vTeMAAAAJ&hl=en&oi=ao">John Jost</a> have shown how situations that represent a perceived threat to established systems trigger inflexible thinking. For example, populations experiencing economic distress or an external threat have often turned to <a href="https://doi.org/10.1037/tps0000122">authoritarian leaders</a> who <a href="https://medium.com/@bardona/varieties-of-bullsh-t-6fd1cfeb102f?source=friends_link&sk=b6096254e8c3873da683a9dbbc165ac1">promise security and stability</a>.</p>
<p>In ideologically charged situations, one’s prejudices end up affecting one’s factual beliefs. Insofar as you define yourself in terms of your <a href="https://doi.org/10.1080/13669877.2010.511246">cultural affiliations</a>, your attachment to the social or economic status quo, or a combination, information that threatens your belief system – say, about the negative effects of industrial production on the environment – can threaten your sense of identity itself. If trusted political leaders or partisan media are telling you that the COVID-19 crisis is overblown, factual information about a scientific consensus to the contrary can feel like a personal attack. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=366&fit=crop&dpr=1 600w, https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=366&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=366&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=460&fit=crop&dpr=1 754w, https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=460&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/312934/original/file-20200130-41490-1fn1e5d.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=460&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Everyone sees the world through one partisan lens or another, based on their identity and beliefs.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/3d-cinema-glasses-isolated-on-white-62373739">Vladyslav Starozhylov/Shutterstock.com</a></span>
</figcaption>
</figure>
<h2>Denial is everywhere</h2>
<p>This kind of affect-laden, motivated thinking explains a wide range of examples of an extreme, evidence-resistant rejection of historical fact and scientific consensus.</p>
<p>Have tax cuts been shown to pay for themselves in terms of economic growth? Do communities with high numbers of immigrants have higher rates of violent crime? Did Russia interfere in the 2016 U.S. presidential election? Predictably, expert opinion regarding such matters is treated by partisan media as though evidence is itself <a href="https://www.realclearpolitics.com/video/2014/04/28/george_will_global_warming_is_socialism_by_the_back_door.html">inherently partisan</a>.</p>
<p>Denialist phenomena are many and varied, but the story behind them is, ultimately, quite simple. Human cognition is inseparable from the unconscious emotional responses that go with it. Under the right conditions, universal human traits like in-group favoritism, existential anxiety and a desire for stability and control combine into a toxic, system-justifying identity politics. </p>
<p>Science denial is notoriously resistant to facts because it isn’t about facts in the first place. Science denial is an expression of identity – usually in the face of perceived threats to the social and economic status quo – and it typically manifests in response to elite messaging.</p>
<p>I’d be very surprised if Anthony Fauci is, in fact, actually unaware of the significant impact of politics on COVID-19 attitudes, or of what signals are being sent by <a href="https://www.texastribune.org/2020/04/21/texas-dan-patrick-economy-coronavirus/">Republican state government officials’ statements</a>, <a href="https://slate.com/news-and-politics/2020/06/pelosi-enforce-new-mask-rule-congress-republicans-committee-hearings.html">partisan mask refusal in Congress</a>, or the recent <a href="https://theconversation.com/trump-rally-in-tulsa-a-day-after-juneteenth-awakens-memories-of-1921-racist-massacre-140915">Trump rally in Tulsa</a>. Effective science communication is critically important because of the profound effects partisan messaging can have on public attitudes. Vaccination, resource depletion, climate and COVID-19 are life-and-death matters. To successfully tackle them, we must not ignore what the science tells us about science denial. </p>
<p><em>This is an updated version of <a href="https://theconversation.com/humans-are-hardwired-to-dismiss-facts-that-dont-fit-their-worldview-127168">an article originally published</a> on Jan. 31, 2020.</em></p>
<p>[<em>Get our best science, health and technology stories.</em> <a href="https://theconversation.com/us/newsletters/science-editors-picks-71/?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=science-best">Sign up for The Conversation’s science newsletter</a>.]</p><img src="https://counter.theconversation.com/content/141335/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Adrian Bardon received funding from the Humility and Conviction in Public Life project at the University of Connecticut.</span></em></p>Whether in situations relating to scientific consensus, economic history or current political events, denialism has its roots in what psychologists call ‘motivated reasoning.’Adrian Bardon, Professor of Philosophy, Wake Forest UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1381612020-05-18T17:32:35Z2020-05-18T17:32:35ZAfter the crisis: how to avoid (some of) our misleading beliefs<p>Nobel Prize winner Daniel Kahneman and his colleague and friend Amos Tversky formalised the concept of “cognitive bias” in 1972, and considerable research since then has shown that our brain finds it remarkably difficult to make rational decisions. Cognitive biases refer to deviations from a rational treatment of information. They can have dramatically negative consequences in the business, military, political and medical sphere. </p>
<p>A blatant illustration of how dangerous cognitive biases can be is confirmation bias – the tendency we all have to disproportionately seek information that confirms our existing beliefs. This bias actively contributed to the Iranian decision to shoot down flight PS752 on January 8, 2020, <a href="https://www.businessinsider.fr/us/ukraine-iran-crash-panic-bad-training-may-led-missile-attack-2020-1">killing the 176 passengers on board</a>. The media called it “human error” at the time. Confirmation bias was also deemed <a href="https://www.theatlantic.com/international/archive/2012/07/how-confirmation-bias-can-lead-to-war/260347/">partly responsible for United States’ decision to invade Iraq in 2003</a>.</p>
<p>With the coronavirus crisis, this bias has hit us with all its might. Whether we consider the delays in deciding to close countries’ borders, the tardy decision to start (or restart) mass-producing masks and ventilators, the discovery that Covid-19 was in Italy weeks after it had already begun to kill Italians, just as the decision each of us makes to go outside because “we need to”, and the remarkably solid belief in some parts of the world that things are “not as bad as they seem”. The consequences are the devastating (underestimated) death figures that every country affected continuously chimes out.</p>
<h2>How to mitigate confirmation bias</h2>
<p>If most of the manifestations of this bias are hard to take control of, this article proposes to focus on two types of decisions that each of us can actively work on to mitigate confirmation bias. The first type of decision is that of leaving your home. Let’s start with a simple, even though unrealistic, observation: if we could freeze everyone in the world for 15 days with at least 2 meters between each person, the virus would be eradicated. We live a prophecy that French philosopher, scientist and theologian Blaise Pascal shared with us more than 300 years ago: <a href="https://www.pri.org/stories/2014-07-19/new-study-found-people-are-terrible-sitting-alone-their-thoughts-how-about-you">“All of humanity’s problems stem from man’s inability to sit quietly in a room alone”</a>. </p>
<p>Of course, we cannot freeze in place for 15 days – there are a number of outings we must do, whether to get food, or help vulnerable others through the confinement. But are these the only reasons we go out? Can’t you further reduce the number of times you go out shopping? Must you go out for exercise? Could you work out at home instead? If you have people confined with you, ask them how necessary your going out truly is, in the light of disconfirming arguments. Every time we choose to go out, it feels as an insignificant, microdecision, but we know the significant impact that it can have down the transmission chain. This, we all understand rationally, enough media rehashed it since lockdown started. #StayHome, #IoRestoaCasa, #JeRestealaMaison. To what extent do you rationally apply this to yourself?</p>
<p>A second type of decision that we need to urgently tackle is what we will actually implement the “day after”. Here, it is noteworthy that a “return to normal” fantasy is fast spreading. Companies project <a href="https://www.nasdaq.com/articles/mccormick-projects-a-return-to-normal-in-2021-2020-02-04">“a return to normal in 2021”</a>. “Our annual meeting gathering thousands of attendees will take place right after things get back to normal”, an event organiser tells me, inviting me to join him there… </p>
<p>But what does going back to normal even mean, when it will be a possibly unprecedented <a href="https://time.com/5827348/great-depression-coronavirus-after/">worldwide recession</a>, directly causing millions to experience financial and material difficulties? Should we even wish for things to go back to normal? Plenty of data show that going back to “normal” would be one of the worst possible crisis exits that we could envision. The old “normal” was famously problematic for too many reasons: </p>
<ul>
<li><p>10% of the world’s population lived on less than $1.90 a day. This translates in <a href="http://www.nccp.org/topics/childpoverty.html">21% of US children living below the federal poverty threshold</a>; in the UK, it is almost a third, while it hovers around 20% in France. These figures will likely markedly rise as a result of the coronavirus crisis. </p></li>
<li><p>33% of the worldwide agricultural production destined to human consumption was wasted, while <a href="https://www.minnpost.com/foreign-concept/2015/06/795-million-people-don-t-have-enough-eat-why-s-actually-good-news/">1 in every 9 people</a> was not eating as much as they needed. </p></li>
<li><p>Between <a href="https://www.europarl.europa.eu/RegData/etudes/STUD/2017/607350/IPOL_STU(2017)607350_EN.pdf">15 and 125 million Europeans suffered from energy precarity</a>, not only due to high energy prices, but also to vast energy inefficiency, with computers’ use deemed “useful” representing only 60% of the time that they are powered on. </p></li>
<li><p>Air traffic was claimed to double between 2019 and 2037, when, in 2018, it already represented 5% of greenhouse gas emissions worldwide. In spite of these sparkling clear data and the numerous calls for rationing flights, in the old “normal”, every one of us could go all over the globe <em>ad libitum</em>.</p></li>
</ul>
<h2>Creating sustainable value</h2>
<p>In 1962, the American philosopher John L. Austin warned us that we “do things with words.” How about we forbid ourselves to talk about going back to normal, to rethink instead our economy, in order to create sustainable value for most of us? Time is running out, the virus will disappear and the old “normal” will quickly spread back into our days, with its hectic pace and its disastrous consequences.</p>
<p>Lockdown time has a unique quality to it, which already fascinated French historian Fernand Braudel, who wrote from memory his masterpiece <em>The Mediterranean</em> while in a German prison, between 1940 and 1945. He used a telegraphic style to write to a friend : “Believe that without captivity, would never have obtained this lucidity. […] captivity […] allows long meditation of a topic.” Since March, half of humanity lives in a glass case. The Earth is closed. Maybe so we can truly hop off and better catch our breath for what’s to come?</p><img src="https://counter.theconversation.com/content/138161/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anne-Laure Sellier has received funding from the HEC Foundation for her research. She is a lecturer-researcher at HEC Paris.</span></em></p>Involving family and friends in decisions or rethinking the meaning of “getting back to normal” helps protect against cognitive bias and its harmful consequences.Anne-Laure Sellier, Professeur Associé en marketing et membre du groupe de recherche CNRS-GREGHEC, HEC Paris Business SchoolLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1359272020-04-09T02:21:35Z2020-04-09T02:21:35ZThe psychology of lockdown suggests sticking to rules gets harder the longer it continues<figure><img src="https://images.theconversation.com/files/326686/original/file-20200408-42853-12pvtb.jpg?ixlib=rb-1.1.0&rect=87%2C252%2C5717%2C3376&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>The COVID-19 pandemic has forced millions of people to live under strict lockdown conditions, but the psychology of human behaviour predicts they will find it harder to stick to the rules the longer the situation continues. </p>
<p>New Zealand has now reached a midway point of a comprehensive four-week lockdown and there have already been some rule breakers. Most prominent among them was the country’s health minister, David Clark, who almost lost his job this week <a href="https://www.rnz.co.nz/news/political/413617/david-clark-offers-to-resign-after-revealing-he-took-a-trip-to-beach-during-covid-19-lockdown">for flouting lockdown rules</a> by going mountain biking and <a href="https://www.beehive.govt.nz/release/statement-david-clark">driving his family 20km to a beach</a>.</p>
<p>He won’t be the last to break the rules. During a pandemic, fear is one of the central emotional responses and up to this point, most people have complied with lockdown conditions out of fear of becoming infected. But as time passes, people’s resolution may begin to fray.</p>
<h2>Psychology of a pandemic</h2>
<p>A group of more than 40 psychologists are currently <a href="https://psyarxiv.com/y38m9">reviewing research relevant to people’s behaviour during a pandemic</a> to advance the fight against COVID-19. </p>
<p>The psychological factors that motivate us to stay in our bubble are a mix of individual, group and societal considerations.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/coronavirus-and-you-how-your-personality-affects-how-you-cope-and-what-you-can-do-about-it-134037">Coronavirus and you: how your personality affects how you cope and what you can do about it</a>
</strong>
</em>
</p>
<hr>
<p><img src="https://cdn.theconversation.com/static_files/files/993/Covid-19-Bubbles-apartment-gymandpool-02.gif?1586386492" width="100%"></p>
<p>At a very basic level, human behaviour is governed by <a href="https://www.ncbi.nlm.nih.gov/books/NBK92792/">reward principles</a>. </p>
<p>If what we do is followed by a perceived reward, we’re more likely to keep doing it. Not getting sick is a reward, but it may not be perceived as such for much longer as most of us weren’t sick in the first place. </p>
<p>This lack of reward reinforcement could be intensified by an <a href="https://www.vice.com/en_in/article/a3an4a/it-wont-happen-to-me-the-psychology-behind-optimism-bias">optimism bias</a> - “It won’t happen to me” - which may become stronger than our anxiety as time passes and the perceived threat reduces.</p>
<p>Outside of our individual psychology, broader social factors come into play. In times of uncertainty we look to others to guide our own behaviour as they set our social norms. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/facing-the-coronavirus-crisis-together-could-lead-to-positive-psychological-growth-134289">Facing the coronavirus crisis together could lead to positive psychological growth</a>
</strong>
</em>
</p>
<hr>
<p>Often, there is a degree of confusion about guidelines on what people are allowed to do, for example when exercising during lockdown. Seeing others out surfing, mountain biking and picnicking in a park can lead to a mindset of “if they’re doing it, why can’t I?” </p>
<p>To counter this, the government should continue to appeal to our sense of shared identity and highlight examples of punishment for rule breakers. But an over-emphasis on punishment risks people sticking to rules merely for social approval, which means they may conform in public but not in private. Being punished can also build resentment and may lead people to seeking out loopholes in the rules.</p>
<h2>Group behaviour</h2>
<p>In order to last the distance at the highest level of lockdown, people need to cooperate as a group. If everyone complies, we’ll all be OK. </p>
<p>The reverse was evident in the early stages of the COVID-19 pandemic with the <a href="https://theconversation.com/a-toilet-paper-run-is-like-a-bank-run-the-economic-fixes-are-about-the-same-133065">panic-induced buying of toilet paper</a>, face masks and other “essentials”. Here we saw decision making based on emotion and the government attempting to counter it with fact-based information. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/psychology-can-explain-why-coronavirus-drives-us-to-panic-buy-it-also-provides-tips-on-how-to-stop-134032">Psychology can explain why coronavirus drives us to panic buy. It also provides tips on how to stop</a>
</strong>
</em>
</p>
<hr>
<p>There is <a href="https://www.nature.com/articles/ncomms10915">evidence</a> that in times of major crises groups may prioritise their local interests, such as keeping your family, neighbourhood or wider community safe. An example of such local activity in New Zealand is the initiative of some iwi (tribal groups) to <a href="https://www.stuff.co.nz/national/health/coronavirus/120470952/sweeping-measures-including-stay-home-edict-for-most-vulnerable-as-covid-cases-rise">set up road blocks around their communities</a> to control access by people who are not local residents. </p>
<p>But this has the potential to spill over into vigilantism if local protection interests combine with fear. It can prioritise the interests of a few over the greater good.</p>
<h2>Cultural factors</h2>
<p>Cultural and political psychology also has an impact on our behaviour during lockdown. Broadly speaking, different <a href="https://www.ncbi.nlm.nih.gov/pubmed/21617077">cultures can be categorised</a> as “tight” or “loose”. </p>
<p>Tight cultures (China, Singapore) tend to be more rule bound and less open but are also associated with more order and self-regulation. In contrast, looser cultures (UK, USA) place more emphasis on individual freedoms and rights, and are correspondingly slow to self-regulate in the face of government requirements.</p>
<p>Australia appears to fall towards the looser end of the spectrum while New Zealanders sits somewhere in the middle. The challenge will be how we respond as our society continues to “tighten” with strict rules while boredom and annoyance sets in.</p>
<p>Political polarisation, which has <a href="https://news.gallup.com/opinion/polling-matters/268982/impact-increased-political-polarization.aspx">increased markedly in recent years</a>, may be exacerbated by being physically distant from others. There is a danger that as we stay in our bubbles, both physical and virtual, we fall into “echo chambers” wherein we only hear similar voices and opinions to our own. </p>
<p>If this chamber becomes filled with resentment at ongoing restraints on our freedom, it can break down our motivation to stay home. But polarisation can be overcome by helping people <a href="https://cpb-us-w2.wpmucdn.com/web.sas.upenn.edu/dist/9/244/files/2016/10/JOP_Americans-17e7zuk.pdf">identify with a bigger cause</a> – and this was often invoked during times of war. </p>
<p>New Zealanders will eventually emerge from the level 4 lockdown, but it may be into a brave new world. It’s hard to know what to expect as alerts are relaxed. People will need clear guidelines at each stage and help to adjust to a new normal.</p><img src="https://counter.theconversation.com/content/135927/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Dougal Sutherland works for Victoria University of Wellington and is an Associate of Umbrella Health</span></em></p>Fear is a central emotional response during a pandemic and it’s why most people have complied with lockdown conditions. But as anxiety eases and boredom sets in, people’s resolution may fray.Dougal Sutherland, Clinical Psychologist, Te Herenga Waka — Victoria University of WellingtonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1126502019-03-27T10:33:02Z2019-03-27T10:33:02ZExtreme weather news may not change climate change skeptics’ minds<figure><img src="https://images.theconversation.com/files/265960/original/file-20190326-36270-1eb53yy.jpg?ixlib=rb-1.1.0&rect=59%2C101%2C1892%2C1245&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How do people respond to media coverage of weather influenced by climate change?</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Psychology-of-Hurricanes/a3b42217902f4d07af6481fff9f87243/6/0">AP Photo/Andy Newman</a></span></figcaption></figure><p>The year 2018 brought <a href="https://reliefweb.int/report/world/counting-cost-year-climate-breakdown">particularly devastating natural disasters</a>, including hurricanes, droughts, floods and fires – just the kinds of extreme weather events scientists predict will be <a href="https://doi.org/10.17226/21852">exacerbated by climate change</a>.</p>
<p>Amid this destruction, some people see an opportunity to finally quash climate change skepticism. After all, it seems hard to deny the realities of climate change – and object to policies fighting it – while its effects visibly wreck communities, maybe even your own.</p>
<p>News outlets have hesitated to connect natural disasters and climate change, though these connections are increasing, thanks to <a href="https://www.nbcnews.com/news/us-news/climate-experts-now-cite-global-warming-during-extreme-weather-disasters-n895976">calls from experts</a> combined with <a href="https://www.nature.com/articles/d41586-018-05849-9">more precise data about the effects of climate change</a>. Media voices like The Guardian <a href="https://www.theguardian.com/commentisfree/2017/dec/07/climate-change-media-coverage-media-matters">advocate for more coverage of the weather events</a> “when people can see and feel climate change.” Harvard’s Nieman Foundation <a href="http://www.niemanlab.org/2019/01/the-year-of-the-climate-reporter/">dubbed 2019</a> “The Year of the Climate Reporter.” Even conservative talk radio host <a href="https://www.newsweek.com/rush-limbaugh-says-hurricane-florence-forecast-trying-heighten-belief-climate-1117416">Rush Limbaugh worried</a> that media predictions about Hurricane Florence were attempts to “heighten belief in climate change.”</p>
<p>But a recent study from Ohio State University <a href="https://scholar.google.com/citations?user=WFIpOQEAAAAJ&hl=en&oi=sra">communications</a> <a href="https://scholar.google.com/citations?user=ffk3cpwAAAAJ&hl=en&oi=sra">scholars</a> found that news stories connecting climate change to natural disasters <a href="https://doi.org/10.1080/17524032.2018.1546202">actually backfire among skeptics</a>. As someone who also studies scientific communication, I find these results fascinating. It’s easy to assume that presenting factual information will automatically change people’s minds, but messages can have complex, frustrating persuasive effects.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/265961/original/file-20190326-36276-1k49f7o.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The flooded Platte River in Nebraska is one example of a recent extreme weather event.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Winter-Weather-Flooding/b30f053a699846399613db1f67cfb427/1/0">AP Photo/Nati Harnik</a></span>
</figcaption>
</figure>
<h2>Investigating how skeptics hear the news</h2>
<p>Social scientists have an unclear understanding of how climate change news affects public opinion, as not enough research has specifically explored that question. To explore the question, researchers from Ohio State recruited 1,504 volunteers. They divided them into groups who read news stories about natural disasters – fires, hurricanes or blizzards – that either emphasized or omitted the role of climate change.</p>
<p>Cleverly, the researchers recruited participants from geographic areas most likely to experience the disasters they read about; for instance, participants in hurricane-prone areas read the news articles about hurricanes. Further, the researchers ran the study in fall 2017, during hurricane and wildfire season, when these sorts of disasters are presumably top of mind.</p>
<p>After reading, participants answered 11 questions meant to measure their resistance to the article, including “Sometimes I wanted to ‘argue back’ against what I read” and “I found myself looking for flaws in the way information was presented.”</p>
<p>It turned out that climate change skeptics – whether politically conservative or liberal – showed more resistance to the stories that mentioned climate change. Climate change themes also made skeptics more likely to downplay the severity of the disasters. At the same time, the same articles made people who accept climate change perceive the hazards as more severe.</p>
<p>The study findings suggest that reporting the relationship between climate change and hazardous weather may actually increase the skepticism of skeptics, even in the face of blatant contrary evidence. Psychologists call this the <a href="https://en.wikipedia.org/wiki/Boomerang_effect_(psychology)">boomerang effect</a>, because the message ultimately sends people in the opposite direction.</p>
<h2>Who’s hearing the message matters</h2>
<p>The boomerang effects seen in this latest study are less surprising than you might think. Researchers have tried a variety of strategies, including <a href="https://doi.org/10.1080/17524032.2018.1548369">emphasizing scientific consensus around climate change</a> and describing the <a href="https://doi.org/10.1177/0093650211416646">negative health impacts of climate change</a> on people near and far, only to find that skeptics often end up more entrenched after reading attempts to persuade them. </p>
<p>Messages can work when they use place to increase people’s concern and willingness to act on climate change, but individual studies show inconsistent results. <a href="https://doi.org/10.1016/j.gloenvcha.2019.01.002">One new study</a> gave Bay Area participants maps showing the increased flood risk in their zip code due to projected sea level rise. The maps made no difference in people’s concern about the effects of climate change on future generations, developing countries or the Bay Area. But the maps did make people who accept climate change less concerned that it would personally harm them. These participants may have replaced their abstract, apocalyptic assumptions about climate change threats with the more tangible predictions, causing them to feel less vulnerable.</p>
<p><a href="https://doi.org/10.1175/WCAS-D-16-0119.1">Another study</a>, also involving Californians, generated slightly more success for place-based climate change news, but only among participants who were already <a href="http://climatecommunication.yale.edu/about/projects/global-warmings-six-americas/">concerned about climate change</a>. Study participants read news articles explaining that climate change would increase droughts either globally or in California. The global message made people more likely to want policy changes, while the local messages made people more likely to say they would change their personal behavior.</p>
<p>Place-based appeals often have some <a href="https://doi.org/10.1080/13549839.2017.1385002">positive effect on people’s willingness to act</a> on climate change and environmental issues.</p>
<p>But most studies about local messaging suggest that you cannot persuade everyone with the same message. A complex relationship of factors – including previous beliefs on climate change, political affiliation, and attachment to place and gender – can all play a role.</p>
<p>And psychologists offer compelling reasons <a href="https://doi.org/10.1038/nclimate2760">why persuasive attempts sometimes backfire</a>. Messages about the local impact of climate change might actually replace people’s abstract, altruistic values with utilitarian concerns. In the case of skeptics resisting news about climate-driven disasters, the researchers from Ohio State suggest that these people are engaged in <a href="https://www.psychologytoday.com/us/basics/motivated-reasoning">motivated reasoning</a>, a cognitive bias where people force new and threatening information to conform to their pre-existing knowledge.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=414&fit=crop&dpr=1 600w, https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=414&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=414&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=520&fit=crop&dpr=1 754w, https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=520&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/265963/original/file-20190326-36260-1fk66ya.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=520&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">More and more info doesn’t necessarily convince.</span>
<span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Southern-Flood-Threat/0e91c7a501174d83816f6742d7e5243e/5/0">AP Photo/Jay Reeves</a></span>
</figcaption>
</figure>
<h2>More news may not convince</h2>
<p>Resistance to news about climate change disasters might be frustrating, but even the media often ignore the role of climate change in disasters, according to an analysis by the nonprofit consumer advocacy organization <a href="https://www.citizen.org/sites/default/files/public-citizen-carbon-omission-extreme-weather-2018.pdf">Public Citizen</a>. They found only 7 percent of American news stories about hurricanes mentioned climate change in 2018. Percentages increase for stories about wildfires (27.8 percent of stories), extreme heat (34 percent of stories) and drought (35 percent of stories). But an overwhelming amount of extreme weather news coverage never mentions climate change.</p>
<p>Some omissions are particularly striking. Liberal research organization <a href="https://www.mediamatters.org/blog/2018/07/12/Major-broadcast-TV-networks-mentioned-climate-change-just-once-during-two-weeks-of-heat-wa/220651">Media Matters</a> found only one mention of climate change in 127 broadcast news stories during two weeks of extreme heat in 2018. Only about 4 percent of stories about Hurricane Irma and Harvey mentioned climate change, according to an <a href="http://lifescienceglobal.com/pms/index.php/IJCC/article/view/5215">academic analysis</a> that included The Houston Chronicle and the Tampa Bay Times.</p>
<p>Despite these low numbers, U.S. climate change coverage related to extreme weather and disasters actually rose in 2018, according to the report from Public Citizen. This increase aligns with a trend of news slowly improving its climate reporting. For instance, U.S. print media has <a href="https://doi.org/10.1177/0963662515612276">dropped some of the skepticism</a> from its climate change reporting, both in terms of outright skepticism of the basic science and a subtler version that involved creating a false balance by <a href="https://doi.org/10.1016/j.gloenvcha.2016.11.004">including voices which both affirm and deny</a> the reality of climate change. </p>
<p>Even if the media continues to increase and improve its climate change coverage, it might not change skeptics’ minds. Of course, the media has a responsibility to report the news accurately, regardless of how some people process it. But those hoping that climate change news will convert skeptics might end up disappointed. </p>
<p>Given this resistance to news, other approaches, such as <a href="https://doi.org/10.1177/0013916515574085">avoiding fear-inducing and guilt-based messaging</a>, <a href="https://doi.org/10.1177/1075547017715473">creating targeted messages about free-market solutions</a>, or deploying a kind of <a href="http://dx.doi.org/10.1037/a0040437">“jiu jitsu” persuasion</a> that aligns with pre-existing attitudes, may prove more effective at influencing skeptics. In the meantime, social scientists will continue to investigate ways to combat the stubborn boomerang effect, even as the consequences of climate change intensify all around us.</p><img src="https://counter.theconversation.com/content/112650/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Ryan Weber does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Media reports are starting to directly connect climate change to its weather effects in local communities. But how you respond to those linkages depends on what you already think about climate change.Ryan Weber, Associate Professor of English, University of Alabama in HuntsvilleLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1060412018-11-19T11:36:51Z2018-11-19T11:36:51ZThe equivalence test: A new way for scientists to tackle so-called negative results<figure><img src="https://images.theconversation.com/files/245434/original/file-20181113-194497-6etpxk.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A new statistical test lets scientists figure out if two groups are similar to one another. </span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/brush-skeleton-dig-dinosaur-fossil-sedimentary-426848773?src=OQSTlxRXMh_sqKQNzj6Y4w-1-11">paleontologist natural/shutterstock.com</a></span></figcaption></figure><p>A paleontologist returns to her lab from a summer dig and sets up a study comparing tooth length in two dinosaur species. She and her team work meticulously to avoid biasing their results. They remain blind to the species while measuring, the sample sizes are large, and the data collection and the analysis are rigorous. </p>
<p>The scientist is surprised to find no significant difference in canine tooth length between the two species. She realizes that these unexpected results are important and sends a paper off to the appropriate journals. But journal after journal rejects the paper, since the results aren’t significantly different. Eventually, the scientist gives up, and the paper with its so-called negative results is placed in a drawer and buried under years of other work.</p>
<p>This scenario and many others like it have played out across all scientific disciplines, leading to what has been dubbed “<a href="https://undark.org/article/loss-of-confidence-project-replication-crisis/">the file drawer problem</a>.” Research journals and funding agencies are often biased toward research that shows “positive” or significantly different results. This unfortunate bias contributes to many other issues in the scientific process, such as <a href="http://dx.doi.org/10.1037/1089-2680.2.2.175">confirmation bias</a>, in which data are interpreted incorrectly to support a desired outcome. </p>
<h2>A new method: Equivalence</h2>
<p>Unfortunately, publication bias issues have been prevalent in science for a long time. Due to <a href="http://onlinestatbook.com/2/tests_of_means/difference_means.html">the structure of the scientific method</a>, scientists often focus only on differences between groups – like the dinosaur teeth from two different species, or a public health comparison of two different neighborhoods. This leaves studies that focus on similarities completely hidden. </p>
<p>However, <a href="https://doi.org/10.1016/0163-7258(94)90004-3">pharmaceutical trials</a> have found a solution for this problem. In these trials, researchers sometimes use a test known as TOST, two one sided test, to look for equivalence between treatments. </p>
<p>For example, say a company develops a generic drug that is cheaper to produce than the name-brand drug. Researchers need to demonstrate that the new drug functions in a statistically equivalent manner to the name brand before selling it on the market. That’s where equivalence testing comes in. If the test shows equivalence between the effects of the two drugs, then the FDA can approve the new drug’s release on the market. </p>
<p>While traditional equivalence testing is very helpful for preplanned and controlled pharmaceutical tests, it isn’t versatile enough for other types of studies. The original TOST cannot be used to test equivalence in experiments <a href="http://blog.minitab.com/blog/adventures-in-statistics-2/repeated-measures-designs-benefits-challenges-and-an-anova-example">where the same individuals are in multiple treatment groups</a>, nor does it work if the two tests groups have different sample sizes. </p>
<p>Additionally, the TOST used in pharmaceutical testing does not typically address multiple variables simultaneously. For example, a traditional TOST would be able to analyze similarities in biodiversity at several river locations before and after a temperature change. However, our new TOST would allow to test for similarities in multiple variables – such as biodiversity, water pH, water depth and water clarity – at all of the river sites simultaneously.</p>
<p>The limitations of the traditional TOST and the pervasiveness of the “file drawer problem” led our team <a href="https://doi.org/10.1016/j.anbehav.2018.09.004">to develop a multivariate equivalence test</a>, capable of addressing similarities in systems with repeated measures and unequal sample sizes. </p>
<p>Our new equivalence test, <a href="https://www.sciencedaily.com/releases/2018/10/181016150725.htm">published in October</a>, flips the traditional null hypothesis framework on its head. Now, rather than assuming similarity, a researcher starts with the assumption that the two groups are different. The burden of proof now lies with evaluating the degree of similarity, rather than the degree of difference. </p>
<p>Our test also allows researchers to set their own acceptable margin for declaring similarity. For example, if margin were set to 0.02, then the results would tell you if the means of the two groups were similar within plus or minus 2 percent.</p>
<h2>A step in the right direction</h2>
<p>Our modification means that equivalence testing can now be applied across a wide range of disciplines. For example, we used this test to demonstrate equivalent acoustic structure in the songs of male and female eastern bluebirds. Equivalence testing has also already been used in some areas of <a href="https://doi.org/10.1007/s11219-013-9196-0">engineering</a> and <a href="https://doi.org/10.1177%2F1948550617697177">psychology</a>. </p>
<p>The method could be applied even more broadly. Imagine a group of researchers who want to examine two different teaching methods. In one classroom there is no technology, and in another all of the students’ assignments are done online. Equivalence testing might help an school district decide if they should invest more in technology or if the two methods of teaching are equivalent. </p>
<p>The development of a broadly applicable equivalence test represents what we think will be a huge step forward in scientists’ long struggle to present real and unbiased results. This test provides another avenue for exploration and allows researchers to examine and publish the results from studies on similarities that have not been published or funded in the past. </p>
<p>The prevalence of publication bias, including the <a href="http://dx.doi.org/10.1037/0033-2909.86.3.638">file drawer problem</a>, confirmation bias and accidental <a href="https://projects.fivethirtyeight.com/p-hacking/">false positives</a>, is a major stumbling block for scientific progress. In some fields of research, up to half of results are missing from the published literature. </p>
<p>Equivalence testing provides another tool in the toolbox for scientists to present “positive” results. If the scientific community takes hold of this test and utilizes it to its full potential, we think it may help mitigate one of the major limitations in the way science is currently practiced. </p>
<p><em>This article has been updated to correct the margin of declaring similarity.</em></p><img src="https://counter.theconversation.com/content/106041/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Evangeline Shank receives funding from The Maryland Ornithological Society. </span></em></p><p class="fine-print"><em><span>Kevin Omland receives funding from the National Science Foundation and the American Bird Conservancy.</span></em></p><p class="fine-print"><em><span>Thomas Mathew receives funding from NIH (past funding). </span></em></p>A new statistical test lets researchers search for similarities between groups. Could this help keep new important findings out of the file drawer?Evangeline Rose, Ph.D Candidate in Biological Sciences, University of Maryland, Baltimore CountyKevin Omland, Professor of Biological Sciences, University of Maryland, Baltimore CountyThomas Mathew, Professor of Mathematics and Statistics, University of Maryland, Baltimore CountyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1015242018-09-24T06:25:04Z2018-09-24T06:25:04ZIt’s better light, not worse behaviour, that explains crimes on a full Moon<figure><img src="https://images.theconversation.com/files/237651/original/file-20180924-129844-hy0pa7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">When people know it's a full moon, they tend to use it to explain all sorts of human behaviour. </span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/x9TZjFdvr0Y">Todd Diemer/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>It’s a full Moon on September 25. </p>
<p>If past months have been anything to go by, this will be accompanied by a round of public chat about how this affects human behaviour – claims of more hospital admissions and arrests, to crazy antics in children. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"948657284024070144"}"></div></p>
<p>Beliefs in the Moon’s behavioural effects are <a href="https://theconversation.com/mondays-medical-myth-hospitals-get-busier-on-full-moons-5383">not new</a> and date back to ancient times. But what evidence is there that the Moon has an impact on behaviour?</p>
<p>As a criminologist, I look at evidence related to arrests and behaviour linked with criminal activity. </p>
<p>The only explanation I can see that links criminology with Moon phases is just about the practicalities of being a criminal: when it’s a full Moon, there’s more light. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/five-reasons-india-china-and-other-nations-plan-to-travel-to-the-moon-87589">Five reasons India, China and other nations plan to travel to the Moon</a>
</strong>
</em>
</p>
<hr>
<p>While somewhat dated, one of the most significant studies looking at Moon phases and linking this with behaviour is a 1985 <a href="http://psycnet.apa.org/record/1985-19152-001">meta-analysis</a> – a study of the findings of 37 published and unpublished studies. The paper concludes it is not sound to infer that people behave any more – or less – strangely between Moon phases. The authors write:</p>
<blockquote>
<p>Alleged relations between phases of the moon and behavior can be traced to inappropriate analyses […] and a willingness to accept any departure from chance as evidence of a lunar effect. </p>
</blockquote>
<p>Two more recent studies have looked at links between criminal activity and phases of the Moon. </p>
<p>A <a href="https://www.sciencedirect.com/science/article/pii/S0010440X09000030">study published in 2009</a> looked at more than 23,000 cases of aggravated assaults that took place in Germany between 1999 and 2005. The authors found no correlation between battery and the various lunar phases. </p>
<p>A <a href="https://link.springer.com/article/10.1007/s12103-016-9351-9">study reported in 2016</a> was careful to make a distinction between indoor and outdoor crime committed in 13 US states and the District of Columbia in 2014.</p>
<p>The authors found no link between lunar phases and total crime or indoor crime. </p>
<p>But they did find the intensity of moonlight to have a substantive positive effect on outdoor criminal activity. As moon illumination increased, they saw an escalation in criminal activity.</p>
<p>One explanation for this finding is what is referred to as the “illumination hypothesis” – suggesting that criminals like enough light to ply their trade, but not so much as to increase their chance of apprehension. </p>
<p>It may also be that there is greater movement of people during lighter nights, thus providing a bigger pool of victims. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/confirmation-bias-a-psychological-phenomenon-that-helps-explain-why-pundits-got-it-wrong-68781">Confirmation bias: A psychological phenomenon that helps explain why pundits got it wrong</a>
</strong>
</em>
</p>
<hr>
<p>Why do some people still cling to the belief that the Moon causes criminal or other antisocial behaviour? The answer most likely lies in human cognition and our tendency to focus on that which we expect or predict to be true.</p>
<p>During an expected lunar event – such as a full or super Moon – we expect that there will be a change in behaviour so we pay more attention when we see it. In the area of cognitive psychology this is known as <a href="https://theconversation.com/confirmation-bias-a-psychological-phenomenon-that-helps-explain-why-pundits-got-it-wrong-68781">confirmation bias</a>.</p>
<p>But other questions remain, including why any behavioural effects must be inherently negative? Even if there was a direct effect, explanations as to why acts of kindness and altruism do not increase or decrease during Moon phases are conspicuously absent. </p>
<p>It is likely that we just assume the folklore is true, and believe that we become the werewolf and not the sheep.</p><img src="https://counter.theconversation.com/content/101524/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Wayne Petherick does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>The ‘illumination hypothesis’ – suggests that criminals like enough light to ply their trade, but not so much as to increase their chance of apprehension.Wayne Petherick, Associate professor of criminology, Bond UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1033832018-09-23T09:03:44Z2018-09-23T09:03:44ZFourth industrial revolution: sorting out the real from the unreal<figure><img src="https://images.theconversation.com/files/237279/original/file-20180920-129859-boz2s8.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Our predictive skills are about as reliable as a crystal ball.</span> <span class="attribution"><span class="source">Andrey_Popov/Shutterstock</span></span></figcaption></figure><p>The phrase “fourth industrial revolution” has become ubiquitous. It’s meant to denote a huge shift in the socioeconomic fabric of society, driven by the availability of increasingly intelligent machines. These will be able to do things we can’t do as well as take care of things we can do. Jobs <a href="https://theconversation.com/many-south-african-jobs-could-soon-be-automated-and-the-country-isnt-prepared-99689">will be lost</a>. And new jobs will be created.</p>
<p>The fourth industrial revolution idea owes much of its credibility to <a href="https://www.penguinrandomhouse.com/books/551710/the-fourth-industrial-revolution-by-klaus-schwab/9781524758868">a book</a> by engineer, economist and World Economic Forum founder Klaus Schwab. He argues that an interconnected world, a cheapening of computer power and storage, developments in artificial intelligence, and advances in areas of biology will have revolutionary effects on our world. </p>
<p>He lays out a range of predictions, of greater or lesser confidence, about what these effects may be. And he argues compellingly that we need to apply ourselves to the human dimension of the revolution: to considering, and taking control of, the effects of it on social inequalities, poverty levels, political structures, labour, the way we assess productivity, and, deepest of all, what it really means to be human, given that so many formerly human tasks will be done by machines, some even via augmentation of human bodies.</p>
<p>It’s a good book, but has its weaknesses. It’s historically not very nuanced; it focuses on economics at the expense of politics. Most importantly, it appears to suffer from “confirmation bias” – the tendency to see any evidence as supporting your view, and to discount evidence that doesn’t.</p>
<p>These strengths and weaknesses reflect the strengths and weaknesses of the wider debate around the fourth industrial revolution. When the idea is used as a stimulus to reconsider what we are doing and think about the future, that’s great. When the narrative morphs into a series of predictions about life in two, 20 and 200 years, it’s easy to lose the plot.</p>
<p>Allocating resources and design strategies based on the predictive content of the fourth industrial revolution narrative would be dangerous given that even two decades ago it was impossible to predict the pace of technological development we’ve seen. </p>
<p>So caution is necessary. We can’t simply work out what is going to happen during the fourth industrial revolution, and place our bets. That’s because people’s predictive powers, never strong, become much worse when we are in the grip of a “big idea”. They become not merely bad, but <em>worse than random</em>.</p>
<h2>The tortoise and the hare</h2>
<p>Psychologist <a href="https://en.wikipedia.org/wiki/Philip_E._Tetlock">Philip Tetlock</a> has conducted large multi-decade studies of socio-political predictions since the 1980s. For example, he asked people to make predictions about the future of communism and capitalism. His results, presented in his book <em><a href="https://press.princeton.edu/titles/11152.html">Expert Political Judgment</a></em>, are striking. </p>
<p>It makes no difference whether you are intelligent, a subject expert, have access to classified information, have a PhD, are left or right wing – none of the traditional markers of expertise translate into improved prediction performance.</p>
<p>The only significant variation relates to cognitive traits that Tetlock characterises as “fox” and “hedgehog”. </p>
<p>A fox has many ideas. A hedgehog has one big idea. In the <a href="https://fablesofaesop.com/the-fox-and-the-hedgehog.html">original fable by Aesop</a>, from which Tetlock draws these creatures, the point is that this one big idea (rolling up into a ball and sticking your spikes out) is enough to defeat the quick-witted fox. But Tetlock draws the opposite moral for prediction. Having one big idea to which you are fundamentally committed makes you far less likely to be a good predictor.</p>
<p>This result has important consequences. It explains why pundits are so often wrong, missing all the huge events of recent times and getting others wrong. Pundits make it because they exude confidence, which is characteristic of the hedgehog, who sees the world in clear and simple terms, and usually absent from the fox, whose world is complex and uncertain.</p>
<p>Fox-thinkers aren’t exactly <em>great</em> as predictors. But they are better than random, and certainly better than hedgehogs. Their scepticism, uncertainty and humility mean they will change their minds when new data come in. This is obviously rational, and the data show that looking for opportunities to change your mind – asking <a href="https://doi.org/10.1016/j.ypmed.2011.09.009">what could possibly go wrong</a> – makes for a far better prediction strategy than hedgehog-like adherence to a single idea.</p>
<h2>Beware of hedgehog thinking</h2>
<p>There’s a great deal to applaud in efforts like Schwab’s to consciously review contemporary circumstances. But we need to be careful of the temptation to adopt a single lens, whether rose-tinted or grimy, for understanding a complex world. </p>
<p>A critical stance is essential if the fourth industrial revolution is to be a stimulus for debate rather than a dogma.</p>
<p>So, if you see the fourth industrial revolution everywhere, beware: you may be in the grip of hedgehog thinking – just as you are if you reject the entire notion. </p>
<p>As Tetlock’s work shows, if you see the certain future events as inevitable, and wonder how others can’t see that too, then you’re probably wrong. It’s better to remain inquisitive, uncertain, critical, and <a href="https://www.goodreads.com/quotes/617523-a-wise-man-apportions-his-beliefs-to-the-evidence">apportion your belief to the evidence</a>. This is how humans will benefit from the fourth industrial revolution, and how we will take control of it.</p><img src="https://counter.theconversation.com/content/103383/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Alex Broadbent works for the University of Johannesburg.</span></em></p>We can’t simply try to work out what’s going to happen during the fourth industrial revolution.Alex Broadbent, Executive Dean, Faculty of Humanities and Director, African Centre for Epistemology and Philosophy of Science, University of JohannesburgLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/971482018-06-20T10:28:04Z2018-06-20T10:28:04ZMisinformation and biases infect social media, both intentionally and accidentally<figure><img src="https://images.theconversation.com/files/223361/original/file-20180615-85822-5fqwo4.png?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">People who share potential misinformation on Twitter (in purple) rarely get to see corrections or fact-checking (in orange).</span> <span class="attribution"><a class="source" href="https://arxiv.org/abs/1801.06122">Shao et al.</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>Social media are among the <a href="http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/">primary sources of news in the U.S.</a> and across the world. Yet users are exposed to content of questionable accuracy, including <a href="https://conspiracypsychology.com/2018/02/22/every-mass-shooting-produces-the-same-conspiracy-theories-more-or-less/">conspiracy theories</a>, <a href="https://www.polygon.com/2018/4/13/17231470/fortnite-strip-clickbait-touchdalight-ricegum-youtube">clickbait</a>, <a href="https://www.pbs.org/newshour/show/online-anger-is-gold-to-this-junk-news-pioneer">hyperpartisan content</a>, <a href="https://www.newyorker.com/science/elements/looking-for-life-on-a-flat-earth">pseudo science</a> and even <a href="https://www.smithsonianmag.com/history/age-old-problem-fake-news-180968945/">fabricated “fake news” reports</a>.</p>
<p>It’s not surprising that there’s so much disinformation published: Spam and online fraud <a href="https://www.symantec.com/connect/blogs/dridex-financial-trojan-aggressively-spread-millions-spam-emails-each-day">are lucrative for criminals</a>, and government and political propaganda yield <a href="https://www.ned.org/issue-brief-distinguishing-disinformation-from-propaganda-misinformation-and-fake-news/">both partisan and financial benefits</a>. But the fact that <a href="http://doi.org/10.1126/science.aap9559">low-credibility content spreads so quickly and easily</a> suggests that people and the algorithms behind social media platforms are vulnerable to manipulation.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/BIv9054dBBI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Explaining the tools developed at the Observatory on Social Media.</span></figcaption>
</figure>
<p>Our research has identified three types of bias that make the social media ecosystem vulnerable to both intentional and accidental misinformation. That is why our <a href="http://osome.iuni.iu.edu/">Observatory on Social Media</a> at Indiana University is building <a href="http://osome.iuni.iu.edu/tools/">tools</a> to help people become aware of these biases and protect themselves from outside influences designed to exploit them. </p>
<h2>Bias in the brain</h2>
<p>Cognitive biases originate in the way the brain processes the information that every person encounters every day. The brain can deal with only a finite amount of information, and too many incoming stimuli can cause <a href="https://hbr.org/2009/09/death-by-information-overload">information overload</a>. That in itself has serious implications for the quality of information on social media. We have found that steep competition for users’ limited attention means that <a href="https://doi.org/10.1038/srep00335">some ideas go viral despite their low quality</a> – <a href="https://arxiv.org/abs/1701.02694v4">even when people prefer to share high-quality content</a>.</p>
<p>To avoid getting overwhelmed, the brain uses a <a href="https://global.oup.com/academic/product/simple-heuristics-that-make-us-smart-9780195143812">number of tricks</a>. These methods are usually effective, but may also <a href="https://www.psychologytoday.com/us/blog/fulfillment-any-age/201210/avoiding-emotional-traps-is-easier-you-think">become biases</a> when applied in the wrong contexts. </p>
<p>One cognitive shortcut happens when a person is deciding whether to share a story that appears on their social media feed. People are <a href="https://doi.org/10.1007/978-3-642-22309-9_5">very affected by the emotional connotations of a headline</a>, even though that’s not a good indicator of an article’s accuracy. Much more important is <a href="https://digitalliteracy.cornell.edu/tutorial/dpl3221.html">who wrote the piece</a>.</p>
<p>To counter this bias, and help people pay more attention to the source of a claim before sharing it, we developed <a href="http://fakey.iuni.iu.edu">Fakey</a>, a mobile news literacy game (free on <a href="https://play.google.com/store/apps/details?id=com.cnets.fakey">Android</a> and <a href="https://itunes.apple.com/us/app/id1386410642?mt=8">iOS</a>) simulating a typical social media news feed, with a mix of news articles from mainstream and low-credibility sources. Players get more points for sharing news from reliable sources and flagging suspicious content for fact-checking. In the process, they learn to recognize signals of source credibility, such as hyperpartisan claims and emotionally charged headlines. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=429&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=429&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=429&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=540&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=540&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=540&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Screenshots of the Fakey game.</span>
<span class="attribution"><span class="source">Mihai Avram and Filippo Menczer</span></span>
</figcaption>
</figure>
<h2>Bias in society</h2>
<p>Another source of bias comes from society. When people connect directly with their peers, the social biases that guide their selection of friends come to influence the information they see.</p>
<p>In fact, in our research we have found that it is possible to <a href="http://doi.org/10.1109/PASSAT/SocialCom.2011.34">determine the political leanings of a Twitter user</a> by simply looking at the partisan preferences of their friends. Our analysis of the structure of these <a href="http://www.aaai.org/ocs/index.php/ICWSM/ICWSM11/paper/view/2847">partisan communication networks</a> found social networks are particularly efficient at disseminating information – accurate or not – when <a href="http://doi.org/10.1140/epjds6">they are closely tied together and disconnected from other parts of society</a>.</p>
<p>The tendency to evaluate information more favorably if it comes from within their own social circles creates “<a href="https://arstechnica.com/science/2017/03/the-social-media-echo-chamber-is-real/">echo chambers</a>” that are ripe for manipulation, either consciously or unintentionally. This helps explain why so many online conversations devolve into <a href="http://www.pewinternet.org/2016/10/25/the-tone-of-social-media-discussions-around-politics/">“us versus them” confrontations</a>. </p>
<p>To study how the structure of online social networks makes users vulnerable to disinformation, we built <a href="http://hoaxy.iuni.iu.edu">Hoaxy</a>, a system that tracks and visualizes the spread of content from low-credibility sources, and how it competes with fact-checking content. Our analysis of the data collected by Hoaxy during the 2016 U.S. presidential elections shows that Twitter accounts that shared misinformation were <a href="http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0196087">almost completely cut off</a> from the corrections made by the fact-checkers.</p>
<p>When we drilled down on the misinformation-spreading accounts, we found a very dense core group of accounts retweeting each other almost exclusively – including several bots. The only times that fact-checking organizations were ever quoted or mentioned by the users in the misinformed group were when questioning their legitimacy or claiming the opposite of what they wrote.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=748&fit=crop&dpr=1 600w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=748&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=748&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=940&fit=crop&dpr=1 754w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=940&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=940&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A screenshot of a Hoaxy search shows how common bots – in red and dark pink – are spreading a false story on Twitter.</span>
<span class="attribution"><span class="source">Hoaxy</span></span>
</figcaption>
</figure>
<h2>Bias in the machine</h2>
<p>The third group of biases arises directly from the algorithms used to determine what people see online. Both social media platforms and search engines employ them. These personalization technologies are designed to select only the most engaging and relevant content for each individual user. But in doing so, it may end up reinforcing the cognitive and social biases of users, thus making them even more vulnerable to manipulation.</p>
<p>For instance, the detailed <a href="https://theconversation.com/solving-the-political-ad-problem-with-transparency-85366">advertising tools built into many social media platforms</a> let disinformation campaigners exploit <a href="https://www.psychologytoday.com/us/blog/science-choice/201504/what-is-confirmation-bias">confirmation bias</a> by <a href="https://www.washingtonpost.com/news/powerpost/paloma/the-cybersecurity-202/2018/05/11/the-cybersecurity-202-the-facebook-ad-dump-shows-the-true-sophistication-of-russia-s-influence-operation/5af4733a30fb04258879944e/">tailoring messages</a> to people who are already inclined to believe them. </p>
<p>Also, if a user often clicks on Facebook links from a particular news source, Facebook will <a href="https://www.wired.com/story/take-back-your-facebook-news-feed/">tend to show that person more of that site’s content</a>. This so-called “<a href="https://www.brainpickings.org/2011/05/12/the-filter-bubble/">filter bubble</a>” effect may isolate people from diverse perspectives, strengthening confirmation bias.</p>
<p>Our own research shows that social media platforms expose users to a less diverse set of sources than do non-social media sites like Wikipedia. Because this is at the level of a whole platform, not of a single user, we call this the <a href="https://doi.org/10.7717/peerj-cs.38">homogeneity bias</a>.</p>
<p>Another important ingredient of social media is information that is trending on the platform, according to what is getting the most clicks. We call this <a href="https://arxiv.org/abs/1707.00574">popularity bias</a>, because we have found that an algorithm designed to promote popular content may negatively affect the overall quality of information on the platform. This also feeds into existing cognitive bias, reinforcing what appears to be popular irrespective of its quality.</p>
<p>All these algorithmic biases can be manipulated by <a href="https://cacm.acm.org/magazines/2016/7/204021-the-rise-of-social-bots/fulltext">social bots</a>, computer programs that interact with humans through social media accounts. Most social bots, like Twitter’s <a href="https://twitter.com/big_ben_clock">Big Ben</a>, are harmless. However, some conceal their real nature and are used for malicious intents, such as <a href="https://newsroom.fb.com/InfoOps">boosting disinformation</a> or falsely <a href="http://www.businessinsider.com/astroturfing-grassroots-movements-2011-9">creating the appearance of a grassroots movement</a>, also called “astroturfing.” We found <a href="http://www.aaai.org/ocs/index.php/ICWSM/ICWSM11/paper/view/2850">evidence of this type of manipulation</a> in the run-up to the 2010 U.S. midterm election.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=432&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=432&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=432&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=542&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=542&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=542&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A screenshot of the Botometer website, showing one human and one bot account.</span>
<span class="attribution"><span class="source">Botometer</span></span>
</figcaption>
</figure>
<p>To study these manipulation strategies, we developed a tool to detect social bots called <a href="http://botometer.org">Botometer</a>. Botometer uses machine learning to detect bot accounts, by inspecting thousands of different features of Twitter accounts, like the times of its posts, how often it tweets, and the accounts it follows and retweets. It is not perfect, but it has revealed that as many as <a href="https://aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15587">15 percent of Twitter accounts show signs of being bots</a>.</p>
<p>Using Botometer in conjunction with Hoaxy, we analyzed the core of the misinformation network during the 2016 U.S. presidential campaign. We found many bots exploiting both the cognitive, confirmation and popularity biases of their victims and Twitter’s algorithmic biases.</p>
<p>These bots are able to construct filter bubbles around vulnerable users, feeding them false claims and misinformation. First, they can attract the attention of human users who support a particular candidate by tweeting that candidate’s hashtags or by mentioning and retweeting the person. Then the bots can amplify false claims smearing opponents by retweeting articles from low-credibility sources that match certain keywords. This activity also makes the algorithm highlight for other users false stories that are being shared widely.</p>
<h1>Understanding complex vulnerabilities</h1>
<p>Even as our research, and others’, shows how individuals, institutions and even entire societies can be manipulated on social media, there are <a href="http://doi.org/10.1126/science.aao2998">many questions</a> left to answer. It’s especially important to discover how these different biases interact with each other, potentially creating more complex vulnerabilities.</p>
<p>Tools like ours offer internet users more information about disinformation, and therefore some degree of protection from its harms. The solutions will <a href="https://www.hewlett.org/newsroom/hewlett-knight-koch-foundations-with-other-funders-will-support-independent-research-on-facebooks-role-in-elections-and-democracy/">not likely be only technological</a>, though there will probably be some technical aspects to them. But they must take into account <a href="https://doi.org/10.1016/j.jarmac.2017.07.008">the cognitive and social aspects</a> of the problem.</p>
<p><em>Editor’s note: This article was updated on Jan. 10, 2019, to replace a link to a study that had been retracted. The text of the article is still accurate, and remains unchanged.</em></p><img src="https://counter.theconversation.com/content/97148/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Giovanni Luca Ciampaglia has received funding from the Office of the Vice Provost for Research at Indiana University, the Democracy Fund, and the Swiss National Science Foundation. Currently, he is supported by the Indiana University Network Science Institute.</span></em></p><p class="fine-print"><em><span>Filippo Menczer has received funding from the National Science Foundation, DARPA, US Navy, Yahoo Research, the J.S. McDonnell Foundation, and Democracy Fund. </span></em></p>Information on social media can be misleading because of biases in three places – the brain, society and algorithms. Scholars are developing ways to identify and display the effects of these biases.Giovanni Luca Ciampaglia, Assistant professor, department of Computer Science and Engineering, University of South FloridaFilippo Menczer, Professor of Computer Science and Informatics; Director of the Center for Complex Networks and Systems Research, Indiana UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/862852017-11-06T01:23:47Z2017-11-06T01:23:47ZWhy social media may not be so good for democracy<figure><img src="https://images.theconversation.com/files/193234/original/file-20171103-1008-1kvuik5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Some of the Facebook and Instagram ads used in 2016 election released by members of the U.S. House Intelligence committee. </span> <span class="attribution"><span class="source">AP Photo/Jon Elswick</span></span></figcaption></figure><p>Recent revelations about how Russian agents <a href="https://www.nytimes.com/2017/09/07/us/politics/russia-facebook-twitter-election.html">inserted ads on Facebook</a>, in an attempt to influence the 2016 election, present a troubling question: Is Facebook bad for democracy? </p>
<p>As a scholar of the social and political implications of technology, I believe that the problem is not about Facebook alone, but much larger: Social media is actively undermining some of the social conditions that have historically made democratic nation states possible. </p>
<p>I understand that’s a huge claim, and I don’t expect anyone to believe it right away. But, considering that <a href="https://www.nbcnews.com/news/us-news/russian-backed-election-content-reached-126-million-americans-facebook-says-n815791">nearly half</a> of all eligible voters received Russian-sponsored fake news on Facebook, it’s an argument that needs to be on the table. </p>
<h2>How we create a shared reality</h2>
<p>Let’s start with two concepts: an “imagined community” and a “filter bubble.”</p>
<p>The late political scientist Benedict Anderson famously argued that the modern nation-state is best understood as an “<a href="https://www.versobooks.com/books/2259-imagined-communities">imagined community</a>” partly enabled by the rise of mass media such as newspapers. What Anderson meant is that the sense of cohesion that citizens of modern nations felt with one another – the degree to which they could be considered part of a national community – was one that was both artificial and facilitated by mass media. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Mass media is one way to create a shared community.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/wikidave/4078540081/in/photolist-7dpzSz-RngzfQ-GmP7b-mdxj3-5U5dgp-j2gfsH-chjenQ-4sfYJw-Vzp9b2-qSZUs-kKuJFD-qSZJk-WxNtJ2-UAK8QD-X7EGHs-qVKLJW-qQmUVE-4kFjVm-5U3TPj-5Pfj8r-7S5oRH-eMg9KX-5mPnbW-8kXFD-96dsKy-8kXL7-6XDmDy-WuUsFL-ahCjFS-8kXGA-aakTBT-8kXEm-8kXJp-XYgTmJ-dKFgyg-b2ySR-MSmNxc-rjbN8d-7JtacE-de9qGq-oPoaFy-aDzm92-34atc1-kARY93-3R6LB-Vzp9Dr-VzpaNv-i9Kyz4-vZ7vv-8kXMX">Dave Crosby</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Of course there are many things that enable nation-states like the U.S. to hold together. We all learn (more or less) the same national history in school, for example. Still, the average lobster fisherman in Maine, for example, doesn’t actually have that much in common with the average schoolteacher in South Dakota. But, the <a href="https://www.versobooks.com/books/2259-imagined-communities">mass media contribute</a> toward helping them view themselves as part of something larger: that is, the “nation.” </p>
<p>Democratic polities depend on this shared sense of commonality. It enables what we call “national” policies – an idea that citizens see their interests aligned on some issues. Legal scholar <a href="http://hls.harvard.edu/faculty/directory/10871/Sunstein">Cass Sunstein</a> <a href="https://books.google.com/books/about/Republic_com.html?id=O7AG9TxDJdgC">explains this idea</a> by taking us back to the time when there were only three broadcast news outlets and they all said more or less the same thing. As Sunstein says, we have historically depended on these “general interest intermediaries” to frame and articulate our sense of shared reality. </p>
<h2>Filter bubbles</h2>
<p>The term <a href="https://www.penguinrandomhouse.com/books/309214/the-filter-bubble-by-eli-pariser/9780143121237/">“filter bubble”</a> emerged in a 2010 book by activist <a href="https://www.opensocietyfoundations.org/people/eli-pariser">Eli Pariser</a> to characterize an internet phenomenon.</p>
<p>Legal scholar <a href="http://hls.harvard.edu/faculty/directory/10519/Lessig">Lawrence Lessig</a> and Sunstein too had <a href="http://codev2.cc/download+remix/Lessig-Codev2.pdf">identified</a> this phenomenon of group isolation on the internet in the late 1990s. Inside a filter bubble, individuals basically receive only the kinds of information that they have either preselected, or, more ominously, that third parties have decided they want to hear. </p>
<p>The targeted advertising behind Facebook’s newsfeed helps to create such filter bubbles. Advertising on Facebook works by determining its user’s interests, based on data it collects from their browsing, likes and so on. This is a very sophisticated operation. </p>
<p>Facebook does not disclose its own algorithms. However, research led by psychologist and data scientist at Stanford University <a href="http://www.michalkosinski.com/">Michael Kosinski</a> <a href="http://www.pnas.org/content/110/15/5802.full">demonstrated</a> that automated analysis of people’s Facebook likes was able to identify their demographic information and basic political beliefs. Such targeting can also apparently be extremely precise. There is <a href="http://www.cnn.com/2017/10/03/politics/russian-facebook-ads-michigan-wisconsin/index.html">evidence</a>, for example, that anti-Clinton ads from Russia were able to micro-target specific voters in Michigan. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Is Facebook creating filter bubbles?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/download/confirm/569244514?src=K4gWpNnYIIplqxveBwOlHQ-1-32&size=small_jpg">sitthiphong/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>The problem is that inside a filter bubble, you never receive any news that you do not agree with. This poses two problems: First, there is never any independent verification of that news. Individuals who want independent confirmation will have to actively seek it out. </p>
<p>Second, psychologists have known for a long time about “<a href="http://psy2.ucsd.edu/%7Emckenzie/nickersonConfirmationBias.pdf">confirmation bias</a>,” the tendency of people to seek out only information they agree with. Confirmation bias also limits people’s ability to question information that confirms or upholds their beliefs.</p>
<p>Not only that, research at Yale University’s <a href="http://www.culturalcognition.net/">Cultural Cognition Project</a> strongly suggests that people <a href="http://www.culturalcognition.net/blog/2012/11/15/is-cultural-cognition-the-same-thing-as-or-even-a-form-of-co.html">are inclined</a> to interpret new evidence in light of beliefs associated with their social groups. This can <a href="http://www.culturalcognition.net/blog/2015/6/12/politically-motivated-reasoning-paradigm-pmrp-what-it-is-how.html">tend to polarize</a> those groups.</p>
<p>All of this means that if you are inclined to dislike President Donald Trump, any negative information on him is likely to further strengthen that belief. Conversely, you are likely to discredit or ignore pro-Trump information.</p>
<p>It is this pair of features of filter bubbles – preselection and confirmation bias – that fake news exploits with precision.</p>
<h2>Creating polarized groups?</h2>
<p>These features are also hardwired into the business model of social media like Facebook, which is predicated precisely on the idea that one can create a group of “friends” with whom one shares information. This group is largely insular, separated from other groups. </p>
<p>The software very <a href="http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html">carefully curates</a> the transfer of information across these social networks and tries very hard to be the primary portal through which its users – about <a href="http://money.cnn.com/2017/02/01/technology/facebook-earnings/index.html">2 billion</a> of them – access the internet.</p>
<p>Facebook depends on advertising for its revenue, and that advertising can be readily exploited: A recent <a href="https://www.propublica.org/article/facebook-enabled-advertisers-to-reach-jew-haters">ProPublica investigation</a> shows how easy it was to target Facebook ads to “Jew Haters.” More generally, the site also wants to keep users online, and it <a href="http://www.pnas.org/content/111/24/8788.full">knows</a> that it is able to manipulate the emotions of its users – who are happiest when they see things they agree with. </p>
<figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Is social media creating more polarization?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/download/confirm/566694994?src=4VpU8SnctQx7i6ZTJKQ7AA-1-0&size=small_jpg">Chinnapong/ Shutterstock.com</a></span>
</figcaption>
</figure>
<p>As the Washington Post <a href="https://www.washingtonpost.com/news/the-switch/wp/2017/11/01/how-russian-trolls-got-into-your-facebook-feed/?utm_term=.aa0c53a633a1">documents</a>, it is precisely these features that were exploited by Russian ads. As a writer at Wired <a href="https://www.wired.com/2016/11/filter-bubble-destroying-democracy/">observed</a> in an ominously prescient commentary immediately after the election, he never saw a pro-Trump post that had been shared over 1.5 million times – and neither did any of his liberal friends. They saw only liberal-leaning news on their social media feeds.</p>
<p>In this environment, a recent Pew Research Center survey should not come as a surprise. The survey <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">shows</a> that the American electorate is both deeply divided on partisan grounds, even on fundamental political issues, and is becoming more so. </p>
<p>All of this combines to mean that the world of social media tends to create small, deeply polarized groups of individuals who will tend to believe everything they hear, no matter how divorced from reality. The filter bubble sets us up to be vulnerable to polarizing fake news and to become more insular. </p>
<h2>The end of the imagined community?</h2>
<p>At this point, two-thirds of Americans get <a href="http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/">at least some of their news</a> from social media outlets. This means that two-thirds of Americans get at least some of their news from highly curated and personalized black-box algorithms. </p>
<p>Facebook remains, by a significant margin, the <a href="https://www.salon.com/2017/10/05/atone-hed-better-facebook-is-still-the-biggest-source-of-right-wing-fake-news/">most prevalent</a> source of fake news. Not unlike forced, false <a href="http://press.uchicago.edu/ucp/books/book/chicago/D/bo3628714.html">confessions of witchcraft</a> in the Middle Ages, these stories get repeated often enough that they could appear legitimate. </p>
<p>What we are witnessing, in other words, is the potential collapse of a significant part of the imagined community that is the American polity. Although the U.S. is also divided demographically and there are sharp demographic differences between regions within the country, <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">partisan differences are dwarfing other divisions</a>
in society.</p>
<p>This is a recent trend: In the mid-1990s, partisan divisions were <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">similar in size to demographic divisions</a>. For example, then and now, women and men would be about the same modest distance apart on political questions, such as whether government should do more to help the poor. In the 1990s, this was also true for Democrats and Republicans. In other words, partisan divisions were no better than demographic factors at predicting people’s political views. Today, if you want to know someone’s political views, <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">you would first want to find out</a> their partisan affiliation. </p>
<h2>The reality of social media</h2>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/jasonahowie/8583949219/in/photolist-e5wZ3t-VMLTZk-8TxrKw-8h6sWa-6sydbF-8tftq9-5XNfPs-bns6av-c9FLYw-6AX2Qo-fgx2YY-eMnSfC-7TGMs7-9robx3-deoZtb-8ddFEZ-8adSFu-8onC9R-bns6J4-7arjcX-7rbSNc-dWUWcb-fiFCow-bns6D8-6ZckCr-98AHcS-gaAzZ6-nZUnJL-qi7hrH-7R7gx2-dXX1bU-9arafo-dPQbk9-7Kh7bs-nX21xv-8adShf-dAgDr2-8emLep-6sARCS-8emR1B-9um8Qd-4se7dy-8emLB2-dhZgBu-73qnar-5n4D99-81gMx7-6ZhLm1-4KvCf1-8emRJF">Jason Howie</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>To be sure, it would be overly simplistic to lay all of this at the feet of social media. Certainly the structure of the American political system, which tends to polarize the political parties in primary elections, plays a major role. And it is true that plenty of us also still get news from other sources, outside of our Facebook filter bubbles. </p>
<p>But, I would argue that Facebook and social media offer an additional layer: Not only do they tend to create filter bubbles on their own, they offer a rich environment for those who want to increase polarization to do so. </p>
<p>Communities share and create social realities. In its current role, social media risks abetting a social reality where differing groups could disagree not only about what to do, but about what reality is.</p><img src="https://counter.theconversation.com/content/86285/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gordon Hull does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A scholar asks whether democracy itself is at risk in a world where social media is creating deeply polarized groups of individuals who tend to believe everything they hear.Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina – CharlotteLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/825142017-08-18T09:07:36Z2017-08-18T09:07:36ZWhy people believe in conspiracy theories – and how to change their minds<figure><img src="https://images.theconversation.com/files/182329/original/file-20170816-17651-y40efh.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Oh please. There's no wind on the moon.</span> <span class="attribution"><span class="source">wikipedia</span></span></figcaption></figure><p>I’m sitting on a train when a group of football fans streams on. Fresh from the game – their team has clearly won – they occupy the empty seats around me. One picks up a discarded newspaper and chuckles derisively as she reads about the latest “alternative facts” peddled by Donald Trump. </p>
<p>The others soon chip in with their thoughts on the US president’s fondness for conspiracy theories. The chatter quickly turns to other conspiracies and I enjoy eavesdropping while the group brutally mock flat Earthers, <a href="http://knowyourmeme.com/memes/chemtrail-conspiracy-theories">chemtrails memes</a> and <a href="https://drjengunter.wordpress.com/2017/05/22/dear-gwyneth-paltrow-were-not-fcking-with-you/">Gwyneth Paltrow’s latest idea</a>. </p>
<p>Then there’s a lull in the conversation, and someone takes it as an opportunity to pipe in with: “That stuff might be nonsense, but don’t try and tell me you can trust everything the mainstream feeds us! Take the moon landings, they were obviously faked and not even very well. I read this blog the other day that pointed out there aren’t even stars in any of the pictures!”</p>
<p>To my amazement the group joins in with other “evidence” supporting the moon landing hoax: inconsistent shadows in photographs, a fluttering flag when there’s no atmosphere on the moon, how Neil Armstrong was filmed walking on to the surface when no-one was there to hold the camera.</p>
<p>A minute ago they seemed like rational people capable of assessing evidence and coming to a logical conclusion. But now things are taking a turn down crackpot alley. So I take a deep breath and decide to chip in.</p>
<p>“Actually all that can be explained quite easily … ”</p>
<p>They turn to me aghast that a stranger would dare to butt into their conversation. I continue undeterred, hitting them with a barrage of facts and rational explanations.</p>
<p>“The flag didn’t flutter in the wind, it just moved as Buzz Aldrin planted it! Photos were taken during lunar daytime – and obviously you can’t see the stars during the day. The weird shadows are because of the very wide-angle lenses they used which distort the photos. And nobody took the footage of Neil descending the ladder. There was a camera mounted on the outside of the lunar module which filmed him making his giant leap. If that isn’t enough then the final clinching proof comes from the <a href="https://www.nasa.gov/mission_pages/LRO/news/apollo-sites.html">Lunar Reconnaissance Orbiter</a>’s photos of the landing sites where you can clearly see the tracks that the astronauts made as they wandered around the surface.</p>
<p>"Nailed it!” I think to myself.</p>
<p>But it appears my listeners are far from convinced. They turn on me, producing more and more ridiculous claims. Stanley Kubrick filmed the lot, key personnel have died in mysterious ways, and so on …</p>
<p>The train pulls up in a station, it isn’t my stop but I take the opportunity to make an exit anyway. As I sheepishly mind the gap I wonder why my facts failed so badly to change their minds. </p>
<p>The simple answer is that facts and rational arguments really aren’t very good at altering people’s beliefs. That’s because our rational brains are fitted with not-so-evolved evolutionary hard wiring. One of the reasons why conspiracy theories spring up with such regularity is due to our desire to impose structure on the world and incredible ability to recognise patterns. Indeed, a recent study showed a correlation between an individual’s need for structure and <a href="https://link.springer.com/article/10.1007/s11002-014-9332-z">tendency to believe in a conspiracy theory</a>. </p>
<p>Take this sequence for example: </p>
<p>0 0 1 1 0 0 1 0 0 1 0 0 1 1 </p>
<p>Can you see a pattern? Quite possibly – and you aren’t alone. A quick <a href="https://twitter.com/Mark_Lorch/status/897510134661992452">twitter poll</a> (replicating <a href="https://pdfs.semanticscholar.org/f472/0326b81d5528c0458510cd87ea7b57418c54.pdf">a much more rigourous</a> study) suggested that 56% of people agree with you – even though the sequence was generated by me flipping a coin. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"897510134661992452"}"></div></p>
<p>It seems our need for structure and our pattern recognition skill can be rather overactive, causing a tendency to spot patterns – like constellations, <a href="https://s-media-cache-ak0.pinimg.com/736x/71/f4/f6/71f4f6b065a5c868183c521fc9e03c84.jpg">clouds that looks like dogs</a> and vaccines causing autism – where in fact there are none. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/182276/original/file-20170816-32624-fudmuu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/182276/original/file-20170816-32624-fudmuu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=406&fit=crop&dpr=1 600w, https://images.theconversation.com/files/182276/original/file-20170816-32624-fudmuu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=406&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/182276/original/file-20170816-32624-fudmuu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=406&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/182276/original/file-20170816-32624-fudmuu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=510&fit=crop&dpr=1 754w, https://images.theconversation.com/files/182276/original/file-20170816-32624-fudmuu.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=510&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/182276/original/file-20170816-32624-fudmuu.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=510&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Can you see what I see?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/blue-sky-background-white-cloud-look-462248776?src=L991bCv37XHH-RRYqdoa6Q-1-17">prapann/Shutterstock</a></span>
</figcaption>
</figure>
<p>The ability to see patterns was probably a useful survival trait for our ancestors – better to mistakenly spot signs of a predator than to overlook a real big hungry cat. But plonk the same tendency in our information rich world and we see nonexistent links between cause and effect – conspiracy theories – all over the place. </p>
<h2>Peer pressure</h2>
<p>Another reason we are so keen to believe in conspiracy theories is that we are social animals and our status in that society is much more important (from an evolutionary standpoint) than being right. Consequently we constantly compare our actions and beliefs to those of our peers, and then alter them to fit in. This means that if our social group believes something, we are more likely to follow the herd.</p>
<p>This effect of social influence on behaviour was nicely demonstrated back in 1961 by the <a href="https://www.researchgate.net/publication/232493453_Note_on_the_Drawing_Power_of_Crowds_of_Different_Size">street corner experiment</a>, conducted by the US social psychologist Stanley Milgram (better known for his work on <a href="https://www.simplypsychology.org/milgram.html">obedience to authority figures</a>) and colleagues. The experiment was simple (and fun) enough for you to replicate. Just pick a busy street corner and stare at the sky for 60 seconds. </p>
<p>Most likely very few folks will stop and check what you are looking at – in this situation Milgram found that about 4% of the passersby joined in. Now get some friends to join you with your lofty observations. As the group grows, more and more strangers will stop and stare aloft. By the time the group has grown to 15 sky gazers, about 40% of the by-passers will have stopped and craned their necks along with you. You have almost certainly seen the same effect in action at markets where you find yourself drawn to the stand with the crowd around it. </p>
<p>The principle applies just as powerfully to ideas. If <a href="http://journals.sagepub.com/doi/abs/10.1177/001872675400700202">more people believe a piece of information</a>, then we are more likely to accept it as true. And so if, via our social group, we are overly exposed to a particular idea then it becomes embedded in our world view. In short <a href="http://www.pnas.org/content/111/Supplement_4/13650.full">social proof</a> is a much more effective persuasion technique than purely evidence-based proof, which is of course why this sort of proof is so popular in advertising (“80% of mums agree”).</p>
<p>Social proof is just one of a host of <a href="http://utminers.utep.edu/omwilliamson/ENGL1311/fallacies.htm">logical fallacies</a> that also cause us to overlook evidence. A related issue is the ever-present <a href="https://theconversation.com/confirmation-bias-a-psychological-phenomenon-that-helps-explain-why-pundits-got-it-wrong-68781">confirmation bias</a>, that tendency for folks to seek out and believe the data that supports their views while discounting the stuff that doesn’t. We all suffer from this. Just think back to the last time you heard a debate on the radio or television. How convincing did you find the argument that ran counter to your view compared to the one that agreed with it? </p>
<p>The chances are that, whatever the rationality of either side, you largely dismissed the opposition arguments while applauding those who agreed with you. Confirmation bias also manifests as a tendency to select information from sources that already agree with our views (which probably comes from the social group that we relate too). Hence your political beliefs probably dictate your preferred news outlets.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=1130&fit=crop&dpr=1 600w, https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=1130&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=1130&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1420&fit=crop&dpr=1 754w, https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1420&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/182127/original/file-20170815-26751-1sc8bar.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1420&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">The difference.</span>
</figcaption>
</figure>
<p>Of course there is a belief system that recognises logical fallacies such as confirmation bias and tries to iron them out. Science, through repetition of observations, turns anecdote into data, reduces confirmation bias and accepts that theories can be updated in the face of evidence. That means that it is open to correcting its core texts. Nevertheless, confirmation bias plagues us all. Star physicist <a href="http://calteches.library.caltech.edu/51/2/CargoCult.htm">Richard Feynman</a> famously described an example of it that cropped up in one of the most rigorous areas of sciences, particle physics. </p>
<blockquote>
<p>“Millikan measured the charge on an electron by an experiment with falling oil drops and got an answer which we now know not to be quite right. It’s a little bit off, because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of the electron, after Millikan. If you plot them as a function of time, you find that one is a little bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.”</p>
<p>“Why didn’t they discover that the new number was higher right away? It’s a thing that scientists are ashamed of – this history – because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong and they would look for and find a reason why something might be wrong. When they got a number closer to Millikan’s value they didn’t look so hard.”</p>
</blockquote>
<h2>Myth-busting mishaps</h2>
<p>You might be tempted to take a lead from popular media by tackling misconceptions and conspiracy theories via the myth-busting approach. Naming the myth alongside the reality seems like a good way to compare the fact and falsehoods side by side so that the truth will emerge. But once again this turns out to be a bad approach, it appears to elicit something that has come to be known as the <a href="http://journals.sagepub.com/doi/pdf/10.1177/1075547015613523">backfire effect</a>, whereby the myth ends up becoming more memorable than the fact. </p>
<p>One of the most <a href="https://clinicaltrials.gov/ct2/show/NCT00296270">striking examples of this</a> was seen in a study evaluating a “Myths and Facts” flyer about flu vaccines. Immediately after reading the flyer, participants accurately remembered the facts as facts and the myths as myths. But just 30 minutes later this had been completely turned on its head, with the myths being much more likely to be remembered as “facts”.</p>
<p>The thinking is that merely mentioning the myths actually helps to reinforce them. And then as time passes you forget the context in which you heard the myth – in this case during a debunking – and are left with just the memory of the myth itself.</p>
<p>To make matters worse, presenting corrective information to a group with firmly held beliefs can actually <a href="https://link.springer.com/article/10.1007%2Fs11109-010-9112-2">strengthen their view</a>, despite the new information undermining it. New evidence creates inconsistencies in our beliefs and an associated emotional discomfort. But instead of modifying our belief we tend to invoke self-justification and even stronger dislike of opposing theories, which can make us more <a href="https://theconversation.com/brexit-trump-and-post-truth-the-science-of-how-we-become-entrenched-in-our-views-69228">entrenched in our views</a>. This has become known as the as the “boomerang effect” – and it is a huge problem when trying to nudge people towards better behaviours. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/182295/original/file-20170816-11616-hzca81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/182295/original/file-20170816-11616-hzca81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/182295/original/file-20170816-11616-hzca81.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/182295/original/file-20170816-11616-hzca81.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/182295/original/file-20170816-11616-hzca81.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/182295/original/file-20170816-11616-hzca81.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/182295/original/file-20170816-11616-hzca81.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Pick your words wisely.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/distraught-looking-conspiracy-believer-suit-aluminum-241641235?src=LH0GW9KFblvG5Adt9DqG3A-1-11">Kunertus/Shutterstock</a></span>
</figcaption>
</figure>
<p>For example, studies have shown that public information messages aimed at reducing smoking, alcohol and drug consumption <a href="https://books.google.co.uk/books?id=Q16NAgAAQBAJ&pg=PA3&lpg=PA3&dq=The+%E2%80%9Cboomerang%E2%80%9D+effect:+A+synthesis+of+findings+and+a+preliminary+theoretical+framework&source=bl&ots=1RRJ4FXBI5&sig=5de1GynumjUQGW_MAbX4BmzTpM8&hl=en&sa=X&ved=0ahUKEwi2hvLr5djVAhUqBsAKHaZ7AJ0Q6AEIUDAH#v=onepage&q=The%20%E2%80%9Cboomerang%E2%80%9D%20effect%3A%20A%20synthesis%20of%20findings%20and%20a%20preliminary%20theoretical%20framework&f=false">all had the reverse effect</a>. </p>
<h2>Make friends</h2>
<p>So if you can’t rely on the facts how do you get people to bin their conspiracy theories or other irrational ideas?</p>
<p>Scientific literacy will probably help in the long run. By this I don’t mean a familiarity with scientific facts, figures and techniques. Instead what is needed is literacy in the scientific method, such as analytical thinking. And indeed <a href="http://www.sciencedirect.com/science/article/pii/S0010027714001632">studies show</a> that dismissing conspiracy theories is associated with more analytic thinking.
Most people will never do science, but we do come across it and use it on a daily basis and so <a href="http://journals.sagepub.com/doi/abs/10.1177/0270467614529707">citizens need the skills</a> to critically assess scientific claims. </p>
<p>Of course, altering a nation’s curriculum isn’t going to help with my argument on the train. For a more immediate approach, it’s important to realise that being part of a tribe helps enormously. Before starting to preach the message, find some common ground.</p>
<p>Meanwhile, to avoid the backfire effect, ignore the myths. Don’t even mention or acknowledge them. Just make the key points: vaccines are safe and <a href="https://www.cdc.gov/flu/about/qa/vaccineeffect.htm">reduce the chances of getting flu by between 50% and 60%</a>, full stop. Don’t mention the misconceptions, as they tend to be better remembered.</p>
<p>Also, don’t get the opponents gander up by challenging their worldview. Instead offer explanations that chime with their preexisting beliefs. For example, conservative climate-change deniers are much <a href="http://journals.sagepub.com/doi/abs/10.1177/0146167209351435">more likely to shift their views</a> if they are also presented with the pro-environment business opportunities.</p>
<p>One more suggestion. Use stories to make your point. People engage with <a href="http://www.pnas.org/content/111/Supplement_4/13614.full">narratives</a> much more strongly than with argumentative or descriptive dialogues. Stories link cause and effect making the conclusions that you want to present seem almost inevitable.</p>
<p>All of this is not to say that the facts and a scientific consensus aren’t important. They are critically so. But an an awareness of the flaws in our thinking allows you to present your point in a far more convincing fashion. </p>
<p>It is vital that we challenge dogma, but instead of linking unconnected dots and coming up with a conspiracy theory we need to demand the evidence from decision makers. Ask for the data that might support a belief and hunt for the information that tests it. Part of that process means recognising our own biased instincts, limitations and logical fallacies. </p>
<p>So how might my conversation on the train have gone if I’d heeded my own advice… Let’s go back to that moment when I observed that things were taking a turn down crackpot alley. This time, I take a deep breath and chip in with.</p>
<p>“Hey, great result at the game. Pity I couldn’t get a ticket.”</p>
<p>Soon we’re deep in conversation as we discuss the team’s chances this season. After a few minutes’ chatter I turn to the lunar landing conspiracy theorist “Hey, I was just thinking about that thing you said about the moon landings. Wasn’t the sun visible in some of the photos?”</p>
<p>He nods.</p>
<p>“Which means it was daytime on the moon, so just like here on Earth would you expect to see any stars?”</p>
<p>“Huh, I guess so, hadn’t thought of that. Maybe that blog didn’t have it all right.”</p><img src="https://counter.theconversation.com/content/82514/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark Lorch does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Rational arguments and myth busting often won’t help you change the mind of a conspiracy theorist. But there are other ways.Mark Lorch, Professor of Science Communication and Chemistry, University of HullLicensed as Creative Commons – attribution, no derivatives.