tag:theconversation.com,2011:/global/topics/cultural-cognition-theory-32262/articlesCultural cognition theory – The Conversation2017-07-05T22:54:18Ztag:theconversation.com,2011:article/802552017-07-05T22:54:18Z2017-07-05T22:54:18ZFacts versus feelings isn’t the way to think about communicating science<figure><img src="https://images.theconversation.com/files/176990/original/file-20170705-3057-4s5j0t.jpg?ixlib=rb-1.1.0&rect=449%2C368%2C5550%2C3440&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">The message might not come through if you put all your communication eggs in one theoretical basket.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/many-eggs-one-basket-bubble-on-336910895">buydeephoto/Shutterstock.com</a></span></figcaption></figure><p>In a world where <a href="https://www.oxforddictionaries.com/press/news/2016/12/11/WOTY-16">“post-truth” was 2016’s word of the year</a>, many people are starting to doubt the efficacy of facts. Can science make sense of anti-science and post-truthism? More generally, how can we understand what drives people’s beliefs, decisions and behaviors?</p>
<p>Scientists have developed many theories to describe <a href="https://doi.org/10.1146/annurev.psych.59.103006.093629">how people process and think about information</a>. Unfortunately, there’s an increasing tendency to see people as creatures whose reasoning mechanisms are largely dependent on a narrow set of processes. For example, one popular theory suggests that if we just communicate more accurate information to people, their behavior will change accordingly. Another suggests that people will reject evidence if it threatens their deeply held cultural worldviews and associated feelings. </p>
<p>It’s more important than ever that our approach to communication is evidence-based and built on a strong, theoretical foundation. Many of these models contribute valuable insights and can help us design better communication, but each on its own is incomplete. And science communicators have a tendency to oversimplify, focusing on a single model and disregarding other theories.</p>
<p>We suggest that this is a dangerous practice and less effective than <a href="https://doi.org/10.1038/nclimate3323">a more nuanced and holistic view</a>. The apparent choice between “fact” and “feeling,” or between “cognition” and “culture,” is a false dilemma. In reality, both are related and address different pieces of the decision-making puzzle. </p>
<h2>Thinking versus feeling</h2>
<p>One well-known theory about how people absorb new facts is the “<a href="https://doi.org/10.1177/0963662504042690">information deficit model</a>.” The main idea here is straightforward: If you throw more facts at people, they’ll eventually come around on an issue.</p>
<p>Most behavioral science scholars agree that this model of human thinking and behavior <a href="https://doi.org/10.17226/23674">is clearly incomplete</a> – people rely on a range of other cues besides facts in guiding their attitudes and behavior. For example, sometimes we simply act based on how we feel about an issue. Unfortunately, the facts don’t always convince.</p>
<p>But the term “information deficit” is problematic, too. People tend to have limited information in most areas of life. For example, we often don’t know the thoughts and feelings of other people we trust and value. Similarly, we might have limited knowledge about appropriate cultural norms when traveling to a new country, and so on. Information deficit isn’t a very meaningful term to use to theorize about human thinking.</p>
<p>Another theory about human thinking is called “cultural cognition.” In brief, it suggests that our cultural values and worldviews shape <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1123807">how we think about science and society</a>.</p>
<p>It’s easy to be duped into thinking of the human brain as a sponge that soaks up only the information it wants to believe. For example, the theory suggests that people’s position on divisive issues such as climate change is not informed by scientific evidence but rather by the strong commitment people have to their political groups and ideologies. The idea is that to protect our cultural worldviews, we actively reject evidence that threatens them – think of someone who fears that government action on climate change undermines the free market.</p>
<p>In short, this narrative sounds appealing on the surface, as humans organize themselves in groups, and much psychological research has shown that we derive part of our <a href="http://psycnet.apa.org/psycinfo/1990-98968-000">social identities from the group affiliations</a> we maintain. </p>
<p>Yet, <a href="https://doi.org/10.1177/1075547015614970">its focus is overly narrow</a>, and there’s a logical fallacy in this conception of human thinking. We belong to many groups at any given time and we juggle many different public and private identities. The real question is about nuance; when and under what conditions is someone motivated to reject scientific facts in favor of their cultural worldview? </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=334&fit=crop&dpr=1 600w, https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=334&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=334&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=419&fit=crop&dpr=1 754w, https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=419&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/176996/original/file-20170705-9733-1olaqr7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=419&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">It’s not a zero sum contest between feelings and facts.</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/image-illustration/brain-heart-on-balance-scale3d-rendering-542556157">haryigit/Shutterstock.com</a></span>
</figcaption>
</figure>
<h2>Either/or misses the point</h2>
<p>To throw all our fact-disseminating eggs into one or the other theoretical basket is oversimplistic and deprives us of important insights. </p>
<p>A more nuanced perspective recognizes that facts and information are embedded in social and cultural contexts. For example, people’s perception of expert consensus matters a great deal, especially on contested issues, and is often described as a <a href="https://doi.org/10.1371/journal.pone.0118489">gateway belief that influences a range of other attitudes</a> about an issue. The near-unanimous consensus that vaccines do not cause autism or that climate change is human-caused are all scientific facts. At the same time, consensus information is also inherently social: It describes the extent of agreement within an influential group of experts.</p>
<p>People often want to be <a href="https://books.google.co.uk/books?hl=en&lr=&id=lsF8zLomQOoC&oi=fnd&pg=PA195&dq=chaiken+accuracy+motivation+1980&ots=30E-57CZyN&sig=zsV8X1SjojZd-_7dyNel5wCPj54#v=onepage&q=chaiken%20accuracy%20motivation%201980&f=false">accurate</a> in their views, and, in an uncertain world bounded by limited time and effort, we make strategic bets on what information to take into account. Consensus acts as a natural heuristic, or mental shortcut, for complicated scientific issues. <a href="https://doi.org/10.1371/journal.pone.0118489">Numerous</a> <a href="https://doi.org/10.1038/nclimate1720">studies</a> have found that highlighting scientific agreement on human-caused global warming can help neutralize and reduce conflicting views about climate change.</p>
<p>Similarly, while some studies have found a limited effect of knowledge on judgment, when you dig deeper into the data, a more nuanced and insightful picture emerges. For example, some studies claim that a <a href="https://doi.org/10.1038/nclimate1547">deficit in scientific “knowledge” does not explain</a> why people are divided on contested issues such as climate change. But what’s being measured in these experiments matters. Indeed, indicators such as how well people understand numbers or their scientific literacy – which is what some of these studies actually quantify – are categorically different from measuring specific knowledge people have about a topic such as climate change. In fact, a survey across six countries found that when people <a href="https://doi.org/10.1038/nclimate2997">understand the causes of climate change</a>, their concern increases accordingly, irrespective of their values. Similarly, <a href="https://doi.org/10.1111/tops.12187">other</a> <a href="https://doi.org/10.1016/j.jenvp.2017.04.008">studies</a> show that explanations about the mechanisms of climate change can reduce biased evaluations of evidence as well as political polarization. </p>
<p>In short, facts do matter.</p>
<h2>How people think is complex and nuanced</h2>
<p>Indeed, there is no need to throw out <a href="https://bppblog.com/2017/06/01/save-the-baby-in-the-bath-water/">the baby with the bathwater</a>. Instead, we need to dispel false dichotomies and folk psychology about human thinking that currently dominate the media. Repeating the story that people don’t care about facts runs the risk of becoming a self-fulfilling prophecy. A holistic view acknowledges that people rely on cognitive shortcuts and emotions, care about social norms and group identities and are sometimes motivated in their reasoning, but it also recognizes the research showing that most people want to fundamentally hold accurate perceptions about the world. </p>
<p>This is particularly important as the public is currently hampered by misinformation and fake news. In two <a href="https://doi.org/10.1002/gch2.201600008">separate</a> <a href="https://doi.org/10.1371/journal.pone.0175799">studies</a>, we each found that misinformation about climate change has a disproportionate influence on public attitudes and opinion. However, we also found that inoculating people against the false arguments neutralized misinformation’s influence, across the political spectrum. In essence, teaching people what false arguments might be deployed helped them overcome their cultural biases. Other work similarly <a href="http://onlinelibrary.wiley.com/doi/10.1111/jcom.12171/abstract">shows</a> that the politicization of science can be counteracted with inoculation.</p>
<p>People are complex, social and affected by a diverse range of influences depending on the situation. We want to hold accurate views, but emotion, group identities and conflicting goals can get in the way. Incorporating these different insights into human thinking enriches our understanding of how people form opinions and make decisions.</p>
<p>Effective science communication requires an inclusive, holistic approach that integrates different social science perspectives. To simplistically focus on a single perspective paints a limited and increasingly inaccurate view of how humans form judgments about social and scientific issues.</p><img src="https://counter.theconversation.com/content/80255/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Reports of facts’ death have been greatly exaggerated. Effective communication jettisons the false dilemma in favor of a more holistic view of how people take in new information on contentious topics.John Cook, Research Assistant Professor, Center for Climate Change Communication, George Mason UniversitySander van der Linden, Director, Cambridge Social Decision-Making Lab, University of CambridgeLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/730282017-03-06T02:14:58Z2017-03-06T02:14:58ZCommunicating climate change: Focus on the framing, not just the facts<figure><img src="https://images.theconversation.com/files/159340/original/image-20170303-29002-1h47na1.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How you package the information matters.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/picture-frame-desert-386830909">Frame image via www.shutterstock.com.</a></span></figcaption></figure><p>Humans are currently in a war against global warming. Or is it a race against global warming? Or maybe it’s just a problem we have to deal with?</p>
<p>If you already consider climate change a pressing issue, you might not think carefully about the way you talk about it – regardless of how you discuss it, you already think of global warming as a problem. But the way we talk about climate change affects the way people think about it.</p>
<p>For scientific evidence to shape people’s actions – both personal behaviors like recycling and choices on policies to vote for – it’s crucial that science be communicated to the public effectively. Social scientists have been increasingly studying the science of science communication, to better understand what does and does not work for discussing different scientific topics. It turns out the language you use and how you frame the discussion can make a big difference.</p>
<h2>The paradox of science communication</h2>
<p>“Never have human societies known so much about mitigating the dangers they faced <a href="http://dx.doi.org/10.2139/ssrn.2562025">but agreed so little</a> about what they collectively know,” writes Yale law professor <a href="http://www.culturalcognition.net/kahan/">Dan Kahan</a>, a leading researcher in the science of science communication.</p>
<p><a href="http://dx.doi.org/10.1111/pops.12244">Kahan’s work</a> shows that just because someone has scientific knowledge, he or she won’t necessarily hold science-supported beliefs about controversial topics like global warming, private gun possession or fracking.</p>
<p>Instead, <a href="http://dx.doi.org/10.1038/488255a">beliefs are shaped by the social groups</a> people consider themselves to be a part of. We’re all simultaneously members of many social groups – based, for example, on political or religious affiliation, occupation or sexuality. If people are confronted with scientific evidence that seems to attack their group’s values, they’re likely to become defensive. They may consider the evidence they’ve encountered to be flawed, and strengthen their conviction in their prior beliefs.</p>
<p>Unfortunately, scientific evidence does sometimes contradict some groups’ values. For example, some religious people trust a strict reading of the Bible: God said there would be four seasons, and hot and cold, so they don’t worry about the patterns in climate that alarm scientists. In cases like this one, how can communicators get their message across? </p>
<p>A growing body of research suggests that instead of bombarding people with piles of evidence, science communicators can focus more on how they present it. The problem isn’t that people haven’t been given enough facts. It’s that they haven’t been given facts in the right ways. Researchers often refer to this packaging as framing. Just as picture frames enhance and draw attention to parts of an image inside, linguistic frames can do the same with ideas.</p>
<p>One framing technique Kahan encourages is <a href="http://www.nature.com/news/why-we-are-poles-apart-on-climate-change-1.11166">disentangling facts from people’s identities</a>. Biologist Andrew Thaler describes one way of doing so in a post called <a href="http://www.southernfriedscience.com/when-i-talk-about-climate-change-i-dont-talk-about-science/">“When I talk about climate change, I don’t talk about science</a>.” Instead, he talks about things that are important to his audiences, such as fishing, flooding, farming, faith and the future. These issues that matter to the people with whom he’s communicating become an entry into discussing global warming. Now they can see scientific evidence as important to their social group identity, not contradictory to it.</p>
<h2>Let me rephrase that</h2>
<p>Metaphors also provide frames for talking about climate change. Recent work by psychologists <a href="http://www.stephenflusberg.com/">Stephen Flusberg</a>, <a href="https://sites.google.com/a/oberlin.edu/thibodeau/home">Paul Thibodeau</a> and <a href="http://teeniematlock.com/">Teenie Matlock</a> suggests that the metaphors we use to describe global warming can influence people’s beliefs and actions.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=961&fit=crop&dpr=1 600w, https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=961&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=961&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1207&fit=crop&dpr=1 754w, https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1207&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/159332/original/image-20170303-29032-144rzey.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1207&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Ready for combat?</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/thomashawk/2346593616">Thomas Hawk</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span>
</figcaption>
</figure>
<p>The researchers asked 3,000 Americans on an online platform to read a short fictional news article about climate change. The articles were exactly the same, but they used different metaphors: One referred to the “war against” and another to the “race against” climate change. For example, each article included phrases about the U.S. seeking to either “combat” (war) or “go after” (race) excessive energy use.</p>
<p>After reading just one of these passages, participants answered questions about their global warming beliefs, like how serious global warming is and whether they would be willing to engage in more pro-environmental behaviors.</p>
<p>Metaphors mattered. Reading about the “war” against global warming led to greater agreement with scientific evidence showing it is real and human-caused. This group of participants indicated more urgency for reducing emissions, believed global warming poses a greater risk and responded that they were more willing to change their behaviors to reduce their carbon footprint than people who read about the “race” against global warming.</p>
<p>The only difference between the articles that participants read was the metaphors they included. Why would reading about a war rather than a race affect people’s beliefs about climate change in such important ways?</p>
<p>The researchers suggest that when we encounter war metaphors, we are reminded (though not always consciously) of other war-related concepts like death, destruction, opposition and struggle. These concepts affect our emotions and remind us of the negative feelings and consequences of defeat. With those war-related thoughts in mind, we may be motivated to avoid losing. If we have these war thoughts swimming around in our minds when we think about global warming, we’re more likely to believe it’s important to defeat the opponent, which, in this case, is global warming. </p>
<p>There are other analogies that are good at conveying the causes and consequences for global warming. Work by psychologists <a href="http://www-personal.umich.edu/%7Ekraimi/">Kaitlin Raimi</a>, <a href="https://www.wilsoncenter.org/person/paul-c-stern">Paul Stern</a> and <a href="http://www.vanderbilt.edu/viee/profiles/Alex-Maki.php">Alexander Maki</a> suggests it helps to point out how global warming is <a href="http://dx.doi.org/10.1371/journal.pone.0171130">similar to many medical diseases</a>. For both, risks are often caused or aggravated by human behaviors, the processes are often progressive, they produce symptoms outside the normal range of past experiences, there are uncertainties in the prognosis of future events, treatment often involves trade-offs or side effects, it’s usually most effective to treat the underlying problem instead of just alleviating symptoms and they’re hard to reverse.</p>
<p>People who read the medical disease analogy for climate change were more likely to agree with the science-backed explanations for global warming causes and consequences than those who read a different analogy or no analogy at all.</p>
<h2>Golden past or rosy future?</h2>
<p>Climate change messages can also be framed by focusing on different time periods. Social psychologists <a href="http://soccco.uni-koeln.de/matthew-baldwin.html">Matthew Baldwin</a> and <a href="https://lammers.socialpsychology.org/">Joris Lammers</a> asked people to <a href="http://dx.doi.org/10.1073/pnas.1610834113">read either a past-focused climate change message</a> (like “Looking back to our nation’s past… there was less traffic on the road”) or a similar future-focused message (“Looking forward to our nation’s future… there is increasing traffic on the road”).</p>
<p>The researchers found that self-identified conservatives, who <a href="http://www.pewinternet.org/2016/10/04/the-politics-of-climate/">tend to resist climate change messages more than liberals</a>, agreed that we should change how we interact with the planet more after reading the past-focused passage. Liberals, on the other hand, reported liking the future-focused frame better, but the frames had no influence on their environmental attitudes.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=825&fit=crop&dpr=1 600w, https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=825&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=825&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1037&fit=crop&dpr=1 754w, https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1037&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/159338/original/image-20170303-29034-1fqougm.jpeg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1037&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Example of a past-focused image (top) and a future-focused image (bottom) of a reservoir.</span>
<span class="attribution"><a class="source" href="http://www.pnas.org/content/113/52/14953/F3.expansion.html">Image courtesy of NASA. Used in Baldwin and Lammers, PNAS December 27, 2016 vol. 113 no. 52 14953-14957.</a></span>
</figcaption>
</figure>
<p>And the frames didn’t have to be words. Conservatives also shifted their beliefs to be more pro-environmental after seeing past-focused images (<a href="http://www.pnas.org/content/113/52/14953/F3.expansion.html">satellite images</a> that progressed from the past to today) more than after seeing future-focused ones (satellite images that progressed from today into the future). Liberals showed no differences in their attitudes after seeing the two frames.</p>
<p>Many climate change messages focus on the potential future consequences of not addressing climate change now. This research on time-framing suggests that such a forward-looking message may in fact be unproductive for those who already tend to resist the idea. </p>
<p>There’s no one-size-fits-all frame for motivating people to care about climate change. Communicators need to <a href="http://collabra.org/articles/10.1525/collabra.68/">know their audience and anticipate their reactions</a> to different messages. When in doubt, though, these studies suggest science communicators might want to bring out the big guns and encourage people to fire away in this war on climate change, while reminding them how wonderful the Earth used to be before our universal opponent began attacking full force.</p><img src="https://counter.theconversation.com/content/73028/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Rose Hendricks receives funding from the NSF GRFP. </span></em></p>Are we in a race against climate change? Or is it a war? How does thinking of the past or the future affect your support for the science? Researchers are learning how metaphors and context matter.Rose Hendricks, Ph.D. Candidate in Cognitive Science, University of California, San DiegoLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/666262016-10-17T01:05:29Z2016-10-17T01:05:29ZWhy do science issues seem to divide us along party lines?<figure><img src="https://images.theconversation.com/files/141672/original/image-20161013-3944-2e6h1d.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">There's more to it than political beliefs.</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-323678543.html">Buttons image via www.shutterstock.com.</a></span></figcaption></figure><p>Much has been made about the <a href="http://www.vox.com/science-and-health/2016/10/10/13227682/trump-clinton-climate-energy-difference">predictable partisan split</a> between presidential candidates Hillary Clinton and Donald Trump on <a href="http://sciencedebate.org/20questions">issues of science and public policy</a>. But what about their supporters? Can Americans really be that far apart in terms of science?</p>
<p>That liberals and conservatives have different opinions toward science is taken as a given. Typically, <a href="http://www.motherjones.com/environment/2014/09/left-science-gmo-vaccines">conservatives are painted as anti-science</a>, with some studies suggesting their <a href="http://doi.org/10.1177/0003122412438225">mistrust of science is increasing</a>. Liberals, on the other hand, are usually assumed to be more <a href="http://www.motherjones.com/environment/2014/09/left-science-gmo-vaccines">receptive to science in general and more supportive of using science to shape policy</a>. </p>
<p>Noting that party affiliation is different than political ideology – not everyone who identifies as liberal is a Democrat and not everyone who identifies as conservative is a Republican – these characterizations certainly seem to be true when we look at major leaders of the political parties. <a href="https://www.washingtonpost.com/news/the-fix/wp/2015/02/26/jim-inhofes-snowball-has-disproven-climate-change-once-and-for-all/">Many</a> <a href="http://www.motherjones.com/environment/2016/03/marco-rubio-had-some-really-dumb-things-say-about-climate-change-last-nigh">Republican</a> <a href="http://www.motherjones.com/environment/2016/01/ted-cruz-satellite-date-climate-change">politicians</a> <a href="https://www.theguardian.com/us-news/2016/apr/15/sarah-palin-bill-nye-climate-change-hustle-film">have</a> <a href="http://www.desmoinesregister.com/story/opinion/columnists/kathie-obradovich/caucus/2015/05/05/ben-carson-climate-change-renewable-fuel-standard/26945261/">publicly</a> <a href="http://www.motherjones.com/politics/2015/06/rick-perry-climate-change-skeptic-oops">expressed</a> <a href="http://www.cbsnews.com/news/paul-ryan-whos-to-blame-climate-change/">doubts</a> over the scientific consensus on climate change, for instance. At the top of the Republican presidential ticket is Donald Trump, who has <a href="http://www.politifact.com/truth-o-meter/statements/2016/jun/03/hillary-clinton/yes-donald-trump-did-call-climate-change-chinese-h/">called climate change a Chinese hoax</a> and is on the record as supporting any number of <a href="https://theconversation.com/the-rise-of-a-conspiracy-candidate-65514">other conspiracy theories</a>. Conversely, Hillary Clinton’s line at the Democratic National Convention – “<a href="http://www.slate.com/articles/health_and_science/science/2016/07/hillary_clinton_believes_in_science_that_shouldn_t_be_noteworthy.html">I believe in science</a>” – was met with resounding applause.</p>
<p>Assuming that the stated views of outspoken politicians reflect the personal beliefs of voters within their parties is tempting. After all, voters elect politicians, presumably on the basis of having comparable worldviews. But research suggests that the <a href="http://www.sciencemag.org/news/2015/02/politics-science-and-public-attitudes-what-we-re-learning-and-why-it-matters">link between partisanship and views on science may not be so cut and dried</a>. Buried in the data is a much more nuanced relationship that’s well worth examining. As a sociologist who focuses on ways to communicate science issues to the public, I’m interested in how a more clear-eyed view of this connection could be used to help combat anti-science attitudes.</p>
<h2>Quantifying the science trust gap</h2>
<p>In 2015, researchers asked 2,000 registered voters <a href="http://doi.org/10.1177/0002716214554756">how deferential they felt politicians should be to science</a> when creating public policy on a variety of issues. On a 10-point scale, participants ranked whether politicians should follow the advice of scientists (10), consider scientific findings in conjunction with other factors (5) or ignore scientific findings completely (1). Issues included climate change, legalizing drug usage, fetal viability, regulating nuclear power and teaching evolution, among other topics.</p>
<p>The participants then responded to questions about their political affiliation and ideological views, religious beliefs and other demographic variables.</p>
<p>Most people supported trusting the recommendations of scientists on policy issues, even politically contentious ones. The average score for all participants across all issues was 6.4, and the lowest-scoring issue (letting same-sex couples adopt children) was 4.9. The results suggest, in other words, that even on divisive issues, Americans think that politicians should take scientific recommendations into consideration when making public policy.</p>
<p>Breaking down responses based on political leanings did reveal some partisan differences. When it comes to deferring to scientific experts on policy issues, conservatives and independents look a lot alike. Averaged across issues, independents said policymakers should weigh science and other factors more or less evenly (5.84), only slightly more than conservatives did (5.58). Liberals, on the other hand, expressed much higher rates of deference to science – across issues, they averaged 7.46.</p>
<p>These findings are interesting because we tend to think of independents as the middle-of-the-road in American politics. If conservatives and independents are on the same page, though, it means that liberals are the outliers, so to speak. In other words, rather than most people putting an emphasis on science while conservatives steadfastly ignore it, the truth is that many people want other factors included in policy discussions. It’s liberals who are further from the pack on this issue, wanting more emphasis on science than their peers. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=501&fit=crop&dpr=1 600w, https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=501&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=501&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=630&fit=crop&dpr=1 754w, https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=630&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/141688/original/image-20161013-3944-11x94ia.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=630&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Do these stem cells strike you as more liberal or conservative?</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/pennstatelive/8972110324">Penn State</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-nd/4.0/">CC BY-NC-ND</a></span>
</figcaption>
</figure>
<h2>It’s not their politics, it’s their values</h2>
<p>Other research has similarly found that science denial can run the political spectrum. For instance, <a href="http://doi.org/10.1177/2158244013518932">another study examined</a> attitudes about climate change, evolution and stem cell research and found that partisan identification was not necessarily a good predictor of how someone will feel about these controversial issues. In fact, very few participants were found to be skeptical of science across the board. And reactions to these specific issues were more tightly linked with religious attitudes than with political ones. </p>
<p>Other scholarship <a href="http://doi.org/10.1177/0002716214554756">echoes these findings</a>. Indeed, <a href="http://doi.org/10.1177/0003122414558919">research does suggest</a> that a certain segment of the population places more trust in religion than in science for understanding the world. But even among this group, science and religion are seen as conflicting only on certain topics, including the Big Bang and evolution.</p>
<p>One area in which political beliefs do have an impact is the <a href="http://doi.org/10.1088/1748-9326/8/4/044029">kinds of scientists that liberals and conservatives are likely to trust</a>. A 2013 study of 798 participants found that conservatives put more faith in scientists involved in economic production – food scientists, industrial chemists and petroleum geologists, for instance – than in scientists involved in areas associated with regulation, such as public health and environmental science. The opposite was true for liberals. Again, this suggests that it’s not simply a matter of conservatives being skeptical of science in general; there’s a much more nuanced relationship between political leanings and trust in scientific expertise.</p>
<p>So why does it appear that liberals and conservatives are living in different worlds when it comes to issues of science? Partisanship clearly plays some role in how people view science and their willingness to trust scientific information. And because these disagreements tend to come on high-profile issues like climate change and evolution, about which there is already so much controversy, it’s easy to get the impression that the liberal and conservative divide on science must run incredibly deep.</p>
<h2>Comes down to cultural cognition</h2>
<p>To help explain why people fall in line with their fellow partisans on these high-profile issues, consider the theory of <a href="http://www.culturalcognition.net/browse-papers/cultural-cognition-as-a-conception-of-the-cultural-theory-of.html">cultural cognition</a>. This social sciences concept suggests it’s hard for people to <a href="http://doi.org/10.1038/488255a">accept new information that poses a threat to their values system</a>. Addressing climate change, for instance, is <a href="http://dukespace.lib.duke.edu/dspace/bitstream/handle/10161/9256/Campbell%20et%20al._Solution%20Aversion.pdf">often talked about in terms of government regulation</a> of carbon pollution. For conservatives who oppose government involvement in the economy, this poses a threat to an idea they hold very dear.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/141682/original/image-20161013-3958-flsylr.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">People like to stick together and share beliefs commonly held within their group.</span>
<span class="attribution"><a class="source" href="http://www.shutterstock.com/pic.mhtml?id=390406471">Sign image via www.shutterstock.com.</a></span>
</figcaption>
</figure>
<p>No one likes to be wrong, of course. <a href="http://doi.org/10.1038/463296a">Cultural cognition theorists take this a step further and argue</a> that there are social consequences to taking a position about a political issue that runs counter to what your community believes – just ask conservative former congressman <a href="http://www.washingtontimes.com/news/2015/mar/24/bob-inglis-advocates-action-to-fight-climate-chang/">Bob Inglis</a>, who was defeated by a primary challenger in 2010 after speaking out on climate change. </p>
<p>From loss of business to strained interpersonal relationships, being the black sheep is hard. Rather than changing their beliefs about government regulation, then, it’s cognitively more comfortable for conservatives in conservative social circles to maintain skepticism about climate change. It’s less an inherent distrust of science, then, but rather a need to discount the science that supports policies that threaten a deep belief.</p>
<p>Everyone is subject to this effect. There are studies that suggest <a href="http://doi.org/10.1177/0002716214555474">it’s stronger for conservatives</a>, but liberals, too, come to mistrust scientific information when it challenges their worldviews. For instance, a 2014 study found that <a href="http://dx.doi.org/10.1037/a0037963">liberals will display the same sort of evidence-ignoring behaviors</a> as their conservative counterparts when faced with arguments that go against their beliefs about policies like gun control. (Claims about <a href="http://www.theatlantic.com/politics/archive/2013/11/the-republican-party-isnt-really-the-anti-science-party/281219/">liberals exhibiting anti-science bias</a> on the issues of vaccination and genetically modified organisms are increasing, though they <a href="https://www.washingtonpost.com/news/energy-environment/wp/2015/01/26/the-biggest-myth-about-vaccine-deniers-that-theyre-all-a-bunch-of-hippie-liberals/?utm_term=.2b8dad5caf78">are challenged by recent</a> <a href="http://blogs.discovermagazine.com/gnxp/2013/06/do-liberals-oppose-genetically-modified-organisms-more-than-conservatives/">studies</a>.)</p>
<p>In other words, these divides may not reflect Americans’ attitudes toward science so much as other cultural and personal beliefs.</p>
<h2>Get past assumptions to common ground</h2>
<p>Having a more complete understanding of when and why liberals and conservatives trust science helps avoid oversimplifications. It’s an important stopgap using oversimplified assumptions to denigrate those who disagree with us politically.</p>
<p>None of this is to suggest that the <a href="http://www.slate.com/blogs/bad_astronomy/2016/07/12/gop_party_platform_claims_coal_is_clean.html">anti-science viewpoints</a> exhibited by Republican politicians on issues such as climate change should be ignored. Nor is it an argument that since “both sides” can fall for anti-science rhetoric, it can be waved away. </p>
<p>Rather, these findings indicate that, in theory, it’s possible liberals and conservatives could work together to encourage politicians to base policy recommendations on sound science, at least on some issues.</p>
<p>Maybe even more importantly, understanding the social and cultural issues surrounding the acceptance or rejection of science is a first step toward crafting messages that resonate with members of the public who question the science on hot-button issues. Research suggests <a href="https://vimeo.com/121145322">using the right kind of messenger</a> – someone who is trusted within the community – can be key to moving the needle. Science communications scholars have been <a href="http://frank.jou.ufl.edu/frankology/14532/">hard</a> <a href="http://frank.jou.ufl.edu/frankology/vaccine-myths/">at</a> <a href="http://frank.jou.ufl.edu/frankology/climate-change-science/">work</a> devising other tactics to help reach people on issues of science. Hopefully they’ll trust the growing body of social science evidence to help guide their efforts.</p><img src="https://counter.theconversation.com/content/66626/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Lauren Griffin does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Social scientists investigate when and why liberals and conservatives mistrust science. The apparent split may be more about cultural and personal beliefs than feelings about science itself.Lauren Griffin, Co-Director of Research for frank and Manager of the Journal of Public Interest Communications, College of Journalism and Communications, University of FloridaLicensed as Creative Commons – attribution, no derivatives.