tag:theconversation.com,2011:/au/topics/filter-bubble-31075/articlesFilter bubble – The Conversation2021-07-13T15:06:02Ztag:theconversation.com,2011:article/1636692021-07-13T15:06:02Z2021-07-13T15:06:02ZTargeted ads isolate and divide us even when they’re not political – new research<figure><img src="https://images.theconversation.com/files/411034/original/file-20210713-23-6dc9jt.jpeg?ixlib=rb-1.1.0&rect=17%2C116%2C5973%2C3871&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/miniature-people-separated-office-cubicles-sterile-1565199103">Zenza Flarini/Shutterstock</a></span></figcaption></figure><p>Five years since the <a href="https://edition.cnn.com/2021/06/23/uk/brexit-five-years-on-analysis-intl-cmd/index.html">Brexit vote</a> and three since the <a href="https://www.theguardian.com/news/video/2018/mar/19/everything-you-need-to-know-about-the-cambridge-analytica-expose-video-explainer">Cambridge Analytica scandal</a>, we’re now familiar with the role that targeted political advertising can play in fomenting <a href="https://theconversation.com/heres-what-happens-when-political-bubbles-collide-121856">polarisation</a>. It was <a href="https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html">revealed</a> in 2018 that Cambridge Analytica had used data harvested from 87 million Facebook profiles, without users’ consent, to help Donald Trump’s 2016 election campaign target key voters with online adverts.</p>
<p>In the years since, we’ve learned how these kinds of targeted adverts can create political <a href="https://www.techopedia.com/definition/28556/filter-bubble">filter bubbles</a> and <a href="https://www.techopedia.com/definition/23423/echo-chamber">echo chambers</a>, suspected of <a href="https://journals.sagepub.com/doi/abs/10.1177/0266382117722446?casa_token=2ayQdG9GskoAAAAA%3ArSCq4yOZ5x33tVexv0EX4jPRqQ7SNCK7z8Pfm42ooHea4Y_VdILuTGVEe7lC3CqJg8Cv1QM9mOx43g&journalCode=bira">dividing people</a> and increasing the circulation of <a href="https://www.tandfonline.com/doi/abs/10.1080/10584609.2021.1910887?journalCode=upcp20">harmful disinformation</a>.</p>
<p>But the vast majority of the ads exchanged online are commercial, not political. Commercial targeted advertising is the primary source of revenue in the <a href="https://www.bbc.com/future/article/20140509-how-much-is-your-facebook-worth">internet economy</a>, but we know little about how it affects us. We know our personal data is collected to support targeted advertising in a way that <a href="https://theconversation.com/explainer-what-is-surveillance-capitalism-and-how-does-it-shape-our-economy-119158">violates our privacy</a>. But aside from privacy considerations, how else might targeting be harming us – and how could these harms be prevented?</p>
<p>These questions motivated <a href="https://www.nature.com/articles/s42256-021-00358-3">our recent research</a>. We found that online targeted advertising also divides and isolates us by preventing us from collectively flagging ads we object to. We do this in the physical world (perhaps when we see an advert at a bus stop or train station) by alerting regulators to harmful content. But online consumers are isolated because the information they see is limited to what is targeted at them.</p>
<p>Until we address this flaw, preventing targeted adverts from isolating us from the feedback of others, regulators won’t be able to protect us from online adverts that could cause us harm.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/Q91nvbJSmS4?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Due to the sheer volume of ads exchanged online, human supervisors cannot vet each campaign. So increasingly, <a href="https://www.reuters.com/technology/ibm-explores-ai-tools-spot-cut-bias-online-ad-targeting-2021-06-24/">machine learning algorithms</a> screen the content of ads, predicting the likelihood that they may be harmful or fail to conform to standards. But these predictions can be <a href="https://www.ibm.com/blogs/research/2018/02/mitigating-bias-ai-models/">biased</a>, and they typically only ban the clearest violations. Among the many ads that pass these controls, a significant portion still contain potentially harmful content.</p>
<p>Traditionally, advertising standards authorities have taken a reactive approach to regulating advertising, relying upon consumer complaints. Take the 2015 case of Protein World’s “<a href="https://www.campaignlive.co.uk/article/asa-bans-protein-world-ad-launches-social-responsibility-probe/1345269?src_site=marketingmagazine">Beach Body</a>” campaign, which was displayed across the London Underground on billboards featuring a bikini-clad model next to the words: “Are you beach body ready?” Many commuters <a href="https://www.asa.org.uk/rulings/protein-world-ltd-a15-300099.html#.VZNwQ_ldWiQ">complained</a>, saying that it promoted harmful stereotypes. Shortly after, the ad was <a href="https://www.theguardian.com/media/2015/apr/29/beach-body-ready-ad-faces-formal-inquiry-as-campaign-sparks-outrage">banned</a> and a <a href="https://www.asa.org.uk/resource/depictions-perceptions-and-harm.html">public probe</a> into socially responsible advertising was launched.</p>
<h2>Regulating adverts</h2>
<p>The Protein World case illustrates how regulators work. Because they respond to consumer complaints, the regulator is open to considering how adverts conflict with perceived social norms. As social norms evolve over time, this helps regulators keep up with what the public considers to be harmful. </p>
<p>Consumers complained about the ad because they felt it promoted and normalised a harmful message. But it was reported that only <a href="https://www.bbc.co.uk/news/uk-33340301">378 commuters</a> raised complaints with the regulator, of the hundreds of thousands likely to have seen them. This raises the question: what about all the others? If the campaign had taken place online, people wouldn’t have seen posters defaced by disgruntled commuters and they may not have been prompted to question its message. </p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"593819614829654019"}"></div></p>
<p>What’s more, if the ad could have been targeted to just the subset of consumers most receptive to its message, they might not have raised any complaints. As a result, the harmful message would have gone unchallenged, missing an opportunity for the regulator to update their guidelines in keeping with current social norms.</p>
<p>Sometimes ads are harmful in a specific context, as when ads for high-fat-content foods are targeted to children, or when gambling ads target those who suffer from a gambling addiction. Targeted ads can also harm by omission. This is the case, for example, if ads for shoes crowd out job ads or public health announcements that someone might find more useful or even vital. </p>
<p>These cases can be described as contextual harms: they’re not tied to specific content, but rather depend on the context in which the ad is presented to the consumer.</p>
<p>Machine learning algorithms are bad at identifying contextual harms. On the contrary, the way targeting works actually amplifies them. Several <a href="https://arxiv.org/ftp/arxiv/papers/2008/2008.09656.pdf">audits</a>, for example, have uncovered how Facebook has allowed <a href="https://venturebeat.com/2020/08/28/ai-weekly-facebooks-discriminatory-ad-targeting-illustrates-the-dangers-of-biased-algorithms/">discriminatory targeting</a> that worsens socioeconomic inequalities.</p>
<h2>Digging deeper</h2>
<p>The root cause of all these issues can be traced to the fact that consumers have a very isolated experience online. We call this a state of “<a href="https://www.nature.com/articles/s42256-021-00358-3">epistemic fragmentation</a>”, where the information available to each individual is limited to what is targeted at them, without the opportunity to compare with others in a shared space like the London Underground.</p>
<p>Because of personalised targeting, each of us sees different ads. This makes us more vulnerable. Ads can play on our personal vulnerabilities, or they can withhold opportunities from us that we never knew existed. Because we don’t know what other users are seeing, our ability to look out for other vulnerable people is also limited.</p>
<p>Currently, regulators are adopting a combination of two strategies to address these challenges. First, we see an increasing focus on <a href="https://privacyinternational.org/explainer/2828/how-minimise-targeted-ads-social-media-instagram-which-owned-facebook">educating consumers</a> to give them “control” over how they’re targeted. Second, there’s a push towards monitoring ad campaigns proactively, automating screening mechanisms before ads are published online. Both of these strategies are too limited.</p>
<p>Instead, we should focus on restoring the role of consumers as active participants in the regulation of online advertising. This could be achieved by blunting the precision of targeting categories, by instituting targeting quotas, or by banning targeting altogether. This would ensure that at least a portion of online ads are seen by more diverse consumers, in a shared context where objections to them can be raised and shared.</p>
<p>In the wake of the Cambridge Analytica scandal, efforts were made by <a href="https://www.electoralcommission.org.uk/who-we-are-and-what-we-do/changing-electoral-law/transparent-digital-campaigning">The Electoral Commission</a> to prise open the hidden world of targeted political ads in the run up to the UK’s 2019 election. <a href="https://www.channel4.com/news/factcheck/factcheck-conservatives-admit-ads-in-key-marginal-seats-are-wrong">Some broadcasters</a> asked their audience to send in targeted ads on their social media feeds, in order to share them with a wider audience. <a href="https://www.bbc.co.uk/news/technology-50726500">Campaign groups</a> and <a href="https://blogs.lse.ac.uk/medialse/2019/12/12/online-political-advertising-in-the-uk-2019-general-election-campaign/">academics</a> were able to analyse targeting campaigns in greater detail, exposing where ads could be harmful or untrue.</p>
<p>These strategies could also be used for commercial targeted advertising, which would break the epistemic fragmentation that currently prevents us from collectively responding to harmful adverts. Our research shows it’s not just political targeting that produces harms – commercial targeting requires our attention too.</p><img src="https://counter.theconversation.com/content/163669/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Silvia Milano received funding from Miami Foundation and Luminate Group. </span></em></p><p class="fine-print"><em><span>Brent Mittelstadt has received funding from the British Academy, Miami Foundation, and Luminate Group.</span></em></p><p class="fine-print"><em><span>Sandra Wachter received funding from the British Academy, Miami Foundation, and Luminate Group. </span></em></p>We know targeted political adverts contribute to polarisation, but commerical ones leave us fragmented too.Silvia Milano, Postdoctoral Researcher in AI Ethics, University of OxfordBrent Mittelstadt, Research Fellow in Data Ethics, University of OxfordSandra Wachter, Associate Professor and Senior Research Fellow, Oxford Internet Institute, University of OxfordLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1617132021-06-29T12:06:22Z2021-06-29T12:06:22ZScience denial: Why it happens and 5 things you can do about it<figure><img src="https://images.theconversation.com/files/408741/original/file-20210628-25-dhlbk1.jpg?ixlib=rb-1.1.0&rect=171%2C171%2C5433%2C3829&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Are you open to new ideas and willing to change your mind?</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/entrepreneur-with-arms-crossed-at-modern-workplace-royalty-free-image/1210533708">Klaus Vedfelt/DigitalVision via Getty Images</a></span></figcaption></figure><p>Science denial became deadly in 2020. Many political leaders <a href="https://www.scientificamerican.com/article/the-failure-of-public-health-messaging-about-covid-19/">failed to support what scientists knew to be effective</a> prevention measures. Over the course of the pandemic, people <a href="https://www.washingtonpost.com/health/2020/11/16/south-dakota-nurse-coronavirus-deniers/">died from COVID-19 still believing it did not exist</a>.</p>
<p><a href="https://www.simonandschuster.com/books/Galileo/Mario-Livio/9781501194740">Science denial is not new</a>, of course. But it is more important than ever to understand why some people deny, doubt or resist scientific explanations – and what can be done to overcome these barriers to accepting science.</p>
<p>In our book “<a href="https://global.oup.com/academic/product/science-denial-9780190944681">Science Denial: Why It Happens and What to Do About It</a>,” we offer ways for you to understand and combat the problem. As <a href="https://scholar.google.com/citations?user=LzHZpAEAAAAJ&hl=en&oi=ao">two research</a> <a href="https://scholar.google.com/citations?user=VBvoFacAAAAJ&hl=en&oi=ao">psychologists</a>, we know that everyone is susceptible to forms of it. Most importantly, we know there are solutions.</p>
<p>Here’s our advice on how to confront five psychological challenges that can lead to science denial.</p>
<h2>Challenge #1: Social identity</h2>
<p>People are social beings and tend to align with those who hold <a href="https://doi.org/10.1002/9781119011071.iemp0153">similar beliefs and values</a>. Social media <a href="https://www.nature.com/articles/d43978-021-00019-4">amplify alliances</a>. You’re likely to <a href="https://www.penguinrandomhouse.com/books/309214/the-filter-bubble-by-eli-pariser/">see more of what you already agree with</a> and fewer alternative points of view. People live in information filter bubbles created by <a href="https://www.pewresearch.org/internet/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/">powerful algorithms</a>. When those in your social circle share misinformation, you are more likely to believe it and share it. Misinformation multiplies and science denial grows.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="two seated men in discussion" src="https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/408706/original/file-20210628-21-1xk8f82.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Can you find common ground to connect on?</span>
<span class="attribution"><a class="source" href="https://unsplash.com/photos/W3Jl3jREpDY">LinkedIn Sales Solutions/Unsplash</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>Action #1: Each person has multiple social identities. One of us talked with a climate change denier and discovered he was also a grandparent. He opened up when thinking about his grandchildren’s future, and the conversation turned to economic concerns, the root of his denial. Or maybe someone is vaccine-hesitant because so are mothers in her child’s play group, but she is also a caring person, concerned about immunocompromised children.</p>
<p>We have found it effective to listen to others’ concerns and try to find common ground. Someone you <a href="https://doi.org/10.1007/s11109-015-9312-x">connect with is more persuasive</a> than those with whom you share less in common. When one identity is blocking acceptance of the science, leverage a second identity to make a connection.</p>
<h2>Challenge #2: Mental shortcuts</h2>
<p>Everyone’s busy, and it would be exhausting to be vigilant deep thinkers all the time. You see an article online with a clickbait headline such as “Eat Chocolate and Live Longer” and you share it, because you assume it is true, want it to be or think it is ridiculous. </p>
<p>Action #2: Instead of sharing that article on how GMOs are unhealthy, learn to slow down and monitor the quick, intuitive responses that psychologist <a href="https://us.macmillan.com/books/9780374533557">Daniel Kahneman calls System 1 thinking</a>. Instead turn on the rational, analytical mind of System 2 and ask yourself, <a href="https://doi.org/10.1080/00461520.2020.1730181">how do I know this is true</a>? Is it plausible? Why do I think it is true? Then do some fact-checking. Learn to not immediately accept information you already believe, which is called <a href="https://doi.org/10.1037/1089-2680.2.2.175">confirmation bias</a>. </p>
<h2>Challenge #3: Beliefs on how and what you know</h2>
<p>Everyone has <a href="https://www.routledge.com/Handbook-of-Epistemic-Cognition/Greene-Sandoval-Braten/p/book/9781138013421">ideas about what they think knowledge is</a>, where it comes from and whom to trust. <a href="https://www.taylorfrancis.com/chapters/edit/10.4324/9781315795225-9/epistemic-cognition-psychological-construct-advancements-challenges-barbara-hofer">Some people think dualistically</a>: There’s always a clear right and wrong. But scientists view <a href="https://doi.org/10.1080/0163853X.2019.1629805">tentativeness as a hallmark</a> of their discipline. Some people may not understand that scientific claims will change as more evidence is gathered, so they may be distrustful of how public health policy shifted around COVID-19.</p>
<p>Journalists who present “both sides” of settled scientific agreements can unknowingly persuade readers that the science is more uncertain than it actually is, turning <a href="https://doi.org/10.1016/j.gloenvcha.2003.10.001">balance into bias</a>. Only 57% of Americans surveyed accept that climate change is caused by human activity, compared with <a href="https://climate.nasa.gov/faq/17/do-scientists-agree-on-climate-change/">97% of climate scientists</a>, and only <a href="https://climatecommunication.yale.edu/visualizations-data/ycom-us/">55% think that scientists are certain that climate change is happening</a>. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="man with book looking off into distance" src="https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/408752/original/file-20210628-21-19bxio7.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">How did you come to know what you know?</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/man-reading-book-on-the-table-royalty-free-image/980285120">ridvan_celik/E+ via Getty Images</a></span>
</figcaption>
</figure>
<p>Action #3: Recognize that other people (or possibly even you) may be operating with misguided beliefs about science. You can help them adopt what philosopher of science <a href="https://leemcintyrebooks.com">Lee McIntyre</a> calls a <a href="https://mitpress.mit.edu/books/scientific-attitude">scientific attitude</a>, an openness to seeking new evidence and a willingness to change one’s mind. </p>
<p>Recognize that very few individuals rely on a single authority for knowledge and expertise. Vaccine hesitancy, for example, has been successfully <a href="https://www.ama-assn.org/delivering-care/public-health/time-doctors-take-center-stage-covid-19-vaccine-push">countered by doctors</a> who persuasively contradict erroneous beliefs, as well as by friends who explain why they <a href="https://addisonindependent.com/joanna-colwell-i-didnt-vaccinate-my-child-and-then-i-did-0">changed their own minds</a>. <a href="https://www.churchleadership.com/leading-ideas/5-ways-churches-can-play-a-critical-role-in-vaccination-efforts/">Clergy can step forward</a>, for example, and some have offered places of worship as vaccination hubs.</p>
<h2>Challenge #4: Motivated reasoning</h2>
<p>You might not think that how you interpret a simple graph could depend on your political views. But when people were asked to look at the same charts depicting either housing costs or the rise in carbon dioxide in the atmosphere over time, interpretations differed by political affiliation. Conservatives were more likely than progressives to <a href="https://apadiv15.org/wp-content/uploads/2020/08/APA-2020-Hockey-Stick-1.pdf">misinterpret the graph</a> when it depicted a rise in CO2 than when it displayed housing costs. When people reason not just by examining facts, but with an unconscious bias to come to a preferred conclusion, <a href="https://www.discovermagazine.com/the-sciences/what-is-motivated-reasoning-how-does-it-work-dan-kahan-answers">their reasoning will be flawed</a>.</p>
<p>Action #4: Maybe you think that eating food from genetically modified organisms is harmful to your health, but have you really examined the evidence? Look at articles with both pro and con information, evaluate the source of that information, and be open to the evidence leaning one way or the other. If you give yourself the time to think and reason, you can short-circuit your own motivated reasoning and open your mind to new information.</p>
<h2>Challenge #5: Emotions and attitudes</h2>
<p>When Pluto got <a href="https://theconversation.com/nasa-missions-may-re-elevate-pluto-and-ceres-from-dwarf-planets-to-full-on-planet-status-36081">demoted to a dwarf planet</a>, many children and some adults responded with anger and opposition. Emotions and attitudes are linked. Reactions to hearing that humans influence the climate can range from anger (if you do not believe it) to frustration (if you are concerned you may need to change your lifestyle) to anxiety and hopelessness (if you accept it is happening but think it’s too late to fix things). How you feel about climate mitigation or GMO labeling aligns with whether you are for or against these policies.</p>
<p>Action #5: Recognize the role of emotions in decision-making about science. If you react strongly to a story about stem cells used to develop Parkinson’s treatments, ask yourself if you are overly hopeful because you have a relative in early stages of the disease. Or are you rejecting a possibly lifesaving treatment because of your emotions?</p>
<p>Feelings shouldn’t (and can’t) be put in a box separate from how you think about science. Rather, it’s important to understand and recognize that emotions are <a href="https://wwnorton.com/books/9780393709810">fully integrated ways of thinking and learning</a> about science. Ask yourself if your attitude toward a science topic is based on your emotions and, if so, give yourself some time to think and reason as well as feel about the issue. </p>
<p>[<em>You’re smart and curious about the world. So are The Conversation’s authors and editors.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=youresmart">You can read us daily by subscribing to our newsletter</a>.]</p>
<p>Everyone can be susceptible to these five psychological challenges that can lead to science denial, doubt and resistance. Being aware of these challenges is the first step toward taking action to meet them.</p><img src="https://counter.theconversation.com/content/161713/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Barbara K. Hofer has received research funding from the National Science Foundation and Vermont EPSCOR. </span></em></p><p class="fine-print"><em><span>Gale Sinatra has received funding from the National Science Foundation (NSF), Social Sciences and Humanities Research Council (SSHRC) of Canada, Mattel Children's Foundation. </span></em></p>Science denial is not new, but researchers have learned a lot about it. Here’s why it exists, how everyone is susceptible to it in one way or another and steps to take to overcome it.Barbara K. Hofer, Professor of Psychology Emerita, MiddleburyGale Sinatra, Professor of Education and Psychology, University of Southern CaliforniaLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1505092020-12-01T13:25:17Z2020-12-01T13:25:17ZYour brain’s built-in biases insulate your beliefs from contradictory facts<figure><img src="https://images.theconversation.com/files/372118/original/file-20201130-21-q7ey1o.jpg?ixlib=rb-1.1.0&rect=126%2C183%2C7539%2C4884&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">These psychological tendencies explain why an onslaught of facts won't necessarily change anyone's mind.</span> <span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/conflict-royalty-free-image/1061219956">Francesco Carta fotografo/Moment via Getty Images</a></span></figcaption></figure><p>A rumor started circulating back in 2008 that Barack Obama was not born in the United States. At the time, I was serving as chair of the Hawaii Board of Health. The director and deputy director of health, both appointed by a Republican governor, <a href="https://www.nbcnews.com/id/wbna42519951">inspected Obama’s birth certificate</a> in the state records and certified that it was real.</p>
<p>I would have thought that this evidence would settle the matter, but it didn’t. Many people thought the birth certificate was a fabricated document. Today, many <a href="https://www.theatlantic.com/ideas/archive/2020/05/birtherism-and-trump/610978/">people still believe</a> that President Obama was not born in the U.S.</p>
<p>I once listened to a “Science Friday” <a href="https://www.npr.org/2011/01/07/132740175/paul-offit-on-the-anti-vaccine-movement">podcast on the anti-vaccination movement</a>. A woman called in who didn’t believe that vaccines were safe, despite <a href="https://www.vaccines.gov/basics/safety">overwhelming scientific evidence that they are</a>. The host asked her how much proof she would need in order to believe that vaccines were safe. Her answer: No amount of scientific evidence could change her mind.</p>
<p><a href="https://scholar.google.com/citations?user=87v4Nk4AAAAJ&hl=en&oi=ao">As a psychologist</a>, I was bothered, but not shocked, by this exchange. There are several well-known mechanisms in human psychology that enable people to continue to hold tight to beliefs even in the face of contradictory information.</p>
<h2>Cognitive shortcuts come with biases</h2>
<p>In its early days, the science of psychology assumed that people would make rational decisions. But over the decades, it’s become clear that many decisions people make – about choices ranging from romantic partners and finances to <a href="https://doi.org/10.1016/j.dr.2008.01.002">risky health behaviors</a> like unsafe sex and <a href="https://doi.org/10.2105/AJPH.2008.155382">health-promoting behaviors</a> – are not made rationally.</p>
<p>Instead, human minds have a tendency toward several <a href="https://www.theatlantic.com/magazine/archive/2018/09/cognitive-bias/565775/">cognitive biases</a>. These are systematic errors in the way you think about the world. Given the complexity of the world around you, your brain cuts a few corners to help you process complex information quickly.</p>
<p>For example, the availability bias refers to the tendency to use information you can quickly recall. This is helpful when you’re ordering ice cream at a place with 50 flavors; you don’t need to think about all of them, just one you recently tried and liked. Unfortunately these shortcuts can mean you end up at a nonrational decision.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="looking at camera man holds up a finger" src="https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/372119/original/file-20201130-21-1iks1yz.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">In efficiency mode, your mind may discount contradictory information.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/young-businessman-rejecting-your-offer-royalty-free-image/1165905568">DjelicS/E+ via Getty Images</a></span>
</figcaption>
</figure>
<p>One form of cognitive bias is called <a href="https://www.sup.org/books/title/?id=3850">cognitive dissonance</a>. This is the feeling of discomfort you can experience when your beliefs are not in line with your actions or new information. When in this state, people can reduce their dissonance in one of two ways: changing their beliefs to be in line with the new information or interpreting the new information in a way that justifies their original beliefs. In many cases, people choose the latter, whether consciously or not.</p>
<p>For example, maybe you think of yourself as active, not at all a couch potato – but you spend all of Saturday lying on the couch bingeing reality TV. You can either start thinking about yourself in a new way or justify your behavior, maybe by saying you had a really busy week and need to rest up for your workout tomorrow.</p>
<p>The <a href="https://doi.org/10.1002/asi.23274">confirmation bias</a> is another process that helps you justify your beliefs. It involves favoring information that supports your beliefs and downplaying or ignoring information to the contrary. Some researchers have called this “<a href="https://www.hup.harvard.edu/catalog.php?isbn=9780674237827">my side blindness</a>” – people see the flaws in arguments that are contradictory to their own but are unable to see weaknesses in their own side. Picture fans of a football team that went 7-9 for the season, arguing that their team is actually really strong, spotting failings in other teams but not in theirs.</p>
<p>With the decline of mass media over the past few decades and the increase in niche media and social media, it’s become easier to <a href="https://theconversation.com/misinformation-and-biases-infect-social-media-both-intentionally-and-accidentally-97148">surround yourself with messages you already agree with</a> while minimizing your exposure to messages you don’t. These information bubbles reduce cognitive dissonance but also make it harder to change your mind when you are wrong.</p>
<h2>Shoring up beliefs about yourself</h2>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="woman seething behind the wheel of a car" src="https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=416&fit=crop&dpr=1 600w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=416&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=416&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=522&fit=crop&dpr=1 754w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=522&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/372120/original/file-20201201-20-1cyeluq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=522&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">I’m nice, so this confrontation must be their fault.</span>
<span class="attribution"><a class="source" href="https://www.gettyimages.com/detail/photo/road-rage-royalty-free-image/1070981954">Petri Oeschger/Moment via Getty Images</a></span>
</figcaption>
</figure>
<p>It can be especially hard to change certain beliefs that are central to your <a href="https://doi.org/10.1016/S0065-2601(08)60075-1">self-concept</a> – that is, who you think you are. For example, if you believe you’re a kind person and you cut someone off in traffic, instead of thinking that maybe you’re not all that nice, it’s easier to think the other person was driving like a jerk.</p>
<p>This relationship between beliefs and self-concept can be reinforced by affiliations with groups like political parties, cults or other like-minded thinkers. These groups are often belief bubbles where the majority of members believe the same thing and repeat these beliefs to one another, strengthening the idea that their beliefs are right.</p>
<p>Researchers have found that people generally think they are <a href="https://www.vox.com/science-and-health/2019/1/31/18200497/dunning-kruger-effect-explained-trump">more knowledgeable </a> about certain issues than they really are. This has been demonstrated across a variety of studies looking at vaccinations, Russia’s invasion of the Ukraine and <a href="https://www.penguinrandomhouse.com/books/533524/the-knowledge-illusion-by-steven-sloman-and-philip-fernbach/">even how toilets work</a>. These ideas then get passed from person to person without being based on fact. For example, <a href="https://www.politico.com/news/2020/11/09/republicans-free-fair-elections-435488">70% of Republicans</a> say they don’t believe the 2020 presidential election was free and fair despite a lack of any evidence of widespread voter fraud.</p>
<p>[<em>The Conversation’s science, health and technology editors pick their favorite stories.</em> <a href="https://theconversation.com/us/newsletters/science-editors-picks-71/?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=science-favorite">Weekly on Wednesdays</a>.]</p>
<p>Belief bubbles and the defenses against cognitive dissonance can be hard to break down. And they can have important downstream effects. For instance, these psychological mechanisms affect the ways people have chosen whether or not to follow public health guidelines around social distancing and wearing masks during the COVID-19 pandemic, sometimes with <a href="https://www.theatlantic.com/ideas/archive/2020/07/role-cognitive-dissonance-pandemic/614074/">deadly consequences</a>.</p>
<p>Changing people’s minds is difficult. Given the confirmation bias, evidence-based arguments counter to what someone already believes are likely to be discounted. The best way to change a mind is to start with yourself. With as open a mind as you can summon, think about why you believe what you do. Do you really understand the issue? Could you think about it in a different way?</p>
<p>As a professor, I like to have my students debate ideas from the side that they personally disagree with. This tactic tends to lead to deeper understanding of the issues and makes them question their beliefs. Give it an honest try yourself. You might be surprised by where you end up.</p><img src="https://counter.theconversation.com/content/150509/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jay Maddock does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Cognitive shortcuts help you efficiently move through a complicated world. But they come with an unwelcome side effect: Facts aren’t necessarily enough to change your mind.Jay Maddock, Professor of Public Health, Texas A&M UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1465652020-09-22T02:42:01Z2020-09-22T02:42:01ZGoogle News favours mainstream media. Even if it pays for Australian content, will local outlets fall further behind?<p>Google’s role in delivering audiences to news outlets has been under scrutiny of late. The Australian Competition and Consumer Commission’s <a href="https://www.accc.gov.au/focus-areas/digital-platforms/draft-news-media-bargaining-code">initiative</a> to redirect advertising revenue from Google and Facebook to news publishers has led to threats of a <a href="https://theconversation.com/if-facebook-really-pulls-news-from-its-australian-sites-well-have-a-much-less-compelling-product-145380">news boycott</a> by both companies. </p>
<p>Australia’s news media businesses have faced revenue loss and <a href="https://www.theguardian.com/media/2020/jun/09/news-corp-cuts-more-jobs-this-time-at-its-metropolitan-newspapers">job</a> <a href="https://www.abc.net.au/news/2020-06-30/job-losses-coronavirus-australia-covid-19/12401232">cuts</a> for some time now, blaming Google and Facebook for poaching advertising revenue. </p>
<p>But rather than share revenue with the publishers whose content they feature, it seems the tech behemoths would rather remove Australian news content from their platforms altogether. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/in-a-world-first-australia-plans-to-force-facebook-and-google-to-pay-for-news-but-abc-and-sbs-miss-out-143740">In a world first, Australia plans to force Facebook and Google to pay for news (but ABC and SBS miss out)</a>
</strong>
</em>
</p>
<hr>
<p>Into this <a href="https://theconversation.com/googles-open-letter-is-trying-to-scare-australians-the-company-simply-doesnt-want-to-pay-for-news-144573">heated debate</a> arrives a new study of Google News search recommendations in the US. The research, <a href="https://www.nature.com/articles/s41562-020-00954-0">published today in Nature Human Behaviour</a>, examines Google News search results across more than 3,000 US counties – evaluating the balance between local and national news outlets in search results on a wide range of topics. </p>
<p>The findings show Google News generally privileges national news outlets over local ones, especially for topics of national interest. This makes it even more difficult for local outlets to compete with their larger national counterparts – but shifting the balance between the two isn’t easy.</p>
<h2>A handful of winners</h2>
<p>In one sense, the research findings merely show Google News is working as advertised: it points readers interested in major issues to leading national outlets. Larger, better-funded media businesses are likely to have more in-depth coverage than local publishers.</p>
<p>Meanwhile, Google News will feature more local content when users search for issues with a local angle. And while the study didn’t cover Australia, it probably works similarly here, too.</p>
<p>Nevertheless, the research found the three most prominent national US outlets account for about one-sixth of all search results. This echoes <a href="http://www.sciencedirect.com/science/article/pii/S0747563218303650">research published last year</a>, which also documented Google News featuring a very narrow range of leading news outlets. </p>
<p>The authors of that study worried this “highly concentrated” set of results was “empowering a handful of prominent outlets and marginalising others”, rather than offering a comprehensive range of perspectives on the news.</p>
<h2>The ‘filter bubble’ argument</h2>
<p>The two studies mentioned above offer a powerful argument against the persistent (but unsubstantiated) idea that search engines and social media place us in “<a href="http://theconversation.com/the-myth-of-the-echo-chamber-92544">filter bubbles</a>”. </p>
<p>This is the idea that the information we encounter online depends on our personal identities, ideologies and geographical location. If the filter bubbles hypothesis were true, it would indeed threaten to deepen social divides.</p>
<p>But an increasing number of <a href="https://www.blm.de/files/pdf2/bericht-datenspende---wer-sieht-was-auf-google.pdf">timely</a> <a href="https://doi.org/10.1080/21670811.2017.1338145">studies</a> suggest something different: if there is a filter bubble, we’re all in it together. </p>
<p>In other words, when different users search for news on Google, they likely see the same results from the same handful of media outlets – regardless of who and where they are.</p>
<h2>Tweaking the results</h2>
<p>From this perspective, the uniformity and predominantly national focus of Google News results may even be welcome, as it ensures searchers of all backgrounds have access to a shared stock of information. </p>
<p>At the same time, however, Google’s channelling of users towards major national news outlets affects their local competitors’ ability to generate advertising revenue. The rich (in readership) get richer (from advertising), while outlets featured less in search results struggle.</p>
<p>In a market already suffering from substantial pandemic-induced downturns, this undermines smaller outlets’ ability to survive in the long term. “News deserts” (areas without local news outlets) are growing rapidly in the <a href="https://www.usnewsdeserts.com/">US</a> and <a href="https://anmp.piji.com.au/">in Australia</a>. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/local-news-sources-are-closing-across-australia-we-are-tracking-the-devastation-and-some-reasons-for-hope-139756">Local news sources are closing across Australia. We are tracking the devastation (and some reasons for hope)</a>
</strong>
</em>
</p>
<hr>
<p><iframe id="tc-infographic-525" class="tc-infographic" height="400px" src="https://cdn.theconversation.com/infographics/525/a19e65efd484188bedc6b9f9703ad2d61a4e0fbf/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>Policy makers might be tempted to arrest this decline by forcing Google News to provide more links to local rather than national news outlets. But even if Google agreed to this, it would come at a cost. </p>
<p>Major national outlets are prominent because local outlets simply can’t provide the same comprehensive coverage of non-local issues. Instead, they draw on wire services and syndicated content. </p>
<p>Making Google feature more content from local outlets would direct more revenue towards those news organisations, but could also reduce the quality and diversity of news provided to users. They might end up only seeing local adaptations of content from a small number of wire services.</p>
<p>While this approach might save some local news outlets, it would undermine citizens’ understanding of the world around them.</p>
<h2>The lion and the mouse</h2>
<p>The Australian initiative to make Google (and Facebook) pay for the news they show on their sites could be seen as a more sensible alternative. </p>
<p>Revenue generated from the <a href="https://www.accc.gov.au/focus-areas/digital-platforms/draft-news-media-bargaining-code">news media bargaining code</a> could be used to increase the strength and diversity of the domestic news industry, enabling smaller outlets to provide a better range of content for Google News to feature.</p>
<p>But even if Google was willing to share advertising revenue, the devil lies in the detail. If that money was distributed based on current Google News recommendation patterns, major news outlets would receive the lion’s share. Local news organisations would still miss out – along with the ABC and SBS, <a href="https://theconversation.com/in-a-world-first-australia-plans-to-force-facebook-and-google-to-pay-for-news-but-abc-and-sbs-miss-out-143740">which are not included</a> in the ACCC’s proposal. </p>
<p>So it would be good news for News Corp and Nine Entertainment, but not so much for everyone else.</p>
<p>To rebuild Australia’s local news industry, the industry heavyweights would have to give up some of their own hard-fought share of the money. But you don’t need to consult Google to work out how likely that is.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/platform-regulation-in-australia-is-just-the-start-facebook-and-google-are-fighting-a-global-battle-145748">Platform regulation in Australia is just the start. Facebook and Google are fighting a global battle</a>
</strong>
</em>
</p>
<hr>
<img src="https://counter.theconversation.com/content/146565/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Axel Bruns receives funding from the Australian Research Council through Discovery projects Journalism beyond the Crisis: Emerging Forms, Practices and Uses and Evaluating the Challenge of 'Fake News' and Other Malinformation, and the ARC Centre of Excellence for Automated Decision-Making and Society. He is a member of the expert research panel of the Public Interest Journalism Initiative (PIJI).</span></em></p>Research shows Google News results often prioritise mainstream media over smaller news businesses. It’s a double-edged sword. While local outlets suffer, it’s actually better for readers.Axel Bruns, Professor, Creative Industries, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1412662020-06-30T16:00:25Z2020-06-30T16:00:25ZTikTok teens and the Trump campaign: How social media amplifies political activism and threatens election integrity<figure><img src="https://images.theconversation.com/files/344649/original/file-20200629-155303-h4esrb.jpg?ixlib=rb-1.1.0&rect=7%2C7%2C5168%2C3437&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">President Trump's campaign rally in Tulsa, Okla. had thousands of empty seats, thanks at least in part to the actions of teenagers who mobilized on the social media platform TikTok.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/Election-2020-Trump/d0e10e8a467a499fbe9b81bed6414e5e/51/0">AP Photo/Evan Vucci</a></span></figcaption></figure><p>The lower-than-expected attendance at President Trump’s rally in Tulsa on June 20 was attributed, at least in part, to an online army of K-pop fans who used the social network <a href="https://www.tiktok.com/en/">TikTok</a> to organize and reserve tickets for the rally as a means of <a href="https://www.nytimes.com/2020/06/22/arts/music/k-pop-fans-trump-politics.html">pranking the campaign</a>. </p>
<p>Similarly, the historically unprecedented scale of the George Floyd protests can be <a href="https://theconversation.com/how-the-pandemic-changed-social-media-and-george-floyds-death-created-a-collective-conscience-140104">attributed in part</a> to social media. By some estimates <a href="https://www.theatlantic.com/politics/archive/2020/06/todays-protest-movements-are-as-big-as-the-1960s/613207/">25 million Americans participated</a> at protests. </p>
<p>Social media has proven itself as a <a href="https://lens.monash.edu/@politics-society/2020/05/14/1379701/social-media-giving-voice-to-online-activists">tool for political activism</a>, from online boycotts to offline gatherings. It also has implications for how political campaigns operate. Social media can aid campaigns with <a href="https://www.cs.yale.edu/homes/jf/IF-ASONAM19.pdf">voter targeting efforts</a>, but it can also make the electoral process <a href="https://www.kofiannanfoundation.org/app/uploads/2020/01/f035dd8e-kaf_kacedda_report_2019_web.pdf">vulnerable to misinformation and manipulation</a>, including from foreign actors.</p>
<h2>Hijacking hashtags</h2>
<p>Social media has <a href="https://www.jstor.org/stable/24461703">enabled protests and meaningful political action</a> by capturing public attention, and by its decentralized nature, which makes it easier for activists to evade censorship and coordinate actions. K-pop fans’ action through TikTok spanned more than a week and stayed off the radar of mainstream media. </p>
<p>TikTok teens and K-pop fans took over anti-Black Lives Matter hashtags such as #WhiteLivesMatter and drowned out the anti-Black Lives Matter messages with GIFs and memes. When people on social media platforms look for these hashtags, they’re met with seemingly unending <a href="https://www.cnn.com/2020/06/04/us/kpop-bts-blackpink-fans-black-lives-matter-trnd/index.html">images and fan videos of popular K-pop groups</a> such as Twice and EXO. </p>
<p>This, in turn, leads algorithms on social media platforms to classify <a href="https://www.forbes.com/sites/lisettevoytko/2020/06/22/twitter-categorizes-whitelivesmatter-as-k-pop-trend-as-fans-flood-it-with-gifs-memes/#2796296613d1">such trending hashtags as K-pop</a> trends rather than political trends, thwarting the anti-Black Lives Matter activists who tried to use the hashtags to promote their messages.</p>
<p>K-pop fans likewise <a href="https://www.nbcnews.com/politics/donald-trump/after-trump-rally-falls-flat-tiktok-teens-take-victory-lap-n1231675">responded to a call</a> from the Dallas Police Department, who were trying to collect information about Black Lives Matter protesters from social media, and bombarded them with images and videos of their favorite K-pop stars.</p>
<p><div data-react-class="Tweet" data-react-props="{"tweetId":"1267188927263313922"}"></div></p>
<h2>Influencers and like-minded connections</h2>
<p><a href="https://scholar.google.com/citations?user=JpFHYKcAAAAJ&hl=en">My own research</a> shows that there are <a href="https://pubsonline.informs.org/doi/abs/10.1287/isre.1100.0339">two mechanisms</a> that make social media influential in digital activism. </p>
<p>First, social media gives <a href="https://pubsonline.informs.org/doi/abs/10.1287/isre.1100.0339">an opinion-making role</a> to a few influencers – people who have extensive social media networks. The furor companies such as <a href="https://www.theguardian.com/technology/2017/jan/30/deleteuber-how-social-media-turned-on-uber">Uber</a> and <a href="https://www.marketwatch.com/story/united-continental-backlash-wont-die-down-newunitedairlinesmottos-2017-04-11">United Airlines</a> aroused on social media for misbehaving was initiated <a href="https://theconversation.com/how-social-media-turned-uniteds-pr-flub-into-a-firestorm-76210">by a handful of individuals</a>. </p>
<p>[<em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters/the-daily-3?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>.]</p>
<p>Second, on social media people engage with like-minded people, a <a href="https://pubsonline.informs.org/doi/abs/10.1287/isre.1100.0339">phenomenon called homophily</a>. </p>
<p>Together, these mechanisms provide a wide audience to both influencers and their followers who are enmeshed in densely connected online networks. As my research shows, once a meme, hashtag or video goes viral, <a href="https://www.jmis-web.org/articles/1281">passive sharing can turn into active broadcasting</a> of the trending idea. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/9qBR_IIZw2o?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>For example, when celebrity Jane tweets in support of a viral hashtag such as #BlackOutTuesday, if fan Alyssa retweets this, it is more likely to be retweeted by people like Alyssa. Jane’s influence is magnified by Alyssa’s ability to influence her social connections. The resulting activism spirals into a large-scale online movement that is hard to ignore. </p>
<h2>Social media and political campaigning</h2>
<p>Social media’s opinion-making power and preference for like-minded connections also lead to <a href="https://en.wikipedia.org/wiki/Filter_bubble">online filter bubbles</a>, echo chambers that amplify information people are predisposed to agree with and filter out information that contradicts people’s points of view. Recent elections in the U.S. and the Brexit vote in the U.K. might have been <a href="https://www.theguardian.com/technology/2017/may/22/social-media-election-facebook-filter-bubbles">influenced by filter bubbles</a>.</p>
<p>Social media also makes it easier to narrowly target classes of voters. In 2016 Hilary Clinton’s presidential campaign significantly outspent Donald Trump’s campaign, and the effectiveness of the Trump campaign has been attributed to its <a href="https://www.bloomberg.com/politics/graphics/2016-presidential-campaign-fundraising/">ability to target specific groups</a> of Clinton voters with negative ads. </p>
<p>With online advertising in general, and with the ability to <a href="https://www.fastcompany.com/90318247/users-need-to-play-a-role-in-how-we-regulate-tech-giants">micro-target voters</a> via social media based on <a href="https://theconversation.com/facebook-begins-to-shift-from-being-a-free-and-open-platform-into-a-responsible-public-utility-101577">detailed demographic data</a>, social media can both help and hinder political campaigns’ ability to target their voters. </p>
<p>Also, political campaigns need good data to create models of likely voters, which they use to get voters to turn out and persuade likely voters to vote for their candidates. It looks like TikTok users <a href="https://www.marketplace.org/2020/06/22/tiktok-users-kpop-stans-deluge-trump-campaign-bad-data/">produced a deluge of bad data</a> for the Trump campaign. This kind of activity forces campaigns to spend time and money cleaning up their data.</p>
<h2>Social media and election integrity</h2>
<p>The power of social media also poses a challenge for election integrity. An entity linked to the Russian government was reportedly <a href="https://theconversation.com/how-the-russian-government-used-disinformation-and-cyber-warfare-in-2016-election-an-ethical-hacker-explains-99989">responsible for spreading a massive disinformation campaign</a> that likely influenced the 2016 elections. A Senate committee <a href="https://www.intelligence.senate.gov/sites/default/files/documents/Report_Volume2.pdf">concluded that</a> “these operatives used targeted advertisements, intentionally falsified news articles, self-generated content, and social media platform tools” to intentionally manipulate the perceptions of millions of Americans. </p>
<p>Likewise, the Tulsa phenomenon underscores that if it’s this easy for a group of teens to influence turnout in a campaign rally, how easy would it be for a foreign actor to interfere in the election process? The election process, including how campaigns and observers gather political information, is vulnerable to misinformation and coordinated trolling. </p>
<p>Social media amplifies both the reach and range of actions available to well-organized, engaged and networked political actors, whatever their intentions. With the pandemic significantly <a href="https://www.nytimes.com/interactive/2020/04/07/technology/coronavirus-internet-use.html">increasing society’s dependence on the internet</a>, these concerns are likely to increase. The question is, when combined with algorithmic filters and disinformation, how will these forces shape the politics of protest and democratic action in the years ahead?</p><img src="https://counter.theconversation.com/content/141266/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anjana Susarla does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If teenagers organizing on social media can hamper a presidential campaign rally, how challenging is it to manipulate elections?Anjana Susarla, Associate Professor of Information Systems, Michigan State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1203972019-07-31T11:43:41Z2019-07-31T11:43:41ZPolitical polarization is about feelings, not facts<figure><img src="https://images.theconversation.com/files/284836/original/file-20190718-116590-1lz8db3.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Animosity between partisan voters has grown in recent years.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-photo/democrats-republicans-campaign-symbolized-boxing-gloves-530147431?src=7oPTAFdbbm8sDifI7_fRqA-1-0&studio=1">Gutzemberg/Shutterstock.com</a></span></figcaption></figure><p>Politicians and pundits from all quarters often lament democracy’s polarized condition.</p>
<p>Similarly, citizens frustrated with polarized politics also demand <a href="https://www.people-press.org/2019/06/19/public-highly-critical-of-state-of-political-discourse-in-the-u-s/">greater flexibility from the other side</a>. </p>
<p>Decrying polarization has become a way of impugning adversaries. Meanwhile, the political deadlock and resentment that polarization produces goes unaddressed. Ironic, right?</p>
<p>Commentators rarely say what they mean by polarization. But if Americans are to figure out how to combat it, they need to begin from a clear understanding of what polarization is. </p>
<p>My 2019 book, <a href="https://global.oup.com/academic/product/overdoing-democracy-9780190924195?cc=us&lang=en&">“Overdoing Democracy</a>,” argued that polarization isn’t about where you get your news or how politicians are divided – it’s about how a person’s political identity is wrapped up with almost everything they do. </p>
<img src="https://cdn.theconversation.com/static_files/files/658/PP_16.04.26_polarization.gif?1563482968">
<h2>Polarization, three ways</h2>
<p>Start with the obvious: Polarization is the political distance separating partisans. But this intuitive idea is not so simple, as political scientists have at least three ways of measuring political distance. </p>
<p>One compares the platforms of competing parties. Polarization is the extent to which these are opposed. </p>
<p>A second assesses each party’s ideological homogeneity. This definition of polarization concerns how many of the party’s officials are “moderates” or bridge-builders.</p>
<p>A third involves neither platforms nor officials, but instead the emotions of ordinary citizens who affiliate with a political party. It tracks the extent to which citizens dislike affiliates of other parties.</p>
<p><a href="https://stanfordmag.org/contents/polarization-is-not-the-problem">Research suggests</a> that, although the major U.S. parties are severely polarized along the first two dimensions, the American public is no more divided now over policy than it was 30 years ago. In fact, on certain hot-button issues such as <a href="https://www.pewforum.org/fact-sheet/public-opinion-on-abortion/">abortion</a> and <a href="https://www.people-press.org/2019/05/14/majority-of-public-favors-same-sex-marriage-but-divisions-persist/">gay rights</a>, rank-and-file citizens who identify with a political party have moved closer together.</p>
<p>Nonetheless, Americans <a href="https://www.thenation.com/article/america-is-less-polarized-than-you-think/">believe that their policy divisions are especially pronounced</a>. Polarization in the third sense has <a href="https://doi.org/10.1111/ajps.12152">skyrocketed</a> with interparty animosity <a href="https://www.people-press.org/2016/06/22/partisanship-and-political-animosity-in-2016/">more intense now than it has been for the past 25 years</a>.</p>
<p><iframe id="IWNyA" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/IWNyA/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>In other words, though Americans are less divided over the issues, we see ourselves as profoundly at odds. We more intensely dislike those we regard as politically different from ourselves.</p>
<p>This suggests to me that, when citizens detest those with opposing affiliations, political parties are driven to overstate their differences, stress ideological purity and vilify the opposition. </p>
<p>For example, consider the popular slur among Republicans, “RINO,” – or Republican In Name Only – which derides GOP members who are seen to be insufficiently devoted to the party line. </p>
<p>A similar dynamic could be seen in discussions of those vying for the Democratic nomination, where hopefuls were often assessed according to the extent of their anti-Trump sentiments. </p>
<p>And, on more than one occasion, <a href="https://www.washingtonpost.com/politics/2019/07/15/trumps-prepared-notes-democrats-he-criticized-are-dangerous-may-hate-america/">President Trump declared</a> that certain Democrats are “dangerous” and may “hate America.” </p>
<h2>Thinking as a group</h2>
<p>Here’s an easy fix to this kind of polarization: Stop hating your political adversaries. But that’s easier said than done.</p>
<p>Why do people despise those who are politically different from themselves?</p>
<p>The answer lies with a widespread cognitive phenomenon called <a href="https://dx.doi.org/10.2139/ssrn.199668">group polarization</a>. When you talk only to those you agree with, or listen only to news that affirms your opinions, you become more radical in your beliefs. </p>
<p>As people radicalize like this, they grow less able to comprehend opposing views, more likely to dismiss objections to their opinions and increasingly prone to regarding dissenters as incompetent and depraved.</p>
<p>Recall the last time you were present in a packed arena watching your favorite team win a home game. As you roared along with your fellow fans, everyone’s enthusiasm for the team spiked. At the same time, animosity for the opposing team and its fans intensified. Your mood was elevated and your identity was affirmed. Cheering with fellow fans makes us feel good about ourselves. </p>
<h2>Echo chambers</h2>
<p>Online environments function as immense polarization machines. They enable individuals to select their information sources and filter out challenging or unfamiliar messages.</p>
<p>Many have suggested that people would become less polarized if they could only break out of their <a href="https://www.theguardian.com/science/blog/2017/dec/04/echo-chambers-are-dangerous-we-must-try-to-break-free-of-our-online-bubbles">“echo chambers”</a> and <a href="https://www.huffpost.com/entry/how-to-break-out-of-an-echo-chamber-your-bubble_b_58e1be20e4b03c2b30f6a7ff">expose themselves to more diverse opinions</a>. </p>
<p><iframe id="28bcN" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/28bcN/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>However, there’s a crucial difference between prevention and cure. Diversifying your media diet could help to prevent group polarization, but it may not reverse the polarization once it has taken effect. </p>
<p>A <a href="https://doi.org/10.1073/pnas.1804840115">2018 social media study</a> exposed both Democrats and Republicans to Twitter messages from people with moderate, but opposing, viewpoints. By the end, participants actually expressed more partisan views than they had when the study began. Once group polarization has taken effect on a person, they tend to regard the expression of opposing viewpoints as an attack on their identity, and this affirms their negative attitude toward their political opposition.</p>
<p><a href="https://psycnet.apa.org/doi/10.1006/jesp.1996.0024">People radicalize in concert</a> with like-minded others due to the mutual affirmation of a shared identity. This behavior intensifies their shared attitudes, including a negative view of outsiders. This, in turn, generates the <a href="https://doi.org/10.1111/ajps.12152">polarization of party platforms and officials</a>. </p>
<p>From my perspective, there’s no easy fix. The trouble lies with people regarding political affiliations as group identities, and their political parties as warring teams in a winner-take-all death match. </p>
<p>[ <em>Deep knowledge, daily.</em> <a href="https://theconversation.com/us/newsletters?utm_source=TCUS&utm_medium=inline-link&utm_campaign=newsletter-text&utm_content=deepknowledge">Sign up for The Conversation’s newsletter</a>. ]</p><img src="https://counter.theconversation.com/content/120397/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Robert B. Talisse does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A person’s political identity is wrapped up in almost everything they do. Exposure to opinions from the other side actually makes it worse.Robert B. Talisse, W. Alton Jones Professor of Philosophy, Vanderbilt UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/1025342018-09-19T18:11:01Z2018-09-19T18:11:01ZOxford-style debate: Trump, education, identity and the perpetual feedback loop<figure><img src="https://images.theconversation.com/files/234464/original/file-20180831-195307-1axlpyh.jpg?ixlib=rb-1.1.0&rect=0%2C30%2C2916%2C1907&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Donald Trump during the 2016 presidential campaign.</span> <span class="attribution"><a class="source" href="https://en.wikipedia.org/wiki/Make_America_Great_Again#/media/File:Donald_Trump_(25953705015).jpg">Gage Skidmore/Wikipedia</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p><em>This article was co-authored with and based on the debate arguments of <a href="http://www.americanschoolgrenoble.com/about/">Carol-Margaret Bitner</a>. It is the fourth in the debate series <a href="https://theconversation.com/fake-news-meets-fact-in-an-oxford-style-debate-revival-96253">“The impact reflected by Trump is here to stay”</a> and argues for the motion. The third article in this series, <a href="https://theconversation.com/oxford-style-debate-morning-after-reflections-on-the-ephemerality-of-trump-100227">“Morning-after reflections on the ephemerality of Trump”</a>, expounds the role of biases, emotions and social norms in arguing against the motion.</em></p>
<hr>
<blockquote>
<p>“Every [society] in a state of equilibrium tends to remain in that state unless an external force is applied to it.” (adapted from I. Newton)</p>
</blockquote>
<p>Today, we are in a world where it is becoming increasingly difficult to talk with one another. This did not start with the Trump administration, nor with the rise of social media usage. This is a direction that we seem to have been heading in for a very long time. As French philosopher and sociologist Edgar Morin reminds us in a <a href="http://unesdoc.unesco.org/images/0013/001331/133120e.pdf">2004 UNESCO interview</a>:</p>
<blockquote>
<p>“Dialogue is only possible between individuals who recognise one another as subjects with the same dignity and the same rights.”</p>
</blockquote>
<p>We can only see another individual as deserving of the same dignity and the same rights as ourselves if we see the other person as autonomous. We must get reacquainted with the rights granted with membership of a society, and how a government and society uphold these. Civic education provides a vibrant answer to at least part of the problem.</p>
<p>Civic education in middle and high school was once a required part of all public school curriculum. Why? Because civic education is about fostering and maintaining a democracy. The fact that we no longer require civic education or that we have reduced it to a multiple-choice test indicates that we have overlooked <a href="https://www.annualreviews.org/doi/abs/10.1146/annurev.polisci.4.1.217">the reality that developing responsible, engaged and active citizens takes time and education</a>.</p>
<h2>Resentment of the professional class</h2>
<p>Furthermore, over the past 50 years, the perception that all high school graduates should be encouraged to pursue a university degree has created an attitude that <a href="https://www.brookings.edu/research/what-we-know-about-career-and-technical-education-in-high-school/">devalues vocational education</a>. In many public schools, classes such as carpentry, metalwork, electrical circuitry and what used to be called “home economics” have completely vanished. The perception of vocational education as innately inferior to academic education translates to an attitude that devalues the vocational worker with respect to those holding a Bachelor’s degree. We have ended up with large portions of the population feeling undervalued, disrespected and misunderstood, with <a href="https://hbr.org/2016/11/what-so-many-people-dont-get-about-the-u-s-working-class">deep resentment in turn for the professional class</a>. Trump has branded himself as someone who will speak to that experience, making those who have previously felt ignored, undervalued or overlooked by the “establishment” finally feel acknowledged and valued.</p>
<p>In addition, <a href="https://www.brookings.edu/opinions/the-paradox-of-identity-politics/">identity politics means we treat a point of view as a fixed position</a>. Instead of discussing ideas, we place ourselves into “us” vs. “them” camps, where the other is vilified and challenges to our ideas are deemed as illegitimate. Yet if we don’t talk to one another, nor learn how to discuss difficult issues in a respectful, nonviolent manner, then the polarisation and divisiveness that Trump reflects is indeed here to stay.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/235971/original/file-20180912-133889-1dxvfcv.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/235971/original/file-20180912-133889-1dxvfcv.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=511&fit=crop&dpr=1 600w, https://images.theconversation.com/files/235971/original/file-20180912-133889-1dxvfcv.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=511&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/235971/original/file-20180912-133889-1dxvfcv.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=511&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/235971/original/file-20180912-133889-1dxvfcv.JPG?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=642&fit=crop&dpr=1 754w, https://images.theconversation.com/files/235971/original/file-20180912-133889-1dxvfcv.JPG?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=642&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/235971/original/file-20180912-133889-1dxvfcv.JPG?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=642&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Carol-Margaret Bitner at the 2018 Festival de Geopolitique at Grenoble Ecole de Management debating for the motion</span>
<span class="attribution"><span class="source">Author</span></span>
</figcaption>
</figure>
<p>While education is crucial for meaningful dialogue, it is insufficient without the availability to see through dialogue in a manner that allows participants in the conversation to share, analyse and engage with their knowledge and opinions and those of others. Debate is a tool for practising “perspective taking” – taking another’s point of view and examining one’s own views. Debate is learning to structure one’s ideas through a process. Diane Ravitch, in her bestseller, <a href="https://www.nytimes.com/2010/05/16/books/review/Wolfe-t.html"><em>The Death and Life of the Great American School System</em></a>, documents the decline of debates in American schools. Sadly, this is reflected in our politics and how we treat each other. We do not have the time within the jam-packed scheduled lives to foster engaged citizens, nor the courage to “risk” debate at school or elsewhere in public discourse. And in turn, by spurning the craft of persuasion, we transform outrage and vilification into virtues.</p>
<h2>In the bubble</h2>
<p>More and more, we live in our own little bubbles. Advances in communication technology, such as social media, have surprisingly served to polarise us further; with <a href="https://en.wikipedia.org/wiki/Filter_bubble">algorithms that show us only information we like or find controversial</a>. This means that while countries integrate more and more diversity, their citizens have the potential to become more insular, self-centred and nationalistic. The <a href="https://www.theguardian.com/news/series/cambridge-analytica-files">Cambridge Analytica scandal</a> paints a picture of the Trump presidency as a result of this trend, and by his constant attacks on his “others”, President Trump normalises and feeds back into the polarised societal discourse from which he was able to rise to power. Until a balance of tolerance through dialogue occurs, the impact reflected by Trump will be, as we are seeing now, a breakdown in “talking” and an increase in bullying, censoring and division among us.</p>
<p>Many of us are living in a country other than our birthplace. Do we try to understand the culture that we live in, try to understand its norms and nuances? Or do we stand outside it, looking in? How has it affected our view of “the Other?” When we are part of a culture, and genuinely concerned by it, then are we able to see the differences (our different beliefs, customs, religions… etc.) as a starting point in the search for common ground.</p>
<p>The legacy of leaders like Trump will be that we let the differences that have created such leaders define us. If we choose this path, which we seem to be heading down already, we provide more feedback to the cycle and we ensure that their impact is here to stay.</p>
<blockquote>
<p>“Real dialogue is when you recognise the same dignity in the other… dialogue assumes equality.” (E. Morin)</p>
</blockquote>
<hr>
<p><em>The next article in this series, “Symptoms of the present: Ethno-nationalism and systemic crisis”, will argue against the motion “The impact reflected by Trump is here to stay”.</em></p><img src="https://counter.theconversation.com/content/102534/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Les auteurs ne travaillent pas, ne conseillent pas, ne possèdent pas de parts, ne reçoivent pas de fonds d'une organisation qui pourrait tirer profit de cet article, et n'ont déclaré aucune autre affiliation que leur organisme de recherche.</span></em></p>The fourth is the Oxford-style debate series, this article argues that “the impact reflected by Trump is here to stay”.Prince C. Oguguo, Doctoral Researcher, Management of Technology and Strategy, Grenoble École de Management (GEM)Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/971092018-06-22T12:43:10Z2018-06-22T12:43:10ZThe predicament of diversity: re-boot for diversity 3.0<figure><img src="https://images.theconversation.com/files/222862/original/file-20180612-112623-suksqm.jpg?ixlib=rb-1.1.0&rect=0%2C134%2C2048%2C1226&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Pro-tolerance march in Des Moines, Iowa, in 2015.</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/tabor-roeder/16351899436/in/photolist-qUXL1C-ckAFmh-ckAEaj-qqzFyD-qn2LEX-2gmcuc-9X7YQM-qXbids-RFaeux-RQpPPN-7bWhzG-RH6uTK-dhcA7B-SkeSmg-q5tqAG-S93UyB-RgMacW-QDDdV2-4vAyK1-RjKbKS-5i7QV4-e9kfAC-atZrsM-9XaVJ7-RMEbX9-RDBRdY-SkdKwB-QyPkKf-ckAJEd-TgHuFT-ckAYsW-Rh826L-ckAQDw-4W6Zsn-cHu4B-9XaRdU-ckADkq-8d8o6u-RH6uSH-atZfi6-9XaR29-cHu4C-9jaUcb-9X7YsB-ckAMSG-qXbimU-qEFdHA-ckAHBm-9XaQuU-SgxRvN">Phil Roeder/Flickr</a></span></figcaption></figure><p>“Diversity” as a concept has a lexical and political value all its own, with a widespread appeal. The problem with that is, however, that no one actually has the same idea of what diversity actually <em>means</em>. There is some consensus that the concept has, over time, morphed into something that it was not originally intended to be. Denise Green’s <a href="http://journals.sagepub.com/doi/abs/10.1177/0042085904265109?journalCode=uexa">2004 study</a> looks at the University of Michigan’s response to a 1997 affirmative action case, and argues that legal precedents such as this one moved the cursor away from social and racial justice towards a narrower, simplified idea about diversity. </p>
<p>Walter Benn Michael’s 2006 book <a href="https://www.amazon.com/dp/1250099331/ref=as_at?creativeASIN=1250099331&linkCode=w61&imprToken=EafxYZKQYtqCBq7uYfTgKg&slotNum=2&tag=thneyo0f-20"><em>The Trouble with Diversity</em></a> views it as a conservative concept that shifts the focus from social and racial <em>inequality</em> to the diversity of <em>identity</em>, sweeping the important issues under the carpet. <a href="https://www.theatlantic.com/business/archive/2015/05/the-weakening-definition-of-diversity/393080/">Millennials have been demonstrated</a> to associate the concept more with diversity of “experiences” and viewpoints instead of with issues of race and gender. <a href="https://press.princeton.edu/titles/8757.html">Scott Page’s work</a> has demonstrated that among all diversity metrics, one very important one – cognitive diversity – is the real game-changer in the workplace. </p>
<p>This diversity of definitions illustrates the precise problem with diversity: it cannot be “all things to all people” without losing some of its earlier focus. Longer-term struggles for equality and civil rights get diluted in this eclectic mix, and <a href="http://theconversation.com/are-identity-politics-emancipatory-or-regressive-94434">identity politics</a> cloud the path forward. It seems that diversity as a concept is so appealing, and so emblematic of our global era, that it has simply brought “too many cooks into the kitchen”; creating distractions away from the focus of the pressing social issues we face in modern democracies.</p>
<h2>Desperately seeking diversity</h2>
<p>Sociologist Ellen Berrey’s 2015 study, <a href="http://press.uchicago.edu/ucp/books/book/chicago/E/bo19910067.html"><em>The Enigma of Diversity</em></a>, examines how diversity actually plays out in three different sectors of society – a large publicly traded company, a mixed neighbourhood in Chicago, and the University of Michigan. Berrey’s six-year ethnography reveals once more that diversity clearly means <em>different things to different constituencies</em>. Her more worrying conclusions demonstrate that the diversity concept is mobilized by different groups with different interests in a way which has significantly <em>weakened the demand for racial and social justice</em>.</p>
<p>So diversity still remains, despite all positive evidence of its value, and the noble efforts to make it work, more an aspirational ideal than a reality in the global workplace today. Calls for <a href="https://www.independent.co.uk/news/business/comment/its-time-for-diversity-20-more-women-from-different-backgrounds-9969623.html">“diversity 2.0”</a> have focused on <em>gender equality</em> and diversity of <em>experience</em>, specifically in Silicon Valley, where a <a href="https://www.pbs.org/newshour/show/how-silicon-valley-is-trying-to-fix-its-diversity-problem">diversity drama</a> has been playing out among tech firms, even inspiring the popular HBO series <a href="https://www.newyorker.com/culture/culture-desk/how-silicon-valley-nails-silicon-valley"><em>Silicon Valley</em></a>. Adding to the drama, a polemic and provocative <a href="https://assets.documentcloud.org/documents/3914586/Googles-Ideological-Echo-Chamber.pdf">anti-diversity manifesto</a> written by Google engineer James Damore was leaked last summer.</p>
<p>Finally, with the explosion of digital content and connected online users, we have paradoxically come to lack a diversity of <em>viewpoints</em>. When Google introduced personalised search algorithms in 2009, it translated into the fact that no two people obtain the same search results, creating an information <a href="https://books.google.fr/books?hl=fr&lr=&id=-FWO0puw3nYC&oi=fnd&pg=PT3&dq=calls+for+information+diversity+filter+bubble&ots=g4IuFvuWNZ&sig=i_uqFeupZbD3Tsp4FI-m0SQ7fHg#v=onepage&q&f=false.">filter bubble</a> where we cannot capture the ideas of others as easily. Living in this bubble prevents us from gaining access the same online information as our family, friends and close acquaintances. Imagine then what this means when it comes to viewing the same content as those who are very different from us. By generating overwhelmingly one-sided content, tailored for our individual preferences, Google funnels us into social “silos” where we do not have exposure to diversity of opinions and ideas.</p>
<p><a href="http://cascade.cs.illinois.edu/publication/p2359-liao.pdf">Studies demonstrate</a> that these silos further entrench our preferences when it comes to information selection. So whereas we aspire to diversity of opinion and information, the very tools we use to connect to others prevent us from our full diversity potential.</p>
<h2>Race: a human invention</h2>
<p>If information diversity has been sabotaged by digital media platforms, questions of <em>social and racial diversity</em> need to get put back on the front burner to address issues of fairness and justice. Nowhere does the unfinished business of diversity play out more visibly and dramatically than in the United States, with its long and violent history of race that, for Princeton historian Nell Irving Painter, is itself “an idea” based neither on science nor fact, constructed by humans for human purposes. Painter’s <a href="https://www.nytimes.com/2010/03/28/books/review/Gordon-t.html"><em>History of White People</em></a> traces a long and tortured heritage of “whiteness” dating from Antiquity up to the present-day America of mass incarceration and the <a href="https://twitter.com/hashtag/blacklivesmatter?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Ehashtag">#blacklivesmatter</a> movement. The implications of race-as-an-invention are startling, because it means that we are actually perpetuating and giving currency to a flawed, “imagined” concept in day-to-day life.</p>
<p>We should actually question the terms we use – instead of <em>race</em>, terms like <em>ethnicity</em> or <em>skin colour</em> that have observable scientific grounding. This begs a broader question: do we actually believe that race exists in reality? Do we have to use the word <em>race</em>? We may question its existence today, but a good many white European and American male scientists certainly believed it existed in the past. If we turn our attention to the <a href="https://www.tandfonline.com/doi/abs/10.1080/03612759.2006.10526967">history of science and its intersection with “race”</a> throughout the 18th and 20th centuries, we see how ‘race’ as a concept paved the way for not only slavery and the <a href="https://yalebooks.yale.edu/book/9780300181364/american-genocide">genocide of native Americans</a>, but also for Hitler’s <a href="http://motlc.wiesenthal.com/site/pp.asp?c=gvKVLcMVIuG&b=395043">racist ideology</a> against the Jews and other minority groups. Although social Darwinism, <a href="https://www.jstor.org/stable/2967206?seq=1">scientific racism</a>, and <a href="http://books.wwnorton.com/books/The-Mismeasure-of-Man/">biological determinism</a> have been thoroughly debunked, we remain the heirs to these defective, racial supremacist ideas which infiltrate the very ways we talk about diversity.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/222856/original/file-20180612-112637-12uso42.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/222856/original/file-20180612-112637-12uso42.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=390&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222856/original/file-20180612-112637-12uso42.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=390&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222856/original/file-20180612-112637-12uso42.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=390&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222856/original/file-20180612-112637-12uso42.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=490&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222856/original/file-20180612-112637-12uso42.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=490&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222856/original/file-20180612-112637-12uso42.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=490&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Solidarity march in London, February 4, 2017.</span>
<span class="attribution"><span class="source">Alisdare Hickson/Flickr</span>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<h2>Racial justice: the diversity elephant in the room</h2>
<p>So while most people think that diversity is a good idea, it remains, like race, more an <em>idea</em> than a fact. And it appears to have effectively shifted our attention away from the festering and indisputable problem of racial inequality. European countries continue to struggle with racial discrimination due to postcolonial legacies, as well as the influx of desperate asylum-seekers and migrant workers. France began to respond to its racial inequity problems <a href="https://www.sss.ias.edu/files/pdfs/Fassin/Racialization.pdf">in the 1990s</a>and Germany has seen a shift from a <a href="https://www.tandfonline.com/doi/abs/10.1080/01419870.1995.9993862">class-based to an ethnicity-based</a> welfare state. Scholars in the UK have focused on <a href="https://www.tandfonline.com/doi/full/10.1080/01419870.2018.1409902?src=recsys">popular culture and black youth</a> and on the <a href="https://www.tandfonline.com/doi/full/10.1080/01419870.2018.1409902?src=recsys">problems of blackness</a> in the academic environment. Again, the United States stands out among advanced economies in terms of its racial inequities. </p>
<p>These disparities are manifested everywhere in the US – in the <a href="http://lj.uwpress.org/content/26/1/10.short">urban space</a>, in the way <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4678876/">people think about their health</a>, and simply in the ways racism is shown to be deeply <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1446334/pdf/10936998.pdf">embedded in institutions</a>. Drawing attention to this elephant in the room, Bryan Stevenson’s <a href="https://eji.org/racial-justice">Equal Justice Initiative</a> has made strides to clarify those areas of society where the inequities reside and to educate citizens about their shared history of racial injustice.</p>
<p>Yet this is nothing new. A <a href="https://www.brookings.edu/articles/american-racial-and-ethnic-politics-in-the-21st-century-a-cautious-look-ahead/">report</a> put out by the Brookings Institute in 1998, years before Obama’s election, envisioned a tenuous future for black-white relations, noting that even affluent and successful African-Americans expressed a particular rage at their consistently unequal treatment. Today, thanks to <a href="http://www.pewinternet.org/2016/08/15/social-media-conversations-about-race/">social media conversations about race</a> and their viral nature, we seem to be experiencing a crescendo effect in terms of the number reported incidents involving minorities. These conversations and video evidence continue to pile up, spanning from everyday discriminatory grievances to <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5761867/">fatal shootings by police</a> and retaliatory actions taken in the aftermath. <a href="https://www.washingtonpost.com/...us-race.../66548936-4aa8-11e6-90a8-fb84201e06...">Polls</a> show that Americans are more cynical than ever about race relations.<br>
Diversity and race keep getting confused, amalgamated, or co-opted for different political gains and purposes. Harvard President Drew Faust published a letter on June 12th 2018 in support of <a href="https://www.harvard.edu/president/news/2018/defending-diversity">“Defending Diversity”</a> as the school prepares to defend the integrity of its diversity policy and admissions process in an upcoming <a href="https://projects.iq.harvard.edu/diverse-education">legal battle</a>. Student body diversity is for Faust the inclusion of “people of <em>different backgrounds, experiences, and perspectives</em>”, an ideal that just about anyone would find laudable and worthwhile. And yet this concept of diversity is, as we have seen, just broad enough for anti-diversity special interest groups like <a href="https://studentsforfairadmissions.org/">Students for Fair Admissions</a> to cherry-pick admissions data and instrumentalize <em>race</em> once more as the culprit behind what they allege are unfair, biased, and unconstitutional admissions processes.</p>
<h2>Diversity 3.0</h2>
<p>And yet there are signs of hope for the future of diversity. Generation Z is said to be <a href="http://www.businessinsider.fr/us/generation-z-profile-2017-9">even more inclusive and tolerant</a> than its predecessor, the Millennial generation. There has been a shift from studying race relations and racism towards understanding racialization, a process of <a href="https://www.tandfonline.com/doi/abs/10.1080/01419870120049806">“ascribing physical and cultural differences to individuals and groups”</a> which demonstrates a deeper and broader understanding of society’s unfinished business. Studies show that <a href="https://www.researchgate.net/publication/304747383_Racism_Racial_Resilience_and_African_American_Youth_Development_Person-Centered_Analysis_as_a_Tool_to_Promote_Equity_and_Justice">minority youth can be extraordinarily resilient</a> in the face of racism, and that novel forms of therapy can help them cope. </p>
<p>Movements such as <a href="https://www.facebook.com/NeverAgainMSD/">#neveragainMSD</a> show us that young people can rise to their political calling, organize a grassroots movement, and inspire an entire nation to pressure government and special-interest groups. Activist groups like <a href="http://www.showingupforracialjustice.org/">Showing Up for Racial Justice</a> demonstrate that majority groups can take a stand to speak out forcefully against racism and challenge the permission structures that make it possible. There remains so much more to be done. </p>
<p>Next on the agenda, in my upcoming article, I will explore how educators can play a vital role in raising awareness and in moving the conversation about diversity in more productive directions through practice scenarios in the classroom.</p><img src="https://counter.theconversation.com/content/97109/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Michelle Mielly ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d'une organisation qui pourrait tirer profit de cet article, et n'a déclaré aucune autre affiliation que son organisme de recherche.</span></em></p>Diversity is an enormously appealing and powerful concept, yet it can also distract us from the focus we need to face today’s pressing social issues. So what’s the way forward?Michelle Mielly, Associate Professor in People, Organizations, Society, Grenoble École de Management (GEM)Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/971482018-06-20T10:28:04Z2018-06-20T10:28:04ZMisinformation and biases infect social media, both intentionally and accidentally<figure><img src="https://images.theconversation.com/files/223361/original/file-20180615-85822-5fqwo4.png?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">People who share potential misinformation on Twitter (in purple) rarely get to see corrections or fact-checking (in orange).</span> <span class="attribution"><a class="source" href="https://arxiv.org/abs/1801.06122">Shao et al.</a>, <a class="license" href="http://creativecommons.org/licenses/by-nd/4.0/">CC BY-ND</a></span></figcaption></figure><p>Social media are among the <a href="http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/">primary sources of news in the U.S.</a> and across the world. Yet users are exposed to content of questionable accuracy, including <a href="https://conspiracypsychology.com/2018/02/22/every-mass-shooting-produces-the-same-conspiracy-theories-more-or-less/">conspiracy theories</a>, <a href="https://www.polygon.com/2018/4/13/17231470/fortnite-strip-clickbait-touchdalight-ricegum-youtube">clickbait</a>, <a href="https://www.pbs.org/newshour/show/online-anger-is-gold-to-this-junk-news-pioneer">hyperpartisan content</a>, <a href="https://www.newyorker.com/science/elements/looking-for-life-on-a-flat-earth">pseudo science</a> and even <a href="https://www.smithsonianmag.com/history/age-old-problem-fake-news-180968945/">fabricated “fake news” reports</a>.</p>
<p>It’s not surprising that there’s so much disinformation published: Spam and online fraud <a href="https://www.symantec.com/connect/blogs/dridex-financial-trojan-aggressively-spread-millions-spam-emails-each-day">are lucrative for criminals</a>, and government and political propaganda yield <a href="https://www.ned.org/issue-brief-distinguishing-disinformation-from-propaganda-misinformation-and-fake-news/">both partisan and financial benefits</a>. But the fact that <a href="http://doi.org/10.1126/science.aap9559">low-credibility content spreads so quickly and easily</a> suggests that people and the algorithms behind social media platforms are vulnerable to manipulation.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/BIv9054dBBI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Explaining the tools developed at the Observatory on Social Media.</span></figcaption>
</figure>
<p>Our research has identified three types of bias that make the social media ecosystem vulnerable to both intentional and accidental misinformation. That is why our <a href="http://osome.iuni.iu.edu/">Observatory on Social Media</a> at Indiana University is building <a href="http://osome.iuni.iu.edu/tools/">tools</a> to help people become aware of these biases and protect themselves from outside influences designed to exploit them. </p>
<h2>Bias in the brain</h2>
<p>Cognitive biases originate in the way the brain processes the information that every person encounters every day. The brain can deal with only a finite amount of information, and too many incoming stimuli can cause <a href="https://hbr.org/2009/09/death-by-information-overload">information overload</a>. That in itself has serious implications for the quality of information on social media. We have found that steep competition for users’ limited attention means that <a href="https://doi.org/10.1038/srep00335">some ideas go viral despite their low quality</a> – <a href="https://arxiv.org/abs/1701.02694v4">even when people prefer to share high-quality content</a>.</p>
<p>To avoid getting overwhelmed, the brain uses a <a href="https://global.oup.com/academic/product/simple-heuristics-that-make-us-smart-9780195143812">number of tricks</a>. These methods are usually effective, but may also <a href="https://www.psychologytoday.com/us/blog/fulfillment-any-age/201210/avoiding-emotional-traps-is-easier-you-think">become biases</a> when applied in the wrong contexts. </p>
<p>One cognitive shortcut happens when a person is deciding whether to share a story that appears on their social media feed. People are <a href="https://doi.org/10.1007/978-3-642-22309-9_5">very affected by the emotional connotations of a headline</a>, even though that’s not a good indicator of an article’s accuracy. Much more important is <a href="https://digitalliteracy.cornell.edu/tutorial/dpl3221.html">who wrote the piece</a>.</p>
<p>To counter this bias, and help people pay more attention to the source of a claim before sharing it, we developed <a href="http://fakey.iuni.iu.edu">Fakey</a>, a mobile news literacy game (free on <a href="https://play.google.com/store/apps/details?id=com.cnets.fakey">Android</a> and <a href="https://itunes.apple.com/us/app/id1386410642?mt=8">iOS</a>) simulating a typical social media news feed, with a mix of news articles from mainstream and low-credibility sources. Players get more points for sharing news from reliable sources and flagging suspicious content for fact-checking. In the process, they learn to recognize signals of source credibility, such as hyperpartisan claims and emotionally charged headlines. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=429&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=429&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=429&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=540&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=540&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222513/original/file-20180610-191951-l5i1yd.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=540&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Screenshots of the Fakey game.</span>
<span class="attribution"><span class="source">Mihai Avram and Filippo Menczer</span></span>
</figcaption>
</figure>
<h2>Bias in society</h2>
<p>Another source of bias comes from society. When people connect directly with their peers, the social biases that guide their selection of friends come to influence the information they see.</p>
<p>In fact, in our research we have found that it is possible to <a href="http://doi.org/10.1109/PASSAT/SocialCom.2011.34">determine the political leanings of a Twitter user</a> by simply looking at the partisan preferences of their friends. Our analysis of the structure of these <a href="http://www.aaai.org/ocs/index.php/ICWSM/ICWSM11/paper/view/2847">partisan communication networks</a> found social networks are particularly efficient at disseminating information – accurate or not – when <a href="http://doi.org/10.1140/epjds6">they are closely tied together and disconnected from other parts of society</a>.</p>
<p>The tendency to evaluate information more favorably if it comes from within their own social circles creates “<a href="https://arstechnica.com/science/2017/03/the-social-media-echo-chamber-is-real/">echo chambers</a>” that are ripe for manipulation, either consciously or unintentionally. This helps explain why so many online conversations devolve into <a href="http://www.pewinternet.org/2016/10/25/the-tone-of-social-media-discussions-around-politics/">“us versus them” confrontations</a>. </p>
<p>To study how the structure of online social networks makes users vulnerable to disinformation, we built <a href="http://hoaxy.iuni.iu.edu">Hoaxy</a>, a system that tracks and visualizes the spread of content from low-credibility sources, and how it competes with fact-checking content. Our analysis of the data collected by Hoaxy during the 2016 U.S. presidential elections shows that Twitter accounts that shared misinformation were <a href="http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0196087">almost completely cut off</a> from the corrections made by the fact-checkers.</p>
<p>When we drilled down on the misinformation-spreading accounts, we found a very dense core group of accounts retweeting each other almost exclusively – including several bots. The only times that fact-checking organizations were ever quoted or mentioned by the users in the misinformed group were when questioning their legitimacy or claiming the opposite of what they wrote.</p>
<figure class="align-right zoomable">
<a href="https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=748&fit=crop&dpr=1 600w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=748&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=748&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=940&fit=crop&dpr=1 754w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=940&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/223109/original/file-20180613-32327-126thdk.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=940&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A screenshot of a Hoaxy search shows how common bots – in red and dark pink – are spreading a false story on Twitter.</span>
<span class="attribution"><span class="source">Hoaxy</span></span>
</figcaption>
</figure>
<h2>Bias in the machine</h2>
<p>The third group of biases arises directly from the algorithms used to determine what people see online. Both social media platforms and search engines employ them. These personalization technologies are designed to select only the most engaging and relevant content for each individual user. But in doing so, it may end up reinforcing the cognitive and social biases of users, thus making them even more vulnerable to manipulation.</p>
<p>For instance, the detailed <a href="https://theconversation.com/solving-the-political-ad-problem-with-transparency-85366">advertising tools built into many social media platforms</a> let disinformation campaigners exploit <a href="https://www.psychologytoday.com/us/blog/science-choice/201504/what-is-confirmation-bias">confirmation bias</a> by <a href="https://www.washingtonpost.com/news/powerpost/paloma/the-cybersecurity-202/2018/05/11/the-cybersecurity-202-the-facebook-ad-dump-shows-the-true-sophistication-of-russia-s-influence-operation/5af4733a30fb04258879944e/">tailoring messages</a> to people who are already inclined to believe them. </p>
<p>Also, if a user often clicks on Facebook links from a particular news source, Facebook will <a href="https://www.wired.com/story/take-back-your-facebook-news-feed/">tend to show that person more of that site’s content</a>. This so-called “<a href="https://www.brainpickings.org/2011/05/12/the-filter-bubble/">filter bubble</a>” effect may isolate people from diverse perspectives, strengthening confirmation bias.</p>
<p>Our own research shows that social media platforms expose users to a less diverse set of sources than do non-social media sites like Wikipedia. Because this is at the level of a whole platform, not of a single user, we call this the <a href="https://doi.org/10.7717/peerj-cs.38">homogeneity bias</a>.</p>
<p>Another important ingredient of social media is information that is trending on the platform, according to what is getting the most clicks. We call this <a href="https://arxiv.org/abs/1707.00574">popularity bias</a>, because we have found that an algorithm designed to promote popular content may negatively affect the overall quality of information on the platform. This also feeds into existing cognitive bias, reinforcing what appears to be popular irrespective of its quality.</p>
<p>All these algorithmic biases can be manipulated by <a href="https://cacm.acm.org/magazines/2016/7/204021-the-rise-of-social-bots/fulltext">social bots</a>, computer programs that interact with humans through social media accounts. Most social bots, like Twitter’s <a href="https://twitter.com/big_ben_clock">Big Ben</a>, are harmless. However, some conceal their real nature and are used for malicious intents, such as <a href="https://newsroom.fb.com/InfoOps">boosting disinformation</a> or falsely <a href="http://www.businessinsider.com/astroturfing-grassroots-movements-2011-9">creating the appearance of a grassroots movement</a>, also called “astroturfing.” We found <a href="http://www.aaai.org/ocs/index.php/ICWSM/ICWSM11/paper/view/2850">evidence of this type of manipulation</a> in the run-up to the 2010 U.S. midterm election.</p>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=432&fit=crop&dpr=1 600w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=432&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=432&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=542&fit=crop&dpr=1 754w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=542&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/222527/original/file-20180611-191940-17sdjut.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=542&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">A screenshot of the Botometer website, showing one human and one bot account.</span>
<span class="attribution"><span class="source">Botometer</span></span>
</figcaption>
</figure>
<p>To study these manipulation strategies, we developed a tool to detect social bots called <a href="http://botometer.org">Botometer</a>. Botometer uses machine learning to detect bot accounts, by inspecting thousands of different features of Twitter accounts, like the times of its posts, how often it tweets, and the accounts it follows and retweets. It is not perfect, but it has revealed that as many as <a href="https://aaai.org/ocs/index.php/ICWSM/ICWSM17/paper/view/15587">15 percent of Twitter accounts show signs of being bots</a>.</p>
<p>Using Botometer in conjunction with Hoaxy, we analyzed the core of the misinformation network during the 2016 U.S. presidential campaign. We found many bots exploiting both the cognitive, confirmation and popularity biases of their victims and Twitter’s algorithmic biases.</p>
<p>These bots are able to construct filter bubbles around vulnerable users, feeding them false claims and misinformation. First, they can attract the attention of human users who support a particular candidate by tweeting that candidate’s hashtags or by mentioning and retweeting the person. Then the bots can amplify false claims smearing opponents by retweeting articles from low-credibility sources that match certain keywords. This activity also makes the algorithm highlight for other users false stories that are being shared widely.</p>
<h1>Understanding complex vulnerabilities</h1>
<p>Even as our research, and others’, shows how individuals, institutions and even entire societies can be manipulated on social media, there are <a href="http://doi.org/10.1126/science.aao2998">many questions</a> left to answer. It’s especially important to discover how these different biases interact with each other, potentially creating more complex vulnerabilities.</p>
<p>Tools like ours offer internet users more information about disinformation, and therefore some degree of protection from its harms. The solutions will <a href="https://www.hewlett.org/newsroom/hewlett-knight-koch-foundations-with-other-funders-will-support-independent-research-on-facebooks-role-in-elections-and-democracy/">not likely be only technological</a>, though there will probably be some technical aspects to them. But they must take into account <a href="https://doi.org/10.1016/j.jarmac.2017.07.008">the cognitive and social aspects</a> of the problem.</p>
<p><em>Editor’s note: This article was updated on Jan. 10, 2019, to replace a link to a study that had been retracted. The text of the article is still accurate, and remains unchanged.</em></p><img src="https://counter.theconversation.com/content/97148/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Giovanni Luca Ciampaglia has received funding from the Office of the Vice Provost for Research at Indiana University, the Democracy Fund, and the Swiss National Science Foundation. Currently, he is supported by the Indiana University Network Science Institute.</span></em></p><p class="fine-print"><em><span>Filippo Menczer has received funding from the National Science Foundation, DARPA, US Navy, Yahoo Research, the J.S. McDonnell Foundation, and Democracy Fund. </span></em></p>Information on social media can be misleading because of biases in three places – the brain, society and algorithms. Scholars are developing ways to identify and display the effects of these biases.Giovanni Luca Ciampaglia, Assistant professor, department of Computer Science and Engineering, University of South FloridaFilippo Menczer, Professor of Computer Science and Informatics; Director of the Center for Complex Networks and Systems Research, Indiana UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/971402018-06-01T13:26:49Z2018-06-01T13:26:49ZYour personal space is no longer physical – it’s a global network of data<figure><img src="https://images.theconversation.com/files/221190/original/file-20180531-69487-x2vgjx.png?ixlib=rb-1.1.0&rect=101%2C8%2C1686%2C1008&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Silvio Carta</span>, <span class="license">Author provided</span></span></figcaption></figure><p>In the digital world, any action we do generates data – whether browsing the internet, answering emails or messaging our friends. Translated into radio waves, this information can travel almost effortlessly through space in a split second. Data <a href="https://books.google.co.uk/books?hl=en&lr=&id=noMNgMcZvL0C&oi=fnd&pg=PA9&dq=Greenfield+2010&ots=skGO95vf2h&sig=GgATu-kl_yWiMIXs06t321C6jMM#v=onepage&q=Greenfield%202010&f=false">are all around us</a>, invisibly occupying the space between ourselves and other objects in the built environment. My colleagues and I conducted <a href="http://www.ingentaconnect.com/contentone/intellect/jucs/2018/00000005/00000001/art00007">a study</a> to understand how the presence of all this data alters our understanding of personal and public spaces.</p>
<p>As a case study, we set up an open Wireless Local Area Network (WLAN) in Plaza de Los Palos Grandes in Caracas, for people to connect free of charge for a limited period of time. A total of 123 people connected to our WLAN with their devices, sending and receiving packets of information to and from servers across the world.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/221194/original/file-20180531-69508-tpd43i.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221194/original/file-20180531-69508-tpd43i.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=303&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221194/original/file-20180531-69508-tpd43i.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=303&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221194/original/file-20180531-69508-tpd43i.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=303&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221194/original/file-20180531-69508-tpd43i.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=381&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221194/original/file-20180531-69508-tpd43i.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=381&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221194/original/file-20180531-69508-tpd43i.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=381&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Our participants sent and received information right across the world.</span>
<span class="attribution"><span class="source">Silvio Carta</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>From the packets, we extracted the location of the servers to which each user connected. In the image below, we generated one line for each connection established between the person and the servers. It demonstrates how the data generated by an email to a close friend, composed in the intimate space between you and your device, has the potential to reach across the world. </p>
<p>Here’s how it works: the smart phone converts your email into radio waves and sends the information to a WiFi router. It’s then passed to your email provider’s servers, then – through the internet’s Transmission Control Protocol (TCP), which controls the movement of data across the web – to the other provider’s servers and to your friend’s inbox. </p>
<h2>Changing space</h2>
<p>People tend to think that personal communication originates within the intimate dimension of our personal space. We consider the space immediately around us to be ours and personal – it’s where we think, formulate ideas and speak with others. This study gave us the opportunity to consider how the shape of our personal space is changing as we live our digital lives in public spaces. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/221192/original/file-20180531-69490-qznu83.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221192/original/file-20180531-69490-qznu83.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=302&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221192/original/file-20180531-69490-qznu83.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=302&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221192/original/file-20180531-69490-qznu83.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=302&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221192/original/file-20180531-69490-qznu83.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=380&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221192/original/file-20180531-69490-qznu83.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=380&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221192/original/file-20180531-69490-qznu83.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=380&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The complexity of the data traffic emitted by each of us in the public space.</span>
<span class="attribution"><span class="source">Silvio Carta</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>The term “personal space” has meant different things since architects, urbanists, sociologist and geographers started studying it. In the 1960s, personal space <a href="https://books.google.co.uk/books/about/The_Hidden_Dimension.html?id=p3g0ngEACAAJ&redir_esc=y**">was thought of</a> as the distances we maintain from others, to control our interactions with them. The size of this invisible aura could vary, depending on cultural values, the density of people around you <a href="https://books.google.co.uk/books?id=VMrSPQAACAAJ&dq=Sommer+(1969&hl=en&sa=X&ved=0ahUKEwj_v8SW-q_bAhVUOMAKHXhNATwQ6AEIKTAA">and other circumstances</a>. </p>
<p>Scholars have tried to argue that in the digital era, our “<a href="https://books.google.co.uk/books?hl=en&lr=&id=EzRLRKwxCg4C&oi=fnd&pg=PA69&dq=Beslay+and+Hakala+(2007)&ots=eyq9Cy5FZf&sig=bz_2sytGibu26YyTDCAix-pHTTM#v=onepage&q&f=false">personal bubble</a>” is not just physical – it’s also virtual. They claim that your personal bubble is a membrane which filters the data you that send out and the information you receive back. It’s the sum of all the settings and agreements across different digital platforms – including apps, social media and email – as well as the phone itself, which help you to manage your personal, group and public data and communications. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/221191/original/file-20180531-69497-c3rh81.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/221191/original/file-20180531-69497-c3rh81.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=285&fit=crop&dpr=1 600w, https://images.theconversation.com/files/221191/original/file-20180531-69497-c3rh81.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=285&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/221191/original/file-20180531-69497-c3rh81.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=285&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/221191/original/file-20180531-69497-c3rh81.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=358&fit=crop&dpr=1 754w, https://images.theconversation.com/files/221191/original/file-20180531-69497-c3rh81.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=358&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/221191/original/file-20180531-69497-c3rh81.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=358&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The personal space as global net originating from each of us.</span>
<span class="attribution"><span class="source">Silvio Carta</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>But our results show that in the digital realm, personal space isn’t like a bubble which surrounds each person, helping to define the nature of their encounters, relationships, intimacies or invasions. In fact, it’s more like a global network of connections, reaching everywhere, coming from each person whenever they send or receive a packet of data. </p>
<p>Our images show how personal space disperses through the atmosphere and materialises in someone else’s device in a matter of seconds, leaving traces in a dispersed constellation of servers. Because of this, personal space has become dynamic – it changes in real time with our digital interactions.</p>
<p>Given how sensitive we are to invasions of our physical personal space, it’s remarkable that many of us don’t even realise the extent of our digital personal space, which is scattered around servers and other devices around the world. By visualising the massive size and dispersed form of our digital personal space, people will become more protective of their data, taking a greater interest in the level of encryption, privacy and permissions granted to each app they use. </p>
<p>Personal space is no longer the immediate space that surrounds us and that moves with us. It is rather something more abstract – globally distributed and possibly everywhere at any time. The next time we look at our phone to send a text message, we should envision the real extent of our space, that goes to the other side of the globe in seconds to pin back to us. Our personal space is not a bubble anymore – it is a global network.</p><img src="https://counter.theconversation.com/content/97140/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>I am very grateful to Jesus Zambrano and Bernardo Morales for their help in making sense of the visualisation logics underpinning this study, and the programming tasks necessary to collect and use the data respectively. The research underpinning this article has been partially funded by the Early Career and returning to Research Staff Research Grants Competition for 2016/17 at the University of Hertfordshire, UK. </span></em></p>How data is changing the shape of our personal ‘bubble’ – in pictures.Silvio Carta, Senior Lecturer and Chair of the Design Research Group, University of HertfordshireLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/955122018-05-14T04:50:08Z2018-05-14T04:50:08ZHow information warfare in cyberspace threatens our freedom<figure><img src="https://images.theconversation.com/files/218236/original/file-20180509-34024-rhe9bv.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Information warfare in cyberspace could replace reason and reality with rage and fantasy.</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p><em>This article is the fourth in a five-part series exploring Australian national security in the digital age. Read parts <a href="https://theconversation.com/explainer-how-the-australian-intelligence-community-works-94422">one</a>, <a href="https://theconversation.com/trust-is-the-second-casualty-in-the-war-on-terror-94420">two</a> and <a href="https://theconversation.com/this-isnt-helter-skelter-why-the-internet-alone-cant-be-blamed-for-radicalisation-94825">three</a> here.</em></p>
<hr>
<p>Just as we’ve become used to the idea of cyber warfare, along come the attacks, via social media, on our polity.</p>
<p>We’ve watched in growing amazement at the brazen efforts by the Russian state to influence the <a href="https://www.dni.gov/files/documents/ICA_2017_01.pdf">US elections</a>, the UK’s <a href="https://www.theguardian.com/world/2018/jan/10/russian-influence-brexit-vote-detailed-us-senate-report">Brexit referendum</a> and other <a href="http://www.abc.net.au/news/2018-04-17/australians-caught-up-in-cyber-attacks-blamed-on-russia/9665820">democratic targets</a>. And we’ve tended to conflate them with the seemingly-endless cyber hacks and attacks on our businesses, governments, infrastructure, and a long-suffering citizenry.</p>
<p>But these social media attacks are a different beast altogether – more sinister, more consequential and far more difficult to counter. They are the modern realisation of the Marxist-Leninist idea that information is a weapon in the struggle against Western democracies, and that the war is ongoing. There is no peacetime or wartime, there are no non-combatants. Indeed, the citizenry are the main targets.</p>
<h2>A new battlespace for an old war</h2>
<p>These subversive attacks on us are not a prelude to war, they are the war itself; what Cold War strategist George Kennan called <a href="http://academic.brooklyn.cuny.edu/history/johnson/65ciafounding3.htm">“political warfare”</a>.</p>
<p>Perversely, as US cyber experts <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3015680">Herb Lin and Jaclyn Kerr note</a>, modern communication attacks exploit the technical virtues of the internet such as “high connectivity” and “democratised access to publishing capabilities”. What the attackers do is, broadly speaking, not illegal.</p>
<p>The battlespace for this warfare is not the physical, but the cognitive environment – within our brains. It seeks to sow confusion and discord, to reduce our abilities to think and reason rationally.</p>
<p>Social media platforms are the perfect theatres in which to wage political warfare. Their vast reach, high tempo, anonymity, directness and cheap production costs mean that political messages can be distributed quickly, cheaply and anonymously. They can also be tailored to target audiences and amplified quickly to drown out adversary messages.</p>
<h2>Simulating dissimulation</h2>
<p>We built simulation models (for a forthcoming publication) to test these ideas. We were astonished at how effectively this new cyber warfare can wreak havoc in the models, co-opting filter bubbles and preventing the emergence of democratic discourse.</p>
<p>We used agent-based models to examine how opinions shift in response to the insertion of strong opinions (fake news or propaganda) into the discourse. </p>
<p>Our agents in these simple models were individuals who each had a set of opinions. We represented different opinions as axes in an opinion space. Individuals are located in the space by the values of their opinions. Individuals close to each other in the opinion space are close to each other in their opinions. Their differences in opinion are simply the distance between them.</p>
<p>When an individual links to a neighbour, they experience a degree of convergence - their opinions are drawn towards each other. An individual’s position is not fixed, but may shift under the influence of the opinions of others. </p>
<p>The dynamics in these models were driven by two conflicting processes:</p>
<ul>
<li><p>Individuals are social - they have a need to communicate - and they will seek to communicate with others with whom they agree. That is, other individuals nearby in their opinion space.</p></li>
<li><p>Individuals have a limited number of communication links they can manage at any time (also known as their <a href="https://en.wikipedia.org/wiki/Dunbar%27s_number">Dunbar number</a>, and they continue to find links until they satisfy this number. Individuals, therefore, are sometimes forced to communicate with individuals with whom they disagree in order to satisfy their Dunbar number. But if they wish to create a new link and have already reached their Dunbar number, they will prune another link.</p></li>
</ul>
<p><strong>Figure 1: The emergence of filter bubbles</strong></p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217877/original/file-20180507-166893-16fg8ve.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217877/original/file-20180507-166893-16fg8ve.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217877/original/file-20180507-166893-16fg8ve.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217877/original/file-20180507-166893-16fg8ve.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217877/original/file-20180507-166893-16fg8ve.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217877/original/file-20180507-166893-16fg8ve.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217877/original/file-20180507-166893-16fg8ve.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Figure 1: Filter bubbles emerging with two dimensions, opinions of issue X and opinions of issue Y.</span>
<span class="attribution"><span class="source">roger.bradbury@anu.edu.au</span></span>
</figcaption>
</figure>
<p>To begin, 100 individuals, represented as dots, were randomly distributed across the space with no links. At each step, every individual attempts to link with a near neighbour up to its Dunbar number, perhaps breaking earlier links to do so. In doing so, it may change its position in opinion space.</p>
<p>Over time, individuals draw together into like-minded groups (filter bubbles). But the bubbles are dynamic. They form and dissolve as individuals continue to prune old links and seek newer, closer ones as a result of their shifting positions in the opinion space. Figure 1, above, shows the state of the bubbles in one experiment after 25 steps.</p>
<p><strong>Figure 2: Capturing filter bubbles with fake news</strong></p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/217879/original/file-20180507-166910-qaybke.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/217879/original/file-20180507-166910-qaybke.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/217879/original/file-20180507-166910-qaybke.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/217879/original/file-20180507-166910-qaybke.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/217879/original/file-20180507-166910-qaybke.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/217879/original/file-20180507-166910-qaybke.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/217879/original/file-20180507-166910-qaybke.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Conversation lobbies figure 2.</span>
<span class="attribution"><span class="source">roger.bradbury@anu.edu.au</span></span>
</figcaption>
</figure>
<p>At time step 26, we introduced two pieces of fake news into the model. These were represented as special sorts of individuals that had an opinion in only one dimension of the opinion space and no opinion at all in the other. Further, these “individuals” didn’t seek to connect to other individuals and they never shifted their opinion as a result of ordinary individuals linking to them. They are represented by the two green lines in Figure 2.</p>
<p>Over time (the figure shows time step 100), each piece of fake news breaks down the old filter bubbles and reels individuals towards their green line. They create new tighter filter bubbles that are very stable over time.</p>
<h2>Information warfare is a threat to our Enlightenment foundations</h2>
<p>These are the conventional tools of demagogues throughout history, but this agitprop is now packaged in ways perfectly suited to the new environment. Projected against the West, this material seeks to increase political polarisation in our public sphere. </p>
<p>Rather than actually change an election outcome, it seeks to prevent the creation of any coherent worldview. It encourages the creation of filter bubbles in society where emotion is privileged over reason and targets are immunised against real information and rational consideration.</p>
<p>These models confirm Lin and Kerr’s <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3015680">hypothesis</a>. “Traditional” cyber warfare is not an existential threat to Western civilisation. We can and have rebuilt our societies after kinetic attacks. But information warfare in cyberspace is such a threat. </p>
<p>The Enlightenment gave us reason and reality as the foundations of political discourse, but information warfare in cyberspace could replace reason and reality with rage and fantasy. We don’t know how to deal with this yet.</p><img src="https://counter.theconversation.com/content/95512/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Anne-Marie Grisogono is a member of the National Security College Futures Council.</span></em></p><p class="fine-print"><em><span>Dmitry Brizhinev, John Finnigan, Nicholas Lyall, and Roger Bradbury do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Simulation models show just how effectively fake news and propaganda can shift opinions.Roger Bradbury, Professor, National Security College, Australian National UniversityAnne-Marie Grisogono, Visiting fellow, Crawford School of Public Policy, Australian National UniversityDmitry Brizhinev, Research Assistant, National Security College, Australian National UniversityJohn Finnigan, Leader, Complex Systems Science, CSIRONicholas Lyall, Research Assistant (National Security College), Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/937342018-03-22T09:33:00Z2018-03-22T09:33:00ZWe need to talk about the data we give freely of ourselves online and why it’s useful<figure><img src="https://images.theconversation.com/files/211511/original/file-20180322-165580-17ovg9v.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">How should your social media data be accessed and used by researchers?</span> <span class="attribution"><span class="source">Gil C/Shutterstock</span></span></figcaption></figure><p>The <a href="http://www.abc.net.au/news/2018-03-22/facebook-mark-zuckerberg-admits-mistakes-in-protecting-data/9574778">Cambridge Analytica and Facebook data harvesting scandal</a> has provided yet another reminder of what has long been known: as social media users, we are the product.</p>
<p>Our personal information and behaviour that we divulge via social media use is valuable and used for commercial gain. </p>
<p>Yet those who do research using big data – and I count myself among these – are probably feeling both concerned and conflicted at this latest scandal.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/australia-should-strengthen-its-privacy-laws-and-remove-exemptions-for-politicians-93717">Australia should strengthen its privacy laws and remove exemptions for politicians</a>
</strong>
</em>
</p>
<hr>
<p>Those of us academics who also have a private business interest in data analytics may even be having a “but there for the grace of God go I” moment.</p>
<h2>Gathering Facebook data</h2>
<p>The data used by Cambridge Analytica came from University of Cambridge researcher Aleksandr Kogan, and were not collected as part of his university work. </p>
<p>Kogan created a Facebook app which used Facebook’s “application programming interface” (<a href="https://www.pcmag.com/encyclopedia/term/37856/api">API</a>) to gather data on about 50 million people in the United States.</p>
<p>At the time there were many academics using the Facebook API in a similar way and, as has been <a href="https://newsroom.fb.com/news/2018/03/suspending-cambridge-analytica/">pointed out by Facebook in the Cambridge Analytica case</a>, there was no data breach (Facebook’s servers were not hacked). </p>
<p>Around the same time Kogan was collecting his data, I was using the Facebook API for research and teaching at the ANU. It was understood that the use of a Facebook app for research required ethics clearance and informed consent. Further, the collected data should be de-identified, stored securely and only used for the stated research project.</p>
<p>It is clear that these requirements were not met in the case of the Cambridge Analytica Facebook data.</p>
<p>It has been argued that it was not appropriate for Facebook apps to access the information of friends of the participants (the people who installed the app). But it was precisely the social network data (who are the participant’s friends, and how do they connect with one another?) that made Facebook data so useful for social research. </p>
<p>It is important to recognise that researchers using the Facebook API had to respect Facebook’s privacy settings – it was not possible to access profiles that were private or could only be viewed by friends.</p>
<p>Facebook restricted the API in 2014 to prevent this kind of collection. So a budding Aleksandr Kogan of 2018 would not be able to collect Facebook data that would be of interest to Cambridge Analytica. </p>
<p>But there are several reasons why this latest story is not simply “old news”.</p>
<h2>A shifting, competitive environment</h2>
<p>The Cambridge Analytica scandal highlights that social media companies such as Facebook are faced with often conflicting privacy-related demands from users and advertisers, as well as from civil society, academia and government. </p>
<p>MIT management professor Sinan Aral calls this the “<a href="https://www.technologyreview.com/s/610577/the-cambridge-analytica-affair-reveals-facebooks-transparency-paradox/">transparency paradox</a>”. </p>
<p>It is a quickly changing environment and what was considered ethical and appropriate five or ten years ago (such as the <a href="http://www.adweek.com/digital/succeeding-politics-depends-social-media-savvy/">savvy use of social media</a> by the Obama presidential campaign) may be regarded as unacceptable in the future. </p>
<p>This is just the natural process of technology evolving over time in response to public scrutiny.</p>
<p>But some of Facebook’s privacy missteps have appeared to be wilful, with <a href="https://techcrunch.com/2007/12/05/zuckerberg-saves-face-apologies-for-beacon/">the platform testing the water (and then apologising)</a> in terms of what it could get away with to make itself more valuable to advertisers. </p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/consent-and-ethics-in-facebooks-emotional-manipulation-study-28596">Consent and ethics in Facebook's emotional manipulation study</a>
</strong>
</em>
</p>
<hr>
<h2>Taking risks in the WWW Wild West</h2>
<p>Facebook operates in a highly competitive environment, as do the academics and entrepreneurs who want to make use of social media data. Some will always be more willing than others to take calculated risks in an attempt to leapfrog the competition. </p>
<p>The world of big data analysis is like the Wild West. If we don’t collect and analyse these data, then our competitors will (and they will get the grants, or the big contracts). </p>
<p>Anyway, the API may be turned off next year or the social media platform might go bust, so we had better get in quick. </p>
<p>I recently attended an academic presentation involving potentially sensitive social media data (not Facebook data) collected via an API. I was not the only person in the room shifting uneasily in my seat when we were told “everyone is doing it”.</p>
<h2>Impact on future of access to big data</h2>
<p>Researchers, including myself, who use big data will be concerned that the Cambridge Analytica scandal will contribute to making it even harder to access social media data for legitimate research. </p>
<p>But while the public APIs may be further restricted, social media companies will continue to use the data themselves and to give <a href="https://www.wired.com/story/its-time-for-facebook-to-share-more-data-with-researchers/">preferential access</a> to affiliated university researchers.</p>
<p>Yet public APIs help to level the research playing field. They allow researchers from around the world, who are less likely to have any preferential access to social media companies, to conduct open science using publicly available data.</p>
<p>If I conduct research using data from a public API and you don’t agree with the results, you can use the same API to collect a similar dataset to try to prove me wrong.</p>
<p>Restricting API access will also make it harder for outside researchers to understand the privacy implications of the data being collected by Facebook and similar companies. </p>
<p>I am only able to write about the nature of the Cambridge Analytica scandal because the data were originally collected via the public Facebook API, which I was using at around the same time. </p>
<p>By further restricting public APIs, only the social media companies themselves will be able to conduct research about users’ behaviour on their platform. What is the implication for accountability and transparency, let alone research into important topics such as political filter bubbles and fake news? </p>
<h2>How to govern our online social data?</h2>
<p>These privacy concerns are particularly pertinent to those platforms that want “the real you”. That is, those where it is either against the Terms of Service or doesn’t make sense to create multiple or fake profiles. Facebook, Academia.edu and LinkedIn are prime examples. </p>
<p>But all social media platforms share a common feature that they are only valuable if they have significant market share. Because of the network effect, these platforms can grow very quickly and it is a winner-takes-all proposition. They will “move fast and break things” to get to number one. </p>
<p><a href="http://www.afr.com/technology/kenneth-rogoff-concerned-by-the-dark-side-of-the-technology-revolution-20180308-h0x8n4">Concerns have been raised</a> about the enormous power that platforms such as Google and Facebook have as a result of controlling data on consumer preferences and search behaviour, and how this can reduce competition and innovation.</p>
<hr>
<p>
<em>
<strong>
Read more:
<a href="https://theconversation.com/regulating-facebook-wont-prevent-data-breaches-93697">Regulating Facebook won't prevent data breaches</a>
</strong>
</em>
</p>
<hr>
<p>The Cambridge Analytica scandal will inevitably focus attention on the question of how we should govern our online social data. The European Union’s <a href="https://decodeproject.eu/">DECODE</a> (Decentralised Citizen Owned Data Ecosystems) project is developing tools to give people control over how their data is used, and the ability to share it on their terms. </p>
<p>Social media platforms are walled gardens but data portability is one of the planks of the European Union’s <a href="https://www.i-scoop.eu/gdpr">General Data Protection Regulation</a>, which will come into effect in May 2018. This is the right for an individual to require an organisation to give them back a copy of their personal data or to send this data to another organisation (potentially a competitor).</p>
<p>Maybe this is a step towards a world where users will be able to easily leave a social media platform if they don’t agree with how their data are being used, without suffering the social or career hit that would be associated with the only option available now: delete your account.</p><img src="https://counter.theconversation.com/content/93734/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Robert Ackland is an academic at the Australian National University and is the founder and CEO of Uberlink Corp, which specialises in quantitative analysis of online social and organisational networks.</span></em></p>Harvesting data from Facebook’s users is within the rules, I should know, I’ve done this kind of research myself. But the latest scandal may make it harder for us to get any useful data.Robert Ackland, Associate Professor and Leader of the VOSON Lab, School of Sociology, Australian National UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/931182018-03-21T10:43:14Z2018-03-21T10:43:14ZThink Facebook can manipulate you? Look out for virtual reality<figure><img src="https://images.theconversation.com/files/211198/original/file-20180320-31624-13znwph.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">What these people are seeing isn't real – but they might think it is.</span> <span class="attribution"><a class="source" href="http://www.apimages.com/metadata/Index/APTOPIX-Spain-Wireless-Show-Flagship-Phones/55557e265ea948089fc69dadde97782a/5/0">AP Photo/Francisco Seco</a></span></figcaption></figure><p>As Facebook users around the world are coming to understand, some of their favorite technologies can be used against them. It’s not just the scandal over psychological profiling firm <a href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election">Cambridge Analytica getting access</a> to data from tens of millions of Facebook profiles. People’s filter bubbles are filled with carefully tailored information – and misinformation – altering their <a href="https://www.onlineprivacyfoundation.org/opf-research/psychographic-targeting/">behavior and thinking, and even their votes</a>.</p>
<p>People, both individually and as a society at large, are wrestling to understand <a href="https://techcrunch.com/2018/03/18/move-fast-and-fake-things/">how their newsfeeds turned against them</a>. They are coming to realize exactly how carefully controlled Facebook feeds are, with highly tailored ads. That set of problems, though, pales in comparison to those posed by the next technological revolution, which is already underway: virtual reality. </p>
<p>On one hand, virtual worlds hold almost limitless potential. VR games can <a href="https://www.tennessean.com/picture-gallery/news/2018/02/23/virtual-reality-games-used-in-drug-rehab-therapy/110761470/">treat drug addiction</a> and maybe help solve the <a href="https://theconversation.com/the-opioid-epidemic-in-6-charts-81601">opioid epidemic</a>. Prison inmates can use VR simulations to <a href="https://news.vice.com/en_us/article/bjym3w/this-prison-is-using-vr-to-teach-inmates-how-to-live-on-the-outside">prepare for life after their release</a>. People are racing to enter these immersive experiences, which have the potential to be more psychologically powerful than any other technology to date: The first modern equipment offering the opportunity <a href="https://www.telegraph.co.uk/technology/ces/12085175/Oculus-Rift-to-go-on-sale-in-March-for-599.html">sold out in 14 minutes</a>.</p>
<p>In these new worlds, every leaf, every stone on the virtual ground and every conversation is carefully constructed. In our research into the emerging definition of ethics in virtual reality, my colleagues and I interviewed the developers and early users of virtual reality to understand <a href="http://hdl.handle.net/1903/20513">what risks are coming and how we can reduce them</a>.</p>
<h2>Intensity is going to level up</h2>
<p>“VR is a very personal, intimate situation. When you wear a VR headset … you really believe it, it’s really immersive,” says one of the developers with whom we spoke. If someone harms you in VR, <a href="https://theconversation.com/sexual-assault-enters-virtual-reality-67971">you’re going to feel it</a>, and if someone manipulates you into believing something, it’s going to stick. </p>
<p>This immersion is what users want: “VR is really about being immersed … As opposed to a TV where I can constantly be distracted,” one user told us. That immersiveness is what gives VR unprecedented power: “really, what VR is trying to do here is duplicate reality where it tricks your mind.”</p>
<p>These tricks can be enjoyable – allowing people to <a href="https://vrsource.com/best-vr-flight-simulators-5901/">fly helicopters</a> or journey back to <a href="https://www.virtualiteach.com/single-post/2017/07/24/Uncover-the-Tomb-of-Tutankhamen-in-VR">ancient Egypt</a>. They can be helpful, offering <a href="https://www.tandfonline.com/doi/abs/10.1586/14737175.8.11.1667">pain management</a> or treatment for <a href="http://www.icdvrat.org/2008/papers/ICDVRAT2008_S01_N05_Rizzo_et_al.pdf">psychological conditions</a>.</p>
<p>But they can also be malicious. Even a common prank that friends play on each other online – logging in and posting as each other – can take on a whole new dimension. One VR user explains, “Someone can put on a VR head unit and go into a virtual world assuming your identity. I think that identity theft, if VR becomes mainstream, will become rampant.”</p>
<h2>Data will be even more personal</h2>
<figure class="align-left zoomable">
<a href="https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=454&fit=crop&dpr=1 600w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=454&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=454&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=571&fit=crop&dpr=1 754w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=571&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/210915/original/file-20180318-104673-196iysp.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=571&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">An image of what the Oculus DK2 sees via its infrared sensors.</span>
<span class="attribution"><a class="source" href="https://forums.oculusvr.com/community/discussion/11385/what-can-the-dk2-ir-camera-see">MaglevNL/reddit</a></span>
</figcaption>
</figure>
<p>VR will be able to collect data on a whole new level. Seemingly innocuous infrared sensors designed to help with motion sickness and alignment can capture near-perfect representations of users’ real-world surroundings. </p>
<p>Further, the data and interactions that give VR the power to treat and diagnose <a href="https://futurism.com/ai-and-vr-could-completely-transform-how-doctors-diagnose-and-treat-mental-disorders/">physical and mental health conditions</a> can be used to hyper-personalize experiences and information to the precise vulnerabilities of individual users.</p>
<p>Combined, the intensity of virtual reality experiences and the even more personal data they collect present the specter of fake news that’s much more powerful than text articles and memes. Rather, immersive, personalized experiences may thoroughly convince people of entirely alternate realities, to which they are perfectly susceptible. Such immersive VR advertisements are on the horizon <a href="https://www.wired.com/story/vr-ads-are-almost-here/">as early as this year</a>.</p>
<h2>Building a virtual future</h2>
<p>A person who uses virtual reality is, often willingly, being controlled to far greater extents than were ever possible before. Everything a person sees and hears – and perhaps even feels or smells – is totally created by another person. That surrender brings both promise and peril. Perhaps in carefully constructed virtual worlds, people can solve problems that have eluded us in reality. But these virtual worlds will be built inside a real world that can’t be ignored. </p>
<p>While technologists and users are cleaning up the malicious, manipulative past, they’ll need to go far beyond <a href="https://www.wired.com/story/what-would-healthy-twitter-look-like/">making social media healthier</a>. As carefully as developers are building virtual worlds themselves, society as a whole must intentionally and painstakingly construct the culture in which these technologies exist. </p>
<p>In many cases, developers are the first allies in this fight. Our research found that VR developers were more concerned about their users’ well-being than the users themselves. Yet, one developer admits that “the fact of the matter is … I can count on my fingers the number of experienced developers I’ve actually met.” Even <a href="http://doi.org/10.1145/2580723.2580730">experts have only begun to explore</a> ethics, security and privacy in virtual reality scenarios. </p>
<p>The developers we spoke with expressed a desire for guidelines on where to draw the boundaries, and how to prevent dangerous misuses of their platforms. As an initial step, we <a href="http://hdl.handle.net/1903/20513">invited VR developers and users</a> from nine online communities to work with us to create a set of guidelines for VR ethics. They made suggestions about inclusivity, protecting users from manipulative attackers and limits on data collection. </p>
<p>As the debacle with Facebook and Cambridge Analytica shows, though, people don’t always follow guidelines, or even <a href="https://www.washingtonpost.com/business/economy/facebooks-rules-for-accessing-user-data-lured-more-than-just-cambridge-analytica/2018/03/19/31f6979c-658e-43d6-a71f-afdd8bf1308b_story.html">platforms’ rules and policies</a> – and the effects could be all the worse in this new VR world. But, our initial success reaching agreement on VR guidelines serves as a reminder that people can go beyond reckoning with the technologies others create: We can work together to create beneficial technologies we want.</p><img src="https://counter.theconversation.com/content/93118/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elissa Redmiles receives research funding from a variety of sources including the National Science Foundation, National Center for Women in Technology, and Facebook.</span></em></p>As the internet-connected world reels from revelations about personalized manipulation based on Facebook data, a scholar of virtual reality warns there’s an even bigger crisis of trust on the horizon.Elissa M. Redmiles, Ph.D. Student in Computer Science, University of MarylandLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/925442018-03-08T22:15:22Z2018-03-08T22:15:22ZThe myth of the echo chamber<figure><img src="https://images.theconversation.com/files/209332/original/file-20180307-146675-msakvo.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">There are widespread fears that so-called echo chambers and filter bubbles are leading to political polarization that poses a danger to democracy. But are the fears unfounded?</span> <span class="attribution"><span class="source">(Melvin Sokolsky/1963 via Creative Commons)</span></span></figcaption></figure><p>There is a common fear that people are using social media to access only specific types of political information and news. <a href="https://kf-site-production.s3.amazonaws.com/media_elements/files/000/000/133/original/Topos_KF_White-Paper_Nyhan_V1.pdf">The echo chamber theory</a> says people select information that conforms to their preferences. </p>
<p>In a <a href="https://www.tandfonline.com/doi/abs/10.1080/1369118X.2018.1428656">recently published study</a>, we show that fears people encounter only information that confirms their existing political views are blown out of proportion. In fact, most people already have media habits that help them avoid echo chambers. </p>
<p><a href="https://www.forbes.com/forbes/welcome/?toURL=https://www.forbes.com/sites/kalevleetaru/2017/12/18/why-was-2017-the-year-of-the-filter-bubble/&refURL=https://www.google.ca/&referrer=https://www.google.ca/">A related theory about “filter bubbles”</a> claims social media companies are incentivized to prioritize likeable and shareable content in an individual’s feed, which in turn puts people in an algorithmically constructed bubble. </p>
<p>The democratic problem with these supposed echo chambers and filter bubbles is that people are empowered to avoid politics if they want. This means they will be less aware of their political system, less informed and in turn less likely to vote — all bad signs for a healthy democracy.</p>
<p>People who like politics aren’t immune either. They might become increasingly polarized in their views since all they see are people confirming their own beliefs. While a lot of the current work is theoretical, a few studies have shown that echo chambers and filter bubbles could exist on Twitter or Facebook, for example.</p>
<h2>People get information from many sources</h2>
<p>But people don’t consume political information and news from only one source or channel. </p>
<p>Individuals have access to a wide range of media, from traditional news outlets on television, radio and newspapers (and their digital versions) to a wide range of social media sites and blogs. This means studies that focus on any one single platform simply cannot speak to the actual experiences of individuals. </p>
<p>We wanted to solve this problem by conducting a study examining the media habits of individuals. We wanted to understand what social media they use on a daily basis, what political information and news sources they incorporate in their daily lives, and whether they do things that might help them avoid echo chambers. </p>
<p>To do this we conducted a nationally representative online survey of 2,000 British adults. This is part of the larger <a href="http://quello.msu.edu/research/the-part-played-by-search-in-shaping-political-opinion-the-quello-search-project/">Quello Search Project</a> that examines the formation of political opinions and the digital media habits of adults in seven different countries. Unfortunately no similar Canadian data set exists at present.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/209237/original/file-20180307-146700-1oqk238.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/209237/original/file-20180307-146700-1oqk238.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/209237/original/file-20180307-146700-1oqk238.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/209237/original/file-20180307-146700-1oqk238.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/209237/original/file-20180307-146700-1oqk238.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/209237/original/file-20180307-146700-1oqk238.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/209237/original/file-20180307-146700-1oqk238.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Are people really only seeking out news and information that conforms to their political views?</span>
<span class="attribution"><span class="source">(Shutterstock)</span></span>
</figcaption>
</figure>
<p>Our analysis suggests that people are rarely caught in echo chambers. Only about 8% of the online adults in the UK are at risk of being trapped in an echo chamber. </p>
<p>Individuals actively check additional sources, change their minds based on information they find using search engines and seek out differing views. All of these are ways individuals can avoid that echo chamber effect. </p>
<p>Importantly, political interest and media diversity — how many sources of information and how many social media a person uses — both help people avoid the threats of echo chambers. </p>
<p>People who have more than one source of political information are far more likely to act to avoid echo chambers. </p>
<p>They encounter different perspectives, they verify information and they sometimes change their minds. Even people who are not interested in politics are likely to do things that help them avoid echo chambers as long as they have a diverse media diet.</p>
<h2>Fact-checking is crucial</h2>
<p>Worries about political polarization are also dampened based on these results.</p>
<p>We fret about polarization, but in fact those who are politically interested are more likely to have encountered different opinions, checked facts and changed their minds about a political issue after searching for more information.</p>
<p>This means that most people are already on the right track for avoiding echo chambers. It also means that media literacy programs that emphasize incorporating multiple sources into your daily routines, and fact-checking, are crucial.</p>
<p>Social media platforms also have an important role to play.</p>
<p>Facebook and Twitter could still be home to communities that exchange information in a way that confirms existing beliefs and opinions. This is not necessarily a bad thing. It’s important to remember that people rarely get all their political information from just one place. </p>
<p>That said, social media companies can help promote media literacy in the very design of their platforms, for example by making sources of news content visible, explaining how their personalization algorithms work and offering suggested content that helps users find new perspectives.</p>
<p>Happily, some of this experimentation is going on within social media companies already. <a href="https://newsroom.fb.com/news/2017/12/news-feed-fyi-updates-in-our-fight-against-misinformation/">Facebook has experimented</a> by tinkering with what shows up in news feeds and how content is flagged as false. <a href="https://blog.twitter.com/official/en_us/topics/company/2018/twitter-health-metrics-proposal-submission.html">Twitter</a> recently announced a program to examine the health of conversations. So far there have been varying levels of success and criticism.</p>
<p>While we do not have access to data about the Canadian population, preliminary results from our U.S. data set, and from work others have been doing in <a href="http://onlinelibrary.wiley.com/doi/10.1111/jcom.12315/abstract">different national contexts</a> and with different samples <a href="https://medium.com/oxford-university/where-do-people-get-their-news-8e850a0dea03">from the U.K.</a>, suggests we should expect the same trends in Canada. </p>
<p>Most people have media habits that help them avoid echo chambers. When it comes to our elections, our democracy or information warfare, the threat of social media-enabled echo chambers is not a major concern.</p><img src="https://counter.theconversation.com/content/92544/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Elizabeth Dubois has received funding from The Social Sciences and Humanities Research Council of Canada.
Data collection for this study was supported by Google as part of the Quello Search Project.</span></em></p><p class="fine-print"><em><span>Grant Blank does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Despite fears that so-called echo chambers are causing political polarization, a new study suggests it’s not the case.Elizabeth Dubois, Assistant Professor, L’Université d’Ottawa/University of OttawaGrant Blank, Survey Research Fellow, Oxford Internet Institute, University of OxfordLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/911192018-02-06T20:34:19Z2018-02-06T20:34:19ZDigital public: looking at what algorithms actually do<figure><img src="https://images.theconversation.com/files/205084/original/file-20180206-14100-qetozi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">_Message from the Unseen World_, an installation of a Turing-inspired algorithm reciting a poem
by Nick Drake.
</span> <span class="attribution"><a class="source" href="https://flic.kr/p/QYuzF4">Roger Marks/Flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span></figcaption></figure><p>The development and expansion of today’s communications platforms have led to a radical change in how public discourse is conducted and public opinion formed. In particular, the traditional boundary between personal and public communication has disappeared.</p>
<p>A prime example is a 2017 case involving the American actor William Shatner – best known for having played the character Captain Kirk in the 1960s TV series <em>Star Trek</em> – tweeted about the organization <a href="http://www.slate.com/articles/health_and_science/science/2017/04/what_we_can_learn_from_william_shatner_s_twitter_meltdown.html">Autism Speaks</a>, known for its claims that autism is caused by vaccines. Among others, David Gorski, an oncologist at Wayne State University in Detroit who advocates for evidence-based interventions, replied to Shatner’s tweet and explained why Autism Speaks is a controversial organisation. In response, Shatner searched for Gorski’s name on Google and shared articles about him from a conspiracy-oriented website called TruthWiki. Asked why he had not read and linked <a href="https://en.wikipedia.org/wiki/David_Gorski">Gorski’s Wikipedia entry</a>, Shatner responded that TruthWiki was higher up in his Google search results. You can find it “all on Google,” he <a href="https://twitter.com/WilliamShatner/status/849773578559959040">maintained</a>, as if that itself was a sign of high quality.</p>
<p>Google and other platforms are incredibly powerful tools that allow all of us – and Shatner, too – to locate information in the blink of an eye. To do so they use computer algorithms that measure “relevance”, but the standards used often do not correspond to the criteria that reputable journalists or researchers would use.</p>
<h2>Custom-fitted ‘relevance’</h2>
<p>Algorithms work mostly <em>descriptively</em> and <em>individually</em>. For example, they adjust relevance for a user based on what links he or she has clicked in the past. Yet many users assume the results are normative (“higher up in the Google results”). In the Shatner/Gorski case, the assertion of a correlation between autism and vaccines is encouraged a small but highly motivated user group in their online activities and ensured that a significant divergence occurred between content quality and “relevance” as determined by Google’s algorithms.</p>
<p>This is not simply a matter of a handful of telling cases. Because of their ubiquity, so-called intermediaries such as Google and Facebook now influence how public opinion is formed. <a href="http://www.die-medienanstalten.de/fileadmin/Download/Veranstaltungen/Pr%C3%A4sentation_Intermedi%C3%A4re/TNS_Intermedi%C3%A4re_und_Meinungsbildung_Pr%C3%A4si_Web_Mappe_final.pdf">57% of German Internet users</a> get their information about politics and social affairs from search engines or social networks. And even though the share of those who say social networks are their most important source of news is relatively small – <a href="http://www.die-medienanstalten.de/fileadmin/Download/Veranstaltungen/Pr%C3%A4sentation_Intermedi%C3%A4re/TNS_Intermedi%C3%A4re_und_Meinungsbildung_Pr%C3%A4si_Web_Mappe_final.pdf">6% of all Internet users</a> – it is considerably higher among younger users.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/205085/original/file-20180206-88795-t4c738.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/205085/original/file-20180206-88795-t4c738.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=74&fit=crop&dpr=1 600w, https://images.theconversation.com/files/205085/original/file-20180206-88795-t4c738.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=74&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/205085/original/file-20180206-88795-t4c738.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=74&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/205085/original/file-20180206-88795-t4c738.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=93&fit=crop&dpr=1 754w, https://images.theconversation.com/files/205085/original/file-20180206-88795-t4c738.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=93&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/205085/original/file-20180206-88795-t4c738.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=93&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">The most important intermediaries for information, according to German users.</span>
<span class="attribution"><span class="source">Kantar TNS. Berlin, in Ecke, 2016</span></span>
</figcaption>
</figure>
<p>As researchers at the Hamburg-based Hans Bredow Institute put it in 2016, the formation of public opinion is <a href="http://www.hans-bredow-institut.de/webfm_send/1172">“no longer conceivable without intermediaries”</a>.</p>
<h2>Maximising engagement</h2>
<p>The design principles used by intermediaries are leading to a structural change in public discourse. Anyone can now publish whatever they like, but not everyone will find an audience. Attention is generated only when people interact with algorithmic decision-making (ADM) processes. ADM processes determine the individual relevance of content items on social networks such as Facebook and select the items to be displayed for each user. In assembling an individual user’s feed, Facebook examines which content that person and his or her friends prefer or hide. Both signals are based on actions that are relatively straightforward.</p>
<p>Facebook also undoubtedly deploys signals that users are not consciously aware of sending, such as the amount of time they view a certain entry in the feed. Users who spend more time with any one item <a href="http://www.slate.com/articles/technology/cover_story/2016/01/how_Facebook_s_news_feed_algorithm_works.html">signal approval without explicitly doing so</a>. ADM systems play a significant role in other areas, like assisting in legal matters or determining where and when <a href="https://www.brennancenter.org/legal-work/brennan-center-justice-v-new-york-police-department">police officers are on duty</a>.</p>
<p>There is much less diversity among intermediaries than among editorially curated media. Even if each person using the services provided by today’s major intermediaries is given an individual choice, the same selection principles are applied to all users, and these are controlled by centralised curators. The new, crucial role played by users reactions and ADM processes is that both determine how much attention the content gets when disseminated.</p>
<h2>Negative emotions and cognitive distortions</h2>
<p>Studies of networking platforms show that content that rouses emotion is commented on and shared most often – and above all when negative emotions are involved.</p>
<p>Such polarizing effects seem to depend on a number of additional factors such as a country’s electoral system. Societies with “first past the post” systems such as the United States are potentially <a href="https://5harad.com/papers/bubbles.pdf">more vulnerable to extreme political polarisation</a>. In countries with proportional systems, institutionalised multiparty structures and ruling coalitions tend to balance out competing interests.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/aTxUetlqWmU?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">How YouTube’s algorithm can distort reality.</span></figcaption>
</figure>
<p>Existing societal polarisation presumably influences and is influenced by the algorithmic ranking of media content. A <a href="http://www.pnas.org/content/113/3/554">2016 study published by National Academy of Sciences</a> indicates that Facebook users who believe in conspiracy theories tend over time to turn to the community of conspiracy theorists holding the same view. This process is possibly intensified by algorithms that increasingly present them with content “relevant” to their views. These systems could in fact result in the creation of so-called echo chambers among people with extremist views.</p>
<p>Below are three aspects of intermediary platforms that can influence the formation of individual and public opinions:</p>
<ul>
<li><p><strong>Intermediaries measure engagement through users’ automatic, impulsive reactions</strong>. They use numerous variables to calculate relevance, ranging from basic behavioural metrics such as scrolling speed or the duration of page views to the level of interaction among multiple users in a social network. When someone with whom a user has repeatedly communicated on Facebook posts content, the probability is higher that the user will be shown this content than if someone’s posts with whom the user has never truly interacted.</p></li>
<li><p><strong>Intermediaries constantly change the variables they measure</strong>. The metrics signalling relevance are potentially problematic. Platform operators are hesitant to provide details of their metrics because of competition-related factors and the fact that they constantly changing the metrics. Google and Facebook alter their systems continuously; the operators <a href="https://www.Facebook.com/notes/Facebook-data-science/big-experiments-big-datas-friend-for-making-decisions/10152160441298859/">experiment with and tweak almost every aspect of the user interface</a> and other platform features to achieve specific goals such as increased interactivity.</p></li>
<li><p><strong>Intermediaries with the greatest reach promote unconsidered behaviour</strong>. Clicking on a “like” button or a link demands no cognitive effort, and many users are evidently happy to indulge this lack of effort.<a href="https://hal.inria.fr/hal-01281190/file/sigm095-gabielkov.pdf">Empirical studies</a> by the French National Institute for Computer Science (INRIA) and Columbia University suggest that many articles in social networks forwarded with a click to the user’s circle of friends could not possibly have been read. Users thus disseminate media content after having seen only the headline and introduction. To some extent they deceive the algorithm and, with it, their “friends and followers” into believing that they have engaged with the text.</p></li>
</ul>
<p>The ease of interaction also promotes cognitive distortions that have been known to social psychologists for years. A prime example is the “availability” heuristic: If an event or memory can easily be recalled, it is <a href="http://psiexp.ss.uci.edu/research/teaching/Tversky_Kahneman_1974.pdf">assumed to be particularly probable or common</a>. Users frequently encounter unread media content that has been forwarded due to a headline, and the content is thus later remembered as being “true” or “likely.” This is also the case when the text itself points out that the headline is a grotesque exaggeration or simply misleading.</p>
<h2>The need for diversity and transparency</h2>
<p>Ensuring a diversity of media in the public sphere means ensuring that the ADM processes that assess relevance are diverse as well. Algorithms that rank content and personalise its presentation are the heart of the complex, interdependent process underlying digital discourse. To bring transparency to ADM processes we need to:</p>
<ul>
<li><p>Make platforms and their impacts more open to external researchers.</p></li>
<li><p>Promote diversity among algorithmic processes.</p></li>
<li><p>Establish a code of ethics for developers.</p></li>
<li><p>Make users more aware of the mechanisms now being used to influence public discourse are essential.</p></li>
</ul>
<p>Organisations working for this kind of transparency include <a href="https://algorithmwatch.org/en/">Algorithm Watch</a>, based in Germany, and the US media watchdog <a href="https://www.propublica.org/article/making-algorithms-accountable">Pro Publica</a>, which have published a number of donation-funded studies and articles on the issue.</p>
<p>Through the combination of industry self-regulation and legislative measures, an unbiased understanding of the real social and political consequences of algorithmic ranking has the potential to identify and counter dangers early on.</p><img src="https://counter.theconversation.com/content/91119/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Konrad Lischka is affiliated with D64 – Zentrum für Digitalen Fortschritt e.V., Free Software Foundation Europe (FSFE) e.V, Sozialdemokratische Partei Deutschlands (SPD), Wikimedia Deutschland e. V. </span></em></p><p class="fine-print"><em><span>Christian Stöcker receives funding from the German Ministry for Education and Research (joint research project Propstop <a href="http://www.propstop.de">http://www.propstop.de</a> on covert propaganda attacks in online media).He writes a weekly column for German news website SPIEGEL ONLINE (<a href="http://www.spiegel.de/thema/spon_der_rationalist/">http://www.spiegel.de/thema/spon_der_rationalist/</a>). Together with Konrad Lischka he authored a paper on the digital public and algorithmic sorting of media content funded by the Bertelsmann foundation. </span></em></p>Today’s communications platforms and the algorithms that power them have led to a radical change in how public discourse is conducted and public opinion formed.Konrad Lischka, Project Lead Ethics of Algorithms, Bertelsmann StiftungChristian Stöcker, Professor of Digital Communication, Hamburg University of Applied SciencesLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/862852017-11-06T01:23:47Z2017-11-06T01:23:47ZWhy social media may not be so good for democracy<figure><img src="https://images.theconversation.com/files/193234/original/file-20171103-1008-1kvuik5.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Some of the Facebook and Instagram ads used in 2016 election released by members of the U.S. House Intelligence committee. </span> <span class="attribution"><span class="source">AP Photo/Jon Elswick</span></span></figcaption></figure><p>Recent revelations about how Russian agents <a href="https://www.nytimes.com/2017/09/07/us/politics/russia-facebook-twitter-election.html">inserted ads on Facebook</a>, in an attempt to influence the 2016 election, present a troubling question: Is Facebook bad for democracy? </p>
<p>As a scholar of the social and political implications of technology, I believe that the problem is not about Facebook alone, but much larger: Social media is actively undermining some of the social conditions that have historically made democratic nation states possible. </p>
<p>I understand that’s a huge claim, and I don’t expect anyone to believe it right away. But, considering that <a href="https://www.nbcnews.com/news/us-news/russian-backed-election-content-reached-126-million-americans-facebook-says-n815791">nearly half</a> of all eligible voters received Russian-sponsored fake news on Facebook, it’s an argument that needs to be on the table. </p>
<h2>How we create a shared reality</h2>
<p>Let’s start with two concepts: an “imagined community” and a “filter bubble.”</p>
<p>The late political scientist Benedict Anderson famously argued that the modern nation-state is best understood as an “<a href="https://www.versobooks.com/books/2259-imagined-communities">imagined community</a>” partly enabled by the rise of mass media such as newspapers. What Anderson meant is that the sense of cohesion that citizens of modern nations felt with one another – the degree to which they could be considered part of a national community – was one that was both artificial and facilitated by mass media. </p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193237/original/file-20171103-1061-1rk7i5q.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Mass media is one way to create a shared community.</span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/wikidave/4078540081/in/photolist-7dpzSz-RngzfQ-GmP7b-mdxj3-5U5dgp-j2gfsH-chjenQ-4sfYJw-Vzp9b2-qSZUs-kKuJFD-qSZJk-WxNtJ2-UAK8QD-X7EGHs-qVKLJW-qQmUVE-4kFjVm-5U3TPj-5Pfj8r-7S5oRH-eMg9KX-5mPnbW-8kXFD-96dsKy-8kXL7-6XDmDy-WuUsFL-ahCjFS-8kXGA-aakTBT-8kXEm-8kXJp-XYgTmJ-dKFgyg-b2ySR-MSmNxc-rjbN8d-7JtacE-de9qGq-oPoaFy-aDzm92-34atc1-kARY93-3R6LB-Vzp9Dr-VzpaNv-i9Kyz4-vZ7vv-8kXMX">Dave Crosby</a>, <a class="license" href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA</a></span>
</figcaption>
</figure>
<p>Of course there are many things that enable nation-states like the U.S. to hold together. We all learn (more or less) the same national history in school, for example. Still, the average lobster fisherman in Maine, for example, doesn’t actually have that much in common with the average schoolteacher in South Dakota. But, the <a href="https://www.versobooks.com/books/2259-imagined-communities">mass media contribute</a> toward helping them view themselves as part of something larger: that is, the “nation.” </p>
<p>Democratic polities depend on this shared sense of commonality. It enables what we call “national” policies – an idea that citizens see their interests aligned on some issues. Legal scholar <a href="http://hls.harvard.edu/faculty/directory/10871/Sunstein">Cass Sunstein</a> <a href="https://books.google.com/books/about/Republic_com.html?id=O7AG9TxDJdgC">explains this idea</a> by taking us back to the time when there were only three broadcast news outlets and they all said more or less the same thing. As Sunstein says, we have historically depended on these “general interest intermediaries” to frame and articulate our sense of shared reality. </p>
<h2>Filter bubbles</h2>
<p>The term <a href="https://www.penguinrandomhouse.com/books/309214/the-filter-bubble-by-eli-pariser/9780143121237/">“filter bubble”</a> emerged in a 2010 book by activist <a href="https://www.opensocietyfoundations.org/people/eli-pariser">Eli Pariser</a> to characterize an internet phenomenon.</p>
<p>Legal scholar <a href="http://hls.harvard.edu/faculty/directory/10519/Lessig">Lawrence Lessig</a> and Sunstein too had <a href="http://codev2.cc/download+remix/Lessig-Codev2.pdf">identified</a> this phenomenon of group isolation on the internet in the late 1990s. Inside a filter bubble, individuals basically receive only the kinds of information that they have either preselected, or, more ominously, that third parties have decided they want to hear. </p>
<p>The targeted advertising behind Facebook’s newsfeed helps to create such filter bubbles. Advertising on Facebook works by determining its user’s interests, based on data it collects from their browsing, likes and so on. This is a very sophisticated operation. </p>
<p>Facebook does not disclose its own algorithms. However, research led by psychologist and data scientist at Stanford University <a href="http://www.michalkosinski.com/">Michael Kosinski</a> <a href="http://www.pnas.org/content/110/15/5802.full">demonstrated</a> that automated analysis of people’s Facebook likes was able to identify their demographic information and basic political beliefs. Such targeting can also apparently be extremely precise. There is <a href="http://www.cnn.com/2017/10/03/politics/russian-facebook-ads-michigan-wisconsin/index.html">evidence</a>, for example, that anti-Clinton ads from Russia were able to micro-target specific voters in Michigan. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193240/original/file-20171103-1017-oe7f2t.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Is Facebook creating filter bubbles?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/download/confirm/569244514?src=K4gWpNnYIIplqxveBwOlHQ-1-32&size=small_jpg">sitthiphong/Shutterstock.com</a></span>
</figcaption>
</figure>
<p>The problem is that inside a filter bubble, you never receive any news that you do not agree with. This poses two problems: First, there is never any independent verification of that news. Individuals who want independent confirmation will have to actively seek it out. </p>
<p>Second, psychologists have known for a long time about “<a href="http://psy2.ucsd.edu/%7Emckenzie/nickersonConfirmationBias.pdf">confirmation bias</a>,” the tendency of people to seek out only information they agree with. Confirmation bias also limits people’s ability to question information that confirms or upholds their beliefs.</p>
<p>Not only that, research at Yale University’s <a href="http://www.culturalcognition.net/">Cultural Cognition Project</a> strongly suggests that people <a href="http://www.culturalcognition.net/blog/2012/11/15/is-cultural-cognition-the-same-thing-as-or-even-a-form-of-co.html">are inclined</a> to interpret new evidence in light of beliefs associated with their social groups. This can <a href="http://www.culturalcognition.net/blog/2015/6/12/politically-motivated-reasoning-paradigm-pmrp-what-it-is-how.html">tend to polarize</a> those groups.</p>
<p>All of this means that if you are inclined to dislike President Donald Trump, any negative information on him is likely to further strengthen that belief. Conversely, you are likely to discredit or ignore pro-Trump information.</p>
<p>It is this pair of features of filter bubbles – preselection and confirmation bias – that fake news exploits with precision.</p>
<h2>Creating polarized groups?</h2>
<p>These features are also hardwired into the business model of social media like Facebook, which is predicated precisely on the idea that one can create a group of “friends” with whom one shares information. This group is largely insular, separated from other groups. </p>
<p>The software very <a href="http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html">carefully curates</a> the transfer of information across these social networks and tries very hard to be the primary portal through which its users – about <a href="http://money.cnn.com/2017/02/01/technology/facebook-earnings/index.html">2 billion</a> of them – access the internet.</p>
<p>Facebook depends on advertising for its revenue, and that advertising can be readily exploited: A recent <a href="https://www.propublica.org/article/facebook-enabled-advertisers-to-reach-jew-haters">ProPublica investigation</a> shows how easy it was to target Facebook ads to “Jew Haters.” More generally, the site also wants to keep users online, and it <a href="http://www.pnas.org/content/111/24/8788.full">knows</a> that it is able to manipulate the emotions of its users – who are happiest when they see things they agree with. </p>
<figure class="align-left ">
<img alt="" src="https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=401&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=401&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=401&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=504&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=504&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193243/original/file-20171103-1027-11kogxw.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=504&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Is social media creating more polarization?</span>
<span class="attribution"><a class="source" href="https://www.shutterstock.com/download/confirm/566694994?src=4VpU8SnctQx7i6ZTJKQ7AA-1-0&size=small_jpg">Chinnapong/ Shutterstock.com</a></span>
</figcaption>
</figure>
<p>As the Washington Post <a href="https://www.washingtonpost.com/news/the-switch/wp/2017/11/01/how-russian-trolls-got-into-your-facebook-feed/?utm_term=.aa0c53a633a1">documents</a>, it is precisely these features that were exploited by Russian ads. As a writer at Wired <a href="https://www.wired.com/2016/11/filter-bubble-destroying-democracy/">observed</a> in an ominously prescient commentary immediately after the election, he never saw a pro-Trump post that had been shared over 1.5 million times – and neither did any of his liberal friends. They saw only liberal-leaning news on their social media feeds.</p>
<p>In this environment, a recent Pew Research Center survey should not come as a surprise. The survey <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">shows</a> that the American electorate is both deeply divided on partisan grounds, even on fundamental political issues, and is becoming more so. </p>
<p>All of this combines to mean that the world of social media tends to create small, deeply polarized groups of individuals who will tend to believe everything they hear, no matter how divorced from reality. The filter bubble sets us up to be vulnerable to polarizing fake news and to become more insular. </p>
<h2>The end of the imagined community?</h2>
<p>At this point, two-thirds of Americans get <a href="http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/">at least some of their news</a> from social media outlets. This means that two-thirds of Americans get at least some of their news from highly curated and personalized black-box algorithms. </p>
<p>Facebook remains, by a significant margin, the <a href="https://www.salon.com/2017/10/05/atone-hed-better-facebook-is-still-the-biggest-source-of-right-wing-fake-news/">most prevalent</a> source of fake news. Not unlike forced, false <a href="http://press.uchicago.edu/ucp/books/book/chicago/D/bo3628714.html">confessions of witchcraft</a> in the Middle Ages, these stories get repeated often enough that they could appear legitimate. </p>
<p>What we are witnessing, in other words, is the potential collapse of a significant part of the imagined community that is the American polity. Although the U.S. is also divided demographically and there are sharp demographic differences between regions within the country, <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">partisan differences are dwarfing other divisions</a>
in society.</p>
<p>This is a recent trend: In the mid-1990s, partisan divisions were <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">similar in size to demographic divisions</a>. For example, then and now, women and men would be about the same modest distance apart on political questions, such as whether government should do more to help the poor. In the 1990s, this was also true for Democrats and Republicans. In other words, partisan divisions were no better than demographic factors at predicting people’s political views. Today, if you want to know someone’s political views, <a href="http://www.people-press.org/2017/10/05/the-partisan-divide-on-political-values-grows-even-wider/">you would first want to find out</a> their partisan affiliation. </p>
<h2>The reality of social media</h2>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=502&fit=crop&dpr=1 754w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=502&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/193244/original/file-20171103-1068-vhhhgx.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=502&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption"></span>
<span class="attribution"><a class="source" href="https://www.flickr.com/photos/jasonahowie/8583949219/in/photolist-e5wZ3t-VMLTZk-8TxrKw-8h6sWa-6sydbF-8tftq9-5XNfPs-bns6av-c9FLYw-6AX2Qo-fgx2YY-eMnSfC-7TGMs7-9robx3-deoZtb-8ddFEZ-8adSFu-8onC9R-bns6J4-7arjcX-7rbSNc-dWUWcb-fiFCow-bns6D8-6ZckCr-98AHcS-gaAzZ6-nZUnJL-qi7hrH-7R7gx2-dXX1bU-9arafo-dPQbk9-7Kh7bs-nX21xv-8adShf-dAgDr2-8emLep-6sARCS-8emR1B-9um8Qd-4se7dy-8emLB2-dhZgBu-73qnar-5n4D99-81gMx7-6ZhLm1-4KvCf1-8emRJF">Jason Howie</a>, <a class="license" href="http://creativecommons.org/licenses/by/4.0/">CC BY</a></span>
</figcaption>
</figure>
<p>To be sure, it would be overly simplistic to lay all of this at the feet of social media. Certainly the structure of the American political system, which tends to polarize the political parties in primary elections, plays a major role. And it is true that plenty of us also still get news from other sources, outside of our Facebook filter bubbles. </p>
<p>But, I would argue that Facebook and social media offer an additional layer: Not only do they tend to create filter bubbles on their own, they offer a rich environment for those who want to increase polarization to do so. </p>
<p>Communities share and create social realities. In its current role, social media risks abetting a social reality where differing groups could disagree not only about what to do, but about what reality is.</p><img src="https://counter.theconversation.com/content/86285/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Gordon Hull does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>A scholar asks whether democracy itself is at risk in a world where social media is creating deeply polarized groups of individuals who tend to believe everything they hear.Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina – CharlotteLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/766882017-05-05T18:03:54Z2017-05-05T18:03:54ZFake news, echo chambers and filter bubbles: Underresearched and overhyped<figure><img src="https://images.theconversation.com/files/167927/original/file-20170504-4929-1sx8gvi.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Don't panic: An international survey finds concerns about fake news are overblown.</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/image-vector/businessman-working-on-laptop-pop-art-356914784">studiostoks/shutterstock.com</a></span></figcaption></figure><p>In the early years of the internet, it was revolutionary to have <a href="https://www.theguardian.com/technology/2014/aug/24/internet-lost-its-way-tim-berners-lee-world-wide-web">a world of information just a click away</a> from anyone, anywhere, anytime. Many hoped this <a href="http://www.hup.harvard.edu/catalog.php?isbn=9780674872332">inherently democratic technology</a> could lead to better-informed citizens more easily <a href="http://www.nature.com/news/society-build-digital-democracy-1.18690">participating in debate, elections and public discourse</a>.</p>
<p>Today, though, many observers are <a href="https://theconversation.com/is-googles-eagerness-to-answer-questions-promoting-more-falsehood-online-70894">concerned that search algorithms</a> and <a href="https://www.theguardian.com/technology/2016/dec/04/google-democracy-truth-internet-search-facebook">social media are undermining</a> the <a href="https://www.theguardian.com/technology/2016/dec/16/google-autocomplete-rightwing-bias-algorithm-political-propaganda">quality of online information</a> people see. They worry that bad information may be <a href="https://www.researchgate.net/publication/220427075_Why_the_Internet_is_Bad_for_Democracy">weakening democracy in the digital age</a>.</p>
<p>The problems include online services <a href="https://theconversation.com/how-can-we-learn-to-reject-fake-news-in-the-digital-world-69706">conveying fake news</a>, splitting users into “<a href="http://www.penguinrandomhouse.com/books/309214/the-filter-bubble-by-eli-pariser/9780143121237/">filter bubbles</a>” of <a href="https://www.wired.com/2016/11/filter-bubble-destroying-democracy/">like-minded people</a> and <a href="http://press.princeton.edu/chapters/s8468.html">enabling users to unwittingly</a> <a href="https://theconversation.com/why-do-we-fall-for-fake-news-69829">lock themselves up</a> in <a href="https://www.psychologytoday.com/blog/psych-unseen/201611/fake-news-echo-chambers-filter-bubbles-survival-guide">virtual echo chambers</a> that <a href="http://www.newstatesman.com/helen-lewis/2015/07/echo-chamber-social-media-luring-left-cosy-delusion-and-dangerous-insularity">reinforce their own biases</a>. </p>
<p>These concerns are much discussed, but have not yet been thoroughly studied. <a href="https://doi.org/10.7717/peerj-cs.38">What research does exist</a> has typically been limited to a single platform, such Twitter or Facebook. Our <a href="http://ssrn.com/abstract=2960697">study of search and politics in seven nations</a> – which surveyed the United States, Britain, France, Germany, Italy, Poland and Spain in January 2017 – found these concerns to be overstated, if not wrong. In fact, many internet users trust search to help them find the best information, check other sources and discover new information in ways that can burst filter bubbles and open echo chambers. </p>
<h2>Surveying internet users</h2>
<p>We sought to learn directly from people about how they used search engines, social media and other sources of information about politics. Through funding from Google, we conducted an <a href="http://ssrn.com/abstract=2960697">online survey of more than 14,000 internet users in seven nations</a>. </p>
<p>We found that the fears surrounding search algorithms and social media are not irrelevant – <a href="https://www.sciencenews.org/blog/science-public/youve-probably-been-tricked-fake-news-and-dont-know-it">there are problems for some users some of the time</a>. However, they are exaggerated, creating unwarranted fears that could lead to inappropriate responses by users, regulators and policymakers. </p>
<h2>The importance of searching</h2>
<p>The survey findings demonstrate the importance of search results over other ways to get information. When people are looking for information, they very often search the internet. Nearly two-thirds of users across our seven nations said they use a search engine to look for news online at least once a day. They view search results as equally accurate and reliable as other key sources, like television news.</p>
<p><iframe id="pfmZV" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/pfmZV/1/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>In line with that general finding, a search engine is the first place internet users go online for information about politics. Moreover, those internet users who are very interested in politics, and who participate in political activities online, are the most likely to use a search engine like Bing or Google to find information online about politics.</p>
<p>But crucially, those same users engaged in search are also very likely to get information about politics on other media, exposing themselves to diverse sources of information, which makes them more likely to encounter diverse viewpoints. Further, we found that people who are interested and involved in politics online are more likely to double-check questionable information they find on the internet and social media, including by searching online for additional sources in ways that will pop filter bubbles and break out of echo chambers.</p>
<h2>Internet-savvy or not?</h2>
<p>It’s not just politically interested people who have these helpful search habits: People who use the internet more often and have more practice searching online do so as well.</p>
<p>That leaves the least politically interested people and the least skilled internet users as most susceptible to fake news, filter bubbles and echo chambers online. These individuals could <a href="http://dx.doi.org/10.1007/978-3-319-40548-3_74">benefit from support</a> and <a href="https://theconversation.com/the-challenge-facing-libraries-in-an-era-of-fake-news-70828">training in digital literacy</a>.</p>
<p>However, for most people, internet searches are critical for checking the reliability and validity of information they come across, whether online, on social media, on traditional media or in everyday conversation. Our research shows that these internet users find search engines useful for checking facts, discovering new information, understanding others’ views on issues, exploring their own views and deciding how to vote.</p>
<h2>International variations</h2>
<p>We found that people in different countries do vary in how much they trust and rely on the internet and searches for information. For example, internet users in Germany, and to a lesser extent those in France and the United Kingdom, are more trusting in TV and radio news, and more skeptical of searches and online information. Internet users in Germany rate the reliability of search engines lower than those in all the other nations, with 44 percent saying search engines are reliable, compared with 50 to 57 percent across the other six countries.</p>
<p><iframe id="nQXkq" class="tc-infographic-datawrapper" src="https://datawrapper.dwcdn.net/nQXkq/2/" height="400px" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>In Poland, Italy and Spain, people trust traditional broadcast media less and are more reliant on, and trusting of, internet and searching. Americans are in the middle; there were greater differences within European countries than between Europe as a whole and the U.S. American internet users were so much more likely to consult multiple sources of information that we called them “media omnivores.”</p>
<p>Internet users generally rely on a diverse array of sources for political information. And they display a healthy skepticism, leading them to question information and check facts. <a href="https://www.theguardian.com/technology/2016/nov/29/facebook-fake-news-problem-experts-pitch-ideas-algorithms">Regulating the internet</a>, as some have proposed, could undermine existing trust and introduce new questions about accuracy and bias in search results.</p>
<p>But panic over fake news, echo chambers and filter bubbles is exaggerated, and not supported by the evidence from users across seven countries.</p><img src="https://counter.theconversation.com/content/76688/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>William H. Dutton received funding from Google through Michigan State University to conduct the survey on which these findings are based. However, Google did not design the study, questionnaire, analysis, or findings. The opinions are those of the author and not any organization that supported this research. </span></em></p>Concerns over filter bubbles and fake news are often based on anecdotal evidence. There is relatively little systematic research on the topic; a new survey finds widespread fears are unwarranted.William H. Dutton, Professor of Media and Information Policy, Michigan State UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/768642017-05-02T23:24:44Z2017-05-02T23:24:44ZAustralian Twitter is more diverse than you think<figure><img src="https://images.theconversation.com/files/167350/original/file-20170501-8926-99wi8y.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Tired of seeing the same thing on Twitter?</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>What are the major drivers of Twitter take-up, in Australia and elsewhere? Do we connect around shared interests, shared location, or pre-existing offline relationships? And when, in the eleven-year history of the platform, did these structures form?</p>
<p>These are the questions that guided a new, long-term study of the Australian national Twittersphere that my colleagues and I have undertaken. </p>
<p>Drawing on <a href="http://trisma.org/">TrISMA</a>, a major multi-institutional facility for social media analytics, we identified some 3.7 million Australian Twitter accounts in existence by early 2016, and captured the 167 million follower/followee connections between them.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/167138/original/file-20170428-15091-12mdr8t.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/167138/original/file-20170428-15091-12mdr8t.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/167138/original/file-20170428-15091-12mdr8t.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/167138/original/file-20170428-15091-12mdr8t.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/167138/original/file-20170428-15091-12mdr8t.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/167138/original/file-20170428-15091-12mdr8t.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/167138/original/file-20170428-15091-12mdr8t.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Clusters in the Australian Twittersphere.</span>
<span class="attribution"><a class="source" href="http://socialmedia.qut.edu.au/">Axel Bruns / QUT Digital Media Research Centre</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA</a></span>
</figcaption>
</figure>
<p>There are plenty of assumptions and not a great deal of reliable data about how we use social media. </p>
<p>Twitter, for example, is variously accused of being <a href="http://www.theaustralian.com.au/opinion/columnists/why-the-unbearable-darkness-of-the-twitsphere-has-made-me-quit-twitter/news-story/b44bfb77c50ec5d6f9ae28698b4b4ca4">a haven for leftist outrage</a> and <a href="https://www.buzzfeed.com/markdistefano/gday-pepe?utm_term=.bgpmwAbMm#.foDyn82Vy">a cesspool of alt-right fascists</a>. It is seen as <a href="https://theconversation.com/crisis-communication-saving-time-and-lives-in-disasters-through-smarter-social-media-50403">a crucial tool for crisis communication</a> and <a href="http://www.dailymail.co.uk/sciencetech/article-2020378/Facebook-Twitter-creating-vain-generation-self-obsessed-people.html">a place where millennials share photos of their lunch</a>. Surely, these can’t all be true.</p>
<p>Part of the problem here is that we all design our own filter bubbles. What two random users see on Twitter might be entirely different, depending on what accounts they choose to follow, as journalism researcher Paul Bradshaw <a href="https://onlinejournalismblog.com/2016/06/28/dont-blame-facebook-for-your-own-filter-bubble/">has put it</a>. </p>
<p>If all you ever see is food porn, perhaps you need to make some new connections. (Or perhaps that’s what you’re there for). But if we could look beyond our own, personal networks, what would we see?</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=362&fit=crop&dpr=1 600w, https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=362&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=362&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=454&fit=crop&dpr=1 754w, https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=454&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/167139/original/file-20170428-15084-1f1nver.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=454&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">New Australian Twitter Accounts per Day.</span>
<span class="attribution"><span class="source">Axel Bruns / QUT Digital Media Research Centre</span></span>
</figcaption>
</figure>
<h2>How Twitter grew</h2>
<p>Data from our study show Twitter took off in Australia in 2009, some three years after its launch, and saw a fairly steady daily sign-up rate of 1,000-2,000 new accounts between 2010 and 2014. Growth has slowed since then, which may indicate market saturation. </p>
<p>There are a number of spikes in sign-ups: the series of natural disasters in early 2011 attracted users to the platform who recognise its role in crisis communication, and the political turmoil of 2013 also seems to have driven take-up.</p>
<p>A major spike in 2015 appears to coincide with the devastating Nepal earthquake, but we’ve yet to determine why that event would lead to new Twitter accounts being created in Australia.</p>
<p>To focus in on the core parts of the network, we further filtered this to accounts that have at least 1,000 connections in the global Twittersphere, which left us with the 255,000 best-connected accounts. We visualised their network using <a href="http://gephi.org/">Gephi’s Force Atlas 2 algorithm</a>, which places accounts close to each other if they share many connections, and further apart if they are only poorly connected.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=450&fit=crop&dpr=1 600w, https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=450&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=450&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=566&fit=crop&dpr=1 754w, https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=566&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/167140/original/file-20170428-15086-pbqq3v.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=566&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Clusters in the Australian Twittersphere.</span>
<span class="attribution"><span class="source">Axel Bruns / QUT Digital Media Research Centre</span></span>
</figcaption>
</figure>
<h2>Shared interests</h2>
<p>The network map shows clear clustering tendencies. Dense regions (in bright yellow), where many accounts are closely connected, are separated from each other by lower-density spaces (in darker colours). We systematically examined these clusters, labelling them based on the overarching themes that emerged from an analysis of the profiles in each cluster. </p>
<p>The result is a kind of birds-eye view of the Twitter landscape, from politics to popular culture and from education to sports.</p>
<p>Accounts connecting around teen culture make up the largest part of this network: 61,000 of our 255,000 accounts. There are 26,000 aspirational accounts (including self-declared social media gurus, self-improvement and life-coaching practitioners, and others who sought to use Twitter for professional betterment). There are also 25,000 accounts around sports (including distinct sub-clusters for cycling and horse racing) and 17,000 accounts of netizens, technologists, and software developers.</p>
<p>Shared interests emerge as the central drivers of our connections on Twitter. For the most part, we follow others because of the topics they cover, not because they’re from the same city or state or because we already know them offline. An equivalent map for Facebook, where connections are much more strongly based on prior acquaintance, would likely look very different.</p>
<p>We further found that these accounts also arrived on Twitter at very different times: both the netizen and the aspirational accounts were created very early in the history of the platform. Fully half of the population in both these clusters had arrived on Twitter by mid-2010. </p>
<p>Sports took a year longer, and may well have been helped along by Twitter Australia itself <a href="http://www.theaustralian.com.au/media/twitter-flies-in-to-meet-leaders/story-e6frg996-1226523789033">as it reached out to key sporting codes</a> to get their teams and players signed up.</p>
<p>The teen culture accounts arrived a great deal later. It took until mid-2012 until half that cluster’s population had joined – a second, separate Twitter adoption event following the first big influx of Australian users in 2009/10. We suspect active encouragement from key bands like One Direction and Five Seconds of Summer to have been a major driver here.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=362&fit=crop&dpr=1 600w, https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=362&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=362&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=455&fit=crop&dpr=1 754w, https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=455&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/167141/original/file-20170428-11206-1ynfxc1.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=455&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">New Australian Accounts per Cluster per Month.</span>
<span class="attribution"><span class="source">Axel Bruns / QUT Digital Media Research Centre</span></span>
</figcaption>
</figure>
<p>In spite of Twitter’s reputation as a space for political debate and agitation, politics attracts only some 13,000 accounts (including 1,500 that form a separate, staunchly right-wing cluster). There’s a great deal more to Twitter than political argument.</p>
<p>But if all you ever see on Twitter is partisan bickering, there may be a reason: per capita, the political accounts are some of the most active in the Australian Twittersphere. Over their lifetimes, they’ve each posted an average of 7.2 tweets per day (and the accounts in the hard right cluster even managed 12.5 per day); in the turbulent first quarter of 2017, those averages are even higher. </p>
<p>Most of the other major cluster communities have managed less than half that work rate. Historically, only the teen culture accounts have been similarly active.</p>
<p>In the end, Twitter is what its users make it. Australian users have made it a diverse and dynamic place, even if they’re less aware of each other than they should be. </p>
<p>As users, we should step beyond our networks more often, to avoid becoming trapped in our own filter bubbles – and this goes doubly for politicians, journalists, and others who now treat their immediate Twitter networks as an instant source of popular opinion.</p>
<p>And as a company, Twitter has much work to do to enable its users to experience the full variety of networked communication and culture that the platform has to offer. Changes to how it recommends new accounts to follow, and how it reveals trending topics outside of our existing networks, could help a great deal in combatting the threat of getting stuck in your own filter bubble.</p>
<p>It doesn’t stop there, of course. We can only speculate what the equivalent networks for Facebook, Instagram, or Snapchat would look like, and what they might tell us about how people are using these platforms.</p><img src="https://counter.theconversation.com/content/76864/count.gif" alt="The Conversation" width="1" height="1" />
<h4 class="border">Disclosure</h4><p class="fine-print"><em><span>This research is supported by the ARC Future Fellowship project "Understanding Intermedia Information Flows in the Australian Online Public Sphere", and the ARC LIEF project "TrISMA: Tracking Infrastructure for Social Media Analysis."</span></em></p>Twitter is made up of numerous communities clustered around all manner of topics. If all you see is the same, it’s time to break out of your filter bubble.Axel Bruns, Professor, Creative Industries, Queensland University of TechnologyLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/743882017-03-21T09:30:31Z2017-03-21T09:30:31ZMy Country: a play about Brexit that tries to break the bubble but disappoints<figure><img src="https://images.theconversation.com/files/161603/original/image-20170320-9144-1w0upjx.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">My Country // Sarah Lee</span></span></figcaption></figure><p>Much has been written over recent months about the apparent bubble in which we are all now living, a kind of transparent hamster ball that encircles us and only serves to substantiate, endorse and inwardly broadcast a set of assumptions and prejudices that we already possess thanks to the quotidian role of social media. The left talks largely to the left and the right to the right, each endorsing their respective views with data sets and human narratives that operate at polar ends of a retweet/Facebook like/clickbait culture that both hot houses and self-perpetuates its primary concerns.</p>
<p>This idea has been used to explain both the <a href="https://www.theguardian.com/media/2016/jul/12/how-technology-disrupted-the-truth">Brexit phenomenon</a> and the rise of President Trump’s version of <a href="http://www.newstatesman.com/world/north-america/2016/11/voting-trump-and-brexit-what-working-class-revolt-really-about">Republicanism</a> in the US. The feelings of “real working people” (a much-invoked moniker for a portion of the electorate that a party is seeking to bring to the ballot box) have somehow got lost in the maelstrom of an alienating political discourse that cites elitism as the problem but is itself borne out of an elite, albeit a financial rather than intellectual one.</p>
<p>I was reminded of the bubble syndrome while sitting at the National Theatre taking in Carol Ann Duffy’s new verbatim drama <a href="https://www.nationaltheatre.org.uk/shows/my-country-uk-tour">My Country</a>, which is embarking on a national tour after a month run in London. The play, set on the night of last year’s Brexit referendum, articulates the national mood of the times through its cast of Brechtian characters who speak on behalf of their particular region or in some cases (such as the character Caledonian) for a whole nation. </p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/161611/original/image-20170320-9121-1r12izq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/161611/original/image-20170320-9121-1r12izq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/161611/original/image-20170320-9121-1r12izq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/161611/original/image-20170320-9121-1r12izq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/161611/original/image-20170320-9121-1r12izq.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/161611/original/image-20170320-9121-1r12izq.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/161611/original/image-20170320-9121-1r12izq.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Rule Britannia.</span>
<span class="attribution"><span class="source">My Country // Sarah Lee</span></span>
</figcaption>
</figure>
<p>Duffy and director Rufus Norris offer us a balanced set of views siphoned down from 70-plus interviews that were conducted up and down the country. But the show ultimately provides no greater sense of comprehending the national psyche than watching an episode of Question Time or listening to a radio talkshow phone in. It doesn’t quite break out of any bubble.</p>
<h2>Documentary theatre</h2>
<p>Verbatim theatre – where people’s words are transplanted to the stage in a documentary style format – can provide a powerful means of articulating contemporary concerns. Robin Soans’ <a href="http://www.telegraph.co.uk/culture/theatre/drama/3641278/Chilling-moving-and-mesmerising.html">Talking To Terrorists</a> (2005), for example, was a prescient <em>tour de force</em> that offered real insights into the mediating role of constructive dialogue. And performances of <a href="http://www.independent.co.uk/arts-entertainment/theatre-and-nothing-but-the-truth-1045299.html">The Colour of Justice</a> (1999), based on the Macpherson inquiry into the police investigation of the murder of Stephen Lawrence, similarly provided a unique, proximal experience for audiences to engage in a legal, social and political process that articulated the prejudices encountered by an entire community.</p>
<figure class="align-right ">
<img alt="" src="https://images.theconversation.com/files/161613/original/image-20170320-9117-70mrjj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=237&fit=clip" srcset="https://images.theconversation.com/files/161613/original/image-20170320-9117-70mrjj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=846&fit=crop&dpr=1 600w, https://images.theconversation.com/files/161613/original/image-20170320-9117-70mrjj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=846&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/161613/original/image-20170320-9117-70mrjj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=846&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/161613/original/image-20170320-9117-70mrjj.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=1063&fit=crop&dpr=1 754w, https://images.theconversation.com/files/161613/original/image-20170320-9117-70mrjj.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=1063&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/161613/original/image-20170320-9117-70mrjj.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=1063&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">My Country flyer.</span>
<span class="attribution"><span class="source">National Theatre</span></span>
</figcaption>
</figure>
<p>Norris, artistic director of the National Theatre, should be congratulated for his attempt to capture the voices of the nation at what was a key pivotal moment in its history last year. But My Country somehow lacks the vitality of previous verbatim pieces. Interspersed speeches from politicians such as Nigel Farage and Boris Johnson feel clunckily dropped in and unnecessarily presented as embodied caricatures.</p>
<p><a href="http://www.telegraph.co.uk/culture/theatre/theatre-reviews/8453051/London-Road-National-Theatre-review.html">London Road</a> (2011), Norris’s previous encounter with verbatim theatre – the unlikely coupling of musical theatre with the 2006 Ipswich serial killings – was a far more successful enterprise, managing to break new ground in terms of both form and content. Interestingly, the highlights of My Country are located in its musical set pieces, where his directorial flair and experience are most in evidence. Here, the play moves more sharply into a popular mode that melds easy accessibility with an affective experience. This comes far closer to catching the national zeitgeist than many of the interviews themselves.</p>
<h2>A national theatre</h2>
<p>In response to our bubble reality, many involved in the arts are <a href="http://www.huffingtonpost.co.uk/dr-michael-petry/brexit-and-the-arts_1_b_11974492.html">questioning the role and function of their practices</a> as their own hamster balls have come sharply into focus. For Norris, part of the answer seems to be verbatim theatre: representing the voices of the public on stage.</p>
<p>Scrutinising the output of a national theatre at a time of rising nationalism is certainly a worthwhile activity. Guardian theatre critic Michael Billington recently <a href="https://www.theguardian.com/stage/theatreblog/2017/jan/30/national-theatre-new-season-classic-plays-rufus-norris">took Norris to task</a> for almost entirely rejecting the classic repertoire in his desire to promote new work. This severance with the past is dangerous, he argues: “We also see all around us the danger of living in a perpetual present.”</p>
<p>In a similar vein, one might also be tempted to read playwright <a href="https://www.theguardian.com/stage/2017/jan/29/david-hare-classic-british-drama-infected-radical-european-staging">David Hare’s attack</a> on the apparent influx of European directing styles on British theatre as part of a broader move of the theatrical establishment to shore up a sense of nationalism through the active promotion of a kind of cultural homogeneity. Does theatre, as one of our biggest cultural assets, deserve to be bordered and sheltered from foreign influences and distractions?</p>
<p>Ironically perhaps, My Country attempts to deliver the kind of state-of-the-nation play that Hare continues to hanker for. But the nation is not fixed and static. It is a dynamic entity that is currently deeply divided. While Duffy’s play is less concerned with exposing the friction engendered by these divisions, she does facilitate a multitude of voices that speak to a united moment that was anything but unifying. </p>
<p>The facilitation, however, feels ultimately flimsy and lacking in either radical intention or emotional insight. For a National Theatre that is seeking to probe the current nation’s psychological state of the mind, there needs to be more. And so two questions for the National remain unanswered. What might an appropriate theatrical response to our times be? And how can we make such a response relevant for all?</p>
<hr>
<p><em>My Country is at the National Theatre until March 22, before going on a nationwide tour March 28 – July 1</em></p><img src="https://counter.theconversation.com/content/74388/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mark O'Thomas does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Scrutinising the output of a national theatre at a time of rising nationalism is a worthwhile activity, but it needs either radical intention or emotional insight.Mark O'Thomas, Dean, Professor of Theatre and Performance, Newcastle UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/730592017-02-16T04:01:04Z2017-02-16T04:01:04ZHelp us restore trust in experts<p>The Conversation thrives on readers and republishers who help us share our content, but we don’t often ask <em>why</em> they do this.</p>
<p>Is it because they value academic expertise, facts and evidence, or is it because they agree with the sentiment expressed in the article? We hope it’s the former, but we know that’s not always the case.</p>
<p>For the most part we publish analysis, not opinion, aiming to inform rather than persuade. But today’s readers can self-select, filtering out content they disagree with and refusing to engage when they feel discussions have become too vitriolic for them to meaningfully participate. The challenge remains getting people out of their bubble.</p>
<p>Trust in institutions, including government and the media continues to decline at a rate we should all be worried about. Trust in media dropped from 42% to 32% in Australia last year, according to <a href="http://www.edelman.com/trust2017/trust-asia-pacific-middle-east-africa/">Edelman’s Trust Barometer</a>. This global survey also found disturbing trends in the attitude toward experts, with “a person like me” increasingly considered to be on par, in terms of credibility, with a technical or academic expert. 59% of people surveyed would rather believe a search engine than a human editor, and more than half (53%) do not regularly listen to people or organisations they disagree with.</p>
<p>We’re keen to collaborate with more Australian media organisations to help restore some of the trust we’ve all lost.</p>
<p>One way publishers can do this is by republishing our content, including our <a href="https://theconversation.com/au/factcheck">FactChecks</a>, which we think are the most rigorous in the country. There’s also opportunity for more formal collaborations, like sharing our news list in advance and working together on editorial projects. We’ve been working closely with academics for six years and can help reporters identify the most current experts to help bring more evidence to the table.</p>
<p>And for you, our reader, we’re keen to hear how you share our content with your friends and family. How can we make it easier for the public to take in multiple points of view on issues that matter to them? The evidence tells us that simply <a href="https://theconversation.com/how-to-cut-through-when-talking-to-anti-vaxxers-and-anti-fluoriders-72504">shouting more facts</a> at people doesn’t work. Building trust and playing the long game does, something we will continue to do.</p>
<p>We’re also experimenting with different types of storytelling to share facts in different ways. Like our <a href="https://theconversation.com/comic-explainer-how-memory-works-64485">comic explainer</a> on memory.</p>
<p>It’s a complex issue, but simply ignoring it is not an option. If trust in the media continues to erode and people lose faith in experts, our democracies will continue to suffer.</p>
<p>One of the aims of <a href="https://theconversation.com/au/charter">our charter</a> is to provide the public with clarity and insight into society’s biggest problems. The loss of support for evidence-based decision making is one such problem.</p><img src="https://counter.theconversation.com/content/73059/count.gif" alt="The Conversation" width="1" height="1" />
We’re keen to collaborate with more Australian media organisations to help restore some of the trust we’ve all lost.Charis Palmer, Deputy Editor/Chief of StaffLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/699642016-12-12T03:40:16Z2016-12-12T03:40:16ZBreak out of your echo chamber: Technology arranges lunch with someone new<figure><img src="https://images.theconversation.com/files/149473/original/image-20161209-31391-1tc4z0w.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">Eating lunch together is increasingly a thing of the past – but it doesn't have to be.</span> <span class="attribution"><a class="source" href="https://unsplash.com/photos/KR2mdHJ5qMg">Luke Chesser</a></span></figcaption></figure><p>On average, Americans spend <a href="http://www.nytimes.com/2016/05/06/business/facebook-bends-the-rules-of-audience-engagement-to-its-advantage.html">50 minutes a day on Facebook</a>. That’s a lot of online socializing. It’s also about the same amount of time workers take for their <a href="http://officeteam.rhi.mediaroom.com/lunchbreaks">lunch break</a>. Yet there’s not nearly as much socializing then: 65 percent of Americans eat lunch at their desk, and 45 percent report eating lunch alone, the <a href="https://www.washingtonpost.com/news/wonk/wp/2015/08/18/eating-alone-is-a-fact-of-modern-american-life/">highest rate in over 50 years</a>. What if people spent their lunchtime connecting in person, rather than just virtually?</p>
<p>Modernity has not just affected our social connections at mealtimes – it has changed how we feed our minds. Many of us have been sucked in to our own “echo chambers,” with large numbers of people <a href="http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016">getting news from similarly minded Facebook friends</a> – at least some of which is <a href="http://www.nytimes.com/2016/11/15/technology/google-will-ban-websites-that-host-fake-news-from-using-its-ad-service.html">fake</a>. Too often, the nuanced complexities of our friendships have been reduced to clever one-liners and carefully curated selfies. In this paradoxical state, we are so much more connected and alone at the same time.</p>
<p>Social media companies won’t solve these problems. Organizations like Facebook want to maximize user time spent in their ecosystem. More time means more clicks, which means more profits. <a href="https://www.wired.com/2016/11/facebook-echo-chamber/">Research shows</a> that people want to consume information that <a href="https://www.wired.com/2016/11/facebook-echo-chamber/">validates their preexisting ideological outlooks</a>, so it’s only sensible that Facebook serves up just that.</p>
<p>The solution is up to all of us. People must connect with each other, honestly, authentically, face to face. We believe that technology can help. As doctoral students at MIT, we developed a service that helps people connect – providing 50 minutes of in-person interaction for every five minutes of screen time.</p>
<h2>The opposite of dating apps</h2>
<p>A year and a half ago, we were disturbed by a series of suicides in the Cambridge community. The tragedies started a conversation about social isolation and the awkwardness of meeting new people. Informal polling of MIT community members at a wide range of quirky events, from hackathons and tea parties to “chocolate soirees,” revealed that many of the students attended events in hopes of making a new friend. Many of them also felt that a group setting was too impersonal to do that successfully.</p>
<p>Inspired, we set out to create a more personal alternative. It became a service called <a href="https://connected.mit.edu">Connect</a>, arranging platonic, face-to-face meetings between interesting people over lunch.</p>
<p>An “interesting” person is one whom we are likely to get along with personally, but who differs from us in some way – ideologically, demographically or socioeconomically. This approach stands in contrast to dating sites and social networking platforms. Instead of trying to find your perfect match, Connect is trying to find a person with similarities that comfort you, but also differences that intrigue you.</p>
<p>Connect users are asked a few profile questions to aid the matching algorithm. These questions capture aspects of people’s identities, including field of study, hobbies and interests, but they also include logistical questions like availability and food preferences. Connect then suggests a venue, time and a conversation starter. As users attend more lunches and provide the algorithm with feedback, it gradually learns more about the kind of attributes in other people that interest them.</p>
<p>An international MBA student may sign up, for example, and tell the system that he enjoys cooking and kayaking. He may get matched with a Ph.D. student in computer science, who is an avid bread-maker. The pair are similar enough – in terms of their level of education and food-related hobbies – that they’ll get along, but they’re also sufficiently different that they might not otherwise meet or share experiences. </p>
<p>A dating app would not match these two people because they’re both straight men, and because their backgrounds differ enough that they are unlikely to be romantically compatible. But they’re not asking Connect to find a life partner – just someone interesting with whom to share a meal. </p>
<h2>Reclaiming lunch</h2>
<p>It turns out that we like meeting interesting new people. The vast majority – 90 percent – of users over the past year have rated their interactions on the platform a 4 out of 5 or better. Aside from having a nice lunch, about a third of users report having made a lasting friend, someone they keep in touch with regularly.</p>
<p>The success of Connect has spurred us to continue exploring how technology can help build stronger in-person communities. Thanks to funding from <a href="https://connected.mit.edu/funding">various offices at MIT</a>, we have even managed to pay for lunches on campus, further lowering the barriers to meeting an interesting new person.</p>
<p>We plan to take the platform global early next year, so that administrators at schools around the world will be able to help their students, faculty, staff and alumni connect more directly. We are hopeful that from a supportive academic cradle, Connect can grow into a platform that helps anyone, student or otherwise, find their next best friend.</p>
<p>As platforms like Connect continue to grow, perhaps technology can help dismantle the echo chamber it helped create. Perhaps the social network of tomorrow will provide us opportunities to hear different perspectives, and to express ideas that aren’t hidden behind the shadow of anonymity – while also bringing back the diminished tradition of eating lunch together.</p><img src="https://counter.theconversation.com/content/69964/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Mohammad Ghassemi is a co-creator of Connect.</span></em></p><p class="fine-print"><em><span>Tuka Al Hanai is a co-creator of Connect. </span></em></p>Bringing back the diminished tradition of eating lunch together may be the solution.Mohammad Ghassemi, Ph.D. Candidate in Electrical Engineering and Computer Science, Massachusetts Institute of Technology (MIT)Tuka Al Hanai, Ph.D. Candidate in Electrical Engineering and Computer Science, Massachusetts Institute of Technology (MIT)Licensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/696642016-12-05T12:20:44Z2016-12-05T12:20:44ZThe filter bubble isn’t just Facebook’s fault – it’s yours<figure><img src="https://images.theconversation.com/files/148624/original/image-20161205-19369-13bcy4h.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><span class="source">Shutterstock</span></span></figcaption></figure><p>Following the shock results of Brexit and the Trump victory, <a href="http://www.theguardian.com/technology/2016/nov/10/facebook-fake-news-election-conspiracy-theories">a lot of attention</a> has focused on the role that Facebook might have played in creating online political ghettos in which false news can easily spread. Facebook now has serious political influence thanks to its development from a social networking tool into a primary source of news and opinions. <a href="http://www.nytimes.com/2016/11/14/technology/facebook-is-said-to-question-its-influence-in-election.html">And for many</a>, the way it manages this influence is in need of greater scrutiny. But to put the blame solely on the company is to overlook how people use the site, and how they themselves create a filter bubble effect through their actions. </p>
<p>Much of this debate has focused on the design of Facebook itself. The site’s personalisation algorithm, which is programmed to create a positive user experience, feeds people what they want. This creates what the CEO of viral content site Upworthy, Eli Pariser, calls “<a href="http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles">filter bubbles</a>”, which supposedly shield users from views they disagree with. People are increasingly turning to Facebook for their news – <a href="http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/">44 % of US adults</a> now report getting news from the site – and fake news is not editorially weeded out. This means that misinformation can spread easily and quickly, hampering the chance people have for making informed decisions.</p>
<p>Over the last few weeks, there have been <a href="http://www.niemanlab.org/2016/11/the-forces-that-drove-this-elections-media-failure-are-likely-to-get-worse">frequent calls</a> for Facebook to address this issue. President Obama himself has <a href="http://www.theguardian.com/media/2016/nov/17/barack-obama-fake-news-facebook-social-media">weighed in on the issue</a>, warning of the perils that rampant misinformation can have for the democratic process.</p>
<p>Much of the debate around this, however, has had an element of technological determinism to it, suggesting that users of Facebook are at the mercy of the algorithm. In fact, our research shows that the actions of users themselves are still a very important element in the way that Facebook gets used. </p>
<p><a href="http://www.palgrave.com/us/book/9781137029300">Our research</a> has been looking specifically at how people’s actions create the context of the space in which they communicate. Just as important as the algorithm is how people use the site and shape it around their own communications. We’ve found that most users have an overwhelming view that Facebook is not ideally suited to political debate, and that posts and interactions should be kept trivial and light-hearted.</p>
<figure class="align-center ">
<img alt="" src="https://images.theconversation.com/files/148625/original/image-20161205-19414-1g0buo4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/148625/original/image-20161205-19414-1g0buo4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=400&fit=crop&dpr=1 600w, https://images.theconversation.com/files/148625/original/image-20161205-19414-1g0buo4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=400&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/148625/original/image-20161205-19414-1g0buo4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=400&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/148625/original/image-20161205-19414-1g0buo4.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=503&fit=crop&dpr=1 754w, https://images.theconversation.com/files/148625/original/image-20161205-19414-1g0buo4.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=503&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/148625/original/image-20161205-19414-1g0buo4.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=503&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">
<figcaption>
<span class="caption">Shutting down conversation.</span>
<span class="attribution"><span class="source">Shutterstock</span></span>
</figcaption>
</figure>
<p>This isn’t to say that people don’t express political opinions on Facebook. But for many people there’s a reluctance to engage in discussion, and a sense that anything that might be contentious is better handled by face-to-face conversation. People report that they fear the online context will lead to misunderstandings because of the way that written communication lacks some of the non-linguistic cues of spoken communication, such as tone of voice and facial gestures.</p>
<p>There’s strong evidence in our research that people are actually exposed to a great deal of diversity through Facebook. This is because their network includes people from all parts of their life, a finding that <a href="http://poq.oxfordjournals.org/content/early/2016/03/21/poq.nfw006.short">echoes other research</a>. In this respect, the algorithm doesn’t have a marked influence on the creation of filter bubbles. But because they often want to avoid conflict, people report ignoring or blocking posts, or even unfriending people, when confronted with views with which they strongly disagree.</p>
<p>They also report taking care of what they say themselves so as not to antagonise people such as family members or work colleagues whose views differ from theirs, but whose friendship they wish to maintain. And finally, they talk of making a particular effort to put forward a positive persona on social media, which again stops them from engaging in debate which might lead to argument.</p>
<h2>Not so easy to fix</h2>
<p>The idea that algorithms are responsible for filter bubbles suggests it should be easy to fix (by getting rid of the algorithms), which <a href="http://www.scientificamerican.com/article/facebook-s-problem-is-more-complicated-than-fake-news/">makes it an appealing explanation</a>. But this perspective ignores the part played by users themselves, who effectively create their own filter bubbles by withdrawing from political discussions and hiding opinions they disagree with.</p>
<p>This isn’t done with the intention of sifting out diversity but is instead due to a complex mix of factors. These include the perceived purpose of Facebook, how users want to present themselves in an effectively public form, and how responsible they feel for the diverse ties that make up their online network.</p>
<p>The fact that manipulation by the algorithm isn’t the only issue here means that other solutions, for example raising people’s awareness of the possible consequences that their online actions have, can help encourage debate. We have to recognise that the impact of technology comes not just from the innovations themselves but also from how we use them, and that solutions have to come from us as well.</p><img src="https://counter.theconversation.com/content/69664/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Research shows the lack of diverse political views on your Facebook feed is more down to self-censorship than any algorithm.Philip Seargeant, Senior Lecturer in Applied Linguistics, The Open UniversityCaroline Tagg, Lecturer in Applied Linguistics and English Language, The Open UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/690332016-12-02T16:54:03Z2016-12-02T16:54:03ZThree ways Facebook could reduce fake news without resorting to censorship<figure><img src="https://images.theconversation.com/files/148323/original/image-20161201-25685-vzmcdl.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">
</span> <span class="attribution"><a class="source" href="https://www.shutterstock.com/pic-525622510/stock-vector-flat-design-concepts-big-data-filter.html">Filter via shutterstock.com</a></span></figcaption></figure><p>The public gets a lot of its <a href="http://www.journalism.org/2016/05/26/news-use-across-social-media-platforms-2016/">news and information from Facebook</a>. Some of it is fake. That presents a problem for the site’s users, and for the company itself.</p>
<p>Facebook cofounder and chairman Mark Zuckerberg said the company will find ways to address the problem, though he didn’t acknowledge its severity. And without apparent irony, he made this announcement in a <a href="https://www.facebook.com/zuck/posts/10103269806149061">Facebook post</a> surrounded – at least for some viewers – <a href="http://www.businessinsider.com/twitter-cofounder-ev-williams-on-facebook-fake-news-problem-2016-11">by fake news items</a>.</p>
<p>Other technology-first companies with similar power over how the public informs itself, such as Google, have worked hard over the years to <a href="http://searchengineland.com/library/google/google-panda-update">demote low-quality information</a> in their search results. But Facebook has not made similar moves to help users. </p>
<p>What could Facebook do to meet its social obligation to sort fact from fiction for the <a href="http://www.pewinternet.org/fact-sheets/social-networking-fact-sheet/">70 percent of internet users</a> who access Facebook? If the site is increasingly where people are getting their news, what could the company do without taking up the mantle of being a final arbiter of truth? My work as a professor of information studies suggests there are at least three options.</p>
<h2>Facebook’s role</h2>
<p>Facebook says it is a <a href="http://www.wsj.com/articles/facebook-leaders-call-it-a-tech-company-not-media-company-1477432140">technology company, not a media company</a>. The company’s primary motive is profit, rather than a <a href="https://www.americanpressinstitute.org/journalism-essentials/what-is-journalism/purpose-journalism/">loftier goal</a> like producing high-quality information to help the public act knowledgeably in the world.</p>
<p>Nevertheless, posts on the site, and the surrounding conversations both online and off, are increasingly <a href="https://theconversation.com/misinformation-on-social-media-can-technology-save-us-69264">involved with our public discourse</a> and the nation’s political agenda. As a result, the corporation has a social obligation to use its technology to advance the common good.</p>
<p>Discerning truth from falsehood, however, can be daunting. Facebook is not alone in <a href="https://www.facebook.com/zuck/posts/10103269806149061">raising concerns about its ability</a> – and that of other tech companies – to judge the quality of news. The director of <a href="http://factcheck.org/">FactCheck.org</a>, a nonprofit fact-checking group based at the University of Pennsylvania, told Bloomberg News that <a href="https://www.bloomberg.com/news/articles/2016-11-23/facebook-s-quest-to-stop-fake-news-risks-becoming-slippery-slope">many claims and stories aren’t entirely false</a>. Many have <a href="http://www.politifact.com/punditfact/statements/2015/oct/01/viral-image/viral-image-wrongly-accuses-clinton-stealing/">kernels of truth</a>, even if they are very misleadingly phrased. So what can Facebook really do?</p>
<h2>Option 1: Nudging</h2>
<p>One option Facebook could adopt involves using existing lists identifying prescreened reliable and <a href="https://www.washingtonpost.com/posteverything/wp/2016/11/18/my-fake-news-list-went-viral-but-made-up-stories-are-only-part-of-the-problem/">fake-news sites</a>. The site could then alert those who want to share a troublesome article that its source is questionable. </p>
<p>One developer, for example, has created an extension to the Chrome browser <a href="http://mashable.com/2016/11/15/bs-detector-chrome-extension-facebook/">that indicates when a website</a> you’re looking at might be fake. (He calls it the “B.S. Detector.”) In a 36-hour hackathon, a group of college students <a href="http://mashable.com/2016/11/19/facebook-fib-extension-fake-news/">created a similar Chrome browser extension</a> that indicates whether the website the article comes from is on a list of verified reliable sites, or is instead unverified.</p>
<p>These extensions present their alerts while people are scrolling through their newsfeeds. At present, neither of these works directly as part of Facebook. Integrating them would provide a more seamless experience, and would make the service available to all Facebook users, beyond just those who installed one of the extensions on their own computer.</p>
<p>The company could also use the information the extensions generate – or their source material – to warn users before they share unreliable information. In the world of software design, this is known as a “<a href="http://www.jstor.org/stable/40041817">nudge</a>.” The warning system monitors user behavior and notifies people or gives them some feedback to help alter their actions when using the software. </p>
<p>This has been done before, for other purposes. For example, colleagues of mine here at Syracuse University <a href="http://repository.cmu.edu/cgi/viewcontent.cgi?article=1335&context=heinzworks">built a nudging application</a> that monitors what Facebook users are writing in a new post. It pops up a notification if the content they are writing is something they might regret, such as an angry message with swear words. </p>
<p>The beauty of nudges is the gentle but effective way they remind people about behavior to help them then change that behavior. Studies that have tested the use of nudges to <a href="http://nudges.org/">improve healthy behavior</a>, for example, find that people are more likely to change their diet and exercise based on gentle reminders and recommendations. Nudges can be effective because they give people control while also giving them useful information. Ultimately the recipient of the nudge still decides whether to use the feedback provided. Nudges don’t feel coercive; instead, they’re potentially empowering.</p>
<h2>Option 2: Crowdsourcing</h2>
<p>Facebook could also use the power of crowdsourcing to help evaluate news sources and indicate when news that is being shared has been evaluated and rated. One important challenge with fake news is that it plays to how our brains are wired. We have mental shortcuts, called <a href="http://us.macmillan.com/thinkingfastandslow/danielkahneman/9780374533557">cognitive biases</a>, that help us make decisions when we don’t have quite enough information (we never do), or quite enough time (we never do). Generally these shortcuts work well for us as we make decisions on everything from which route to drive to work to what car to buy But, occasionally, they fail us. Falling for fake news is one of those instances.</p>
<p>This can happen to anyone – even me. In the primary season, I was following a Twitter hashtag on which then-primary candidate Donald Trump tweeted. A message appeared that I found sort of shocking. I retweeted it with a comment mocking its offensiveness. A day later, I realized that the tweet was from a parody account that looked identical to Trump’s Twitter handle name, but had one letter changed. </p>
<p>I missed it because I had fallen for <a href="https://theconversation.com/confirmation-bias-a-psychological-phenomenon-that-helps-explain-why-pundits-got-it-wrong-68781">confirmation bias</a> – the tendency to overlook some information because it runs counter to my expectations, predictions or hunches. In this case, I had disregarded that little voice that told me this particular tweet was a little too over the top for Trump, because I believed he was capable of producing messages even more inappropriate. Fake news preys on us the same way.</p>
<p>Another problem with fake news is that it can travel much farther than any correction that might come afterwards. This is similar to the challenges that have always faced newsrooms when they have reported erroneous information. Although they publish corrections, often the people originally exposed to the misinformation never see the update, and therefore don’t know what they read earlier is wrong. Moreover, people tend to hold on to the first information they encounter; <a href="http://dx.doi.org/10.1007/s11109-010-9112-2">corrections can even backfire</a> by repeating wrong information and reinforcing the error in readers’ minds.</p>
<p>If people evaluated information as they read it and shared those ratings, the truth scores, like the nudges, could be part of the Facebook application. That could help users decide for themselves whether to read, share or simply ignore. One challenge with crowdsourcing is that people can game these systems to try and drive biased outcomes. But, the beauty of crowdsourcing is that the crowd can also rate the raters, just as happens on Reddit or with Amazon’s reviews, to reduce the effects and weight of troublemakers. </p>
<h2>Option 3: Algorithmic social distance</h2>
<p>The third way that Facebook could help would be to reduce the algorithmic bias that presently exists in Facebook. The site primarily shows posts from those with whom you have engaged on Facebook. In other words, the Facebook algorithm creates what some have called a <a href="https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles">filter bubble</a>, an online news phenomenon that has <a href="https://www.ft.com/content/3e2ee254-bf96-11dc-8052-0000779fd2ac">concerned scholars</a> for decades now. If you are exposed only to people with ideas that are like your own, it leads to <a href="http://onlinelibrary.wiley.com/doi/10.1111/1467-9760.00148/full">political polarization</a>: Liberals get even more extreme in their liberalism, and conservatives get more conservative. </p>
<p>The filter bubble creates an “echo chamber,” where similar ideas bounce around endlessly, but new information <a href="https://theconversation.com/misinformation-on-social-media-can-technology-save-us-69264">has a hard time finding its way in</a>. This is a problem when the echo chamber blocks out corrective or fact-checking information.</p>
<p>If Facebook were to open up more news to come into a person’s newsfeed from a random set of people in their social network, it would increase the chances that new information, alternative information and contradictory information would flow within that network. The average number of <a href="http://www.pewresearch.org/fact-tank/2014/02/03/6-new-facts-about-facebook/">friends in a Facebook user’s network is 338</a>. Although many of us have friends and family who share our values and beliefs, we also have acquaintances and strangers who are part of our Facebook network who have diametrically opposed views. If Facebook’s algorithms brought more of those views into our networks, the filter bubble would be more porous.</p>
<p>All of these options are well within the capabilities of the engineers and researchers at Facebook. They would empower users to make better decisions about the information they choose to read and to share with their social networks. As a leading platform for information dissemination and a generator of social and political culture through talk and information sharing, Facebook need not be the ultimate arbiter of truth. But it can use the power of its social networks to help users gauge the value of items amid the stream of content they face.</p><img src="https://counter.theconversation.com/content/69033/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jennifer Stromer-Galley does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>If the site is increasingly where people are getting their news, what could the company do without taking up the mantle of being a final arbiter of truth?Jennifer Stromer-Galley, Professor of Information Studies, Syracuse UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/691132016-11-22T03:14:51Z2016-11-22T03:14:51ZHow to bridge the political divide at the holiday dinner table<figure><img src="https://images.theconversation.com/files/146824/original/image-20161121-4552-1r8gkxu.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">A time to join with close ones and, perhaps, open a dialogue?</span> <span class="attribution"><a class="source" href="https://www.flickr.com/photos/quinn/8356557483/in/photolist-dJrxpi-8WgKQ3-6VMrk-8WgKUf-48waiq-5FjBxC-aLYFqr-ibYyjK-aKDMd8-tETXR-5Jx2mt-aLYNR6-7mh1J9-aKDLWa-dGHz33-dGHyL5-5JGFWN-aKDH74-aKwpmr-4r86Gn-sYukG-5Fpkau-taDXL-7jc3k4-8WgNSE-48Zytd-tfR2H-48Co8D-aKKzBx-7jgQo8-dvSt8o-4bufRL-aKKA7i-dvvqXy-8VMbYv-aKKvmF-8WbgqD-8VV18M-t3Vdd-7idh8k-5FFf3g-5FyQX5-24RmEp-48yzuo-7jVBjB-493D9h-7jfHaD-4rcckY-aKKy6r-48Fw2Z">quinn/flickr</a>, <a class="license" href="http://creativecommons.org/licenses/by-nc/4.0/">CC BY-NC</a></span></figcaption></figure><p>We are a divided nation; that is an understatement. What’s more, we increasingly hear we are living in our own “bubble” or echo chamber that differing views cannot penetrate. To correct the problem, many are calling for people to reach out, to talk and above all, to listen. That is all well and good, but what are we supposed to talk about? We can’t hope to listen without a topic for finding common ground.</p>
<p>In my view, there are (at least) two prominent issues in this political season that can serve as a bridge across our political divides. The first is that the political and economic system needs fixing because it favors those with special status or access. The second is that income inequality is reaching an intolerable level.</p>
<p>Might these two topics help mend the <a href="http://www.nytimes.com/2016/11/16/us/political-divide-splits-relationships-and-thanksgiving-too.html">unpleasant Thanksgiving or Christmas dinners</a> that many Americans are dreading? Instead of avoiding that unpleasantness, it may be a time to embrace it. </p>
<h2>Period of flux</h2>
<p>There is an opportunity before us right now. While unpleasant, we live in a period of flux when beliefs can shift. This is how social change happens – in fits and spurts – something I’ve studied in looking at how <a href="https://www.amazon.com/Culture-Shapes-Climate-Change-Debate-ebook/dp/B00U2YM70K/ref=la_B001JS4ZD0_1_1?s=books&ie=UTF8&qid=1479581172&sr=1-1">culture shapes public debates</a> around climate change. </p>
<p>American physicist and historian <a href="https://www.amazon.com/Structure-Scientific-Revolutions-Thomas-Kuhn/dp/0226458083">Thomas Kuhn</a> first described this process as moving between periods of stability and periods of chaos. In the former, one set of beliefs dominates all other beliefs as the “paradigm.” But, periods of flux begin when tumultuous events upset this paradigm and a chaotic search for a new paradigm begins. Social scientists call this process of rapid social change “<a href="http://amj.aom.org/content/37/5/1141.abstract">punctuated equilibrium</a>.” The key is to push for change when things are most chaotic. </p>
<p>Any corporate change agent knows that is easiest to push for change when things are at their worst. As a quotation sometimes attributed to Winston Churchill notes, “Never let a good crisis go to waste.” Try thinking about that over your Thanksgiving dinner.</p>
<h2>We all live in worlds of our own design</h2>
<p>Our country has broken into deeply divided <a href="https://www.ted.com/talks/jonathan_haidt_can_a_divided_america_heal">tribes</a>: left versus right, urban versus rural, the coasts versus the middle. We have become suspicious of each other, <a href="https://www.amazon.com/Culture-Shapes-Climate-Change-Debate/dp/0804794227">questioning motives before considering ideas</a>. </p>
<p>Facts, it seems, have become less important than the political and ideological affiliation of their source. We seem to consider evidence only when it is accepted or, ideally, presented by those who represent our tribe and we dismiss information that is advocated by sources that represent groups whose values we reject. </p>
<p>This divide is ever deeper today because of social media, a relatively new force in our society. Social media has “democratized knowledge” because the gatekeepers for determining the quality of information have been taken down. But social media also creates the conditions for what has been termed <a href="http://www.nytimes.com/2016/11/19/business/media/exposing-fake-news-eroding-trust-in-real-reporting.html?_r=0">fake news</a> to run rampant. </p>
<p>Web-based media sites, and increasingly social media services Twitter, Facebook and LinkedIn, allow us to find information to support any position we seek to hold and find a community of people that will share those positions – a phenomenon known as <a href="https://www.amazon.com/Judgment-Managerial-Decision-Making-Bazerman/dp/1118065700">confirmation bias</a>. As a result, the internet doesn’t always make us more informed, but it often makes us more certain. We self-create what Eli Pariser calls our “<a href="https://www.amazon.com/Filter-Bubble-Personalized-Changing-Think/dp/0143121235">filter bubbles</a>.” </p>
<p>In one vivid illustration of this phenomenon, <a href="https://www.semanticscholar.org/paper/Political-Polarization-on-Twitter-Conover-Ratkiewicz/8c817595ceb00c3786a2bec6b33500eba24848b3">a research study</a> of <a href="https://www.semanticscholar.org/paper/Political-Polarization-on-Twitter-Conover-Ratkiewicz/8c817595ceb00c3786a2bec6b33500eba24848b3/figure/1">250,000 tweets</a> during the six weeks leading up to the 2010 U.S. congressional midterm elections found that liberal and conservative populations primarily retweeted only politically similar tweets. </p>
<h2>To engage is to not to acquiesce</h2>
<p>A study by the <a href="http://www.people-press.org/2016/06/22/partisanship-and-political-animosity-in-2016/">Pew Research Center</a> found that “49 percent of Republicans say they’re outright afraid of the Democratic Party, with 55 percent of Democrats saying they fear the GOP.” This part of the cultural divide is self-reinforcing: we fear the other so we don’t engage; we don’t engage so we fear the other even more. </p>
<p>To break this loop, we need to do what columnist <a href="http://www.nytimes.com/2016/11/16/opinion/donald-trump-help-heal-the-planets-climate-change-problem.html%22%22">Thomas Friedman</a> calls “principled engagement.” While some may choose to sit on the sidelines or hope that one side or the other fails, there is too much at stake. Others may choose to stand resolute in their defiance of engagement, and in doing so, stake the “<a href="http://irasilver.org/wp-content/uploads/2011/08/Reading-Movement-funding-Haines.pdf">radical flank</a>” and provide a constructive tension in the debates to come. </p>
<p>But some can choose to build bridges, accepting the mere act of engagement does not mean an acceptance, endorsement or even that we like the other side. It is merely a recognition that we have common concerns and interests. Standing in the middle of warring tribes is not easy as it invites attacks from both sides, but someone has to try by finding common ground.</p>
<h2>Where can we start the conversation?</h2>
<p>While <a href="http://www.nationalreview.com/article/436898/income-inequality-clinton-trump-both-wrong">not all experts</a> agree that we have an income inequality problem, the numbers are sobering and, more importantly, many voters on both the left and right believe what they tell us. </p>
<p>Overall, between 1979 and 2013 the share of income earned by the U.S.’ richest 1 percent <a href="http://www.forbes.com/sites/nathanielparishflannery/2016/08/15/trump-nation-does-income-inequality-now-define-the-u-s-economy/#1a6e9e02649c">increased</a> from 10 percent to 20.1 percent of the total economic pie. Between 2009 and 2013 the top 1 percent of U.S. earners captured 85.1 percent of total income growth. Within the 37-member Organization of Economically Developed Countries, the U.S. trails only Turkey, Mexico and Chile when it comes to inequality. </p>
<p>This is the source of the disgust and disaffection that many American voters feel – a vein that both Donald Trump and Bernie Sanders tapped into. At its core, it represents a distrust of our political and economic institutions. Some direct their ire at government, some at the corporate sector, and both hold great disdain for the seemingly corrupt relationship between the two.</p>
<p>So, what should you talk about over your holiday dinner? Well to begin, if there is absolutely no hope of common ground, stay away from politics and talk about football. </p>
<p>But if there is an opportunity to build bridges, maybe the topics of common concern to start the conversation include: the need to invest in upgrading our highways, bridges and transportation infrastructure; the corrupting influence of money in politics and possibilities for campaign finance reform; the practice of influence peddling and the proposal for time limitations on when government officials can become lobbyists; programs to increase opportunities for upward mobility like making college education more affordable; or programs to help ease the burden that workers feel when they are displaced by technology, automation, globalization or policy shifts. It may not be easy or pleasant at first, but it’s at least a start. And maybe you’ll be surprised.</p>
<p>One positive outcome of this election is that everyone seems to be engaged (even though a large percentage of Americans didn’t vote). We just need to find the right way to engage. In my religious tradition, it is said, “blessed are the peacemakers.” Whether or not you share my tradition, I think we can agree that we need more peacemakers. </p>
<p>Healing the country won’t come from Washington. It will come from each of us at our family dinner table, local Kiwanis Club, town hall, workplace and sports league. It will come from each of us as we work to open up our own individual bubbles and remember, in the words of the late <a href="http://www.foxnews.com/entertainment/2016/11/10/singer-songwriter-leonard-cohen-dead-at-82.html">Leonard Cohen</a>: “Ring the bells that still can ring; Forget your perfect offering; There is a crack in everything; That’s how the light gets in.”</p><img src="https://counter.theconversation.com/content/69113/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Andrew J. Hoffman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Many are dreading meeting relatives for Thanksgiving after Donald Trump’s surprise victory. A student of the cultural divide around climate change offers tips for opening dialogues on politics.Andrew J. Hoffman, Holcim (US) Professor at the Ross School of Business and School of Environment and Sustainability, University of MichiganLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/632172016-09-09T14:28:42Z2016-09-09T14:28:42ZHow personalisation could be changing your identity online<figure><img src="https://images.theconversation.com/files/137061/original/image-20160908-25257-17npvfa.jpg?ixlib=rb-1.1.0&rect=0%2C5%2C3840%2C2155&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">(Microsoft) Windows to the soul?</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-426990427/stock-photo-close-up-of-beautiful-young-woman-eyes-looking-at-monitor-working-with-computer-laptop-monitor-blue-light-is-reflected-in-her-eyes-evening-woman-freelancer-working-shopping-online-wa.html?src=SA-pWWKqAoiz2qUXMsZIpA-1-11">Youproduction/Shutterstock</a></span></figcaption></figure><p>Wherever you go online, someone is trying to personalise your web experience. Your preferences are pre-empted, your intentions and motivations predicted. That toaster you briefly glanced at three months ago keeps returning to haunt your browsing in tailored advertising sidebars. And it’s not a one-way street. In fact, the quite impersonal mechanics of some personalisation systems may not only influence how we see the world, but how we see ourselves. </p>
<p>It happens every day, to all of us while we’re online. <a href="https://www.facebook.com/help/327131014036297/">Facebook’s News Feed</a> attempts to deliver tailored content that <a href="https://www.facebook.com/help/327131014036297/">“most interests”</a> individual users. Amazon’s recommendation engine uses personal data tracking combined with other users’ browsing habits to suggest <a href="https://www.cs.umd.edu/%7Esamir/498/Amazon-Recommendations.pdf">relevant products</a>. Google <a href="https://googleblog.blogspot.co.uk/2009/12/personalized-search-for-everyone.html">customises search results</a>, and much more: for example, personalisation app <a href="https://www.google.com/intl/en-GB/landing/now/">Google Now</a> seeks to “give you the information you need throughout your day, before you even ask”. Such personalisation systems don’t just aim to provide relevance to users; through targeted marketing strategies, they also generate profit for many free-to-use web services. </p>
<p>Perhaps the best-known critique of this process is the <a href="http://www.penguinrandomhouse.com/books/309214/the-filter-bubble-by-eli-pariser/9780143121237/">“filter bubble”</a> theory. Proposed by internet activist <a href="https://www.opensocietyfoundations.org/people/eli-pariser">Eli Pariser</a>, this theory suggests that personalisation can detrimentally affect web users’ experiences. Instead of being exposed to universal, diverse content, users are algorithmically delivered material that matches their pre-existing, self-affirming viewpoints. The filter bubble therefore poses a problem for democratic engagement: by restricting access to challenging and diverse points of view, users are unable to participate in collective and informed debate. </p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/6_sim_Wc3mY?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
</figure>
<p>Attempts to find evidence of the filter bubble have produced mixed results. <a href="http://arxiv.org/pdf/1405.1486v1.pdf">Some studies</a> have shown that personalisation can indeed lead to a “myopic” view of a topic; <a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1321962">other studies</a> have found that in different contexts, personalisation can actually help users discover common and diverse content. My research suggests that personalisation does not just affect how we see the world, but how we view ourselves. What’s more, the influence of personalisation on our identities may not be due to filter bubbles of consumption, but because in some instances online personalisation is not very “personal” at all.</p>
<h2>Data tracking and user pre-emption</h2>
<p>To understand this, it is useful to consider how online personalisation is achieved. Although personalisation systems track our individual web movements, they are not designed to “know” or identify us as individuals. Instead, these systems collate users’ real-time movements and habits into mass data sets, and look for patterns and correlations between users’ movements. The found patterns and correlations are then <a href="http://bds.sagepub.com/content/2/2/2053951715608406">translated back</a> into identity categories that we might recognise (such as age, gender, language and interests) and that we might fit into. By looking for mass patterns in order to deliver personally relevant content, personalisation is in fact based on <a href="http://firstmonday.org/article/view/3344/2766">a rather impersonal process</a>. </p>
<p>When the filter bubble theory first emerged in 2011, Pariser argued that one of the biggest problems with personalisation was that users did not know it was happening. Nowadays, despite objections to data tracking, <a href="https://www.asc.upenn.edu/news-events/publications/tradeoff-fallacy-how-marketers-are-misrepresenting-american-consumers-and">many users are aware</a> that they are being tracked in exchange for use of free services, and that this tracking is used for forms of personalisation. Far less clear, however, are the specifics of what is being personalised for us, how and when. </p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=359&fit=crop&dpr=1 600w, https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=359&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=359&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=451&fit=crop&dpr=1 754w, https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=451&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/137068/original/image-20160908-25253-5k2f5j.jpg?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=451&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Data gathering: less complex than we might think.</span>
<span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-111392354/stock-photo-detailed-planet-earth-at-night-with-embossed-continents-earth-is-surrounded-by-a-luminous-network-representing-the-major-air-routes-based-on-real-data-elements-of-this-image-furnished-b.html?src=vQ1_2R4Rq3XrzJFD1HPUpg-1-33">Anton Balazh/Shutterstock</a></span>
</figcaption>
</figure>
<h2>Finding the ‘personal’</h2>
<p>My research suggests that some users assume their experiences are being personalised to complex degrees. In an in-depth qualitative study of 36 web users, upon seeing advertising for weight loss products on Facebook some female users reported that they assumed that Facebook had profiled them as overweight or fitness-oriented. In fact, these weight loss ads were delivered generically to women aged 24-30. However, because users can be unaware of the impersonal nature of some personalisation systems, such targeted ads can have a detrimental impact on how these users view themselves: to put it crudely, they must be overweight, because Facebook tells them they are. </p>
<p>It’s not just targeted advertising that can have this impact: in an ethnographic and longitudinal study conducted of a handful of 18 and 19-year-old Google Now users, I found that some participants assumed the app was capable of personalisation to an extraordinarily complex extent. Users reported that they believed Google Now showed them stocks information because Google knew their parents were stockholders, or that Google (wrongly) pre-empted a “commute” to “work” because participants had once lied about being over school age on their YouTube accounts. It goes without saying that this small-scale study does not represent the engagements of all Google Now users: but it does suggest that for these individuals, the predictive promises of Google Now were almost infallible. </p>
<p>In fact, <a href="https://www.amazon.co.uk/Sociology-Monsters-Essays-Technology-Domination/dp/0415071399">critiques of user-centred design</a> suggest that the reality of Google’s inferences is much more impersonal: Google Now assumes that its <a href="http://sth.sagepub.com/content/29/1/30.abstract">“ideal user”</a> does – or at least should – have an interest in stocks, and that all users are workers who commute. Such critiques highlight that it is these assumptions which largely structure Google’s personalisation framework (for example through the app’s adherence to <a href="https://www.google.com/search/about/learn-more/now/">predefined “card” categories</a> such as “Sports”, which during my study only allowed users to ‘follow’ men’s rather than women’s UK football clubs). However, rather than questioning the app’s assumptions, my study suggests that participants placed themselves outside the expected norm: they trusted Google to tell them what their personal experiences should look like. </p>
<p>Though these might seem like extreme examples of impersonal algorithmic inference and user assumption, the fact that we cannot be sure what is being personalised, when or how are more common problems. To me, these user testimonies highlight that the tailoring of online content has implications beyond the fact that it might be detrimental for democracy. They suggest that unless we begin to understand that personalisation can at times operate via highly impersonal frameworks, we may be putting too much faith in personalisation to tell us how we should behave, and who we should be, rather than vice versa.</p><img src="https://counter.theconversation.com/content/63217/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Tanya Kant received funding from the Arts and Humanities Research Council as part of her Doctoral studies (2012-2015). </span></em></p>Attempts to model your web experience led to fears of an echo chamber effect, but rather than reinforcing your sense of self, the process might be altering it.Tanya Kant, Lecturer in Media and Cultural Studies, University of SussexLicensed as Creative Commons – attribution, no derivatives.