tag:theconversation.com,2011:/fr/topics/polling-companies-26461/articlesPolling companies – The Conversation2016-11-18T03:34:24Ztag:theconversation.com,2011:article/685442016-11-18T03:34:24Z2016-11-18T03:34:24ZWhat will pollsters do after 2016?<figure><img src="https://images.theconversation.com/files/146132/original/image-20161115-31138-1uzu85t.jpg?ixlib=rb-1.1.0&q=45&auto=format&w=496&fit=clip" /><figcaption><span class="caption">What will polling look like in the future?</span> <span class="attribution"><a class="source" href="http://www.shutterstock.com/pic-299573114/">Person taking survey via shutterstock.com</a></span></figcaption></figure><p>Clinton defeated Trump much like <a href="http://www.chicagotribune.com/news/nationworld/politics/chi-chicagodays-deweydefeats-story-story.html">Dewey defeated Truman</a>. Both election results were dramatic surprises because pre-election polls created expectations that didn’t match the final outcomes.</p>
<p>Many polls were very accurate. For example, the polling averages in <a href="http://www.realclearpolitics.com/epolls/2016/president/va/virginia_trump_vs_clinton_vs_johnson_vs_stein-5966.html">Virginia</a>, <a href="http://www.realclearpolitics.com/epolls/2016/president/co/colorado_trump_vs_clinton_vs_johnson_vs_stein-5974.html">Colorado</a> and <a href="http://www.realclearpolitics.com/epolls/2016/president/az/arizona_trump_vs_clinton_vs_johnson_vs_stein-6087.html">Arizona</a> were within 0.1 percent of the election outcome.</p>
<p>That said, <a href="http://www.nytimes.com/interactive/2016/11/13/upshot/putting-the-polling-miss-of-2016-in-perspective.html">many polls</a> missed the mark in 2016. Polls of Wisconsin in particular performed <a href="http://www.realclearpolitics.com/epolls/2016/president/wi/wisconsin_trump_vs_clinton_vs_johnson_vs_stein-5976.html">very poorly</a>, suggesting Clinton was ahead by 6.5 percent before her ultimate loss by 1 percent. </p>
<p>If polls are going to remain a <a href="http://www.aapor.org/Education-Resources/Reports/Polling-and-Democracy.aspx">major part of the democratic process</a> both in the United States and globally, pollsters have a professional duty to be as accurate as possible.</p>
<p>How will the polling industry improve accuracy after the 2016 election? The first step is to identify sources of error in polling.</p>
<h2>Potential sources of polling error</h2>
<p>When poll results are reported, they come with a <a href="http://www.pewresearch.org/fact-tank/2016/09/08/understanding-the-margin-of-error-in-election-polls/">margin of error</a> – saying the poll is accurate within plus-or-minus a few percentage points. Those margins are the best-case scenarios. They account for statistically expected error, but not entirely for <a href="http://ropercenter.cornell.edu/support/polling-fundamentals-total-survey-error/">several other sources of error inherent</a> in every poll.</p>
<p>Chief among these sources are the questions we ask, how we collect the data, how we figure out whom to ask and how we interpret the results. Each of these deserves a look.</p>
<p>The first two categories – question wording and data collection – are likely not the source of the systemic problems we saw in 2016. For one thing, pollsters have <a href="http://dx.doi.org/10.1093/poq/nfh008">good techniques to test questions in advance</a> and develop good standards. Interviewers may occasionally misread questions, but this is both rare and not likely systematic enough to cause problems outside of a few surveys each election cycle.</p>
<h2>Sampling errors</h2>
<p>The nastiest of all errors for pollsters happen in sampling – determining which people should be asked the poll’s questions. These errors are both the hardest to detect and the most likely to cause major problems across many polls. </p>
<p>At the most basic level, sampling errors happen when the people being polled are not in fact representative of the wider population. For example, an election poll of Alabama should not include people who are citizens of Mississippi.</p>
<p>It is essentially impossible to have a poll with perfect sample selection. Even with our best efforts at random sampling, not all individuals have an equal probability of selection because some are more likely to respond to pollsters than others.</p>
<p>Sampling errors could have crept into 2016 polls in several ways. First, <a href="http://www.people-press.org/2012/05/15/assessing-the-representativeness-of-public-opinion-surveys/">far fewer people</a> are willing to respond to surveys today than in previous years. That’s in large part because <a href="http://www.pewresearch.org/2010/04/14/is-caller-id-is-increasing-non-response-rates-in-your-surveys/">people are more likely to screen their phone calls</a> than in the past.</p>
<p>Young people and those who aren’t interested in politics are particularly hard to reach. Those who did respond to pollsters may not have been representative of the wider group. Pollsters have ways to adjust their findings to account for these variation. One common technique is weighting. But these adjustments can still fall short. A single young black Trump supporter had a <a href="http://www.latimes.com/politics/la-na-pol-daybreak-poll-questions-20161013-snap-story.html">measurable difference</a> in one poll because of this weighting.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/cnXfmOwUwQI?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Weighting survey results, explained.</span></figcaption>
</figure>
<h2>Who is a ‘likely voter,’ anyway?</h2>
<p>General population surveys, such as those of all adult residents of a geographic area, are not particularly prone to sampling errors. This is because <a href="http://factfinder.census.gov/faces/nav/jsf/pages/index.xhtml">U.S. Census Bureau data</a> tell us the characteristics of any given community. Therefore, we can choose samples and weight responses that reflect the specific population.</p>
<p>Election “horse-race” polls are more difficult, primarily because pollsters must first determine which people are actually going to vote. But voter turnout in the United States is <a href="http://dx.doi.org/10.1016/j.electstud.2005.09.002">voluntary and volatile</a>. Pollsters do not know in advance how many members of each politically relevant demographic group will actually turn out to vote.</p>
<p>One way pollsters can seek to identify likely voters is to include several questions in the poll that help them decide whose responses to include in the final analysis. Though the big surprises on election night were polls biased against Trump, pollsters also were biased against Clinton in states such as <a href="http://www.realclearpolitics.com/epolls/2016/president/nv/nevada_trump_vs_clinton_vs_johnson-6004.html">Nevada</a>. </p>
<p>When looking back at 2016 polling problems, some pollsters may find that they were too restrictive in <a href="http://www.pewresearch.org/files/2016/01/PM_2016-01-07_likely-voters_FINAL.pdf">identifying likely voters</a>, which <a href="http://fivethirtyeight.blogs.nytimes.com/2012/07/19/does-romney-have-an-edge-from-likely-voter-polls/">often</a> favors Republicans. Others may have been too lax, which generally favors Democrats. The challenge we will face, though, is that a likely voter screening technique that worked well in 2016 might not work well in 2020 because the <a href="https://www.census.gov/content/dam/Census/library/publications/2015/demo/p25-1143.pdf">electorate will change</a>.</p>
<h2>Interpretation challenges</h2>
<p>A major problem polls faced in 2016 was not in their data specifically, but in <a href="https://theconversation.com/reports-of-the-death-of-polling-have-been-greatly-exaggerated-68504">how those data were interpreted</a>, either by pollsters themselves or by the media. At the end of the day, polls are but rough estimates of public opinion. They are the best estimates we have available, but they are still estimates – ballpark figures, not certainties.</p>
<p>Many people expect polls to be highly accurate, and they often are – but how the public often thinks of accuracy is different from how pollsters do. Imagine an election poll that showed a one-point lead for a Democrat, and had a margin of error of four percentage points. If the Republican actually wins the election by one point, many people would think the poll was wrong, off by two points. But that’s not the case: The pollster actually said the race was too close to call given typical margins of error and somewhat <a href="http://abcnews.go.com/Politics/undecided-voters-unpredictable-year-experts/story?id=41946563">unpredictable undecided voters</a>.</p>
<p>Organizations that aggregate polls – such as <a href="http://fivethirtyeight.com/">FiveThirtyEight</a>, <a href="http://www.nytimes.com/section/upshot">the Upshot</a> and <a href="http://elections.huffingtonpost.com/pollster">Huffington Post Pollster</a> – have added to this tendency. They combine many polls into one complex statistic, which they then argue is more accurate than any one poll on its own.</p>
<p>Those poll aggregators have been <a href="http://www.huffingtonpost.com/simon-jackman/pollster-predictive-perfo_b_2087862.html">accurate in the past</a>, which led the public to rely more heavily on them than they probably should. Without an expectation of extremely accurate polling, the surprise of election night would have been far less dramatic.</p>
<p>Personally, I paid less attention to aggregators and more attention to a handful of <a href="http://www.ipspr.sc.edu/publication/Link.htm">high-quality polls</a> in each swing state. As a result, I entered election night realizing that most swing states were really too close to call – despite some aggregators’ claims to the contrary.</p>
<h2>Changes in the context of the race</h2>
<p>Technically speaking, polls are designed to measure opinion at the particular point in time during which interviews were conducted. In practice, however, they are used to gauge opinion in the future – on Election Day, which is usually a week or two after most organizations stop conducting polls.</p>
<p>As a result, late shifts in public opinion won’t always be apparent in polls. For example, many polls were conducted before <a href="http://www.nytimes.com/2016/10/29/us/politics/fbi-hillary-clinton-email.html">the announcements by FBI Director James Comey</a> about Hillary Clinton’s emails.</p>
<p>A shift in public opinion after a poll is taken is not technically an error. But as happened this year, unpredictable events like the Comey announcements can cause polling averages to differ from the actual election outcome.</p>
<h2>‘Secret’ Trump voters?</h2>
<p>It will take time to assess the extent of <a href="http://www.mcclatchydc.com/news/politics-government/election/article98915057.html">supposed “secret” Trump voters</a>, those people who did not appear in polls as Trump voters but did in fact vote for him. Pollsters will need several months to determine if the extent of their existence is likely due more to <a href="http://www.vox.com/the-big-idea/2016/11/6/13540646/poll-shifts-misleading-clinton-leads-trump">sampling errors</a>, such as Trump voters being less likely to answer the phone, <a href="http://www.politico.com/story/2016/11/poll-shy-voters-trump-230667">than to</a> people being <a href="http://www.people-press.org/2016/08/03/few-clinton-or-trump-supporters-have-close-friends-in-the-other-camp/#how-open-are-voters-with-their-candidate-preferences">embarrassed about their vote intention</a>. Still, <a href="https://theconversation.com/voters-embarrassment-and-fear-of-social-stigma-messed-with-pollsters-predictions-68640">pollsters need to do more</a> to test this potential form of social desirability bias.</p>
<p>When the 2016 polling postmortem is done, I suspect we will find few “secret” Trump voters were lying to pollsters out of political correctness. Rather, we’ll discover a group of Trump voters who simply didn’t normally take surveys. For example, Christians who believe the Bible is the inerrant word of God are often underrepresented in surveys. It’s not because they are ashamed of their faith. It’s because <a href="http://dx.doi.org/10.1093/socrel/68.1.83">they don’t like to talk to survey researchers, a form of sampling bias</a>.</p>
<h2>Polling after 2016</h2>
<p>Pollsters were aware of the challenges facing them in the 2016 election season. Most notably, they identified declining response rates – fewer people willing to be asked polling questions. They reported that concern, and others, in a <a href="https://doi.org/10.1017/S104909651600144X">poll of academic pollsters</a> I conducted in 2015 with Kenneth Fernandez of the College of Southern Nevada and Maggie Macdonald of Emory University. </p>
<p>Many pollsters (73 percent) in our survey were using the internet in some capacity, a sign they were willing to try new survey methods. A majority (55 percent) of pollsters in our sample agreed that poll aggregators increased interest in survey research among the public and the media, an opinion suggesting a win-win for both aggregators and pollsters. However, some (34 percent) also agreed poll aggregators helped to give low-quality surveys legitimacy.</p>
<p>Many pollsters have embraced an industry-wide <a href="http://www.aapor.org/Standards-Ethics/Transparency-Initiative/Latest-News.aspx">transparency initiative</a> that will include revealing their methods for determining who is a likely voter, and weighting their responses to reflect the population. The polling industry will figure out what happened in places like Wisconsin, but surveys are a complex process and disentangling hundreds of surveys across 50 states will not be immediate. The American Association of Public Opinion Research, the largest professional association of pollsters in the country, has already <a href="https://www.aapor.org/Publications-Media/Press-Releases/AAPOR-to-Examine-2016-Presidential-Election-Pollin.aspx">convened a group of survey methodologists</a> to examine the 2016 results.</p>
<figure>
<iframe width="440" height="260" src="https://www.youtube.com/embed/U1MYM35qUr8?wmode=transparent&start=0" frameborder="0" allowfullscreen=""></iframe>
<figcaption><span class="caption">Surveys are a complex process.</span></figcaption>
</figure>
<p>Polls remain a valuable resource for democracy. Without polls we would base our understanding of elections more on “hunches” and <a href="http://dx.doi.org/10.2307/420860">guesses based on rough trends</a>. We would know little about why people support a given candidate or policy. And we might see <a href="http://www.jstor.org/stable/2131766">more traumatic major swings</a> in the partisan composition of our leaders.</p>
<p>If political polls were weather forecasts, they would be good at saying whether the chance of rain is high or low, but they would not be good at declaring with confidence that the temperature will be 78 degrees instead of 75 degrees. In modern politics with narrow margins of victory, what causes someone to win an election is closer to a minor change in temperature than an unexpected deluge. If I’m planning a large outdoor event, I would still be <a href="http://www.esa.doc.gov/economic-briefings/value-government-weather-and-climate-data">better off</a> with an imperfect forecast than a nonexistent perfect prediction.</p><img src="https://counter.theconversation.com/content/68544/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Jason Husser does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.</span></em></p>Pollsters must be as accurate as possible. How will they address the challenges revealed in the 2016 election, and other changes in the coming years?Jason Husser, Director of the Elon University Poll, Elon UniversityLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/612502016-06-21T16:05:51Z2016-06-21T16:05:51ZPolling history: 40 years of British views on ‘in or out’ of Europe<p>At Ipsos MORI, we have been asking people in Britain how they would vote in a referendum on membership since 1977. During this time, both pro- and anti-European views have spent time in the majority – but there have been some dramatic swings from side to side. </p>
<p>The UK’s testy relationship with a united Europe goes back further, to the early 1970s when Ted Heath’s government <a href="http://news.bbc.co.uk/onthisday/hi/dates/stories/january/1/newsid_2459000/2459167.stm">took Britain </a> into the European Economic Community without a vote on January 1 1973. Gallup polls <a href="https://books.google.co.uk/books/about/The_Gallup_International_Public_Opinion.html?id=r8LZAAAAMAAJ&redir_esc=y">initially</a> found the public almost evenly divided on the decision, but by the start of the following year there was a two-to-one majority believing the country had been wrong to join. </p>
<p>The Labour leader, Harold Wilson, who became prime minister for the second time after the February 1974 general election, promised a referendum on whether Britain should remain a member – but first, he was going to renegotiate the terms of British membership. <a href="https://theconversation.com/camerons-eu-wish-list-what-can-he-get-and-when-42286">Sound familiar</a>?</p>
<p>In February 1975, Gallup found that 41% of people said they would vote to leave in an immediate referendum and only 33% to stay in. But Gallup also asked a follow-up question: how would people vote if the government negotiated new terms for UK membership and said that they thought it was in Britain’s interests to stay in? In that case, it turned out that 50% would vote to stay and only 22% to leave – an 18% swing. </p>
<p>And that, of course, is what happened. In March 1975, the renegotiation was completed, parliament endorsed it and all the major party leaders <a href="https://theconversation.com/selling-a-new-deal-in-europe-what-the-remain-campaign-can-learn-from-1975-56177">recommended that Britain should stay in</a>. In June, the voters opted by 67% to 33% to do so. (The final Gallup poll <a href="https://books.google.co.uk/books/about/British_Political_Facts.html?id=u_epbwAACAAJ&redir_esc=y">had forecast</a> a 68-32 result.) </p>
<h2>Growing dissatisfaction</h2>
<p>It was soon after this that MORI began to publish regular polls in the newspapers, including <a href="https://www.ipsos-mori.com/researchpublications/researcharchive/2435/European-Union-membership-trends.aspx">periodic polls on membership of the European Community</a> – or, as we called it to reflect the universal usage of the time, the “Common Market”. </p>
<p>As James Callaghan replaced Harold Wilson and his government slid toward eventual defeat in the “<a href="http://news.bbc.co.uk/1/hi/business/7598647.stm">Winter of Discontent</a>”, the Common Market became steadily less popular. By March 1979 the voters were clearly regretting their 1975 decision, with 60% saying they would vote to leave in a referendum and only 32% to stay. A year later, with Margaret Thatcher now prime minister, the gap was even wider: 65% to 26%.</p>
<p>Opposition to Britain’s membership of the European bloc has never subsequently reached the levels at which it peaked in the first years of the Thatcher premiership. It was the Labour Party which was committed to leaving in the early 1980s. Thatcher argued for, and eventually achieved, changes to the European Community budget involving <a href="https://theconversation.com/the-uks-eu-rebate-explained-58019">substantial rebates</a> to Britain which reduced the level of the country’s net funding contribution. The public warmed once more to the communities: opinion was definitely in favour, 47% to 39%, in 1987.</p>
<figure class="align-center zoomable">
<a href="https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=45&auto=format&w=1000&fit=clip"><img alt="" src="https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&fit=clip" srcset="https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=45&auto=format&w=600&h=445&fit=crop&dpr=1 600w, https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=30&auto=format&w=600&h=445&fit=crop&dpr=2 1200w, https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=15&auto=format&w=600&h=445&fit=crop&dpr=3 1800w, https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=45&auto=format&w=754&h=559&fit=crop&dpr=1 754w, https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=30&auto=format&w=754&h=559&fit=crop&dpr=2 1508w, https://images.theconversation.com/files/127515/original/image-20160621-13005-1ro1epj.png?ixlib=rb-1.1.0&q=15&auto=format&w=754&h=559&fit=crop&dpr=3 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px"></a>
<figcaption>
<span class="caption">Changing views on the prospect of a Brexit.</span>
<span class="attribution"><span class="source">Ipsos Mori</span>, <span class="license">Author provided</span></span>
</figcaption>
</figure>
<p>After this, “get out” was not ahead in a poll again until 1999 – although the gap was very narrow throughout the run-up to Tony Blair’s first election victory in 1997. But that is not to say that the 1990s were a period of broadly stable support for the European project. European issues were central to domestic political debate in Britain in this period, causing bitter divisions that were a crucial factor in condemning John Major’s government to certain defeat to Blair. </p>
<p>Our many polls at this time suggested that the lead of those wanting to “stay in” reflected a willingness to remain in a Common Market but that political and monetary union across Europe were much less popular prospects. In all the time <a href="https://www.ipsos-mori.com/researchpublications/researcharchive/78/Joining-The-Euro-Trends-Since-1991.aspx">we polled on joining the single currency</a>, from 1991 until 2007, we never in even a single poll found more people in favour than opposed.</p>
<p>In the ten years that Tony Blair was prime minister, British attitudes to the EU fluctuated between reasonably comfortable majorities for staying in and narrow leads for the “get out” camp. At one point there was a swing of 12% in just three months, from a 53%-32% lead for “stay” to 46%-43% in favour of “leave” between June and September 2000 – as Blair was telling voters that Britain should be playing a role “at the heart of Europe”. This was just after the launch of the euro, which the British public felt had not been a success and would continue to fail, and after Denmark voted against joining it. </p>
<p>But the biggest factor in this shift may have been <a href="https://www.ipsos-mori.com/researchpublications/researcharchive/1647/Public-Attitudes-Towards-Europe-And-The-Euro.aspx">distrust</a> of the then-chancellor of the exchequer, Gordon Brown. </p>
<p>Our only poll on EU membership during Brown’s own premiership, <a href="https://www.ipsos-mori.com/researchpublications/researcharchive/241/Northern-Rock-Metric-Measurements-And-The-EU-Constitutional-Treaty.aspx">taken in September 2007</a> before the shine had worn off his “honeymoon period” and before the global financial crisis, found the public once more in favour of staying in by a wide majority. The EU had recently <a href="http://news.bbc.co.uk/1/hi/uk/6988521.stm">dropped its attempts</a> to force Britain to replace its last few Imperial measurements with metric ones, and most of the public seemed to be pleased with this, although few admitted it made much difference either way to whether they supported EU membership.</p>
<h2>Reaction to European expansion</h2>
<p>More importantly, though, this was before the introduction of a new dimension into the argument – that of the effect of EU membership on immigration policy. Poland had joined the EU in 2004 and that had immediately been followed by an influx of Polish workers into Britain under laws allowing free movement of labour. Britons had been strongly in <a href="https://www.ipsos-mori.com/researchpublications/researcharchive/2948/Britains-Future.aspx">favour of Poland joining the EU</a> after the fall of the Berlin Wall, but when it came to the point, some proved to be much less keen on Poles coming to Britain to take jobs.</p>
<p>Nevertheless, even though immigration was already becoming prominent among the “most important issues facing the country” as measured in our <a href="https://www.ipsos-mori.com/researchpublications/researcharchive/2905/Issues-Index-2012-onwards.aspx">Issues Index polls</a>, few were yet explicitly making the link with EU membership. </p>
<p>Under Cameron’s premiership, the polls have swung first against EU membership then more recently in its favour. From the 2015 general election to May 2016, all of our polls found more wanting to remain than to leave; but <a href="https://www.ipsos-mori.com/researchpublications/researcharchive/2435/European-Union-membership-trends.aspx">our poll in the first half of June</a> had “leave” back in the lead. But as the history of British attitudes to Europe tells us, such swings are by no means unprecedented, and there is no guarantee the lines may not cross back again.</p><img src="https://counter.theconversation.com/content/61250/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>Roger Mortimore works for Ipsos MORI.</span></em></p>The wane and wax of Euroscepticism in Britain.Roger Mortimore, Director of Political Analysis at Ipsos MORI and Professor of Public Opinion and Political Analysis, King's College LondonLicensed as Creative Commons – attribution, no derivatives.tag:theconversation.com,2011:article/575142016-04-12T09:00:08Z2016-04-12T09:00:08ZBrexit campaign is doomed – if bookmakers are right again<p>As we edge closer to the EU referendum on June 23, the <a href="http://www.ibtimes.co.uk/eu-referendum-polls-show-yes-no-voters-almost-evenly-poised-1552574">latest opinion polls</a> put the
Remain and Leave campaigns either neck and neck or at least close together. </p>
<p>But the reputation of opinion polls <a href="http://www.bbc.co.uk/news/uk-politics-35308129">has plummeted</a> following their abject failure to predict the winner of last year’s general election. According to a <a href="http://www.southampton.ac.uk/news/2016/01/polling-enquiry.page">recent independent review</a> by Professor Patrick Sturgis of the University of Southampton, inadequate sampling procedures led to biased estimates of party support.</p>
<p>Prediction markets, which are often based on betting odds, are an increasingly popular alternative for predicting election outcomes. When you look at their past performance, they have been <a href="https://www.researchgate.net/profile/Robert_Forsythe/publication/2836398_Results_from_a_Dozen_Years_of_Election_Futures_Markets_Research/links/02e7e5150a9e57a0fe000000.pdf">relatively successful</a>. Where opinion polling tends to be irregular and noisy because of the different sampling methods used by the various companies involved, betting data is collected continuously and on a consistent basis.</p>
<p>Prediction markets successfully forecast the outcome of the <a href="http://www.bbc.co.uk/news/events/scotland-decides/results">Scottish referendum</a> of 2014, for example. Whereas the opinion polls <a href="http://whatscotlandthinks.org/opinion-polls">suggested</a> the outcome was uncertain and increasingly hard to call nearer the vote, the betting odds always <a href="https://theconversation.com/hard-evidence-indyref-why-the-bookies-expect-no-to-win-31855">suggested that</a> the probability of a majority vote for independence was quite small. </p>
<p>So what are the odds for the Brexit referendum? The most recent data, as you can see from the chart below, suggests that the probability of a Leave vote is around 30%, with the chance of a Remain vote being correspondingly around 70%. </p>
<p><iframe id="tc-infographic-238" class="tc-infographic" height="610" src="https://cdn.theconversation.com/infographics/238/33b50a32b76a46a5e309241d24d47e9c755bf389/site/index.html" width="100%" style="border: none" frameborder="0"></iframe></p>
<p>These data are based on <a href="http://www.oddschecker.com">www.oddschecker.com</a>, which lists more than 20 companies which have offered odds on the referendum at different times. The most active is <a href="https://www.matchbook.com">Matchbook</a>, which frequently offers multiple small variations in its odds during a single day. The odds from the different companies are transformed into estimates of the average daily probability for each outcome, going back as far as last May if you click the “All” box on the chart. </p>
<h2>Events, dear boy …</h2>
<p>You can also see the timing of some key events which might have been expected to influence the odds, such as when David Cameron <a href="http://www.bbc.co.uk/news/uk-politics-35621079">announced</a> the referendum date. When an event doesn’t make much difference – that announcement did not – it might be because it was expected and has therefore already been discounted by punters.</p>
<p>On the other hand, <a href="http://www.theguardian.com/politics/2016/mar/18/iain-duncan-smith-resigns-from-cabinet-over-disability-cuts">Iain Duncan Smith’s resignation</a> was arguably a surprise, but it had little impact on the odds. This suggests that those placing bets did not feel this event would have a significant effect on the referendum outcome. </p>
<p>Similarly, Tory leadership pretender Boris Johnson’s <a href="http://www.independent.co.uk/news/uk/politics/boris-johnson-confirms-he-will-campaign-for-uk-to-leave-eu-referendum-a6887596.html">decision</a> to throw his weight behind the Leave campaign had little impact.</p>
<p>The only good news for Brexit supporters is that there has been some whittling away of the advantage enjoyed by the Remain campaign since the beginning of March. When the referendum was called, the probability of leaving was around 29%. It fell to 27% in early March but has been rising since, before levelling off in the last few days of that month. </p>
<p>The most recent data at time of writing, for April 7, suggests that the probability of Britain leaving the EU has now reached 33% – a one in three chance. Even so, there is still a clear feeling among those with a monetary interest in the outcome that the UK will remain part of the European Union. </p>
<p>Indeed, the highest odds in favour of a Leave vote were last November, giving a 39% probability. Save this page and we’ll keep updating the chart as we get closer to the referendum. If you want to cut through the noise of the opinion polls, these are probably the numbers to watch. </p>
<h2>Update, April 26</h2>
<p>The latest data from the betting odds show a significant reduction in the probability of a Leave vote in the last week. It started with the <a href="https://theconversation.com/fact-check-do-the-treasurys-brexit-numbers-add-up-58086">Treasury’s gloomy view</a> of Britain’s prospects outside the EU, which brought the slight increase in the leave probability from the beginning of April to a halt. </p>
<p>The downturn accelerated after Barack Obama <a href="http://www.bbc.co.uk/news/uk-politics-eu-referendum-36120808">argued that</a> the UK would be at the back of the queue for a trade deal with the US. Even though he moderated this statement subsequently, the implied probability of leaving the EU fell from 0.34 to 0.31 last week – the sharpest fall in the bookies’ odds so far. It is not possible to show whether <a href="http://www.telegraph.co.uk/education/2016/04/25/boris-johnson-no-platformed-over-obamas-ancestry-comments/">Boris Johnson’s intervention</a> further accelerated the decline, but it is interesting that his decision to join the Leave campaign has also coincided with a period of lengthening odds on that outcome.</p><img src="https://counter.theconversation.com/content/57514/count.gif" alt="The Conversation" width="1" height="1" />
<p class="fine-print"><em><span>David Bell receives funding from the ESRC, but this article does not represent the views of the research councils. </span></em></p>Sorry Boris, those with a bet at stake think we’re staying put.David Bell, Professor of Economics, University of StirlingLicensed as Creative Commons – attribution, no derivatives.