At 10pm on election night, everything the polls had been telling us for weeks was suddenly turned on its head. This was not the closest election in 40 years after all. Instead, according to the broadcasters’ exit poll, the Tories were way out in front. Their coalition partners, the Liberal Democrats, were heading for a humiliation, while every seat but one in Scotland was about to turn yellow.
Few believed what they were seeing. It took the steady thud of early results from Sunderland and Swindon for people to realise that the nails were indeed being banged into Lib Dem and Labour coffins.
At the centre of all this was Professor John Curtice, who has been involved in coverage of elections since 1979. On behalf of the BBC, ITV and Sky News, he fronts an exit-poll team of eight analysts from the universities of Oxford, Plymouth and Manchester and the London School of Economics. Here, he explains how the exit poll was conducted.
How unusual is such a gulf between the exit poll and all the previous polls?
Much the same thing happened five years ago. In that election we forecast that the Liberal Democrats were going to end up with fewer seats than they had in 2005. Nobody believed us. So I have been there and learned to expect the unexpected.
How do you react to coming up with unexpected results?
Rule number one with conducting any exit poll is to forget anything and everything you have read, seen, or looked at in advance. You have to go with the data produced by the polling itself, and have faith in the method and design that has been developed across recent elections.
I said quite clearly in advance of polling day that this is quite a difficult exercise. We could not promise it was going to be exactly right. And indeed it was not. We said 316 seats for the Conservatives, but in the end it was 331. Indeed, because 316 seats was quite close to 326 the headline that was put on the poll was “Tories Largest Party”, thereby leaving open whether the Conservatives would have an overall majority or not.
How did the broadcasters react to the news?
The broadcasters published the results of our analysis without quibble or question, and so far as I am aware made no attempt to distance themselves from it. In doing so they showed faith and displayed courage. The BBC at least, with whose programme I was involved after 10pm, made the poll a central feature of their early coverage, and above all used it to help guide viewers as to the significance of the early results – which indeed is the principal purpose of the whole exercise.
The first few results were mostly in line with the expectations of the poll, including crucially that from Swindon North. And after a half dozen results or so, it began to become apparent that we had the broad story right after all. We began to relax a little at that point – and doubtless the broadcasters did so too!
You said beforehand that this election would be particularly difficult. Did you have to change much in your methodology?
The methodology was exactly the same as in 2010, which in turn was exactly the same as in 2005. They key feature is that wherever possible, we poll outside the same polling locations as we did at the previous election. That means that for each location we can derive an estimate of the change in each party’s support and can paint a picture of how the ups and downs of party support are varying from one kind of constituency to another,
We were able to poll at nearly all of the polling stations at which we also polled in 2010. To these we simply added some new ones to meet the two new key challenges posed by this election – what might happen in Scotland and how well were UKIP were doing.