Support 100 years of independent journalism.

  1. Politics
22 January 2016

Where do the pollsters go from here?

The polling inquiry has laid out the challenge - now pollsters  must meet it.

By Marcus Roberts

Pollsters across Britain have been trying to figure out what went wrong last May. Now the British Polling Council’s independent inquiry, led by the University of Southampton’s Professor Patrick Sturgis, has made it clear: we got our numbers wrong because we got our samples wrong. 

In response to this pollsters have a simple choice: pretend the problem never happened, blame the voters or take responsibility.

YouGov has chosen the latter course with full and unreserved apologies from the very top for the errors we made in 2015. 

So what was the main mistake? Our own internal investigation into our errors concurs with that of the independent inquiry: sample failure. Simply put, our pre-election samples contained too many politically-engaged young people and too few quieter pensioners. Since the election YouGov has spent hundreds of thousands of pounds on panel recruitment to correct this so as to maximise our accuracy in the future. 

Getting the makeup of the sample right, difficult and expensive as it is, is the best way to tackle the problem. 

Sign up for The New Statesman’s newsletters Tick the boxes of the newsletters you would like to receive. Quick and essential guide to domestic and global politics from the New Statesman's politics team. The New Statesman’s global affairs newsletter, every Monday and Friday. The best of the New Statesman, delivered to your inbox every weekday morning. The New Statesman’s weekly environment email on the politics, business and culture of the climate and nature crises - in your inbox every Thursday. A handy, three-minute glance at the week ahead in companies, markets, regulation and investment, landing in your inbox every Monday morning. Our weekly culture newsletter – from books and art to pop culture and memes – sent every Friday. A weekly round-up of some of the best articles featured in the most recent issue of the New Statesman, sent each Saturday. A weekly dig into the New Statesman’s archive of over 100 years of stellar and influential journalism, sent each Wednesday. Sign up to receive information regarding NS events, subscription offers & product updates.

Other solutions that we considered and rejected included simply adjusting the figures from polling before publication to ensure a larger share of Conservative voters or adopting a probability sampling model for future polling. 

Content from our partners
How automation can help telecoms companies unlock their growth potential
The pandemic has had a scarring effect on loneliness, but we can do better
Feel confident gifting tech to your children this Christmas

The problem with the first ‘solution’ is that while it may answer questions of voting intention with greater accuracy than the 2015 debacle it does not provide an accurate window into the opinions of those voters. What’s more, it can result in low income groups being underrepresented. And as psephologist Matt Singh, one of the few to warn of the polling problem in advance of the election, notes http://www.ncpolitics.uk/2015/11/new-ncp-analysis-where-the-polls-went-wrong.html/ this approach failed to save pollsters from embarrassment in 2015 as the shifts in voter behaviour were too complicated for weighting to accurately represent. 

Indeed the Sturgis inquiry itself has warned of the problems of so called ‘weighting’ solutions which may catch some problems but do not maximise the chance of accuracy. 

Next, is the suggestion based on Professor Curtis’s warm words for probability sampling that pollsters should ensure an accurate sample of the population through large scale, in-depth, face-to-face surveys. But this approach still has errors, the British Election Study  (a remarkable piece of scholarly work and a must read for politicos everywhere) still got the UKIP vote share wrong by several points. What’s more it can take two months or more to complete and cost hundreds of thousands of pounds per poll. Such an approach fails a pollster’s responsibility to provide accuracy as well as timeliness and indeed affordability – so that polling does not become the preserve of wealthy clients alone. 

The Sturgis inquiry has also raised the question of ‘herding’ in the polling industry. This makes little commercial sense as any polling company worth its salt desires differentiation in its results rather than similarity so as to demonstrate superiority. Simply put, we want our results to be different from our competitors so that, as we hope, when we are proved right they are proved wrong! 

Nevertheless, what is possible is that some companies may have adjusted methodologies during the course of the final days and weeks of the 2015 campaign which gave results more in keeping with the rest of the industry. This could explain the claims of some pollsters after the fact regretting that they didn’t publish more of their polls. But this was categorically not the case with YouGov – as the reams of data and cross-tabs publicly available make clear. 

The solution to the polling problem of 2015 is nevertheless hard and expensive: changing the make up of panels and samples to ensure they are an accurate reflection of the opinions and voting intentions of the British people all year round. 

The Sturgis Inquiry rightly diagnosed the problem in polling as being an inaccurate representation of the electorate. By making sure our samples include more people who pay little attention to politics and increasing the proportion of older voters in the sample, YouGov is addressing this. Answering the central charge of what we got wrong last May by changing our approach to that very error ensures that the lessons of 2015 are learned. And it is this approach that we believe will ensure accuracy in future that over time restores trust in our polling.