Making social science accessible

6715845603_3062579e09_b

One of the striking features of polling of referendum vote intentions to date has been that polls conducted by phone have been producing more favourable results for Remain than those undertaken via the internet. Typically, once Don’t Knows are left to one side, the former have been suggesting that support for Remain is around the 60% mark (and thus that for Leave at 40%), whereas the latter have been putting the two sides more or less neck and neck.

Inevitably, this has raised the question of which set of polling numbers (if either) should be believed. Providing an answer has not been easy, because close examination of the detailed tables for the two sets of polls simply uncovers the fact that phone polls find more Remain voters than internet polls do within more or less every category of voter. Thus, for example, both sets of polls invariably find that younger voters are keener than older voters on remaining in the EU. It is just that phone polls have typically been finding around 10% more Remain voters amongst younger voters than have internet polls, and 10% more amongst older voters too.

However, a new paper published today jointly by Populus and Number Cruncher Politics, reports the results of some methodological experimentation that attempts to explain the discrepancy between the two sets of polls  – and give us some handle as to which might be closer to the truth.

Hypotheses and Evidence

The paper sensibly starts from the premise that, given that differences between the two sets of polls are evident within each of the various demographic characteristics by which pollsters commonly tabulate (and weight) their samples, the answer to the divergence must lie elsewhere. They look at two possibilities. The first is that it arises as a result of differences between the two kinds of polls in the number of ‘Don’t Knows’ that they obtain. The second is that it is occasioned by important differences in the composition of their samples that are not captured by the pollsters’ standard demographic variables.

On the first point, the paper reports the outcome of two experiments, one conducted on a phone poll, the other on an internet one, in which roughly half the respondents were offered ‘Don’t Know’ as a possible answer (as is typically the case in internet polls), and half were not (or in the case of the internet sample, it was only available in small type at the bottom of the relevant page). It finds that in both cases more people said ‘Don’t Know’ when it was offered as an option – and that in both instances support for Remain was markedly higher when it was not offered. The implication is that those who say ‘Don’t Know’ tend to be closet supporters of the status quo, and thus polls (such as most internet polls) that offer that option are consequently at risk of underestimating support for Remain.

On the second point, the paper notes that in the academic British Election Study (BES), those who favour remaining in the EU are also more likely to give a socially liberal answer in response to questions on whether attempts to introduce greater equality for women and for racial minorities had gone too far or not gone far enough, and that they were also more likely to say they were British rather than English. Populus thus included these questions on both a phone and an internet poll of referendum vote intentions, and weighted the resulting samples so that they matched the distribution of responses to these three questions obtained by the face-to-face random probability survey conducted as part of the BES (and which was relatively successful in replicating the 2015 UK general election result). This weighting reduced what was otherwise an eleven point difference between the two surveys in Remain’s estimated lead over Leave to one of just three points.

Evaluation

This paper undoubtedly makes a most welcome contribution to what hitherto has been an unresolved puzzle. But there are some questions to be asked, both about how far it really does help explain the difference in the estimates being obtained by internet and phone polls, and certainly whether its evidence justifies the assertion that the ‘true state of public opinion’ is ‘closer’ to the picture obtained by phone than by internet polls.

Let us look first at the evidence on the prevalence and impact of Don’t Knows. It is certainly the case that most internet polls report more people saying ‘Don’t Know’ than do phone polls. But this is not invariably the case.

First, in two polls that it conducted in February and March by phone, Survation reported 19% saying Don’t Know, a figure that was little different from the 18% and 21% that the company previously reported in two internet polls it conducted in December and January. Yet at 58%, the average level of support for Remain in the two phone polls (after the Don’t Knows were left aside) was still markedly higher than the average of 48% obtained by that company in its two previous internet polls.

Second, there is one company, ORB, whose internet polls have not contained any Don’t Knows – because that option was not available at all. Yet in the six polls it administered in that way, support for Remain stood on average at just 51% – little different from the figure being obtained by other internet polls with many more Don’t Knows.

Meanwhile, if it were the case that those who say Don’t Know to pollsters are disproportionately closet supporters of Remain, we should find that when pollsters attempt to ‘squeeze’ their respondents by asking them which option they were more likely to back, Remain should emerge with a clear lead. Yet in the handful of polls where an attempt has been made to squeeze the Don’t Knows, this is not the picture that emerges. In its most recent phone poll released yesterday, Ipsos MORI found only a modest Remain lead of 32% to 25% amongst those who initially said Don’t Know, while in its recent phone poll ORB found this group leaning slightly toward Leave by 37% to 31%.  Meanwhile in two internet polls conducted recently by BMG the Don’t Knows broke almost evenly, when squeezed, with 23% on average inclining towards Remain and 21% towards Leave.

In any event, even if we are willing to accept that those who say Don’t Know are more likely to support Remain than Leave, it is still far from clear that we should assume that polls that have a lower proportion of Don’t Knows are more likely to be accurate. As both the most recent Ipsos MORI and the most recent ICM poll exemplify (both companies have now started collecting information on reported likelihood of voting), those who say Don’t Know are less likely to say they will make it to the polling station. That implies that a poll that fails to attempt to estimate or model likely turnout while limiting the number of Don’t Knows could be giving too much weight to a group of voters who will prove less likely to cast a vote.

But what of the paper’s evidence on sample composition? Here we discover that, as compared with the BES face-to-face random probability sample, an online poll conducted by Populous as part of the methodological experiment proved to be markedly less socially liberal, while an identical one conducted by phone proved to be more liberal. In short, neither exercise proved to be particularly accurate as compared with the BES, though in fact it was the phone sample that was the slightly more astray. Meanwhile, we should bear in mind that the advocates of internet polling at least would argue that the reason why their samples appear less socially liberal than those conducted by phone or face-to-face is because, without an interviewer present, respondents feel less inhibited about giving what they think is a socially unacceptable response. In other words, the difference between the internet poll and the BES could be a consequence of the mode of interviewing rather than because of a difference in the composition of their samples.

Conclusion

In short, while the paper makes a valuable step forward in identifying the importance of obtaining in EU referendum polls samples that are representative of the balance of social liberalism and social conservatism in our society, it is less clear that it has established that phone polls are proving more accurate than internet polls in that regard. If anything, the paper would simply seem to point to the conclusion that both may be failing to do so, albeit in different ways.

Meanwhile, we should note, somewhat ironically, that the gap between internet and phone polls has, in fact, now suddenly narrowed. In the phone polls they have conducted this month, ComRes, Ipsos MORIand Survation have on average put Remain on just 55% (once Don’t Knows are left to one side), down five points on the equivalent figure for February. Meanwhile in the first phone poll that they have conducted, ORB actually put Leave narrowly ahead this month by 51% to 49%. In contrast there is no sign of any equivalent movement in those polls conducted via the internet. In their three most recent polls,ICM, TNS and YouGov put Remain on average on 51% and Leave on 49%, little different from the picture of an even contest that has been painted throughout by internet polls. Perhaps this is a sign that as the referendum comes closer into view and voters begin to develop rather firmer views, how the polls are conducted will come to matter less, much as proved to be the case in the Scottish independence referendum.

Alas, however, that will be no guarantee that they will be right.

This piece was written by John Curtice, senior research fellow at NatCen and senior fellow at the UK in a Changing Europe. It was originally published on the What UK thinks EU.

MORE FROM THIS THEME

Taking Scotland forward: the Scottish Conservative Party conference

Shades of green: the 2024 European Parliament elections and the EU’s Green Deal

What will parties be spending on at the next election?

Lee Anderson’s defection: should the Conservatives be worried?

Does climate change divide supporters of social democratic parties?

Recent Articles

Subscribe to our newsletter

* indicates required