The Remainers of the Day. Why are pollsters consistently finding more Remainers than you would expect?

The Remainers of the Day. Why are pollsters consistently finding more Remainers than you would expect?

I don’t take opinion polls very seriously and nor should you. For all that, they tell us something and some of the time we have no better clue as to what is going on than what they tell us. Right now, they seem to be telling us something rather interesting.

For many months, pollsters have consistently found appreciably more respondents in their sample who said they voted Remain in 2016 than said they voted Leave. Since Leave won 52:48, this is not what you would expect. For example, the most recent YouGov, Survation and Ipsos MORI polls have all shown this effect. The Ipsos MORI poll was conducted on a sample that on an unweighted basis would have voted 57:43 Remain.

Pollsters can deal with this of course in the weighting of the sample, but there comes a point where such a consistent finding needs to be considered further. Just why are pollsters’ samples so consistently askew in this regard?

There are quite a few possible explanations, and several of them may contribute to the effect. Let’s have a look at some of them.

Some people who said they voted Remain have forgotten that they voted Leave

As you would expect, the pollsters are well ahead of me when looking at such anomalies. Anthony Wells of YouGov looked at this a while back so far as concerned recollections of voting in 2017.

What he found was that memory is a slippery, plastic thing.  If you compare what YouGov respondents say they did in the 2017 now with what they said a couple of years ago, an appreciable chunk of respondents have retrospectively edited their vote from Labour to elsewhere.

We might easily be seeing a similar effect when voters are being asked how they voted in the referendum, deciding with the benefit of hindsight that rather than voting for Leave they came out for Remain.

If they have indeed forgotten, it seems tolerably safe to assume that they now identify with the Remain cause. If so, it would be wrong to downweight.

Some people who said they voted Remain are lying about the fact they voted Leave 

Some of those people who have retrospectively changed their votes might not be doing so innocently. Some might be deliberately lying. Why would they do a thing like that?

There are two possible reasons pointing in opposite directions. Some people might be embarrassed about the choice that they made. We can take it that they now identify with Remain and should not be downweighted. Others, however, may regard it as socially unacceptable to be identified as a Leave supporter. If so, pollsters do need to downweight to reflect their concealed preferences.

Some people who said they voted Remain did not in fact vote at all

You know the type. They tweet #FBPE, they sign petitions, they might even go on marches. Voting in the referendum, however, was a step too far. Not that they would admit that, of course.

If the Remain figure includes such people, the pollsters are right to downweight their contribution. There is no reason to assume that they will behave differently next time.

Some people who voted Leave are reluctant to take part in opinion polls

Allied to the shy Leavers who claim to have voted Remain are those Leavers who do not want to take part in opinion poll surveys. There will be some Remain voters who are similarly unenthused, but if the refusals skew in the Leave direction pollsters need to unskew by upweighting their presumed allegiances.   

Some people who voted Leave are harder for pollsters to find

The people who respond to opinion polls are going to be unrepresentative in one way or another. The hope of pollsters is that the ways in which they are unrepresentative don’t matter for the matters being polled about.

One big risk – relevant to any political poll but it seems particularly relevant in relation to the EU referendum – is that the respondents to an opinion poll may be disproportionately enthralled by politics. If those who find politics tedious but still troop out to vote react differently from the political wonks, the poll results will be awry.

This seems entirely possible. If it is correct, Leave respondents will need to be upweighted (and Remain respondents correspondingly downweighted). But even this won’t work reliably because engaged Leavers and disengaged Leavers might well be reacting differently.

The electorate has changed since 2016

This is not so much a hypothesis as a fact. It may be distasteful to say so but we can be confident that older Leave-leaning voters have been replaced by debutant Remain-leaning voters. Over a period of three years that might already have wiped out the original Leave lead, all other things being equal.

Moreover, there will have been an influx of naturalising immigrants over the last three years (in part so that they can regularise their legal right to stay in Britain). We can expect this group also to lean heavily Remain.

For this reason it seems wrong to reweight the sample back to 48:52 as par.  That was then but this is now. Since most pollsters seem to do this, it appears that the downweighting does indeed seem to be at least a bit overdone.

Conclusion

The only sensible conclusion is that it’s all a bit odd.  It does look as though Remain supporters’ views are being a bit underrepresented in polls (suggesting that the Conservatives in particular might be a bit overstated and that scepticism about the wisdom of leaving the EU is being understated). But I do have this gnawing doubt that the samples being obtained are potentially unrepresentative in ways that might matter quite a bit. You should always be sceptical of opinion polls’ finding. Right now, the scepticism should be turned up to 11.

Alastair Meeks


Comments are closed.