Commentary: US elections – why polls so often seem to get it wrong
The huge win suggested for Joe Biden did not materialise. Perhaps we shouldn’t expect certainty from polls, says the Financial Times' Tim Harford.
LONDON: Irving Fisher, who a century ago was one of the world’s most famous economists, once declared: “The sagacious businessman is constantly forecasting.”
Well, perhaps. But how sagacious is it to be constantly forecasting, when the forecasts seem so often to be wrong?
Significant amounts of money, not to mention incalculable reserves of intellectual and emotional energy, were invested in the problem of figuring out who was going to win this week’s US presidential election.
The polls repeatedly and consistently suggested a huge win for the Democrats’ Joe Biden. That is not how things have panned out.
WE EXPECTED THEM TO BE WRONG
What did we know beforehand? That if the polls were wrong in the same way as in 2016, the election would end up with Donald Trump very close in Florida and Pennsylvania.
A polling error fractionally bigger than 2016 would put us exactly where we found ourselves — on the edge of our collective seats, if not losing our collective minds.
No one is really that surprised. Yes, Mr Biden’s lead was larger and more stable than Hillary Clinton’s in 2016.
Yes, pollsters had in principle corrected for their earlier mistakes. Yes, while polling errors could still be expected, it was as likely that Mr Biden would overperform and grab Ohio and Texas as that he would underperform, failing to win Florida.
Yes, yes, yes.
But no one could quite believe the polls. And it seems we were right to doubt.
HOW WRONG WERE THEY?
The state-level polls were indeed off in much the same way and in much the same places as they were in 2016 and the 2018 midterms. Pollsters do not want to be wrong, and they particularly dislike being wrong in the same way twice in a row. So while polling errors are common, it is a surprise that lightning struck twice in the same place.
At this early stage one can only guess at what went wrong, but it is worth underlining the difficulty that pollsters face.
Consider the situation in Florida, where polls suggested Mr Biden would win 51 per cent of the vote and Mr Trump 48-49 per cent. The actual result was the reverse.
WHY POLLS WERE SO WRONG
But step back. The pre-election numbers suggest that in a typical poll with 500 positive responses, 255 went for Mr Biden and 243 for Mr Trump.
But typical response rates are five in 100 — often lower, says Andrew Gelman, a statistician and prominent election modeller. He says that only around 1 in 100 people respond to opinion polls.
So now picture 10,000 people, 255 who called their vote for Mr Biden, 243 for Mr Trump and 9,500 who never responded. How confident are we feeling now?
Worse, the people who do reply will be systematically different from those who do not: older and whiter, more likely to be women.
Pollsters may try to correct for these factors to ensure that the demographics of the poll match the demographics of the census.
Perhaps Cuban-Americans in Florida are under-represented in the poll by a factor of three. Fine. Let’s say the Cuban-Americans who do reply count triple.
But does this help? What assurance do we have that the tiny minority who bother to respond are a good proxy for the vast majority who do not?
One pollster told me: “Half the time, our adjustments make things better. Half the time they make things worse.”
Complicating matters still further is the question of turnout. Someone may tell the pollsters that they are planning to vote. But will they?
This caused problems for forecasting the Brexit referendum in the UK. Older, less educated voters told pollsters they would show up in force to vote Leave.
Prior elections suggested otherwise. Pollsters who placed more weight on history than on their own raw data were tripped up. Turnout in this US election has been unusually high, giving pollsters another headache.
MAYBE POLLS JUST DON’T TELL US WHAT WE REALLY WANTED TO KNOW
We shouldn’t exaggerate the problem. Polls do generate information. Every single state that the Financial Times confidently predicted would vote for Mr Biden, voted for Mr Biden.
Every single state that the FT confidently predicted would vote for Mr Trump, voted for Mr Trump. Those calls were not made by leaps of political intuition, but by looking at where the polls predicted a safe margin.
Mr Trump needed to win most of the marginal states to have a chance, and promptly bagged three of the four big ones, Florida, Ohio and Texas, denying Mr Biden the quick and decisive victory for which he might reasonably have hoped.
It’s not that the polls told us nothing. It’s that they could not tell us what we yearned to know. We want certainty, but we can’t always get what we want. In a close-run election where most people refuse to speak to pollsters, opinion polls cannot do away with the uncertainty.
Fisher was quite right to highlight the need to think about the future. We must, after all, weigh up our chances and make our decisions.
But, as his contemporary John Maynard Keynes famously remarked, sometimes “we simply do not know”. And since Fisher was eventually ruined, while Keynes died a millionaire, a little agnosticism comes in very handy.
In any case, we must learn to live with uncertainty. Perhaps we should obsess less about the question, “Will it happen?” and devote more thought to what we would do if it did.