back to top

This Is How Just A Handful Of People Can Influence An Opinion Poll

We throw polling figures at each other in conversation -- but should we? We find out how polls are carried out, and how few people are required to actually change the results.

Posted on

Britain loves opinion polling. And it guides the decisions of politicians and the media. But it's not completely reliable.


Politicians, the public and the press rely on the reams of polling data churned out by pollsters on a daily basis. It forms a large part of many stories in daily newspapers and on websites including BuzzFeed News.

Right or wrong, there's a perception that numbers are infallible.

But where does the data come from, do we trust it, and how few people does it take to influence an opinion poll?

It doesn't take many people to sway a poll. Most good polls rely on a sample size of between 1,000-2,000 people.

A carefully selected sample of a couple of thousand people can be better than a slapdash survey of several million.

There's a sweet spot at which polling is simple enough to practically do, while also keeping an acceptable level of reliability. According to Anthony Wells of YouGov, and the author of the UK Polling Report blog, a bigger sample size doesn't necessarily mean a better poll.

Beyond that "you get to the point of diminishing returns".


"A poll of 1,000 people has, on paper, a margin of error of 3%," says Wells.

"A poll that's twice the size has a margin of error of 2%," he adds. "So it's twice the size but you've only reduced the margin of error by a third. As you keep going up, the benefits of having a larger sample keep getting smaller and smaller."

What makes a poll worthwhile is how representative the sample is.

Each of us are individuals, and together, we make up lots of different tribes, based on our race, our gender, which class we fit into, what our political and personal beliefs are, even down to what kind of car we drive (or whether we can afford one in the first place). And they all need to be represented in polling at a similar level as they are in real life.


That can be done by who you select (sampling), and how you treat their views (weighting).

Sampling the right group of people to ensure you get a series of viewpoints that matches the wider populace is vital, as is giving their opinions the correct weight.

A good poll lives or dies on whether, through sampling and weighting, you've collectively got 1,000 or so people who represent the British public in all of the ways we know about.

Despite the small sample sizes, 80% of people trust polls to be honest and representative.

A survey by the Worldwide Independent Network of Market Research (WIN) and Gallup, a polling group, of 42,720 people across 47 countries, from Guatemala to Macedonia, the UK to the USA, found that globally, we're generally very trusting of survey results.

Britons, though, are more circumspect. A quarter of us said we distrust polls when asked the same question by ORB International. Scots are particularly wary, with 30% actively distrust polling results.

Scotland's a pretty good example of how polling works, and how few people are needed to change results – and the political narrative.

The recent independence referendum is a perfect example. referendum.

Both the 'Yes' and 'No' campaigns – as well as the huge army of journalists reporting on the Scottish Independence referendum – carefully watched polling. The 'No' campaign responded to a tightening of opinion polls by flooding the country with high-profile politicians near the end of the race.

The polls were so tight that in some cases, the gap between the 'Yes' and 'No' campaigns was less than the margin of error. The entire campaign was swinging on the views of just a handful of people's responses to polls.

Panelbase / Via

Until polling stations opened on September 18th, opinion polls were the best insight we had into how Scotland would vote. And some surveys, such as this one conducted by Panelbase for The Scottish Sunday Times, separated the campaigns by just over a single percentage point before weighting – just 15 people.

Had those 15 people who were surveyed said something different, the poll's result could have materially changed, and resultantly could have changed the whole tone of the campaign.


In the end, pollsters correctly called the result of the referendum, but by and large understated the margin of victory.

Anthony Wells explains: "Scotland was particularly hard because normally, if it's a voting intention poll, our methods will be based at some point on a past vote. When something comes along like the Scottish referendum, you haven't got a previous referendum to go back and draw up methodologies and weighting from."

It could've been worse. This could've happened.


The Chicago Tribune called the 1948 US Presidential election for Thomas Dewey, who had consistently led opinion polling in the run-up to voting day. They even printed it on the front page of their newspaper, which the actual winner of the election, Harry Truman, was happy to hold up and jeer the following morning.

'Dewey Defeats Truman' isn't the only epic fail based on incorrect or outmoded polling. Literary Digest, a US magazine, polled 2.4 million people ahead of the 1936 election -- and came out with the wrong result.

In the UK, there have been a couple of catastrophic failures.

At the 1970 general election, pollsters called the race for Labour too early and missed a late switch in public opinion towards the Conservatives. They got it wrong again for the same reasons in 1992, which a post mortem carried out by the Market Research Society called "the most spectacular failure in the history of British election surveys."

"It’s not a constant improvement from rubbish polls in 1945 to fantastic polls now," says YouGov's Wells. Pollsters get things terribly wrong, look back at how to fix it, and come up with new methods of reaching people, smarter sampling, or account for changing taste.

But polls aren't infallible, because alongside small sample sizes, sometimes humans lie, or bluff their way through them.

A stubborn minority of people lie in polls. Well, that's according to polls of people asking if they lie in polls.

And just because someone responds to a poll, it doesn't mean that they're necessarily well-informed.

The headline-grabbing poll of 2,022 people carried out by YouGov asked people if they could recognise members of the Labour shadow cabinet. Among the names was Andrew Farmer, a politician YouGov made up.


15% of respondents said they had heard of 'Andrew Farmer', even though he doesn't exist.

That's a little #awkward.

So polling is tough to do, and we perhaps put too much stock in it.

BuzzFeed News asked Anthony Wells whether politicians and the media are too wedded to the results of opinion polls.

"Yes," he said, laughing.

"I think it's got better, but there is still an awfully big temptation if a poll shows a sudden change, to report that, when by definition, if a poll shows something unusual, it's probably wrong. Most of the time public opinion moves quite slowly over time, and it jumps so often.

"The problem is of course in terms of media that everyone wants to be first. In media terms, 'Third polls show same thing other two polls showed yesterday' is a really rubbish story."

"It’s a constant race to keep up with the British population, the British political system, and technology," he admits.

"You're only as accurate as your last general election. You're fighting to improve that and keep up with whatever has changed in the five years since."

Chris is a freelance writer for BuzzFeed, The Economist, The Sunday Times and the BBC, based in the UK.

Contact Chris Stokel-Walker at

Got a confidential tip? Submit it here.