The Essential Report Archive Read the latest report

Essential Facts For Reading Our Poll

11 Aug 2010

The Essential Report has been drawing a bit of comment in recent days, notably for failing to chart a perceived collapse in Labor support in week two of the campaign.

We were in the firing line on the Insiders on Saturday, where George Megalogenis noted that ,as an online poll, we have a different methodology to the major poll, so should not be treated with the same level of credence.

It is true that the Essential Poll uses a different model to the established pollsters – unlike phone-polling, we draw on a community panel of about 100,00 votes established by Your Source.

So why are the Essential numbers different to the phone pollsters?

The first point to make is that all the major polls are operating around a three per cent margin of error – this means that any deviation less than this could be a random shift only. Media doesn’t understand this and covers single point change in forensic detail.

Because of this EMC softens these bumps by averaging our sample over a two week period. That is each week we add that weeks return with the previous week’s numbers and run an average.

The reason Essential didn’t dip as fast as other polls was that because of this method it took some time for the trend to wash though – interestingly, by this week we were at a 2PP largely in line with most phone polls.

In contrast the phone polling companies concentrate their questions over a short period. They usually conduct their interviews over a couple of days whereas our poll is in the field for six days. The upside is that this allows them to pick up any sharp changes in a short time-frame. But it also means that their results appear more volatile.

There are also differences between phone polls and online polls in drawing participants.

While phone polling needs to reach someone who is available to conduct a survey at the time they are called, online participants can fill out the poll at a time that suits them. For many busy people with families, this gives them much better access.

Because of this all polls are weighted to ensure they comply with national averages of gender, age and geography. That is, if numbers in one of these categories are too far out, pollsters will ‘weight’ the section that is under-represented.

So which polls are most accurate? It’s really the wrong question – because no-one knows what the “right” answer is at any point in time  except for election day.

As the saying goes, the only poll that counts is the one on election day, when the 10 per cent of voters who say don’t know to all surveys are forced to choose. This means that any result that deviates from the polls can be, rightly, explained away as a late swing.

Polls are about indicating trends not predicting results– and on this Essential and the major polls agree there has been a narrowing in the race and the current situation is somewhere around line ball, if not a slight advantage to the ALP.

But we all operate in a statistical margin of error of about 2-3% either way and that means any of the published polls that finish within that range of the actual election result will have produced a statistically accurate outcome. The poll which comes closest to the election result will not necessarily be the most reliable or accurate poll – but it will be the luckiest.

National polls can’t really predict a seats outcome when there are state, regional or local factors at play determining outcomes in marginals seats. For example in 1998 Beazley won the 2PP vote but didn’t win enough seats to win the election and in the South Australian election in March Labor won the election despite the Liberals winning the 2PP vote – so broad poll numbers do not always translate into seats won and lost.

The parties however would be spending serious amounts of money tracking the seats that will determine the election and as a result will have a clearer idea of the likely end result in the key seats

How that washes out over the next two weeks is, frankly, anyone’s guess.


Peter Lewis, Director EMC

Error: