March 23, 2017
Democracy is coming
We have an election this year, so we are starting to have polling.
To save time, here are some potentially useful StatsChat posts about election polls:
- “All about election polls” (a general explainer by Andrew Balemi)
- A simple cheatsheet for working out the margin of error for minor parties (also including a simple Excel macro)
- Why opinion polling is helpful: your neighbours aren’t the whole country.
- Roy Morgan produce a greater number of surprising poll results than other companies. That’s because they poll more often.
Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »
A quick thought on election polls. Has anyone measured fluxes from one party to another? E.g. if, say, National is on 45% and Labour on 30% for two polls in a row (ignoring uncertainty), it’s probably largely the same 45% and 30% of the population, but not exactly. All that would be known for sure based on the two polls is that the transition rates satisfy detailed balance.
Does anyone try to measure the turnover rather than overall proportions?
8 years ago
In the US, there are election panel surveys that I think do this. I don’t think any of the big NZ surveys get the same people each time, so it probably wouldn’t be feasible.
8 years ago
The US election panels, yes that was the famous one from LA Times which was the outlier and predicted a Trump win ( of course the others will say they were withing the margin of error)
They say they predicted the previous Obama win as well.
Is the end of the random phone poll nigh ?
8 years ago
Yes, if you’re not willing to consider anything other than point estimates you will find polls pretty useless.
This isn’t new, it has always been the case, and it’s why right from the start StatChat has gone on and one about margin of error.
8 years ago
I’m pretty sure Colmar Brunton did this in 2014, but I can’t find the report on their new website.
8 years ago
I just saw the dates and realised this post is a month old, but just in case you’ve subscribed, the tweet is here:
https://twitter.com/ColmarBruntonNZ/status/505061791613718528
The report is gone, but most of the info is in the tweeted image anyway.
8 years ago
In his chapter in the book ‘Kicking the Tyres’ (2012, ed. Johansson and Levine), Rob Salmond compared the accuracy of each of the four main telephone pollsters’ last poll before the 2011 election (based on their ‘root mean squared errors’) and found Colmar Brunton/TVNZ had the lowest error (about 1.4 from eyeing his graph), and Roy Morgan and Reid Research the highest equal (about 2.2).Can we read anything into this? Is there evidence to say one pollster is ‘better’ than the others?
8 years ago
*accuracy compared to the actual election results, that is.
8 years ago
That’s the right sort of evidence, but there isn’t enough of it in one election, and I don’t think the 2014 elections showed the same error rankings.
What’s clear is that Roy Morgan results, which are widely thought to be much more volatile than the others, really aren’t. Accuracy is harder, because there isn’t enough ground truth. Things are easier in the US.
8 years ago