Posts filed under Polls (132)

June 2, 2013

Submissions are for reading, not counting

The Herald, writing about Hamilton’s pending removal of fluoridation from their water supply

A Hamilton City Council tribunal examining the topic has re-ignited intense public debate on the issue, with 89 per cent of the 1,557 submissions made to it in favour of stopping fluoridation. In 2006, 70 per cent of residents who voted in a referendum backed fluoride.

This actually isn’t evidence or even a suggestion of a change in opinion. All we can tell from the numbers is that 1386 people now want fluoride removed.  Public submissions are useful qualitatively, not quantitatively.

It may be true that the people of Hamilton don’t want fluoride in their water, in which case I think they are unwise, but it’s their problem. Confusing self-selected numbers with referendum votes  isn’t going to help determine what they want, [and neither is the exclusion from voting of three of twelve council members on the grounds that they also sit on the DHB and so have thought about the issues before]

May 20, 2013

Survey reporting, yet again

The Herald says “Half of NZ workers eyeing new jobs – survey

The photo caption even says “Of those working in property and construction, 63 per cent said they were likely to look for a new job in the next year” so they survey apparently breaks down its sample into a bunch of subcategories.

So, how reliable is this survey?  The Herald doesn’t say much about methodology, except that

The online survey canvassed 260 “professionals” working in jobs ranging from entry-level to senior management.

That’s not a lot, and you really have to wonder how they were sampled.  The Herald gets points for linking to the full glossy pamplet, but its methodology section say, in full

The Michael Page Employee Intentions Report is based  on the online survey responses of 260 professionals in New Zealand. Participants represent a range of professional occupation groups and hold positions that range from entry level through to senior management. The scope of the report includes key employee insights into preferences for attraction and retention, salary expectations, benefits and work-life balance and their views on the predicted employment outlook.

This might not be a bogus poll, but the lack of information is really not encouraging.  The report doesn’t give any demographic information that might help verify how representative it is, but it does say that 76% of those planning to change jobs intend to use recruitment consultants.

All in all, clearly a win for the flacks, not the hacks.

April 26, 2013

That TV3 poll on racism? Bogus!

The Herald today ran this story claiming that people think New Zealand is a racist country, based on the results of a survey run  for TV3’s new show The Vote. Viewers voted through Facebook, Twitter, The Vote website or by text.

I haven’t watched The Vote, but I would like to know whether its journalist presenters, presumably fans of accuracy, point out that such self-selecting  polls are unscientific – the polite term for bogus. The best thing you can say is that such polls allow viewers to  feel involved.

But that’s not a good thing if claims made as a  result of these polls lead to way off-beam impressions being planted in the public consciousness; that’s often the way urban myths are born and prejudice stoked.

I’m not saying that racism doesn’t exist in New Zealand, but polls like this  offer no insight into the issue or, worse, distort the truth.

It’s disappointing to see the Herald, which still, presumably, places a premium on accuracy, has swallowed The Vote press release whole, without  pointing out its shortcomings or doing its homework to see what reliable surveys exist. TV3 must be very pleased with the free publicity, though.

March 21, 2013

That’s not worth a thousand words

The Herald has an interesting set of displays of the latest DigiPoll political opinion survey.  According to the internets it was even worse earlier in the day, but we can pass over that and only point out that corrections in news stories shouldn’t happen silently (except perhaps for typos).

We can start with the standard complaint: the margin of error for the poll itself is 3.6%, so the margin of error for change since the last poll is 1.4 times higher, or a little over 5%. None of the changes is larger than 5%, and only one comes close.

Secondly, there is a big table for the minor parties. I would normally not quote the whole table, but in this case it’s already changed once today.

minorparties

 

The total reported for the minor parties is 6.1%, and since there were 750 people sampled, 46 of them indicated support for one of these parties. That’s not really enough to split up over 7 parties. These 46 then get split up further, by age and gender. At this point, some of the sample proportions are zero, displayed as “-” for some reason.

[Updated to add: and why does the one male 40-64 yr old Aucklander who supported ACT not show up in the New Zealand total?]

Approximately 1 in 7 New Zealanders is 65+, so that should be about 6 or 7 minor-party supporters in the sample.  That’s really not enough to estimate a split over 7 parties. Actually, the poll appears to have been lucky in recruiting older folks: it looks like 6 NZ First, 2 Conservative, 1 Mana.

That’s all pretty standard overtabulating, but the interesting and creative problems happen at the bottom of the page.  There’s an interactive graph, done with the Tableau data exploration software.  From what I’ve heard, Tableau is really popular in business statistics: it gives a nice clear interface to selecting groups of cells for comparison, dropping dimensions, and other worthwhile data exploration activities, and helps analysts present this sort of thing to non-technical managers.

However, the setup that the Herald have used appears to be intended for counts or totals, not for proportions.  For example, if you click on April 2012, and select View Data, you get

tab

 

which is unlikely to improve anyone’s understanding of the poll.

I like interactive graphics.  I’ve put a lot of time and effort into making interactive graphics.  I’ve linked to a lot of good interactive graphics on this blog. The Herald has the opportunity to show the usefulness of interactive graphics to a much wider community that I’ll ever manage. But not this way.

March 15, 2013

Policing the pollsters … your input sought

This is from Kiwiblog:

A group of New Zealand’s leading political pollsters, in consultation with other interested parties, have developed draft NZ Political Polling Guidelines.

The purpose is to ensure that Association of Market Research Organisations and Market Research Society of New Zealand members conducting political polls, and media organisations publishing poll results, adhere to the highest “NZ appropriate” standards. The guidelines are draft and comments, questions and recommendations back to the working group are welcome.

This code seeks to document best practice guidelines for the conducting and reporting of political polls in New Zealand. It is proposed that the guidelines, once approved and accepted, will be binding on companies that are members of  AMRO and on researchers that are members of MRSNZ.

March 6, 2013

Twitter is not a random sample

From Stuff,

If you’ve ever viewed Twitter as a gauge of public opinion, a weathervane marking the mood of the masses, you are very much mistaken.

That is the rather surprising finding of a new US study, which suggests the microblog zeitgeist differs markedly from mainstream public opinion.

Apart from being completely unsurprising, this is a useful thing to have data on.  The Pew Charitable Trusts, who do a lot of surveys, compared actual opinion polls to tweet summaries for some major political and social issues in the US, and found they didn’t agree.

Along the same lines, it was reported last month that Google’s Flu Trends overestimated the number of flu cases this year (after having initially underestimated the H1N1 pandemic), probably because the high level of publicity for the flu vaccine this year made people more aware.

These data summaries can be very useful, because they are much less expensive and give much more detail in space and time than traditional data collection, but they are also sensitive to changes in online behaviour. Getting anything accurate out of them requires calibration to ‘ground truth’, as a previous generation of Big Data systems called it.

March 5, 2013

They think they are representing you

An interesting finding from the US (via):  politicians think their electorates are more conservative than they actually are — slightly more conservative for left-wing politicians, much more conservative for right-wing ones.

broockman_graph

 

The errors are large: right-wing politicians overestimate the support among their electorate for conservative positions by an average of nearly 20%.  The size of the error, and the big differences by ideology of the politician mean that it can’t just be explained by actual voters being more conservative than the population at large.

February 20, 2013

We can haz margin of error?

Generally good use of survey data in a story from Stuff about the embattled Education Minister.  They even quote a competing poll, which agrees very well with their overall statistic.

The omission, though, relates to the headline figure: “71pc want Parata gone – survey”.  That’s a proportion “among voters from Canterbury”.   Assuming that they don’t mean “voters” in any electorally-relevant sense, just respondents, we would expect about 120 of the 1000 respondents to be from Canterbury. The maximum margin of error is a little under 10%.

The fact that one region has 71% wanting Ms Parata gone when the overall national average is 60% would actually not be all that notable on its own. Since we already expect her to be less popular in ChCh, the difference is worth writing about, but if it’s worth a headline, it’s worth a margin of error.

February 19, 2013

Terminology

Most of the Stats department is currently moving from the leafy park-like north end of campus back to the glass and concrete Tower of Science. While we’re in transit, here’s a bogus poll on statistical terminology.

Distributions can be classified as to whether they produce more outliers or fewer outliers than a normal distribution. The terms are “platykurtic” (same Greek root as platypus, meaning “flat”) and “leptokurtic” (Greek root meaning “thin”)

Update: answer, and potentially discussion, in the comments

February 12, 2013

Unclear on the pie-chart concept

Everyone recognises pie. Everyone likes pie. So pie must be a good representation of numbers, right?

One important detail: you want bigger numbers to translate into more pie. This would be especially important if the numbers meant anything, but it’s not a good look even if they don’t.

4XMQKnz

From the Herald-Sun, via Juha Saarinen on Twitter.

[Also: could a cynical reader perhaps think the question was a bit slanted?]

[Update: this looks like exactly the same pie chart they used for 56% vs 44% last month]