Posts filed under Surveys (188)

September 9, 2012

Provenance-free survey data

Stuff reports that 97 per cent of those aged 18-29 are saving for something.  This is attributed to RaboBank

The myth of feckless youth which does not understand the value of money should be laid to rest, new research from online deposit bank RaboDirect suggests.

and you now know everything they’re prepared to admit about how the survey was done.

There isn’t a press release up at RaboBank yet, but the previous press release (from May) is about another survey. It also says nothing about methodology.  There is a link to a PDF, which might have had more details, but it has been taken down.

With more searching I found a RaboBank press release from 2010 that did still have it’s ‘full survey results’ PDF link up — it has lots of colorful bar charts, and while it still doesn’t say anything about how the data were collected, the footnotes to the graphs include sample sizes and a little magenta logo that suggests the data were collected by market research firm TNS and were probably some sort of online panel.  So that Rabobank survey wasn’t totally bogus — there’s disagreement about the accuracy of this approach, but at least it is trying to get the right answer.

So, it could be that the youth saving survey is also plausible. Or not. Hard to tell, really. And that’s before you dig into what “saving for something” might mean — is it actual money-in-the-bank or a New Year’s resolution.

August 29, 2012

Surveys again

Last Wednesday I wrote that a survey should always provoke the questions “How was it done? By whom? and What are they selling?”

This week, Stuff and TVNZ tell us that half of New Zealanders don’t have a will, and that there are more wills in the South Island than the North Island.  This survey was commissioned by Public Trust (who do wills), but there’s no information about how the survey was done or by whom.  Public Trust don’t currently have any information on their ‘News‘ webpage.

An interesting coincidence, but probably no more: at the moment, the proportion saying they have a will on the clicky poll on the Stuff website is 47%.

August 24, 2012

One in three, one in six, one in eight?

The Herald says

One in three New Zealanders have been harmed by their own alcohol drinking, a survey shows.

The survey, published in the New Zealand Medical Journal, found 33.8 per cent of current drinkers reported they had been adversely affected in the past year.

This is actually true: there is such a survey, it is published in the NZMJ, and it’s a real survey (the response rate isn’t all one could wish, but it’s not out of line with other major health surveys).  The story goes on to say that the harm was reported more often by men.

The interesting point is that the current issue of NZMJ has two articles about frequency of harm from drinking.  The other one says, also based on real survey data,

The prevalence of self-reported harm from others’ drinking was higher than harm from own drinking (18% vs 12% in the past year) and was higher in women and young people.

And this second survey isn’t pro-alcohol in the slightest — for example, it quotes the proportion of criminal offenders who had been drinking as if it was the proportion of crimes resulting from alcohol.  This is a huge overestimate, precisely because there is too much drinking in NZ: the people who are drunk and arrested in the major cities tend to be just as drunk about three times a week on average (based on Massey Uni. research that we’ve mentioned before).

The timing of these articles is perhaps not a coincidence, but in fact there’s pretty wide consensus about the harms from drinking: the real disagreement is about the costs of various sorts of regulation.

August 23, 2012

How was the survey done, by whom, and what are they selling?

Those are the basic questions you need to ask when reading, for example, that “almost half of Australian workers would rather quit a job than deal with office tension.”

Unfortunately, in the case the media release doesn’t answer the first question, and its answer to the second and third questions is

The 2012 R U OK? Australian Workplace Relationships Survey was developed in partnership with the Centre for Corporate Health, one of Australia’s leading workplace mental health service providers. 

Now, workplace mental health is an important issue, and for all I know it may be seriously neglected in Australia (I last worked there in 1995, but it was in a medical research centre, which probably isn’t typical).   That doesn’t mean mental-health advocates get a free pass on data quality — good facts matter more for important issues than for, say, whether we should wear pyjamas in the street.

The Centre for Corporate Health isn’t that enthusiastic about telling us how the survey was done: you need to subscribe to their research news to get in, and even then all they say is

The survey was distributed via online and paper-based to people across a wide range of Australian workplaces including both public and private sectors with all industry groups being represented.  The survey was open for a four week period. The total number of survey respondents was 1554.

which is not encouraging either in what it says or what it omits.  As far as I can tell, there’s no reason to believe the numbers are good estimates of anything about Australian employees as a whole.

The reason that organizations do these surveys seems to largely be marketing: having things that look (from the right angle, in poor light) like statistics makes it easier to get media coverage.  It shouldn’t: if we’re covering West Island business or health news, “R U OK Day” is important enough to be in the papers without needing unconvincing numerical camouflage to get it past the gate.

August 21, 2012

Measuring what you care about

The Herald is reporting on Auckland Transport’s monthly report (you can find the reports here).  One of the recurring surprises in these reports is how high the punctuality figures are, and the Herald comments on these for some of the train services.

The bus punctuality statistics are even shinier

 

As a regular bus commuter it’s hard to imagine how these could be correct — and if they were, there would be no need for the real-time bus predictions, since the timetables would be more accurate than the predictions.

The solution is in the fine print: “Service punctuality for July 2012 was 99.24%, measured by the percentage of services which commence the journey within 5 minutes of the timetabled start time and reach their destination“.

Or, to quote Lewis Carroll’s Humpty Dumpty “When I use a word it means just what I choose it to mean — neither more nor less”

August 16, 2012

Exactly 100 million pi (roughly)

The US Census Bureau estimates that the population of the United States reached π×100 million on Tuesday afternoon (US time): 314,159,265. (via Stuff)

There’s no margin of error with this estimate, which might seem surprising from a respected national statistics agency.  The reason is that there is no sampling error in the estimate, all the uncertainty is from non-sampling errors.   The Census Bureau started with the 2010 US Census counts, subtracted deaths and emigrations (me, for example) and added births and immigrations.  In principle, the data on all these is complete and no sampling is used.  That doesn’t mean there isn’t any error — far from it — but it does make it very hard to estimate how much error there is.

The ubiquity of non-sampling error, and the impossibility of estimating it accurately, explain why surveys in New Zealand are about the same size as surveys in the USA, despite the huge difference in population.   In theory, you could afford to collect larger samples in the US, so US statistical agencies could get more precise estimates than Stats New Zealand can afford.  In practice, once surveys get to a certain size, the non-sampling error starts to be more important than the sampling error, and extra sample size stops giving you much increase in accuracy.

August 15, 2012

Is that a kiwi in your pocket?

Stuff has a story based on the latest release from the “Mega Kiwi Sex Survey”, covering all sorts of headline-worthy topics such as infidelity rates, and major turn-ons and turn-offs.

Conducting an accurate and reliable sex survey is difficult: both in taking the sample and in persuading people to give honest answers.  It’s much easier not to bother with all that.  We’ve commented adversely on the Durex Sex Survey before, but that at least made real attempts to get a representative sample.  Durex hired Harris Interactive, who are probably the leaders in trying to get reliable data out of online surveys.

The Mega Kiwi survey, not so much.  The producers say (update: most links here NSFW)

Fitzgerald says they are aiming to capture data from a broad cross section of New Zealanders “We’re working with a couple of key partners to get their audiences to take part, but we’re really trying to build a complete picture of the New Zealand sexual identity, so whether you’re a 60-year-old male in Eketahuna or a 22-year-old female in Ponsonby, we want to hear from you.”

and that’s how they went about it. A little Googling finds some of the links to the online survey form.  It’s a typical bogus poll.

Even if it were a real poll, the sample size of 1500 wouldn’t justify quoting results to the nearest tenth of a percentage point, since the margin of error would be about 2.5%.    In fact, the news release gives numbers to a tenth of a percent even for Gisborne.  Gisborne has about 1% of the Kiwi population, so a representative sample would have just 15 Gisborne respondents.

Still, the real point of the survey is to get news coverage rather than having to pay for advertising.  It seems to have worked.

August 6, 2012

Incompetent Australians?

Stuff reports

Lost receipts are costing Australian taxpayers about A$7.3 billion (NZ$9.4b) in total, or about A$1,000 each, according to a Commonwealth Bank survey.

The story in The Australian goes on to mention that Commonwealth Bank is introducing a product to help, so this is basically an advertising press release.  I can’t find out whether the survey is a real survey or some sort of bogus poll (there’s nothing on the Commonwealth Bank media releases page, for example), but there’s clearly something strange about the figures.  If you divde $7.3 billion by $1000, you get 7.3 million.  If you do the same calculations for the time spent looking for receipts, you get about the same figure.  But there are about 12 million Australians who lodge individual tax returns (14.6 million tax returns, 84.7% for individuals), so these figures don’t seem to add up.

[Update (28/8): the media release is up now, but it doesn’t clarify much.  The description suggests this is a bogus poll with reweighting to Census totals, but that doesn’t explain the discrepancy with actual tax returns]

August 5, 2012

One-third of who?

The lead in an otherwise reasonable story about a large employee survey in the Herald today is

Just one-third of New Zealand employees are currently working to their full potential.

If you go and look at the report (the first PDF link on this page), you find that the survey says it’s a stratified random sample, matched on organisation size, and then goes on to say that 93% of respondents “were from private organisations employing 50 or more people”.  At little work with StatsNZ’s business demography tables shows that about 57% of NZ employees work for organisations employing 50 or more people, and when you remove the public-sector employees from the numerator you get down to 42%.  The survey must have specifically targeted people working for large private organisations. Which is fine, as long as you know that and don’t just say “NZ employees”.

Also, the link between “working to their full potential” and what was actually measured is not all that tight.  The 33% is the proportion of respondents who are “engaged”, which means responding in the top two categories of a five-point scale on all eight questions targeting “job engagement” and “organisational engagement”.

Although it’s harder to interpret actual numerical values, since the company seems to use consistent methodology, changes since the last survey really are interpretable (bearing in mind a margin of error for change of around 3%).  And if you bear in mind that the survey was done by people who are trying to sell managers their services, and read the report with an skeptical eye to what was actually measured, it might even be useful.

 

July 31, 2012

Commuter survey ‘can’t be trusted’ – statistician

This just in from National Business Review – thanks to journo Caleb Allison and  NBR Online for giving us permission to upload the content, which sits behind a pay wall.

Commuter survey ‘can’t be trusted’ – statistician 

A statistician questions the validity of a survey promoting flexible working conditions for employees.

A recent survey by Regus – which describes itself as “the world’s largest provider of flexible workspaces” – said 67% of New Zealand employees would spend more time with family if they had a shorter commute as a result of flexible working conditions.

While the survey claimed to have polled more than 16,000 people in 80 countries, NBR ONLINE can reveal the company polled just 54 people in New Zealand.

Auckland University’s Dr Andrew Balemi says while it is a very low number, that alone does not suggest the poll is dodgy.

“Most people obsess about the sample size, but what I obsess about is the sample quality,” Dr Balemi says.

The only way to know if the information is credible is to know how the company undertook the survey.

However, the methodology was not included with the poll.

Dr Balemi says not only does this survey have a small sample size, it doesn’t tell the reader how it was obtained.

“In the absence of any explanation of how they’ve collected the data I wouldn’t trust this information.

“If they can’t even do that, I wouldn’t dignify it with any more consideration.”

He says the company may have a valid methodology and the poll could be worthy, but they should have included that information in the survey.

This follows another recent example of dodgy polling by the Auckland Council.

A press release claiming 63% of Aucklanders favour mayor Len Brown’s city rail loop turned out to have surveyed only 112 people.