Posts written by Rachel Cunliffe (512)

avatar

Rachel Cunliffe is the co-director of CensusAtSchool and currently consults for the Department of Statistics. Her interests include statistical literacy, social media and blogging.

December 6, 2011

The Magic of Numbers

Here’s the final poster we sent out to schools around the country this week:The Magic of numbers:

The other posters sent out may be found here and here.

December 5, 2011

Stat of the Week Winner: November 26 – December 2 2011

Thanks for the two nominations for last week’s Stat of the Week competition.

Both were about They Work For You‘s statistical analysis of how the various political parties voted during the last government with a summary graphic:

As both nominations were to the same story, (and it’s quite interesting), we have decided to award it jointly.

It would have been interesting to see a more subtle analysis interpreting the type of bill (and their sign in the PCA) driving the two principal components.

It is not very interesting that Labour and Progressive vote together, but what is separating them from the Greens and from National?

December 2, 2011

Statistics: Helping you understand our world

It’s that time of year again when students all around the country are figuring out what they’re going to do next year.

Our department has produced a series of posters for careers advisors showing where you can end up working thanks to statistics.  There’s some fascinating stories in there – from astrostatistics to sports statistics, from research wind energy to insurance.

Go see the posters (PDF)

 

November 26, 2011

Election Night Graphics

I’ve posted this over on Throng as well, but thought I’d add it here too:

I just saw this graph on TVNZ’s election coverage:

Putting aside the issue of using a perspective graph makes it harder to compare, there’s something wrong with this graph: the informal votes (labelled INF) bar is not correct.

The correct graph should look like this:

Update: there are faint notches in their graphic indicating the “0” mark, however it’s not a clear enough distinction.

November 14, 2011

The Nocebo Effect

Everyone has heard of the placebo effect, but a lesser known effect is its counterpart, the so-called nocebo effect, which Wikipedia describes as:

A harmful, unpleasant, or undesirable effects a subject manifests after receiving an inert dummy drug or placebo. Nocebo responses are not chemically generated and are due only to the subject’s pessimistic belief and expectation that the inert drug will produce negative consequences.

In today’s Guardian paper, they’ve printed an edited version of Penny Sarchet’s winning essay on nocebo research in the Wellcome Trust science writing competition. Well worth a read.

November 8, 2011

Political poll with sample size of 47 makes headlines

David Farrar of Kiwi Blog criticises a story in the Herald which says:

John Banks has some support in the wealthy suburb of Remuera, but is less popular on the liberal fringes of the Epsom electorate, according to a Herald street survey.

A poll of 47 Epsom voters yesterday found the National candidate ahead of Act’s Mr Banks by 22 votes to 20.

Farrar correctly points out that the poll is in no way random (i.e. is not scientific), and goes on to say:

But even if you overlook the fact it is a street poll, the sample size is ridiculously low. The margin of error is 14.7%! I generally regard 300 as the minimum acceptable for an electorate poll. That gives a 5.8% margin of error. A sample of 47 is close to useless.

October 11, 2011

The Best Statistics Blogs of 2011

Here’s a great collection of 50 statistics blogs which were chosen, amongst other factors, for being nice and accessible to non-statisticians. We also have a list of blogs in our sidebar which has many in common with theirs.

(Of course, we’d love to see Stats Chat included in their list next year! Help us spread the word too.)

October 3, 2011

“Unexpected results of a new poll”?

Kiwiblog’s David Farrar has nominated 3 News for the most misleading story of the week in their reporting of a political poll because their story does not mention that the poll was only a sample of Maori voters, not a sample of all voters:

“Labour most popular party in new poll…

Labour leader Phil Goff will be clinging to the unexpected results of a new poll in which his party has picked up twice as much support as National.

But he is well behind John Key in the preferred prime minister stakes, according to the TVNZ Marae Investigates Digipoll, released today.

Labour’s on 38.4 percent support in the poll, followed by the Maori Party on 22.2 percent, while National’s on just 16.4 percent. That is in stark contrast to other media polls, which put National above 50 percent support, with Labour rating at 30 percent or less, and the Maori Party on around one percent support.

…The TVNZ poll interviewed 1002 respondents between August 19 and September 20, and has a margin of error of +/- 3.1 percent.”

The original press release from TVNZ does state this very clearly:

Full release of Digipoll Maori Voter Survey… The TVNZ Marae Investigates Digipoll is one the most established voter polls in NZ and often the only one to survey Maori voters in an election year.

In a further 3 News article they discuss a different poll and say that “the poll differs greatly to one released by TVNZ’s Marae Investigates earlier today” without explanation for the difference.

UPDATE: 3 News have now updated the headline to: “Labour most popular party among Maori” and added “The TVNZ Marae Investigates Digipoll surveyed Maori listed on both the general and Maori electoral rolls.”

3 News’ Chief Editor James Murray apologised on Kiwiblog:

“Got to put our hands up to a genuine mistake there. This was a story from our wire service, and we didn’t do our due diligence in fact-checking it.

We absolutely understand the importance of getting this right, and the story has now been corrected. My team have been told to be extra vigilant on poll stories in future and NZN have been informed of the error.

Apologies for anyone who may have been misled by this mistake.”

September 5, 2011

How many people will be at the Rugby World Cup’s Opening Night celebrations in Auckland?

Expect to hear a lot of big numbers thrown about during the Rugby World Cup – and not just about those on the field.

When Auckland hosts the Rugby World Cup Opening Night Celebrations this Friday, people will want to know how many turned up. The success of the event (and justification for the expenditure) will be measured in part by estimated crowd size.

Crowd estimation is often not at all scientific. During tonight’s ONE News bulletin, reporter Jack Tame estimated there were “five or ten thousand delirious Tongan fans”.

How are crowd estimate figures obtained and how reliable are they?

The authors of a new study published in Significance, the magazine of the Royal Statistical Society and the American Statistical Association claim that most crowd estimations are unreliable and that the public should view crowd estimation with scepticism:

“In the absence of any accurate estimation methods, the public are left with a view of the truth coloured by the beliefs of the people making the estimates,” claims Professor Paul Yip, of the University of Hong Kong, one of the authors of the study.

“It is important to rectify the myth of counting people. The public would be better served by estimates less open to political bias. Our study shows that crowd estimates with a margin of error of less than 10% can be achieved with the proposed method.”

Further reading:

Also of interest:

August 16, 2011

If the world lived in one city, how large would that city be?

Per Square Mile is a fascinating blog about population density with plenty of research, statistics, graphics and food for thought.

See how much space the people of the world would fit in if they lived in one big city which was as densely populated as cities such as London, Paris or New York: