Posts filed under Risk (222)

May 8, 2014

Who’s afraid of the NSA?

Two tweets in my time line this morning linked to this report about this research paper, saying “americans have stopped searching on forbidden words

That’s a wild exaggeration, but what the research found was interesting. They looked at Google Trends search data for words and phrases that might be privacy-related in various ways: for example, searches that might be of interest to the US government security apparat or searchers that might be embarrassing if a friend knew about them.

In the US (but not in other countries) there was a small but definite change in searches at around the time of Edward Snowden’s NSA revelations. Search volume in general kept increasing, but searches on words that might be of interest to the government decreased slightly

unnamed

The data suggest that some people in the US became concerned that the NSA might care about them, and given that there presumably aren’t enough terrorists in the US to explain the difference, that knowing about the NSA surveillance is having an effect on political behaviour of (a subset of) ordinary Americans.

There is a complication, though. A similar fall was seen in the other categories of privacy-sensitive data, so either the real answer is something different, or people are worried about the NSA seeing their searches for porn.

May 6, 2014

Personalised medicine: all the screenings

This piece from the Vancouver Sun exaggerates the current level of usefulness of genetic tests, but is spot on about the problems of scale

“As a diagnostic tool, personal genomics is invaluable for selecting therapies, but this whole screening issue opens up another can of worms,” said Lynd. “With the new economies of scale … it is just as easy to look for everything as it is to look for the one thing you need to know.”

Every cancer patient sent for a full genome analysis to determine which variant of breast cancer she has, could potentially become a patient for any or all of the other diseases indicated on their genome and the subject of a whole series of expensive tests to disprove the presence of an illness.

May 3, 2014

Optimising the for the wrong goal

From Cathy O’Neill, at mathbabe.org

By contrast, let’s think about how most big data models work. They take historical information about successes and failures and automate them – rather than challenging their past definition of success, and making it deliberately fair, they are if anything codifying their discriminatory practices in code.

That is, data mining approaches to making decisions are blind to the prejudices attached to characteristics such as gender, ethnicity, age, so they will readily look at historical data, note that women (gays, immigrants, Maori, older people) haven’t been successful in positions like this in the past, and fail to wonder why.

White House report: ‘Big Data’

There’s a new report “Big Data: Seizing Opportunities, Preserving Values” from the Office of the President (of the USA).  Here’s part of the conclusion (there are detailed recommendations as well)

Big data tools offer astonishing and powerful opportunities to unlock previously inaccessible insights from new and existing data sets. Big data can fuel developments and discoveries in health care and education, in agriculture and energy use, and in how businesses organize their supply chains and monitor their equipment. Big data holds the potential to streamline the provision of public services, increase the efficient use of taxpayer dollars at every level of government, and substantially strengthen national security. The promise of big data requires government data be viewed as a national resource and be responsibly made available to those who can derive social value from it. It also presents the opportunity to shape the next generation of computational tools and technologies that will in turn drive further innovation.

Big data also introduces many quandaries. By their very nature, many of the sensor technologies deployed on our phones and in our homes, offices, and on lampposts and rooftops across our cities are collecting more and more information. Continuing advances in analytics provide incentives to collect as much data as possible not only for today’s uses but also for potential later uses. Technologically speaking, this is driving data collection to become functionally ubiquitous and permanent, allowing the digital traces we leave behind to be collected, analyzed, and assembled to reveal a surprising number of things about ourselves and our lives. These developments challenge longstanding notions of privacy and raise questions about the “notice and consent” framework, by which a user gives initial permission for their data to be collected. But these trends need not prevent creating ways for people to participate in the treatment and management of their information.

You can also read comments on the report by danah boyd, and the conference report and videos from her conference’The Social, Cultural & Ethical Dimensions of “Big Data”‘ are now online.

May 2, 2014

Animal testing

Labour want to prevent animal testing of legal highs. That’s a reasonable position. They are quoted by the Herald as saying “there is no ethical basis for testing legal highs on animals”. That’s a completely unreasonable position: testing on animals prevents harm to humans, and the fact you don’t agree with something doesn’t mean it lacks an ethical basis.

More important is their proposed legislation on this issue, with the key clause

Notwithstanding anything in the Psychoactive Substances Act 2013, no animal shall be used in research or testing for the purpose of gaining approval for any psychoactive substance as defined in section 9 of the Psychoactive Substances Act 2013.”

Assuming that the testing is done overseas, which seems to be National’s expectation, this legislation wouldn’t prevent animal use in testing.  The time when a drug dealer would want to use animals in testing is for initial toxicity: does the new drug cause liver or kidney damage, or have obvious long-term neurological effects that might reduce your customer base unduly.  The animal data wouldn’t be sufficient on their own, because there’s some variation between species, especially in side-effects mediated by the immune system (don’t Google “Stevens-Johnson syndrome” while you’re eating). But animal data would be relevant, and many plausible candidates for therapeutic medications fail early in development because of this sort of toxicity.

Whether animals were used for toxicity testing or not, it would still be necessary to test in humans to find the appropriate dose and the psychoactive effects in people. Depending on the regulations, it might well also be necessary to test moderate overdoses in humans — especially as it appears most of the adverse effects of the synthetic cannabis products are in people taking quite high doses.  That’s the sort of data that might be required in an application for approval of a psychoactive substance.

Labour’s proposal would mean that the animal test data could not be used for gaining approval, and would also mean that the regulations could not require animal data.  But I can’t see much reason it would discourage someone from using animals in initial toxicity testing, which is the only place animal testing would really be relevant.

April 23, 2014

Citation needed

I couldn’t have put it less clearly myself, but if you follow the link, you do get to one of those tall, skinny totem-pole infographics, and the relevant chunk of it saystxt

What it doesn’t do is tell you why they believe this. Neither does anything else on the web page, or, as far as I can tell, the whole set of pages on distracted driving.

A bit of Googling turns up this New York Times story from 2009

The new study, which entailed outfitting the cabs of long-haul trucks with video cameras over 18 months, found that when the drivers texted, their collision risk was 23 times greater than when not texting

That sounds fairly convincing, though the story also mentions that a study of college students using driving simulators found only an 8-fold increase, and notes that texting might well be more dangerous when driving a truck than a car.

The New York Times doesn’t link, but with the name of the principal researcher we can find the research report and Table 17, on page 44 does indeed include the number 23. There’s a pretty huge margin of error: the 95% confidence interval goes down to 9.7. More importantly,  though, the table header says “Likelihood of a Safety-Critical Event”. 

A “Safety-Critical Event” could be a crash, but it could also be a near-crash, or a situation where someone else needed to alter their behaviour to avoid a crash, or an unintentional lane change. Of the 4452 “safety-critical events”, 21 were crashes.  There were 31 safety-critical events observed during texting.

So, the figure of 23 is not actually for crashes, but it is at least for something relevant, measured carefully.  Texting, as would be pretty obvious, isn’t a good thing to do when you’re driving. And even if you’re totally rad,hip, and cool like the police tweetwallah, it’s ok to link.  Pretend you’re part of the Wikipedia generation or something.

 

 

April 2, 2014

Drug use trends

There’s an interesting piece in Stuff about Massey’s Illegal Drug Monitoring System. I’d like to make two points about it.

First, the headline is that synthetic cannabis use is declining. That’s good, but it’s in a survey of frequent users of illegal drugs.  If you have the contacts and willingness to buy illegal drugs, it isn’t surprising that you’d prefer real cannabis to the synthetics — there seems to be pretty universal agreement that the synthetics are less pleasant and more dangerous.  This survey won’t pick up trends in more widespread casual use, or in use by teenagers, which are probably more important.

Second, the study describes the problems caused by much more toxic new substitutes for Ecstacy and LSD. This is one of the arguments for legalisation. On the other hand, they are also finding increased abuse of prescription oxycodone. This phenomenon, much more severe in the US, weakens the legalisation argument somewhat.  Many people (including me) used to believe, based on reasonable evidence, that a substantial fraction of the adverse health impact of opioid addiction was due to the low and unpredictably-varying purity of street drugs, and that pure, standardised drugs would reduce overdoses. As Keith Humphreys describes, this turns out not to be the case.

 

 

March 27, 2014

Individual risk and population risk

The Herald and Stuff both have a story about the most dangerous intersections in the country, based on the Ministry of Transport press release. The Herald continues its encouraging new policy of providing the actual data, so we can look in more detail.

The first thing to note is that no intersection in the country appears to have had more than two fatal crashes in ten years, which is better than I would have expected. That’s why crashes involving even minor injuries need to be included in the ranking.

The second issue is the word ‘dangerous’. These 100 intersections are the ones that most need something done to them; they are where the most crashes happen. That’s not the same as the usual use of ‘most dangerous’ — these aren’t the intersections that pose the greatest risk to someone driving through them. The list is from a population or public health viewpoint: these intersections are more dangerous in the same way that dogs are more dangerous than sharks, or flu is more dangerous than meningitis.

 

March 25, 2014

An ounce of diagnosis

The Disease Prevention Illusion: a tragedy in five parts, by Hilda Bastian

“An ounce of prevention is worth a pound of cure.” We’ve recognized the false expectations we inflate with the fast and loose use of the word “cure” and usually speak of “treatment” instead. We need to be just as careful with the P-word.

 

March 18, 2014

Big Data & privacy presentation

If you have time, there’s an interesting event that will be streamed from New York University this (NZ) morning (10:30am today NZ time, 5:30pm yesterday NY time)

..the Data & Society Research Institute, the White House Office of Science and Technology Policy, and New York University’s Information Law Institute will be co-hosting a public event entitled The Social, Cultural, & Ethical Dimensions of “Big Data.” The purpose of this event is to convene key stakeholders and thought leaders from across academia, government, industry, and civil society to examine the social, cultural, and ethical implications of “big data,” with an eye to both the challenges and opportunities presented by the phenomenon.

The event is being organised by danah boyd, who we’ve mentioned a few times and whose new book I plan to write about soon.