July 9, 2012

Book review: Thinking, Fast and Slow

Daniel Kahneman and Amos Tversky made huge contributions to our understanding of why we are so bad at prediction.  Kahneman won a Nobel Prize[*] for this in 2002 (Tversky failed to satisfy the secondary requirement of still being alive).  Kahneman has now written a book, Thinking, Fast and Slow about their research.  Unlike some of his previous writing, this book is designed to be shelved in the Business/Management section of bookshops and read by people who might otherwise be  looking for their cheese.

The “Fast” and “Slow” of the title are two systems of thought: the rapid preconscious judgement that we use for most of our decision-making, and the conscious and deliberate evaluation of alternatives and probabilities that we like to believe we use.   The “Fast” system relies very heavily on stereotyping — finding the best match for a situation in a library of stories — and so is subject to predictable and exploitable biases.  The “Slow” system can be trained to do much better, but only if we can force it to be used.

A dramatic example of the sort of mischief the “fast” system can get up to is anchoring bias.  Suppose you ask a bunch of people how many UN-member countries are in Africa.  You will get a range of guesses, probably not very accurate, and perhaps a few people who actually know the answer.  Suppose you had first asked people to write down the last two digits of their telephone number, or to spin a roulette wheel and write down the number that is chosen, and then to guess how many countries there are in Africa.  Empirically, across a range of situations like this, there is a strong correlation between the obviously irrelevant first number and the guess.   This is an outrageous finding, but it is very well confirmed.   It’s one of the reasons that bogus polls are harmful even if you know they are bogus.

Kahneman gives many other examples of cognitive illusions generated by the ‘fast’ system of the mind.  As with optical illusions, they don’t lose their intuitive force when you understand them, but you can learn not to trust your intuition in situations where it’s going to be biased.

One minor omission of the book is that there’s not much explanation of why we are so stupid: Kahneman points out, and documents, that thinking uses up blood sugar and is biologically expensive, but that doesn’t explain why the mistakes we make are so simple.  Research in computer science and philosophy, by people actually trying to implement thinking, gives one possibility, under the general name of “the frame problem“.  We know an enormous number of facts and relationships between them, and we cannot afford to investigate the logical consequences of all these facts when trying to make a decision.  The price of tea in China really is irrelevant to most decisions, but not to decisions about tea purchases, or about souvenir purchases when in Beijing, or to living-wage levels in Fujian.  We need some way of ignoring the price of tea in China, and millions of other facts, except very occasionally when they are relevant, without having to deduce their irrelevance each time.  Not surprisingly, it sometimes misfires and treats information as important when it is actually irrelevant.

Read this book.  It might help you think better, and at least will give you better excuses for your mistakes.

 

* to quote Daniel Davies: “blah blah blah Sveriges Riksbank. Nobody cares, you know.”

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »

Comments

  • avatar
    Graham White

    The commet which I like best comes from page 77 of the book and reads:
    “The prominence of causal intuitions is a recurrent theme in this book because people are prone to apply causal thinking inappropriately, to situations that require statistical reasoning…… System 2 (viz., thinking slow) can learn to think statistically but few people receive the necessary training.

    11 years ago

  • avatar
    Graham White

    The comment which I like best comes from page 77 of the book and reads:

    “The prominence of causal intuitions is a recurrent theme in this book because people are prone to apply causal thinking inappropriately, to situations that require statistical reasoning…… System 2 (viz., thinking slow) can learn to think statistically but few people receive the necessary training.

    11 years ago