Thinking, fast and slow — Daniel Kahneman

My friend David lent me this book after telling me that it had been blowing his mind. I’m not sure if it has blown my mind, but it definitely helped me to understand it a bit better.

Kahneman suggests thinking of the mind as composed of two notional systems: the fast-thinking, intuitive System 1; and the slow, deliberate, accurate but lazy System 2. The interplay between these two results in the amazing, yet often incomprehensible, behaviour of our minds.

There are a lot of ideas in this book, and many of these resonated with me. Here are just a few.

“When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.”

Base rate neglect: when presented with evidence for a normally rare occurrence, we often simply evaluate the evidence, while ignoring the rarity of the event. My favourite example is a test for a one-in-a-million disease, where the test is 99.9% accurate. You take the test and it says you have the disease. You may now be worried, but your chance of having the disease is still only 0.1% (that’s one in a thousand). The positive result for the test means you’re now 1000 times more likely to have the disease, but the base rate of 1/1000000 is so low that you’re still almost certainly fine.

The idea of regression to the mean is very powerful. It’s a mathematical fact that, given any two incompletely correlated observations, the second is likely to be closer to the (mean) average than the first. The pretty much follows from the definition of “mean”. Regression to the mean explains a huge number of otherwise mysterious things, such as why punishment appears to work better than praise, and why intelligent women tend to marry less intelligent men.

However, we tend not to take regression to the mean into account, and it leads to our being surprised when apparent patterns in our observations turn out not to be real.

Kahneman gives a fairly simple way of estimating the expected correlation between events (such as last weeksend’s football score and this weekend’s) and thus how to make unbiased predictions. With enough practice you should be be able to train your mind to overcome this particular bias.

Hindsight bias and the resulting Illusion of understanding lead us to believe we understand a lot more about past events that we really do. Looking back it seems “obvious” that Google or South Korea or Tiger Woods would become so amazingly successful. But in reality it was as unpredictable then is it is obvious now. This is ignored by the purveyors of most business books — Kahneman says, “these stories induce and maintain an illusion of understanding, imparting lessons of little enduring value to readers who are all to eager to believe them.”

Simple formulas are as reliable as complex formulas or even expert judgement in many cases. Kahneman’s example is the Apgar test, a very quick and simple way of assessing a newborn baby’s health. As a rough-and-ready, simple, but broadly accurate measure, it has saved countless lives since its introduction. Simple tests like this are effective because they tend to be accurate enough, and crucially are simple enough that busy people actually use them. Similarly, Atul Gawande describes how checklists can fill the same role in his book The Checklist Manifesto.

The Planning Fallacy is simply the idea that we can make accurate forecasts about our future plans. We plan a project of some sort and try to be as accurate as possible about how long all the steps will take. Invariably the whole thing ends up taking twice as long anyway. This is nicely summed up by Hofstadter’s Law:

It always takes longer than you think it will take, even when you take into account Hofstadter’s Law.

The way to defeat this vicious circle is to use reference class forecasting, also called the “outside view” or “evidence-based forecasting”. It’s also called “yesterday’s weather” from the idea that you can make a fairly accurate weather forecast simply by predicting that today’s weather will be the same as yesterday’s.

So when forecasting how long a project will take, simply consider the last similar project you did, and that’s how long your new project will take. “But,” I hear you say, “last time that big unexpected disaster happened right in the middle of the project and that’s why it took so long!” And you probably think such a disaster won’t happen this time. Just like you thought last time, which is why you underestimated how long it would take.

Kahneman talks a bit about regret, and how to avoid it. He pretty much goes with the strategy I have used for some years. So, often I find myself about to decide on some course of action based on the assumption that some low-probability event won’t happen. For example, should I insure my mobile phone against loss? Should I buy the expensive carseat for my child even though seatbelts are probably good enough? In each case, I imagine the unlikely event occurring, and ask myself how much I will regret making my choice. If I don’t insure my phone and lose it, I will be annoyed, but I will recognise that I took a gamble and hey, life’s too short to worry about insuring your phone. But if my child is injured in a car accident I will not forgive myself for not making a decent effort to keep him safe.

So as Kahneman suggests, I imagine feeling the regret and see if it would make me want to change my decision. Of course, this is not infallible, but it has worked well for me so far in many minor situations.

(On carseats for children, see also Steven Levitt’s TED talk and have fun spotting the flaws in his argument.)

It has long annoyed me that car mileage is often quoted in miles per gallon. One reason this irritates me is that even in the 21st century, so many people still insist on measuring in miles and gallons rather than kilometres and litres. But the other reason is that it’s the wrong way around. People who measure efficiency care about the cost of the fuel and the amount of fuel burned: in other words, they care about the fuel rather than the distance, and the lower, the better. The purpose of driving a car is to travel over distances, and the way to measure efficiency is to measure how much fuel is required to drive those distances: in other words, litres per kilometre (or gallons per mile, if you must).

Kahneman points this out too, and gives some good examples of how misleading it is to think in terms of MPG, and how GPM is a far more intuitive concept. I don’t know if we’ll ever make the switch though.

So after reading this book, I feel I understand a lot more about how my mind works and how it should work. Even so, I still empathise with my 5 year old son’s precociously wise words:

Did you like this? Sharing is good!
This review is about , , , . Bookmark the permalink.

4 Responses to Thinking, fast and slow — Daniel Kahneman

  1. Pingback: Blind Spots — Bazerman & Tenbrunsel | Bennettarium

  2. Pingback: How Children Succeed — Paul Tough | Bennettarium

  3. Pingback: The Undercover Economist — Tim Harford | Bennettarium

  4. Pingback: Blind Spots — Bazerman & Tenbrunsel | Bennettarium

Leave a Reply

Your email address will not be published. Required fields are marked *