Takeaways from 'Thinking, Fast and Slow'

Thinking, Fast and Slow (TFAS, for short) by Daniel Kahneman is one of the most quoted and well-regarded texts in the recent explosion of books on cognitive biases, and for good reason: Kahneman, along with his research partner Amos Tversky, performed much of the original research that backs most of those books (and received a Nobel Prize for it). In TFAS, Kahneman takes the reader on a tour of the research that has been done into how people think and how we think we think, demonstrating along the way that the two are less related than we realize.

Setting the stage for the entire discussion, we open by introducing two modes of thought: Systems 1 and 2. System 1 is the reflexive, heuristics-based mind which attempts to solve problems as quickly and effortlessly as possible by generating as many interpretations as it can to our everyday experiences and picking one that seems the most plausible. System 2 by contrast is slower and more methodical, incorporating more information and applying more sophisticated techniques.

For the most part this two-system arrangement works well: System 1 uses mostly-good-enough heuristics to solve problems quickly and efficiently while System 2 is called in for harder problems which justify a higher effort. Sometimes, however, the downsides of the effort-sharing arrangement appear in the form of systematic biases in our thinking. For example, we are prone to overestimate the probability of rare events if examples of them come easily to mind - the probability of being a victim of a terrorist attack is one example[1]. We are also prone to avoid risk in cases where we stand to gain, and to seek risk when we stand to lose; we often fail to look for additional information when what we have forms a coherent story (Kahneman calls this “WYSIATI” - What You See Is All There Is); and we remember our experiences in terms of what was most salient about them, discarding the rest (Peak-End Heuristic[2]).

There’s a lot of great material in the book and it’s well worth your time to read through and consider carefully, especially if you’re in a position where recognizing bias can have a large impact. Unfortunately for us, as is pointed out in the last chapter, it’s very difficult to realize when we’re engaged in a biased mode of thinking. About the best we can hope for is recognizing when we’re at particulary high risk of committing certain errors and attempting to mitigate them. Thinking, Fast and Slow isn’t a book you’ll pick up and burn through in a day; but I think the insights it gives are worth it.


  1. Terrorism depends heavily on this to be effective.  ↩

  2. The Peak-End Heuristic is a feature of memory where the “goodness” or “badness” of a memory is represented by the peak value and the end state, ignoring the rest of what happened. For example, if you’re listening to a song you enjoy and having a good time but it ends with a loud screech, you’ll be likely to consider the event ruined in retrospect even though the majority of it was pleasant.  ↩

If you like what I write, you can subscribe to my mailing list to get new posts in your inbox: