These Cognitive Biases are Undermining Your Decisions

One of the most formative books I ever read on the fallibility of human decisionmaking was Thinking, Fast and Slow by Nobel Laureate Daniel Kahnemann. Kahnemann’s book is chock full of interesting case studies, as well as examples that exploit a bias and encourage you to try to form conclusions yourself, showcasing the sometimes hilarious, sometimes disturbing mistakes you (and most others) make in their reasoning.

Brain

The two systems model of cognition

To lay a little groundwork, first we should identify what cognition, and cognitive bias’, actually are. To offer a quick and dirty definition of cognition, we can consider it our ability to process information in order to form memories and make decisions. A cognitive bias is some factor that prejudices our cognition towards a certain outcome or conclusion, usually as a result of what Kahnemann calls our ‘System 1’ thinking. System 1 and System 2, in Kahnemann’s framework, model the the ‘fast’ and ‘slow’ parts of thought. System 1 can generally be described as our instinctive, emotional and gut reactions, and System 2 our use of reason and careful thought to come to a rational conclusion. System 1, in most circumstances, is our default – it is fast, often unconscious, and mostly right most of the time. In fact, when it comes to skilled individuals, the entire goal is to relegate as many tasks to a well-trained system 1 as possible, as use of our system 2 is slow and energy intensive. That said, Kahnemann illustrated that System 1 has various automatic quirks that come to incorrect conclusions, sometimes dramatically so, and it can result in unfortunate errors if we do not train our ‘System 2’ to identify the problem and come to the rescue.

Substitution & Intensity Matching

The bias of substitution lays at the basis for many of the errors we make in our decisionmaking from day to day, and feeds into many of the other bias’ Kahnemann discusses in his book. Substitution is exactly what it sounds like – when faced with a complex problem, we are faced with two choices.

  1. Engage system 2, wrestle with the complexity, and carefully come to a conclusion for a chaotic issue which may or may not even have a correct answer.
  2. Substitute an easy problem for the children complex one, answer the easy problem, and proceed as if you answered the complex problem.

Laughable as it sounds, most people will choose option 2 most of the time, and not even realize they had done it. For example, if you are prompted on how much you may wish to donate to a charity to save Dolphins, the question you’re faced with is to assign a dollar value to that issue. We might consider some questions of how much money will actually be of value in that circumstance, how it will be allocated, et cetera. Or we can ask “How much do I like dolphins?”, rate that on a scale of 1 to 10, and match that intensity to a dollar value of “How much would I be willing to spend?” on the same scale. Bam. Done.

Sometimes this quick and dirty substitution is good enough, it gets us by in a complex world without being paralyzed by indecision, but substitution is at the heart of many of our prejudices – we may draw far-reaching conclusions on how a large group of people behaves by substituting that question with ‘How pleasant or unpleasant was my latest interaction with a member of that group?’ Even with questions on life satisfaction we are prone to substitute our current mood for our appraisal of our long term quality of life! Take this startling example: “Norbert Schwarz and his colleagues invited subjects to the lab to complete a questionnaire on life satisfaction. Before they began that task, however, he asked them to photocopy a sheet of paper for him. Half the respondents found a dime on the copying machine, planted there by the experimenter. The minor lucky incident caused a marked improvement in subjects’ reported satisfaction with their life as a whole!”

Anchoring & Framing

Consider these examples:

  • The survival rate of this surgery is 99%
  • 1% of patients die as a result of this surgery
  • 1 in 100 patients die as a result of this surgery

These are all different ways to say the same thing, but may have evoked different emotional reactions. The framing of an issue can lead to significant differences in our conclusions on the same data, and this is true of experts as well as amateurs. Kahnemann offers this example:

“Two alternative programs to combat a disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows: If program A is adopted, 200 people will be saved. If program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved. A substantial majority of respondents choose program A: they prefer the certain option over the gamble. The outcomes of the programs are framed differently in a second version: If program A’ is adopted, 400 people will die. If program B’ is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die. Look closely and compare the two versions: the consequences of programs A and A’ are identical; so are the consequences of programs B and B’. In the second frame, however, a large majority of people choose the gamble.”

The Availability Heuristic & Negativity Bias

System 1 cares more about the coherence of a story than it does about its actual truth or falsehood. The ease with which we retrieve examples is more important to our system 1’s view of the world than the veracity of those examples, or the actual statistics of the issue. This ‘availability heuristic’ results in weighing things we hear more about as more frequent – for example violent crime, which we hear of in the news. We’ve seen in the above examples that our ability to do statistical analysis on the spot is extremely flawed and subject to the emotional weight it’s framed in. The double-whammy of this heuristic is we are also primed to notice bad or dangerous things more readily than good things, resulting in a bias towards negativity. An angry face will stand out to us in a happy crowd, but a happy face will not stand out in an angry one nearly as starkly. We are naturally disposed to be wary of danger, and weigh it more highly in our calculations.