• Home
  • Galleries
    • Climbers, Skiers, and Alpinists
    • Landscapes and Wildlife
    • Pets and Animal Rescues
  • Blog
  • Resources
  • Animal Rescues
  • Adventure Guiding
  • Buy Prints
  • About Me
  • Home
  • Galleries
    • Climbers, Skiers, and Alpinists
    • Landscapes and Wildlife
    • Pets and Animal Rescues
  • Blog
  • Resources
  • Animal Rescues
  • Adventure Guiding
  • Buy Prints
  • About Me

 

Thinking Fast and Slow, by Daniel Kahneman – Summary and Analysis


Kahneman challenges us to question the biases we carry with us, which colour even what we consider to be pure intuition. He notes that all human beings make certain unconscious assumptions without first carefully examining the truth behind those assumptions. He shows us how to overcome this way of thinking—the trick is to be aware that certain heuristics we carry with us in our brains can lead us astray.

Chapter 1: The Brain’s Two Systems

Summary:

There are two systems in our brains. One of those systems operates very quickly, the other more slowly. The first system functions automatically, intuitively, and involuntarily while the second employs more deliberation, reasoning, and concentration. The two systems can work symbiotically but quite often conflict with each other.

System 1 may employ heuristics that are inaccurate or too hastily formed, while System 2 requires some effort to function correctly and is prone to error and miscalculation. System 1 is the system responsible for our intuitions, feelings, and impressions, which System 2 then turns into beliefs and calculated actions. System 2 is slow to react and System 1 is quick to react. The operations of System 2 require focus and are disrupted when attention is diverted away from those operations. System 2 can alter the way System 1 functions, to some degree. In fact, one of the key purposes of System 2 is to overcome System 1’s impulses. This is where self-control comes in.

System 2, unlike System 1, can follow rules and make intelligent and deliberate choices. System 2 can also create rules based on impressions gleaned from System 1. Crucially, System 2 can program our memories so that we behave in a way that overrides habitual, instinctive, System 1 responses. System 2 is the system responsible for self-control and is the main system in charge when things get difficult. Ultimately, System 2 has the last word over System 1.

Multi-tasking that requires System 2 to simultaneously dedicate itself to more than one task is often overwhelming and unproductive. However, multi-tasking that makes use of System 1 and System 2 concurrently is more manageable and effective.

Analysis:

  •  There are two systems in the brain, known as System 1 and System 2.
  • System 1 functions automatically, intuitively, and involuntarily and is responsible for intuition, impressions, and feelings.
  • System 2 requires more deliberation, reasoning, and concentration and is responsible for decision-making, self-control, and deliberation.
  • System 2 creates rules based on impressions from System 1.
  • System 1 works quickly and System 2 works slowly.
  • If you’re going to multi-task, don’t do multiple things that require the use of System 2.

Chapter 2: The Busy System, and the Lazy System

Summary:

System 1 is always on, always busy, always searching for answers. System 1 controls everything we do that is automatic, unconscious, and instinctive. System 1 feeds us information whether we seek that information or not. It creates prejudices, tells us when we like things or dislike things, feeds our impressions, and causes our involuntary behaviours.

System 2 performs tasks of which System 1 simply is not capable. Therefore, the two systems must work together to create a well-functioning mind. System 2 is slow and lazy. The law of effort, which applies to both our physical exertion and our mental exertion, dictates that System 2 seeks to do its jobs with as little effort as possible. Thus, laziness is built into our very nature. This is why our minds employ heuristics to get the job done.

System 2 requires more energy, but many people experience something known as “flow” while in a state of deep System 2 concentration. This occurs where a person becomes so involved in a task that require concentration that she loses her sense of time, of herself, and of any problems she might be facing. This is a sought-after experience that brings about joy and peace.

Analysis:

  • System 1 feeds us information whether we seek that information or not, thus creating prejudices, emotions, inclinations, and involuntary behaviours.
  • The law of effort means that System 2 seeks to do its jobs with as little effort as possible.
  • “Flow” occurs where a person becomes so involved in a task that require concentration that she loses her sense of time, of herself, and of any problems she might be facing.

Chapter 3: Associative Activation

Summary:

Science is beginning to recognize that people think not only with their brains, but also with their bodies. Therefore, the body, just like the brain, experiences an emotional and physical response whenever we encounter something that we consider to be ambiguous or contradictory.

The term “associative activation” means that everything we experience reinforces something we have already experienced, some existing heuristic in our brain. Our attention sees only what it wants to see and expects to see and, in much the same way, our associative memory seeks out reinforcement of our existing patterns of association. Not only that, it deliberately ignores any information or experience that would appear to contradict the heuristics and associations we’ve already formed. This, Kahneman declares, is both a victory and a disaster in our intuitive mind.

The fact that our brains insist on a stable representation of reality can make life much simpler. But associative activation also has its costs. It requires us to adopt a generalized interpretation at the expense of an interpretation that is perhaps more accurate or more beneficial. It causes us to suppress ambiguity, even though ambiguities in life can sometimes be enriching. It causes us to discard notions and impressions that don’t fit in with our existing schema, so that we see the world as a much more rational place than it actually is.

In other words, associative activation causes us to be so uncomfortable with ambiguity that we cling to safe, comfortable interpretations, even if those interpretations are partially or completely disconnected from reality.

For example, let’s say that you had a bad experience with a friend’s family dog when you were a child. The dog knocked you over or bit you, causing you injury. You formed a schematic in your brain that dogs, especially big dogs, are dangerous. This schematic, or heuristic, gave you comfort because it gave you direction in how to deal with dogs (avoid at all costs). When, later in life, you begin dating a person who has a beloved Labrador and the dog jumps on you the first time you meet it, your brain tells you, consistent with your association, that the dog is dangerous and may injure you. In reality, the dog is probably simply playful and happy to meet you. It’s simpler for you, though, to cling to your existing associations regarding dogs than to take this new information (a dog jumping on you and wagging its tail) and evaluating it under a brand-new schema.

Analysis:

  • “Associative activation” means that everything we experience reinforces some existing heuristic in our brain.
  • Our associative memory seeks out reinforcement of our existing patterns of association and deliberately ignores any information or experience that would appear to contradict the heuristics and associations we’ve already formed.
  • Associative activation has value, but also requires us to adopt a generalized interpretation at the expense of an interpretation that is perhaps more accurate or more beneficial.

Chapter 4: Cognitive Ease, and Cognitive Strain

Summary:

The systems in our brains are constantly working and processing the information we take in through our senses. System 1 is automatically assessing the information, forming impressions, and determining whether any extra effort is required from System 2. It’s a never-ending computation process—what’s going on? What threats are present? Are things going well? Is my attention appropriately focused? How much effort does this task require? Where is my effort best directed?

When our brains register that things are going pretty well—in other words, we are under no present threat and there is no new and vital information to be processed—we experience cognitive ease. However, if our brains identify a problem we’re facing, an unmet demand to deal with, or important new information to incorporate, we experience cognitive strain. Cognitive ease and cognitive strain, put most simply, have to do with the level of effort required of the brain systems.

When we are in a state of cognitive ease, we are probably in a pretty good mood. We like what we see, we are able to rest in our assumptions and intuition, and we feel comfortable with our situation. However, when we feel cognitively strained, we are more vigilant and suspicious, we exert much more effort in our mental and physical processes, and we feel less comfortable. Importantly, though, we also make fewer errors.

We, as humans, generally try to avoid cognitive strain. This makes us especially vulnerable to forming certain biases. We seek cognitive ease, so we process information in a way that is comfortable with our existing biases, thus requiring less effort on our part. However, this laziness can lead to irrational and badly-thought-out decisions.

Analysis:

  • When we are under no present threat and there is no new and vital information to be processed, we experience cognitive ease.
  • When our brains identify a problem we’re facing, an unmet demand to deal with, or important new information to incorporate, we experience cognitive strain.
  • We generally try to avoid cognitive strain, which makes us especially vulnerable to forming certain biases.

Chapter 5: The Systems and Surprises

Summary:

The main function performed by System 1 is an ongoing assessment of what is normal in a person’s world. When something abnormal happens—when something we expect to happen does not happen or when something we do not expect to happen does happen—System 1 takes note of the abnormality and then System 2 kicks into gear automatically to search for a causal connection between the abnormal event and our existing biases. Eventually, if that particular abnormality occurs repeatedly, System 1 begins to classify the event as normal, rather than abnormal.

System 1 is less sophisticated, in some ways, than System 2, meaning that System 1 functions in large part by jumping to conclusions rather than carefully weighing or deliberating. System 1, unlike System 2, comes up with its answer based almost purely on previous experience. This is a time and energy saver, but it can become detrimental when a person is in a new situation, when the situation has high stakes and requires more careful deliberation, or when there’s little time to gather and assess information.

System 1 is good for combing through already available ideas and interpretations to come up with the most likely understanding of a new situation, but is not good for considering ideas it has not already incorporated into itself. Kahneman uses the acronym WYSIATI, which stands for “what you see is all there is.” This is how System 1 operates. It takes what limited evidence is included in its own system of intuitive thinking and jumps to conclusions based only on that evidence.

Analysis:

  • System 1 functions mostly by jumping to conclusions, rather than carefully weighing or deliberating, and comes up with its answer based almost purely on previous experience.
  • System 1 takes what limited evidence is included in its own system of intuitive thinking and jumps to conclusions based only on that evidence, which is not always ideal when it comes to decision-making.

Chapter 6: Heuristics: The Mental Shotgun

Summary:

When the mind encounters a difficult question and no satisfactory answer immediately presents itself, System 1 engages in a process known as “substitution.” The process of substitution involves System 1 searching for a related but simpler question and answering that question instead of the difficult one presented. Kahneman also characterizes this process as the “mental shotgun,” where we retrieve answers based on a heuristic (rule of thumb) without checking in to make sure that the use of that heuristic is accurate and makes sense.

This explains why we as human beings can have such strong biases and still somehow be completely unaware of them, even denying their existence when pointed out by others. Examples of substitutions include optical illusions where our eyes deceive us, stereotypes based on race and ethnicity, and too-quick judgments about what is fair and moral. All of these phenomena are the result of the way System 1 works—it tries hard to find coherence and consistency in what we see, even where there is little. We reach for causal thinking, even amongst totally random occurrences, and this leads to poor intuition and misunderstanding about how the world really works.

Analysis:

  • Substitution is where System 1 tries to answer a hard question by searching for a related but simpler question and answering that question instead.
  • Substitution means that we retrieve answers based on a heuristic (rule of thumb) without checking in to make sure that the use of that heuristic is accurate and makes sense.
  • System 1 tries hard to find coherence and consistency in what we see, even where there is little.

Chapter 7: A Quick Look at Heuristics

Summary:

Heuristics: A heuristic is a mental device—a rule of thumb—that helps us to answer questions and make decisions as quickly as possible. Without being able to use heuristics as mental shortcuts, we’d have to constantly stop our daily activities to make detailed and thorough analyses of our next course of action.

As helpful as heuristics can be, they can also cause errors in our decision making by leading to biases that cause us to go astray. Many situations simply don’t lend themselves to generic rules-of-thumb, particularly when those rules-of-thumb are hastily or haphazardly constructed. Heuristics can cause us to fall into unproductive patterns of behavior and make it difficult for us to be creative in our problem-solving and decision making.

Some of the heuristics profiled by Kahneman:

  • Anchoring:

Anchoring is a heuristic used when someone is making a numerical estimate. Anchoring involves using a number you’re given—known as the “anchor”—and shifting up or down from that number to reach another number that makes sense to you. The anchoring heuristic indicates that people tend not to move too far away from the anchor number. Therefore, when a question is posed and an anchor is given, the anchor contaminates all possible answers.

An example of anchoring—you’ve probably played a game where you were instructed to guess the number of jelly beans (or marbles, etc) in a jar. The person who guessed the number closest to the actual number of items in the jar won. Let’s say that the instructions tell you to guess how much more or fewer than 200 jelly beans are in the jar. Most of the answers would be fairly close numerically to the number 200, whereas if no anchor number had been provided, the answers would have varied more widely and perhaps been more accurate.

  • Availability and Frequency:

The availability heuristic relates to the ease with which a particular idea comes to a person’s mind when that person is presented with a certain set of circumstances. The availability heuristic often involves a person speculating about how likely an even is to occur or how frequently an event occurs based on that event’s availability. When an event that occurs fairly rarely has an overestimated likelihood in a person’s mind, the heuristic causes inaccurate perception.

For example, because violent crimes and natural disasters are featured prominently in the news, people overestimate the likelihood that they themselves will face these dangers, because the media coverage of these events brings the events quickly to the mind. On the other hand, the likelihood of under-publicized dangers, like illnesses or health conditions, are underestimated because they receive relatively less attention when they do occur.

  • Representativeness and Stereotypes:

The representative heuristic involves people using categories to make decisions and judgments. A common example of the representative heuristic is the use of stereotypes to make assumptions about a person’s characteristics. This heuristic ignores the actual likelihood that the characteristic actually occurs in that person. Like the other heuristics, it breaks the laws of probability.

Another example of use of the representative heuristic occurs when people overestimate the causal relationship between an ostensible cause and an ostensible effect. For example, an athlete may have played an exceptionally good game while wearing a certain pair of socks, erroneously attributed his success to those socks, and then worn those socks, unwashed, at every subsequent game. This is, of course, completely irrational.

Analysis:

  • A heuristic is a mental device that helps us to answer questions and make decisions as quickly as possible.
  • The use of heuristics can cause errors in our decision making by leading to biases that cause us to go astray.
  • Anchoring is used when someone is making a numerical estimate. Anchoring involves using a number you’re given and shifting up or down from that number to reach another number that makes sense to you.
  • Availability is the ease with which a particular idea comes to a person’s mind when that person is presented with a certain set of circumstances.
  • The representative heuristic involves people using categories to make decisions and judgments. For example: stereotypes.

Chapter 8: Narrative Fallacy and Outcome Bias

Summary:

Have you ever noticed how we tend to blame people for good decisions they’ve made that ended up working out badly and, yet, give them little credit when something they do that turned out well seemed obvious in hindsight? This is narrative fallacy, which is closely tied to outcome bias. Narrative fallacy occurs when flawed stories from the past shape the way we see the world and what we expect from the future. This fallacy comes from our need to make sense of our lives. Thus, though it seems unfair, we judge past decisions by their outcomes rather than by the circumstances in place at the time the decision was made. The reason this is a fallacy is because no one ever knows whether or not a decision, especially a risky one, will work out. So many other factors, many of which we have no control, play a role in the outcome.

Outcome bias is sometimes mistaken for hindsight bias. The best way to tell the difference is to remember that hindsight bias has to do with a situation where the “correct” choice at the time of the decision seems obvious in hindsight, while it really was not at the time.

Analysis:

  • Narrative fallacy occurs when flawed stories from the past shape the way we see the world and what we expect from the future.
  • Outcome bias means that we judge past decisions by their outcomes rather than by the circumstances in places at the time the decision was made.
  • Hindsight bias has to do with a situation where the “correct” choice at the time of the decision seems obvious in hindsight, while it really was not at the time.

Chapter 9: The Accuracy of Algorithms

Summary:

When it comes to predicting the future, algorithms almost always are more accurate than people, even experts. Humans, even those who are well-educated on a subject, are very inconsistent and often erroneously trust their intuition or “gut.” Moreover, experts tend to be overconfident in their predictions. Data passed through algorithms is far more reliable when it comes to predicting outcomes.

What is an algorithm? Put simply, an algorithm is a formula for calculating a certain outcome. It’s a step-by-step process designed to perform a certain function that will ultimately lead to the desired result if followed correctly (think of it as a map or flowchart). Algorithms have a set beginning, a set end, and a certain number of steps. An algorithm will always produce the same result if given the same input, and short algorithms may be combined to perform complex tasks. An example of a simple algorithm would be a cookbook recipe.

Analysis:

  • When it comes to predicting the future, algorithms almost always are more accurate than people, even experts.
  • An algorithm is a step-by-step process designed to perform a certain function that will ultimately lead to the desired result if followed correctly.

Chapter 10: Planning Fallacy

Summary:

The planning fallacy occurs when our predictions about how much time we’ll need to complete a future task show an overly-optimistic bias so that we underestimate the time we’ll need to complete the task, regardless of our knowledge that similar tasks performed in the past have taken longer. We also underestimate the costs and risks of those tasks and overestimate the benefits. The result of this fallacy is time overruns, missed deadlines, cost shortfalls, and disappointment in the ultimate benefit.

Analysis:

  • The planning fallacy is where we are underestimate the time, costs, and risks of a future task and overestimate the benefits, even when we should know better.

Chapter 11: Risk and Loss Aversion

Summary:

Risk aversion is a phenomenon in which we, as humans, when exposed to uncertainty, behave in ways to attempt to reduce the uncertainty. We are reluctant to make decisions that will lead to a result that is uncertain and are more likely to choose the course of action that leads to a more certain result, even if the certain result is less favourable than a likely result under the more uncertain course of action.

This is why, for example, we’re more likely to put our money in an investment account with a low, but guaranteed interest rate rather than into a high-risk stock where the potential for profit is high but so is the potential for loss. We are so averse to the idea of losing our money that we are willing to forego the opportunity to turn it into even more money.

Analysis:

  • Risk aversion means that, when exposed to uncertainty, we behave in ways to attempt to reduce the uncertainty.
  • We avoid risk even if the certain result is far less favourable than a likely result under the riskier course of action.

Chapter 12: Sunk Cost Fallacy and Fear of Regret

Summary:

A sunk cost is a cost that we have already incurred and we cannot get back. Sunk costs are often contrasted with prospective costs, which are costs that we may or may not incur in the future.

Sunk costs influence our decisions because of our aversion to risk. Until we irreversibly commit resources to a certain course of action, the prospective cost is nothing but an avoidable prospective cost. However, once the resources are committed, the price we paid, which should be irrelevant, becomes a yardstick for the value.

Analysis:

  • A sunk cost is a cost that we have already incurred.
  • Sunk costs influence our decisions because of our aversion to risk.
  • Once the resources are committed, the price we paid becomes a yardstick for the value.

Chapter 13: The Experiencing-Self and Remembering-Self

Summary:

There are two ways to measure feelings and circumstances: the experiencing-self, and the remembering-self.

The remembering-self retrospectively rates an experience by the highs or lows of the experience and by the outcome of the experience, with little attention paid to the duration of the experience.

The experiencing self, on the other hand, operates quickly, intuitively, and unconsciously in the present moment. The experiencing-self lives life and focuses on the quality of experience as it happens, rather than thinking about it after the fact. Each moment of the experiencing self lasts about 3 seconds, then vanishes. The significant moments of the experience are later retrieved by the remembering self and often altered and coloured by the brain.

Analysis:

  • The remembering-self retrospectively rates an experience by the highs or lows of the experience and by the outcome of the experience
  • The experiencing-self operates quickly, intuitively, and unconsciously in the present moment.
  • The significant moments of the experience are later retrieved by the remembering-self and often altered and coloured by the brain.

What it all Means

We know now that we can’t always trust our thoughts and intuition, thanks to faulty biases, heuristics, fallacies, and aversions. So what does this all mean? What actionable steps should we take with this new knowledge in mind?

Here are some things to consider:

  • Be careful to approach problems and decisions slowly and systematically, without jumping to conclusions and settling on what appear to be simple answers.
  • Be cognizant of our astonishing susceptibility to influence by biases, stereotypes, prejudices, aversions, and fallacies.
  • Mentally search for internal prejudices and biases and try to refute them with actual evidence.
  • Be disciplined in thoughts, without resorting to lazy assumptions and decision-making.
  • Remember that we do not understand the world as well as we think we do.
  • Whenever possible and especially when it matters, think slowly.

Kahneman’s main premise in Thinking, Fast and Slow is that we should remain very doubtful of our judgment and intuition and take our time in evaluating the thoughts that come into our minds. Our minds are hard-wired for error, prone to exaggeration, and eager to ignore our own ignorance.

The book contains the result of decades of research and it shows in the detailed case studies and analysis of the evidence. Moreover, Kahneman is clear that understanding fast and slow thinking could help us more effectively deal with the problems we face as a society, issues like racism, poverty, violence, and inequality.

© Copyright William Skea Climbing Photography