Thinking, Fast and Slow

Introduction

Nobel laureate Daniel Kahneman's Thinking, Fast and Slow is a landmark book in behavioural psychology.

  • Daniel Kahneman aims to equip readers with the tools to diagnose their own biases and flawed decision-making, often driven by intuitions, impressions, and feelings, rather than by a careful consideration of equally relevant statistical data.
  • We can be overly confident even when we are wrong. An objective observer is far more likely to detect our errors than we are ourselves.
  • He uses real-world examples to illustrate concepts including two distinct thinking systems, halo effect, heuristics and biases, anchoring effect, prospect theory, and loss aversion, making the content relatable and enlightening.

Thinking, Fast and Slow



Two Systems

Social scientists in the 1970s broadly accepted two ideas about human nature.

  • First, people are generally rational and their thinking is normally sound.
  • Second, emotions such as fear, affection and hatred explain most of the occasions on which people depart from rationality.

However, these traditional views contrast sharply with the model proposed by Daniel Kahneman. He suggests that our minds house two distinct thinking systems when it comes to how we think and make decisions:

  • System 1 is fast, intuitive and emotional.
    • It operates automatically and quickly, with little or no effort and no sense of voluntary control.
    • Examples
      • Detect that one object is more distant than others
      • Detect hostility in a voice
      • Answer to 2+2=?
      • Drive a car on an empty road
      • Understand simple sentence
      • Recognize a familiar face
    • Encountering an idea immediately triggers associative memory, which in turn evokes emotions, facial expressions, and other reactions. This also triggers the priming effect, where related concepts become more readily available. For example, seeing the word "EAT" might make you more likely to complete the word fragment "SO_P" as "SOUP" rather than "SOAP."
  • System 2 is slower, more deliberate and more logical.
    • It allocates attention to the effortful mental activities that demand it, including complex computations. Therefore, it is difficult or impossible to conduct several at once.
    • Nonetheless, as you become skilled in a task, its demand for energy diminishes.
    • Examples
      • Focus on the voice of a particular person in a crowded and noisy room
      • Maintain a faster walking speed than is natural for you
      • Count the occurrences of the letter a in a page of text
      • Compare two washing machines for overall value
      • Check the validity of a complex logical argument.
    • System 2 excels at allocating attention. Only under high demand, when intense focus on a task is required, can people become effectively blind, even to stimuli that would normally grab their attention.
We tend to identify with System 2, the conscious self that reasons, makes choices, and controls our thoughts and actions.

  • However, in reality, the intuitive System 1 is more influential than your experiences tell you, dictating many of the choices and judgements you make.
  • System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. In simpler terms, System 1 proposes a story, and System 2 decides whether to believe it.
  • Therefore, when System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment.
  • However, System 1's influence goes beyond simply proposing ideas. It can subtly frame the way we approach problems and nudge System 2 towards certain conclusions, even in seemingly deliberate decisions.
  • System 2 also handles self-monitoring, keeping you polite when angry or alert while driving at night.

System 1 and System 2 work efficiently together, minimizing effort and optimizes performance. This aligns perfectly with the law of least effort.

  • However, System 1 is prone to biases on familiar situations and jumps to conclusion quickly, without considering logic and statistics. This can lead to systematic errors.
  • For example, in the famous Muller-Lyer illusion, we naturally believe that the bottom line is obviously longer than the one above it, although they are in fact identical in length. To resist the illusion, System 2 needs to consciously instruct you to distrust your impressions of the length of the lines when fins are attached to them.

Muller-Lyer Illusion
  • Building on the concept of intentional causality, System 1 readily infers intentions and motivations from others' observed behaviours, facilitating social interactions, all while bypassing statistical reasoning (essentially ignoring randomness) or evidence (even when the link is demonstrably false).
  • Halo effect is another System 1 bias where a positive initial impression of someone (e.g., friendly and well-dressed) leads us to unconsciously assume they also have other positive qualities (e.g., intelligent, trustworthy) without any evidence to support those assumptions.
  • The WYSIATI principle ("what you see is all there is") implies that System 1 excels at constructing the best possible coherent story from active ideas. It does not actively seek a complete picture, even when information is scarce and irrelevant.
  • Because System 1 operates automatically and cannot be turned off at will, preventing errors of intuitive thought is often difficult.
  • Additionally, System 2 is slow and inefficient, making constant questioning of our own thinking impossibly tedious. It is also characterized by laziness, a reluctance to invest more effort than is strictly necessary.
  • Therefore, the best approach is not to be overconfident with our intuition, but to learn to recognize situations prone to errors and exert greater cognitive effort to avoid them.

Furthermore, when System 2 is preoccupied with self-control or cognitive effort, System 1 exerts greater influence on behaviour.

  • This can manifest in various ways. People who are cognitively busy are more likely to make selfish choices, use sexist language, and make superficial judgments in social situations.
  • Similarly, external factors such as alcohol, lack of sleep and anxious thoughts, can all have the same effect, temporarily tipping the balance in favour of System 1.
Psychologist Roy Baumeister and his colleagues demonstrated the existence of a shared pool of mental energy that all the attention and effort exerted by System 2 draws on, at least partly. They achieved this through experiments involving successive rather than simultaneous tasks.
  • Hence, if you had to force yourself to do something (exerting self-control), you would be less willing or less able to exert self-control when the next challenge comes around. This phenomenon has been named ego depletion.

Fortunately, cognitive work is not always aversive. People can sometimes expend considerable effort for long periods without feeling the strain.

  • Psychologist Mihaly Csikszentmihalyi proposed the concept of flow (optimal experience), where a person experiences a state of complete absorption (effortless concentration) in an activity, allowing them to devote all their resources to the task at hand.

Surprisingly, your cognitive state dictates which thinking system, 1 or 2, takes the lead.

  • When you are in a state of cognitive ease, a good mood often accompanies it. In this state, you tend to favour what you see, readily believe what you hear, trust your intuitions, and find the current situation comfortably familiar. However, this ease can also lead to relatively casual and superficial thinking.
  • Conversely, when you feel strained, you become more vigilant and suspicious. You invest more effort in your tasks, feel less comfortable, and are less prone to errors. However, this heightened state also comes at the cost of diminished intuition and creativity.
  • Furthermore, the cognitive ease associated with familiar information makes us assume it is true (mere exposure effect). Conversely, new or improbable extremes are often rejected. This is also why frequent repetition can make people believe falsehoods as truth.
  • Also, to craft a truly persuasive message, keep it clear and easy to understand. This leverages the power of cognitive ease.



Heuristics and Biases

Intuitively, we form feelings and opinions about almost everything you encounter, often based on limited information. For example, we might like or dislike a person long before we know much about them.
  • A simple explanation lies in System 1 thinking that often employs heuristics - mental shortcuts that help us navigate the world quickly and efficiently. However, these shortcuts can lead to biases because judgements are made without analysing every detail.
  • In essence, System 1 substitutes a complex question for a simpler, more readily answerable question with an intensity match (e.g., "How happy are you with your life these days?" becomes "What is my mood right now?"). Plus, System 1 readily jumps to conclusions based on limited data, overlooking the bigger picture and statistical analysis. Therefore, a reasonable margin of error is expected for such judgments.
The Law of Small Numbers highlights the importance of using statistical methods to determine sample sizes, not intuition, to ensure generalizability.
  • Statistically speaking, smaller samples are more likely to produce extreme results that are not representative of the broader population.
  • In other words, randomness can sometimes appear as a regularity or tendency to cluster, even when there's no underlying cause.
  • Intuitively, we might think smaller schools with more personal attention provide a superior education. However, research suggests the relationship is not that simple. While smaller schools can offer advantages, some of the lowest-performing schools are also small. Conversely, larger schools may have more resources and produce a higher number of high-achieving students.

Furthermore, System 1 can be swayed by irrelevant information. The anchoring effect demonstrates how initial numbers presented can influence our judgements.

  • When primed with a certain number, System 1 readily accepts it as a reference point. System 2 then uses this anchor as a starting point for its judgments, potentially adjusting the value up or down but often remaining within a range influenced by the initial anchor.
  • For instance, the same house will appear more valuable if its listing price is high than if it is low.

Another common bias is the availability heuristic. This mental shortcut occurs when people judge the probability of events based on how easily they can think of examples (ease of memory search).

  • Frequently mentioned topics populate the mind more readily, making them seem more like to happen.
  • Imagine you are assessing your assertiveness. If you are asked to list specific assertive behaviours, the ease with which examples come to mind (fluency) can distort your judgment. Recalling six assertive actions might lead you to believe you are highly assertive, while struggling to come up with twelve might lead to an underestimation.
  • The coincidence of two planes crashing last month has made her prefer taking the train now, but the risk of flying has not actually changed.
  • After each significant earthquake, locals surge in purchasing insurance and adopting measures of protection and mitigation.
  • Strokes cause almost twice as many deaths as all accidents combined, but most respondents judged accidental death to be more likely due to media coverage.

The affect heuristic describes a mental shortcut where people make judgements and decisions based on their emotions.

  • They ask themselves questions like "Do I like it? Do I hate it? How strongly do I feel about it?".
Availability cascade is a phenomenon where public fear encourages extensive media coverage, which in turn fuels more emotional reactions and prompts even more coverage.

  • This cycle can lead to an exaggerated sense of danger as media outlets compete for attention-grabbing headlines.
  • Even scientific evidence may struggle to quell the rising fear.
  • This illustrates that our mind either ignore the small risk altogether or give them far too much weight - nothing in between.

Judgements of representativeness based on stereotypes are sometimes made despite equal base probabilities (likelihoods).

  • For example, someone might assume a person with tattoos would not succeed academically, or that young men are inherently more aggressive drivers than elderly women.
  • However, while our intuition might tell us that judging probability based on stereotypes is better than random guessing, the representativeness heuristic can often mislead us.
  • Consequently, a description that aligns with our stereotype should not be seen as strong evidence. It merely indicates a greater likelihood, not certainty.

Conjunction fallacy is a cognitive bias where people judge the likelihood of two events happening together (a conjunction) to be higher than the likelihood of just one of those events happening alone.

  • This is illogical because for the conjunction to occur, the single event must also occur.
  • However, System 1 thinking focuses heavily on the plausibility and coherence of a specific scenario, rather than probability.

Causal stereotypes are mental shortcuts that trick us into believing there is a cause-and-effect relationship between two things, even when that connection is not necessarily true or statiscally supported.

  • Causal stereotypes perfectly suit System 1 thinking, which seek quick explanations and readily accepts causal links that seem plausible.

We often observe that praising someone's performance can lead to a decrease in their performance the next time, while scolding them might be followed by improvement. This might lead us to conclude that punishment is more effective than reward.

  • However, this fails to consider the role of luck in performance.
  • Over time, performance tends to regress to the mean, meaning that exceptional results (positive or negative) are often followed by a return to a more typical level.
  • System 1 thinking, with its focus on causal interpretations, can contribute to misinterpreting these observations.
  • It is important to remember that whenever the correlation between two factors is imperfect (not 1), regression to the mean will occur.

Intuitive judgements can also be made with high confidence despite the fact that they are based on weak evidence.

  • To illustrate, we can intuitively predict a university GPA of a student based on his fluency in reading when four years old by intensity matching the performance of two non-regressive parameters (a feature of System 1 Thinking).



Overconfidence

Our tendency to continuously make sense of the world through narratives is the narrative fallacy, which explains the illusion of understanding. In fact, we understand the past less than we believe we do.

  • People find simple and concrete explanatory stories, like attributing success to talent or failure to stupidity, more persuasive than abstract concepts like luck.
  • However, these flawed narratives of the past inevitably shape a distorted view of the world and our future expectations.
  • The Google story's rise, for example, is thrilling because of the constant risk of disaster. However, luck plays a key role in their success, making it difficult to replicate.
  • In other words, stories of business rise and fall resonate because they offer what our minds crave: a simple narrative of triumph and failure with clear causes. These stories, however, ignore the determining power of luck and the inevitable regression to the mean. They create and sustain an illusion of understanding, offering lessons of little lasting value to readers eager to believe them.

Hindsight bias is a mental shortcut that makes us believe we accurately predicted events after they already happened.

  • It is also sometimes called the "knew-it-all-along" phenomenon.
  • When an unpredicted event occurs, we immediately adjust our view of the world to accommodate the status. However, the adjustment comes at a cost: we lose the ability to recall what we used to believe once a new view is adopted.
  • Imagine a close football game between two evenly matched teams. Now imagine your surprise when one team dominates the other. Hindsight bias kicks in, and you might rewrite your memory of the winning team's strengths.
  • Consequently, hindsight bias makes us prone to blaming decision-makers for good decisions that backfired, while giving them too little credit for successes that now seem obvious.
  • In extreme cases, fuelled by the narrative fallacy's need for clear causes, we can even get the causal relationship backward. After a company fails, we might be more likely to view the CEO as incompetent based on the company's performance, even if their leadership was not the root cause of the failure.
  • In summary, the illusion that we understand the past fosters overconfidence in our ability to predict the future.

Our subjective confidence in our opinions often stems from the coherence of the story that System 1 and System 2 construct based on limited or even poor-quality evidence, not necessarily that the story is true.
  • This aligns with the WYSIATI rule: "What You See Is All There Is." We consider only the readily available evidence.
  • When presented with more evidence, we readily admit that our forecasts, often based on intuition, were likely flawed.
  • However, even if these intuitive predictions are demonstrably wrong (or only slightly better than random guesses), we may still feel and act as if each specific prediction was valid: The illusion of validity.
  • Essentially, investment involves traders making confident educated guesses in a highly uncertain environment. While those with more knowledge tend to make slightly better forecasts than those with less, the relationship is not straightforward. The reason for this is that as someone acquires more knowledge, they may develop an inflated sense of their own skill and fall victim to overconfidence. This can lead them to overestimate the accuracy of their predictions in this unpredictable world.

Across many studies, the accuracy of expert subjective impressions has been shown to be matched or exceeded by statistical predictions made using simple algorithms.

  • One reason why human decision-makers are often worse than a prediction formula is that experts attempt to be clever, think outside the box, and consider complex combinations of features when making their predictions. However, while complexity may be beneficial in rare cases, it more often reduces validity. Simpler combinations of features tend to be more effective.
  • Another reason expert judgment is inferior is that humans exhibit inherent inconsistency when making judgments about complex information. When presented with the same information twice, they often provide differing evaluations, susceptible to the influence of priming, cognitive ease, and surrounding circumstances.
  • Even though formulas like Apgar scores are used widely in daily practice, humans still value clinical expertise for its dynamic, holistic, and sophisticated approach, which may lack a strong statistical foundation.
  • A story of a child dying due to an algorithmic error resonates more powerfully than one caused by human error. This difference in emotional intensity can easily translate into a moral preference for human judgment, despite the potential for human bias.

While heuristics can influence intuitive judgments, particularly under uncertainty, prolonged practice with timely feedback offers a deeper explanation for accurate expert intuition.

  • For example, a seasoned firefighter can assess a dangerous situation based on subtle smoke patterns, sounds, and building materials, and make split-second decisions.
  • Through countless patient interactions, doctors may also develop a refined intuition that allows them to identify potential issues during an initial examination.
  • In essence, the situation provides a cue, unlocking a vast network of experience stored in memory. This retrieved knowledge fuels the expert's seemingly intuitive judgment. Intuition, in these cases, is just recognition from a culmination of experience. In the absence of valid cues, intuitive "hits" are due either to luck or, more concerning, to biases or misconceptions masquerading as insights.
  • Nonetheless, the absence of immediate feedback makes it difficult to connect actions with outcomes, hindering the development of expertise based on past experiences.

In planning, we often get caught up in the inside view, focusing solely on the specifics of the task at hand (WYSIATI) and making predictions based on our current limited understanding, often influenced by optimism bias. This leads to the planning fallacy, where we underestimate the time, cost, and effort required because the forecasts are unrealistically close to best-case scenarios.

  • Our overconfidence stems from neglecting "unknown unknowns" – unforeseen events like illnesses, divorces or unexpected complications that can significantly extend the project timeline. We fail to consider the true odds stacked against us, leading to an overly optimistic outlook. Consequently, these initiatives are unlikely to be completed on budget, on time, or to deliver the expected returns.
  • In contrast, the outside view considers the bigger picture and similar situations. It leverages past experiences, statistics from similar projects and general knowledge to create a more realistic baseline prediction. This broader perspective offers an objective view, less susceptible to biases.
  • However, when confronted with an outside view, we might be surprised but choose to ignore it. This is known as irrational perseverance, where we prioritize our commitment to the project over reason.

Optimists tend to be cheerful, happy, and popular due to their preference for seeing the bright side. However, the benefits of optimism are limited to those with a mild optimistic bias. They can "accentuate the positive" without losing touch with reality.

  • Optimistic bias plays a dominant role in individuals' willingness to take significant risks, making them more likely to become inventors and entrepreneurs. Their confidence in future success fuels a positive mood and persistence in the face of obstacles. However, this overconfidence can morph into an entrepreneurial delusion if they fail to consider the outside view, where failure rates are actually high.
  • Our System 1 thinking often falls prey to the WYSIATI principle (What You See Is All There Is). This leads us to focus solely on our goals, plans and actions while neglecting the high number of failed businesses that exist and failing to consider the actions and innovations of potential competitors (competition neglect). We fall into the trap of assuming we are above average, blind to the reality of the situation.
  • Despite careful evaluation of critical factors like product need, production cost, and demand trends, many optimists persist despite discouraging advice. Some even double down on their initial losses before giving up.
  • The truth is, financially, self-employment generally offers mediocre returns. On average, individuals with the same qualifications achieve higher returns by working for employers than by starting their own businesses. The evidence suggests that optimism is widespread, stubborn and costly.

Moreover, clients can inadvertently encourage expert overconfidence, leading to riskier behaviour. At worst, acting on pretended knowledge becomes the preferred solution.

  • For instance, clinicians who express uncertainty might be perceived as weak or vulnerable. This creates pressure to downplay doubt, even when the situation is unclear.
  • Experts who acknowledge the limitations of their knowledge risk being replaced by more confident competitors who appear more trustworthy to clients.



Choices

Every significant choice we make in life involves some uncertainty, inherent in the nature of decision-making.

  • Expected utility theory, adopted by economists, posits that rational logic forms the foundation of decision-making.
  • Expected utility is calculated by multiplying the utility of each possible outcome by its probability of occurring, and then summing the products.
  • For example, the expected value of an 80% chance to win $100 and a 20% chance to win $10 is $82 (0.8 × 100 + 0.2 × 10).

NOTE: Theory-induced blindness occurs when our strong belief in a theory makes it difficult to notice its flaws. We might encounter observations that contradict the model, but instead of questioning the theory, we assume there is a missing explanation that fits the existing framework. We give the theory the benefit of the doubt, trusting the established experts who support it.

While expected utility theory assumes rationality, Daniel Kahneman, with Prospect Theory, argues that humans actually make risky choices.

  • We define outcomes as gains and losses from a neutral reference point, not just states of wealth.

Risk aversion
  • Most people dislike risk. They prefer a certain outcome, even if it's smaller, over a risky option with a potentially higher payoff.
Loss aversion
  • Nonetheless, people feel losses more intensely than gains of the same size.
  • This means we tend to be more motivated to avoid losses than acquire gains, except when the potential gain is significantly larger, between 1.5 and 2.5 times the amount of the potential loss.
Endowment effect
  • A cognitive bias where people tend to value things they own more highly than things they do not own, even if they are objectively the same.
  • This is especially true for items held for a longer period, where emotional attachment can build over time.
  • The fear of losing the item becomes a bigger factor in its perceived value compared to the potential pleasure of acquiring something new.
  • This differentiates it from standard trading, where decisions are typically based on objective factors and market value.
Bad events
  • Our brain, specifically the amygdala, responds very quickly to bad events (news, words, feedback). It prioritizes these threats more quickly than good ones to ensure survival in the wild.
  • Hence, we are driven more strongly to avoid losses than to achieve gains.
  • When a firm is doing well, we do not consider it unfair if it does not share increased profits. However, we perceive it as unfair when the firm exploits its power to break informal agreements or contracts with customers or suppliers, imposing losses on others to boost its own profits.
The Fourfold Pattern
  • When facing losses, we tend to overweigh both small risks (0-5% chance) and certain loss (95-100% chance) compared to smaller, incremental changes (e.g., 60% to 65% loss). This highlights our tendency to overemphasize both the possibility of losing and the fear of complete loss, while undervaluing small increases in potential losses.
  • Hence, the weights people assign to outcomes do not necessarily match the likelihood of those outcomes.
  • Risk-seeking for low-probability gains: People are more likely to take risks for a chance at a big win, even if the odds are low. (Imagine buying a lottery ticket for the hope of large gain)
  • Risk-averse for high-probability gains: People tend to be more cautious when the possibility of a gain is high, preferring a sure thing over a slightly riskier option with a potentially higher reward. (Imagine choosing a guaranteed 10% return on investment over a 90% chance of 11% return, we tend to accept unfavourable settlement due to fear of disappointment)
  • Risk-averse for low-probability losses: People dislike even a small chance of a significant loss and will often go to great lengths to avoid it. (Imagine taking out extensive insurance coverage due to the fear of a large loss or becoming paranoid about the potential risk of bank robberies)
  • Risk-seeking for high-probability losses: When a loss seems very likely, people may sometimes take additional risks, figuring they "have nothing to lose" anyway. (Imagine gambling all your remaining money for the slim hope when you are already down significantly due to fear of large loss. Nonetheless, risk taking of this kind often turns manageable failures into disasters.)
Framing effects
  • The way choices are presented, or the framing of the options, can evoke different emotions and preferences in decision-making.
  • One interesting consequence of framing effects is preference reversal.
    • This describes a situation where someone's choices actually switch depending on how the options are presented, even though the underlying value of those options remains the same.
    • For example, a $100 gamble with a 10% chance to lose $100 or 90% chance to win $200 or a surgery with 90% one-month survival rate or 10% mortality in the first month. As illustrated in the fourfold pattern, people might be more likely to take a risk to avoid a certain loss than to take the same risk for a potential gain.
  • Vivid descriptions can also make low-probability events seem much more significant through denominator neglect.
    • Consider a vaccine that protects children from a fatal disease. While the risk of permanent disability is statistically low at 0.001%, it might seem negligible.
    • However, framing the risk differently can have a strong impact. Imagine hearing this instead: "One out of every 100,000 vaccinated children will suffer permanent disability."

    Keeping Score

    • Similar to accounting, our minds mentally track gains and losses from our initial investments.
    • However, unlike accounting, human psychology can be influenced by emotions like the well-described disposition effect.
    • This effect makes us reluctant to admit "failure" by selling losing investments, and instead, we tend to sell winners before losers, potentially harming our long-term financial performance.

    Further complicating sound financial decisions are biases like the sunk cost fallacy.

    • This describes how people tend to "throw good money after bad", rather than consider he odds that an incremental investment would produce a positive return.
    • They continue investing in failing projects simply because they have already invested significant resources.
    • Therefore, boards of directors, aware of these biases, may replace a CEO who is unwilling to move on from failing projects, even if the new CEO is not necessarily seen as more competent.

    Regret is an emotional state accompanied by feelings that one should have known better.

    • It is triggered by the availability of alternatives to the actual course of events, and is worse when unusual events are involved.
    • Blame, on the other hand, is directed towards someone who took an unreasonable risk and seems to deserve the negative outcome.
    • Consider two men: one who rarely picks up hitchhikers and another who frequently does. Yesterday, they both gave a man a ride and were robbed. Which man would likely experience greater regret, and who would be blamed more by others?
    • The key is not the difference between commission and omission but the distinction between default options and actions that deviate from the default.



    Two Selves

    Experienced utility refers to the moment-to-moment pleasure or pain associated with an activity or event, and it shapes our preferences and motivations.

    • It is distinct from decision utility, which focuses on the anticipated value or overall evaluation of an outcome, which is primarily influenced by rational considerations.
    • Nonetheless, the concepts may coincide when people enjoy what they do and do what they enjoy.

    The two selves

    • Experiencing Self: This is the "live" version of ourselves, focused on the present moment. It answers the question, "Does it hurt now?".
    • Remembering Self: This is the "memory" version of ourselves, reflecting on the past. It answers the question, "How was it, on the whole?".
    • Memories are all we have left from our experiences. Since we can only reflect on our lives through memories, our perspective is primarily shaped by the remembering self.
    • To illustrate, a long symphony with a shocking sound at the bad ending (exemplifying the peak-end rule) may ruin the whole experience, effectively ignoring the 40 minutes of musical bliss.
    • In summary, the experiencing self is silent, living in the moment without a voice. While the remembering self can be mistaken, it plays a crucial role. It keeps score of our experiences, shaping what we learn from life and ultimately guiding our decisions.

    Life as a story

    • A story is not about the mere passage of time, but about significant events and memorable moments.
    • In storytelling, duration is often neglected, and the ending frequently defines the character of the narrative.
    • For example, a divorce does not necessarily mean the marriage was a complete failure. Even if the good times lasted ten times longer than the bad, the remembering self often focuses on the negative ending.

    Our emotional state is largely determined by what captures our attention.

    • We typically focus on our current activity and immediate environment.
    • There are exceptions, however, where the quality of subjective experience is dominated by recurrent thoughts rather than by the events of the moment.
    • When happily in love, we may feel joy even when caught in traffic, and if grieving, we may remain depressed when watching a funny movie.
    People's evaluations of their lives and their actual experience may be related, but are also distinct.

    • Factors like poverty and education attainment can significantly influence this gap.
    • To illustrate, illness can be a far greater burden for the very poor than for the wealthy.
    • People who seem equally fortunate can vary greatly in happiness due to what they value and what they strive for in life.

    On their wedding day, most couples are aware of the high divorce rates and even more common marital dissatisfaction. Yet, they often believe they are somehow immune to these statistics. This exemplifies a classic issue with affective forecasting, where we struggle to predict our future emotions accurately.

    • Newlyweds, basking in the initial euphoria of love, might feel their happiness will last forever. However, as the "honeymoon phase" inevitably fades, their well-being becomes more dependent on daily activities and their current environment, just like everyone else.
    • Married women generally spend less time alone and with friends compared to their single counterparts. While intimacy increases, so do time commitments to housework, meal preparation, and childcare - tasks that often fall outside the category of "popular activities."
    • Consequently, many women may perceive their marriage as unhappy. However, deliberately cultivating positive thoughts about their relationship throughout the day can significantly improve their emotional experience.

    The focusing illusion is a cognitive bias that occurs when we overestimate the importance of a single factor in influencing an outcome, especially when we dwell on that particular aspect. We tend to fixate on this one detail and neglect the influence of other relevant factors.

    • Nothing in life is as important as you think it is when you are thinking about it.
    • To illustrate, investing in a comfortable new car can feel like a significant decision, though the initial excitement might fade as it becomes your everyday ride.



    Summary

    Thinking, Fast and Slow is an influential book whose concepts are often highlighted in other self-help books. For example,

    • Adam Grant's Originals explores the idea of two thinking systems and risk aversion, while
    • Chris Voss's Never Split The Difference highlights the two thinking systems, anchoring effect, and prospect theory.
    However, while Thinking, Fast and Slow's concepts are widely used, it is important to consider the limitations of some research findings due to limited study group sizes and the possibility of chance results.
    • Even if an initial experiment with a small sample yields a result, it might not be successfully replicated with a larger, more diverse group.
    • Consequently, applying these findings to real-world situations can lead to a false sense of confidence.

    Moreover, the "System 1" and "System 2" model, while powerful, might be an oversimplification of how our minds actually work.

    • Some argue that our minds function along a continuum rather than in two distinct categories.

    Comments