Thinking, Fast and Slow – Daniel Kahneman

Thinking, Fast and Slow – Daniel Kahneman
Thinking, Fast and Slow – Daniel Kahneman

Nothing in life is as important as you think it is, while you are thinking about it.

DANIEL KAHNEMAN

Frequent repetition is a reliable way to make people believe falsehoods because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact.

Bat and Ball Problem

Try to solve this problem –

“A bat and a ball cost $1.10 in total.” The bat costs $1 more than the ball. “How much does the ball cost?”

– 

– 

If you say $1 for the bat and 10 cents for the ball, then it’s wrong because you just used system 1.

(Answer is in last card)

Answer To Bat & Ball Problem

In this type of case, you just need to use your System 2 and spend some time with it. And the answer will be $1.05 for the bat and 5 cents for the ball.

Now the ball costs $1 more than the bat.

Framing Effect

Imagine that there is an outbreak of a deadly disease that will kill 600 people and you must choose between the two options –  

Option 1: Guarantees that 200 people will live.

Option 2: Provides a one-third probability that all 600 people will live, but it also comes with a 1/3 probability that no one will survive.

Option 1 may be chosen due to loss aversion. If you think through the choices, you can see that the probabilities of each are identical. This is called the framing effect.

Status Quo Bias

System 1 defaults to choices that maintain the status quo because System 1 psychologically weighs losses twice as much as gains (loss aversion). System 1 is emotionally attached to objects it owns or invests in (the endowment effect) and overvalues the status quo.

You can ask yourself – “What opportunities do I lose by maintaining the status quo?” (or, “If I continue to say yes to this, what am saying no to?”)

Understanding 2 Systems

  1. System 1 comprises the oldest parts of the brain. It operates automatically and involuntarily. This system is always functioning and is responsible for most of the day-to-day activities. It is also responsible for our reactions to danger, novelty, and intuition.
  2. System 2 allocates attention and completes tasks that require effort. System 2 is a newly evolved part of the brain, and only humans have a highly developed prefrontal cortex.

Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

Availability Bias

The bias of availability happens when we give too much weight to recent evidence or experience.

If we don’t check for the “availability bias” (or what psychologists call the “mere exposure effect”) prior to making an important decision, then our preference will be based on environmental conditioning.

So before each decision, just ask “Is this the best option or just the option I’ve been frequently exposed to?”

Loss Aversion

Ask yourself, “If I flipped a coin and could lose $100 on tails or win $150 on heads, would you take the bet?” Did you feel a slight hesitation about the gamble? Most people do, even though it’s a reasonable bet to take.

Losses loom larger than gains – relative to a reference point, a loss is more painful than a gain of the same magnitude.

Tunnel Vision Bias

System 1 loves to use limited information to form quick judgments and then block out conflicting information. The author Daniel Kahneman calls it W.Y.S.I.A.T.I. (What You See Is All There Is).

Kahneman explains that System 1 sees two or three pieces of information and then “infers and invents causes and intentions, then neglects ambiguity and suppresses doubt.”

Just try to ask, “Why might the opposite be true?”

Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed

Confirmation Bias

Within WYSIATI (What You See Is All There Is), people will be quick to seize on limited evidence that confirms their existing perspective. And they will ignore or fail to seek evidence that runs contrary to the coherent story they have already created in their mind.

Just ask yourself – “Why do we have certain beliefs or perspectives?”

The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little

Anchoring Effect

Anchors are arbitrary values that we consider for an unknown quantity before encountering that quantity.

Anchors are known to influence many things, including the amount of money people are willing to pay for products they have not seen. 

So before choosing the anchors, just do research on the thing and then decide.

Source