THINKING, FAST AND SLOW

Thinking fast and slow is creative work that helps you understand behavioural science topics. The book was published in October 2011, won the 2012 National Communication Award and was a New York Times bestseller. In this highly anticipated book, the writer takes us on an innovative tour of the mind and talks about the two systems that guide the way we think. System 1 is fast, automatic and operates with less effort with no sense of voluntary control. System 2 is logical, slow and allocates attention to effortful activities.

HOW DOES THIS BOOK HELP US? 

Thinking Fast and Slow helped us understand how prone we are to neglect facts, others’ failures and what we don’t know in favour of what we know. The book made us realise that the outcome of our achievements does not lie entirely in our hands; the luck factor also plays a significant role.

THE BOOK EXPLAINED IN UNDER 60 SECONDS 

  1. Thinking Fast and Slow shows how two systems in the brain constantly fight over control of behaviour and action. It teaches many ways which lead to errors in memory, judgement and decisions.
  2. The book delineates rational and non-rational motivations associated with each thinking process and how they complement each other.
  3. Thinking, Fast and Slow summaries research suggests people have too much confidence in human judgement.

TOP THREE QUOTES 

  1. “A reliable way to make people in falsehoods is frequent repetition because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact.”
  2. “Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”
  3. “The easiest way to increase happiness is to control your use of time. Can you find more time to do the things you enjoy doing?”

BOOK SUMMARY AND NOTES

PART ONE: TWO SYSTEMS

Chapter one: The character of the story

The human brain comprises two characters: a character that thinks fast, known as System 1 and one that thinks slow, System 2.

System 1

System 1 operates automatically and swiftly without effort or a sense of voluntary control. System 1 quickly originates impressions and feelings that are the primary sources of the explicit beliefs and choices of System 2. The capabilities of System 1 include innate skills that you share with animals. You’re born prepared to distinguish the world around you, recognise objects, orient attention or avoid losses. These mental activities become fast and automatic through prolonged practice. Examples of automated activities attributed to System 1 include detecting hostility in a voice, reading words on giant billboards or orienting to the source of a sudden sound.

System 2

System 2 distributes attention to the effortful mental activities that demand it, including compound computations. The behaviours of system two are usually associated with the individual experience of agency, choice and concentration. The operations of system 2 are diverse and have one attribute in common, they call for attention. If you direct your attention inappropriately, the chances are that you’ll fail to execute these operations. Examples of diverse activities associated with system 2 include checking the validity of a logical argument, parking in a narrow space, or focusing on a person’s voice in a crowd.

System 2 can alternate the way system one works by programming the automatic functions of attention and memory. For example, you can set your memory to search for cities that start with the letter “N”.

Chapter two: Attention and Effort

Engaging in effortful mental activities affects your body. That’s to say, pupils are sensitive indicators of mental effort. Therefore they dilate when you encounter a challenging task. Since thinking slowly requires attention, effort and work, you’re more likely to turn to think fast, a path of less resistance. You will need to think fast to accomplish routine duties and skilful tasks. You’ll think too slow when dealing with complicated tasks because they

require your effort and attention.

Chapter three: The Lazy Controller

In most cases, it’s usually easy and quite pleasant to walk while you think simultaneously, but at the extremes, these activities tend to compete for the limited resources of system 2. While walking alongside, try computing 23×78 in your head and do so immediately. You’ll undoubtedly stop and try concentrating on the computation. Thinking while strolling is possible, but you cannot engage in deep mental work that imposes a heavy load on short-term memory. Self-control tends to diminish when you’re tired, hungry or mentally engaged. Because of this, you’re prone to let system one takeover intuitively and impulsively.

Favourite quote of the part: “The world makes less sense than you think. The coherence comes mostly from the way your mind works.”

PART TWO: HEURISTICS AND BIASES

Chapter one: The law of small numbers

Humans usually have a difficult time with statistics. Small samples are prone to maximum results than large samples. Still, you lend the outcomes of small samples more credence than statistics warrant. Small samples are not representatives of large samples. Therefore, a sufficiently large sample is the only way to reduce the risk. If you pick small samples, you leave yourself at the mercy of sampling luck.

The strong bias toward believing that small samples resemble the population from which they are drained. This is also part of a larger story: we are prone to exaggerate the consistency and coherence of what we see. The law of small numbers manifests a general bias favouring certainty over doubt.

Chapter two: Anchors

The anchoring effect is prevalent and essential in your everyday life. It occurs when people consider a specific value for an unknown quantity before estimating it. When shown greater/lesser numbers, experimental subjects gave greater/lesser. What happens is one of the most reliable and robust results of experimental psychology: the estimates provided by the issues stay close to the number considered—hence the image of an anchor.

Anchoring as Adjustment

A strategy for evaluating undetermined quantities: commence from the anchoring number, assess whether it’s too high or too low and gradually adjust your estimate by mentally moving from the anchor. The adjustment generally ends prematurely because people stop when they’re no longer confident that they should move farther.

Chapter three: The Science of Availability

The availability heuristic refers to judging the probability of events based on how easy instances come to mind. The heuristic operates on the “If you can think of it, it must be important.”. Like other heuristics of judgment, the availability heuristic replaces one question for another. You wish to evaluate the frequency of an event, but you report a notion of the ease with which instances come to mind. Substitution of questions indeed produces errors. Discover how the heuristic leads to biases with this straightforward procedure: list factors other than the frequency that make it easy to develop instances. Each element in your list will be a potential source of bias.

The availability bias influences people driven by System 1 more than others. Particularly when they’re engaged in another effortful task simultaneously or if they score low on a depression scale.

Favourite quote of the part: “In most situations, a direct comparison makes people more careful and more logical. But not always. Sometimes intuition beats logic even when the correct answer stares you in the face.”

PART THREE: OVERCONFIDENCE

Chapter one: The Illusion of Understanding

Narrative Fallacy describes a tendency for fallacious stories of the past and how they shape our views of the world and future expectations. These fallacious stories come to light inevitably from our continuous attempt to make sense of the world. People find compelling explanatory stories simple, and concrete rather than abstract; they assign a more significant role to talent and intentions than luck. Focusing on the few noticeable events that failed to happen. As a result, people tend to overestimate skills and underestimate luck. Humans constantly fool themselves by constructing flimsy accounts of the past and believing they’re true.

The Social Costs of Hindsight

The mind that creates narratives about the past is a sense-making organ. When an unforeseeable event occurs, you immediately change your view of the world to accommodate the surprise. An extensive limitation of the human mind is its flawed ability to restructure the past states of knowledge that changed.

When you adopt a new view of the world, you instantly lose the ability to remember what you used to believe before your mind changed.

Chapter two: The Illusion of Validity

We sometimes believe our opinions, predictions and points of view are valid even when confidence is unjustified. Some even hold on with confidence to ideas in the face of counter-evidence. Because of belief by coherence, our subjective confidence in our opinions reflects the consistency of the story that systems 1 and 2 constructed. The amount of evidence and its quality only matters a little because even poor evidence formulates a good story. Most essential beliefs have no proof, but people we love and trust hold these beliefs. When you consider how little we know, our confidence in our beliefs is incredible and essential.

Subjective confidence in judgement is not a reasoned estimate of the probability that this judgement is correct. Confidence is a feeling which reflects the consistency of the information and the cognitive ease of processing it. It’s better to take the acknowledgement of uncertainty seriously. Still, declarations of high confidence mainly tell you that an individual has constructed a coherent story in their mind, not necessarily that the story is true.

Chapter three: The Engine of Capitalisation

The planning fallacy is one of the expressions of a pervasive optimistic bias. We tend to view the world as more benign than it is. Our attributes as more favourable than they indeed are, and the goals we adopt as more achievable than they are likely to be. We are prone to abandon facts, others’ failures and what we don’t know in favour of what we know and how proficient we are. We think the results of our achievements lie in our hands while ignoring the luck factor. We don’t appraise the uncertainty of our environment.

Exaggerating our ability to forecast the future fosters optimistic overconfidence. The optimistic bias may be the most significant cognitive bias regarding its decision consequences. Because optimistic bias is both a favour and a risk, be happy and wary if you are temperamentally optimistic.

Favourite quote of the part: “Our comforting conviction that the world makes sense rests on a secure basis: our almost unlimited ability to ignore our ignorance.”

PART FOUR: CHOICES

Chapter one: Bernoulli’s Errors

What controls people’s choices between various simple gambles and between gambles and facts? Simple stakes, such as a 40% chance to win $300, are to students of decision-making what the fruit fly is to geneticists. Choices between such gambles provide a simple model that shares crucial attributes with the more complex decisions. Gambles represent that the consequences of choices are never inevitable, and the seeming outcomes are uncertain. Say you sign a contract to buy an apartment; you are unsure how much you’ll sell it. Every significant alternative we make in life comes with uncertainty, so students hope that some of the lessons from the model situation will apply to more interesting everyday problems.

Chapter two: Prospect Theory

The prospect theory describes how individuals assess their loss and gain perspectives asymmetrically. For example, for some people, the agony of losing $1,000 can only be compensated by the satisfaction of earning $2,000. Thus, opposite to the expected utility theory, which models the decision that perfectly rational agents would make, prospect theory aims to explain people’s actual behaviour. Utility theory makes analytical assumptions of economic rationality that do not represent people’s basic choices and does not consider cognitive biases. In the utility theory, the utility of a gain is assessed by comparing the utilities of two states of wealth. For example, the utility of getting an extra $500 when your wealth is $1 million is the difference between the utility of $1,000,500 and the utility of $1 million.

When set side by side against each other, losses emerge larger than gains. This asymmetry between the power of positive and negative expectations has an evolutionary history. Organisms that treat threats that are more urgent than opportunities have a chance to survive and reproduce.

Chapter three: Bad Events

The concept of loss aversion is, beyond doubt, the most significant contribution of psychology to behavioural economics. This is unusual because the idea that people evaluate many outcomes as gains and losses and that losses emerge larger than gains surprises no one. People work harder to stay away from losses than to achieve gains.

The negative tends to trump the positive in numerous ways, and loss aversion manifests an overall negativity dominance. Bad is more substantial than good: bad parents, destructive emotions, or bad feedback are more influential than good, processing wrong information more thoroughly than good. Bad impressions and harmful stereotypes are quicker to form and more resistant to disconfirmation than good ones.

Favourite quote of the part: “They will feel better about what happened if they’re able to frame the results in terms of how much money they kept rather than how much they lost.”

PART FIVE: TWO SELVES

Chapter one: Two Selves

We have both an experiencing and remembering self. In most cases, the latter usually takes precedence over the former. That’s to say, you can experience nine days of vacation, but in a case where on the 10th day things go wrong, you’ll remember the holiday as being negative. When remembering the whole, you tend to overrate the end of an experience, and your memory overrides your experience. Confusing the experience with the memory is an enthralling cognitive illusion. The substitution makes you believe that an experience can be ruined.

Decisions do not generate the best possible experience and estimates of future feelings. You cannot fully trust your preferences to reflect your interests. Even if they’re based on personal experience, the memory of that experience was laid down within the last quarter of an hour.

Chapter two: Life as a Story

A story is about remarkable events and memorable moments, not time passing. Duration neglect is routine in a story, and the ending usually defines its character. The same core attributes appear in the rules of narratives and the memories of colonoscopies and movies. This is how the remembering self works: it creates stories and keeps them for future reference.

Most importantly, we all care greatly for the narrative of our life and want it to be a good story with a satisfactory hero. Caring for people usually takes the form of concern for their stories’ quality and not their feelings. We can be immeasurably moved by the events that change the stories of people who are already dead.

Favourite quote of the part: “The easiest way to increase happiness is to control your use of time. Can you find more time to do the things you enjoy doing?”

HOW THIS BOOK CAN HELP SOFTWARE DEVELOPERS

“Thinking, Fast and Slow” by Daniel Kahneman can help software developers by highlighting the cognitive biases that affect human thinking and decision-making. The book explains that humans have two thinking systems: System 1, which is automatic, quick, and prone to biases, and System 2, which is slower, more deliberate, and more rational. By being aware of these biases and understanding how they affect our thought processes, software developers can make better decisions, write more effective code, and create products that better serve their users. The book’s insights into cognitive biases, heuristics, and the interplay between our automatic and deliberate thinking processes can help developers design better user interfaces and experiences, identify and avoid common pitfalls in software development, and improve their own problem-solving and decision-making abilities.

DevologyX OÜ
Harju maakond, Tallinn, Lasnamäe
linnaosa,
Vaike-Paala tn 1, 11415

+372 6359999
[email protected]
DevologyX Limited
Nakawa Business Park
Kampala
Uganda

+256206300922
[email protected]