Making better decisions, probably. Thinking, Fast and Slow by Daniel Kahnemann.

Kit Teguh
10 min readJan 14, 2025

--

“Subjective confidence in a judgement is not a reasoned evaluation of the probability that this judgement is correct. Confidence is a feeling, which reflects coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

This is the book that I wish I read when it was first published in 2011, when I was a struggling Honours student and I was just working a dead end job with the power company, taking off people’s power when they haven’t paid their bills. Fuck it was a shit job. It’s the type of book that the earlier you read it in your life, the more you study it, the more it’s going to help. I’m pushing towards forty now and I’m still a bit shit with making decisions.

WHERE WERE YOU KAHNEMANN? But enough of gaslighting a dead Nobel Laureate for no reason, the fault is on me not picking up the book earlier, even though like ten people already had recommended it to me. It is a book that I’ve started reading before the start of the year and I’ve only just finished it the second week of the new year. It’s a book that took me a while to finish — almost three weeks. It is worthwhile to study each page and make sure that if you don’t connect the dots, you gotta keep trying to connect them, because you might get that eureka moment when you do.

I keep coming back to the comma in the title. The comma separates the thinking, which is an independent entity in itself. The fast and slow are two approaches that we have in any impression we make in our minds — the automatic and non-automatic ways of thinking. Kahnemann separates this into the two systems, concisely named as system 1 and system 2.

System 1 takes what we see and perceive for granted, glossing over things without much thought. In general, system 1 accepts what we see (What you see is all there is, shortened to WYSIATI) without questioning much. Our system 1 is built from our previous learning experiences, culminating into second nature knowledge. But sometimes what we see and read the first time isn’t always right and that’s where our system 2 comes in, or at the very least, should come in.

System 2, in contrast, questions what system 1 won’t, to examine and analyse below the surface and slow things down a fair bit before jumping into conclusions. It requires more effort and focus, to do tasks such as finding where the fuck is Waldo, or trying to figure out if someone is a Scot instead of Irish from his accent, or multiplying any number except for 1 with pi.

I was mistaken on what I thought this book was about, which was how to put the brakes on your system 1 when you really need to, and switch to system 2. But the book dives deeper into the psychological theories between the two systems, generously backed with heaps and heaps of experimentation discussions and data. Having said that, it is a useful read and shows how silly we can be in face of what could appear as obvious problems which needs more complex solutions.

I think this dude is thinking slow. Must’ve been thinking for a couple of centuries. now. Photo by Valentin Kremer on Unsplash

Over the course of his life, Kahnemann and his partner in crime, Amos Tversky (who passed away before the publication of the now seminal book) would posit hypothetical questions to each other, not dissimilar to the vein of who you’d marry, fuck or kill, but with a more academic nuance. Plenty of these choices are simple on the surface, but has deep implications about how we think. You will see some of these questions later as we go a bit deeper.

Thinking, Fast and Slow. But mainly slow.

There are too many takeaways to list, but I need to summarise what I find the most useful.

We are dumber than we think

It is the classic bat and ball story which failed half of Harvard undergraduates, and honestly, failed me too the first time I saw it:

If a bat and a ball costs $1.10, and the bat costs a dollar more than the ball, how much are each of the item?

At first glance, the answer is simple enough — that the bat is a dollar and the ball is 10c. But on second glance, the total then would add up to $1.20 now. This is the obvious answer that most people would take, and it’s damn wrong. The correct answer would be $1.05 for the bat and 5c for the ball. There is an algebraic equation to figure this out, but you can figure that our for yourself.

The key really is to identify situations where the obvious answer may not be the best answer, but in the end of the book, Kahnemann admitted that this awareness and the triggers will be different for different people. It’s really up to us to jump start our system 2 into action lest our system 1 makes a dummy out of us, based on previous experiences and internal thought processes we have in place.

Other reasons why we’re dumber than we think

Our ability to think is jaded with information most readily available to us. Thus, we are jaded with biases with readily available information instead of looking at the overall picture and seeing a flaw in the original statement offered. Consider this statement:

“How many of each animals did Moses take to the ark?”

Instead of looking at the statement and judging its initial accuracy, we’d be thinking of answering the question of the animal instead. On a second glance, we should notice that it wasn’t Moses who took the animals into the ark, it was Noah. But if we replace Moses with another figure such as Nelson Mandela, we would realise the error of the statement right away. The relative closeness of Noah and Moses, being figures in the bible was enough to fool us to accepting the errant question in the first place.

Another brainteaser:

Which is more probable?
John is a postman.
John is a postman and walks to work.

For plenty of people, the second option would be the most probable, as the narrative makes sense that a teacher would walk to work. However, bundling the two events together, the probability of the second to happen must be lesser than the more general statement. There is a difference between possibility and plausibility — the fact that an event can plausibly happen doesn’t mean that it will.

Consider also the question:

An individual has been described by a neighbor as follows: “Steve is very shy and withdrawn, invariably helpful but with very little interest in people or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.” Is Steve more likely to be a librarian or a farmer?

At first glance, Steve would be a librarian, as the personality matches the narrative of the stereotypes of what a librarian is supposed to be. But who’s not to say that farmers are not quiet, withdrawn and helpful either? The two professions share similar personality traits. Yet, I venture to say that there are also loud librarians who are not very helpful either. There are far more farmers than librarians in the population, which follows that there is a greater probability for Steve to be a farmer.

We often tend to neglect the base rate, that is the percentage of a population which displays a certain characteristics. Sometimes we are too focused on the problem that we miss the bigger picture.

Kahnemann filling the gap where Bernoulli fell short

Bernoulli was a fucking genius, but theories from 300 years ago might show some cracks overtime. Bernoulli fails to account the initial reference state in regards to winning or losing one’s wealth. The reference point matters when compared to someone’s relative situation, for example, the difference of gaining $100 when someone only has $20 to begin with is higher than when someone has $1,000 to begin with. This difference in reference point is the foundation of Kahnemann’s prospect theory.

People are more risk averse to losing money than to gain money.

Another Bernoulli faux pas is to assume that gains and losses have equal amounts of emotional weight, though in reality, losses have more impact on one’s decision than gains. This is the idea of loss aversion, where individuals would be less willing to risk losses as opposed gunning for gains. Imagine the following scenario:

If the coin shows tails, you lose $100.
If the coin shows head, you win $150.
Is this gamble attractive? Would you accept it?

Most people would not take the bet even when the payout is larger than what they stand to lose. If I ask this question for myself, I would not accept this bet myself. And most people, like me, would also skip this bet. Of course, when there is no risk and the loss anybody would risk $0 to potentially gain $150.

Kahnemann plotted this in the now seminal S curve of prospect theory below. An interesting thing to note is that for gains, the psychological value tapers out after reaching a certain amount. For example, there is little benefit on gaining $10 million as opposed to $20 million, depending how much Fuck You money you need, and this will be different for every person.

People can go all in when desperate, even if this is fucking TERRIBLE for them in the long run.

In the diagram below, 3 out of 4 was expected from an experiment which studies the relationship between probability and gains. The top right box is worth discussing, as it shows that plenty of punters would go for a gamble rather than to accept the certainty of a loss. For example, most people would accept a 95% gamble of losing $1,000 (or 5% probability of NOT paying $1,000) rather than just paying $800 straight up.

In Econs, this is related to the idea of a sunken cost fallacy where the person would dig themselves into a deeper hole instead of accepting current losses. Gamblers are guilty of this behaviour, seeing that there is a super slim chance to win their money back but instead ending up mortgaging their houses to pay off the debt.

This reminds me of a story I heard from a bartender in Las Vegas, whose grandmother lost both her properties after winning $100,000 on the first day. The nice man from the casino stopped her so she could spend another night free of charge and let the good times roll, maybe doubling her winnings. Yeah well, that didn’t fucking happen did it? It is a psychological flaw almost impossible to remedy.

— -

Honestly, I couldn’t summarise all the things that I picked up from Thinking, Fast and Slow. It is a book to explore every so often if you’re feeling stuck with a problem and if you just want to understand how the mind works. Will it make you smarter? I’d like to think so, but it depends whether you’d use the information in your day to day life. It’s been a few days since I’ve finished the book and I have to say that it had made me a little calmer.

Last night, as I was hurrying to wash my clothes so I could pack them to a suitcase, I accidentally washed my AirPods. Thankfully, those things are fucking strong and after fanning them out overnight, putting it into a a container of rice, it’s still working. But I thought about the issue from the book, whether I should purchase another if I couldn’t use it again.

In the book, this issue is equivalent to having to pay another RM800 for a new one, or to let the RM800 forgone. Though this was a lot of money, it wouldn’t have broken the bank if I would purchase a new one, and this could be one of those essential things you need when you’re out and about walking around in foreign streets. In this case, the psychological value of the AirPods would exceed the monetary value, if I do a quick and abstract calculation from the top of my head. But fuck me, I’m glad those bloody things are made strong enough to last a wash cycle.

It is a book which has been much mentioned by other imitators. Think Rofl Dobelli’s The Art of Thinking Clearly, or Manson’s The Subtle Art of Not Giving A F*ck. But Kahnemann, though sometimes anecdotal with his personal stories, have never relied on them as prime examples to prove his point, which adds to the readability of the book (not that it is an easy book to digest). He lets the findings of academic experiments as proof of his points, and though this automatically increases the degree of the difficulty of the book, it makes a much more credible argument than whatever story Stephen Covey or Jim Collins cook up to feed their readers.

Speaking of which, I’m damn glad that Kahnemann criticised some of these more popular authors, taking jabs at Malcolm Gladwell and Jim Collins. Gladwell, who wrote Blink before Thinking, Fast and Slow, is much too lazy to explain why we have a system 1 and somehow we just know when something isn’t right. Then he contradicts himself when he states that our system 1 doesn’t work all the time and we just have to accept it. Kahnemann at least offers the mechanics behind this psychological flaw (or attempts to) instead of just being a lazy cunt. Collins, well, the companies which he selected in Good to Great didn’t do so well after the publication of his famous book. Call it the commentator’s curse.

I know that every so often I would often this book again to read what I’ve underlined and peruse through them again. There is a fair chance that I will reread it, even though it is not an easy book to read and slow reading is very much necessary. Thus, it is a book to be read, to be examined after its done, to be lived and to be shared. Don’t keep this book locked up.

--

--

Kit Teguh
Kit Teguh

Written by Kit Teguh

A full time project manager who loves to read on the side. Connect with me to chat anything tech and lit.

No responses yet