kuniga.me > Books > The Art of Thinking Clearly

The Art of Thinking Clearly

Book cover

Book by Rolf Dobelli. This book lists 99 psychological traps that humans tend to fall into. The The Art of Thinking Clearly aims to make the reader aware of such these traps in hopes to avoid them and thus think more clearly.

If you ever read Nassim Nicholas Taleb Incerto book series, such as Fooled by Randomness, The Black Swan, Skin in the Game, Antifragile and The Bed of Procrustes, you’ll find that he talks about the same topics. Thius is no coincidence. Taleb is actually mentioned in the introduction as an inspiration.

Taleb’s books however are very digressive, so these psychological traps are discussed at random places. Dobelli’s on the other hand focus on these only, and have a dedicated chapter for each, so it’s a lot more organized.

In this summary I’ll list each chapter and try to briefly convey what they’re about. I’ll name the title after the trap itself, while in Dobelli’s book he uses a more catchy name, for example “Why You Should Visit Cemeteries” as opposed to “Survivorship Bias” (which is left as subtitle).

In Dobelli’s book each chapter also contains some examples and anecdotes to make the trap more relatable. For brevity, I’ll omit examples where I feel I can describe the trap.

In some editions of the book, at the end of each chapter he lists related chapters, which I find very useful for forming mental connections (see Zettelkasten in On Memory).

1. Survivorship Bias

Success is more visible than failure, so we might overestimate the chances of success by looking at available information.

2. Swimmer’s Body Illusion

When observing a correlation, getting the direction of causation wrong. The name of the bias stems from the fact that swimmers have muscular and toned bodies, so one might conclude that swimming will give you that body.

In reality people with the right build and athleticism are the ones that are more likely to be on swimming competitions and it’s where you’re more likely to see them.

3. Clustering Illusion

Seeing patterns where they don’t exist. The visual version of this bias has a name: pareidolia.

4. Social Proof

Is our tendency to follow the actions of the majority. The author suggests this might have helped individuals survive as a group but today it’s used by ads to influence customers’ decisions.

5. Sunken Cost Fallacy

It’s when we continue to spend time and money on something because of all the resources already invested, while the most optimal decision would be to stop right away.

6. Reciprocity

It’s when we feel obliged to reciprocate a favor. The author claims NGOs exploit this by giving you a small gift and then asking for a donation, calling it a gentle blackmail.

The negative version of reciprocity is retaliation.

7. Confirmation Bias (Part 1)

Confirmation bias is when you cherry-pick data that supports (confirms) your beliefs and convictions.

What the human being is best to doing is interpreting all new information, so that the prior conclusions remain intact - Warren Buffet

The author claims that the brain forgets disconfirming evidence after a short time.

8. Confirmation Bias (Part 2)

This is the only bias the author dedicates two chapters to, possibly because he claims that “The confirmation bias is the mother of all misconceptions”.

In this second part he provides a few more examples.

9. Authority Bias

This bias is when we trust people with credential more, even when there’s no data backing the abilities.

He mentions a case where copilots wouldn’t speak up against pilots, even when they believed something was wrong, due to this bias. It reminds me of the Avianca Flight 052 Crash, mentioned in Malcolm Gladwell’s Outliers.

10. Contrast Effect

We have difficulty making absolute judgments so we often do relative ones.

Companies can exploit this by putting a very expensive product that no one is going to buy, but it makes other products cheaper in comparison, or by starting with an inflated price and then making it discounted.

11. Availability Bias

We assign probabilities based on how easily we can think of samples.

A nice example: “Are there more English words that start with a k or more words with k as its third letter?”. It’s the latter, but it’s more difficult to remember words by their third letter.

12. It Will Get Worse Before It Gets Better

This applies to a random process that only has two outcomes: improve or get worse.

Saying something will get worse before it gets better without specifying a time frame is a tautology.

13. Story Bias

Humans like narratives. This means that selecting facts and building a coherent simple story with them gives it credibility, even when it’s not true.

14. Hindsight Bias

When things seem obvious and easily explinable after it occurred.

It makes us believe we’re better predictors than we actually are, causing us to (…) to take too much risk.

Even people who are aware of hindsight bias fall for it. One antidote is to keep a journal with your predictions and keep yourself honest.

15. Overconfidence Effect

We overestimate our knowledge and ability to predict.

16. Chauffeur Knowledge

People who pretend they know more than they do, but only have superficial knowledge. They’re able to get away with their eloquence or just being able to memorize things.

The term Chauffer is from an anedocte from Max Planck and his chauffeur. Because his chauffeur attended Planck’s talks, he memorized the lecture by heart and one day decided to give the lecture himself as a prank.

17. Illusion of Control

Is our tendency to want to believe we can influence outcomes over which we have no control.

The author claims that the buttons to close doors on elevators and those for pedestrian crossings lights are fake.

18. Incentive Super-response Tendency

This is when the incentives back-fire. Usually because the incentive is a proxy to achieve an outcome, but people game it and the result is not the expected, sometimes it’s the opposite.

A related concept is “Following the letter rather than the spirit of the law”.

19. Regression to the Mean

Suppose we have a random process with normal distribution. When we observe a rare/extreme event, chances are that the next event will not be that extreme.

It’s just probability but people might act on the extreme event and think they had some control over it going back to normalcy (special case of Illusion of Control).

20. Outcome Bias

We tend to evaluate quality of decisions by their outcome rather the decision process itself.

Assuming some degree of randomness and confounding factors of the outcome, we might end up overestimating a bad decision or underestimating a good one.

21. Paradox of Choice

One would think that having more choices is strictly better than having fewer. However, having too many options leads to inner paralysis, poorer decisions and regret.

22. Liking Bias

The more we like someone, the more we’re inclined to help them (buying, donating, doing favors, etc.). We like someone if:

23. Endowment effect

All else being equal, we value things we own more.

This also applies for near ownership: why silver medal tastes worse than bronze or getting to the final stage of interviews and being rejected feels worse than being rejected at the beginning.

24. Coincidence

We’re bad with probabilities. Rare events are rare on their own, but the chances of someone, somewhere or sometime experiencing a rare event is common.

25. Groupthink

This is a consequence of Social Proof (Chapter 4), when people are less likely to disagree with consensus, which leads to a large number of people standing behind a bad decision.

26. Neglect of Probability

It’s a generalization of Coincidence. TL;DR; we’re bad at grokking probabilities.

A more specific way to put it: we respond to expected magnitude, not to its likelihood. Example: if can choose to win $10 million with one in $100 millon probability or $10,000 with one in 10,000 probability. Most people would choose the first, even though its expected value is 10x smaller than the second one.

27. Scarcity Error

Scarce things seem more valuable. Of course this is exploited by people, for example by artifically limiting supply.

The author doesn’t provide an explanation of why, but I’d speculate it has to do with natural selection.

28. Base Rate Neglect

This is easier to explain via an example. Suppose Mark can be either:

Now suppose we know that (c) Mark is a thin man from Germany with glasses who likes to listen to Mozart. What is Mark’s more likely profession?

Our intuition jumps to (b), because the correlation of (b) and (c) is greater than that of (a) and (c). However, there are way more truck drivers in Germany than professors of literature in Frankfurt.

29. Gambler’s Fallacy

This is another specific of Neglect of Probability. This is when people believe that results will balance out, ignoring the independence of events. For example, if you got 10 heads in a row, you might think the probability of getting another head is lower than getting a tail.

Somewhat opposite to Regression to the Mean.

30. Anchor Effect

When we need to estimate some information we don’t know about, we use prior information to make educated guesses. This can backfire if we don’t have such information, we’ll use whatever data we have to make a guess, even if it is completely irrelevant.

An experiment asked people to estimate how much they would be willing to spend in wine. Before that participants were asked to write down the last 2 digits of their SSN. The results showed a correlation between the two numbers!

This is in a way a special case of the Availability bias and of the Contrast Effect.

31. Induction

Generalizing from a small number of samples. This is also related to Confirmation Bias.

32. Loss Aversion

We fear loss more than we value gain. This can be used for manipulation, by framing things as a way to avoid losses instead of obtaining gains.

This can lead to the Sunken Cost Fallacy.

33. Social Loafing

People do less when in a group. The larger the group, the more individuals slack, but up to a certain limit.

The author suggests that the degree of this effect varies across culture, suggesting that in Japan this happens less than in the West, so successful corporate practices should not be blindly copied between different countries.

One consequence is the diffusion of responsibility, which might cause people to take more risks as a group than as individuals (for the better or for the worse).

34. Exponential Growth

We’re bad at grokking exponential growth.

35. Winner’s Curse

This is specific to auctions. The winner of the auction usually overpays for it.

Probably affected by the Sunken Cost Fallacy, the Anchor effect and some degree of irrational competitiveness.

36. Fundamental Attribution Error

We attribute to individuals what is probably chance or a group effort.

37. False Causality

This is a generalization of the Swimmer’s Body Illusion. When people get the direction of causality wrong.

38. Halo Effect

When a single aspect outweights all others when judged by other people. This is somewhat a special case of the Induction. We extrapolate an overall impression from a single characteristic.

The author mentions studies saying that good-looking people are seen as more pleasant, honest and intelligent. This also explains the first item in Liking Bias (we like someone more if they’re attractive).

39. Alternative Paths

We often don’t consider the other possible outcomes of an action.

This seems to be a special case of the Outcome Bias, in which we evaluate decisions by their outcome rather the decision process itself. A bad decision might lead to a good outcome by sheer luck.

An example I’ve seen is leaders claiming credit for a favorable outcome despite it having nothing to do with them, so in an alternative world if they were replaced by a random individual the outcome would had been the same.

40. Forecast Illusion

There’s little downsides for experts to make forecasts that turn out to be wrong. Beware when you encounter such predictions.

Somewhat related to Overconfidence Effect and to Authority Bias.

41. Conjunction Fallacy

We find events $A \cap B$ more likely than $A$ or $B$ in isolation if $A$ and $B$ have a strong correlation or are part of a narrative, even though mathematically this is impossible.

Related to Story Bias.

42. Framing

“It’s not what you say but how you say it”.

The author claims that people find ‘99 percent fat free’ healthier than ‘1 percent fat’. A specific type of framing is glossing, which spins something negative in a more positive light. For example , framing a problem as an opportunity. Everything you say contains some element of framing.

There are some related concepts such as Loss Aversion (framing something as loss avoidance might be more convincing than as a gain), Neglect of Probability (framing something as a certainty vs. a probability even if they have the same expected value).

43. Action Bias

We bias to action under uncertainty because “wait and see” looks bad.

An interesting example provided is about goalkeepers. It’s claimed that a player can shoot the ball to the left, right or middle with equal probabilities. However, most of the times, goalkeepers will jump to the left or the right, because standing still in the middle waiting for the ball would look bad.

44. Omission Bias

If we have to choose between two negative outcomes, but one involves doing something and the other doing nothing, we choose to not act, even if it’s the worse choice.

This sounds like the other side of the Action bias.

45. Self-serving Bias

We attribute success to us and failures to external factors.

Related to Framing.

46. Hedonic Treadmill

We eventually bounce back to our level of happiness regardless of how positive (winning the lottery) or negative (contracting disease) the outcome is.

However, some negatives cannot be overcome: commute, chronic pain. And some positive might last: free time, doing what you like, professional status.

47. Self-selection Bias

By basic probability, you’re likely to be part of a majority. This acts as a selection bias on your own observations.

For example: men complaining about not having enough women at work and vice-versa.

48. Association Bias

Make connections where they don’t exist.

Similar to Clustering Illusion (seeing patterns where they don’t exist) and Coincidence.

49. Beginner’s Luck

This is a special case of Association Bias, in which early coincidences / luck might suggest an association.

Similar to the Fundamental Attribution Error as well. People might attribute to talent what is just luck.

50. Cognitive Dissonance

When we justify our decisions, however bad they are.

The author provides a very surprising experiment, one devised by Carlsmith at Stanford. He asked two groups A and B to perform a very boring task and later asked them to praise the work to an student outside. Finally he asked the groups to rate how they really felt about their work.

Group A was paid $12 while group B was paid $20. Surprisingly, group A rated the work as more enjoyable and interesting than B. The explanation is that $1 is not enough of an incentive to flat out lie, so students in group A convinced themselves they actually enjoyed the work.

51. Hyperbolic Discounting

Things that are closer to us in time feel more valuable. Basically driven by our desire for instant gratification.

52. “Because” Justification

When you justify your behavior, people are more tolerant towards it, even if the reason is completely bogus.

53. Decision Fatigue

Having to choose drains our energy. Related to the Paradox of Choice.

The book cites a study which tracked judges’ parole decisions throughout the day. Early morning and after lunch decision were more likely to lead to parole. As the day went by their decision defaulted to the status quo (no early release).

54. Contagion Bias

How we are incapable of ignoring the connection we feel with certain items (positive or negative), however irrational this is, as if the item was contaminated (hence the name).

The author mentions that most people would be reluctant to wear Hitler’s clothes as an example.

55. The Problem with Averages

Averages misrepresent exponential distributions. An interesting saying around this is “don’t cross a river if it’s on average four feet deep”.

56. Motivation Crowding

Monetary rewards might erode other more noble motivations.

A good example is from a day care center. Parents late to pick up their children forced staff to stay late as well. To fix the issue the day care instituted a fee for lateness. But then tardiness increased because people felt less bad if they paid.

Seems very similar to Incentive Super-response Tendency but specific to monetary incentives.

57. Twaddle Tendency

When people use a lot of words and complicated but unsound logic, hoping to mask lack of knowledgeable.

I like the quote:

“Simplicity is the zenith of a long, arduous journey, not the starting point”.

58. Will Rogers Phenomenon

Will Rogers is a comedian who supposedly joked: “Oklahomans who pack up and move to California raise both state’s average IQ”.

How is this possible? Suppose we have two sets of integers $A$ and $B$. $A$ has elements $\curly{10, 20, 30}$ with average $20$, and $B$ has elements $\curly{1, 2, 3}$, average $2$. Move element $10$ to $B$ and now $A$’s average is $25$ and $B$’s average $4$. Pretty easy to achieve but it’s counterintuitive.

To explain the joke, Rogers is implying that Oklahomans who move to California have lower IQ than the average on their original state, but higher than the average on their state destination. This in turn implies that, on average, Oklahomans have higher IQ than Californians.

59. Information Bias

More data isn’t always better. Somewhat related to the Paradox of Choice and the Anchor effect.

Quote from Daniel J. Boorstin:

The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge.

60. Effort Justification

We value things more if it cost effort or suffering to achieve or obtain.

Somewhat related to the Sunken Cost Fallacy.

61. The Law of Small Numbers

Smaller samples have more variance.

One interesting experiment: suppose you split a set of normally distributed values into multiple sets of different sizes. Then compute the average value on each set. The make a spreadsheet with one column being the average and the other being the set size. If you sort the data by the average, sets on the bottom and the top of the list will be on the smaller side.

62. Expectations

Expectations can lead to irrational behavior such as the stock price of a company not meeting the expectations of investors, no matter how good the results are.

This is the same mechanism behind the placebo effect. An interesting experiment mentioned is one where teachers are told a group of students show high academic potential after taking a test. The teachers are not told these students were selected at random. After some time, researchers measured the IQ of the class and this group had a higher than average score. One explanation is that the teachers spent more effort with this group, because they expected them to be brighter.

63. Simple Logic

When the most immediate and intuitive answer to a question is wrong.

Experiments showed that people who perform well on these kind of puzzles tend to be able to control their impulses better.

This seems related to Neglect of Probability, in the sense that probability is counterintuitive.

64. Forer Effect

Named after psychologist Bertram Forer. Statements that are general, flattering and positive, are relatable by anyone via Confirmation Bias.

This is why some people believe in astrology.

65. Volunteer’s Folly

If you’re well paid in your job, it’s more efficient for you to earn money and donate it than volunteering your time and money.

This is similar to Effective Altruism.

66. Affect Heuristic

Our assessment of risks and benefits, and hence decision making, might depend on our emotional state, which in turn could be influenced by some completely unrelated event.

Related to other items such as Confirmation Bias, Decision Fatigue and Anchor Effect.

67. Introspection Illusion

The illusion that introspecting leads to truth. In reality introspection leads us to stick even more to our beliefs.

This is perhaps a consequence of Confirmation Bias.

68. Inability to Close Doors

When we’re unable to let things go, keeping the doors open, which has a cost of mental energy and having to deal with context switch.

He proposes a good idea: write down a list of options not to pursue, instead of the usual to-do list.

Somewhat related to the Sunken Cost Fallacy.

69. Neomania

The mania for the new.

The case against neomania is that most new things won’t last, so it’s a waste of time to always be on top of the latest gadget or read the news every day.

A rule of thumb is that things that have lasted for X years will last for another X (this heuristic was suggested in the book Algorithms to Live By).

70. Sleeper’s Effect

We forget the source of a message faster than its content.

This can be bad because messages from untrustworthy sources gains more credibility over time. That’s how propaganda and ads work (fun fact: in Portuguese our word for “ads” is literally “propaganda”).

71. Alternative Blindness

We systematically forget to compare an existing outcome with alternative ones.

In other words, we often don’t consider opportunity costs. Example: buying a house to stop paying rent, but not taking into account that the downpayment could be invested in an alternate universe.

72. Social Comparison Bias

The tendency to withold help to a potential future competitor.

Seems like a natural defensive behavior, but the author claims it’s bad for you long term.

Interesting quote by Guy Kawasaki in the context of companies: “A-players hire people even better than themselves, but B-players hire C-players so they can feel superior to them. C-players hire D-players”. That is, the moment you start hiring B-players, the quality of next generation of hires will degrade over time.

73. Primacy and Recency Effects

The primacy effect is when we give a lot more importance to facts that appear first.

The quote “You never get a second chance to make a first impression” captures this idea.

On the other end is the recency effect, we give a lot more importance to facts that appear last. According to the author, the primacy effect is stronger than recency one, but also fades faster, so for things that happened a while back, the last impression is more memorable.

74. Not Invented Here Syndrome

Individuals and groups (companies) tend to value their ideas more highly.

Somewhat related to Self-serving Bias.

75. Black Swan

This term was coined by Taleb. It denotes super rare events that are unexpected and unpredictable: the unknown unknowns.

76. Domain dependence

Skills and knowledge don’t necessarily transfer between domains, even when they look very similar.

An example provided is from the Nobel prize winner Harry Markowitz who had a theory on portfolio selection but for his personal savings he did a much simpler strategy.

A popular saying in Portuguese is “Em casa de ferreiro, o espeto é de pau” (lit. “in the blacksmith’s house, the skewers are made of wood”), analogous to “The shoemaker’s son always goes barefoot” in English.

77. False-Consensus Effect

We overestimate how many people share our opinion.

78. Falsification of History

We rewrite our memories to be consistent with our current self.

The reason being that admitting mistakes is emotionally difficult. The author claims:

It is safe to assume that half of what you remember is wrong

even the so called flashbulb memories, memories that are so vivid that it seems to come from photographs.

79. In-group Out-group Bias

In-group bias: Groups form based on flimsy criteria (e.g. where you were born) but members of the group attach a lot of value to them (examples sport teams, military, etc.).

Out-group (homogeneity) bias: You perceive members outside of your group to be more similar amongst themselves than they actually are.

80. Ambiguity Aversion

Risk is then we cannot guarantee an outcome but we know the probabilities. Uncertainty or ambiguity is we don’t even know the probabilities.

The Ellsberg Paradox. Consider the following experiment: There are two boxes, Box A has 50 black and 50 red balls, Box B has 100 black or red balls but we don’t know how many of each.

Participants asked to choose a box from which to draw a ball. If you draw a red ball, you earn $100. Most people choose Box A for that. The experiment continues but this time you earn $100 if you draw a black ball. Consistently, people choose Box A again.

This is not an optimal strategy though. If you chose from Box A on the first part, it means you believe Box B has fewer red balls than black balls. Then you’d better off choosing Box B on the second part.

The conclusion is that we prefer known probabilities over uncertain ones.

81. Default Effect

We stick with defaults.

Seems to be a consequence of Decision Fatigue (making a choice requires energy) and Omission Bias (perhaps to avoid regret we opt to not take an action).

82. Fear of Regret

The author claims that regret can happen either if we take an action (action bias) or do nothing (omission bias). The main factor is the fear making a bad decision and being an exception.

I have a personal anecdote: I personally don’t buy lottery tickets, but whenever friends and colleagues would do a lottery pool I’d feel strongly compelled to join, because of the extremely remote chance they win and I regret not joining.

Perhaps this is another consequence of Social Proof, and related to Inability to Close Doors and Loss Aversion.

83. Salient Effect

The idea is that when we trying to find explanations, we might hang on to some salience that might be completely unrelated to the actual cause.

The first example given is of a car accident where marijuana is found on the backseat. Even if the drug had no effect on driving skills, it would likely be attributed as a cause, when it’s just coincidence, because it’s a factor that stands out (marijuana would not be found in most car accidents).

I’m not sure salience is enough of a factor, or it’s just arising from prejudice or extrapolation. For this example, alcohol impairs driving and so does a lot of medicines due to drowziness so it wouldn’t be surprising if marijuana would as well.

Seems related to Clustering Illusion, Halo Effect, Conjunction Fallacy and Association Bias.

84. House-Money Effect

Depending on how we earned some amount of money, we might value it differently. Hard-earned money feels more valuable than money earned by luck for example.

85. Procrastination

The tendency to delay unpleasant tasks.

Some ways to avoid this is to make deadlines public and take breaks to restore energy.

86. Envy

This chapter includes a quote from Balzac:

Envy is the most stupid of the vices, for there is no single advantage to be gained from it.

The author suggests we feel more envy towards those in similar situations to ours, quoting Aristotle:

Potters envy potters

This reminds me of Mimetic desire, described in Deceit, Desire and the Novel by René Girard [1].

87. Personification

We react more humanely when we deal with people in person, even if strangers.

This can be used in manipulative ways, for example donation campaigns using starving children instead of showing actual data.

This is related to Story Bias.

88. Illusion of Attention

The confidence that we notice everything that is in front of us.

A famous example is Gorilla in the room experiment.

89. Strategic Misrepresentation

Lying to say you can do things when you don’t think you can. This is used in situations where one-time lies won’t have major consequences but there’s a lot at stakes, for example during an interview.

This reminds me of the saying: “Fake until you make it”.

90. Overthinking

If we overthink we turn off our “lizard” brain. It’s not always an advantage, for example with motor skills or things you have rehearsed many times.

91. Planning Fallacy

We overestimate how quickly we can do something.

According to the author, our wishful thinking makes us too optimistic. We also don’t account for unexpected events that distract us from our plans.

Interestingly, he claims that detailed planning like step-by-step amplifies the effect because narrow focus makes us lose lose sight of external factors even more.

Likely related to the Overconfidence Effect.

92. Deformation Professionelle

Professional deformation in French, is the tendency to view things from a specialized but narrow lens.

This is encoded by the saying “When all you have a hammer, all your problems will be nails”.

93. Zeigarnik Effect

This effect is named after Bluma Zeigarnik, a Russian psychologist who noticed that we seldom forget unfinished tasks, but once we complete it, it’s erased from memory.

Coupled with the Inability to Close Doors this can suck our mental energy. Fortunately, we don’t need to actually complete the task for it to vanish from memory: we just need to have a clear idea on how we’ll deal with them.

94. Illusion of Skill

Luck matters more than skill in many high-up corporate positions.

We tend to attribute successful people to talent but it’s just Survivorship bias as play: people who failed disappear from the scene.

95. Feature-positive Effect

Absense is much harder to detect than presence.

Seems to be the other side of Availability Bias.

96. Cherry Picking

Also known as selection bias: chosing some facts and omitting others to make a point.

This can be tricky to detect due to Feature-positive Effect, if we don’t realize what’s missing.

97. Fallacy of Single Cause

We tend to want simplistic answers even when the reality is a lot more complex and nuanced.

This is why scapegoats exist: attribution of failure to a single individual.

This is a generalization of Fundamental Attribution Error and related to Story Bias.

98. Intention-to-Treat Errors

In a randomized clinial trial, one group of subjects receive an experimental medicine (A), while the other receive a placebo (B). Imagine a trial where patients must take the pill weekly and are monitored for a few years to measure the survivorship rate of each group.

However, some patients don’t follow the routine properly and skip taking the pills once in a while. To not add confounding factors to the experiment, they’re considered to be in a separate cohort (C).

Now suppose the pill has no effect. When we look at the survivorship rate between (A) and (B) they should show the same results. However, they might find that the survivorship rate of (C) is much lower. A bad scientist might claim that the treatment works because people who took the pill more regularly had better outcomes.

It’s likely that people in (C) didn’t take the pill regularly because they were too sick to do so, and this factor contributed to their low survivorship rate, not the pill.

The key is that we have to stick with the original control and treatment groups when computing statistics. I guess the name “intention to treat” is a reference to this: you should consider the people who you were planning to treat, not the ones who you ended up treating.

99. News Illusions

The author claims that:

And that:

Nothing beats books for understanding the world

This seems to be a special case of Neomania.

Summary

There are many different items on this book so it’s hard to summarize, but I’ll attempt to put them in broad categories:

Incomplete Data. Extraplolation from incomplete data. Chapters: 1, 3, 7, 8, 10, 11, 14, 15, 30, 31, 37?, 38, 39, 48, 66, 71, 73, 75, 80, 81, 83, 95 and 96.

Social Natural Selection. Some built-in behavior might have helped us survive as a society early on, but today it can be problematic: 4, 6, 9, 13, 22, 25, 33, 43, 52, 54, 74, 77, 79 and 87.

Individual Natural Selection. Some built-in behavior might have helped us survive as individuals, including fears and egotistical behavior: 18, 21, 23, 27, 32, 44, 68, 69, 72, 80, 81, 82, 86, 89 and 99.

Self Blindness. Not survival related, but the fact that we have trouble putting ourselves outside our heads. Examples: 15, 17, 23, 45, 50, 51, 60, 67, 77, 78, 84, 88, 91 and 94.

Math. Our inability to grok abstract mathematical concepts such as probability, statistics and logic. Chapters: 12, 17, 19, 20, 24, 26, 28, 29, 34, 35, 36, 37, 40, 41, 49, 55, 58, 61, 63, 75 and 98.

My review

Overall I like this book quite a bit. It’s very organized and provides a lot of examples to make the trap easier to grasp. I learned a bunch of new psychological traps, even though I might not be able to avoid them.

It seems like the author squeezed some of the traps so they add up to 99 items. Some chapters feels like special cases of others and just variants.

Rating: 5/5.

Thinking in Systems: A Primer by Donella H. Meadows. There’s a chapter in that book called Systems Traps which is related to Psychological Traps. In particular, Rule beating appears to be the same as Incentive Super-response Tendency (following the letter rather than the spirit of the law).

References