Thinking, Fast and Slow

Who should read this book: Everyone.

I have seen Daniel Kahneman and Amos Tversky’s book referenced more in the past few years than any aother books. The book has been revolutionary in defining why humans act they way they do, and has had repercussion for practically every industry in the world. He rightfully got a Nobel Prize for his research.

It took me a while to get through it, and I’m still digesting it. The funny thing is that even with the depth of information and years of research behind the book, Kahneman admits that simply “knowing” about human nature/psychological fallacies does little to help you from preventing them. We’re wired a certain way, and it’s very hard to change that. The best we can do, then, is to setup systems that will lower the chances that we make stupid mistakes.

For teachers of psychology, the implications of this study are disheartening. When we teach our students about the behavior of people in the helping experiment, we expect them to learn something they had not known before; we wish to change how they think about people’s behavior in a particular situation. This goal was not accomplished in the Nisbett-Borgida study, and there is no reason to believe that the results would have been different if they had chosen another surprising psychological experiment. Indeed, Nisbett and Borgida reported similar findings in teaching another study, in which mild social pressure caused people to accept much more painful electric shocks than most of us (and them) would have expected. Students who do not develop a new appreciation for the power of social setting have learned nothing of value from the experiment. The predictions they make about random strangers, or about their own behavior, indicate that they have not changed their view of how they would have behaved. In the words of Nisbett and Borgida, students “quietly exempt themselves” (and their friends and acquaintances) from the conclusions of experiments that surprise them.

Say you’re at the zoo and you see a jaguar within a glass enclosure. The jaguar takes a swipe at you. The primitive part of you brain reacts with fear and flight. It’s only later that the more evolved part of your brain introduces logic: “there’s a sheet of glass between myself and the predator. I’m safe. A fear reaction is not indicated”. From the brain’s perspective, these sorts of “false positives” favor natural selection. From an evolutionary perspective, therefore, we are primed to react on the basis of emotion; and it takes a real, concerted effort to develop mental and emotional fortitude.

Kahneman describes two basic system, system 1 and system 2.

  • System 1 “is the brain’s fast, automatic, intuitive approach
  • System 2 “the mind’s slower, analytical mode, where reason dominates.”

Kahneman says “System 1 is…more influential…guiding…[and]…steering System 2 to a very large extent.

Here are key ideas from the book:

The power of repetition

A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact.

The power of priming

This remarkable priming phenomenon—the influencing of an action by the idea—is known as the ideomotor effect. Although you surely were not aware of it, reading this paragraph primed you as well. If you had needed to stand up to get a glass of water, you would have been slightly slower than usual to rise from your chair—unless you happen to dislike the elderly, in which case research suggests that you might have been slightly faster than usual!

Why we Jump to Conclusions

The exaggerated faith in small samples is only one example of a more general illusion—we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify. Jumping to conclusions is a safer sport in the world of our imagination than it is in reality. Statistics produce many observations that appear to beg for causal explanations but do not lend themselves to such explanations. Many facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.

Anchoring Effects

Similar or even larger anchoring effects have been obtained in numerous studies of estimates and of willingness to pay. For example, French residents of the heavily polluted Marseilles region were asked what increase in living costs they would accept if they could live in a less polluted region. The anchoring effect was over 50% in that study. Anchoring effects are easily observed in online trading, where the same item is often offered at different “buy now” prices. The “estimate” in fine-art auctions is also an anchor that influences the first bid.

In general, a strategy of deliberately “thinking the opposite” may be a good defense against anchoring effects, because it negates the biased recruitment of thoughts that produces these effects.

Saliency of an event

A salient event that attracts your attention will be easily retrieved from memory. Divorces among Hollywood celebrities and sex scandals among politicians attract much attention, and instances will come easily to mind.

You are therefore likely to exaggerate the frequency of both Hollywood divorces and political sex scandals. A dramatic event temporarily increases the availability of its category. A plane crash that attracts media coverage will temporarily alter your feelings about the safety of flying. Accidents are on your mind, for a while, after you see a car burning at the side of the road, and the world is for a while a more dangerous place. Personal experiences, pictures, and vivid examples are more available than incidents that happened to others, or mere words, or statistics. A judicial error that affects you will undermine your faith in the justice system more than a similar incident you read about in a newspaper.

What is risk?

“Risk” does not exist “out there,” independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as “real risk” or “objective risk.”

Regression to the mean

The golfer who did well on day 1 is likely to be successful on day 2 as well, but less than on the first, because the unusual luck he probably enjoyed on day 1 is unlikely to hold. The golfer who did poorly on day 1 will probably be below average on day 2, but will improve, because his probable streak of bad luck is not likely to continue.

Regression effects are ubiquitous, and so are misguided causal stories to explain them. A well-known example is the “Sports Illustrated jinx,” the claim that an athlete whose picture appears on the cover of the magazine is doomed to perform poorly the following season. Overconfidence and the pressure of meeting high expectations are often offered as explanations. But there is a simpler account of the jinx: an athlete who gets to be on the cover of Sports Illustrated must have performed exceptionally well in the preceding season, probably with the assistance of a nudge from luck—and luck is fickle.

 

Thinking_ Fast and Slow-Notebook


Close Menu
×
×

Cart