Book Review on “Thinking Fast and Slow”
Thinking Fast and Slow, a book about the dual system in our brain that informs human thinking, received a great review from the WSJ. Check out this article to learn about Daniel Kahneman and Amos Tversky’s fascinating work.
Spotted by Daniel Lubetzky, by Adeena Schlussel
By CHRISTOPHER F. CHABRIS
There’s a scene in a "Seinfeld" episode in which Kramer, after reacting emotionally to a movie about a woman in a coma, sits down with a lawyer to prepare a living will. The lawyer goes through a list of situations in which a person might want his life support terminated. "You have a liver, kidneys and a gall bladder but no central nervous system," the lawyer intones. Kramer votes to pull the plug, explaining: "I gotta have a central nervous system!" The lawyer goes on: "One lung, blind, and you’re eating through a tube." Kramer declines this life, too: "That’s not my style." "All right," the lawyer says, offering one more scenario. "You can eat, but machines do everything else." Kramer decides that, in this case, life would still be worth living, "because I could still go to the coffee shop."
Behind the silliness lies a serious question: Can our healthy selves predict how we will feel in unhealthy circumstances with enough certainty to choose whether we would want to live or die? Can our present selves, in general, make reliable choices for our future selves? How good are our decisions anyway, and how do we make them?
Daniel Kahneman, a cognitive psychologist who won the Nobel Prize in economics in 2002, faced a problem somewhat like Kramer’s when he and his wife were debating whether to move from Berkeley, Calif., to Princeton, N.J. His wife claimed that people were less happy on the East Coast than in California; Mr. Kahneman thought this unlikely. But rather than just argue the point, he conducted a study. Sure enough, while most people in California—and elsewhere—believed that Californians were happier, Californians themselves reported being no more satisfied with their lives than people in Ohio and Michigan.
Why do people think Californians are happier than Ohioans? Because they focus on the most salient difference between the two places: climate. The "focusing illusion," according to Mr. Kahneman, happens when we call up a specific attribute of a thing or experience (e.g., climate) and use it to answer a broader and more difficult question (what makes life enjoyable, in California or anywhere else?).
Mr. Kahneman describes the California study and much else in "Thinking, Fast and Slow," a tour de force of psychological insight, research explication and compelling narrative that brings together in one volume the high points of Mr. Kahneman’s notable contributions, over five decades, to the study of human judgment, decision-making and choice.
Many of these contributions were the result of work that Mr. Kahneman did with Amos Tversky, a fellow psychologist who would no doubt have shared Mr. Kahneman’s Nobel had he not died—prematurely, at the age of 59—in 1996. The two men began to collaborate in 1969, after meeting at a seminar. Mr. Kahneman had been researching perception and attention while Mr. Tversky was applying mathematical models to decision-making. The melding of their approaches and personalities led to a creative partnership rarely paralleled in the history of science. Mr. Kahneman is careful to give full credit to their collaboration even as he writes—with estimable clarity and wit—in his own voice.
The first article they wrote together, titled "Belief in the Law of Small Numbers," showed that even trained research psychologists had poor judgment about statistical inferences: The sample sizes of their experiments were often too small to support their conclusions. This problem, like so many others, had broad implications. Crucial policy decisions are often based on statistical inferences, but as Mr. Kahneman notes, we "pay more attention to the content of messages than to information about their reliability." The effect is "a view of the world around us that is simpler and more coherent than the data justify."
One major effect of the work of Messrs. Kahneman and Tversky has been to overturn the assumption that human beings are rational decision-makers who weigh all the relevant factors logically before making choices. When the two men began their research, it was understood that, as a finite device with finite time, the brain had trouble calculating the costs and benefits of every possible course of action and that, separately, it was not very good at applying rules of logical inference to abstract situations. What Messrs. Kahneman and Tversky showed went far beyond this, however. They argued that, even when we have all the information that we need to arrive at a correct decision, and even when the logic is simple, we often get it drastically wrong.
Consider the "Linda problem," which became Messrs. Kahneman and Tversky’s most famous experiment. Subjects were asked to read a description of Linda, a single, 31-year-old woman with a philosophy degree who had espoused left-wing causes like nuclear disarmament in college. The subjects were asked which of two possibilities was more probable: that Linda was a bank teller or that Linda was a bank teller and an active feminist. Logically, the conjunction of two things must be less probable than one of those two things alone. In this case, there have to be more bank tellers than feminist bank tellers, and there must be at least a chance that Linda is one of the nonfeminist bank tellers, so "bank teller" is the right answer. But 85% of college students chose the wrong answer. Similarly, people asked to estimate the probability of a stock-market crash next year will likely give a greater probability to a stock-market crash that is precipitated by a European debt crisis than to a stock market crash by itself—even though the latter is a single event and the former a conjunction of two.
This "conjunction fallacy" (like the focusing illusion) illustrates a broader pattern—of human reasoning being distorted by systematic biases. To understand one source of such errors, Mr. Kahneman divides the mind into two broad components. "System 1" makes rapid, intuitive decisions based on associative memory, vivid images and emotional reactions. "System 2" monitors the output of System 1 and overrides it when the result conflicts with logic, probability or some other decision-making rule. Alas, the second system is a bit lazy—we must make a special effort to pay attention, and such effort consumes time and energy.
You can get an idea of the two-system distinction by trying to solve this simple problem, from the work of the psychologist Shane Frederick: "If it takes 5 machines 5 minutes to make 5 widgets, how many minutes does it take 100 machines to make 100 widgets?" The answer "100 minutes" leaps to mind (System 1 at work), but it is wrong. But a bit of reflective thought (by System 2) leads to "five minutes," the right answer.
The divided mind is evident in other situations where we are not as "rational" as we might assume. Most people require a larger expected outcome to take a risk when a sure thing is available as an alternative (risk aversion), and they dislike losses much more than they like gains of equivalent size (loss aversion). These now-commonplace concepts are central to prospect theory, perhaps the most influential legacy of Messrs. Kahneman and Tversky.
Mr. Kahneman notes that we harbor two selves when it comes to happiness, too: one self that experiences pain and pleasure from moment to moment and another that remembers the emotions associated with complete events and episodes. The remembering self does not seem to care how long an experience was if it was getting better toward the end—so a longer colonoscopy that ended with decreasing pain will be seen later as preferable to a shorter procedure that involved less total pain but happened to end at a very painful point. Complications like this should make us wary of letting simplistic measures of happiness determine national policy and social goals
Mr. Kahneman stresses that he is just as susceptible as the rest of us to the cognitive illusions he has discovered. He tries to recognize situations when mistakes are especially likely to occur—such as when he is starting a big project or making a forecast—and then act to rethink his System 1 inclinations. The tendency to underestimate the costs of future projects, he notes, is susceptible to taking an "outside view": looking at your own project as an outsider would. To avoid overconfidence, Mr. Kahneman recommends an exercise called the "premortem," developed by the psychologist Gary Klein: Before finalizing a decision, imagine that, a year after it has been made, it has turned out horribly, then write a history of how it went wrong and why.
Many books about the mind—a crowded genre these days—combine vivid stories with accounts of seminal experiments and then proceed to argue for changes in law, policy or business practices. "Thinking, Fast and Slow" is different. It is almost defiantly focused on the science, with a leavening of memoir and personal observation. Mr. Kahneman’s stated goals are minimalist: to "enrich the vocabulary that people use" when they talk about decisions, so that his readers benefit from his work at the "proverbial watercooler, where opinions are shared and gossip is exchanged."
Such modesty is rare and inspiring. All scientists, not least social scientists, should be wary of adhering to any belief system in their professional lives other than the one that requires fidelity to their data. As soon as you have a cause, you have a conflict of interest. Mr. Kahneman has kept his attention focused on doing research and, now, on explaining it as well. Thanks to the elegance and force of his ideas, and the robustness of the evidence he offers for them, he has helped us to a new understanding of our divided minds—and our whole selves.
related posts
-
Decreasing Memory or Ossification of Critical Thinking?
An interesting set of studies suggest that as we grow older, we forget things because our brains don’t have the ability to remember prior incidents as well and they associate similar but non-identical experiences as having been identical (as seen here). But another possibility was ignored – that part of the problem is connected to [...]
-
Informed or Wishful Thinking?
A very interesting article from Reuel Marc Gerecht on Iraq, Afghanistan, Al Qaeda and what I call "pseudo-Islamic terrorists" (militant extremist assassins who usurp and tarnish a noble religion to advance their absolutist aims) appears in the Washington Post. He is either extraordinarily well informed, and providing some real hope, or just engaging in wishful [...]
-
Critical Thinking on Presidential Candidates
I have been arguing to all who’d listen that in the end of the day, the 3 remaining candidates are all formidable possible leaders and we are lucky to end with them. They all have some weaknesses, but overall their strengths greatly outweigh their weaknesses. There is nothing like a well-thought-out set of op-eds to [...]
-
Short-Term Thinking Leads to Long Term Costs
Sometimes in business you are faced with the decision to invest up front more capital resources but ensure that over the long term you see savings, vs. save up front, but at a steady higher cost of production per widget on an ongoing basis. The problem with choosing the path that is "inexpensive" up front [...]
-
An interesting book about resilient communities
Meg Wheatley and Deborah Frieze just came out with a book that is sure to be inspiring. Titled, Walk Out Walk On: A Learning Journey into Communities Daring to Live the Future Now,” the book conveys stories of the people that the women met from their work at The Berkana Institute. It hopes to mobilize [...]
post a new comment