Thinking, Fast and Slow

Metadata
- Title: Thinking, Fast and Slow
- Author: Daniel Kahneman
- Book URL: https://amazon.com/dp/B00555X8OA?tag=malvaonlin-20
- Open in Kindle: kindle://book/?action=open&asin=B00555X8OA
- Last Updated on: Thursday, July 30, 2020
Highlights & Notes
People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media.
“The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”
This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.
System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
The highly diverse operations of System 2 have one feature in common: they require attention and are disrupted when attention is drawn away.
The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.
The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people’s mistakes than our own.
As you become skilled in a task, its demand for energy diminishes. Studies of the brain have shown that the pattern of activity associated with an action changes as skill increases, with fewer brain regions involved. Talent has similar effects. Highly intelligent individuals need less effort to solve the same problems, as indicated by both pupil size and brain activity.
The most effortful forms of slow thinking are those that require you to think fast.
People who experience flow describe it as “a state of effortless concentration so deep that they lose their sense of time, of themselves, of their problems,” and their descriptions of the joy of that state are so compelling that Csikszentmihalyi has called it an “optimal experience.”
self-control requires attention and effort. Another way of saying this is that controlling thoughts and behaviors is one of the tasks that System 2 performs.
Baumeister’s group has repeatedly found that an effort of will or self-control is tiring; if you have had to force yourself to do something, you are less willing or less able to exert self-control when the next challenge comes around. The phenomenon has been named ego depletion.
activities that impose high demands on System 2 require self-control, and the exertion of self-control is depleting and unpleasant. Unlike cognitive load, ego depletion is at least in part a loss of motivation. After exerting self-control in one task, you do not feel like making an effort in another, although you could do it if you really had to.
Ego depletion is not the same mental state as cognitive busyness.
A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?
Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed.
Speaking of Control “She did not have to struggle to stay on task for hours. She was in a state of flow.” “His ego was depleted after a long day of meetings. So he just turned to standard operating procedures instead of thinking through the problem.” “He didn’t bother to check whether what he said made sense. Does he usually have a lazy System 2 or was he unusually tired?” “Unfortunately, she tends to say the first thing that comes into her mind. She probably also has trouble delaying gratification. Weak System 2.”
you think with your body, not only with your brain.
This remarkable priming phenomenon—the influencing of an action by the idea—is known as the ideomotor effect.
You can see why the common admonition to “act calm and kind regardless of how you feel” is very good advice: you are likely to be rewarded by actually feeling calm and kind.
Money-primed people become more independent than they would be without the associative trigger. They persevered almost twice as long in trying to solve a very difficult problem before they asked the experimenter for help, a crisp demonstration of increased self-reliance. Money-primed people are also more selfish: they were much less willing to spend time helping another student who pretended to be confused about an experimental task.
The general theme of these findings is that the idea of money primes individualism: a reluctance to be involved with others, to depend on others, or to accept demands from others.
Speaking of Priming “The sight of all these people in uniforms does not prime creativity.” “The world makes much less sense than you think. The coherence comes mostly from the way your mind works.” “They were primed to find flaws, and this is exactly what they found.” “His System 1 constructed a story, and his System 2 believed it. It happens to all of us.” “I made myself smile and I’m actually feeling better!”
The impression of familiarity is produced by System 1, and System 2 relies on that impression for a true/false judgment.
A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.
More advice: if your message is to be printed, use high-quality paper to maximize the contrast between characters and their background. If you use color, you are more likely to be believed if your text is printed in bright blue or red than in middling shades of green, yellow, or pale blue. If you care about being thought credible and intelligent, do not use complex language where simpler language will do.
In addition to making your message simple, try to make it memorable. Put your ideas in verse if you can; they will be more likely to be taken as truth.
Finally, if you quote a source, choose one with a name that is easy to pronounce.
How do you know that a statement is true? If it is strongly linked by logic or association to other beliefs or preferences you hold, or comes from a source you trust and like, you will feel a sense of cognitive ease. The trouble is that there may be other causes for your feeling of ease—including the quality of the font and the appealing rhythm of the prose—and you have no simple way of tracing your feelings to their source.
The bat-and-ball problem was mentioned earlier as a test of people’s tendency to answer questions with the first idea that comes to their mind, without checking it.
If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets? 100 minutes OR 5 minutes In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake? 24 days OR 47 days
The results tell a clear story: 90% of the students who saw the CRT in normal font made at least one mistake in the test, but the proportion dropped to 35% when the font was barely legible. You read this correctly: performance was better with the bad font. Cognitive strain, whatever its source, mobilizes System 2, which is more likely to reject the intuitive answer suggested by System 1.
As expected, easily pronounced words evoke a favorable attitude. Companies with pronounceable names do better than others for the first week after the stock is issued, though the effect disappears over time. Stocks with pronounceable trading symbols (like KAR or LUNMOO) outperform those with tongue-twisting tickers like PXG or RDO—and they appear to retain a small advantage over some time.
The link between positive emotion and cognitive ease in System 1 has a long evolutionary history.
Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition.
A happy mood loosens the control of System 2 over performance: when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors.
Speaking of Cognitive Ease “Let’s not dismiss their business plan just because the font makes it hard to read.” “We must be inclined to believe it because it has been repeated so often, but let’s think it through again.” “Familiarity breeds liking. This is a mere exposure effect.” “I’m in a very good mood today, and my System 2 is weaker than usual. I should be extra careful.”
Speaking of Norms and Causes “When the second applicant also turned out to be an old friend of mine, I wasn’t quite as surprised. Very little repetition is needed for a new experience to feel normal!” “When we survey the reaction to these products, let’s make sure we don’t focus exclusively on the average. We should consider the entire range of normal reactions.” “She can’t accept that she was just unlucky; she needs a causal story. She will end up thinking that someone intentionally sabotaged her work.”
When uncertain, System 1 bets on an answer, and the bets are guided by experience. The rules of the betting are intelligent: recent events and the current context have the most weight in determining an interpretation. When no recent event comes to mind, more distant memories govern.
Uncertainty and doubt are the domain of System 2.
The moral is significant: when System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.
A simple rule can help: before an issue is discussed, all members of the committee should be asked to write a very brief summary of their position. This procedure makes good use of the value of the diversity of knowledge and opinion in the group. The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them.
System 1 excels at constructing the best possible story that incorporates ideas currently activated, but it does not (cannot) allow for information it does not have.
The measure of success for System 1 is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant. When information is scarce, which is a common occurrence, System 1 operates as a machine for jumping to conclusions.
It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.
Overconfidence: As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing—what we see is all there is. Furthermore, our associative system tends to settle on a coherent pattern of activation and suppresses doubt and ambiguity. Framing effects: Different ways of presenting the same information often evoke different emotions. The statement that “the odds of survival one month after surgery are 90%” is more reassuring than the equivalent statement that “mortality within one month of surgery is 10%.” Similarly, cold cuts described as “90% fat-free” are more attractive than when they are described as “10% fat.” The equivalence of the alternative formulations is transparent, but an individual normally sees only one formulation, and what she sees is all there is. Base-rate neglect: Recall Steve, the meek and tidy soul who is often believed to be a librarian. The personality description is salient and vivid, and although you surely know that there are more male farmers than male librarians, that statistical fact almost certainly did not come to your mind when you first considered the question. What you saw was all there was.
Speaking of Jumping to Conclusions “She knows nothing about this person’s management skills. All she is going by is the halo effect from a good presentation.” “Let’s decorrelate errors by obtaining separate judgments on the issue before any discussion. We will get more information from independent assessments.” “They made that big decision on the basis of a good report from one consultant. WYSIATI—what you see is all there is. They did not seem to realize how little information they had.” “They didn’t want more information that might spoil their story. WYSIATI.”
System 2 receives questions or generates them: in either case it directs attention and searches memory to find the answers. System 1 operates differently. It continuously monitors what is going on outside and inside the mind, and continuously generates assessments of various aspects of the situation without specific intention and with little or no effort.
Good mood and cognitive ease are the human equivalents of assessments of safety and familiarity.
Speaking of Judgment “Evaluating people as attractive or not is a basic assessment. You do that automatically whether or not you want to, and it influences you.” “There are circuits in the brain that evaluate dominance from the shape of the face. He looks the part for a leadership role.” “The punishment won’t feel just unless its intensity matches the crime. Just like you can match the loudness of a sound to the brightness of a light.” “This was a clear instance of a mental shotgun. He was asked whether he thought the company was financially sound, but he couldn’t forget that he likes their product.”
The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions. The word comes from the same root as eureka.
Our answer was that when called upon to judge probability, people actually judge something else and believe they have judged probability. System 1 often makes this move when faced with difficult target questions, if the answer to a related and easier heuristic question comes readily to mind.
“If you can’t solve a problem, then there is an easier problem you can solve: find it.”
The present state of mind looms very large when people evaluate their happiness.
Speaking of Substitution and Heuristics “Do we still remember the question we are trying to answer? Or have we substituted an easier one?” “The question we face is whether this candidate can succeed. The question we seem to answer is whether she interviews well. Let’s not substitute.” “He likes the project, so he thinks its costs are low and its benefits are high. Nice example of the affect heuristic.” “We are using last year’s performance as a heuristic to predict the value of the firm several years from now. Is this heuristic good enough? What other information do we need?”
As I described earlier, System 1 is not prone to doubt. It suppresses ambiguity and spontaneously constructs stories that are as coherent as possible. Unless the message is immediately negated, the associations that it evokes will spread as if the message were true. System 2 is capable of doubt, because it can maintain incompatible possibilities at the same time. However, sustaining doubt is harder work than sliding into certainty. The law of small numbers is a manifestation of a general bias that favors certainty over doubt, which will turn up in many guises in following chapters.
The exaggerated faith in small samples is only one example of a more general illusion—we pay more attention to the content of messages than to information about their reliability, and as a result end up with a view of the world around us that is simpler and more coherent than the data justify. Jumping to conclusions is a safer sport in the world of our imagination than it is in reality.
Statistics produce many observations that appear to beg for causal explanations but do not lend themselves to such explanations. Many facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong.
Speaking of the Law of Small Numbers “Yes, the studio has had three successful films since the new CEO took over. But it is too early to declare he has a hot hand.” “I won’t believe that the new trader is a genius before consulting a statistician who could estimate the likelihood of his streak being a chance event.” “The sample of observations is too small to make any inferences. Let’s not follow the law of small numbers.” “I plan to keep the results of the experiment secret until we have a sufficiently large sample. Otherwise we will face pressure to reach a conclusion prematurely.”
However, a key finding of anchoring research is that anchors that are obviously random can be just as effective as potentially informative anchors.
anchors do not have their effects because people believe they are informative.
The psychologists Adam Galinsky and Thomas Mussweiler proposed more subtle ways to resist the anchoring effect in negotiations. They instructed negotiators to focus their attention and search their memory for arguments against the anchor. The instruction to activate System 2 was successful. For example, the anchoring effect is reduced or eliminated when the second mover focuses his attention on the minimal offer that the opponent would accept, or on the costs to the opponent of failing to reach an agreement. In general, a strategy of deliberately “thinking the opposite” may be a good defense against anchoring effects, because it negates the biased recruitment of thoughts that produces these effects.
However, you should assume that any number that is on the table has had an anchoring effect on you, and if the stakes are high you should mobilize yourself (your System 2) to combat the effect.
Speaking of Anchors “The firm we want to acquire sent us their business plan, with the revenue they expect. We shouldn’t let that number influence our thinking. Set it aside.” “Plans are best-case scenarios. Let’s avoid anchoring on plans when we forecast actual outcomes. Thinking about ways the plan could go wrong is one way to do it.” “Our aim in the negotiation is to get them anchored on this number.” “Let’s make it clear that if that is their proposal, the negotiations are over. We do not want to start there.” “The defendant’s lawyers put in a frivolous reference in which they mentioned a ridiculously low amount of damages, and they got the judge anchored on it!”
We defined the availability heuristic as the process of judging frequency by “the ease with which instances come to mind.”
Maintaining one’s vigilance against biases is a chore—but the chance to avoid a costly mistake is sometimes worth the effort.
Self-ratings were dominated by the ease with which examples had come to mind. The experience of fluent retrieval of instances trumped the number retrieved.
Speaking of Availability “Because of the coincidence of two planes crashing last month, she now prefers to take the train. That’s silly. The risk hasn’t really changed; it is an availability bias.” “He underestimates the risks of indoor pollution because there are few media stories on them. That’s an availability effect. He should look at the statistics.” “She has been watching too many spy movies recently, so she’s seeing conspiracies everywhere.” “The CEO has had several successes in a row, so failure doesn’t come easily to her mind. The availability bias is making her overconfident.”
The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.
Frightening thoughts and images occur to us with particular ease, and thoughts of danger that are fluent and vivid exacerbate fear.
The affect heuristic is an instance of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think about it?).
An inability to be guided by a “healthy fear” of bad consequences is a disastrous flaw.
“The emotional tail wags the rational dog.”
“Risk” does not exist “out there,” independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as “real risk” or “objective risk.”
As I know from experience, it is difficult to reason oneself into a state of complete calm. Terrorism speaks directly to System 1.
Speaking of Availability Cascades “She’s raving about an innovation that has large benefits and no costs. I suspect the affect heuristic.” “This is an availability cascade: a nonevent that is inflated by the media and the public until it fills our TV screens and becomes all anyone is talking about.”
The essential keys to disciplined Bayesian reasoning can be simply summarized: Anchor your judgment of the probability of an outcome on a plausible base rate. Question the diagnosticity of your evidence.
Speaking of Representativeness “The lawn is well trimmed, the receptionist looks competent, and the furniture is attractive, but this doesn’t mean it is a well-managed company. I hope the board does not go by representativeness.” “This start-up looks as if it could not fail, but the base rate of success in the industry is extremely low. How do we know this case is different?” “They keep making the same mistake: predicting rare events from weak evidence. When the evidence is weak, one should stick with the base rates.” “I know this report is absolutely damning, and it may be based on solid evidence, but how sure are we? We must allow for that uncertainty in our thinking.”
When you specify a possible event in greater detail you can only lower its probability. The problem therefore sets up a conflict between the intuition of representativeness and the logic of probability.
The most coherent stories are not necessarily the most probable, but they are plausible, and the notions of coherence, plausibility, and probability are easily confused by the unwary.
This is a trap for forecasters and their clients: adding detail to scenarios makes them more persuasive, but less likely to come true.
In the absence of a competing intuition, logic prevails.
The laziness of System 2 is an important fact of life, and the observation that representativeness can block the application of an obvious logical rule is also of some interest.
Speaking of Less is More “They constructed a very complicated scenario and insisted on calling it highly probable. It is not—it is only a plausible story.” “They added a cheap gift to the expensive product, and made the whole deal less attractive. Less is more in this case.” “In most situations, a direct comparison makes people more careful and more logical. But not always. Sometimes intuition beats logic even when the correct answer stares you in the face.”
The cab example illustrates two types of base rates. Statistical base rates are facts about a population to which a case belongs, but they are not relevant to the individual case. Causal base rates change your view of how the individual case came to be.
Stereotypes are statements about the group that are (at least tentatively) accepted as facts about every member.
but the psychological facts cannot be avoided: stereotypes, both correct and false, are how we think of categories.
Subjects’ unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular.
You are more likely to learn something by finding surprises in your own behavior than by hearing surprising facts about people in general.
rewards for improved performance work better than punishment of mistakes.
the feedback to which life exposes us is perverse.
our mind is strongly biased toward causal explanations and does not deal well with “mere statistics.”
Speaking of Regression to Mediocrity “She says experience has taught her that criticism is more effective than praise. What she doesn’t understand is that it’s all due to regression to the mean.” “Perhaps his second interview was less impressive than the first because he was afraid of disappointing us, but more likely it was his first that was unusually good.” “Our screening procedure is good but not perfect, so we should anticipate regression. We shouldn’t be surprised that the very best candidates often fail to meet our expectations.”
Following our intuitions is more natural, and somehow more pleasant, than acting against them.
your intuitions will deliver predictions that are too extreme and you will be inclined to put far too much faith in them.
Speaking of Intuitive Predictions “That start-up achieved an outstanding proof of concept, but we shouldn’t expect them to do as well in the future. They are still a long way from the market and there is a lot of room for regression.” “Our intuitive prediction is very favorable, but it is probably too high. Let’s take into account the strength of our evidence and regress the prediction toward the mean.” “The investment may be a good idea, even if the best guess is that it will fail. Let’s not say we really believe it is the next Google.” “I read one review of that brand and it was excellent. Still, that could have been a fluke. Let’s consider only the brands that have a large number of reviews and pick the one that looks best.”
Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.
Taleb suggests that we humans constantly fool ourselves by constructing flimsy accounts of the past and believing they are true.
knowable than it is. It helps perpetuate a pernicious illusion. The core of the illusion is that we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less than we believe we do.
To think clearly about the future, we need to clean up the language that we use in labeling the beliefs we had in the past.
Learning from surprises is a reasonable thing to do, but it can have some dangerous consequences.
A general limitation of the human mind is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or of any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.
Your inability to reconstruct past beliefs will inevitably cause you to underestimate the extent to which you were surprised by past events.
When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.
The illusion that one has understood the past feeds the further illusion that one can predict and control the future. These illusions are comforting. They reduce the anxiety that we would experience if we allowed ourselves to fully acknowledge the uncertainties of existence. We all have a need for the reassuring message that actions have appropriate consequences, and that success will reward wisdom and courage.
Because of the halo effect, we get the causal relationship backward: we are prone to believe that the firm fails because its CEO is rigid, when the truth is that the CEO appears to be rigid because the firm is failing. This is how illusions of understanding are born.
In the presence of randomness, regular patterns can only be mirages.
Stories of how businesses rise and fall strike a chord with readers by offering what the human mind needs: a simple message of triumph and failure that identifies clear causes and ignores the determinative power of luck and the inevitability of regression. These stories induce and maintain an illusion of understanding, imparting lessons of little enduring value to readers who are all too eager to believe them.
Speaking of Hindsight “The mistake appears obvious, but it is just hindsight. You could not have known in advance.” “He’s learning too much from this success story, which is too tidy. He has fallen for a narrative fallacy.” “She has no evidence for saying that the firm is badly managed. All she knows is that its stock has gone down. This is an outcome bias, part hindsight and part halo effect.” “Let’s not fall for the outcome bias. This was a stupid decision even though it worked out well.”
For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous—and it is also essential.
Subjective confidence in a judgment is not a reasoned evaluation of the probability that this judgment is correct. Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.
Most of the buyers and sellers know that they have the same information; they exchange the stocks primarily because they have different opinions. The buyers think the price is too low and likely to rise, while the sellers think the price is high and likely to drop. The puzzle is why buyers and sellers alike think that the current price is wrong. What makes them believe they know more about what the price should be than the market does? For most of them, that belief is an illusion.
Facts that challenge such basic assumptions—and thereby threaten people’s livelihood and self-esteem—are simply not absorbed. The mind does not digest them. This is particularly true of statistical studies of performance, which provide base-rate information that people generally ignore when it clashes with their personal impressions from experience.
We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers.
The illusion that we understand the past fosters overconfidence in our ability to predict the future.
Those who know more forecast very slightly better than those who know less. But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident.
The first lesson is that errors of prediction are inevitable because the world is unpredictable. The second is that high subjective confidence is not to be trusted as an indicator of accuracy (low confidence could be more informative).
The line that separates the possibly predictable future from the unpredictable distant future is yet to be drawn.
Speaking of Illusory Skill “He knows that the record indicates that the development of this illness is mostly unpredictable. How can he be so confident in this case? Sounds like an illusion of validity.” “She has a coherent story that explains all she knows, and the coherence makes her feel good.” “What makes him believe that he is smarter than the market? Is this an illusion of skill?” “She is a hedgehog. She has a theory that explains everything, and it gives her the illusion that she understands the world.” “The question is not whether these experts are well trained. It is whether their world is predictable.”
Several studies have shown that human decision makers are inferior to a prediction formula even when they are given the score suggested by the formula!
intuition adds value even in the justly derided selection interview, but only after a disciplined collection of objective information and disciplined scoring of separate traits.
Suppose that you need to hire a sales representative for your firm. If you are serious about hiring the best possible person for the job, this is what you should do. First, select a few traits that are prerequisites for success in this position (technical proficiency, engaging personality, reliability, and so on). Don’t overdo it—six dimensions is a good number. The traits you choose should be as independent as possible from each other, and you should feel that you can assess them reliably by asking a few factual questions. Next, make a list of those questions for each trait and think about how you will score it, say on a 1–5 scale. You should have an idea of what you will call “very weak” or “very strong.” These preparations should take you half an hour or so, a small investment that can make a significant difference in the quality of the people you hire. To avoid halo effects, you must collect the information on one trait at a time, scoring each before you move on to the next one. Do not skip around. To evaluate each candidate, add up the six scores. Because you are in charge of the final decision, you should not do a “close your eyes.” Firmly resolve that you will hire the candidate whose final score is the highest, even if there is another one whom you like better—try to resist your wish to invent broken legs to change the ranking. A vast amount of research offers a promise: you are much more likely to find the best candidate if you use this procedure than if you do what people normally do in such situations, which is to go into the interview unprepared and to make choices by an overall intuitive judgment such as “I looked into his eyes and liked what I saw.”
Speaking of Judges vs. Formulas “Whenever we can replace human judgment by a formula, we should at least consider it.” “He thinks his judgments are complex and subtle, but a simple combination of scores could probably do better.” “Let’s decide in advance what weight to give to the data we have on the candidates’ past performance. Otherwise we will give too much weight to our impression from the interviews.”
“The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”
Little repetition is needed for learning.
We are confident when the story we tell ourselves comes easily to mind, with no contradiction and no competing scenario. But ease and coherence do not guarantee that a belief held with confidence is true.
Statistical algorithms greatly outdo humans in noisy environments for two reasons: they are more likely than human judges to detect weakly valid cues and much more likely to maintain a modest level of accuracy by using such cues consistently.
Remember this rule: intuition cannot be trusted in the absence of stable regularities in the environment.
Whether professionals have a chance to develop intuitive expertise depends essentially on the quality and speed of feedback, as well as on sufficient opportunity to practice.
Speaking of Expert Intuition “How much expertise does she have in this particular task? How much practice has she had?” “Does he really believe that the environment of start-ups is sufficiently regular to justify an intuition that goes against the base rates?” “She is very confident in her decision, but subjective confidence is a poor index of the accuracy of a judgment.” “Did he really have an opportunity to learn? How quick and how clear was the feedback he received on his judgments?”
I was following a procedure that we already planned to incorporate into our curriculum: the proper way to elicit information from a group is not by starting with a public discussion but by confidentially collecting each person’s judgment. This procedure makes better use of the knowledge available to members of the group than the common practice of open discussion.
This embarrassing episode remains one of the most instructive experiences of my professional life. I eventually learned three lessons from it. The first was immediately apparent: I had stumbled onto a distinction between two profoundly different approaches to forecasting, which Amos and I later labeled the inside view and the outside view. The second lesson was that our initial forecasts of about two years for the completion of the project exhibited a planning fallacy. Our estimates were closer to a best-case scenario than to a realistic assessment. I was slower to accept the third lesson, which I call irrational perseverance: the folly we displayed that day in failing to abandon the project. Facing a choice, we gave up rationality rather than give up the enterprise.
There are many ways for any plan to fail, and although most of them are too improbable to be anticipated, the likelihood that something will go wrong in a big project is high.
The argument for the outside view should be made on general grounds: if the reference class is properly chosen, the outside view will give an indication of where the ballpark is, and it may suggest, as it did in our case, that the inside-view forecasts are not even close to it.
This is a common pattern: people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.
Errors in the initial budget are not always innocent. The authors of unrealistic plans are often driven by the desire to get the plan approved—whether by their superiors or by a client—supported by the knowledge that projects are rarely abandoned unfinished merely because of overruns in costs or completion times. In such cases, the greatest responsibility for avoiding the planning fallacy lies with the decision makers who approve the plan. If they do not recognize the need for an outside view, they commit a planning fallacy.
A well-run organization will reward planners for precise execution and penalize them for failing to anticipate difficulties, and for failing to allow for difficulties that they could not have anticipated—the unknown unknowns.
When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns—or even to be completed.
Speaking of the Outside View “He’s taking an inside view. He should forget about his own case and look for what happened in other cases.” “She is the victim of a planning fallacy. She’s assuming a best-case scenario, but there are too many different ways for the plan to fail, and she cannot foresee them all.” “Suppose you did not know a thing about this particular legal case, only that it involves a malpractice claim by an individual against a surgeon. What would be your baseline prediction? How many of these cases succeed in court? How many settle? What are the amounts? Is the case we are discussing stronger or weaker than similar claims?” “We are making an additional investment because we do not want to admit failure. This is an instance of the sunk-cost fallacy.”
We focus on our goal, anchor on our plan, and neglect relevant base rates, exposing ourselves to the planning fallacy. We focus on what we want to do and can do, neglecting the plans and skills of others. Both in explaining the past and in predicting the future, we focus on the causal role of skill and neglect the role of luck. We are therefore prone to an illusion of control. We focus on what we know and neglect what we do not know, which makes us overly confident in our beliefs.
The consequence of competition neglect is excess entry: more competitors enter the market than the market can profitably sustain, so their average outcome is a loss.
Overconfidence is another manifestation of WYSIATI: when we estimate a quantity, we rely on information that comes to mind and construct a coherent story in which the estimate makes sense. Allowing for the information that does not come to mind—perhaps because one never knew it—is impossible.
Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred solution.
The effects of high optimism on decision making are, at best, a mixed blessing, but the contribution of optimism to good implementation is certainly positive. The main benefit of optimism is resilience in the face of setbacks.
The main obstacle is that subjective confidence is determined by the coherence of the story one has constructed, not by the quality and amount of the information that supports it.
The procedure is simple: when the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”
The premortem has two main advantages: it overcomes the groupthink that affects many teams once a decision appears to have been made, and it unleashes the imagination of knowledgeable individuals in a much-needed direction.
Speaking of Optimism “They have an illusion of control. They seriously underestimate the obstacles.” “They seem to suffer from an acute case of competitor neglect.” “This is a case of overconfidence. They seem to believe they know more than they actually do know.” “We should conduct a premortem session. Someone may come up with a threat we have neglected.”
If you prefer an apple to a banana, then you also prefer a 10% chance to win an apple to a 10% chance to win a banana.
Bernoulli observed that most people dislike risk (the chance of receiving the lowest possible outcome), and if they are offered a choice between a gamble and an amount equal to its expected value they will pick the sure thing. In fact a risk-averse decision maker will choose a sure thing that is less than expected value, in effect paying a premium to avoid the uncertainty.
The happiness that Jack and Jill experience is determined by the recent change in their wealth, relative to the different states of wealth that define their reference points (1 million for Jack, 9 million for Jill). This reference dependence is ubiquitous in sensation and perception. The same sound will be experienced as very loud or quite faint, depending on whether it was preceded by a whisper or by a roar.
theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. If you come upon an observation that does not seem to fit the model, you assume that there must be a perfectly good explanation that you are somehow missing.
disbelieving is hard work, and System 2 is easily tired.
Speaking of Bernoulli’s Errors “He was very happy with a $20,000 bonus three years ago, but his salary has gone up by 20% since, so he will need a higher bonus to get the same utility.” “Both candidates are willing to accept the salary we’re offering, but they won’t be equally satisfied because their reference points are different. She currently has a much higher salary.” “She’s suing him for alimony. She would actually like to settle, but he prefers to go to court. That’s not surprising—she can only gain, so she’s risk averse. He, on the other hand, faces options that are all bad, so he’d rather take the risk.”
The reason you like the idea of gaining 100 is not that these amounts change your wealth. You just like winning and dislike losing—and you almost certainly dislike losing more than you like winning.
A principle of diminishing sensitivity applies to both sensory dimensions and the evaluation of changes of wealth. Turning on a weak light has a large effect in a dark room. The same increment of light may be undetectable in a brightly illuminated room. Similarly, the subjective difference between 1,000 is much smaller than the difference between 200. The third principle is loss aversion. When directly compared or weighted against each other, losses loom larger than gains. This asymmetry between the power of positive and negative expectations or experiences has an evolutionary history. Organisms that treat threats as more urgent than opportunities have a better chance to survive and reproduce.
Many of the options we face in life are “mixed”: there is a risk of loss and an opportunity for gain, and we must decide whether to accept the gamble or reject it.
For most people, the fear of losing 150. We concluded from many such observations that “losses loom larger than gains” and that people are loss averse.
You can measure the extent of your aversion to losses by asking yourself a question: What is the smallest gain that I need to balance an equal chance to lose 200, twice as much as the loss. The “loss aversion ratio” has been estimated in several experiments and is usually in the range of 1.5 to 2.5.
Speaking of Prospect Theory “He suffers from extreme loss aversion, which makes him turn down very favorable opportunities.” “Considering her vast wealth, her emotional response to trivial gains and losses makes no sense.” “He weighs losses about twice as much as gains, which is normal.”
First, tastes are not fixed; they vary with the reference point. Second, the disadvantages of a change loom larger than its advantages, inducing a bias that favors the status quo.
Your leisure time and the standard of living that your income supports are also not intended for sale or exchange.
Evidence from brain imaging confirms the difference. Selling goods that one would normally use activates regions of the brain that are associated with disgust and pain. Buying also activates these areas, but only when the prices are perceived as too high—when you feel that a seller is taking money that exceeds the exchange value. Brain recordings also indicate that buying at especially low prices is a pleasurable event.
Being poor, in prospect theory, is living below one’s reference point. There are goods that the poor need and cannot afford, so they are always “in the losses.” Small amounts of money that they receive are therefore perceived as a reduced loss, not as a gain. The money helps one climb a little toward the reference point, but the poor always remain on the steep limb of the value function.
People who are poor think like traders, but the dynamics are quite different. Unlike traders, the poor are not indifferent to the differences between gaining and giving up. Their problem is that all their choices are between losses. Money that is spent on one good is the loss of another good that could have been purchased instead. For the poor, costs are losses.
Speaking of the Endowment Effect “She didn’t care which of the two offices she would get, but a day after the announcement was made, she was no longer willing to trade. Endowment effect!” “These negotiations are going nowhere because both sides find it difficult to make concessions, even when they can get something in return. Losses loom larger than gains.” “When they raised their prices, demand dried up.” “He just hates the idea of selling his house for less money than he paid for it. Loss aversion is at work.” “He is a miser, and treats any dollar he spends as a loss.”
There is no real threat, but the mere reminder of a bad event is treated in System 1 as threatening.
Loss aversion refers to the relative strength of two motives: we are driven more strongly to avoid losses than to achieve gains. A reference point is sometimes the status quo, but it can also be a goal in the future: not achieving a goal is a loss, exceeding the goal is a gain. As we might expect from negativity dominance, the two motives are not equally powerful. The aversion to the failure of not reaching the goal is much stronger than the desire to exceed it.
Loss aversion creates an asymmetry that makes agreements difficult to reach. The concessions you make to me are my gains, but they are your losses; they cause you much more pain than they give me pleasure. Inevitably, you will place a higher value on them than I do. The same is true, of course, of the very painful concessions you demand from me, which you do not appear to value sufficiently! Negotiations over a shrinking pie are especially difficult, because they require an allocation of losses. People tend to be much more easygoing when they bargain over an expanding pie.
Many of the messages that negotiators exchange in the course of bargaining are attempts to communicate a reference point and provide an anchor to the other side. The messages are not always sincere. Negotiators often pretend intense attachment to some good (perhaps missiles of a particular type in bargaining over arms reductions), although they actually view that good as a bargaining chip and intend ultimately to give it away in an exchange. Because negotiators are influenced by a norm of reciprocity, a concession that is presented as painful calls for an equally painful (and perhaps equally inauthentic) concession from the other side.
Remarkably, altruistic punishment is accompanied by increased activity in the “pleasure centers” of the brain. It appears that maintaining the social order and the rules of fairness in this fashion is its own reward. Altruistic punishment could well be the glue that holds societies together. However, our brains are not designed to reward generosity as reliably as they punish meanness. Here again, we find a marked asymmetry between losses and gains.
In a more recent discussion, Eyal Zamir makes the provocative point that the distinction drawn in the law between restoring losses and compensating for foregone gains may be justified by their asymmetrical effects on individual well-being. If people who lose suffer more than people who merely fail to gain, they may also deserve more protection from the law.
Speaking of Losses “This reform will not pass. Those who stand to lose will fight harder than those who stand to gain.” “Each of them thinks the other’s concessions are less painful. They are both wrong, of course. It’s just the asymmetry of losses.” “They would find it easier to renegotiate the agreement if they realized the pie was actually expanding. They’re not allocating losses; they are allocating gains.” “Rental prices around here have gone up recently, but our tenants don’t think it’s fair that we should raise their rent, too. They feel entitled to their current terms.” “My clients don’t resent the price hike because they know my costs have gone up, too. They accept my right to stay profitable.”
The assignment of weights is sometimes conscious and deliberate. Most often, however, you are just an observer to a global evaluation that your System 1 delivers.
The conclusion is straightforward: the decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle. Improbable outcomes are overweighted—this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology.
When you pay attention to a threat, you worry—and the decision weights reflect how much you worry. Because of the possibility effect, the worry is not proportional to the probability of the threat. Reducing or mitigating the risk is not adequate; to eliminate the worry the probability must be brought down to zero.
people attach values to gains and losses rather than to wealth, and the decision weights that they assign to outcomes are different from probabilities.
Of course, what people acquire with a ticket is more than a chance to win; it is the right to dream pleasantly of winning.
This is where businesses that are losing ground to a superior technology waste their remaining assets in futile attempts to catch up. Because defeat is so difficult to accept, the losing side in wars often fights long past the point at which the victory of the other side is certain, and only a matter of time.
Consistent overweighting of improbable outcomes—a feature of intuitive decision making—eventually leads to inferior outcomes.
Speaking of the Fourfold Pattern “He is tempted to settle this frivolous claim to avoid a freak loss, however unlikely. That’s overweighting of small probabilities. Since he is likely to face many similar problems, he would be better off not yielding.” “We never let our vacations hang on a last-minute deal. We’re willing to pay a lot for certainty.” “They will not cut their losses so long as there is a chance of breaking even. This is risk-seeking in the losses.” “They know the risk of a gas explosion is minuscule, but they want it mitigated. It’s a possibility effect, and they want peace of mind.”
People overestimate the probabilities of unlikely events. People overweight unlikely events in their decisions.
Our mind has a useful capability to focus spontaneously on whatever is odd, different, or unusual.
The story, I believe, is that a rich and vivid representation of the outcome, whether or not it is emotional, reduces the role of probability in the evaluation of an uncertain prospect. This hypothesis suggests a prediction, in which I have reasonably high confidence: adding irrelevant but vivid details to a monetary outcome also disrupts calculation.
The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects. You read that “a vaccine that protects children from a fatal disease carries a 0.001% risk of permanent disability.” The risk appears small. Now consider another description of the same risk: “One of 100,000 vaccinated children will be permanently disabled.” The second statement does something to your mind that the first does not: it calls up the image of an individual child who is permanently disabled by a vaccine; the 99,999 safely vaccinated children have faded into the background. As predicted by denominator neglect, low-probability events are much more heavily weighted when described in terms of relative frequencies (how many) than when stated in more abstract terms of “chances,” “risk,” or “probability” (how likely). As we have seen, System 1 is much better at dealing with individuals than categories.
When it comes to rare probabilities, our mind is not designed to get things quite right. For the residents of a planet that may be exposed to events no one has yet experienced, this is not good news.
Speaking of Rare Events “Tsunamis are very rare even in Japan, but the image is so vivid and compelling that tourists are bound to overestimate their probability.” “It’s the familiar disaster cycle. Begin by exaggeration and overweighting, then neglect sets in.” “We shouldn’t focus on a single scenario, or we will overestimate its probability. Let’s set up specific alternatives and make the probabilities add up to 100%.” “They want people to be worried by the risk. That’s why they describe it as 1 death per 1,000. They’re counting on denominator neglect.”
This is generally true: every simple choice formulated in terms of gains and losses can be deconstructed in innumerable ways into a combination of choices, yielding preferences that are likely to be inconsistent.
The example also shows that it is costly to be risk averse for gains and risk seeking for losses. These attitudes make you willing to pay a premium to obtain a sure gain rather than face a gamble, and also willing to pay a premium (in expected value) to avoid a sure loss. Both payments come out of the same pocket, and when you face both
I sympathize with your aversion to losing any gamble, but it is costing you a lot of money. Please consider this question: Are you on your deathbed? Is this the last offer of a small favorable gamble that you will ever consider? Of course, you are unlikely to be offered exactly this gamble again, but you will have many opportunities to consider attractive gambles with stakes that are very small relative to your wealth. You will do yourself a large financial favor if you are able to see each of these gambles as part of a bundle of small gambles and rehearse the mantra that will get you significantly closer to economic rationality: you win a few, you lose a few. The main purpose of the mantra is to control your emotional response when you do lose. If you can trust it to be effective, you should remind yourself of it when deciding whether or not to accept a small risk with positive expected value. Remember these qualifications when using the mantra: It works when the gambles are genuinely independent of each other; it does not apply to multiple investments in the same industry, which would all go bad together. It works only when the possible loss does not cause you to worry about your total wealth. If you would take the loss as significant bad news about your economic future, watch it! It should not be applied to long shots, where the probability of winning is very small for each bet. If you have the emotional discipline that this rule requires, you will never consider a small gamble in isolation or be loss averse for a small gamble until you are actually on your deathbed—and not even then.
The combination of loss aversion and narrow framing is a costly curse. Individual investors can avoid that curse, achieving the emotional benefits of broad framing while also saving time and agony, by reducing the frequency with which they check how well their investments are doing. Closely following daily fluctuations is a losing proposition, because the pain of the frequent small losses exceeds the pleasure of the equally frequent small gains. Once a quarter is enough, and may be more than enough for individual investors. In addition to improving the emotional quality of life, the deliberate avoidance of exposure to short-term outcomes improves the quality of both decisions and outcomes. The typical short-term reaction to bad news is increased loss aversion. Investors who get aggregated feedback receive such news much less often and are likely to be less risk averse and to end up richer. You are also less prone to useless churning of your portfolio if you don’t know how every stock in it is doing every day (or every week or even every month). A commitment not to change one’s position for several periods (the equivalent of “locking in” an investment) improves financial performance.
The outside view and the risk policy are remedies against two distinct biases that affect many decisions: the exaggerated optimism of the planning fallacy and the exaggerated caution induced by loss aversion. The two biases oppose each other. Exaggerated optimism protects individuals and organizations from the paralyzing effects of loss aversion; loss aversion protects them from the follies of overconfident optimism. The upshot is rather comfortable for the decision maker. Optimists believe that the decisions they make are more prudent than they really are, and loss-averse decision makers correctly reject marginal propositions that they might otherwise accept. There is no guarantee, of course, that the biases cancel out in every situation. An organization that could eliminate both excessive optimism and excessive loss aversion should do so. The combination of the outside view with a risk policy should be the goal.
Speaking of Risk Policies “Tell her to think like a trader! You win a few, you lose a few.” “I decided to evaluate my portfolio only once a quarter. I am too loss averse to make sensible decisions in the face of daily price fluctuations.” “They never buy extended warranties. That’s their risk policy.” “Each of our executives is loss averse in his or her domain. That’s perfectly natural, but the result is that the organization is not taking enough risk.”
The ultimate currency that rewards or punishes is often emotional, a form of mental self-dealing that inevitably creates conflicts of interest when the individual acts as an agent on behalf of an organization.
Boards of directors are well aware of these conflicts and often replace a CEO who is encumbered by prior decisions and reluctant to cut losses. The members of the board do not necessarily believe that the new CEO is more competent than the one she replaces. They do know that she does not carry the same mental accounts and is therefore better able to ignore the sunk costs of past investments in evaluating current opportunities.
people expect to have stronger emotional reactions (including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction.
It is the departure from the default that produces regret.
We spend much of our day anticipating, and trying to avoid, the emotional pains we inflict on ourselves. How seriously should we take these intangible outcomes, the self-administered punishments (and occasional rewards) that we experience as we score our lives?
Speaking of Keeping Score “He has separate mental accounts for cash and credit purchases. I constantly remind him that money is money.” “We are hanging on to that stock just to avoid closing our mental account at a loss. It’s the disposition effect.” “We discovered an excellent dish at that restaurant and we never try anything else, to avoid regret.” “The salesperson showed me the most expensive car seat and said it was the safest, and I could not bring myself to buy the cheaper model. It felt like a taboo tradeoff.”
Speaking of Reversals “The BTU units meant nothing to me until I saw how much air-conditioning units vary. Joint evaluation was essential.” “You say this was an outstanding speech because you compared it to her other speeches. Compared to others, she was still inferior.” “It is often the case that when you broaden the frame, you reach more reasonable decisions.” “When you see cases in isolation, you are likely to be guided by an emotional reaction of System 1.”
In terms of the associations they bring to mind—how System 1 reacts to them—the two sentences really “mean” different things. The fact that logically equivalent statements evoke different reactions makes it impossible for Humans to be as reliably rational as Econs.
We should not be surprised: losses evokes stronger negative feelings than costs. Choices are not reality-bound because System 1 is not reality-bound.
Their psychology was sound: people will more readily forgo a discount than pay a surcharge. The two may be economically equivalent, but they are not emotionally equivalent.
Decision makers tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good. They tend to reject the sure thing and accept the gamble (they are risk seeking) when both outcomes are negative.
Your moral feelings are attached to frames, to descriptions of reality rather than to reality itself.
Our preferences are about framed problems, and our moral intuitions are about descriptions, not about substance.
Broader frames and inclusive accounts generally lead to more rational decisions.
Speaking of Frames and Reality “They will feel better about what happened if they manage to frame the outcome in terms of how much money they kept rather than how much they lost.” “Let’s reframe the problem by changing the reference point. Imagine we did not own it; how much would we think it is worth?” “Charge the loss to your mental account of ‘general revenue’—you will feel better!” “They ask you to check the box to opt out of their mailing list. Their list would shrink if they asked you to check a box to opt in!”
Peak-end rule: The global retrospective rating was well predicted by the average of the level of pain reported at the worst moment of the experience and at its end. Duration neglect: The duration of the procedure had no effect whatsoever on the ratings of total pain.
The experiencing self is the one that answers the question: “Does it hurt now?” The remembering self is the one that answers the question: “How was it, on the whole?” Memories are all we get to keep from our experience of living, and the only perspective that we can adopt as we think about our lives is therefore that of the remembering self.
Confusing experience with the memory of it is a compelling cognitive illusion—and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience. This is the tyranny of the remembering self.
The cold-hand study showed that we cannot fully trust our preferences to reflect our interests, even if they are based on personal experience, and even if the memory of that experience was laid down within the last quarter of an hour! Tastes and decisions are shaped by memories, and the memories can be wrong.
A memory that neglects duration will not serve our preference for long pleasure and short pains.
Speaking of Two Selves “You are thinking of your failed marriage entirely from the perspective of the remembering self. A divorce is like a symphony with a screeching sound at the end—the fact that it ended badly does not mean it was all bad.” “This is a bad case of duration neglect. You are giving the good and the bad part of your experience equal weight, although the good part lasted ten times as long as the other.”
A story is about significant events and memorable moments, not about time passing.
Caring for people often takes the form of concern for the quality of their stories, not for their feelings.
Most important, of course, we all care intensely for the narrative of our own life and very much want it to be a good story, with a decent hero.
In intuitive evaluation of entire lives as well as brief episodes, peaks and ends matter but duration does not.
The photographer does not view the scene as a moment to be savored but as a future memory to be designed.
Odd as it may seem, I am my remembering self, and the experiencing self, who does my living, is like a stranger to me.
Speaking of Life as a Story “He is desperately trying to protect the narrative of a life of integrity, which is endangered by the latest episode.” “The length to which he was willing to go for a one-night encounter is a sign of total duration neglect.” “You seem to be devoting your entire vacation to the construction of memories. Perhaps you should put away the camera and enjoy the moment, even if it is not very memorable?” “She is an Alzheimer’s patient. She no longer maintains a narrative of her life, but her experiencing self is still sensitive to beauty and gentleness.”
Attention is key. Our emotional state is largely determined by what we attend to, and we are normally focused on our current activity and immediate environment.
These observations have implications for both individuals and society. The use of time is one of the areas of life over which people have some control. Few individuals can will themselves to have a sunnier disposition, but some may be able to arrange their lives to spend less of their day commuting, and more time doing things they enjoy with people they like.
Not surprisingly, a headache will make a person miserable, and the second best predictor of the feelings of a day is whether a person did or did not have contacts with friends or relatives. It is only a slight exaggeration to say that happiness is the experience of spending time with people you love and who love you.
There is a clear contrast between the effects of income on experienced well-being and on life satisfaction. Higher income brings with it higher satisfaction, well beyond the point at which it ceases to have any positive effect on experience. The general conclusion is as clear for well-being as it was for colonoscopies: people’s evaluations of their lives and their actual experience may be related, but they are also different. Life satisfaction is not a flawed measure of their experienced well-being, as I thought some years ago. It is something else entirely.
Speaking of Experienced Well-Being “The objective of policy should be to reduce human suffering. We aim for a lower U-index in society. Dealing with depression and extreme poverty should be a priority.” “The easiest way to increase happiness is to control your use of time. Can you find more time to do the things you enjoy doing?” “Beyond the satiation level of income, you can buy more pleasurable experiences, but you will lose some of your ability to enjoy the less expensive ones.”
The goals that people set for themselves are so important to what they do and how they feel about it that an exclusive focus on experienced well-being is not tenable. We cannot hold a concept of well-being that ignores what people want. On the other hand, it is also true that a concept of well-being that ignores how people feel as they live and focuses only on how they feel when they think about their life is also untenable. We must accept the complexities of a hybrid view, in which the well-being of both selves is considered.
Nothing in life is as important as you think it is when you are thinking about it.
Adaptation to a new situation, whether good or bad, consists in large part of thinking less and less about it. In that sense, most long-term circumstances of life, including paraplegia and marriage, are part-time states that one inhabits only when one attends to them.
The focusing illusion creates a bias in favor of goods and experiences that are initially exciting, even if they will eventually lose their appeal. Time is neglected, causing experiences that will retain their attention value in the long term to be appreciated less than they deserve to be.
The mistake that people make in the focusing illusion involves attention to selected moments and neglect of what happens at other times. The mind is good with stories, but it does not appear to be well designed for the processing of time.
Speaking of Thinking About Life “She thought that buying a fancy car would make her happier, but it turned out to be an error of affective forecasting.” “His car broke down on the way to work this morning and he’s in a foul mood. This is not a good day to ask him about his job satisfaction!” “She looks quite cheerful most of the time, but when she is asked she says she is very unhappy. The question must make her think of her recent divorce.” “Buying a larger house may not make us happier in the long term. We could be suffering from a focusing illusion.” “He has chosen to split his time between two cities. Probably a serious case of miswanting.”
The central fact of our existence is that time is the ultimate finite resource, but the remembering self ignores that reality. The neglect of duration combined with the peak-end rule causes a bias that favors a short period of intense joy over a long period of moderate happiness. The mirror image of the same bias makes us fear a short period of intense but tolerable suffering more than we fear a much longer period of moderate pain. Duration neglect also makes us prone to accept a long period of mild unpleasantness because the end will be better, and it favors giving up an opportunity for a long happy period if it is likely to have a poor ending.
In contrast, the duration-weighted conception of well-being treats all moments of life alike, memorable or not. Some moments end up weighted more than others, either because they are memorable or because they are important. The time that people spend dwelling on a memorable moment should be included in its duration, adding to its weight. A moment can also gain importance by altering the experience of subsequent moments.
A theory of well-being that ignores what people want cannot be sustained. On the other hand, a theory that ignores what actually happens in people’s lives and focuses exclusively on what they think about their life is not tenable either. The remembering self and the experiencing self must both be considered, because their interests do not always coincide. Philosophers could struggle with these questions for a long time.
Rationality is logical coherence—reasonable or not. Econs are rational by this definition, but there is overwhelming evidence that Humans cannot be. An Econ would not be susceptible to priming, WYSIATI, narrow framing, the inside view, or preference reversals, which Humans cannot consistently avoid.
Although Humans are not irrational, they often need help to make more accurate judgments and better decisions, and in some cases policies and institutions can provide that help.
when we observe people acting in ways that seem odd, we should first examine the possibility that they have a good reason to do what they do.
Humans, unlike Econs, need help to make good decisions, and there are informed and unintrusive ways to provide that help.
The attentive System 2 is who we think we are. System 2 articulates judgments and makes choices, but it often endorses or rationalizes ideas and feelings that were generated by System 1. You may not know that you are optimistic about a project because something about its leader reminds you of your beloved sister, or that you dislike a person who looks vaguely like your dentist. If asked for an explanation, however, you will search your memory for presentable reasons and will certainly find some. Moreover, you will believe the story you make up. But System 2 is not merely an apologist for System 1; it also prevents many foolish thoughts and inappropriate impulses from overt expression. The investment of attention improves performance in numerous activities—think of the risks of driving through a narrow space while your mind is wandering—and is essential to some tasks, including comparison, choice, and ordered reasoning. However, System 2 is not a paragon of rationality. Its abilities are limited and so is the knowledge to which it has access. We do not always think straight when we reason, and the errors are not always due to intrusive and incorrect intuitions. Often we make mistakes because we (our System 2) do not know any better.
Our thoughts and actions are routinely guided by System 1 and generally are on the mark. One of the marvels is the rich and detailed model of our world that is maintained in associative memory: it distinguishes surprising from normal events in a fraction of a second, immediately generates an idea of what was expected instead of a surprise, and automatically searches for some causal interpretation of surprises and of events as they take place.
The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions. When these conditions are fulfilled, skill eventually develops, and the intuitive judgments and choices that quickly come to mind will mostly be accurate. All this is the work of System 1, which means it occurs automatically and fast. A…
System 1 is not constrained by capacity limits and is profligate in its computations. When engaged in searching for an answer to one question, it simultaneously generates the answers to related questions, and it may substitute a response that more easily comes to mind for the one that was requested. In this conception of heuristics, the heuristic answer is not necessarily simpler or more frugal than the original question—it is only more accessible, computed more quickly and…
System 1 registers the cognitive ease with which it processes information, but it does not generate a warning signal when it becomes unreliable. Intuitive answers come to mind quickly and confidently, whether they originate from skills or from heuristics. There is no simple way for System 2 to distinguish between a skilled and a heuristic response. Its only recourse is to slow down and attempt to construct an answer on its own, which it is reluctant to do because it is indolent. Many suggestions of System 1 are casually endorsed with minimal checking, as in the bat-and-ball problem. This is how System 1 acquires its bad reputation as the source of errors and biases. Its operative features, which include WYSIATI, intensity…
The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. This is how you will proceed when you next encounter the Müller-Lyer illusion. When you see lines with fins pointing in different directions, you will recognize the situation as one in which you should not trust your impressions of length.
The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intuitions is unpleasant when you face the stress of a big decision. More doubt is the last thing you want when you are in trouble.
Observers are less cognitively busy and more open to information than actors.
Whatever else it produces, an organization is a factory that manufactures judgments and decisions. Every factory must have ways to ensure the quality of its products in the initial design, in fabrication, and in final inspections. The corresponding stages in the production of decisions are the framing of the problem that is to be solved, the collection of relevant information leading to a decision, and reflection and review.
There is a direct link from more precise gossip at the watercooler to better decisions. Decision makers are sometimes better able to imagine the voices of present gossipers and future critics than to hear the hesitant voice of their own doubts. They will make better choices when they trust their critics to be sophisticated and fair, and when they expect their decision to be judged by how it was made, not only by how it turned out.