A hundred cognitive biases
Ways of thinking fast and slow, with illustrations and converse strategies.
The frailties of human modes of thinking have been laid out in several recent works. The pathbreaking research of Nobel laureates Daniel Kahneman and Amos Tversky introduced me to the science of cognitive biases, popularized in Kahneman’s book Thinking, Fast and Slow (2011). Nicholas Naseem Taleb applied their results and his own finance experience in such volumes as Black Swan (2007) and Fooled by Randomness (2001). And the Swiss novelist Rolf Dobelli compiled in The Art of Thinking Clearly (2013) an exhaustive and exhausting list of nearly a hundred biases.
If you think, or have to deal with people who think, it is good to be aware of these biases. Note that, as with many biases, there may not be any problem with inclinations of thought. We humans were once prey and accordingly our habits of risk aversion and groupthink were survival skills—those humans who embraced risk and went lone wolf are not part of our gene pool. As Dobelli says, “Not all cognitive errors are toxic, and some are even necessary for living a good life.” But like our craving for fats and sweets, many of these understandable evolutionary quirks are no longer positive for us. “In the modern world, this intuitive behavior is disadvantageous. Today’s world rewards single-minded contemplation and independent action.”
These are large books and a bit repetitive and tedious to work through. Here is my more concise guide to the biases cited by Dobelli. I think it is useful to run through these from time to time. I even stop at many of these waypoints and consider when I or others have exhibited them. In any event, here we go.
1. “I’m a winner because I look like the winners”: Survivorship bias. We see the publicity about the surviving winners, not all the many disappeared losers. Very few books are written about the failures. The Dow Jones Industrial “Average” sheds the companies that fall behind and add the ones that have recently risen. We see that we are like the winner in some senses, and therefore infer that we ourselves are a winner. But we may be just like the losers too, the losers we don’t see, maybe even more so. Don't overestimate your chances of success based on your similarity to examples of success.
2. “If I swim I’ll get a swimmer's body.” Swimming doesn't make your body a trapezoid—trapezoid-shaped bodies make good swimmers. Schools admit bright people, and then take credit for their success after graduation. People can’t necessarily make themselves optimists—those who see the glass half full might have just been born happy.
3. “That cloud looks like a horsie”: Clustering illusions. (Quoted from one of my favorite Peanuts strips, featured in my book about making stories about the Big Dipper stars.) We see patterns in data distributions that are actually the result of random processes.
4. When one wildebeest bolts, they all bolt: Social proof or herd instinct. The more people that follow a trend, the more join it, and the more we believe the trend. Compare W. Somerset Maugham: “If fifty million people say something foolish, it is still foolish.”
5. “We must keep going because we’ve spent so much so far”: Sunk costs. Our acquisition cost should not be relevant to a later hold/sell decision, but we hang on to that notion when making the further choice. Also called the "Concorde effect”: the British-French Concorde supersonic airliner project was heavily subsidized and continued to receive funding, despite clear indications of financial failure.
6. “Here is a gift, are you grateful?”: Reciprocity. People give you something expecting you to give back. A fundraising mailer encloses a dollar or postcards, increasing the likelihood you will contribute. Or a professional firm offers skybox seats to someone who will make procurement decisions tomorrow. Or on the negative side, people retaliate tit for tat. (The converse strategy is to be truly philanthropic, or to turn the other cheek and avoid retaliation.)
7. “I’ve already made my mind up, don’t confuse me with new facts”: Confirmation bias. When you strain to interpret or ignore new facts in order to be consistent with your existing belief. (The converse strategy is always to be on the lookout for counterexamples against any tentative belief you might hold.)
8. “Murder your darlings.” Quiller-Couch’s admonition applies to situations where otherwise you become overly attached to “precious” words, expressions, plots, and the like, which may not be suitable for an audience broader than yourself.
9. “I was following orders”: Authority bias. We respect authority even when we know it is wrong or just plain nuts. In Psychology 1 classes, we learned of the Stanley Milgram experiment shocks and (my professor) the Philip Zimbardo prison experiment. Co-pilots defer to pilot all the while the plane is plummeting. (The converse strategy is to question authority.)
10. Relative size of the prize: Contrast effect. We tend to judge the impact of an item by the size of the associated prize relative to the base value. We walk a long ways to save $10 off of a $15 purchase, but not $50 on a $10,000 purchase. A 30% discount from an [inflated] price seems like a bargain to us, even if an undiscounted non-inflated price is lower. It is hard for us to spot small gradual changes in value, like the heavy touch of a pickpocket that draws our attention away from the light-handed theft; and inflation saps the value over years.
11. “I decide based on what is in front of me”: Availability bias. We tend to rely on anecdotal data points like a recent plane crash, act of terrorism, or strange disease. Corporate boards discuss what is on the typed agenda, not something that is obvious out the window but not in a line item. According to Dobelli the Black-Scholes option valuation system hasn't worked for at least ten years, but since there is no emerging standard alternative, we still use it.
12. Things will be worse before they get better. So if a plan starts off bad, we tend to stick to it expecting the tide to turn. Hard to disprove—if things get worse you’re right, and if they get better you’re right. Applies for example to International Monetary Fund (IMF)-type austerity programs.
13. Story bias. We try to fit facts into a particular story. “Versailles led to World War II.” Maybe there isn't a story, or isn't that story.
14. Hindsight bias. To ward off, record all your predictions, not just the ones that come true. In early 2007, most experts predicted ongoing rosy economies around the globe. In 2008, those same experts pointed to imprudent monetary policy, flawed rating agency practices, low capitalization, and margin securities practices.
15. Overconfidence effect. Bidders on construction contracts tend to lowball. Owners like to hear the siren song of a low initial price. Then they regret the change orders. Sydney Opera House (1957 $7 million and 1963 completion estimate, $102 million and 1973 completion actual performance).
16. Chauffeur knowledge. People who look good or repeat things, like TV news anchors, or are extremely confident of what they say, like your Lyft driver, may not know what they are talking about. (The contrary strategy is to rely on the thoughts of folks acting in their “circle of competence.”)
17. Illusion of control. Throw dice really, really hard to get high number, softly to get low number. Is that “close door” elevator button even connected? On the other hand, the irrational idea that you can influence destiny keeps you going and may help you survive (Viktor Frankl).
18. Incentive super-response. The moral hazard of perverse incentives. Pricing a service using hourly rates. Don't ask a barber if you need a haircut. Banning a book makes it more attractive and publicized (also see the Streisand effect where trying to censor something enhances its circulation).
19. Regression to mean. Nuff said. The “hot hand” in basketball doesn’t test well. Everything over a long term tends to even out.
20. Outcome bias. Survivor bias? After the fact, Pearl Harbor, the 1973 Arab invasion of Israel, and 9/11 all seem obviously predictable, but at the time it was not clear anyone could put the clues together. Small sample size is another such risk. (The contrary strategy is to focus on process, not on results.)
21. Paradox of choice. Too many choices leave us less happy. Dating sites display so many differences in characteristics that they overwhelm us, and decisions reduce to some key single attribute like physical attractiveness or wealth.
22. Liking the salesperson. Attraction or similarity, or the salespersons show they like us. Compliments by salespeople work wonders. The World Wildlife Fund advertises with photos of cute mammals, not endangered insects.
23. Endowment. We all want to sell at a price that is higher than the price at which we would buy the same thing. (The converse strategy is not to cling to possessions.)
24. Anti-coincidence. We have trouble believing that an important pair of events are just matters of coincidence. (The converse strategy is to consider probability of the inputs happening without the output, and of the output happening without the inputs.)
25. Groupthink. Speak your mind.
26. Probability. Respond to magnitude, not likelihood. We gamble on winning a $100 million lottery regardless of the odds, but we will do practically anything to avoid a $10 downside.
27. Scarcity error. Rara sunt cara (rare things are valuable.) Cookies in an almost empty box seem to taste better. Phrases like "Only while supplies last" and sales “today only” work. Red dots on most tags make us think we are getting a bargain. What is forbidden is desired. Severely drunk teenagers appear in US colleges, not European ones, because alcohol is an illegal American thrill.
28. Hoofbeats mean horses, not zebras. We sometimes imagine some exotic and shocking cause. (The converse strategy is to use the “base rate,” not superficially relevant factors with much lower probabilities. The medical education expression is “When you hear hoofbeats, expect the horse, not the zebra”—in a diagnosis, think of the most likely illness not some exotic one.
29. Gambler's fallacy. Independent events don't harmonize. Coin doesn't know it has come up tails 100 times in row—the next flips are still 50/50. (Though consider possibility of loaded coin!) In contrast, weather and stocks may or may not be independent.
30. Anchor. Ask someone to state their last two social security numbers, then ask them to bid on something—that random two digit number actually affects their bid! Recommended price. If a teacher knowingly grades a known A student, they get better marks for the same output as another student.
31. Induction. Goose thinks farmer feeding him is looking out for him. Taleb suggests that stockbrokers send emails just to those they previously made accurate predictions to, time after time, and the remaining ones after several amazing recommendations will think the broker is a genius.
32. Loss aversion. In caveman subsistence times, one mistake was death. We carry that to modernity. People don't want to realize a loss even though further decline is likely.
33. Social loafing. When you can't measure individual effort, each has an incentive to loaf. We are rowers in a collective activity, not relay runners whose individual effort can be observed. (The converse strategy is to staff teams with specialists because we can see their individual effort.)
34. Exponential growth. True exponential growth for any period of time sneaks up on you; 70/g% gives you years to double. So in seven years of 10% inflation you lose half your asset value. But be wary of thinking exponential growth continues forever. If something is too good to last, it will stop.
35. Winner's curse. You probably paid too much. Warren Buffett’s advice on going to auctions is “don’t go.”
36. Attribution of events to a human. President takes credit or blame for economy dependent on hundreds of factors. People are often small part of picture. It was very useful in our dim past to rely on human agency offending the gods, but not so much today.
37. False causality. Do more women on boards lead to more profit, or does more profitability lead companies to recruit women or make it easier to recruit women? Was Alan Greenspan a genius, or did he just have the good fortune to be in office during a period of growth?
38. Halo effect. One input like stock price dazzles us into thinking the company has unique expertise on other inputs. Beauty, celebrity, prior success. Why do we trust Roger Federer’s advice on coffee machines?
39. Alternative paths. Run out all the paths, not just the paths followed by the winners.
40. Forecast illusion. Most forecasts are crap. Look at all of them, not just the random few that are correlated with actual events. There is no penalty on forecasters who are wrong.
41. Conjunction fallacy. Again, look at the base rate of each factor and the factors’ interdependence, not combinations. Which is more common, a banker who knits or a knitter? The additional condition tells story, but decreases odds. “Seattle airport closed due to weather” gets more “more likely” votes than “Seattle airport closed.”
42. Framing. Upside (a drug will save 200 out of 600 lives) vs. downside (a drug will still leave 400 dead) makes a difference to us. A “98% fat free” label gets more attention than a “1% fat” label.
43. Action bias. Goalie moves left or right though a third of balls go straight. Cops, physicians act in favor of sitting pat. It is better to look active even if it achieves nothing or is counterproductive. Pascal: “Humanity’s greatest problems arise from man’s inability to sit quietly in a room alone.”
44. Omission bias. Let some die through inaction? Or actively kill some? The second choice is considered worse (see the “trolley problem”). Issuing a DNR order is ok, but euthanasia is criminal.
45. Self-serving bias. Seen throughout corporate annual reports. Bad outcome is chalked up to exogenous factors, while a good outcome is due to management. (The converse strategy is to ask someone with an adverse interest to comment.)
46. Hedonic treadmill. Once you achieve a goal, you don’t feel happy, you have a new discomfort. (The converse strategy is to avoid the negatives you can avoid—noise, commutes. Expect happiness from material things to be limited. Instead, strive for free time, follow your passions.)
47. Self-selection bias. My line always seems to move slower. Campfire smoke always comes my way.
48. Association bias. Wear my lucky thong to keep my hitting streak alive. Shoot the messenger.
49. Beginners' luck. Early victories are a curse not a blessing; just ask Napoleon and Hitler. Alter attitude from early easy victories.
50. Cognitive dissonance. Sour grapes. “I didn’t want [what I didn’t get or couldn’t have] anyway.”
51. Hyperbolic present value discounting. “Live as if it is your last day” is lousy advice. Immediate reward overcomes high interest or low discount rate. Credit card interest. Next day delivery charge profit margin.
52. Because. Just a reason will help you absorb psychological impact. Stocks fell because of employment data. Often bogus.
53. Decision fatigue. Decisions drain willpower. IKEA has restaurant in middle of store. Early in day, CEO will do hard work on decisions; later in day, may just approve or veto things.
54. Contagion bias. The medieval display of saintly relics inspired oath-making and oath-keeping. We shun objects used by those we dislike.
55. Averages. Cross a river on average 4 feet deep? Could be fatally 20 feet deep in middle for just a little bit. Watch for extreme cases that can sink you. William Gibson: “The future is already here, it is just not very evenly distributed.”
56. Motivation crowding. Charging daycare parents for delay increases tardiness. Paying for blood donations reduces the incidence of volunteering. The monetary incentive drives out the peer pressure for what is otherwise a social activity.
57. Twaddle tendency. Miss South Carolina blathered because she was obligated to speak. Habermas and Derrida have no such excuse. The travesty of sports interviewers asking the winner “how it felt” is universal. If you talk a lot and use big words, people think you are smart. Mark Twain: “If you have nothing to say, say nothing.”
58. Will Rogers on IQ. His joke: those who migrated from Oklahoma to California raise both states' average IQ. Cancer medicines lead to more Stage 1 cases being diagnosed, sometimes progress is illusory.
59. Informational bias. Borges map (1:1, everything at actual size) yields no new information for action. Daniel Boorstin: “Our principal obstacle is not ignorance, it is the illusion of knowledge.” Mark Twain: “It’s not what you don’t know that gets you, it’s what you think you know but don’t.”
60. Effort justification. Effort in must lead to value out, right? A cake mix with powdered egg left the homemaker with no involvement. They backed out the egg so the home chef had to do something, and sales increased.
61. Law of small numbers. Denominator is smaller for any factor, it looms larger in rate. A small city is going to have larger swings in results, up or down, than does a larger city.
62. Expectations. If profits up but miss a "number," the stock price falls. Placebo.
63. Simple logic. Old tricks like the two objects costing $1.10 one of which is $1 more than the other, the lily pond whose cover doubles in size every day is only half full the day before the end.
64. Forer effect. Astrology advice somehow applies to one-twelfth of the human race. Ask "who doesn't have this outlook?"
65. Volunteer's folly. We volunteer in an area outside our expertise in lieu of working for greater compensation, even though the rational thing might be to work for the higher compensation and just donate the money. Rationality isn’t everything; this one is a good folly, it centers us. Also, celebrities (and others) get reputational benefits from been seen as doing good work..
66. Affect heuristic. Here, “affect” means a snap momentary judgment to like or dislike something. We don't tote up pros and cons to make big decisions, or at least if we do make pro/con lists they are not authentic. If you previously know you like something, you will list more pros. Often, it is your last impulse that rules.
67. Introspective Illusion. We are confident of our own beliefs, even when new facts emerge that confront or even contradict them. We react that the source bringing the unpleasant conflicting facts might be ignorant, or idiotic, or malicious. (The converse strategy is to be a tougher critic of yourself and your beliefs, which should in many matters be tentative not rigid.
68. Inability to close doors. Too many choices and opt-out options paralyze us. Cortes sunk his ships upon reaching Mexico. Keeping options open has its own real costs.
69. Neomania. Affection for the new shining thing over older, more tested items. We think the world 20 years from now will have flying cars, but most daily living will be the same.
70. Sleeper. Propaganda is initially ineffective, but over time it seeps into consciousness of the audience. (The converse strategy is to remember the sources of your own personal beliefs.)
71. Alternative blindness. A business school touts the first salary success of its graduates. Did the school cause the success? What was the cost of going to school (tuition, foregone income)? What is the graduates’ long-term outlook? What about arguments that spending public money on a sports stadium will elevate a city’s economic health—compared to what? (The converse strategy is always to compare one alternative fully to the second best at that time—even if second best is “do nothing” or “status quo.”)
72. Social comparison. There is an inclination to hire people just like you. (The converse strategy is to hire people smarter than you, to hire people who provide a diverse team. When Isaac Newton’s professor Isaac Barrow saw his work, Barrow became Newton’s student!)
73. First and last impressions. We hear the first or latest in a list most clearly, in between most dimly. Applies to job interviewees, books, judicial decisions over a long morning. We form judgments about the quality of professionals from seeing the model of auto they drive. (The converse strategy is to take notes evenly throughout the whole roster to make sure you don’t shortchange the ones in the middle.)
74. Not-invented-here. We tend to fall in love with our own ideas or the institutions or our home team.
75. Black swan. Taleb’s famous example. “All swans are white” had been a typical example in logic; then in 1697 a black swan was seen in Australia. Unthinkable things happen: California gold rush, collapse of USSR, the transistor, the Internet. Donald Rumsfeld trio still applies: we have known knowns, known unknowns, and unknown unknowns. (The converse strategy is versatility—the artist, the inventor, the entrepreneur with scalable and flexible dreams will survive. In times of unrest, avoid debt, invest, get used to modest living. I am not sure it is worth following this advice.)
76. Domain dependence. An expert in one area is not an expert in another (physicist William Shockley opining on race and intelligence) .
77. False consensus. If I like 1980s music, I am sure that others will too.
78. Falsify history. In Nineteen Eighty Four (George Orwell, 1948), Winston Smith’s job was to correct historical writings to fit the present narrative. That resonates today. We fudge known historical facts to suit ourselves.
79. In-group affinity, out-group stereotypes. We “root, root, root for the home team.”
80. Risk vs. Uncertainty: Different types of ambiguity aversion. A “risk” is where the probability of outcomes is known, and over a medium term the outcomes can be calculated. Survival rates with various medical conditions, for example. “Uncertainty” is where the probabilities are unknown. Like the stock market. (The converse strategy is to distinguish between the two, and more generally to calculate risk and to tolerate uncertainty.)
81. Status Quo. We pick the “house red wine.” Default option is the nudge. Cass Sunstein and others urge changing the default (on health care, insurance and many other choices) to fit prudent social policy. That is a good thing, but it is also something to be on the watch for.
82. Last chance regret. Being offered a last option right before we decide tempts us to change. The perplexing Monty Hall problem doesn’t really replicate in practice.
83. Salience effect. We tend to feel that a notable event must have a notable cause (e.g., assassination conspiracy theories). A murder by a immigrant from one country makes us suspicious of all from that country or of all immigrants.
84. House money. Found money is often put in risky investments, spent on luxuries, or even squandered. Coupons and discounts induce you to buy even more, using your funds other than the savings.
85. Procrastination. We tend to delay work on things that are important but not urgent. Writer’s block. The cause is the delay between the drudgery of “sowing” and the rewards of “reaping.” (The converse strategy is to impose deadlines, especially from an external source like a buddy.)
86. Envy. Balzac called it "the most stupid of vices." Aristotle wrote, “Potters envy potters.” To combat, only envy in the form of emulation, aspire to be the person you yourself aspire to be.
87. Personification. Statistics about the breadth of a famine do not grab us as much as a photo of a child or a coffin. Be human and humane, but also look at the statistics.
88. Illusion of Attention. The famous gorilla-dribbling-basketball video; people focused on the movement of the ball don’t see the primate strolling through the shot.
89. Strategic misrepresentation. The more that is at stake, the more we tend to exaggerate. This shows up in construction megaprojects and gigaprojects. Bent Flyvbjerg chronicles megaproject disasters.
90. Overthinking. Analysis paralysis. Buridan's ass. (The converse strategy is to trust your gut instincts especially when a decision is in your circle of competence.)
91. Planning fallacy. We tend to overestimate benefits, underestimate costs and risks. (The converse strategy is to cross-examine the upside and downside, compared with a concrete alternative such as “do nothing.” A good question from a CEO is to ask the eager proponents of a plan, "It's a year from now. The project was a failure. What happened?")
92. Professional oversight. A hammer sees the world as a series of nails to be hit. Surgeons think of surgical solutions. “Déformation professionele.” (The converse strategy is to be a Swiss Army knife.)
93. Zeigarnik effect. We tend to remember the uncompleted tasks and forget and dispose our memories of the completed ones. “Memory in, memory out.” (The converse strategy is to break jobs down into minitasks that can be remembered.)
94. Illusion of skill. A CEO at the helm of a successful company may have that good managerial record may be a function of the company not the person. If a specific observable skill is not present (e.g., an athlete), consider that the context produced the result.
95. Feature positive. The presence of a thing is more noticeable and measurable than the absence of a thing. We don’t think about the missing asset, or the undisclosed liability. We don't celebrate proofs of impossibility.
96. Cherry-picking. Unachieved goals aren't mentioned. We adjust the “budget” to reflect the actual results. We shoot the arrow, then draw the bullseye around where it sticks.
97. Fallacy of single cause. Events have lots of causal factors or perhaps no clear cause at all. Free will, to the extent it exists, is constrained by many internal (genetic) and external conditions.
98. Intention-to-Treat. Survivors show up in the right category, others in the wrong category. (Unlike survivorship bias (item #1, where they drop out of picture altogether.) An analyst might point out that companies with debt are more successful than companies without debt. But some of the debt-laden companies went bankrupt, and unprofitable companies can’t get bank loans. There is no proof here that incurring debt will make you successful.
99. Breaking News Illusion. We react disproportionately to news, to celebrity, and to shocking images.
100. Epilogue. It is easier to identify what makes us unhappy than to search directly for happiness. Instead, follow the via negative. Kahneman's “fast thinking” is intuitive. It got us through our evolutionary origins—those of us who trusted our guts are part of the gene pool, those who deliberated or didn’t stay with the pack are out of the gene pool. Kahneman’s “slow thinking” is more rational. The point is that they are both aimed at saving energy.
We all make snap decisions, and for less consequential things, or things within our circle of competence, that is fine.
For things outside our competence and for large consequential things, though take a look at this list of cognitive biases. Make sure you are sure of yourself!