Survivorship Bias: I systematically overestimate my chances of success.
– The vast majority of would-be novelists, rock stars, actors, athletes, entrepreneurs, etc. fail.
Swimmer’s Body Illusion: I confuse cause and effect – when a characteristic that makes membership in a group more likely is instead attributed to membership in the group.
– Swimming doesn’t give you a swimmer’s body, but professional swimmers tend to have those types of body. Certain cosmetics or clothes won’t make you model beautiful or attractive, rather those people in ads were already beautiful – which is why they were selected.
Clustering Illusion: I am oversensitive when it comes to pattern recognition, mistaking noise for signal.
– Whether it’s Jesus in a tortilla, voices in white noise or animal shapes in the clouds, we impose patterns on randomness.
Social Proof: I tend to do or believe something simply because other people do or believe it.
– Social (peer) pressure is so strong it can convince a third of people that a line they know is shorter is actually longer than another.
Sunk Cost Fallacy: I tend to stay with failing efforts because of time or resources already expended on them, not chance of future success.
– Consistency becomes more important than rational assessment of an undertaking’s real chance of success.
Reciprocity: I tend to feel obligated to repay those who give me something (or have revenge on those who do me harm).
– Salespeople, NGOs, religious proselytizers and philanthropic organizations use it all the time – they give, then take.
Confirmation Bias (Part I) – The “Special Case”: I often dismiss contradictory evidence as an “exception” or “special case.”
– “What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.” – Warren Buffett.
Confirmation Bias (Part II) – “Murder Your Darlings”: I tend to seek out only evidence that supports my beliefs and assumptions and ignore or minimize evidence that contradicts them.
– The more vague and nebulous a claim, the stronger the confirmation bias. I must work to find disconfirming evidence.
Authority Bias: I would tend to painfully shock an innocent person if someone in a white coat told me to.
– Crowns, suits, white coats, faces on magazine covers, titles – all of it is taken way too seriously and should be challenged.
Contrast Bias: I tend to judge things as more than it is if something less is presented with it, or vice versa.
– I don’t notice gradual changes, am easily distracted by large contrasts and alter my perception based on the context.
Availability Bias: My model of the world is distorted by what comes easily to mind.
– We remember the “last dog that bit us,” the dramatic case, the easiest or first explanation, or consider only the options or agenda presented to us.
“It’ll-Get-Worse-Before-It-Gets-Better” Fallacy: If things do get worse, I believe the prediction was accurate; if they don’t, I’m happy and forget the prediction was made.
– Variation of the confirmation bias. Vague predictions of things getting worse are sometimes true, giving credibility; but if the prediction is wrong, credibility is not lost because everybody’s happy.
Story Bias: I believe oversimplified, understandable narratives of events and neglect the complexity and uncertainty of reality.
– Our lives are much more chance-based and complex than we believe; the same is true for history.
Hindsight Bias: I believe I am a much better predictor than I actually am.
– Everything seems “obvious” in retrospect; historical events (personal and global) seem more inevitable than they really are.
Overconfidence Effect: I tend to think I am better, more talented and more skilled than I really am.
– We all think we are above-average; even pessimists overestimate their skills – only less so than optimists.
Chauffer Knowledge: I tend to mistake superficial, limited knowledge for true knowledge in myself and others.
– We all have circles of competence – things we know truly well – then everything else outside that circle. The key is knowing where the boundary lies.
Illusion of Control: I tend to believe I control more about the world than I actually do.
– We believe we have control over things and situations which, in fact, we have no influence. However, some sense of control preserves hope. The key is to know what few things in life we can control.
Incentive Super-Response Tendency: If you pay me by the hour, I’ll work slower.
– Where intents and rewards are mismatched, incentives will have the opposite effect from what was intended. People will generally act in their best interests, go the easy route, minimize effort, maximize reward.
Regression to Mean: I forget that pain gets better, extremes correct themselves, things average out.
– We take too much credit for our interventions when, in fact, any extremes generally correct themselves no matter what we do.
Outcome Bias: I judge decisions by their results, rather than by the process by which they were reached.
– Bad results don’t necessarily mean a bad decision, nor good, good. What is important is whether the decision process was rational and understandable.