I often think that if we all become more aware of how we’re acting and choiceful about how we want to be, we’ll live more in alignment with ourselves and with each other. Sure, the future won’t always turn out exactly as we intend, but at least we’ll be consciously trying in a direction we decide upon.
What I often forget, though, is that awareness is not just a consciousness of our intentions and actions, but also a consciousness of our patterns of thinking. We all have a collection of cognitive biases that run invisibly within our heads – the biases which consistently guide us away from truthful, clear thinking. They influence us without us ever realizing it; they inform how we see the world and decide to act. All are proven via extensive social science research. And that’s the craziest part – we know about them, we’ve proven them, and yet we all keep on getting things wrong in the same ways over and over again.
I recently came across a fantastic list of cognitive biases in Michael Schermer’s book The Believing Brain and it reminded me of all the ways I am consistently getting things wrong. I’m always trying to recommit to becoming more aware of not just what I want but how I’m thinking, and this was a good prompt in that direction. They say the solution isn’t getting rid of the bias (which, in my experience, is near impossible), but instead becoming aware of how it might impact you in everyday situations.
As for me, confirmation bias, halo effect and authority bias are always cropping up in my world. Which of these do you see in your life? Which are you actively combating?
With compassion for all our invisible shortcomings!
A Short List of Cognitive Biases
Attribution bias: Tendency to attribute different causes for your own beliefs and actions than that of others. It comes in at least two forms:
- Situational/dispositional attribution bias: (“I succeed at work because I’m smart and hard-working, but he succeeds because he’s lucky and has the right sponsors.” “I screwed up this recipe because the kids were screaming, but he screwed up the recipe because he’s a horrible cook.”)
- Intellectual/emotional attribution bias: (“I have a well-reasoned ideology behind my conservatism, but you are just a bleeding heart liberal.”)
Authority bias: Tendency to value the opinions of an authority, especially in the evaluation of something we know little about (“Arnold told me that purgatory was never a concept in Catholicism; he majored in religion, so he would know.”)
Availability heuristic: Tendency to assign probabilities to potential outcomes based on examples that are immediately available to us, especially those that are vivid, unusual, or emotionally charged (“If you’re a woman, you’re very likely to get breast cancer. I know two other women who have been dealing it with this past year.”)
Believability bias: Tendency to evaluate the strength of an argument based on the believability of its conclusion (“It seems reasonable that genetically-modified foods cause cancer, so the science is likely right.”)
Confirmation bias: Tendency to seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirming evidence (“See, this US Today article says that Obamacare is driving small enterprises out of business! But I really question the methodology of that NY Times article on why Obamacare is affordable for even mom-and-pop shops.”)
Consistency bias: Tendency to recall one’s past beliefs, attitudes and behaviors as resembling present beliefs, attitudes and behaviors more than they actually do (“I’ve basically always believed that there’s no way intelligent life could be out there.”)
False-consensus bias: Tendency for people to overestimate the degree to which others agree with their beliefs or that will go along with them in a behavior (“Don’t you think that other people will want to protest owl habitat destruction too?”)
Generalization bias (stereotyping): Tendency to assume that a member of a group will have certain characteristics believed to represent that group without having actual information about that particular member (“She’s a banker; you know what that means.”)
Halo effect: Tendency for people to generalize one positive trait of a person to all the other traits of that person (“She’s beautiful – so she’s probably pretty smart and athletic too.”)
Hindsight bias: Tendency to reconstruct the past to fit with present knowledge. I love it’s nickname “creeping determinism.” (“If you look at the indications before the Challenger launch, anyone could have seen the explosion coming.”)
Inattentional blindness bias: Tendency to miss something obvious and general while attending to something special and specific (“I don’t know why all those people were on the corner protesting, but did you see that woman’s blue pants?”) The classic example of the inattentional blindness bias is here.
In-group bias: Tendency for people to value the beliefs and attitudes of those whom they perceive to be fellow members of their group and to discount the beliefs and attitudes of those whom they perceive to be members of a different group (“A friend at work told me that it’s dangerous to walk in the Southwest Corridor at night.”)
Just-world bias (victim-blaming): Tendency for people to search for things that the victim of an unfortunate event might have done to deserve it. (We’ve seen this in spades with recent news coverage on Ferguson, sexual assault on college campuses, bullying, and domestic abuse in the NFL.)
Negativity bias: Tendency to pay closer attention and give more weight to negative events, beliefs, and information than to positive (“It just seems like every time you turn on the news, it’s all bad things happening.”)
Not-invented-here bias: Tendency to discount the value of a belief or source of information that does not come from within (“Sure, but that’s what the consultants said, not what my research turned up.”)
Rosy retrospection bias: Tendency to remember past events as being more positive than they actually were (“Ahh, the good old days. So much better than life since everyone got a cell phone!”)
Self-justification bias: Tendency to rationalize decisions after the fact to convince ourselves that what we did was the best thing we could have done (“It was the middle of the highway during rush hour. If I pulled over to talk to the woman whose car I rear-ended, we both would have been in danger. That’s not worth it for a small fender bender.”)
Status quo bias: Tendency to opt for whatever it is we are used to (“I picked the health plan that I’ve had for years.”)
Sunk-cost bias (escalation of commitment): Tendency to believe in something because of the cost sunk into that belief (“I’ve supported gun control all my life; there is no way I’m going to change that opinion now.”)
Trait-ascription bias: Tendency for people to assess their own personality, behavior and beliefs are more variable and less dogmatic than those of others (“I would definitely change my mind given good date; I’m not so sure about my opponents, however.”)
The examples are my own, but the bias descriptions are sourced from The Believing Brain: From Ghosts and Gods to Politics and Conspiracies – How We Construct Beliefs and Reinforce Them as Truths by Michael Shermer
And here’s the even crazier part: If you think that these biases don’t apply to you, then you’re succumbing to the ultimate meta-bias: the bias blind spot. That is our “tendency to recognize the power of cognitive biases on other people but to be blind to their influence on our own beliefs.” Annoying, isn’t it?