The Flawed Brain That Tricks The Mind

When you hear the word bias, what comes to mind? Racial prejudice? Or maybe you think of mainstream media or a particular political persuasion.  But I know… you’re not biased, right? Sorry, you are. We ALL are.

The whole idea of cognitive biases— the mental shortcuts we take to form judgments and make predictions — was discovered back in the 1970s by an unlikely science duo, Daniel Kahneman and Amos Tversky. Early in his career, Tversky was a “mathematical psychologist;” he used formal models to characterize human behavior. He felt metaphors were cover-ups – that they “replaced genuine uncertainty about the world with semantic ambiguity.” He was organized, meticulous, and highly disciplined. On any given day, the only things on his desk were a notepad, a mechanical pencil, and an eraser. He was also an optimist. “When you are a pessimist and the bad thing happens, you live it twice. Once when you worry about it, and the second time when it happens.”

Kahneman mastered the art of worrying and pessimism often waking up early in the morning alarmed about something. He claims that by expecting the worst, he is never disappointed. This pessimism shaped his research over the years. In fact, he enjoyed finding his own mistakes. He once said, “I get an extraordinary sense of discovery whenever I find a flaw in my thinking.”

Kahneman and Tversky were brilliant, and they did most of their work together more than thirty years ago. This odd couple changed how we think about how we think. How did two so radically different personalities find common ground, much less become the best of friends? One reason was that Kahneman was always sure he was wrong while Tversky was always sure he was right. But they both were fascinated by the flaws in human thinking. 

Daniel Kahneman coined the phrase “two systems of the mind” to explain how these shortcuts derail rational thinking.   System 1 operates quickly and is often completely involuntary.  It’s the instant feeling of dread you experience when you see flashing red and blue lights in your rear view mirror or how you know someone is happy from their tone of voice.

System 2 is more thoughtful and complex. This type of thinking includes making decisions and focusing your attention on a particular task. System 1 generates suggestions, feelings, and intuitions for System 2. If System 2 endorses those feelings, they turn into beliefs often leading to impulses that turn into voluntary actions.

According to Kahneman, “A lazy System 2 accepts what the faulty System 1 gives it, without questioning. This leads to cognitive biases. Even worse, cognitive overload and psychological discomfort tax System 2, making it more willing to accept System 1. This is why we’re more vulnerable to cognitive biases when we’re stressed or tired. The bad news is that because System 1 operates automatically and can’t be turned off, we can’t eliminate cognitive biases. The good news is that when we better understand them, we can recognize them when they happen and sometimes even use them to our advantage.

Screen Shot 2021-02-25 at 2.40.44 PM

Perhaps the most significant discovery in Kahneman and Tversky’s work lies in the claim that not only are we not as rational as we’d like to think, departures from rational thinking aren’t just possible; they are predictable.

For example, the gambler’s fallacy makes us absolutely certain that, if a coin has landed heads up 10 times in a row, it’s bound to land on tails the 11th time. In fact, the odds are 50–50 every single flip. Perhaps the most famous example of the gambler's fallacy occurred in a game of roulette at the Monte Carlo Casino back in 1913.  The ball fell on black 26 times in a row, and as the streak lengthened gamblers lost millions betting on red, believing that the chances changed with the length of the run of blacks. The probability of a sequence of either red or black occurring 26 times in a row is (18/37)26-1 or around 1 in 66.6 million, assuming the mechanism is functioning properly.  

A related bias called the 'hot hand' fallacy is the belief that your luck comes in streaks. Win on roulette and your chances of winning again aren't more or less – they stay exactly the same. But something in human psychology resists this fact, and people often place money on the premise that streaks of luck will continue – the so called 'hot hand'.

Kahneman maintains that the most effective check on biases doesn’t come from within; it is from others. Because of our bias blind spot, others can recognize our faulty thinking more easily than we can. He expands this construct and applies fast and slow thinking to organizations. Organizations can learn about these invisible forces at work and methodically impose procedures that help avoid errors.

Many leaders take their teams through a postmortem after a project has finished as a way to identify what went right and what didn’t.  A “premortem” may be a better strategy as a counter optimism bias – the bias that causes us to underestimate the cost, effort, and time it will take to complete a project –by requiring team members to imagine the project bombed as a complete train wreck and describe how it happened.  This exercise is a great way for people to more broadly anticipate flaws and obstacles.

Organizations that bring bias to the table, talk about them, and identify them when they occur are much more likely to avoid the potential pitfalls that are so commonly associated with these unconscious mental shortcuts.  

Download my free whitepaper to learn more about the

biases that may be undermining your organizational success. 


download Bias whitepaper

 

Let's Chat!
Share Post