Cognitive biases are expensive for organizations. These dangerous judgment errors happen all the time in business — often during times of critical decision-making. Research shows that about 80% of mergers and acquisitions fail, destroying value rather than creating it. As do most product launches, taking down successful leaders who staked their reputation on the new product end as well. Major projects wind up choking under cost overruns: a 2014 study of IT projects found that only 16.2% succeeded in meeting the original planned resource expenditure, and of the 83.8% of projects that did not, some of those didn’t because they failed completely. Of those who limped to the end, the average cost overrun was 189%.
We all suffer from these cognitive biases — or mental blindspots — due to how our brain is wired. Our autopilot system reverts to absolutes, no matter the numbers. Under siege or just overwhelmed, we can view reality as a yes/no; opportunity/threat; attraction/aversion way. That perspective closes us off from considering data and other points of view just when we should be open to them.
But recent research in cognitive neuroscience and behavioral economics is showing that such judgment errors can be overcome using careful, intentional approaches. One such effective strategy is probabilistic thinking — and it’s a powerful antidote to our over-reliance on trying to see everything in black and white.
Probabilistic thinking is called Bayesian reasoning, after Rev. Thomas Bayes, author of the Bayesian theorem that says we can evaluate the probability of an event, based on prior knowledge of conditions that might be related to the event. As we work to construct a more rational assessment, and as more information becomes available, we are able to further update our beliefs.
For instance, say your business partner said something that seemed hurtful. Your intuitive response might be to immediately feel the need to defend yourself, confront him/her on what he/she meant, or say something mean in response. That could have all manner of consequences — to your organization, to your own position in the company, to your project. But if you took a probabilistic thinking approach, you could step back and evaluate the relative likelihood that your business partner meant to hurt you, or that it’s more likely that a miscommunication occurred. You could then seek further evidence to help you update your beliefs about whether your business partner meant to hurt you or not.
Imagine she was looking at the ledger for the month and says “Wow, our electric bill is really high this month.” Imagine that you like the office to be warm in the winter and so you set the thermostat high. It would be easy to take the comment as an attack on you, feel slighted, and possibly respond with something hurtful. You might point out she hasn’t been bringing in that much business lately — “Well, we wouldn’t have to worry about the size of the bill if we had more money coming in.” Of course, drama would follow.
By contrast, use probabilistic thinking. After assessing the likelihood of whether she would actually want to hurt you, now look for more evidence before responding. Which means asking: “Are you concerned about the electricity costs of my setting the thermostat high?” What if her response is, “Well, the electric bill is about two times as high as last month — it’s true that you had set the thermostat, but I wonder if the electric company just screwed up? Or if they estimated based on another bill? Why don’t I call them tomorrow?” Conflict averted.
Launching Small Experiments
Try launching experiments to gain additional information instead of going on gut reaction — since our instincts tend to cause us to be vastly overconfident about what reality actually looks like. Launching small experiments is a low-cost way to correct our evaluations of our business environment. Look for ways you can disprove your theories rather than confirm them, to best counter our tendencies to look only for information that supports our beliefs.
A key aspect of probabilistic thinking consists of using your existing knowledge about the likely shape of reality (called the base rate probability, also known as prior probability) to evaluate new evidence. I saw this in action in a recent keynote I did for a group of bank managers. The topic was using de-biasing techniques to improve organizational performance, and how using base rates can determine how to best invest time and energy most effectively.
At question was whether or not these leaders were mentoring their employees most effectively. In a facilitated exercise, they considered how their prior mentoring impacted their subordinates. Then, they compared the qualities of their current subordinates to the prior subordinates they mentored. Finally, they considered whether their mentoring energy was invested effectively compared to the impact they could have on subordinates.
Here, base rates referred to their prior experience of investing energy into mentoring, and the kind of outcomes they achieved. The discussion revealed that the current behavior of bank managers did not match their estimates of employee improvement. In fact, the managers were spending far too much time mentoring the worst performers —about 70% of their time on average.
Yet, the biggest impact of mentoring based on their prior experience came from improving the performance of their best performers, not the worst ones. Informed by this evaluation of prior probabilities and how they compared to current actions, the managers determined that they needed to shift their mentoring energies, and connect the worst performers with an outside coach. Even if doing so would negatively impact their relationship with these employees, it would be a far more effective way to improve performance.
To overcome cognitive bias, we need to apply better approaches — and step by step, gather better information for a more positive outcome. Probabilistic thinking breaks us out of the rut of thinking in only black and white, and could turn a potential problem into a rational win.