Time Plus News

Breaking News, Latest News, World News, Headlines and Videos

If You're Not Aware of These Common Biases, Your Entire Leadership Strategy Is at Risk

Opinions expressed by Entrepreneur contributors are their own.

Seeing may be believing, but it’s not always reality. The other day, on my way to Boston, I was looking out the window at Newark Airport when the shadow of a near-perfect hexagon on the runway caught my eye. The form was so distinct that I assumed the object casting it had to be a hexagonal shape. When I repositioned myself to see beyond the edge of the window frame, all I saw was a large rectangular truck. For a moment, easing from my off-base perceptions into that new reality, I was shocked at how wrong my perceptions had been. Did my eyes fail me? Was my mind playing a trick with shapes and sizes? In the world of Meta, is nothing real anymore? 

People imagine that everything about the mind can be biased, but they do not apply this skepticism to what they see with their own eyes. We trust our eyes so much that their perception serves as legal evidence (as with an eyewitness), but every set of eyes carries biases depending on how our brain takes in and processes information.  A person of one gender finds someone of the opposite gender wearing red more attractive. Music can affect the way we see the world and our expectations of it. Biases affect the way we approach everything; leaders, who make difficult and crucial decisions every day, can make better decisions by recognizing and regulating some of the most common biases. 

Related: 3 Ways to Eliminate Data Biases at Your Company

The rabbithole of human biases

Biases exist, are abundant, and everyone has them, but being aware of our own can reduce the effects our cognitive biases have on our decision-making. Cognitive biases are mental processing errors that limit the mind’s attention and can drive a person’s thoughts or behavior. Sometimes these biases serve as mental shortcuts, called heuristics, that can promote quick reactions when a situation demands speed, but leaders need full cognitive control to assess the spectrum of potential risks and threats. When cognitive biases impact their thinking, leaders often end up making illogical or irrational decisions. 

Biases develop from an individual’s personal attributes like values, memories or socialization patterns, which is why everyone’s tend to be different. Without any meaningful confirmation, we may assign whole identities to people. We may pay attention to only certain information or assume everyone shares our perspective. Thanks to the internet, a common bias today consists of learning a little about a topic and assuming we know everything about it. Often, multiple biases go into making these decisions, but the first step to keep them in check is recognizing them. To control them, however, we need to know how they can change our thoughts. 

Related: How Entrepreneurs Can Overcome Confirmation Bias

Know when enough is enough 

Biases can often be shockingly accurate, but as I saw with the hexagonal shadow, they can also be completely wrong — the key is counterbalancing it with enough valid information. Leaders need the patience to gather enough information to feel confident; as more information comes in, biases should hold increasingly less sway, but some information may never become available. In the vast complexity of the business world, no one can wait to consider every possible option, and action bias can drive us to act even when waiting is the better choice. Biases can fill in the blanks to act quickly, but they also challenge our ability to act with control. We tend to run on a 20:80 rule, focusing on the 20% of energy it takes to give you that 80% impact — but with only 20% of the data and a healthy dose of bias, attempting such a big impact can be dangerous. 

Too much bias affects how you make decisions even with new information. I assumed the hexagonal shadow had to have a hexagonal source because the availability heuristic led me to rely on information that most readily came to mind. With more accurate information — the rectangular truck — the new reality was so obviously correct. If I had been subject to an anchoring effect, however, I might have gotten stuck in my initial conclusion — the object must be hexagonal — by relying too heavily on the first piece of information I received. Confirmation bias could then drive me to seek a hexagonal object to support what I already believed and discount the possibility that a rectangular truck could cast that shadow. With enough patience to gather at least 80% of the information in any situation before you act, bias can be helpful without being harmful.  

Shut out the noise 

Biases can help leaders sort through a world of too much information, but leaders also need to distinguish the important knowledge from the noise. From the internet to politicians, biases cause us to heed the loudest voices. In an era when we consume “facts” offered by unknown sources on Instagram, TikTok or Snapchat in bite-sized increments, generalizations take up less brain space than specifics, and especially as technology advances, our minds often resort to preconceived biases to filter through the flood of incoming data. Top-of-the-line computing can collect massive amounts of organizational information, and every single person on a given team can let his or her biases evaluate that data differently, even those with the same role. 

To extract accurate signals and actionable insights from so much information, avoid the biases that can keep you from seeing it clearly. The ambiguity effect leads us to favor options that seem quicker, simpler or have more complete information than complex ones with more uncertainty. Some people who read one piece of information or are unduly impacted and influenced by the delivery of a certain point of view, even as new information comes to light, are subject to a bias known as the framing effect. Anyone who has seen a good conspiracy theory movie knows the clustering illusion, where people extract patterns and meaning from random data. But people end up getting dragged into really bad decisions when they let their thoughts get entangled in these biases. 

Related: How Entrepreneurs Can Address Unconscious Bias

Necessary is not always popular 

Leaders need to be willing to make difficult and potentially unpopular decisions, but some biases let group preferences or harmony overshadow that need. It’s particularly unfortunate when people and groups refuse to consider or try anything that isn’t their own — whether ideas or products. The Cold War stand-off might have ended sooner if not for widespread reactive devaluation, which caused popular rejection of mutual arms reduction. This is related to Not-Invented-Here-Syndrome, which often happens when people or organizations avoid using things they did not create themselves. When a Kodak engineer invented the first digital camera, the company’s leadership caught a case of NIH Syndrome surrounding its business model, which depended on consumables — film, developer and prints — to make the bulk of their profits. They killed their own innovation and went bankrupt in 2012.

From Kennedy’s Bay of Pigs disaster to Nixon’s Watergate, entire groups of leaders have bought into a bad decision-making bias called groupthink. This hasty and irrational form of decision-making limits critical evaluation because it seems to jeopardize group cohesion. Group members believe in their own or their leader’s expertise to the point of invulnerability and may even pressure others into conformity. Surrounded by nothing but a bunch of “yes men,” even a leader of 330 million citizens can easily get away with making terrible groupthink decisions. 

To combat group biases, leaders need the voice of dissent: reasonable ones brave enough to counter flaws in a plan, propose alternative solutions or devise new ways to approach a problem. In today’s world of viral videos and social media, a common type of groupthink is the bandwagon effect: People believe something to be true only because so many others already believe it. During the pandemic, mass toilet paper hoarding ended up causing the shortages people feared and left those who jumped on the bandwagon managing stockpiles. To avoid these popularity biases in a group, you must be confident enough to bring in fresh perspectives and dissenting voices to challenge your biases. 

Everyone has biases, but to be more effective, leaders need to keep them from clouding their decisions. Science shows us that humans do things predictably wrong in the same way all the time, and having biases is one of those things. Law enforcement, research scientists and medical professionals are susceptible; we all need to discipline ourselves to avoid falling prey to bias. We may never be able to eliminate our reliance on cognitive and perceptual biases, but if we can identify them and the places they commonly ensnare us, we can be better prepared to control how they impact our decisions.

Source link