Microsoft Teams Logo

Publish content to your screens directly from Microsoft Teams

Learn More ScreenCloud Banner Arrow
ScreenCloud Close Icon
Login
Free Trial
EnterprisePricingCustomers
Free TrialSign in

On Screen

What is Screencloud Icon
What is Screencloud
ScreenCloud Content Management Icon
Content Management
ScreenCloud Dashboards Icon
Dashboards
ScreenCloud Integrations Icon
Apps & Integrations

Manage

ScreenCloud OS Icon
ScreenCloud OS
ScreenCloud Hardware Icon
Hardware
ScreenCloud Security Icon
Security
Resources
>

Cognitive Biases That Ruin Your Workplace & How to Beat Them

Understand common cognitive biases and learn how to combat them to build a healthy and productive workplace.

ScreenCloud Post

Recently I rewatched Danny Boyle’s brilliant film about the iconoclastic Apple’s founder, Steve Jobs. The first act establishes Jobs' obsession with the 1983 TIME Magazine story about him. He blames Daniel Kottke for revealing his crazy algorithm to prove that he’s not Lisa’s father. Jobs believes that was the reason he wasn’t named Man of the Year, and it was also why the cover was a drawing of a man and a PC instead of his photo.

In the third act, Joanna Hoffman (played by Kate Winslet) demonstrates Jobs' reality distortion field (RDF), disclosing that he was never in the conversation for Man of the Year. It also wasn’t a PC on the 1983 TIME cover, but a sculpture of a man and a computer (which would have had to have been commissioned months in advance, so the idea that they replaced Jobs last minute was absurd). 

This information left Jobs in shock—as if it was staring at him the whole time, yet he didn't realize it. 

steve jobs film

Cognitive biases, explained

Steve Jobs’ RDF is a type of cognitive biases called optimism bias. The Cognitive Bias Codex categorizes and describes 180+ cognitive biases that influence and distort our perceptions and reasoning. 

Israeli psychologists Amos Tversky and Daniel Kahneman introduced the notion of cognitive bias in 1972. The pair explained that humans use simple strategies or mental shortcuts, known as heuristics, to process information swiftly and make decisions given limited resources.   

Cognitive biases explain why even though human brains are capable of magnificent intellectual feats, they can equally trip us up and lead us to monstrous failures. They're the reasons why geniuses act foolishly, why the smartest people can be spectacularly wrong about many things.

In the workplace, cognitive biases impact how we make decisions, interact and collaborate with others, and recognize and reward people. Unless we're aware of cognitive biases, we'll keep lying to ourselves and falling into common traps that perpetuate false judgments and misconceptions. Understanding cognitive biases is crucial in building a healthy working culture and running a successful business. 

Below we’ll look at 8 common cognitive biases that can affect your team and projects, along with techniques to overcome them.

confirmation bias

1. Confirmation bias

Confirmation bias refers to the tendency to search for, favor, and focus on information that supports our preconceptions. This need to be right means we're likely to ignore or reject contradictory information. Confirmation bias results in flawed decision-making that can be costly to a business. 

Examples 

If a manager is happy with an employee's performance, he/she might easily let that employee get away with making a mistake that otherwise would have required disciplinary action. Similarly, if a manager is displeased with an employee, he/she is prone to embrace evidence that confirms the employee's mistakes and weaknesses.

dunning kruger effect

2. Dunning-Kruger effect

Psychologists David Dunning and Justin Kruger first coined the eponymous Dunning-Kruger effect in 1999. This cognitive bias refers to our tendency to overestimate our skills and knowledge and underestimate our ignorance or how bad we are at certain things. The belief that we're smarter than we really are, combined with the inability to accurately recognize and correct our poor performances, can lead to grave mistakes and frustrating situations at work.

The Dunning-Kruger effect works both ways. Real experts tend to underrate their abilities. Since they know so much about their chosen subject, they can see the gaps in their knowledge and detect when they make mistakes (even if those mistakes are imperceptible to non-experts). 

Examples

According to a lesson by psychologist David Dunning, when software engineers at two companies were asked to rate their performance, 32% of the engineers at one company and 42% at the other put themselves in the top 5%.

Another example of the Dunning-Kruger effect is a colleague who thinks they're brilliant and confidently tells everyone so. But behind the curtain, other people have to pick up the slack and fix their mistakes. It's dangerous for an organization to promote people like this to leadership positions as this can quickly breed bad decision-making, resentment, and unfair situations, causing poor team collaboration, low employee engagement, and ultimately business failure.

3. Sunk cost fallacy     

Often referred to as throwing good money after bad, the sunk cost fallacy describes our tendency to increase investment in a decision or continue an endeavor based on previously invested time or resources, even if the current costs outweigh the benefits. 

Examples

An entrepreneur who invests in a startup idea and becomes overly attached to it that even if the business has been unprofitable for a long time, they keep putting more money and resources in it, convincing themselves that they can salvage the situation.      

Another example is a company that spends millions of dollars on a marketing campaign that totally bombs. Instead of scrapping that campaign and trying something else, the company continues to run it in the hope of getting some returns on their initial investment.

this is fine

4. Optimism bias

Remember the story about Steve Jobs' reality distortion field earlier? It's an extreme form of optimism bias, which refers to our mistaken belief that we're less likely to experience negative events and more likely to attain positive outcomes. Nobel Laureate Daniel Kahneman describes this cognitive bias in his book Thinking, Fast and Slow. Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be.

Examples

While optimism is a great way to reduce stress and stay happy, seeing the world as a lot rosier than it is can be dangerous. Optimism bias can drive foolish decision-making and harmful behaviors, such as unhealthy diet, smoking, overspending, and not wearing a seatbelt while driving. This cognitive bias also explains why 80% of New Year's resolutions fail by the second week of February.

Consider the world of banking and finance as another example. Financial analysts, investors, and government officials with unrealistic expectations of financial success tend to make extremely risky bets, despite contradictory evidence. Neuroscientist and author of The Optimism Bias, Tali Sharot points out that optimism bias has been named as one of the core causes of the financial crisis of 2008.

friends tv show

5. Bandwagon effect

Also known as groupthink or herd mentality, the bandwagon effect is when we adopt a certain attitude, behavior, or style because other people are doing it, regardless of our personal beliefs. The pressure to conform, the desire to be on the winning side, and the fear of exclusion/ the need to belong are common factors that influence the bandwagon effect.

Examples

The impact of the bandwagon effect is prevalent in many aspects of our lives, from music, fashion, and diets to businesses and politics. 

This study by Northwestern University and the University of Maryland suggests that Oprah Winfrey's endorsement was responsible for approximately 1 million additional votes for Obama during the 2008 presidential election. The same study also mentions Winfrey's influence on the striking sales volumes for two books (Anna Karenina by Leo Tolstoy and Love in the Time of Cholera by Gabriel Garcia Marquez) after they were included in her book club.

In the workplace, if everyone keeps nodding their heads to whatever idea is presented and nobody is willing to voice a disagreement or alternative opinion, there will be no room for creativity and innovation to thrive. When people copy what their colleagues and leaders are doing, even if those behaviors are negative, the result is a workplace devoid of healthy conflict, productive discussions, diverse perspectives, and unconventional ideas.

6. Planning fallacy

Daniel Kahneman and Amos Tversky first proposed the concept of planning fallacy in their 1977 paper Intuitive Prediction: Biases and Corrective Procedures. This cognitive bias describes our tendency to overly underestimate the amount of time it will take to complete a task, along with the costs and risks associated with that task. In Kahneman's words: "The planning fallacy is that you make a plan, which is usually a best-case scenario. Then you assume that the outcome will follow your plan, even when you should know better."

Examples

The planning fallacy is the culprit behind many massively failed or delayed projects. 

The Boston Big Dig is a mega road infrastructure project that began construction in 1991 and was originally estimated to be completed in 1988 at a projected cost of $2.8 billion. However, the project was behind schedule for nearly a decade and the overall cost ballooned up to $22 billion (with interest). Another famous example is the Eurofighter Typhoon joint defense project, which was delivered 54 months late for £20 billion instead of £7 billion. 

You don’t have to work on colossal projects to fall prey to the planning fallacy. You predict you’ll finish a new project within a couple of weeks when the work actually requires a month. You estimate it takes 20 minutes to get to work, yet the actual time turns out to be 45 minutes. The planning fallacy can also explain why about 90% of professional writers miss their deadlines.

7. Anchoring bias

Also known as focalism, anchoring bias means we rely too heavily on the first piece of information we receive when making decisions. When we become anchored to a specific idea or plan, we’re more unwilling to evaluate new information objectively, leading to skewed judgments and distorted perceptions.

Examples

Anchoring bias is visible in everyday life. For instance, when department stores and retail outlets offer sales and promotions, it triggers consumers to compare the discounted price against the original or ‘anchor’ price and think they’ve got a great deal. When Williams-Sonoma first introduced a bread maker to the market in the 1990s for $275, they were disappointed by its tepid sales. They decided to offer a slightly bigger and better bread maker for double the original price point. Surprisingly the sales of the $275 model immediately skyrocketed.   

Another example of anchoring bias is salary negotiation. According to this article, "the first salary an employer sees during a negotiation can have a significant influence on the ultimate offer: Seeing a low number right off the bat can lead employers to make lower salary offers than they would have otherwise."

Anchoring bias is also ubiquitous in brainstorming sessions, where each team member shares an idea. What tends to happen after is everyone usually chooses to move forward with the first presented idea.  

8. Self-serving bias

Self-serving bias is when we credit positive events to our character or actions, but blame outside factors for negative events. It's a defense mechanism we use to protect, maintain, or boost our self-esteem. It also occurs because of our desire to appear a certain way to other people. This cognitive bias can lead to a severe lack of critical analysis and resistance to criticism (even constructive criticism) and eventually damage our professional and personal lives.    

Examples

When a candidate is offered a new position, he believes it's because of his skills and achievements. When he doesn't receive a job offer, he thinks it's because the interviewer doesn't like him.

When a meeting with a potential client goes south, a salesman attributes losing the account to a competitor's unethical business practices. 

A leader who is more interested in taking all the glory for themselves and avoiding accountability for failure can destroy trust and team morale, resulting in an unproductive and toxic work environment that suppresses real talents.

How to overcome cognitive biases

1. Build awareness 

Acknowledging that cognitive biases exist everywhere and affect everyone (remember the Steve Jobs story earlier?) is the first step to prevent you from succumbing to them. Leaders should not only be mindful of their psychology, but they should also provide resources to help raise awareness and understanding of cognitive biases among their team members. 

2. Challenge your beliefs

To escape old ways of thinking and expand our knowledge, we need to continuously examine and challenge our existing beliefs and assumptions. Here are some questions to help you get started:

  • What is the source of your belief? 
  • Is your source trustworthy?   
  • Is there any evidence? Does it prove that your belief is true?
  • Is it always true in every situation?
  • Is there any evidence that could change your mind?
  • What is a more useful way of thinking about it?
  • What do you need to do about it? 

3. Seek multiple perspectives

Invite honest advice or feedback from smart, trustworthy people who are not afraid to disagree with you, question your own views and decisions, and point out your blind spots. A progressive and healthy workplace embraces diversity, as well as fresh perspectives, heated debates, and difficult conversations that come with it.

4. Reflect on past decisions

Reflect on a similar situation you've been in previously to help guide you in the right direction. Think about how you made that decision, the challenges you faced, and whether you overcame them. Look at the outcome and the lessons you learned from it.       

5. Employ decision-making frameworks 

Applying well-designed frameworks can improve your decision-making process and reduce cognitive biases in the workplace. Below are a few decision-making tools you can try out:

  • The SPADE toolkit: Developed by Gokul Rajaram and has been successfully implemented at Square, SPADE stands for Setting, People, Alternatives, Decide, and Explain. It's used to help teams make difficult decisions while delegating ownership, coordinating execution, and being inclusive.     
  • Circle of Competence: Warren Buffett and Charlie Munger initially developed this mental model to help investors focus on areas they know best. Knowing your circle of competence enables you to avoid problems, identify room for improvement, and learn from others. 
  • Six Thinking Hats: Edward de Bono introduced this method in the 1980s and wrote a book about it. This powerful technique incorporates parallel thinking, in which all participants focus on a specific direction at any given time. The Six Thinking Hats method gives people a more rounded view of the situation, inspires creativity, encourages cooperation, reduces conflict within a team, and improves decision making.

Humans are full of cognitive biases and flawed thinking patterns that can lead us on the wrong path. I hope this article has given you a solid understanding of how unconscious biases can affect your organization, from team collaboration and project management to performance reviews and recruitment. It's time to raise awareness of cognitive biases and combat them to help you make better decisions and create a healthy, productive work environment. 

If you have any questions or feedback, feel free to contact us at hello@screencloud.com.  

Thanks to Siora Photography for the cover photo on Unsplash!

 SC Gradient

Ready to get your organization connected?

Connect your first screen today with our 14-day free trial

Free TrialBook Demo
blog