As project managers, a large part of our jobs is to assess risks. We typically assess risks during the pre-work stage of project management and then monitor those risks throughout the project life cycle.

Risks come in many different forms and categories, such as systemic risks, environmental risks, and programmatic risks, just to name a few, of course. They can range from a bug in a project management system to an earthquake.

We develop a detailed risk management plan complete with a risk register log to identify track, monitor, and resolve risks as they arise during the project.

But here’s one risk we likely don’t account for in our risk management plans… our own brains.

How Our Brains Work

There is a lot of research out there on how our brains work, specifically how we think and perceive what we hear and see, and how we process situations, issues, experiences, and events. In his book Thinking, Fast and Slow, Daniel Kahneman—long-time economist and Nobel Peace Price winner—describes the two systems in the human mind: System 1 and System 2.

Here are the characteristics of each one:

Characteristics of “System 1”

  • Generates first impressions, feelings, and inclinations
  • Operates automatically and quickly, with little or no effort, energy, or voluntary control
  • Creates “coherent” ideas and stories
  • Creates a sense of cognitive ease to illusions of truth, pleasant and comfortable feelings
  • Focuses on existing evidence and ignores absent absence
  • Activates the “mental shotgun” – jumps to conclusions
  • Overweights low-probability events
  • … among many others

Characteristics of “System 2”

  • Allocates attention to mental activities that require substantial effort, such as performing complex computations, solving problems, questioning conclusions and the validity of those conclusions
  • Associated with subjective experiences

Both types of “systems” are active while we are awake. However, they function at different speeds. When we process information, events, experiences, and even answer questions, our “System 1” is often the first to respond.

However, there is risk to this. As described above, our “System 1” is prone to reacting and responding too quickly. While our “System 1” reacts and responds, our “System 2” sits behind the scenes in comfortable “stand-by” mode. “System 2” is slower to question information and look a level deeper than what is in front of us. But it is ready to take charge when things become difficult.

As intelligent as we are, our brains are quite imperfect. As a result of our “System 1s”, our brains are naturally wired to fall victim to cognitive biases, which can lead to incorrect judgments about situations or others, fallacies, errors, and even poor decision-making—all of which can be detrimental to managing a project.

6 Types of Cognitive Biases and Illusions

What do we mean by a “cognitive bias”? Here are some common examples:

1. Uncertainty Bias

As project managers, we know that we have to account for a certain level of uncertainty when planning a project. These are our “unknown unknowns”. As a result, we make decisions based on the probability that a certain—or uncertain—event will occur. Often, our own personal beliefs about whether or not they will occur will also get in the way, forming a bias.

Why do we do this? Because assessing an estimated probability that an event will occur is too difficult. Making decisions based on heuristics rather than probabilities often leads to systemic errors.

For example, as a project manager, you have likely engaged in expert judgment sessions (whether or not you call them that formally). Expert judgment sessions are often facilitated through risk workshops or “interviews” to learn about what is involved to carry out specific work activities and how long those activities will take, assessing the probability and impacts of project risks, gathering data, and a number of other factors. This is often conducted during the “initiation” or planning phases of the project.

However, this is a common area for cognitive biases to appear. The possibility of expert views can and are often biased. The interviewer (or project manager, in this case) should encourage honest and unbiased assessments, refer to base rates of historical events, evidence, and use algorithms and formulas to aid in making final decisions (whenever possible, of course).

2. Optimism Bias

There are two different types of people in the world: optimists and pessimists. Optimists are super positive about life, regardless of the situation. They are always open to new opportunities, and say “yes” more than they say “no”.

Pessimists are more cautious. They live in their own misery, turn down opportunities—both personal and professional—and are content with living in a state of unhappiness.

In the world of project management, optimism bias can lead to planning fallacies because project managers who are also naturally optimists tend to emphasize and focus on the positive and embrace opportunities without fully assessing risks. In many cases, optimists will often misread risks or take on more risks because they are overly positive about the outcome, even when the probability for a positive outcome is low.

For example, when planning a project, an “optimist” project manager might be excited about the new project or opportunity and feel confident about the team he or she has assembled to work on the project. As a result, he or she is likely to misread or misjudge the risks about completing the project on time or within budget.

Although this might seem like a blessing (and in most scenarios, it is) optimism bias is arguably one of the most significant cognitive biases.

3. Substitution

As project managers, we ask a lot of questions. If we ask a difficult question to a project team member, stakeholder, or expert, the “System 1” part of brains often takes over and associates the question with a related question and therefore finds a related answer. That answer likely doesn’t fully answer the question that we asked. This is called substitution.

On the other hand, the project manager might accept the question, taking it at face value, rather than realize that the initial question he or she asked wasn’t exactly answered.

For example, think of a time when you asked someone what would be a simple “yes or no” question, and rather they provided a long-winded response. Although detailed, it still didn’t answer the question. This is the responder’s “System 1” at work. And the fact that the original asker accepted the responder’s answer is also the asker’s “System 1”.

Imagine how the conversation might be different if both the asker and the responder’s “System 2” were activated…

4. The Anchoring Effect

Project managers will occasionally play the role of “negotiator”, particularly when performing make-buy analyses or procuring vendors or contractors for a project. The Anchoring Effect, by definition, is considering the particular value for an unknown quantity before estimating that quantity.

The most common example of where the anchoring effect is present is in real estate. If you are looking at a house or property to purchase, you are influenced by the asking price. The higher the listing price, the more valuable the property appears.

Now let’s consider an example in the world of project management. Let’s say you wanted to ask a number of contractors or vendors to submit bids to work on your construction project. So, you put out the Request for Proposal (RFP) memo to a handful of suppliers (or, contractors and vendors) that your organization has worked with before, or to a specific community. Some organizations put a budgetary range in their RFPs. However, by putting any number of monetary range in an RFP, you are instilling the anchoring effect before you receive any proposals.

When considering a specific number in an estimation or negotiation problem or situation, this creates the anchoring effect before you receive any bids.

And, yes, you guessed it. Succumbing to the anchoring effect is yet another reaction of “System 1”.

5. Hindsight and Outcome Bias

“Hindsight is 20/20.” We’ve all heard that phrase, especially when making a decision and realizing (too late) that it was the wrong one. As a result, it can negatively impact quality decision-making. And not necessarily the process by which a decision was made, but whether the outcome was favorable or not. This is known as hindsight bias.

However, learning from experiences and surprises is 100% reasonable, but it can have some costly consequences. For example, we often blame decision-makers when otherwise good decisions turned out to be bad ones. We often don’t give them credit for other successful decisions they have made—but that are only apparent after the fact.

To put it more clearly, when the outcome is unfavorable, stakeholders blame project managers for not seeing “the writing on the wall” when it only became legible after the fact. This is known as outcome bias.

All in all, hindsight bias impacts decision-making. Experienced project managers likely fall victim to hindsight bias when making decisions, whether or not they realize it. As a result, hindsight bias and outcome bias make it nearly impossible to properly make a decision.

6. The Illusion of Validity

“System 1” jumps to conclusions from little evidence, or only evidence that is apparent (again, rather than considering the absence or lack of evidence). After all, poor or lack of substantial evidence makes for a very good story, doesn’t it? Our “System 1” also loves to jump to conclusions based on coherence (the stories we form in our heads), and our confidence (and sometimes overconfidence) in our opinions and subjective experiences. This translates into the illusion of validity, which is a cognitive illusion.

After all, confidence is a feeling. Project managers might feel confident in managing a particular project (again, maybe they are optimists), and they are thinking about how the project will play out under their leadership. (This is coherence at work.) However, the true facts might show that the project manager is really inexperienced in managing the particular project (based on industry or complexity).

In this case, the project manager made a decision and constructed a story in his head even though the story might not necessarily be true.

Cognitive Bias Risk Mitigation Strategies

So what do we do? How do plan for and manage the risk of our own selves? How do we avoid it altogether? Can we avoid it?

Here are some risk mitigation strategies:

1. Reference Class Forecasting

This strategy can be used when planning and estimating a new project. This strategy involves reviewing and analyzing the historical events of other previous projects. To prevent bias, reference class forecasting ensures that project managers assess projects with similar characteristics, including:

  • Industry
  • Category
  • Complexity
  • Budget
  • Constraints
  • Statistics related to project outcomes (such as cost overruns)

For example, some organizations use reference class forecasting to control and mitigate planning fallacies. This strategy requires having a proper project archive (which stems from a solid project closeout process), or even a Project Management Information or Knowledge System that tracks historical outcomes and statistics related to prior projects.

This is used to forecast future performance based on past results. These results can then be used to recommend preventive actions, if necessary.

However, in terms of cognitive bias, the trends analysis is flawed. For example, a project manager might overweight a trend or risk occurring due to his or her own personal experiences managing similar projects or experiences with similar outcomes. Therefore, the project manager may fall victim to overweighting or overestimation based on his or her own emotions and experiences.

3. Develop Risk Policies

Mental accounts are used to “keep score”, and are often shaped by our own personal experiences. Project managers who have experienced project failure will likely not only deal with emotional turmoil as a result but he or she is also more likely to attribute that failure and take a specific course of action to avoid failure in the future, even if potential failure or a particular risk would be considered a rare event. (Ahem. “System 1″… )

Project managers as decision-makers—and human beings—and are therefore prone to the emotions that tie into decisions, including regret. And regret clashes with hindsight bias, which is purely a function of System 1.

4. Develop a Decision-Making Process

We all know that decision-making is a huge part of project management. There are several decision-making methods project managers follow, including voting, autocratic decision-making, and multicriteria decision analysis, just to name a few, of course.

When you break it down, decision-making really comes down to gains and losses. And, yes, just like playing the lottery or trying your chances at a game at a casino. In this context, we will use monetary means as an example of a loss or gain, since money is something we all can relate to.

By definition, a “gain” refers to more money than what you had before. And “loss” is obviously losing that money you had. However, I might argue that a gain might mean earning or being given something of value that didn’t cost you anything.

For example, if someone gives you a gift or a hand-me-down, such as a winter jacket, a diamond necklace, or a kitchen table. This is also a gain, because you still gained something of value and the savings alone might attribute to a longer-term gain.

So, when weighing decisions, consider the potential losses and gains evenly. Use a decision matrix to help make decisions, either related to your project—or life.

How to Activate Your System 2 to Reduce Risks

Now that you just read a ton of information about how our brains work, how cognitive biases and illusions form, and how we—as first and foremost, human beings—but also project managers—are often our own biggest risk and make errors just because of how our brains are naturally “wired” to work.

This now begs the question: How do you not fall victim to System 1?

I asked myself the same question while reading through Thinking, Fast and Slow. While approaching the end of the book, I anticipated the “aha” moment—the “what do I do now?”—the answer to overcoming the risks of my very own System 1.

Daniel Kahneman, the author, writes—

“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort.”

Considerable effort… What does that mean for what I need to do each day?

Exercising and strengthening “System 2” is different for everyone, but the first step is to slow down and eliminate hurry, and free up mental space to think, question, and focus.

Here are some steps to try:

Step 1: Learn to recognize cognitive biases and cognitive illusions.

Although we can’t completely avoid cognitive biases and cognitive illusions completely, we can learn to recognize them in conversations, situations, and decision-making.

Recognize when a cognitive bias might be present or when mistakes are likely to occur, and try hard to avoid such mistakes, especially when the stakes are high.

Step 2: Slow down, rationalize, and activate “System 2”.

Slow down, “System 1”, let’s see what “System 2” has to say…

Rather than jump to conclusions, be quick to respond or make a decision, or answer a question. Stop. slow down. Consider the facts and statistics. Look at the evidence that is present and what evidence might be missing. Question everything.

Step 3: Reduce “System 2” overload.

“System 2” overload leads to task-switching as a result of time pressure, depleting your overall mental effort and energy. “System 2” becomes lazy. Then, “System 1” takes over, which is risky business.

“System 1” often puts us in a position where we fall victim to cognitive illusions and cognitive biases, preventing us from really thinking through problems. We form “stories” in our brains about how we think situations are, how they ought to play out. As a result, we often jump to conclusions and make illogical errors.

As a fellow project manager, I personally shudder to think how many errors, judgments, and incorrect conclusions I have made over the course of my career just due to my “lazy System 2”, which is incredibly dangerous as a business consultant and project manager.

All in all, cognitive biases and cognitive illusions don’t just pose risks and lead to errors in the world of project management and business; they also introduce risks into our personal lives as well as our personal relationships.

Take some time to consider your strengths and weaknesses as a project manager, where you might fall victim to your “System 1”, and what you can do to prevent what could be costly errors going forward.

If you’re interested in learning more about project management, or supplementing your existing project management skills, check out my course “The Basics of Project Management”. Readers and fans get a 25% discount!

3 thoughts on “The Biggest Project Management Risk of All—Our Brains

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s