A large part of a project manager’s job is risk management. We typically assess risks during the pre-work stage of project management and then monitor those risks throughout the project lifecycle.

Risks come in many different forms and categories—systemic risks, environmental risks, and programmatic risks, just to name a few. They can range from a technical bug in a project management system to an earthquake.

We develop a detailed risk management plan, complete with a risk register log to identify, track, monitor, and resolve risks as they arise during a project.

But here’s one risk we likely don’t account for in our risk management plans… our own brains.

How Our Brains Work

There is a lot of research out there on how our brains work, specifically how we think and perceive what we hear and see, how we process situations, issues, experiences, and events, and even how we make decisions, and how our emotional brain and rational brain both play a role.

In his book Thinking, Fast and Slow, Daniel Kahneman, economist and Nobel Prize winner, describes the two systems in the human mind: System 1 and System 2.

Here are the characteristics of each one:

Characteristics of “System 1”

  • Generates first impressions, feelings, and inclinations
  • Operates automatically and quickly, with little or no effort, energy, or voluntary control
  • Creates “coherent” ideas and stories
  • Creates a sense of cognitive ease to illusions of truth, pleasant, and comfortable feelings
  • Focuses on existing evidence and ignores the absent
  • Activates the “mental shotgun” – jumps to conclusions
  • Overweights low-probability events
  • … among many others

Characteristics of “System 2”

  • Allocates attention to mental activities that require substantial effort, such as performing complex computations, solving problems, questioning conclusions and the validity of those conclusions
  • Associated with subjective experiences

Both types of “systems” are active while we are awake. However, they function at different speeds. When we process information, events, experiences, and even answer questions, our “System 1” is often the first to respond.

However, there is a risk to this. As described above, our “System 1” is prone to reacting and responding too quickly. While our “System 1” reacts and responds, our “System 2” sits behind the scenes in a comfortable “standby” mode. System 2 is slower to question and respond to information and look deeper than what is in front of us, but it is ready to take charge when things become difficult.

As intelligent as we are, our brains are quite imperfect. Because of our “System 1s,” our brains are naturally wired to fall victim to cognitive biases, which can lead to incorrect judgments about situations or others, fallacies, errors, inaccurate memories, and even poor decision-making—all of which can be detrimental to project management.

The Role of Rules

Believe it or not, rules provide us with “cognitive shortcuts”, as described by author, Paulo Savaget in his book The Four Workarounds. Whether we consciously realize it or not, we all conform to rules and social norms. Rules silently shape what we think is acceptable or desirable. They “normalize” our behavior and thinking patterns, which can speed up our decision-making process.

On the other hand, rules also inhibit us from finding innovative workarounds and solutions to problems. In a way, conforming to rules eventually becomes a “System 1” activity, one we rarely question or challenge. As a result, we subconsciously become blind to solutions to problems, creating a breeding ground for cognitive biases.

6 Types of Cognitive Biases and Illusions

What do we mean by a “cognitive bias”? Here are some examples:

1. Uncertainty Bias

As project managers, we must account for certain uncertainty when planning a project. These are our “unknown unknowns”. As a result, we make decisions based on the probability that a certain—or uncertain—event will occur. Our personal beliefs about whether or not they will occur will often get in the way, forming a bias.

Why do we do this? Assessing an estimated probability that an event will occur is too difficult. Making decisions based on heuristics rather than probabilities often leads to systemic errors.

For example, as a project manager, you have likely engaged in expert judgment sessions. Expert judgment sessions are often facilitated through risk workshops or “interviews” to learn about what is involved in carrying out specific work activities and how long those activities will take, assess the probability and impacts of project risks, gather data, and address a number of other factors. These sessions are often held during the “initiation” or “planning” phases of the project.

However, this is a common area where cognitive biases occur. The possibility of expert views can and are often biased. The interviewer (or project manager, in this case) should encourage honest and unbiased assessments, refer to base rates of historical events, evidence, and use algorithms and formulas to aid in making final decisions (whenever possible, of course).

2. Optimism Bias

There are two different types of people in the world: optimists and pessimists. Optimists are generally positive. They often see “the glass half full,” regardless of the situation. They are always open to new opportunities and say “yes” more often than “no.”

Pessimists are more cautious. They live in their own misery, turn down opportunities—both personal and professional—and are content with living in a state of unhappiness.

In the world of project management, optimism bias can lead to planning fallacies because project managers who are also naturally optimists tend to emphasize and focus on the positive and embrace opportunities without thoroughly assessing risks. In many cases, optimists will often misread risks or take on more risks because they are overly positive about the outcome, even when the probability of a positive outcome is low.

For example, when planning a project, an “optimist” project manager might be excited about the new project or opportunity and feel confident about the team he or she has assembled to work on the project. As a result, he or she is likely to misread or misjudge the risks of completing the project on time or within budget.

Although this might seem like a blessing—and in most scenarios, it is—optimism bias is arguably one of the most significant cognitive biases.

3. Substitution

As project managers, we ask many questions. If we ask a difficult question to a project team member, stakeholder, or expert, the “System 1” part of our brains often takes over. It associates the question with a related question and, therefore, finds a related answer. That answer likely doesn’t fully answer the question that we asked. This is called substitution.

On the other hand, the project manager might accept the question, taking it at face value, rather than realize that the initial question he or she asked wasn’t exactly answered.

For example, think of a time when you asked someone what would be a simple “yes or no” question, and rather they provided a long-winded response. Although detailed, it still didn’t answer the question. This is the responder’s “System 1” at work. And the fact that the original asker accepted the responder’s answer is also the asker’s “System 1”.

Imagine how the conversation might differ if both the asker and the responder’s “System 2” were activated…

4. The Anchoring Effect

Project managers will occasionally play the role of “negotiator”, particularly when performing make-buy analyses or procuring vendors or contractors for a project. The Anchoring Effect, by definition, considers the particular value of an unknown quantity before estimating that quantity.

The most common example of the anchoring effect in real estate is when you are looking at a house or property to purchase. The asking price influences you, and the higher the listing price, the more valuable the property appears.

Now, let’s consider an example in the world of project management. Let’s say you wanted to ask several contractors or vendors to submit bids for your construction project. You put out the Request for Proposal (RFP) memo to a handful of suppliers (or, contractors and vendors) that your organization has worked with before, or to a specific community. Some organizations put a budgetary range in their RFPs. However, by doing this, you are instilling the anchoring effect before you receive any proposals.

Considering a specific number in an estimation, negotiation problem, or situation creates the anchoring effect before you receive any bids.

And, yes, you guessed it. Succumbing to the anchoring effect is yet another reaction of “System 1”.

5. Hindsight and Outcome Bias

“Hindsight is 20/20.” We’ve all heard that phrase, especially when making a decision and realizing (too late) that it was the wrong one. As a result, it can negatively impact quality decision-making. And not necessarily the process by which a decision was made, but whether the outcome was favorable or not. This is known as hindsight bias.

Learning from experiences and surprises is 100% reasonable, but it can have some costly consequences. For example, we often blame decision-makers when otherwise good decisions turn out to be bad ones. We often don’t give them credit for other successful decisions they have made—but that are only apparent after the fact.

To put it more clearly, when the outcome is unfavorable, stakeholders blame project managers for not seeing “the writing on the wall” when it only became legible after the fact. This is known as outcome bias.

All in all, hindsight bias impacts decision-making. Experienced project managers, whether or not they realize it, likely fall victim to hindsight bias when making decisions. As a result, hindsight bias and outcome bias make it nearly impossible to make a decision properly.

6. The Illusion of Validity

“System 1” jumps to conclusions from little evidence, or only apparent evidence (again, rather than considering the absence or lack of evidence). After all, poor or lack of substantial evidence makes for a very good story, doesn’t it? Our “System 1” also loves to jump to conclusions based on coherence (the stories we form in our heads) and our confidence (and sometimes overconfidence) in our opinions and subjective experiences. This translates into the illusion of validity, which is a cognitive illusion.

After all, confidence is a feeling. Project managers might feel confident in managing a particular project (again, maybe they are optimists), and they are thinking about how the project will play out under their leadership. (This is coherence at work.) However, the facts might show that the project manager is inexperienced in managing the particular project (based on industry or complexity).

In this case, the project manager made a decision and constructed a story in his head even though the story might not necessarily be true.

Cognitive Bias Risk Mitigation Strategies

So what do we do? How do we plan for and manage the risk of our own selves? How do we avoid it altogether? Can we avoid it?

Here are some risk mitigation strategies:

1. Reference Class Forecasting

This strategy can be used when planning and estimating a new project. This analogous strategy involves reviewing and analyzing the historical events of other previous projects. To prevent bias, reference class forecasting ensures that project managers assess projects with similar characteristics, including:

  • Industry
  • Category
  • Complexity
  • Budget
  • Constraints
  • Statistics related to project outcomes (such as cost overruns)

For example, some organizations use reference class forecasting to control and mitigate planning fallacies. This strategy requires a proper project archive (which stems from a solid project closeout process) or a Project Management Information or Knowledge System (PMIS) that tracks historical outcomes and statistics related to prior projects.

This is used to forecast future performance based on past results. These results can then be used to recommend preventive actions, if necessary.

However, in terms of cognitive bias, the trends analysis is flawed. For example, a project manager might overestimate a trend or risk due to his or her personal experiences managing similar projects or experiences with similar outcomes. Therefore, the project manager may fall victim to overestimating or underestimating.

3. Develop Risk Policies

Mental accounts are used to “keep score” and are often shaped by our personal experiences. Project managers who have experienced project failure will likely not only deal with emotional turmoil as a result, but they are also more likely to attribute that failure and take a specific course of action to avoid failure in the future, even if potential failure or a particular risk would be considered a rare event. (Ahem. “System 1″… )

Project managers as decision-makers—and human beings—are therefore prone to the emotions that tie into decisions, including regret. And regret clashes with hindsight bias, which is purely a function of System 1.

4. Develop a Decision-Making Process

We all know that decision-making is a huge part of project management. Project managers use several decision-making methods, including voting, autocratic decision-making, and multicriteria decision analysis.

When you break it down, decision-making really comes down to gains and losses. And, yes, just like playing the lottery or trying your chances at a game at a casino. In this context, we will use monetary means as an example of a loss or gain, since money is something we all can relate to.

By definition, a “gain” refers to more money than what you had before. And “loss” is obviously losing that money you had. However, I might argue that a gain might mean earning or being given something of value that didn’t cost you anything.

For example, if someone gives you a gift or a hand-me-down, such as a winter jacket, a diamond necklace, or a kitchen table, this is a gain because you still gained something of value, and the savings alone might contribute to a longer-term gain.

All in all, when weighing decisions, consider the potential losses and gains evenly. Use a decision matrix to help make decisions related to your project or life.

5. Challenge the Status Quo

Sometimes, the way we interpret a problem or a risk is a problem in itself. By applying critical thinking, you can question the status quo and challenge the default.

How to Activate Your System 2 to Reduce Risks

Now that you’ve just read a ton of information about how our brains work and how cognitive biases and illusions form, you likely have a better understanding of how we, as human beings, are our largest risk and make errors simply due to how our brains are naturally wired.

This now begs the question: How do you not fall victim to System 1?

I asked myself the same question while reading Thinking, Fast and Slow. As I neared the end of the book, I anticipated the “aha” moment—the “what do I do now?”—the answer to overcoming the risks of my very own System 1.

Daniel Kahneman, the author, writes—

“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort.”

Considerable effort… What does that mean for what I need to do each day?

Exercising and strengthening “System 2” is different for everyone. The first step is to slow down and eliminate hurry, and free up mental space to think, question, and focus.

Here are some steps to follow:

Step 1: Learn to recognize cognitive biases and cognitive illusions.

Although we can’t completely avoid cognitive biases and cognitive illusions, we can learn to recognize them in conversations, situations, and decision-making. Recognize when a cognitive bias might be present or when mistakes are likely to occur, and try hard to avoid such mistakes, especially when the stakes are high. Of course, this takes practice, but it’s possible.

Step 2: Slow down, rationalize, and activate “System 2”.

Slow down, “System 1”, let’s see what “System 2” has to say…

Rather than jumping to conclusions, be quick to respond, make a decision, or answer a question, stop. Slow down. Consider the facts and statistics. Look at the evidence that is present and what evidence might be missing. Question everything.

Step 3: Reduce “System 2” overload.

“System 2” overload leads to task-switching as a result of time pressure, depleting your overall mental effort and energy. Human beings have only a limited amount of mental energy and cognitive load capacity within a given period of time.

There are three different types of cognitive load to be aware of: intrinsic, extrinsic, and germane.

  1. intrinsic: a big task on your mind, your todo list, or your desk. To reduce intrinsic load, a common strategy is to segment or break up the task into smaller tasks.
  2. extrinsic: how a task appears or is presented. To reduce extrinsic load, work on a task without thinking about the end goal.
  3. germane: consolidating pieces of information into a larger concept

When our mental energy begins to deplete, “System 2” becomes lazy. Then, “System 1” takes over, which is risky business. “System 1” often puts us in a position where we fall victim to cognitive illusions and biases, preventing us from really thinking through problems. We form “stories” in our brains about how we think situations are and how they ought to play out. As a result, we often jump to conclusions and make illogical errors.

There are three ways to manage cognitive load:

  1. Intensity
  2. Frequency
  3. Duration

Depending on the difficulty or complexity of the material you are working on, reading, or studying, all three require a balance in order to retain mental energy, absorb material, improve the quality of learning, while avoiding mental exhaustion and burnout.

All in all, cognitive biases and cognitive illusions pose not only risks and lead to errors in the world of project management and business, but also introduce risks into our personal lives and relationships.

Take some time to consider your strengths and weaknesses as a project manager, where you might fall victim to your “System 1”, and what you can do to prevent potentially costly errors going forward.

4 thoughts on “The Biggest Project Management Risk of All—Our Brains [UPDATED]

Leave a comment