Column

Bias and Project Management

Cognitive bias is:

  1. Making decisions or judgments not supported by objective reality;
  2. The subject of Nobel Prize winner Daniel Kahneman’s book Thinking, Fast and Slow;
  3. A barrier to effective Legal Project Management;
  4. All of the above.

As a lawyer, you probably have some familiarity with cognitive biases. (If you do defense work, your clients are likely very familiar… from experience.) However, I want to focus on some ways cognitive biases can screw up projects.

(I’m deferring my promised column on project charter disasters to next time.)

Anchoring, or “First Liar”

We can’t help it. Someone tells us about a problem with our project, say a disagreement with another team member on how to move forward. When we hear from the second team member, we mentally frame what we’re now hearing in terms of what we first heard.

Or take the discussion with a client. If you say the matter should run about $10,000 but could go as high as $20,000, what number do you think they’ll remember a month from now when the invoice comes in at $12,000? Did you save them $8,000, or overrun their mental budget by $2,000?

By the way, if you want to see professionals in action when it comes to anchoring, go shop for a car. Consider who sets the price you negotiate from. (It’s actually a two-step process, the sticker price and then the salesperson’s “here’s what I can do for you” price.) Also, why do they so want you to sit in the car while you think about it?

Studies have repeatedly shown that even when we know anchoring is in play, we fall prey to it anyway. So even though you know as a lawyer that you need to hear all sides of an argument, you’re likely to put unwarranted (and unrecognized) trust in the first report you receive, using that information to frame further discussion. If someone tells you, “Task X is running late,” you’re more likely to seek the cause of the delay rather than confirm that there actually will be a delay… and in doing so, you may create a self-fulfilling prophecy.

Bottom line: We anchor even when we know anchoring is in play. So try consciously to evaluate no information until you’ve heard from a variety of sources. This applies whether the “first liar” appears in person, sends an email, or even consists of the first set of data you review.

Finally, note that “first liar” can work in reverse. If you hear information first from someone you distrust or dislike, you’re likely to discount the validity of that information, trusting too much in the second report. Which leads us to…

Confirmation Bias

Confirmation bias means that we evaluate new information in light of what we already believe.

Think about how hard it was for so many educated people in the Middle Ages to accept that the world was roughly spherical. The earth’s size and shape had been known since the ancient Greeks performed some calculations I won’t go into here, but our common, everyday experience (intuition) tells us that the world is flat. And frankly, unless you’re sailing or flying somewhere distant enough that great circle routes come into play, a “flat earth” mental model is quite sufficient to get us through our days. (It’s still counterintuitive that to fly from Ottawa to Singapore, head due north.)

Because of confirmation bias, we are resistant to new information on a project that doesn’t square with our current mental model. For example, if I believe that paralegals can’t perform certain work effectively, then a) I’m not going to assign that work to a paralegal and b) should they do the work, I’m going to spend almost as much time editing and reworking it as they spent doing it in the first place. Likewise, if I believe that lawyer X has great skills, my tendency will be to accept their work as terrific until it becomes undeniably clear that they’re in the wrong role on this particular project.

Confirmation bias comes in various flavors, including:

  • Search for information: If we believe a certain “fact” about our project, we’re more likely to seek out further information that confirms the fact, rather than going in search of “inconvenient” facts.
  • Receipt of information: We are more likely to ignore or discount information that doesn’t square with what we already know.
    • If we believe something and receive new information, we generally accept that new information if we can find one aspect that appears to be true.
    • If we do not believe something, we will reject new information if we can discern even one aspect that appears untrue.
  • Recall of information: We are more likely to remember information we agree with than information we don’t. So even if you’ve heard bad news from someone on the project, if you believe all is going well, a few days later you’ll struggle to recall that you ever heard the bad news.

“Hot-Hand” Fallacy

In basketball, if a given shooter has made two three-pointers in a row, most spectators – and players – believe that player has the hot hand, and teammates will keep feeding her the ball. Statistical evidence shows, however, that even in the course of a single evening, shooting percentage tends to revert to the mean. The fact that a fifty-percent shooter made two in a row does not alter the likelihood of the next shot going in. It’s still fifty percent.

We see this in reverse as well. When a coin has landed heads eight times in a row, many people suspect it is now “due” to come up tails. Nope, still fifty-fifty. Actually, eight heads in a row could suggest that the coin is not weighted evenly. Probably not the case, but either way, tails is still no more than a fifty-fifty proposition.

We see this play out on projects when people stick to methods that appear to have worked in the past, even though those methods had no demonstrable effect on the outcome. (I wore a blue shirt and my team won, so I’ll wear my “lucky” shirt this weekend.)

“Hey, my projects usually turn out okay” may mean you’re a pretty good intuitive project manager. However, it may mean that your boss thinks you’re a terrible project manager and assigns you only the most foolproof projects. Or that your team is terrific and keeps bailing you out. Or even that your project coin has simply landed heads a few times in a row.

Gaining skills in project management – actually, in anything you do – increases the likelihood of success truly being a repeatable outcome. Even great “intuitive” project managers – and I’ve known a few – benefit from examining what they do on projects and trying to understand where their specific actions have made a positive difference.

Loss Aversion

  1. Here’s $20. Hold it in your hand. Now, want to give me $10?
  2. I have $20. I’m willing to give you $8. Good deal?

Of course, I’m not actually handing out twenty-dollar bills, but scientists (with research budgets) have actually done this. The results are what you might expect from my two scenarios. Once people have the goodies, they’re more reluctant to part with some relative to when they’re offered a partial amount while having nothing in hand.

The economist’s version of this bias is the sunk-costs fallacy. (They’re not exactly equivalent, but their relationship is close enough for this general overview.) “We have spent so much money on this project that we can’t afford to stop now.”

I often see these two related biases in consulting on projects that have stalled. (I have occasionally counseled teams on stalled projects, including a couple with millions of dollars in sunk costs, that the fastest way to the finish is to start over, with a fresh team and fresh eyes.) The project team is continuing to try to “force” a solution by working harder at what they’ve already done, rather than stepping back and reevaluating. They don’t want to give up and “lose” all that hard work.

But I also see a form of loss aversion in individual tasks. A person stuck on a task too often continues to believe success is right around the corner, and they fail to ask for help, or to back off and try a new approach. In a project, asking for help isn’t – or shouldn’t be seen as – a sign of weakness. Projects are delivered by project teams, and the team’s success is also the individual’s success.

I also see loss aversion from new (and too often not-so-new) managers. They have spent their career to date doing individual tasks – e.g., lawyering – at a very high level. To be a successful manager, they’ll have to devote time to helping others on their team and doing various new tasks. They fear correctly, often without admitting so, that they will be less successful at these novel tasks, and so they cling to their old responsibilities, afraid to give up the rewards that come with completing them successfully. The upshot is that they make poor managers, because they continue to be focused not on leading but on the tasks that got them to the point in their careers where someone suggested they start managing people or projects.

Finally, I also see loss aversion at play in deciding whether to learn about project management. “We’re up against deadlines, and everyone is swamped.” Okay, but if you took a day now to learn some new techniques, maybe that investment would be paid back with interest a few weeks down the road.

Next Step

Go read a book.

But in this case, I’m not going to recommend one of my books (but, hey, go read them anyway). Rather, Daniel Kahneman’s Thinking, Fast and Slow is an easy, worthwhile read. There’s no math necessary, for one thing. For another, he explains these biases, and more, in a clear, detailed manner, providing both real-life examples and great stories from various experiments.

And of course, the answer to the question I posed at the start is d) All of the above.

Comments

  1. Hi Steven, great article (and long time no talk – we should fix that!)

    I have to say that I have kind of an odd view of Daniel Kahneman’s book. I found it to be perhaps the single most important book that I have ever read. I had a mentor who insisted that I read this book if I wanted to understand anything about critical thinking and he was absolutely right.

    But at the same time I have to say that I utterly HATED this book. Even though the book SHOULD have been easy to read, I found that the way Kahneman came across made it very hard for me to read. Well before I was done I was just sick and tired of the author: his self obsession, his humble-bragging, his need to repeatedly score personal paybacks on so many grudges.

    I wanted to recommend it to people, but how could I do so for a book that even though I fully understood how life-changing it is, I actually kind of loathe it?

    Then, a few years later Michael Lewis’ The Undoing Project came out, a book about . . . Kahneman’s book. Unlike Daniel Kahneman, I adore Michael Lewis as an author. The Undoing Project explains why Kahneman comes across so negatively. It sets the context that helps to make Kahneman seem, well, if not a particularly likeable person, at least not so awful as to make his book so difficult to get through.

    I guess you could call Michael Lewis’ book a game-changer for . . . a game-changer (“game-changer squared?”). So now I recommend to people that they read Lewis’ book first and then Kahneman’s.

    Take care,
    – Mike