The Top Three Bias Pitfalls in Capital Planning

Sept. 19, 2011
When individual decision-makers within an organization or governing body come together to make choices, the interplay of biases can create factions and alliances that stalemate progress or hijack agendas.

There is an old saying; “There is a sucker at every poker table; if you can’t spot them, it’s likely you.” This, at its core, describes the nature of the bias blind spots that everyone shares. The question isn’t, “Are we biased?”  We are.  The better question may be, “How do our environment and the system of positive and negative reinforcements within it shaping our perspective?” These influences define our preferences and thus the way we think, feel and respond to information. It is difficult to gain objectivity regarding these influences on ourselves, and so bias becomes a term used for the distortion in how “others” process information. When individual decision-makers within an organization or governing body come together to make choices, the interplay of biases can create factions and alliances that stalemate progress or hijack agendas. To break logjams or neutralize groupthink, the process can become increasingly autocratic.  Those in agreement with the leader grow in confidence and satisfaction, while those opposed feel disempowered and grow passive aggressive. No one wins. Three common cognitive biases can have a significant impact on capital allocation decisions: Framing Effect, Confirmation Bias, and Planning Fallacy. Here’s how to spot them, and how to minimize their effect. Framing Effect: The way a question is framed can have a major impact on our choices. Implied certainty and positive or negative language can sway opinion. For example, “Would you prefer option A that has a one million dollar Net Present Value? Or option B that has a 33 percent chance of a three million dollar NPV and a 66 percent chance of a zero NPV?” While both projects have the same expected value, project B appears more risky. To counteract the Framing Effect, challenge assumptions and reframe options to explore the proposition and break through the anchoring that the frame imposes. Confirmation Bias: Selectively choosing information that supports existing assumptions and beliefs is common; our minds seek patterns and similarities.  In experiments, a group given a list of numbers “2, 4, 6” almost all converge on the underlying rule of increasing by two, when in fact the rule being tested is simply successive larger values. Consider comments like, “Didn’t we get a positive result from that test on option A?” and, “We tried a similar approach to option B before and it was a disaster!” The parties to the decision have begun trying to substantiate the framing of the decision.  To combat this effect, assign a devil’s advocate within the group to express counterpoints. Introduce outsiders who are not vested in the outcome, or use a structured approach to introduce different opinions and contrary positions. Planning Fallacy: The planning fallacy is known to anyone who has ever read an article or heard complaints of projects being years behind schedule and millions (or even billions) of dollars over budget.  The famous Sydney Opera House was expected to take five years from the start in 1959 and $7 million to finish. When the project was completed in 1973 (14 years), it was at a price tag of $107 million (15 times budget). Even the most conservative estimates are often optimistic.  To manage the planning fallacy, make estimates of project costs and timing – best, expected and worst case.  Then make the worst case the best case and see if it would change the decision.  If performing an NPV or schedule simulation, run it with a range of values for cost and time variables.  Then run it again for comparison, with ranges approximately twice as wide as you initially believed them to be. Biases can distort our perceptions and cultivate wishful thinking.  The cascading effect of a problematic frame, talking ourselves into the wishful benefits and optimistic planning can render poor results.  Being on the watch for these common effects and being aware of their potential pitfalls can allow decision makers to implement countermeasures to help make sure they are pursuing the best course of action. Kevin Connor is Vice President of Decision Lens’ Solutions Group. He can be reached at [email protected]