PsyFi Search

Thursday 25 June 2015

Corporate Bias: When Projects Go Bad

Bypass Surgery

As I recently blogged in Mobile Bypass Surgery For Banks over on Tomorrow’s Transactions, where the more technical side of my financial blogging occurs these days, the recent system outages at Royal Bank of Scotland, where customers were once again unable to access their own money, are just the tip of an industry-wide problem.  But that’s only half the story, because technology failures are always mediated by psychological ones.

As we know well enough – the Big List of Behavioral Biases is testament to the issue – individuals suffer from a vast range of behavioral issues, with predictably depressing financial consequences. But similar problems also afflict corporations, because managers suffer from the same issues as investors, and the results can be catastrophic for investors.

Sunk

That corporations can fall victim to the collective equivalent of individual behavioral bias is something we’ve looked at before, in Investment Analysts, Sunk By Deepwater Horizon; Hersch Shefrin’s masterly analysis of the problems in the Gulf of Mexico that brought BP to its knees:
“BP [w]as an example of a company possessing virtually all of the psychological weaknesses of companies headed for disaster: excessive optimism, overconfidence, choosing high risk to avoid having to accept an unfavorable outcome and turning a blind eye to warning signals”

Shefrin argues that the Macondo oil spill wasn’t an unforeseeable Black Swan but an entirely predictable White Swan, and one that analysts and investors should have seen coming. That they didn’t is evidence of how dependent these groups are on being spoon-fed information by companies, and on how little truly independent thought goes on.

Failure, Failure, Failure

We can see the impact of corporate behavioral bias more clearly in major project failures, many of which would be terribly amusing were they not so tragic. A study of the systematic errors seen across a range of major project cock-ups – the Airbus 380 (the wiring didn’t fit in the airframe), the US Coastguard Maritime Awareness system (it thought waves were boats), the Columbia Shuttle (see Risk, Reality and Richard Feynman), the Denver Baggage Handling system (it couldn’t handle baggage), the Mars Climate Orbiter (it failed because different engineering groups muddled up pounds and kilograms), Merck’s Vioxx arthritis drug (it had the side-effect of inducing heart attacks), Microsoft’s Xbox 360 (it chewed up disks and had a tendency to catch fire) and New York’s Subway Communications System (you couldn’t hear the person at the other end of the line) – revealed a small number of common problems.

In Systematic Biases and Culture in Project Failures Barry Shore identified that the main recurring biases in these projects were conservatism, illusion of control, selective perception and sunk cost, with honorable mentions for groupthink and over-confidence. We’ve met most of these before, but selective perception is an interesting issue – it’s where different people perceive the same situation differently, but don’t realise it: they’re talking the same language, but they mean different things.

Checks and Balances

Andy Haldane, the Chief Economist of the Bank of England, also addressed this idea of corporate bias in a speech on Central Bank Psychology last year. He also notes a range of potential issues including preference biases, myopia biases, hubris biases and groupthink as key issues for institutions, and goes on to discuss how the Bank of England is actively seeking to protect itself against these problems in the light of the regulatory failures leading up to the crash of 2008.

Haldane points out that the creation of strong institutions is, in itself, a defense against human weaknesses:
“Historically at least, the antidote to these preference problems is typically found in institutions – property rights, the rule of law, democratic processes. Strong institutions have often laid the foundation on which nations have been built. Typically, these institutional structures comprise an ex-ante mandate (agreed by society) and ex-post accountability mechanisms (assessed by society). Colloquially, these are often called checks and balances”.
These checks and balances are intended to prevent individuals’ self-interest leading to autocracy and corruption and they tend to arise in societies that achieve some kind of uneasy balance between the various arms of government, with the intention of preventing any one individual or group assuming absolute power. In political terms we call the outcome of this “democracy”. That’s one reason that countries with strong institutions tend to be less corrupt and more likely to attend to the rule of law; and, incidentally, tend to offer investors a better home for their money.

Anecdotal Evidence

However, as both Shore and Haldane suggest, even good institutions can fall victim to implicit biases and each institution needs to create a culture that seeks to debias based on their specific circumstances. What’s interesting is that these authors identify significantly different causes. Haldane’s list of issues is based on the historical research into behavioral bias, but Shore’s is the first (as far as I know) to try and look at the empirical evidence for what actually goes on.

Shore’s study is really nothing more than a series of anecdotes, and is far from a conclusive study, but I’m fascinated by the conclusions, because although they’re not what you might expect coming from a research background they accord precisely with my personal experience. As I continually caution people against relying on observational evidence you’ll hopefully treat this statement with the scepticism it deserves, but unexpected outcomes are always worthy of investigation.

Sunk Again

Sunk cost is perhaps the most interesting cause of poor decision making. In The Psychology of Sunk Cost Hal Arkes and Catherine Blumer:
“Found that those who had incurred a sunk cost inflated their estimate of how likely a project was to succeed compared to the estimates of the same project by those who had not incurred a sunk cost”.
Once you’ve invested a vast amount of time and money and mental effort on a specific project or system, then stopping is really difficult. Interestingly Merck, following their fiasco with Vioxx, started rewarding researchers with share options for canning projects. The idea, of course, is to provide people with an incentive to avoid the sunk cost fallacy, and it’s the kind of approach that more institutions need to take into account in their processes.

Debias or Glitch

The trouble with this is that we need more actual data to decide what the major issues are, and how they can be debiased. The actual empirical evidence on this is scant, and without data all we can do is speculate. The Bank of England is building defences against the behavioral biases it expects to find – but it’s not obvious that these are the ones that are really affecting their decision making.

To come full circle, to the technical issues at the banks, they aren’t going to solve their issues until they can find a way of addressing the underlying psychology. And, if we’re going to invest in complex companies that are critically dependent on technology we need to understand what they’re doing about these issues – otherwise we’re likely to suffer our own, financial, glitches.

Sunk Cost and Selective Perception added to The Big List of Behavioral Biases.

No comments:

Post a Comment