PsyFi Search

Tuesday 30 August 2011

Investment Analysts, Sunk By Deepwater Horizon

Flawed Analysis
“BP has a systemic problem with its culture that runs deep.”
There are unlikely to be any readers of this article unaware of the disaster surrounding BP in 2010 when their Gulf of Mexico based oil drilling rig Deepwater Horizon lived up to its name in dramatic and tragic circumstances. Since then many people have pointed fingers and made accusations but the quote above, by Hersch Shefrin, comes from his book, Ending the Management Illusion, based on an analysis of the behavioral flaws apparent in BP’s management, and written two years before the disaster at the Macondo Prospect.

Yet while Shefrin was able to identify the potential problems at BP ahead of the game not only did this not alert investors to the dangers of investing in BP it also failed to jolt oil sector analysts into any kind of action at all. And, frankly, if analysts can’t tell the difference between an oil major taking excessive safety risks and one that isn’t, what the hell’s the point of them?

Warning Signs

After the disaster in the Gulf Shefrin and his co-author Enrico Cervellati followed up with a paper entitled: BP’s Failure to Debias: Underscoring the Importance of Behavioral Corporate Finance  in which they attempt to dissect BP’s decision making processes. As Shefrin records:
“In my 2008 book, "Ending the Management Illusion," I profiled BP as an example of a company possessing virtually all of the psychological weaknesses of companies headed for disaster: excessive optimism, overconfidence, choosing high risk to avoid having to accept an unfavorable outcome and turning a blind eye to warning signals.”
Indeed, in the five years prior to the Gulf disaster BP had already been the nexus of two highly publicised safety problems, at a refinery in Texas City and an oil pipeline in Alaska. Both of these came with plenty of warnings which, for the reasons Shefrin identifies, BP’s management chose to ignore. Here’s Natalya Sverjensky:
“Between 1997 and 1998 alone, for example, BP was responsible for 104 oil spills in the Arctic. And in 2008, BP received the largest fine in the history of the U.S. Chemical Safety and Hazard Investigation Board: $87 million for failing to correct safety hazards revealed in the 2005 Texas City explosion.

As of June 2010, BP has had 760 such OSHA fines for "egregious, wilful" safety violations. Meanwhile Exxon Mobil has had just one.”
Analytic Oversight

So overall we have a company that seems to have a problem with safety, that’s beset by internal behavioral problems at multiple levels of its management and which is lagging behind its oil industry peers in these regards. It’ll come to no surprise to regular readers that this information was neither reflected in the market price of BP compared to its competitors, nor was it identified in analysts’ reports. So much for efficient markets and information analysis.

In fact there are a range of known problems with analysts and the way they perform their roles. Ramnath and colleagues give a review of the literature on behavioral bias in analysts (see page 57 onwards, and settle in for a long read).  Specifically, In James Randi and the Seersucker Illusion, we noted the research that indicates their forecasts tend to cluster – they herd together: younger analysts in particular are punished for getting forecasts wrong, so they have lots of career incentives to avoid standing out from the crowd. This McKinsey report, from 2010, suggests nothing much has changed in the past decade, despite lots of disconfirming evidence. Basically analyst forecasts seem to be permanently biased in an upwards direction.

Analysing the Analysts

Shefrin and Cervellati’s analysts of the analysts is illuminating. They show that the degree of herding in forecasts increased after the disaster at Macondo and that the forecast share price never fell below the price of BP’s actual stock. Prior to the explosion analysts were almost uniformly positive about the outlook for BP. As the authors state:
“In our view, availability bias and confirmation bias loom large. Analysts focus heavily on earnings trajectories and company narratives, as these are readily available and salient. It is well known in the behavioral finance literature that security analysts tend to rely on management’s stories. See Montier (2005).”
The paper by James Montier, The Seven Sins of Fund Management, is, as usual, wonderfully acute:
“The insistence of spending hours meeting company managements strikes us as bizarre from a psychological standpoint. We aren’t good at looking for information that will prove us to be wrong. So most of the time, these meetings are likely to be mutual love ins. Our ability to spot deception is also very poor, so we won’t even spot who is lying.”
The failure to seek disconfirmation of our pre-existing views – see Confirmation Bias, the Investor's Curse – seems to be a fundamental problem with the human condition, while the tendency to grab hold of salient information is ever-present as we saw in It's Not Different This Time. These problems dog all of us in everyday life and are particularly problematic for investors. However, you might hope that equity analysts would actually be better at dealing with these problems than the rest of us: otherwise, what’s the point of them?

Predictably Risky? 

What this research suggests is that the idea that Deepwater Horizon was an unforeseeable accident is almost certainly wrong. In investing there’s always an element of uncertainty: we can’t foresee that the Finance Director will abscond with the CEO’s husband and the company’s check book, or that sanity will break out in Washington or a million and one other events that will disrupt the smooth operation of our investments. However, Shefrin’s analysis suggests the uncomfortable reality that BP’s safety record, and the implied behavioral problems with its management compared to its peers, made it a less reliable and more risky investment than those competitors. 

The disaster at Deepwater Horizon itself may have been the result of a myriad of interlocking separate events none of which could have been foreseen to culminate in the eventual result. However, a behavioral analysis of BP prior to the event, looking seriously at the gap between the company’s rhetoric and its public record of safety violations really should have provoked some cognitive dissonance, at least in the minds of professional analysts.

Yet the evidence seems to suggest that, with a few honourable exceptions, this didn’t happen because those analysts themselves were suffering from their own set of behavioral biases. Investors, meanwhile, were happy to piggyback on the research of others, blithely assuming that the risks associated with one oil explorer were much the same as with another.

A Personal Postscript

I can claim a personal involvement as I was, and still am, a shareholder in BP, bought at an apparent margin of safety in the depths of the crisis in 2008. Like many smaller investors it would have been wastefully expensive for me to carry out the analysis Shefrin has performed, but we all need to be acutely aware that all investments carry a tail-risk of behavioral management ineptitude that we can never assess properly.  Still, adding "check the safety record compared to industry peers" to an investment checklist may save some investment capital in future.

The lessons here remain as clear as ever: don’t overpay and don’t under-diversify. And until we see some evidence that investment analysts are receiving training in behavioral analysis and being rated accordingly we should assume that their inputs are about as useful as stories emanating from company managements: unconciously overoptimistic and oft-times inaccurate.

Ending the Management Illusion: How to Drive Business Results Using the Principles of Behavioral FinanceA Hole at the Bottom of the Sea: The Race to Kill the BP Oil GusherThe Little Book of Behavioral Investing: How not to be your own worst enemy (Little Book, Big Profits)



1 comment: