PsyFi Search

Wednesday 3 November 2010

Risk, Reality and Richard Feynman

Rocks and Risk

One of the problems that lies behind many of the crises that have afflicted the financial sector over the past thirty years or so is that business managers seem to have difficulty relating the level of risks that they’re taking to something akin to reality. For various reasons, not entirely unrelated to the need to make business cases look reasonable, the potential risks in many transactions are downplayed to the point where they simply vanish.

This isn’t, however, a problem exclusive to the financial sector. Many other organisations face these problems as their managers are squeezed between rocks and hard places. As NASA, and its astronauts, have found to their cost.

Challenger

In 1986, in the wake of the destruction of the Challenger space shuttle at take-off, the Nobel Prize winning Richard Feynman, one of the minds behind quantum electrodynamics, was called onto the Rogers Commission of investigation. Typically, he mounted his own, one man investigation of what really went wrong.

To non-scientists Feynman is justly famous for figuring out the problem that caused the destruction of Challenger – the inability of the rubber O-rings sealing the propulsion fuel to respond to the flexing of the spaceship at low temperature. However, he did something equally as important, if less publicised, during the investigation by showing how the real – and high – risks of a disaster were transmuted through organisational management into something more "acceptable".

Risk or Reality?

Here’s Feynman, in his Personal Observations on the Reliability of the Shuttle, on the critical point:
“They [NASA] therefore fly in a relatively unsafe condition, with a chance of failure of the order of a percent (it is difficult to be more accurate).

Official management, on the other hand, claims to believe the probability of failure is a thousand times less. One reason for this may be an attempt to assure the government of NASA perfection and success in order to ensure the supply of funds. The other may be that they sincerely believed it to be true, demonstrating an almost incredible lack of communication between themselves and their working engineers”
Now as anyone dealing in computer modelling knows, it's important to try and calibrate the outputs of the models with the real-world, to get a general feeling for how accurate the results are. Putting Feynman's findings into perspective, according to NASA’s figures you could launch a space shuttle every day for three hundred years and only have a single failure. According to Feynman you wouldn’t make one hundred launches before disaster struck. The difference isn’t so much striking as simply astonishing. In fact anyone with any understanding of complex systems – and the shuttle was an incredibly complex system – knows without even analysing the data that the official number was, literally, unbelievable.

Financial Failure

Now you may have a sense of déjà vue about these numbers. Back in 2007 we apparently experienced a once in a hundred thousand year occurrence of risk model failures. Indeed the CFO of Goldman Sachs explained that they were seeing “25 standard deviation moves, several days in a row”. Which, as this paper How Unlucky is 25-Sigma? explains, suggests a reason why Goldman may have been so unlucky – their CFO didn’t understand their risk models:
“On February 29 2008, the UK National Lottery is currently offering a prize of £2.5 million for a ticket costing £1. Assuming it is a fair bet the probability of winning the lottery on any given attempt is therefore 0.0000004 ... and the probability of a 25 sigma event is comparable to the probability of winning the lottery 21 or 22 times in a row”.
More succinctly, the authors add:
“We suspect … that the estimate of a 25-sigma event being on a par with Hell freezing over is just about right”.
Risk in the Real World

In any case, even if the risk models did predict a once in a hundred thousand year failure a mere glance at the history books would show that the models were severely underestimating the probability of a major problem. The inability of senior managers to grasp the real meaning of the risk figures they quote is seriously worrying: after all, if they can’t figure out that the numbers they’re being quoted by their underlings are at odds with the realities of time and space then who, exactly, is managing these risks? The answer to this conundrum, of course, is that the numbers they’re quoted are the ones they want to hear, because they’re the ones that make sense of their business models.

In the dim and dark world of finance it’s hard to trace the evolution of financial risk models away from tools to help managers manage risks to the basis for deciding whether excess risks were being taken. But with that transformation would have come management pressure to make it easier to overcome the hurdles the models imposed and a gradual degradation in the relationship of the models to anything with even a distant relationship to reality.

Risk in NASA's World

Now, in the circumstances surrounding the Challenger enquiry, Richard Feynman had a pretty free hand to investigate and, unlike the more opaque recent financial disasters, was able to come to some pretty robust conclusions about how a 1 in 100 actual chance of a failure got transmuted into a public statement that a disaster was virtually impossible. What he found was that at the engineering level the staff were pretty much agreed that the probability of a failure was much, much higher than those used by management.

His conclusion on this was profound, and disturbing:
“If a reasonable launch schedule is to be maintained, engineering often cannot be done fast enough to keep up with the expectations of originally conservative certification criteria designed to guarantee a very safe vehicle. In these situations, subtly, and often with apparently logical arguments, the criteria are altered so that flights may still be certified in time.”
One logical step at a time the risk figures move away from reality, away from anything really grounded in analysis. Even worse, each successive flight in which something didn’t go wrong was then used as justification that there was nothing to go wrong. Every step to hell is paved with management’s intentions.

Making the Numbers

The point of this anecdote – which I’m sure you’ve already taken – is that in spite of the understanding of the risks being taken within the ranks of the staff the nature of this got modified as it moved up the management hierarchy in order to meet the overall needs of the organisation. In the end, it wasn’t the actual risks involved in the deployment of the shuttle that was driving NASA, it was the management’s requirement to make the launch schedule that was driving the risk figures.

This is an experience many workers in financial institutions may know well. When there’s a mismatch between the actual risks being taken and the organisational imperative to make earnings forecasts then the unspoken pressure to cut corners, take more risks and justify this through manipulation of the numbers may become overwhelming but the move away from reality can happen in such small steps that it's impossible to spot it in progress.

Nature Cannot Be Fooled

Now it can be argued that to compare the deaths of the Challenger astronauts with the various financial collapses we’ve experienced is to examine chalk and declare it’s not cheese. This may be so, but the underlying principle is the same. Although many commentators have focused on the failure of the financial industry’s risk models there’s little recognition of the path of transmission through which this failure occurred. Models have their place in proper risk management processes but solely blaming them when trouble strikes occurs is, at best, ignorance.

The lesson for investors is to ignore corporations that can deal in moral hazard in this way, because they’re simply sitting on the beach telling the sea to stay out. Feynman ended his story with a pithy reminder of what we’re really dealing with when we deal with risk:
“For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

Related articles: Risky Bankers Need Swiss Cheese Not VaR, The Lottery of Stockpicking, Mandlebrot's Mad Markets

1 comment:

  1. I follow the guys over at FFT check out their latest forecasts about the US elections. Insane! Spot on, they called the crash of 08' and are definatly worth a look. http://www.forecastfortomorrow.com/news/2010/11/major-hit-us-elections

    ReplyDelete