PsyFi Search

Loading...

Thursday, 9 February 2012

Backfiring Investment Theories

Holy Theory

We’ve met the nasty nature of confirmation bias on several occasions. This is the twisted trait that sees us in thrall to the views of others around us, and helps us encourage them to maintain those views, regardless of anything resembling reality. But this is only the half of it.

No, we also have to cope with the impact of information that blows huge great holes in our world views and which undermines our most devoutly held beliefs. When faced with unequivocal evidence that shows that our favourite investment ideas are wrong we do what you really ought to expect by now: we don’t so much ignore it as twist it to our own predetermined ends. Folks, meet the wonderfully counter-intuitive backfire effect.

Conspiracy Theory

We’re all familiar with conspiracy theories. Many people know that man never set foot on the Moon, that the Fed is designed to impoverish the hard working middle classes, that Pearl Harbour was known about in advance, that Princess Diana was assassinated on the orders of the British Royal Family and that President Obama is a non-American or a Muslim, or both.

What’s especially interesting about conspiracy theories is how hard they are to extinguish with actual evidence. Or, to be precise, how impossible it is to ever stop them being believed. There is simply no amount of detail that will ever satisfy a determined conspiracy theorist. After all, if the President won’t publish his birth certificate then this is evidence that he was born outside the USA. Although if he does, this doesn’t count since it’s obviously fake. Like the combined resources of the US President’s office couldn’t create a genuinely fake birth certificate. But, of course, if it was perfect then that would be evidence it’s fake, because it couldn’t be. And so on, regressing ad infinitum into a slough of despond.

UFO Theory

In fact, generally speaking, the harder you try to quash a rumour of this kind the more you fuel it. Why, after all, would the US military conduct an investigation into UFOs if they didn’t exist? Although, of course, the same US military appears to have funded groups to try and kill goats by staring at themand attempted to develop the so-called "gay bomb" intended to make enemy troops irresistible to each other, so you can't really be sure...

The backfire effect predicts that if you present some people with evidence that contradicts their strongly held beliefs it will reinforce their existing opinions, rather than changing them. Research by Brendan Nyhan and Jason Reifler on the effect in politics has resulted in some quite startling results. They point out that mostly we don’t receive evidence that contradicts our pre-existing opinions in a vacuum:
“Authoritative statements of fact (such as those provided by a survey interviewer to a subject) are not reflective of how citizens typically receive information. Instead, people typically receive corrective information within “objective” news reports pitting two sides of an argument against each other, which is significantly more ambiguous than receiving a correct answer from an omniscient source. In such cases, citizens are likely to resist or reject arguments and evidence contradicting their opinions.”
WMD, Theory?

So the researchers went and performed a series of experiments that largely confirmed the backfire hypothesis. Presenting evidence that Saddam Hussein did not have weapons of mass destruction before the second Iraq war made conservatives even more convinced that he did. Similarly, supporters of President Bush’s tax cuts remained convinced that the administration’s tax cuts had boosted government revenue, even when shown that it hadn’t. On the other hand, a similar experiment erroneously suggesting that President Bush had banned all stem cell research didn’t produce a backfire effect: leading the researchers to muse that this may be because conservatives are more dogmatic than liberals.

As the research points out, these experiments are embedded in the context of their time: in essence they’re virtually unrepeatable. So, it’s pretty unlikely that any significant group of people still believe Iraq had WMDs, although back in 2005, when the original experiment was conducted, this probably wasn’t the case. Nonetheless, this makes studying the effect difficult.

Testable Theory

The very difficulty in studying it makes the backfire effect suspect: after all, if the effect itself can’t be pinned down isn’t it possible that it really doesn’t exist? Well, the theory that lies behind this is known as motivated reasoning, and essentially argues that emotion drives reasoning, such that if we are emotionally disposed to believe something no pesky data that suggests the opposite is going to dissuade us. As David Redlawsky, who has been researching these issues for a while, states (in the context of President Obama, once more):
"The simple reality is people feel before they think. And when those feelings are strong enough, facts take a back seat.”
In fact the way that people form beliefs isn’t really understood, but it seems to be the case that we need to believe something in order to understand it. Which means that in order to analyse two contradictory positions we need to believe them both: a generally difficult proposition. In You Can’t Not Believe What You Read Daniel Gilbert sets out the problem:
“For example, if a person is told that lead pencils are a health hazard, he or she must immediately believe that assertion and only then may take active measures to unbelieve it. These active measures require cognitive work (i.e., the search for or generation of contravening evidence), and if some event impairs the person's ability to perform such work, then the person should continue to believe in the danger of lead pencils until such time as the cognitive work can be done”.
Spinozian Investment Theory

This is the view of Baruch Spinoza, which leads us to the position that the backfire effect is a response to not wanting to believe: because if you have to believe in order to understand then if the idea presented is abhorrent to your beliefs it’s going to be at best ignored and at worst traduced. For investors this is a dangerous position: if you have to believe everything in order to analyse it then you’ll spend your life chasing false propositions, but if you simply refuse to face potential issues because you don't want to believe them you’ll likely miss the big changes when they come. Unfortunately the investing business is a whirling mass of conspiracy theories, cunningly labelled “investment analysis”.

The lessons from Gilbert et al are that you need to take time and care to analyse your investment decisions and that you need to keep the number of decisions you need to make to a sensible level: or simply follow a strategy that doesn’t require you to make many decisions at all. Above all, though, you need to develop testable hypotheses, rather than just uncritically evaluating – or rejecting ideas – based on preconceived belief systems. Otherwise, you’ll just find yourself following every vague rumour and conspiracy theory around. Which, as an investment strategy, is simply bound to backfire.



1 comment:

  1. This seems to be approaching cognitive dissonance, where, on being forced to challenge a deeply held belief we either reinforce our belief or convince ourselves that we never held it in the first place.

    ReplyDelete