Consumers of environmentally friendly products are more likely to steal; and advisers who disclose their conflicts of interest are more likely to do the same – although they’ll call it something different, of course. Performing a morally good action will often give us the spurious moral justification for doing something bad.
Disclosure is the lawmaker’s go-to action when they need to be seen to do something, because it costs little and is easy to mandate. Unfortunately it comes with a slight problem: it doesn't work. Conflicts of interest need to be avoided, not managed.
A study by Nina Mazar and Chen-Bo Zhong, Do Green Products Make Us Better People? found that people who bought more environmentally friendly products were more likely to steal or cheat than people who bought conventional products. The hypothesis for this, in line with many other similar studies, is that having established (at least to themselves) their impeccable moral credentials they then promptly give themselves license to behave immorally: an effect known as moral licensing. As in so many areas of psychology there’s no give without take and no action without a corresponding unintended consequence.
From this point it’s a short jump to considering the effect of disclosure on a financial advisor. After all disclosure of a conflict of interest, even when legally mandated, is an unambiguously positive moral act, isn't it? And once you've done something good, as our green consumers show, it’s perfectly OK to do something morally dubious, isn't it?
Now, boys and girls, what do you think the naughty financial adviser is likely to do with their license to misbehave?
And The Point Is?
In fact we've looked at the problem of disclosure before, in Disclosure Won’t Stop a Conflicted Adviser, where we saw that disclosure actually tends to make advisers behave worse than before. Which is a rather unfortunate result of an action intended to make advisee’s lives easier: because, of course, it has the unintended consequence of making them worse.
The Cain, Loewenstein and Moore study we quoted in that previous article has now been backed up by a couple of empirical studies of what actually happens in the real-world. And, it appears that the theory is correct. Not only does biased advice remain biased, but it gets more biased, because the advisers now have the moral high ground: after all, if the client knows they’re conflicted it’s the client’s job to figure out the bias.
Which rather leaves you wondering what the point of advisers is.
In Biased Advice Christopher Robertson finds that:
“A primary finding of the present study is that a disclosure mandate improves layperson performance when unbiased advice is available too, as may be true in many market settings. A second opinion from an unbiased advisor is a much better remedy for biased advice than disclosure.”
Robertson’s study provides a whole host of interesting findings. One in particular is worth noting:
“Across all conditions of the study, there is no relationship between the average accuracy of laypersons’ judgments and their average expressed confidence in those judgments. The participants apparently had no idea as to whether they were doing well or poorly. In contrast, one would have hoped that those in the inaccurate conditions, such as those with no advisor or an advisor with a disclosed conflict of interest, would express low confidence, such that they might be willing to pay a premium to move to a more accurate condition. This was not the case.”
Basically, the advisees had no idea whether they needed advice or whether the advice, if they got it, was worth having. It almost makes you feel sorry for advisers.
The finding about the impact of disclosure is exactly in line with another study by Cain, Loewenstein and Moore – When Sunlight Fails To Disinfect – in which they suggest several reasons for the widespread popularity of disclosure as a method of dealing with bias. In particular they flag up the Chicago Theory of Regulation:
“Which posits that regulation typically exists not for the general benefit of society but for the benefit of the regulated groups. These entities might be aware of the ineffectiveness of disclosure but accept it because it benefits them.”
After all, it’s a lot easier to agree to accept a vaguely regulated disclosure requirement than it is to enforce regulations requiring that advisers are paid for or can only offer advice in a blinded situation. As the researchers go on to argue:
“Even if disclosure does no direct harm (e.g., if it does not morally or strategically license bias), it can have a pernicious effect if it substitutes for more-effective regulations, thereby morally licensing policy makers to not take more substantive measures to deal with conflicts.”
The idea that your financial adviser may have less duty to you than the man (or woman) who mends your faucet is ever so slightly worrying, an idea we explored in Mind the Chastity Belts: Fiduciary Duties and 900 Pound Lemmings, but even those who have fiduciary responsibilities – and who are therefore legally obliged to look after your best interests – are biased. They suffer from defensive decision making, and tend to look for safe options that they can justify when they get sued. All of which rather argues that we probably need to look askance at any adviser – which is a bit of a Catch-22 situation, to be sure.
A couple of potential solutions suggest themselves. Firstly advisers can be blinded – an idea Robertson has developed in a legal context, such that they have no idea whether they will benefit personally from the advice they offer. This isn't as odd as it sounds – if you have a pool of advisers who are assigned to clients randomly such that they can't know whether they will earn fees from the advice that they’re giving then this obscures the conflict of interests.
The alternative is to get a second opinion. This, of course, is good practice anyway when dealing with experts, even in fields like medicine where the practitioners are (hopefully) properly qualified. If we can’t personally judge the quality of the advice given, and the evidence strongly suggest we can’t, then we can at least try to get multiple points of view and see if there’s any agreement. Of course that probably won’t come cheap.
The complexity of the interplay between advisers and conflicts of interests is the usual mess of confusion where incentives are involved. Sadly there is no alternative in the end to us doing our own heavy lifting, even if this involves using a adviser. The best advice, as ever, comes from Charlie Munger:
“You can hire your adviser and then just apply a windage factor, like I used to do when I was a rifle shooter. I’d just adjust for so many miles an hour wind. Or you can learn the basic elements of your advisor’s trade. You don’t have to learn very much, by the way, because if you learn just a little then you can make him explain why he’s right.”
Appointing an adviser doesn't mean you can stop thinking. It just changes what you need to think about.
Moral licensing added to The Big List of Behavioral Biases.
Moral licensing added to The Big List of Behavioral Biases.