Aaron Pressman at Reuters had a piece last week that looked at how to a man the analysts covering BP (BP) in the aftermath of the Deepwater Horizon disaster could have gotten things so wrong.  As oil spilled into the Gulf and the company’s stock continued to plummet the vast majority of analysts continued to rate the company a “Buy.”  It wasn’t until June that many analysts began downgrading the stock.  Pressman writes:

How could so many analysts have gotten the call so wrong? Of course, to err is human. And Wall Street is also prone to herd-like tendencies. But some experts say the unanimity of error around the BP blow-up also has exposed — yet again — the conflicts and weaknesses that still bedevil the sell-side analyst community, despite a decade of much-heralded reform.

Then again, we probably shouldn’t be all that surprised by this.  The sell-side has been a topic of derision ever since the bursting of the Internet bubble.  Felix Salmon also at Reuters writes:

Sell-side analysts live in mediocristan, and are prone to being blindsided by the unexpected; they almost never, for instance, recommend negative-carry trades. Investors, if they’re any good, know this. No one ever made money by blindly following sell-side advice, and so we should hardly be surprised that people whose position coincided with the sell-side consensus ended up losing a lot.

However this case is a good illustration of something else:  the Dunning-Kruger effect.*  To wit:

When people are incompetent in the strategies they adopt to achieve success and satisfaction, they suffer a dual burden: Not only do they reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the ability to realize it.

Errol Morris writing at the New York Times has a piece well worth reading exploring this effect, including an interview with David Dunning.  The idea that there are “known unknowns” and “unknown unknowns” runs throughout the piece.  This is not entirely dissimilar from the idea of “risk and uncertainty.”  There are ideas and information that are so far removed from our normal state of awareness that we are left completely unaware of them until they pop up.  Morris writes:

Put simply, people tend to do what they know and fail to do that which they have no conception of.  In that way, ignorance profoundly channels the course we take in life.  And unknown unknowns constitute a grand swath of everybody’s field of ignorance.

Coming back to how this relates to the topic at hand.  The fact is that deep water drilling is a novel technology that heretofore had not experienced a catastrophic failure.  Therefore trying to estimate the potential damages from this accident at that early date was at best an exercise in dart-throwing.  Joshua Brown at The Reformed Broker has it about right:

So now you take a scenario like BP where, in truth, no one has any clue what the damage could be, how much the disaster may cost, who is on the hook for the cleanup, etc.  It’s all unprecedented.  For a fundamental analyst to step up in the midst of all the uncertainty and pretend like their “models” have an answer is the height of slapstick-comedy-masquerading-as-research.

There is nothing wrong with saying you don’t know something.  After a reading of Dunning-Kruger one could argue that acknowledging one’s lack of knowledge is in fact a sign of intelligence.   This is not just a BP issue.  It pervades the modern world and the world of finance as well.  There is simply too much out there that we don’t know to state with confidence what is at best conjecture.

*Hat tip to Jason Kottke for pointing us to the Errol Morris piece.