



© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Published: 04/04/2016
Phil used to be a very senior financial executive. When asked for a number, he would typically provide with a rough ballpark answer, such as, “It’s about 5 percent.” He’d then be peppered with questions about how he had arrived at that figure.
After a while, he got tired of this questioning and started to bring a stack of financials with him to every meeting. From then on, instead of providing an approximate but effectively accurate answer, he would instead turn to his printout, thumb through the pages, and then randomly point to a specific line and answer by saying “It is 4.96 percent.” The questions stopped. The oracle had spoken.
Phil’s experience is not unique. Humans tend to dislike uncertainty. For example, many people are happy to play roulette, despite its inherent risk and expectation of loss, but only a few are willing to participate in a wager if the odds are not clearly defined, even if they can choose their side of the gamble.
Models, particularly those with a veneer of complexity and sophistication, cater to this aversion. Various academic studies suggest that seemingly more precise numbers can act as more potent anchors, and that when presented with more information, people tend to be even more overconfident of their capability even though their actual performance does not improve. Data do not guarantee knowledge.
By looking at Phil’s 4.96-percent figure, we are all the more likely to anchor more strongly on that view and feel even greater overconfidence towards the precision of that measure. In fact, many people don’t want to be bothered with details, particularly when data go against their prior beliefs. For example, studies are less likely to change peoples’ minds when they provide more information about the way they were conducted. Knowledge can be a curse.
Naturally this has implications for risk management. Risk modeling has made incredible strides, particularly in financial markets, and experts have far more sophisticated indicators of what their risk positions are than ever before. Along the way, businesspeople ceased to follow a conversation based on esoteric mathematical concepts. If you can’t convince, confuse.
But even simpler things are being missed in the conversation. For example, one of the most common measures of risk in the financial sector is value at risk (VAR). VAR provides a sense of the volatility that can be expected over a certain period given the investment made by the firm. The higher the VAR, the greater the risk. Recently, VAR measures have dropped. Unfortunately, the drop is because the indicator is typically based on experience throughout the previous five years. Observations during the financial crisis were removed from the calculation, lowering the value of the indicator. Needless to say, the underlying risk has not been affected. The indicator itself is not the risk.
At the same time, the most notable problems have arisen from issues of uncertainty, as unpredictable surprises swamp businesses. For example, the various banks that failed or suffered during 2008 had wonderfully complex risk models, yet they failed to consider the possibility of a major increase of correlations among individual instruments or cascading asset classes that eventually expanded to include regulatory and structural changes to markets. The surprise was that models had not anticipated this. Reality is a stubborn thing.
Twenty years ago, a major hedge fund, Long-Term Capital Management (LTCM), was run by finance veterans: a small army of Ph.D.s and no less than two Nobel Prize winners. In 1998, it nearly caused a global financial meltdown as the same increase in asset correlations happened. We learn from history that we do not learn from history.
These uncertainties, by their very definition, aren’t something that can be included in the typical risk model, but that doesn’t mean they should simply be ignored. Blaming failure on a black swan isn’t particularly useful either. Careful consideration of and preparation for the types of disruptions that can occur due to unknown events is still prudent planning and can aid business operations in times of trouble. Plans may be useless, but planning is indispensable.
With the idea that it’s never too late, risk professionals can take steps to better frame their environment, including formatting, vocabulary, and the visual display of information. Such steps might seem basic compared to the advanced mathematics and consideration that go into today’s risk models, but they’re opportunities to bridge the gap between managing risk and managing uncertainty. Sometimes less is more.
Phil may save a lot of time by presenting overly precise answers, but he’ll get far more insightful discussion in return if he starts conversation with a recognition of uncertainty—and embrace fuzziness inherent in any crystal ball.
This article is republished courtesy of INSEAD Knowledge. © INSEAD 2016.
Links:
[1] http://knowledge.insead.edu/blog/insead-blog/embrace-the-fuzzy-crystal-ball-4602