Note: This post was originally published in the May 2018 issue of Value Investing Almanack (VIA). To read more such such insights on behavioural finance and other deep thoughts on value investing and business analysis, click here to subscribe to VIA.
When anyone asks me how I can best describe my experience in nearly forty years at sea, I merely say, uneventful. Of course there have been winter gales, and storms and fog and the like. But in all my experience, I have never been in any accident…of any sort worth speaking about. I have seen but one vessel in distress in all my years at sea. I never saw a wreck and never have been wrecked nor was I ever in any predicament that threatened to end in disaster of any sort.
Those were the words of E.J. Smith, captain of RMS Titanic, the ship that sank in the North Atlantic Ocean on 15 April 1912, after colliding with an iceberg during its maiden voyage, drowning 1500. Out of 2200 people onboard only 700 could be saved.
Since it has never happened in the past it’s unlikely to happen in the future. That’s perhaps the most dangerous assumption to make when the stakes are high. Smith’s mistake wasn’t in failing to predict the disaster. No one could have. His blunder was in not preparing for it.
On Titanic, there weren’t enough lifeboats for 2200 people. To add insult to an injury, the lifeboats actually had the capacity for more than 700 people but in confusion to get the ladies and the children first, many lifeboats were lowered partially filled. They hadn’t practiced the emergency evacuation protocols properly.
The guys at LTCM (Long Term Capital Management) repeated captain Smith’s mistake, albeit in a different field, i.e., the financial industry. The team at LTCM included several PhDs, experienced traders and two Nobel Laureates – Myron Scholes and Robert Merton.
Long Term came up with complex mathematical models to predict the short-term movement of Russian bond prices. Their models were based on the past data and in the past the Russians had never defaulted on their bonds. And yet, on 17th August 1998, the Russian government defaulted on its debt and devalued its currency.
LTCM bet on the swing coming back to the neutral position, no matter how long the swing stays in each extreme position. They ignored that the rope holding the swing can snap right at the time the swing is at one of the extreme ends.
Like Smith, LTCM’s mistake wasn’t in ignoring the possibility of a rare event. It was in not being prepared for it. At one point, LTCM’s debt to equity ratio was 100 to 1, i.e., they had leveraged beyond what they could ever handle. They had no lifeboats.
Captain Smith and folks at LTCM were suffering from, what Nassim Nicholas Taleb calls, The Turkey Illusion. He writes –
Consider a turkey that is fed every day. Every single feeding will firm up the bird’s belief that it is the general rule of life to be fed every day by friendly members of the human race “looking out for its best interests,” as a politician would say. On the afternoon of the Wednesday before Thanksgiving, something unexpected will happen to the turkey. It will incur a revision of belief.
You could replace the turkey with a Chicken or a Lamb. The analogy holds good for all such cases where the animal is fed for a long time only to be slaughtered eventually.
Right before the butcher surprises the poor bird, the turkey’s confidence is highest in the statement that the butcher loves turkey and life is very quiet and soothingly predictable. Thanksgiving was an unknown to the turkey. Turkey’s past was devoid of Thanksgiving so it failed to include the possibility in its calculations.
A turkey is a metaphor for the one who is both surprised and harmed by black swan events, i.e., large-scale unpredictable and irregular events of massive consequences. Encounter with an iceberg and Russian bond default were both black swan events.
Like turkey, we can never know how much information is there in the past. So no matter how much data we’ve crunched, we can never be hundred percent sure of anything happening or not happening.
Ironically, the turkey illusion probably happens more often to humans than to turkeys. The 2008 economic crisis was largely a result of widespread turkey Illusion among the financial institutions.
Gerd Gigerenzer, in his brilliant book Risk Savvy, writes –
There is a similarity between the turkey’s unexpected disaster and experts inability to anticipate financial crises: Both use models that might work in the short run, but cannot foresee the disaster looming ahead. As in the case of the turkey, the risk estimates in the U.S. housing market were based on historical data and on models similar in spirit to the rule of succession. Because the housing prices kept rising, the risk seemed to decline. Confidence in stability was highest before the onset of the subprime crisis. As late as March 2008 Henry Paulson, the U.S. Secretary of the Treasury, declared: “Our financial institutions, banks and investment banks, are strong. Our capital markets are resilient. They’re efficient. They’re flexible.” Shortly thereafter, the entire economy was in turmoil. The risk models influencing Paulson’s belief did not anticipate the scale of the bubble, similar to the turkey not anticipating the concept of Thanksgiving. The only difference was that instead of being slaughtered, the banks were bailed out by taxpayers. By suggesting a false sense of certainty, models of known risk can promote rather than prevent disaster.
It has been close to 10 years since a major crash has hit the stock market. As a result, a significant portion of the investor crowd in the stock market is made up of people who opened their demat accounts after 2008. These are the people who have only heard the stories about the crash but they have never really experienced one. I am one of them.
And even those who witnessed the last financial crisis have a fading memory of it now.
Next time you see a someone on the financial news channel claiming, “We’re not expecting any surprises,” imagine a turkey bobbing its head.
The biggest risk of big-data is precisely this. The illusion of certainty that it will eventually bring. The illusion that crunching petabytes of data using sophisticated computers and AI-powered algorithms can solve every problem. There is no doubt that subjecting hard problems of humanity through the filters of machine learning, large historical data and artificial intelligence will reveal groundbreaking insights. It surely will. It’s already happening. The danger is that it will also bring with it the possibility that we get lulled into complacency.
The more we believe that we’ve tamed the risk, the more risk we tend to take. Distinguishing between the risk and the perceived risk is hard. For example, seatbelts reduce the risk but encourage the drivers to drive more aggressively and that in turn results in more accidents.
Legendary investor Howard Marks, in his 1998 letter to shareholders titled When Genius Isn’t Enough, wrote –
Inability to remember that you can’t know what the future holds is a common failing and the cause of some of the biggest financial difficulties. It’s one of the greatest contributors to hubris – the over-estimation of what you can know and do.
The confusion between risk and uncertainty has baffled the investors for centuries. In a casino, the probabilities of different outcomes are calculable and hence the risk can be controlled. There’s no unknown-unknown in the sterile environment of a casino. But the in the real world of business not only the probabilities of possible outcomes are unknown, the range of possible outcomes is also unknown.
When I was a kid, writes Marks, “my dad used to joke about the habitual gambler who finally heard about a race with only one horse in it. He bet the rent money on it, but he lost when the horse jumped over the fence and ran away. There is no sure thing, only better and worse bets, and anyone who invests without expecting something to go wrong is playing the most dangerous game.”
Peter Bernstein, in his remarkable book Against the Gods, summarized it well –
The past seldom obliges by revealing to us when wildness will break out in the future. Wars, depressions, stock-market booms and crashes, and ethnic massacres come and go, but they always seem to arrive as surprises…If these events were unpredictable, how can we expect the elaborate quantitative devices of risk management to predict them? How can we program into the computer concepts that we cannot program into ourselves, that are even beyond our imagination?
We cannot enter data about the future into the computer because such data are inaccessible to us. So we pour in data from the past to fuel the decision-making mechanisms created by our models, be they linear or nonlinear. But therein lies the logician’s trap: past data from real life constitute a sequence of events rather than a set of independent observations, which is what the laws of probability demand. History provides us with only one sample of the economy and the capital markets, not with thousands of separate and randomly distributed numbers. Even though many economic and financial variables fall into distributions that approximate a bell curve, the picture is never perfect. Once again, resemblance to truth is not the same as truth. It is in those outliers and imperfections that the wildness lurks.
The obvious lessons that one can derive from this discussion are that future is uncertain and debt is dangerous. Those are useful lessons but the most important idea behind turkey illusion is what Warren Buffett packed in minimum few words –
Predicting the rain doesn’t count building an ark does.