Gurus

Latticework Of Mental Models: Lucretius Problem - Anshul Khare

Tan KW
Publish date: Thu, 22 Dec 2016, 02:24 PM

December 22, 2016 | Anshul Khare  

It was a Friday on March 11, 2011 when a massive earthquake with an intensity of 9 on Richter scale hit off the coast of Japan at 2:26 pm local time. The epicenter of the quake was 70 kilometer east of the Oshika Peninsula of Tōhoku.

The earthquake triggered powerful tsunami waves that reached heights of up to 40 meters. It took 50 minutes for the largest wave in the tsunami to arrive at the shores of Fukushima. What followed was something totally unimaginable and unexpected for those who take pride in taming the mother nature.

The Fukushima Daiichi nuclear power plant had six separate boiling water reactors, protected by a 10-meter-high seawall to prevent sea waves from entering the plant.

When the tsunami struck the Fukushima coastline, the gigantic waves easily overtopped the plant’s seawall. It took seconds to flood the basements of the turbine buildings and disabling the emergency diesel generators. Soon the backup generator building was also flooded. This resulted in an explosion and leakage of radioactive material to the sea water and created a huge nuclear hazard.

Why would the engineers and designers of Fukushima nuclear power plant build a wall only 10-meter high? What made them believe that the waves can’t breach the 10-meter height?

The reason was that the engineers had never seen sea waves as high as 10 meters in that area for the recorded history. The history betrayed them and that’s the premise for today’s discussion.

Clearly, the Fukushima disaster could have been averted if the designers weren’t blind sighted by historical records.

While designing critical systems, it doesn’t suffice to build redundancy based on the historical worst-case scenario. You actually have to overcompensate. Nassim Taleb has named this inconsistency i.e. over-reliance on historical worst case scenarios, as The Lucretius Problem.

Titus Lucretius Carus was a Roman poet and philosopher. Nassim Taleb, the author of Antifragile, writes in his book –

…risk management professionals look in the past for information on the so-called worst-case scenario and use it to estimate future risks – this method is called “stress testing.” They take the worst historical recession, the worst war, the worst historical move in interest rates, or the worst point in unemployment as an exact estimate for the worst future outcome. But they never notice the following inconsistency: this so-called worst-case event, when it happened, exceeded the worst case at the time.

I have called this mental defect the Lucretius problem, after the Latin poetic philosopher who wrote that the fool believes that the tallest mountain in the world will be equal to the tallest one he has observed. We consider the biggest object of any kind that we have seen in our lives or hear about as the largest item that can possibly exist. And we have been doing this for millennia. In Pharaonic Egypt, which happens to be the first complete top-down nation-state managed by bureaucrats, scribes tracked the high-water mark of the Nile and used it as an estimate for a future worst case scenario.

In December 2015, when excessive rains hit Chennai city, the situation was made worse by inadequate infrastructure for water drainage systems which caused flooding in the city. The officials and the government had designed the sewage system taking the worst historical case in consideration. This event forced them to rethink their definition of worst case.

Shane Parrish writes in his blog –

…our documented history can blind us. All we know is what we have been able to record.

We think because we have sophisticated data collecting techniques that we can capture all the data necessary to make decisions. We think we can use our current statistical techniques to draw historical trends using historical data without acknowledging the fact that past data recorders had fewer tools to capture the dark figure of unreported data. We also overestimate the validity of what has been recorded before and thus the trends we draw might tell a different story if we had the dark figure of unreported data.

We take the wrong interpretation of history. We know that history repeats itself but we forget that repeating doesn’t necessarily mean repetition of the same patterns. Repetition can also mean that the surprise will be repeated.

Disraeli observed, “What we learn from history is that we don’t learn from history.”

Overcoming Lucretius Problem

The answer to this problem is found in nature.

Our body is more imaginative about the future than we are. It discovers the probabilities in a very sophisticated manner and assesses risks much better than our intellects do.

When you exercise or lift heavy weights, it exerts stress on your body. As a result, muscle fibres break down but how does the body respond to this strain? The body’s response isn’t just to rebuild the lost fibres. It overcompensates to the trauma and comes out stronger than before.

In a discussion about Fukushima nuclear disaster, Nassim Taleb writes –

Not seeing a tsunami or an economic event coming is excusable; building something fragile to them is not…It had been built to withstand the worst past historical earthquake, with the builders not imagining much worse – and not thinking that the worst past event had to be a surprise, as it had no precedent.

…In the wake of Fukushima disaster, instead of predicting failure and the probabilities of disaster, these intelligent nuclear firms are now aware that they should instead focus on exposure to failure – making the prediction or non-prediction of failure quite irrelevant. This approach leads to building small enough reactors and embedding them deep enough in the ground with enough layers of protection around them that a failure would not affect us much should it happen – costly, but still better than nothing.

The idea is to build the layers of redundancy and overcapacity, i.e., create a buffer for ourselves.

The simplest example of a redundant feature could be a personal emergency fund which acts as an insurance policy against something catastrophic such as a job loss that allows you to survive and fight another day.

Warren Buffett famously said, “Predicting the rain doesn’t count, building an arc does.”

That’s what Taleb means by non-predictive decision making. You don’t have to be accurate about your prediction whether the tsunami waves will be 10 meters high or 15 meters high.

In Investing

The biggest intraday fall that BSE-Sensex had experienced was 826 points on May 18, 2006, but that was true until 21st Jan 2008, when the Sensex fell by 1,408 points. Very few people were prepared for this kind of drop believing that the worst couldn’t go beyond 800 in a day.

Lucretius problem tells us that our portfolio and long term investing performance should be immune to short term market gyrations. It shouldn’t matter to you if the market waves are the size of 800 points high or 1,400 points.

So how do you deal with Lucretius problem in investing?

There are two things that are advised to protect ourselves from this problem while investing.

First is to protect your downside. Which means staying away from debt and investing only that portion of your net worth in stocks which you are willing to lose (for the sake of adequate long term returns).

And second is to incorporate sufficient margin of safety in your stock purchase decisions. Warren Buffett describes margin of safety concept using this example –

When you build a bridge, you insist it can carry 30,000 pounds, but you only drive 10,000 pound trucks across it. And that same principle works in investing.

When engineers build a bridge they not only build it strong enough to bear a much higher load but also strong enough to bear much stronger winds.

When it comes to investing your money in the stock market, the famous Murphy’s law shouldn’t be forgotten –

It is found that anything that can go wrong generally does go wrong at the worst possible time.

This is the reason that almost all accomplished value investors including Warren Buffett, Howard Marks, and Seth Klarman give so much importance to risk control and are always looking out for things that can go wrong.

Losing some money is an inevitable part of investing. In fact, there is nothing you can do to prevent it. But to be a sensible and intelligent investor, you must take responsibility for ensuring that you never lose most or all of your money.

Conclusion

Looking at historical patterns can give us important clue but it doesn’t provide a complete picture of what’s possible in the future.

The past is a poor indicator of future and over-reliance on historical records and data can make us vulnerable to unexpected shocks. So when the next market tsunami hits, don’t be caught off guard like the Fukushima.

Lucretius Problem is a reminder for us to stay cognizant of what can throw us out of the game.

Chetan Parikh, an accomplished value investor and practitioner of Charlie Munger’s Latticework theory, recently delivered a talk titled Exploring The Latticework. I was fortunate to attend his lecture. Here’s an excerpt –

Charlie Munger in the 2016 AGM of Daily Journal Corporation said: “Synthesis is reality, because we live in a world of multiple models, and of course we’ve got to have synthesis to understand the situation.”

What does synthesis mean? In the context of this presentation, it would mean taking many relevant disciplinary perspectives and then transcending them. The result is, if one were to use the terminology of systems theory, an “emergent” perspective.

Use two more metaphors that I came across – the fruitbowl and the smoothie. The reason for the explicit mention of metaphors is because using metaphors is one of the main tools of lateral thinking and present in all discourse. Metaphors shape the way we think, interpret and behave.

The bowl of fruit is basically a picture of multi disciplinarity. If one takes each fruit as representing a discipline, then the bowl represents many disciplines in close proximity to one another. Taking courses in two or more disciplines gives multidisciplinary knowledge.

The smoothie represents the blending and the integration of many disciplines. The distinctive flavor of each fruit is no longer identifiable, but what one tastes is an emergent flavor. The fruitbowl gives way to the smoothie. This is interdisciplinarity.

Multidisciplinary mental models help in the deconstruction (and I use the word “deconstruction” here from an engineering and not a philosophical perspective) of a complex problem, But knowledge does not mean wisdom.

Wisdom helps in mapping relationships accurately, in making the right connections. Wisdom is partly about understanding the implications of the connections as much as it is about the process of making connections.

He used the metaphors of fruitbowl and smoothie which, for me, added a totally new perspective to the importance of multidisciplinary thinking.

Time and again, call it confirmation bias, I have realized that one common trait among successful people is their ability to look at a problem and address it from multiple perspectives. It’s the most effective way to be as objective as possible in your thinking and decision making process.

Take care and keep learning.

http://www.safalniveshak.com/latticework-mental-models-lucretius-problem/

Discussions
1 person likes this. Showing 0 of 0 comments

Post a Comment