Gurus

Latticework of Mental Models: The Rashomon Effect

Tan KW
Publish date: Fri, 14 Jun 2019, 10:18 AM

The parable of six blind men and an elephant, goes like this —

When a group of blind men, who had never come across an elephant before, encounter the tusker for the first time, they try to conceptualize the animal by touching it. Each blind man feels a different part of the elephant’s body, but only one body part, such as the tail or the trunk. Then they discuss their understanding about the elephant.

The man who had touched the elephant’s side says, “It’s very much like a wall.”

The one who held the elephant’s tusk declares, “No! it’s like a smooth spear.”

“Not really. It’s like a python.” Claims the man who grabbed the trunk.

“You’re all mistaken.” shouts the man who got the elephant’s tail. “It’s like a thick rope.”

“I know we’re all blind but have you guys lost your mind also?” The fifth man who touched the animal’s ears says, “It’s like a big fan.”

“Come on, folks! What’s wrong with all of you?” Argues the sixth man who was leaning against the elephant’s knee, “It’s definitely like a tree.”

Image Source: John Godfre Saxe, “The Blind Men and the Elephant”

Who was right? In a way, everyone was right about what they perceived. At the same time, everyone was wrong about the elephant because their limited experience allowed them to figure out only a smaller chunk of reality in isolation.

The elephant in the fable is an apt metaphor for the complex problems we encounter in the real world. And who are those six blind men? Those are us — handicapped by our tendency to claim absolute truth based on our limited, subjective experience.

The fable is a reminder of how our overconfidence divorces us from other’s viewpoint and makes us unwilling to accept the fact that those who disagree with us are also under the spell of the same bias — looking at the problem from a single dimension.

Anne Duke in her book Thinking in Bets writes —

We’ve all experienced situations where we get two accounts of the same event, but the versions are dramatically different because they’re informed by different facts and perspectives. This is known as Rashomon Effect, named for the 1950 cinematic classic Rashomon, directed by Akira Kurosawa. The central element of the otherwise simple plot was how incompleteness is a tool for bias. In the film, four people give separate, drastically different accounts of a scene they all observed, the seduction (or rape) of a woman by a bandit, the bandit’s duel with her husband (if there was a duel), and the husband’s death (from losing the duel, murder, or suicide).

Akira Kurosawa deliberately used elements of perception and subjectivity to present conflicting versions of the same event through different characters in the storyline. This contradictory interpretation of the same event boggles the minds of the viewers because they are constantly trying to guess who is right and what actually happened.

The movie was a great commercial success and Akira’s insight — relativity of truth and the unreliability and inevitable subjectivity of the human memory — was recognized in the world outside the cinema also. The lawyers and judges commonly speak of The Rashomon Effect when first-hand witnesses give contradictory testimonies.

The Bollywood film Talvar — based on the 2008 Noida double murder case — used the Rashomon effect. The film depicts the investigation of the case from three different perspectives in which victims’s parents are either guilty or innocent of the murder charges by the police investigation, the first CBI probe and later an investigation by a different CBI team.

So why does this happen? Why do different people have such dramatically different accounts of the same event? Maybe they’re lying. That’s plausible but an easy explanation and pretty much useless in solving the problem. However, there’s another possibility.

Humans interpret any incident based on their own perceptions. Like those six blind men.

So, even when the incident is an independent event, what’s observed is modified by the observer’s mindset, experiences, and expectations. And when it comes to the recollection of the event, another distortion is layered by the memory. It is due to this that it becomes maddeningly hard to verify the truth based on narratives given by different people.

Morgan Housel, in his essay The Psychology of Money, writes —

Your personal experiences make up maybe 0.00000001% of what’s happened in the world but maybe 80% of how you think the world works. If you were born in 1970 the stock market went up 10-fold adjusted for inflation in your teens and 20s – your young impressionable years when you were learning baseline knowledge about how investing and the economy work. If you were born in 1950, the same market went exactly nowhere in your teens and 20s.

When everyone has experienced a fraction of what’s out there but uses those experiences to explain everything they expect to happen, a lot of people eventually become disappointed, confused, or dumbfounded at others’ decisions. Keep that quote in mind when debating people’s investing views. Or when you’re confused about their desire to hoard or blow money, their fear or greed in certain situations, or whenever else you can’t understand why people do what they do with money. Things will make more sense.

Brushing aside disagreement with others with an assumption that others are misinformed or are stupid doesn’t help the situation. When you become curious about why others believe what they believe, you open up the possibility to unearth important information that might help you in updating your worldview and making better decisions.

Everyone’s watching a different movie, writes Housel, “Personal financial success is all relative measured against the amount of effort you put into it and the expectations you set for yourself. Both are different for everyone. What seems trivial to you might be the most important thing in the world to me, especially if we’re at different stages in life – low interest rates are great for young borrowers, but disastrous for retirees needing fixed income. We’re all coming from a different place with different perspectives, which explains why so many equally smart people in finance and economics disagree with each other. When you find something crazy in finance and ask yourself “Why is this happening?,” the answer is usually “because someone with a different perspective thinks it should.”

Duke writes —

Even without conflicting versions, the Rashomon Effect reminds us that we can’t assume one version of a story is accurate or complete. We can’t count on someone else to provide the other side of the story, or any individual’s version to provide a full and objective accounting of all the relevant information. When presenting a decision for discussion, we should be mindful of details we might be omitting and be extra-safe by adding anything that could possibly be relevant. On the evaluation side, we must query each other to extract those details when necessary.

The lesson here is that we should never be overconfident about one version of the truth, especially the one we believe in. Being adamant about our version of truth makes it hard for us to share information that could give others a chance to find flaws in our decision-making. And that would eventually lead to fooling ourselves. That’s why Richard Feynman observed that fooling ourselves is the easiest thing to do.

Commenting on scientific truth-seeking, Feynman said —

A kind of utter honesty — a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid — not only what you think is right about it: other causes that could possibly explain your results…

We have to realize that the elephant of reality hides in it a huge amount of information. And our cognitive abilities are limited and can never absorb all the details available at any given moment.

Reminds me of this intriguing quote from famous mythologist Devdutt Pattanaik. He writes —

Within infinite myths lies the eternal truth
Who sees it all?
Varuna has but a thousand eyes,
Indra has a hundred,
You and I, only two.

Which means we can never be sure of what we see as reality. However, being unsure doesn’t mean being indecisive. It means that this lack of surety is an opportunity to use it as a motivation to keep updating our hypothesis and acknowledge our fallibility so that the downside can be protected if our hypothesis turns out to be false.

Seeing the reality as it is may not make your life necessarily more comfortable. Most probably it won’t. But the idea of comfort itself is an illusion.

Anne writes —

In the movie, the matrix was built to be more comfortable version of the world. Our brains, likewise, have evolved to make our version of the world more comfortable…Giving that up is not the easiest choice. By choosing to exit the matrix, we are asserting that striving for a more objective representation of the world, even if it is uncomfortable at times, will make us happier and more successful in the long run.

Conclusion

The world’s smartest problem-solvers and decision-makers rely on a set of frameworks and mental models that help make decisions and separate good ideas from the bad.

These mental models help you perceive the reality in a manner which is closer to the truth. Once you learn these mental models, it becomes easy to change your own actions and avoid common traps.

A latticework of mental models assists you in interacting with the world with better results. Having these mental models in your head is like a bag of lego blocks which you can use to build your own decision-making framework and discover new insights on how the world really works.

 

https://www.safalniveshak.com/latticework-mental-models-rashomon-effec/

 

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment