World has found a new interest in understanding antifragile again. With the onset of a pandemic, global survival is at risk. Everyone is pondering how could we have built an antifragile health system, financial back bone and an economy.
Antifragile is not a theory. It is a set of heuristics at best. Antifragile is a property of a system.
Everything can be thought of as a system, humans, humanity, an organization such as a village, apartment association, government, stock market and several more. Studying a system is not easy. Understanding the component of a system does not help in understanding the system. For example knowing about a neuron does not help in understanding the brain. Similarly knowing an ant’s behavior is not useful in understanding the ant colony.
A system is called antifragile when upon getting shocks from outside makes it stronger.
Antifragile does not fit into nice scientific theory. Carrot hardens in boiling water while potato softens. Fire increases in wind while candle wick blows out. This is hard to understand for logic or a linear thinking mind. Same stressor makes some systems fragile and others antifragile.
To understand Antifragile, one must first understand ‘Black Swan’. And to understand ‘Black Swan’ you must understand ‘Fooled by Randomness’.
And thereafter to design antifragile system you must learn about ‘Skin in the Game’ as an important design tool.
Fooled by Randomness
Fooled by Randomness is where people see patterns where none exists. Confusing noise for signal or vice versa. Also known as attribution bias. It is mistaking luck for skill. People that become successful attribute their success to their skills. While they attribute to luck, for failures that they face. Safer bet however is to assume that it was only partly true. World only celebrates the stories of those who came on top, we discount the role chance may have played in helping the winner get there. The lottery winner will be invited to give a keynote speech and he will champion to buy lotteries and never give up. This is Survivorship bias.
Decision making under randomness therefore requires a full rethink. One must not look at outcomes to evaluate the quality of decisions.
In the world of randomness, It is not the presence, absence, frequency of random events but the magnitude that is more critical in making decisions.
Scientists, engineers & economists are the most prone to not understand randomness. They clutch to the formulas of math without letting the reality give context to equations. The fundamental limitation of science that they forget is that science is right only until it is proven wrong. Folks working in extreme domains recognize it, entrepreneurs, small business, actors, traders, cab drivers etc. They know it through experience.
Investors and founders therefore are better off to write down their process to learn from how much of their success was skill and how much it was luck.
The most actionable piece from this knowledge about randomness is to decide to never play Russian roulette. Russian Roulette is a game where you put a gun with one bullet and five empty chambers on your head and shoot. If you live then you get $10m. The probability that you may survive is a huge 5/6, or 83%, but the consequence of failure is death. You become a statistic.
As humans we live life looking into the rear view mirror of experiences and beliefs. We are confident that which we have not experienced or thought before does not exist.
Black swan challenges this fallacy. It says that absence of evidence does not mean evidence of absence. Black swans are not always bad, it can be both a treasure or a landmine. Thumb rule of navigating life should be that you must expose yourself to positive black swans and reduce exposure to negative black swans. Black swans exist in different levels, black swans at an individual human level and at the humanity level.
This understanding of black swan is something that must be taught to everyone that studies science. For all those who think science can save us and believe science like god don’t understand what are the limits of science. Scientists are people that should be constantly looking for evidence that shows that their theory is wrong. Normal folks let that burden be carried out by the scientist and willfully believe that the new theory is universally true i.e across time.
Normal brain can’t muster emotional energy and go after things that it has doubts about. That is why scientists must have emotional certainty but intellectual uncertainty.
How to make something antifragile.
We must design systems that are going to survive shocks, they must infact thrive under stress. Today’s systems such as economy, banks, markets are neither resilient or antifragile. To make them antifragile following heuristics must be kept in mind.
Units must be fragile – One counter intuitive premise to start with is that for a system to become antifragile, its unit must be fragile. Humanity is antifragile because humans are fragile. For the banking system to be antifragile, individual banks must be fragile and allowed to fail. Yet we do the exact opposite for sick banks.
Trial and Error – Allow the system to learn through trial and error as opposed to being driven by theory. Give small shocks that may tear its individual units. Just like how muscles become strong through strength training. Key in this is that stress given should not create ruin events because then after that it is game over.
Option, Convex – Expose to option, a convex one. In any option if there is more upside than downside it is called convex option. Options that provide unlimited upside while taking limited downside exposure. Just like how an angel investor bets an affordable portion of wealth on startups. Downside is all money may be lost, upside is that the startup could become the next Google.
Lindy Effect – Old wisdom is better than recent insights. if something has survived for long then it is more likely to survive for longer.
Remove not add – It is better to keep things simple as opposed to making it complex. Before adding anything new, remove an existing one. Also called via negativa. via negativa simply means for survival eliminate all things that would kill you. Greatest contribution in life is by removing what we think is wrong. In life antifragility is reached by not being a sucker.
Do no harm – In complex system understand that first instinct should be that of doing no harm.
Redundancy – Do the exact opposite for just in time efficiency, pad with redundancy. Always have a buffer. In the investment it is called margin of safety.
Skin in the game – Keep risk transferers away, i.e folks that do not have skin in the game. Always look for skin in the game. In an interaction with others (family, work, market, life etc) when the other does not have skin in the game then it does not lead to a convex option for yourself. Never ask anyone for their opinion, ask them what they have in their portfolio. Don’t get on a plane without a pilot as that is without skin in the game. People who had religion have survived, it brought cohesion: people who eat together hang together.
Avoid Ruin Risk – If you die playing, a billion dollar prize money is not worth it — In a strategy that entails ruin, benefits never offset risks of ruin. Most rational thing therefore do is to avoid systemic ruin.
Some of the heuristics described above when actioned on a system increase the odds of that system to be antifragile.