What I've been reading (vol. 2)
The second volume of what I've been reading is all about, as I mentioned last time, the brilliant philosophy of Nassim Nicholas Taleb and a large part of his Incerto, a four volume "philosophical and practical essay on uncertainty". It contains Fooled by Randomness, Black Swan, Antifragile, and the Bed of Procrusets, plus there is another book on the way to make it five, Skin in the Game. I read the first three, each is brilliant, so I'll definitely find some time in the future to complete the Incerto. What is interesting about the three of his books I read so far is the re-occurrence of some topics in different scenaries. Already in his first book, Fooled by Randomness, Taleb introduces us to uncertainty, the role of luck, the Black Swan problem, his skin in the game philosophy (to have skin in the game means being able to put your money where your mouth is), and most importantly his underlying life philosophy of an empirical skeptic. Empirical skepticism and outright popperianism take the center stage in all of his books. Perhaps that's why I liked them so much. That's the thing with Taleb, you either love him or you hate him. I'm guessing he's deliberately doing that.
Taleb, Nassim Nicholas (2004) Fooled by Randomness. The Hidden Role of Chance in Life and in the Markets. Penguin Books
How do we define success? Is it down to skills, ability, and strategy, or is it down to something much more unpredictable – pure luck!?
The book is about how we tend to misinterpret our perception of luck and randomness into something tangible such as skills and knowledge. Taleb uses an autobiographical element of his career as a Wall Street trader to unveil the hoax that successful and rich traders are successful due to their inherent ability, competence, intelligence and/or skills. This kind of simplistic causal thinking is what it means to be fooled by randomness. People genuinely have a lack of understanding of probability and uncertainty, making them believe that events are non-random, whereas in most cases they are really triggered purely by chance.
The biggest mistake we make in not applying a probabilistic reasoning is not taking into consideration the whole set of alternative outcomes. What would have happened if an event carried on differently than it actually did? There are many fallacies arising from this. For example, books on the successes of millionaires usually focus on a particular string of outcomes, as if there is a particular embedded reason as to why someone succeeded in life. So they inform their readers of the optimal formula, the necessary ingredients, to become successful. Work long hours, read and educate yourself, get to know the ‘right’ people, defer your consumption decisions, don’t live an extravagant lifestyle, apply a particular way of thinking about problems, etc. The basic problem with this kind of thinking is that it falls under selection bias (the law of selection – pick a specific, non-representative sample and make general assumptions of the population), or as Taleb calls it survivorship bias. When observing how the rich got rich, we tend to look at a very limited sample. More precisely, we only look at the winners. There is no mention of those who did exactly the same, who applied the exact same strategies (a lot of people work hard, come home late, know the ‘right’ people) but failed. There is no way to test which of these characteristics actually do hold true – whether there is a “success gene” that makes the successful, in any field, – successful. In order to do that we need to assemble a huge sample of winners and losers to see what distinguished the winners from the losers. Taleb has no doubt – it was luck!
“Mild success can be explainable by skills and labor. Wild success is attributable to variance.”
One of the strongest points of the book is how randomness fools us into thinking that certain outcomes we tend to observe are down to skills. In the prologue there is a Table of Confusion where Taleb precisely identifies the central distinctions and confusions later analyzed in the book. These include confusing luck with skill, probability with certainty, belief with knowledge, theory with reality, anecdote(coincidence) with causality, forecast with prophecy, a lucky idiot for a skilled investor, survivorship bias for market outperformance, volatility for return, noise for signal (!), etc.
The problem with randomness (variance) is that no one attributes it to his or her successes, but only to failures. Intelligence, know-how and skills are responsible for success, bad luck for failure. However, this purely depends on the type of profession. In many professions luck plays a minor role, and skills and knowledge dominate. Surgeons, dentists, car mechanics, plumbers, etc. all have a non-random element determining their performance – the level of their own practice, i.e. skill. This of course doesn’t mean that occupations like business CEOs, university professors, or scientists have their fates purely in the hands of chance. On the contrary. Each person’s skills are important. But the extent to which they’re important depends primarily on the volatility of the person’s job/environment. In other words how exposed is he to randomness. The lower the exposure, the more role is attributed to skills, and less to luck.
The book is therefore a defense of science, and an attack of scientists straying from their course (lack of self-critisism and incapability of dealing with randomness). It battles against heuristics and warns of the dangers of simplification. It discusses visible and invisible histories of rare events, probability biases, and what it takes to become an empirical skeptic. The point Taleb is making is that he too can be fooled by randomness, only he has the audacity to accept this fact. There are tricks to deal with this and to try and avoid being fooled by randomness too often. One is to constantly revisit your own opinions, i.e. update your prior beliefs (Bayesian updating). Avoid being “married to your position” (trading lingo). In light of new evidence the least you can do is to update your beliefs.
Taleb, Nassim Nicholas (2007) Black Swan. The Impact of the Highly Improbable. Penguin Books.
Taleb, Nassim Nicholas (2012) Antifragile. Things That Gain from Disorder. Penguin Books.
In what is very much a sequel to Fooled by Randomness, Taleb further exposes and details his philosophy of an empirical skeptic. The emphasis is once again on the faulty of human perception, however this time it is related to the interpretation and perception of really rare events, the so-called Black Swans.
A Black Swan is an event that has three distinct characteristics: it is completely unpredictable; it bears long-lasting consequences that make the world as it is (it has a high impact); and it has a retrospective explaninability (after it happens people tend to rationalize it and fool themselves into thinking that it, in fact, was predictable - typical example of hindsight bias). Taleb teaches us not to worry too much about predictions and how to be good at them, but to embrace uncertainty instead. The book is an excellent philosophical and psychological treatise of human rationality, the limits and overestimation of our knowledge, our primal heuristics driven by Platonicity, and our chronic lack of abstract thinking.
The Black Swan logic teaches us that things we don’t know are far more relevant than things we do know. Black Swans are unexpected; if they weren’t we would have prevented them. Since we are generally blind to randomness and large deviations we fail to recognize the importance of Black Swan events in shaping the way the world around us looks (just ask yourself how many things in your life came as planned, and how many were a result of an abstract shock, purely random events that steered you into one direction or another – and don’t fool yourself into thinking you planned to have everything exactly as it is; if you belong to this category, you suffer from some serious hindsight bias).
In fact, no modern day breakthrough (be it technological, scientific, economic or political) came as a result of careful design and planning. It was all trial and error, where some realities turned out to be more (or less) likely than others and have manifested themselves into the social order we have today (think of any great historical event – the world after WWII e.g. – or any technological innovation that has affected how we live today). Markets also work thanks to trial and error. So does democracy in my opinion (this is further developed in Taleb’s next book: Antifragile. My own take on this was the success of democracies over autocracies due to their trial and error process, or what Taleb accounts to their antifragility).
Whenever we observe a Black Swan, apart from fooling ourselves into thinking that it was rather predictable, we also tend to learn from the specifics of the event, not the general existence of such events. For example, after 9/11 people learned the threat of terrorist attacks through airplanes, which is why the reaction was to make air force security tight, in addition to creating a Big Brother society to make us all feel safe. It was a typical example of learning from a very specific case. It is the same way that regulators tend to react to financial crises. They react ex-post, without realizing that the next financial crisis, when it happens, which it surely will, won’t necessarily affect the system through the same mechanisms. Regulating the mortgage market and its financial derivatives (MBSs, CDOs) after the 2008/09 crisis is an expected response, but the next crisis is very unlikely to come from the same financial instruments. It is likely to arise as a consequence of something completely different. Just like the 2008/09 crisis came from the super-safe mortgage market, that “no one saw coming”, the next big crisis is very likely to be a consequence of something else. The reason for this is that we tend to learn from history, and the way it has unfolded, without considering the alternative scenarios. Combined with hindsight bias, in that we should have seen it coming as it was so obvious (!), we tend to draw the wrong implications from historical events. And in that perspective we tend to derive wrong conclusions and wrong solutions.
The bottom line is that in having these types of reactions is not strengthening the system – on the contrary, the more complex the system (of regulations, rules, laws, etc.) the more likely for it to combust. It is still vulnerable to Black Swans, since we never change our way of thinking about them and we never change our way of thinking about uncertainty. We don’t learn “rules” and the general idea of risk and uncertainty, we only learn the facts. We get preoccupied with anecdotes and the interesting and specific cases and stories. We run away from the abstract, since anything that cannot be explained is scary. This is undoubtedly an artifact of our primitive behavioral trails, our evolutionary fear from the unknown, the uncertain. We don’t think as much as we should. We simply react. This too is deeply embedded in our gene pool.
In addition to all of that the book features a brilliant part on our failures of prediction, and the great scam that is the bell curve.
Taleb, Nassim Nicholas (2012) Antifragile. Things That Gain from Disorder. Penguin Books.
Just like the Black Swan was a sequel to Fooled by Randomness, so is Antifragile a sequal to the Black Swan. Many ideas touched upon in the first two books are further developed in this one. Essentially in Antifragile Taleb offers to solve problems raised in the Black Swan. It is, in his own words (and I agree) his best work!
The underlying reasoning is a split of the world and everything in it into three distinct categories: fragile, robust, and antifragile. Fragile "things" (which includes people as well as businesses, countries or institutions) avoid disruption from fear of change. By doing so it offers a false sense of security and is making the system even more vulnerable to a shock. The longer it's postponed, the worse its consequences. Robust means that one can bear the burden of the shocks without implying the necessity to change one's behavior. But the best of all three is to be - antifragile. This means that shocks, adversities, and disruptions make you stronger (what doesn't kill you...), you adapt to them and change, improve. In other words this is typical trial and error behavior, you make mistakes, get hit by disruptions, but you gain from that and become more robust to future shocks. You become better by falling down and learning how to pick yourself up again. Only when we're antifragile can we actually avoid being surprised and congested by the Black Swan. That's what the subtitle "gaining from disorder" stands for. Uncertainty here is desirable. We should be well aware of it, embrace it and, more importantly, we should design our institutions to support uncertainty, to become antifragile to it. In fact we should all strive to be antifragile.
The best way to illustrate this is with the package example used in the beginning of the book. If you pack glasses in a box, even very carefully (pseudostability), they would still be sensitive to being dropped and thus broken, since they are, in their essence, fragile. On the other hand if you pack a steel cube in the same box you can drop it around all you want, the cube won't break. It is robust to disturbances. However an even better alternative is to pack something antifragile. This is an artifact which would become stronger from every drop and every disturbance. In real life Taleb recognizes evolution as the prime example of antifragility. Whenever there was a stressful event evolution made us stronger by adapting. A similar example is entrepreneurial ventures which strive on trial and error. Something
that is defined to be anti-fragile, the complete opposite of fragile, is
something that benefits from shocks, from volatility, from randomness and
disorder, from risk and uncertainty.
When faced with a shock, a fragile thing will
break, but an antifragile thing gets better.
What does a
system that benefits from errors look like? Consider the airline industry –
each plane crash makes a system safer as we learn what took one plane down and
make sure this error is never repeated. The reason why this is possible is
because errors are uncorrelated (i.e. a fall of one plane does not cause all
the others to go down as well). The economy is different – a major shock in one
industry does transfer very quickly onto others (e.g. housing, finance, etc.).
This is because in fragile systems errors are compounding and spreading. So
while every plane crash makes the next plane less likely to fall, every bank
crash makes the next one more likely.
A few helpful examples of the triad - fragile, robust, antifragile: the mythical creature Hydra is antifragile, while a Phoenix is robust. The
banking system is fragile, while the Silicon Valley (characterized by entrepreneurship & innovation) is
antifragile. Fragile systems hate mistakes, antifragile systems love them and
learn based on trial & error. Debt is fragile, VC capital is antifragile,
centralized systems are fragile, decentralized are antifragile, etc.
Antifragility
is the property of all systems that have survived, meaning that depriving them
of stressors will hurt them and make them more exposed to hidden risks. By
surpressing randomness and volatility we are fragilizing the system (such as
increasing moral hazard). Such changes usually come top-down and block
antifragility, therefore constraining growth. Changes need to arrive bottom-up through the process of trial and error
and discovery.
Taleb is very sensitive to what he calls fragilistas, people who make us engage into artificial policies and naïve interventions
which carry small benefits to society (and potentially large to themselves),
however with potential negative side effects to the system which could be huge. This is how systemic
risk is created: building up hidden risks by preventing fragile systems to
change and blow up before they become too-big-to-fail. The role of the
fragilista is to lend credibility to such a systemic error through academic and
policy-based advocacy. Systems
need to be open to small but vital stressors, small errors which we learn from
and improve. When a system is juxtaposed this way it will become antifragile
and grow stronger with each shock. Fragilistas (policy-makers, economists, foreign policy hawks, politicians, bureaucrats, bankers, and many many more) actively prevent that.
One of the most interesting concepts he develops is skin in the game as a way to understand ethics (I'm happy to see there's a new book carrying this exact title). Skin in the game means assigning direct exposure to one's opinions/predictions/goals, so that if you're mistaken you should lose money (or in another way be exposed to the negative event). He stresses out that the lack of skin in the game is one of the major causes of fragility. This is equally applicable to pundits/experts ("empty suits") unaffected by their predictions, managers (CEOs) unaffected by how their company performs or what impact it has on society (e.g. the environment), or politicians personally unaffected by the policies they implement. In other words “learn
from people who do what they teach! Ignore others” (this is from Black Swan actually, but I feel it fits more here).
"To me, every opinion maker needs to have “skin in the game” in the event of harm caused by reliance on his information or opinion ... Further, anyone producing a forecast or making an economic analysis needs to have something to lose from it, given that others rely on those forecasts..."
In addition to all of this the book also offers some very deep and interesting thoughts about debt, innovation, health (especially), politics, foreign policy and wars, urban planning and city states, personal finance, etc. And all of that done through the same framework of popperianism and empirical skepticism. Like all his books, it is fascinating to read as Taleb jumps from topic to topic, from anecdotes to analyses, from order to disorder (sic!). In my opinion, this specific style is exactly what makes him such a compelling author. If you have to read just one of Taleb's books (even though they're all brilliant), read this one.
Comments
Post a Comment