This will be the first of series of recommended books, again dealing with new ideas of science. It might be odd then to start with a book like The Black Swan considering that it might apply more to people in finance and business, but Taleb would argue that his principles apply to all of us. Taleb openly admits that he despises discussing finance with the media, and would prefer to talk about his philosophies as they apply to our lives and society.
Nassim Taleb was a Wallstreet option trader at one time (known as a ‘quant’), who was able to design a successful investment strategy based on his philosophies. He popularized the idea of a ‘black swan,’ a rare, but highly consequential event that is generally unpredictable before the event but becomes entirely plausible afterwards. For example, if someone said to you on September 10, 2001, that radical extremists were going to hijack our airplanes and blow up our buildings, you would have said that person was completely ridiculous, even insane. However, if that same person were to say that to you now, then it no longer seems silly. Such an event becomes plausible after the fact.
Similarly, there is no way to predict ahead of time many of the life-changing events that happen to us. This becomes particularly true as our societies become more interdependent, which at the same time, makes us more vulnerable to particular events (see my Math vs. HIV post). Taleb argues that the probabilities of these rare events (‘tail events’) are not computable. With respect to 9/11, unless you were working for the government and part of the Middle East relations, then there was no way for you or anyone to predict ahead of time the tragic event of that day. This is why Taleb gets extremely annoyed when people ask him what the next ‘black swan’ will be. The point is not to predict black swans; the point is to be robust to them.
This is where Taleb looks to nature as a model. He uses the example of shooting the largest animal or mammal in the world, and while it would be bad, how it would not at all affect the survival of the species or life in general. I would take it one step further. When the asteroid hit the Earth 65 million years ago, it was not the biggest and strongest of the species, i.e. the dinosaurs, that survived. It was the small mammals that later allowed us to inherit the Earth. Robustness to black swans does not mean being the biggest or the strongest; it means being small and adaptable. This is why Taleb is so vehemently against companies that are ‘too big to fail’ and get large bailouts when things go bad. Nature does not do bailouts.
As it applies to our economy, applying Taleb’s philosophies requires a fundamental shift in the way we structure our society. I believe that there is a human psychological reason why we allow few, but very large and very vulnerable companies to exist (under the guise of ‘optimization’ and ‘efficiency’ to use business jargon). Taleb argues that we need to go the opposite way: we need to build redundancies as a way to deal with the unexpected.
Overall, I thought The Black Swan was very enlightening. Will we change? I seriously doubt it. We can’t even get ourselves to make simple changes (I always joke that if we can’t even get ourselves to switch to the metric system, then there is no hope for us to make any substantial changes whatsoever). Only a massive crisis will force us to make that change. That arguably should have been the 2008 crisis, but instead our government did its best to smooth out the problem by going deeper into debt. Now it’s 2011, three years later, and here we are: debt crisis.
To see a lecture by Nassim Taleb and Daniel Kahneman on the recent financial crisis, which draws on studies of human psychology, click here.
Update 9.18.11: Taleb made a really interesting point about the recent earthquake in Japan. In 2003, the Japanese Nuclear Commission had the following goal: ”The mean value of acute fatality risk by radiation exposure resultant from an accident of a nuclear installation to individuals of the public, who live in the vicinity of the site boundary of the nuclear installation, should not exceed the probability of about 1×10^6 per year (that is, at least 1 per million years).” It took only 8 years for ‘their one in a million-year accident’ to occur.