I finished reading Taleb’s second book, The Black Swan. He openly admits that it’s not really a new book, but a re-writing of his first book, Fooled by Randomness, which I loved. He’s gotten really incredibly lucky with the timing of his book releases… just before 9/11 and just before the stock market laid a giant turd on the doorstep of all the happytalk from Wall Street. Especially lucky when you take into account the fact that The Black Swan was at least 15 months late!
Taleb really has just one big idea, and in his own obnoxious way, he’s humble enough to admit it. His idea is that the world is less predictable than we think. That “rare” events are both systematically more likely that we believe them to be, and that their consequences are disproportionately large. He rails against the use of Gaussian distributions where they should not be used — against the mindless shoehorning of all kinds of processes into that bell shaped box, where they do not belong, and can do great damage.
I think the main differences between this book and his prior one are that in this book, he provides a few short words on how he thinks we should live and plan, given that we live in an inherently, and increasingly, unpredictable world. That, and the fact that because of his prior book’s success, he was able to get away without having this book edited, apparently, at all (which I think may have been a mistake… but oh well). Anyway, his advice in a nutshell:
- Learn to distinguish between those human undertakings in which unpredictability can be extremely beneficial, and those where the failure to understand the future causes harm. Participate in the former, avoid the latter if you can.
- Do not look for the precise and the local. Invest in preparedness, not prediction, a la Pasteur: “Chance favors the prepared”.
- Seize any opportunity, or anything that looks like opportunity. They are rarer than you think. Do your best to expose yourself to opportunities. Go to parties, and don’t put up with small talk. (He especially and specifically encourages scientists to go to parties.)
- Beware of precise plans by large bureaucratic entities. They tend to be organizationally incapable of not making precise plans and predictions, but they seldom if ever know what they’re talking about, and will generally turn out to have been wrong. These kinds of predictions are more a function of institutional social structure than of any kind of real knowledge.
- Do not waste your time trying to fight forecasters, stock analysts, economists, or anybody else who derives a living from making precise predictions. They won’t listen to you anyway, and the predisposition in humanity to believe stories is so strong that these people are probably always going to be around. At best, you can make them angry by ridiculing them, or profit from your knowledge that their predictions are almost certainly wrong on average.
Taleb claims to have exercised these rules in the creation of his “dumbell” investment strategy, in which a small proportion of an overall portfolio is invested in extremely risky ventures which are likely to fail, but if they do succeed, may do so without any reasonable limitations on the scale of their success (these are all essentially information-based ventures, either developing technology, or media — he prefers the former). The remaining portion of the portfolio is invested in the safest possible securities… e.g. inflation protected treasury bonds, or T-bills. This poses an interesting problem because the U.S. government is subject to currency-based Black Swans. He actually uses the example of the hyperinflation of Reichsmarks between WWI and WWII elsewhere in the book as an example of a Black Swan (juxtaposing it with the portrait of Gauss, and his bell curve, on the last round of Deutschmarks issued before the Euro took over). So here at least I think he’s wrong. It would be nice to think that there was a safe store of value out there: gold, T-bills, canned food and ammunition… whatever. But I think the hard truth is that no such thing exists, certainly not on the timescale of centuries, which means the best we can do is maximize our exposure to scale-free successes, and avoid that which is known to be mediocre.
I’m curious what Taleb would think of the synthetic biology debate, which is a case in which the very same technology will create exposure to both positive and negative Black Swans. I suspect he’d argue Drew Endy’s point of view, that many small tinkerers are more likely to develop the technology beneficially than a few people who are going to be predisposed toward abuse. But technological intent and consequence are largely, if not completely, unrelated. That’s just not the nature of technology, or the nature of history. In retrospect we can always weave a narrative that ties what happened together into some coherent whole. We’re a storytelling species. That’s entirely different from being able to predict the future.
This book and Michael Pollan’s recent talk at the Long Now both peripherally touch on the inherent differences between endeavors of the material world, and those which are based on information. I really do think there’s something more general to be said about this division, but I don’t know who has said it already, or even if anybody has. Reading suggestions welcome.