theblackswan
Joe

Joe

The Impact of the Highly Improbable

Black Swan events explain almost everything about our world, and yet we—especially the experts—are blind to them.

Video Summary by the Swedish Investor

Key Takeaways

1. The Black Swan Problem

The introduction of this video illustrates the example of a positive Black Swan. For an event to qualify as a Black Swan, it must fulfill the following: 

1. It’s an outlier. Nothing that has happened before can convincingly point to even the possibility of the event.

2. It carries an extreme impact. 

3. It becomes explainable only after the fact. Human nature fools us into believing that we should have been able to know it would happen all along. 

Other examples of Black Swans are: 

  • the outbreak of WW1,
  • 9/11,
  • Black Monday
  • the Indian Ocean earthquake and tsunami.

The Black Swan problem is also called “the problem of induction.” The basic principle is that there are great uncertainties when trying to forecast the future, given the knowledge of the past. How can we expect to be able to figure out the properties of something that is infinite and unknown, based on something that is finite and known? 

Another example: Imagine that you’re a turkey, living on a farm. From the day that you were born, some friendly creatures with two legs and two arms have been feeding you. Every day, they keep coming back with food! Also, they’ve built a nice fence which protected you from that hairy thing on four legs, which looked like it wanted to rip your guts out. They even bring female turkeys to you occasionally. What lovely creatures they seem to be! 

For every passing day, as you are fed, protected and stimulated, you become more and more certain that these must be the friendliest creatures on earth! Until that day called Thanksgiving, when they’re no longer so friendly anymore. Thanksgiving definitely came as a Black Swan for this turkey. 

From the point of view of the turkey, it was totally unexpected, had an extreme impact, and if he could have reasoned around the event after its occurrence, he might even have found explanations for it. From this, we can conclude that a Black Swan is a sucker’s problem. It’s only a Black Swan if you’re not informed. 

This goes for randomness in general. It’s nothing else than lack of knowledge. For the turkey, him making it to the Thanksgiving table of some hungry human family, was totally unexpected. But the family had probably known the exact date of the event for months. 

There’s a cousin of the Black Swan which is called the Grey Swan. This concerns the “known unknowns” – things that we know, that we don’t know. We don’t know how fast a man can run 100 meters, but we know that we don’t know that! 

A Black Swan, on the other hand, is an “unknown unknown.” We don’t even know that we don’t know that. Imagine a book that you have never read, and that you haven’t watched a summary of on my channel, of course. That contains a lot of unknown unknowns. If you decide to read the book one day, at some point, you’ll exclaim: “Oh, I didn’t know that!” but you can’t foresee today what content you’ll be exclaiming that about.

2. The Implications of Black Swan Blindness

Nassim Taleb talks about five related issues to the Black Swan that emerge from our blindness to it. You can learn more from: Thinking Fast and Slow. 

1. The error of confirmation. We, as humans, are prone to draw conclusions from what we’ve seen to the unseen. 

For instance, I’ve heard a strange argument regarding if graduating from college is a good idea or not. It goes something like this: “Well, Bill Gates, Thomas Edison, Richard Branson and … and many other billionaires are school dropouts!” Let’s exaggerate a bit. 

Even if we pretend that ALL billionaires are school dropouts, we can never, I repeat never, conclude the opposite from that statement – that all school dropouts are billionaires. That billionaires are school dropouts doesn’t confirm that it’s a good idea to drop out from school. Yet, using it as an argument for leaving school is not uncommon, but the argument is flawed, at best. 

2. The narrative fallacy. Last year, a friend of mine and I went to Australia for a few weeks of vacation. We went to this kinda remote place called Mission Beach, and the only reason for going there is basically skydiving and rafting. We had our eyes set on skydiving, but not for long … A friendly fella at our hostel told us about an accident which had happened the week before. In a tandem skydive, the parachute of the tandem jumpers and that of their cameraman had apparently twisted around each other during the jump, which caused all three of them to fall to their deaths. Me and my friend went rafting instead … It didn’t matter that when I asked my friend Siri, he told me that I was very unlikely to die in a skydive. Stories stick. Statistics, do not. This is the narrative fallacy. Sorry for making you less likely to go skydiving. 

3. We are not programmed for Black Swans. Humans are prone to believe in linear progression. We think that a certain input will gradually result in a desired output. Not so when Black Swans exist. 

Imagine the author who’s been spending many years writing books. After 10 years of intensely hard work, she finally has her first book published, and it becomes a blockbuster. That’s a typical Black Swan event. Imagine how demoralizing the first nine years must have been for this author as all her friends expected that her to have made progress.

4. The distortion of silent evidence. History has a tendency of hiding Black Swans for us, by filtering the reality that we are presented with. 

Consider the sailors of the 15th century that came back after their voyages to tell that they survived through many storms through praying together. Does this imply that praying together makes it less likely for your ship to sink? 

Well, maybe, but we must also consider the rest of the sailors – those that didn’t survive the voyages. Did they pray? Or well, did they not? We may never know, because they can’t really tell us from the bottom of the ocean! 

5. Tunneling. We tend to focus too much on what we know and shy away from what we don’t know. For instance, school teaches us many models on how to interpret reality. Many of the inventors of these models were great thinkers, yet, they can’t think outside the box of their own models. And sometimes, being too narrow and relying too much on these models can be devastating for the interpretation of the real world, as we shall see in takeaway number four.

3. Mediocristan vs Extremistan

Mediocristan is the land of the average. In Mediocristan, the first 100 observations of a variable will give you a good expectation of what you might see for the, say, next 1,000 observations. The supreme law of this country is as follows: “When your sample size is large, no single instance will significantly change the aggregate or the total.”

  • For instance – weight, height, mortality rates, car accidents and the salary of a college graduate are all matters that belong to Mediocristan.
  • Consider the total height of the people in a sample of 100 Swedish males. Let’s pretend that their total length is 182 meters, using 182 centimeters per person, which is the height of the average Swedish male.
  • Now, let’s add the world’s tallest man in history to the sample – Robert Wadlow, who also staggering 272 centimeters tall.
  • The total length is now 184.7 meters, which is an increase of about 1.5 percent. Not too radical, right?

In Extremistan, we introduce Black Swans. This makes it so that the first 100 observations might not give so much information about the next 1000 once at all. Remember the life of the turkey? His first 100 days of being fed and protected by humans couldn’t really tell him that he was about to become Thanksgiving dinner on the 101st day. 

Most human-made or social matters belong to Extremistan – such as wealth, income, book sales per author, number of subs per youtuber, deaths in war, sizes of planets, and, of course, financial markets. 

  • Consider the total wealth of the people in a sample of 100 Swedes. Let’s pretend that their total wealth is $19 million, using $190,000 per person, which is the wealth of the average Swede.
  • Now let’s add Warren Buffett to the sample, who currently has a net worth of $84.2 billion.
  • The total wealth is now $84.22 billion, which is an increase of approximately 443,100%. That’s quite the difference. 

The implications are as follows: In Mediocristan, we can safely do some predictions. In Extremistan, it’s much more difficult, maybe even impossible. 

Problems arise because we seem to think that we are living in Mediocristan, while almost all of the matters that we’re trying to forecast are matters of Extremistan. 

Sometimes we base important functions in our society around these predictions, such as the banking system, and that leads to economic crisis, such as the financial crisis of 2007 to 2008.

4. Gaussian Schmaussian!

The bell curve, or the normal distribution as we refer to it in school, or the Gaussian curve, as we refer to it when we think that we’re honoring its original inventor, Karl Friedrich Gauss, is a very common tool used for risk management among regulators and central bankers, among others. 

Let’s talk about what this curve is first and foremost. Its fundamental property is that data from a given random variable, say height, will hover around the average. 

For instance, take the example of the Swedish males presented previously. The average height is 182 centimeters. If you look at the likelihood that someone is taller than, say 189 centimeters, you may see that only 1/6.3 males are. Taller than 196 centimeters? 1/44. Taller than 203 centimeters? 1/740. 

What’s important to notice here is not the actual numbers, but that the same incremental increase in our random variable (ie height), leads to an even faster decline in its number of observations. You can say that the variable is experiencing an ever-increasing headwind whenever it tries to deviate from the mean. 

Now, the normal distribution is awesome when it’s applied under the right conditions, which is in Mediocristan. Then, our previous observations can tell a whole lot about potential future ones. It does not work with variables from Extremistan though. 

For example, if you would have modeled daily stock market returns as normally distributed, and you were faced with the Black Monday of 1987, where the market crashed by 22.6% in a single day, you would have to call this event an outlier. Because, according to the normal distribution, it should only happen once in several billion lifetimes. 

Many investors went bankrupt because they didn’t even consider the possibility of something like this happening. It was a Black Swan to them. So, we see that a normal distribution can be limited, nay, dangerous to use for decision-making under circumstances when we deal with matters from Extremistan. 

And remember, this is pretty much all human-made social matters! Then the question becomes, what can we do instead? We can use something Nassim Taleb calls Mandelbrotian randomness instead of pretending that everything is normally distributed.

Mandelbrotian randomness does not assume that deviations from the mean becomes increasingly difficult. Instead, it suggests that, for instance, if we talk about losses in a single day in the stock market, it’s just as rare to see a day resulting in a return of -5% instead of -2.5% as it is to see -10% rather than -5%. Going from -1.25 to -2.5 to -5 to -10 to -20 are all the same. In terms of probability, I mean. 

If we would have assumed that on the Friday before the Black Monday, we would at least have considered a -22.6% decline a possibility, and we would have been able to protect ourselves from such an event. 

By assuming Mandelbrotian randomness instead of the normal distribution, we can turn some Black Swans into Grey Swans. Grey Swans are known unknowns as we discussed in the first takeaway. They are better than Black Swans because at least we can adapt our decision-making to something that we know that we don’t know about.

5. How to Act as an Investor in an Environment of Black Swans

In a world dominated by Black Swans, where we fool ourselves with the error of confirmation, the narrative fallacy and tunnelling and where we must be very careful when using the Platonic normal distribution for anything useful, what should we do? 

Nassim Taleb suggests two different approaches: 

1. The hyper-conservative and hyper-aggressive approach Don’t put your money in some medium risk investments, because let’s face it – how do we know that it’s medium risk anyways? Did some “expert” compute that using a normal distribution perhaps?

Instead, put a majority of your money in something extremely safe, like Treasury bills. These aren’t hedged against Black Swans either, but if you lose your money in Treasury bills, you’ll have bigger problems than just losing your investment capital … 

The rest of the money should be put in something extremely speculative, like options or on angel investments. With this type of portfolio, you are limited in your risk because of your hyper conservative investments, but you are also exposed to the possibility of hitting a positive Black Swan with your hyper-aggressive ones. Nassim Taleb refers to this as a convex combination.

2. The speculative, insured portfolio The second option is to have a very speculative portfolio, but to insure it against losses that are greater than, for example, 15%. This might not always be possible though, depending on what your portfolio consists of. 

Instead of using an actual insurance company, you can create this effect yourself by putting up stop losses at minus 15% and taking multiple bets with small parts of your equity. This strategy is also convex. Your risk is limited, but your upside is exposed to positive Black Swans. 

Share this post

Share on facebook
Share on google
Share on twitter
Share on linkedin
Share on pinterest
Share on print
Share on email