Summary
clock12-minute read
headphoneIconAudio available

The Black Swan

by Nassim Nicholas Taleb
clock12-minute read
headphoneIconAudio available
The Black Swan
The Impact of the Highly Improbable Just because you haven’t seen something doesn’t mean it doesn’t exist, right? Well, Nassim Nicholas Taleb uses this exact logic to explain the Black Swans that happen in our society. A Black Swan is an improbable or highly unlikely event that has three principal characteristics. The first two are that it is unpredictable and it carries a massive impact. The third is the ability to construct an explanation after the fact to make it appear less random, and more predictable. Think of events like 9/11 or the invention of Google. These Black Swans, while unpredictable and impactful, could easily be explained in the moments following the event. Black Swans like these underlie almost everything about the world. But why can’t we acknowledge them until after they occur? Well, according to Taleb, humans are simply hardwired to focus on the details rather than see the big picture. We concentrate only on what we know and understand; therefore, we are unable to conceptualize the impossible. As you read, you’ll learn that we can learn a thing or two from turkeys, you'll see how a casino's greatest threat isn't high-rolling gamblers, and how focusing on what we don't know is critical for making informed decisions.
Read more
--:--
--:--
Download our free app:
button
button
Download the book summary:
The Black Swan
"The Black Swan" Summary
Font resize:plusminus
Summary by Lea Schullery. Audiobook narrated by Alex Smith
Introduction
Did you know that before the discovery of Australia, people in the Old World believed that all swans were white? While the first sighting of a black swan may have been quite surprising, this story illustrates more than just how a swan's feathers can be both white and black. It illustrates a severe limitation to our learning from observations or experience and the fragility of knowledge. A single observation invalidated an entire belief derived from millions of sightings of white swans. All this system needed to be completely derailed is the sighting of a single black bird. Just because you haven’t seen a black swan doesn’t mean that they don’t exist. Today, Nassim Nicholas Taleb uses black swans to support Black Swans in society. Black Swans are the seemingly random, unlikely events that have had profound consequences on society, and while they are easily explainable, they are much harder to predict.
Examples of Black Swans in the past are events like the terrorist attacks of 9/11, the advent of World Wars I and II, the explosion of the dot.com bubble in the 90s, and even the invention of the personal computer and the Internet. Black Swans can even be cultural fads like the Harry Potter books that have greatly impacted society. This combination of low predictability and large impact makes the Black Swan a great puzzle. Throughout The Black Swan, Taleb aims to point out the shortcomings of man and how our world is dominated by the extreme, the unknown, and the improbable. And despite our progress in growth and knowledge, the future will only become increasingly less predictable.
Chapter 1: Humans Are Notoriously Bad at Making Predictions
Growing up, author Nassim Nicholas Taleb grew up in Lebanon, a place he considered paradise. Of course, that was until after thirteen centuries of coexistence, the Christians and the Muslims began a fierce civil war. How could anyone have known this would happen? Suddenly, combat zones became the center of his town and his high school was a mere few hundred feet from the war zone. A Black Swan came out of nowhere and changed the course of history for Taleb and his country.
Unfortunately, history is opaque. You can see what happens after the fact, but you can’t see the script that produces the events. This is partly due to what Taleb considers the triplet of opacity. The first leg of the triplet is “the illusion of understanding,” or how everyone seems to know exactly what’s going on when they have no idea. For Taleb, he was constantly told by adults that the war was going to end in “only a matter of days.” Little did they know that war would last close to 17 years.
People seemed quite confident in their predictions; in fact, a number of people sat waiting in hotel rooms and other temporary housing in Cyprus, Greece, France, and elsewhere for the war to finish. This “duration blindness” is a widespread disease and can be seen throughout many historical events. For example, there are stories of Cuban refugees with suitcases still half-packed who came to Miami in the 1960s for “a matter of few days” after the installation of the Castro regime. There are also Iranian refugees who fled to Paris and London in 1978 thinking their absence would be a brief vacation. Few are still waiting, more than a quarter-century later, for their return.
The dynamics of the Lebanese conflict have been seemingly unpredictable. Every day new events took place that lay completely outside the forecast, events deemed completely crazy. Yet, those same events didn’t seem so crazy after the events. Taleb concluded that the human mind is a wonderful explanation machine, capable of making sense of almost anything, yet generally incapable of accepting the idea of unpredictability.
Chapter 2: Lessons From a Turkey
As humans, we possess a major flaw. How can we logically go from specific instances to reach general conclusions? How do we know what we know? You see, humans have the habit of creating narratives based on what they observe and know. While we like to believe that this habit makes us intelligent beings, we often make mistakes because we fail to account for what we don’t know. To explain this further, we can take a lesson from the turkey.
Consider a turkey that is fed every day. Each feeding further confirms the turkey’s belief that the human race is good. The turkey believes the human feeding him each day is “looking out for its best interests.” Suddenly, the Wednesday before Thanksgiving, something unexpected happens to the turkey. That turkey is killed and plucked clean. He’s then filled with various herbs and spices and roasted in the oven for the humans to enjoy on Thanksgiving Day. The turkey problem can be generalized to any situation where the same hand that feeds you can be the one that wrings your neck. Consider also the increasingly integrated German Jews in the 1930s, who were lured into a false sense of security before Hitler and the Nazis enacted their plan to exterminate them.
This story simply shows that we think we don’t actually know what we know. We falsely believe that something has worked in the past until it unexpectedly no longer does. Thus, what we have learned from the past turns out to be irrelevant or false, or even worse, viciously misleading. Similarly, humans naturally tend to look only for corroboration. This vulnerability is what they call confirmation bias. That is, we seek out information that supports our own views and reject information that goes against those views.
One such experiment that illustrates this tendency was done by psychologist P.C. Wason. The study presented participants with the three-number sequence 2, 4, 6, and asked them to try to guess the rule generating it. Their method of guessing was to produce other three-number sequences, to which the experimenter would respond “yes” or “no” depending on whether the new sequences were consistent with the rule. The correct rule was “numbers in ascending order,” and nothing more. Few subjects successfully discovered the rule because they were confident in the rules they created in their minds. The participants would consistently offer sequences that confirmed their own rules rather than trying sequences that rejected their hypothesis.
Similarly, let’s consider politics. In the United States, you have two parties: the Democratic Party and the Republican Party. Suppose you support the conservative Republican Party and stumble upon an article that paints the Republican candidate in a negative light. Naturally, you might become upset and angry. After this, you might go to the Internet and search for articles published by conservative news outlets that support your views rather than go against them.
Confirmation bias is the equivalent of believing that witnessing an additional white swan will bring confirmation that there are no black swans. In other words, this type of thinking can be absurd and dangerous. Unfortunately, there isn’t much we can do about it since this notion of corroboration is rooted in our intellectual habits, it is simply in our nature.
Chapter 3: Our Tendency to Create Stories Distorts Our View of the World
Humans like stories. We like to summarize and to simplify. This is because stories help us make sense of the past, and this tendency is what Taleb calls the narrative fallacy. You see, we are vulnerable to overinterpretation and prefer compact stories over raw truths. This severely distorts our mental representation of the world. It further addresses our limited ability to look at sequences of facts without weaving an explanation for them or simply forcing a logical link, or arrow of relationship, upon them. These explanations bind facts together, they make them more easily remembered; they help them make more sense.
But why do we do this? Well, today we are faced with incredible amounts of information every day. We can’t simply make sense of it all, so our brains only select the information that it deems important. The more orderly, less random, and narratized a series of words or symbols, the easier it is to store in our mind. Consider a collection of words glued together to create a 500-page book. If the words are purely random, you will not be able to summarize, transfer, or reduce the dimensions of that book without losing something significant from it. The more we simplify information, the less random the world becomes. Unfortunately, we leave the Black Swan out of this simplification.
When we create narratives, we prevent ourselves from gaining any meaningful understanding of the world. For instance, suppose I told you to recall events from your past. You are probably more likely to remember the facts from your past that fit into a narrative; meanwhile, you neglect the ones that play a causal role in that narrative. This inability to remember not the true sequence of events but a reconstructed one, makes history appear in hindsight more explainable than it actually was. You see, when we look into the past, we forget to take into account the infinite explanations that are possible for a single event.
Look at the butterfly effect, for example. A single butterfly flapping its wings in New Delhi may be the cause of a hurricane in North Carolina, though the hurricane may take place a couple of years later. If we only look at the hurricane in North Carolina, it becomes overwhelming to think of all the causes; there are billions of billions of such small things as wing-flapping butterflies in Timbuktu or sneezing wild dogs in Australia that could have caused it. This simply proves that we fail to take into account each possible cause for major events.
Chapter 4: The Distinctions Between Scalable and Non-Scalable Information
Taleb remembers one of the most important pieces of advice he ever received. In retrospect, this advice was bad and pushed him deeper into the dynamics of the Black Swan. He was just 22-years-old when another Wharton student told him to get a profession that is “scalable,” that is, one in which you are not paid by the hour. Instead, you are subject to the limitations of the amount of your labor. You see, while we humans constantly try to make senseof the world around us, we struggle at distinguishing between scalable and non-scalable information.
For example, some professions cannot be scaled: there is a cap on the number of patients or clients you can see in a given period. If you open up a restaurant, you have the potential to steadily fill the room with hungry patrons but there is a cap to how many you can serve at any given time. This kind of work is largely predictable: it will vary, but a single days’ revenue won’t have the potential to drastically change your life. In other words, these professions are not Black Swan driven.
Other professions allow you to produce more and make more money, at little or no extra effort. These professions are occupied by “ideas” people versus “labor” people. As an ideas person, you don’t necessarily have to work hard, just think hard. You can do the same work and produce a hundred units or a thousand. For example, a writer expends the same effort to attract one reader as she would to capture several hundred million. J.K. Rowling, for example, the author of the Harry Potter books wrote each book only once, she doesn’t have to rewrite it each time someone wants to read it. The baker, on the other hand, must bake a single piece of bread to satisfy each additional customer. You can apply these distinctions between scalable and non-scalable professions to other areas of the world as well. In fact, this distinction allows us to make a clear-cut differentiation between two varieties of uncertainties, or two types of randomness.
Assume that you select a thousand people randomly and have them stand next to one another in a stadium. The sample of people will all vary in height and weight, right? You’ll have some very tall people, perhaps over seven feet, and some very short people, perhaps around 3 feet. However, nature limits the height at which humans can grow; therefore, your sample of people wouldn’t include 50ft giants, despite what Game of Thrones or Harry Potter characters might tell you. Even the tallest person there would only represent no more than 0.6 percent of the total. Things like height and weight are limited, meaning they are non-scalable. With non-scalable information, it is possible for us to make fairly accurate predictions.
Take that same amount of people, and this time, add the wealthiest person on the planet - Jeff Bezos, the founder of Amazon whose net worth is about $111 billion. Meanwhile, the cap of the 999 other people would hover around a few million. In this case, his total wealth would represent about 99.9% of the total wealth of the others. A person’s height or weight cannot represent the same type of share, if so, that person would need to weigh fifty million pounds! When you have such a radical skew in the distribution, you are in a place that Taleb refers to as “Extremistan.”
The problem with Extremistan is that it is nearly impossible to predict when outliers will occur and what impact they might have. For example, the legendary screenwriter William Goldman once shouted, “Nobody knows anything!” when discussing the prediction of movie sales. Life doesn’t always fit in the shape of a bell curve, always explaining away random events as “outliers.” However, this is what many of us do every day. Furthermore, we follow the herd and look to experts for guidance, which brings us comfort and makes us feel as if we are in control. But suddenly, the stock market drops or 9/11 happens. Or something like the Internet or Game of Thrones comes around to throw off the curve.
Chapter 5: Casinos and the Ludic Fallacy
When it comes to taking risks, humans typically try to be cautious. I mean, that’s how insurance companies make a living, right? It’s too risky to live without it, yet we spend thousands of dollars, or more, a year paying for something in case we end up needing it. As humans, we try to measure risks as accurately as we can to ensure that we get the most “bang for our buck” and get the most out of life.
Unfortunately, when we measure risks, we fall into the trap of being too confident. We are confident that we know all the possible risks we should protect ourselves against. This is what Taleb calls the ludic fallacy. Ludic comes from ludos, Latin for games. So it should come as no surprise that casinos often use this ludic fallacy in their approach to protecting themselves against possible threats. For instance, a casino’s risk management is geared towards reducing losses resulting from cheaters. They simply need to control the “whales,” the high rollers who travel thousands of miles and bet several million dollars in a single gambling bout.
Yet, despite their sophisticated surveillance systems that seem like that of a James Bond movie, their largest losses incurred by the casino fell completely outside these sophisticated models - they were “outliers.” First, they lost around $100 million when an irreplaceable performer in their main show was maimed by a tiger. The tiger had been raised by the performer, even sleeping in his bedroom. But then, the unthinkable happened, as no one expected this wild, powerful animal to turn on his master. In this scenario, the casino had even prepared for the risk of the tiger jumping into the crowd, but nobody thought of the idea to have insurance against what actually happened.
The second loss involved an injured and disgruntled contractor who was injured in the construction of the hotel’s annex. Offended by the settlement offer, he attempted to dynamite the casino. His plan was to put explosives around the pillars in the basement. Of course, his attempt was thwarted, but the event cost the casino more money than they expected. Another loss was a series of dangerous events, including the kidnapping of the casino owner’s daughter, which caused him to secure cash for ransom and violate gambling laws by dipping into the casino coffers.
Ultimately, the dollar value of these Black Swans overpowers the predicted risks by a factor close to 1,000 to 1. In other words, the casino spends hundreds of millions of dollars on gambling theory and high-tech surveillance; meanwhile, the bulk of their risks come from outside their predicted models.
Chapter 6: The Nerd Effect and How to Avoid It
As we go throughout life, we typically focus on what we do know and not on what we don’t. Unfortunately, this type of thinking doesn’t allow us to see all the possible outcomes, thus creating a breeding ground for Black Swan events. For instance, someone who studied the stock market between the years of 1920-1928, might believe he has a good understanding of the stock market trends and feels confident in playing the market. Soon, however, the market crash of 1929 happens, and suddenly everything you thought you knew is no longer true.
When you focus only on what you know, Taleb calls this the nerd effect. You view your world within a model and think exceedingly inside the box; you become a nerd. Think about all the straight-A students who end up going nowhere in life while someone who constantly fell behind in school and hated following the rules ends up becoming successful! While someone who does well in school might do well on an IQ test and in an academic setting, those who think outside the box generally perform better in real-life situations.
For instance, a nerd learning a new language might learn by reading a grammar book cover to cover and memorizing the rules. He will then believe that he understands the grammatical rules and vocabulary enough to speak the language. In reality, languages grow organically and understanding grammar doesn’t necessarily mean you can have a conversation. Therefore, a non-nerd might pick up a language by heading out to the bars picking up chicks and talking to cabdrivers, then fitting in the grammatical rules (if needed) to the knowledge he already possesses. He goes against the rules and doesn't go the traditional route.
According to Taleb, we need to focus on what we don’t know to reduce our risks. For instance, when gamblers know the rules of the game, they are able to determine the probability of their opponent beating them. But they also focus on what they don’t know, like the strategy that their opponent is employing or how much they are willing to gamble. Considering these unknowns means that gamblers don’t simply focus on the cards at hand. Instead, they consider many different factors that allow them to take informed risks, increasing their odds at winning.
When it comes to Black Swans, our lives will be filled with them. You can’t predict randomness, but there are ways to embrace Black Swans and help you make better decisions. For instance, by simply knowing they exist, you can begin to keep your eyes open for them. Furthermore, by understanding where your ignorance lies, you are at an advantage because you can learn more and try to fill in those gaps. While randomness is simply a part of life, we can at least take some control in learning about the vast complexity of our world and lessen the damage we create through our own ignorance.
Chapter 7: Final Summary
It is only human nature to try and make sense of the world around us. We make predictions, we explain randomness, and we reduce the vast knowledge of the world into easy-to-read stories. As a result, we become far too confident in what we know and underestimate what we don’t know. Unfortunately, this pattern of behavior only contributes to poor decision-making, which has the potential to create a Black Swan - an event that we believe to be impossible until it ends up happening, forcing us to rethink everything we once knew and understood.

Popular books summaries

New books summaries