Enoch’s Thoughts

September 27, 2009

Uncertainty and hubris

Filed under: Uncategorized — etblog @ 4:05 pm

“Before I went to Tech, I couldn’t even spell ‘engineer’, now I are one.” Similarly, I only learned the the word hubris a few years ago. I knew many examples of hubris, I just didn’t have a good fifty-cent word to describe them.

I’ve just finished reading The Black Swan – The Impact of the Highly Improbable, by Nassim Nicholas Taleb. It was, at turns, controversial and obvious, worrisome and strangely comforting, annoying and great fun. NNT, as he is often called, is a clever writer, well-read and well-educated, practical, and self-assured. He has apparently thought about, and worked with, uncertainty, probability, luck, and knowledge for most of his life.

Trying to summarize any book is risky, particularly one with such complex and nuanced subject matter. The term “Black Swan” refers to a significant event that no one has seen before.  NNT says that Black Swans have three attributes – “unpredictability, consequences, and retrospective explainability.” Trying to explain the events leading to a Black Swan is similar to using a puddle of water on the kitchen counter to deduce the shape of the ice cube before it melted (or to prove, indeed, that there ever was an ice cube.)

Black Swans may be positive or they may be negative. They may occur suddenly, such as a sudden financial disaster or world event, or they may be slower in happening, such as the discovery of coherent light and its many uses (also known as the “laser”, and its many children, including optical computer networks, CDs, and DVDs.)

Because the book is full of stories, illustrations and examples, it was fun to read. He includes an appropriate number of well-chosen graphs and pictures, and only a few “math” sections, which he warns about in advance so the “non-technical” reader may skip them. He seasons his arguments with quotations ranging from Yogi Berra to Benoit Mandelbrot.

He does not hesitate to say that people who use precise mathematical models to predict “risk,” such as financial analysts, are foolish. He thinks that the Nobel Prize in Economics is generally given to charlatans. Needless to say, this has not made him popular with a large segment of professional money managers and other people who get paid quite well to make profound predictions, nor with the bogus experts he calls “empty suits.”

I won’t claim to have already figured out the things NNT talks about, but some of the points of the book supported some simple personal observations that have bugged me for some time.

  1. People who base their beliefs on anecdotes instead of statistics. As a recent example, consider people who tell stories about long waiting lines for medical treatment in a country with universal health care (anecdotes), but ignore World Health Organization statistics which show otherwise.
  2. What I call “straight-line projections,” wherein someone bases decisions on a few data points that can be extended in a straight line to demonstrate a wildly successful outcome. As an example for this one, I offer several businesses (including a national furniture chain and a popular doughnut maker) who appeared to have recently borrowed money and expanded their stores without any consideration for possible uncertainties in their calculations.
  3. A lack of understanding of “percentage growth,” which is doomed to decrease as growth increases. When I moved to the county I live in, it was experiencing “double-digit” growth. Now that growth has “slowed” to single digits, even though the number of people entering the county is larger every year. Perhaps that misunderstanding of the difference between growth and percentage growth accounts for some of our budgetary problems.
  4. The smug tone of documents which predict market changes and project the popularity of proposed new service ideas. I don’t know what bugs me worse: the overconfident perspective, or the fact that things never happen as predicted.
  5. The fact that most significant inventions are discovered accidentally, rather than by rational, linear thinking. This one reinforces my own particular research style, which is to just play around with interesting stuff until something useful emerges. (I’ll let you know how that turns out.)

Perhaps the most dramatic thing I read in the book was a footnote in a section bemoaning the concentration of U.S. financial institutions into gigantic, interconnected, bureaucratic banks, a frightening “financial ecology” largely based on convincing but overly simplistic risk measurements. “[T]he government-sponsored institution Fanny Mae, when I look at their risks, seems to be sitting on a barrel of dynamite….” The book was published in 2007.

He also discusses other patterns of human behavior that contribute to our short-sightedness, including

  • the difference between absence of evidence and evidence of absence (just because I’ve never seen something happen doesn’t mean it never has or will)
  • confirmation bias (only noticing events that confirm what you already think)
  • the “narrative fallacy” (we prefer compact stories over raw truth),
  • our inability to resist explaining why things happened as they did, even though the events, players, and interactions are complex beyond anyone’s understanding; this is particularly prevalent in the political arena, and
  • the fallacy of silent evidence (history only records the parts that fit the understanding of the historian.)

One more thought came to me as I read the book. Years ago (and I’ve forgotten where), I read a comment on the study of “non-linear equations.” It turns out that the primary reason we study linear equations is not because there are so many linear systems in the world (there aren’t), but because the mathematics are manageable. The comment was something to the effect that referring to non-linear equations is a little like referring to zoology as the study of “non-elephant animals.” That seems to be the only explanation for why we use overly-simple mathematical models to do projections – because the math is manageable. And because they seem to work. That is, until they fail dramatically.

In the end, I was left feeling like the Christian who decided he would turn every one of his life-decisions over to God. He sat on the edge of his bed for four hours one morning waiting for divine guidance, until he finally realized he was going to have to pick out which shirt to wear all by himself. (There’s a book I read long ago called Decision-Making and the Will of God that, as I recall, addresses this poor fellow’s dilemma.)

While NNT has some modest suggestions for how we should then live, he claims that the main benefit of this analysis is a better understanding of how things work, a respect for the possibility of errors and worst-case scenarios, and an awareness of the potential of the impact of Black Swans on our lives.

Good luck!

1 Comment »

  1. […] The fact is, our world and our systems are too complex for our simple theories (or even our complicated ones) to predict how things will turn out. (That brings up an interesting but distracting thought line I […]

    Pingback by Prediction « Enoch’s Thoughts — November 22, 2009 @ 7:24 pm

RSS feed for comments on this post. TrackBack URL

Leave a comment

Powered by WordPress