Although randomness is all around us, we often fail to recognize the profound role it plays in our lives. Evolutionary biology and faulty intuition lead us to make mistakes that can radically distort our perception of reality. We incorrectly attribute causes to random events, make flawed decisions in the face of uncertainty and consistently underestimate the role of luck.

In “The Drunkard’s Walk: How Randomness Rules Our Lives,” by Leonard Mlodinow, the author attempts to tame the chaos of human misunderstanding through a narrative describing the path humans took in discovering how the world really works.

Molodinow’s message is that while sloppy thinking and our biology may drive us to think in illogical ways about randomness, we can use the science of probability to interpret more accurately what we observe in the world around us. We can also get better at making smart choices in the face of uncertainty through thinking, practice and experience.

### What I like about the book

Mlodinow is smart, funny, and seriously well-educated. This advantageous combination allows him to present sometimes dense and abstract material in a highly enjoyable and accessible way. He makes it fun. I enjoyed how he presented statistical ideas in an integrated way, with one concept leading to the next. Mlodinow takes us through a history of probability, and examines big milestones in its evolution, going from early rules and proceeding to sophisticated modern concepts.

Probability had a late start in some respects versus other branches of mathematics. The Greeks had some mathematical sophistication, but had no formal system for thinking about probabilities, although they gambled with astragali, an early form of dice. They also used Astragali when consulting their oracles. If chance was an expression of divine will, there was little sense in trying to measure it.

The Romans also lacked an understanding of basic probabilities. Roman law, seeking to quantify uncertainty, used the concept of the half proof: Two half proofs combined to make a whole proof. But probabilities are not additive: No finite number of half proofs can result in certainty, since to compound probabilities you must multiply them.

Scholars struggled in relative ignorance for centuries. A famous French mathematician reasoned as follows: The number of heads that can come up when flipping two coins is either 0, 1, or 2. Since there are three possible outcomes, the chances of each is 1 in 3.

He was wrong.

It took Gerolamo Cardano, a 16th century Italian, to establish the law of the “sample space,” which is the set of all possible outcomes of a random process. For instance, if you flip two coins, there are not three, but four potential outcomes: (heads, heads), (heads, tails), (tails, heads), and (tails, tails). The odds of an event happening depend on the different ways outcomes can occur.

In 17th century France, the sample space insight was the basis for Blaise Pascal’s famous triangle, which is a shorthand way to evaluate probabilities involving larger numbers of outcomes.

Suspend your disbelief, and assume the Warriors and the Sixers are in the NBA finals, and each has a 50% chance of winning any game. The finals are a best of 7 series, and the Sixers win the first 2 games. There are 5 possible games left to play, so how many ways can the Sixers win? To find out, we move across row 6 of Pascal’s Triangle:

The Sixers win if the Warriors win 3 games (10 ways), if the Warriors win 2 games (10 ways), if the Warriors win 1 game (5 ways), or if they win no games (1 way). This makes 26 ways to win, and the sample space (sum of the row) is 32, so the odds are now 26/32 = 81%. Nifty. Incidentally, this is the type of example interspersed throughout the book that make it a fun read.

But this assumes both teams are equal and each will win 50% of the time. How many games must they play before you can be almost certain this is true?

Jakob Bernoulli published in a book in 1713 his “golden theorem” which will tell you this number if you want to be almost certain. As the teams play more games, the average of the outcomes will converge to the theoretical probability of 50%, and the more sure you will be. This is also known as the “law of large numbers.”

In 1733, Abraham De Moivre published what might be the first discussion of the bell curve. Subsequently, Carl Frederick Gauss, and then Pierre-Simon LaPlace worked to formalize the the normal distribution, which described how variations in sampling (errors) obeyed the distributions seen in Pascal’s Triangle. Instead of using rows of a triangle, they used a function which formed the basis for statistical significance.

While the evolution of the normal distribution might be a central theme of the book, Mlodinow also explores outgrowths of statistical progress, including false positives, mean reversion, conditional probabilities, and others, and provides helpful examples.

The final chapters cover how the human mind perceives randomness, and includes references to work in cognitive psychology and behavioral finance. We often don’t recognize randomness when we see it; instead we see patterns that don’t exist, since we seek a sense of control over things that are fundamentally out of our control, since they are simply random.

### Constructive Criticism

While Mlodinow had a deep grasp of the math and statistics, and their history, he didn’t seem as comfortable with the psychology and behavioral material, which I felt was presented in more of a patchwork manner. It seemed like he kind of reeled off the behavioral biases, rather than treat them in a ordered way, as with the statistics. This material is a much newer science that hasn’t been unified in the same way that statistics has been, however, so perhaps it doesn’t lend itself as well to his approach.

Also, although I appreciate he was trying to make the book short and readable, I could have used more beefed up examples and footnotes. A few times I was trying to understand something and needed to be walked through another example.

### Summary

Misconceptions about how the laws of probability operate cause us a variety of problems. For instance, we often think people are skilled, when they’re just lucky, and unskilled when they’re unlucky. Based on this, Mlodinow would say that we misjudge people–both in a positive and a negative sense. Mlodinow cites the example of Bill Miller, the portfolio manager for the Legg Mason Value Trust Fund, which beat the S&P 500 for 15 years in a row. While one analyst put the odds of that happening at 372,521 to 1, Mlodinow puts the odds of some manager over 4 decades beating the S&P for a 15-year stretch at closer to 3 out of 4. He would surprised if we did not see a Bill Miller.

The world, rather than fitting neatly into black and white categories, tends towards gray. The key is to recognize when you are at risk of misperceiving the role chance plays: The analyst above interprets using statistics in a superficial way, while Mlodinow judges based on an understanding of the range of outcomes that could have occurred, instead of on a definition that works for Miller.

It is Mlodinow’s rejection of the human tendency to accept at face value what we observe around us that was the real value of the book to me. In a sense, we are at war with ourselves. On the one hand, we seek understanding and control over our environment, on the other hand, we suffer from delusions about how much understanding and control we actually have.

Perhaps true knowledge begins with humility and self awareness. If we can recognize the nature of our delusion, and how radically reality may differ from our view of the world, only then are we on the path to a deeper understanding.

* Hat tip to Annie Duke, and David Lenowitz over at www.howidecide.org for sending us a free copy of this excellent book. How I Decide is a non-profit foundation focused on extending the scope of what under-served youth learn both in schools and through programs outside of school time, so they can develop core life skills that support better decision making: self-regulation, conscious habit formation, effective information processing, and management of uncertainty.