How Not to Be Wrong“Math is like an atomic powered prosthesis that you attach to your common sense, vastly multiplying its reach and strength.”  – Jordan Ellenberg

“If you don’t get elementary probability in your repertoire, you’re like a one-legged man in an ass-kicking contest.”  – Charlie Munger

When your grade school teachers told you that you’d be doing math every day of your life, you scoffed. In adulthood, you likely took perverse pride in noting their predictions to be wrong. Enter Jordan Ellenberg (@JSEllenberg) to show you that math is nothing less than a method of thinking and reasoning, and it pervades your everyday decisions. While it can be ignored or avoided, you do so at your own peril.

Do you have an interest in decision making and critical thinking? Enjoy the works of Daniel Kahnemann, Steven Levitt, and Dan Ariely? This book is right up your alley. Are you a physician? Have you encountered statistics, probability and the p value (which stands for publication)? This book is for you. Do you hate math? Recall laboring over arcane formulas and plugging in variables? This book is still for you. “What is math?” is the overarching theme of the book, and it is an ambitious one.

Book Synopsis: How Not To Be Wrong

The opening vignette sets the stage – Abraham Wald, an Austrian Jew working in Colorado in Economics research, stays in the United States when Nazis conquer Austria. He is tapped for the Statistical Research Group, a who’s-who of high level mathematicians who have been yoked to the war effort. The question brought to Wald is a fairly straightforward one: how much armor should we put on the planes? Too much weight and the plane is unmaneuverable. Too little armor and you lose more planes. The officers approaching Wald saw an opportunity for efficiency. Since the bullet holes in the planes weren’t uniformly spread, they asked the optimum amount of armor for the most heavily bullet ridden areas. Wald’s contribution was not the formula they expected, but rather a question in return. “Where are the missing holes?” This was actually an insight into missing data: some planes didn’t come back. Wald’s advice was to armor the areas with no holes. They were the most vulnerable, as it was clear to him that hits to those areas were not survivable.

This is what mathematicians do, Ellenberg is saying. They ask ‘Are you asking the right questions?,’ and ‘What are the assumptions you’re making?,’ and ‘Are your conclusions accurate?’. They do not just seek the right answers, but attempt to see deeply into why they are the right answers.

“Dividing one number by another is mere computation; figuring out what you should divide by what is mathematics.” – Ellenberg

How Not to Be Wrong moves quickly through a wide range of topics and weaves recurrent themes of analysis and pitfalls to reasoning. Familiar constructs appear throughout, including the Laffer curve and the 2 x 2 contingency table (aka sensitivity/specificity). From specifics of economic theory to scientific research to self-correcting codes to the decision trees of slime molds, we see some unifying themes emerging.

  1. Probability: Unlikely things happen, all the time. However, we often mistake equating the very improbable with the impossible. Very improbable things, (p < 0.001) are happening all around us. Impossible things, however, are not.
  2. Linearity: Lines are a kind of curve. But not all curves are straight lines. In real life, most relationships don’t stay perfectly linear. Basing your predictions on two variables (2 points which create the slope of a line) can be thought of as taking a tiny segment of a large curve, and proclaiming the entire curve a straight line. Linearity comes into play when discussing correlations and predictions, and can be quite uncertain, making extrapolation a difficult and inexact science. Easiest take home point: More is not always better. Most things are a tradeoff. Whether it’s the Laffer curve or the Starling curve, in many relationships there is a sweet spot.
  3. Inference: On pattern finding, big data and Bayesian thinking. Can I truly know if something happens by chance alone? This question, in one form or another, pervades prophecies, predictions and data interpretation. With examples that range from investment funds to the “hot hand” in sports, comes the answer: Not really. But math can give you a much clearer guess. Ellenberg does, however, lay the groundwork for the importance of skepticism and applying one’s a priori judgment when interpreting ‘statistically significant’ findings.
  4. Correlation: Is not causation, nor is it transitive. The implications of this concept are hard to understate. When multiple variables are involved, any correlative statement is suspect, elucidated here in examples of stock portfolios and polling for electoral candidates. The now infamous Women’s Health Initiative is a great example, as it showed that while high estrogen levels correlate with low heart disease, and hormone replacement (HRT) can raise estrogen levels, HRT actually increases, not decreases, heart disease.

While the author specifically vows not to shy away from numbers and equations, he makes an effort to keep the discussion grounded in real world scenarios. Even when the math gets hard (and I admit to a few rounds of struggle), the underlying thought process is always made clear. In fact, one of the pervading themes is just that –  mathematics is about the explanation more than the formula. But How Not To Be Wrong is about thinking first, and calculating next. It is also not always about giving a definitive answer. In some ways, it adds clarity to doubt, perhaps an essential insight for a practitioner of emergency medicine.

But mathematics is also a means by which we can reason about the uncertain. Math gives us a way of being unsure in a principled way: not just throwing up our hands and saying “huh,” but rather making a firm assertion:

“I’m not sure, this is why I’m not sure, and this is roughly how not-sure I am.” Or even more: “I’m unsure, and you should be too.”

Medical relevance

Missing Data
The insight of Abraham Wald into the planes that weren’t present is a clear example of survivorship bias which saturates our understanding of pathology. How many pre-hospital fatalities listed as ‘Atherosclerotic Heart Disease’ on death certificates are actually caused by subarachnoid hemorrhage? Or perhaps by pulmonary embolus? The same concept can be applied to our journal publications. Ben Goldacre, a British physician and EBM pundit, has written and blogged on missing data never published and hidden from public scrutiny, which affects our clinical judgments. The public battle over the release of clinical trial data on oseltamivir (Tamiflu) by Roche in 2013 is a recent visible example.

Statistics
This book may contain the single greatest explanation of the p value, its uses, and its faults that I have ever read. Ellenberg adds further value with insights on sample size (n), representative samples, Bayesian thinking, the need for (and difficulty with) replication in science, and numerous examples of reputable medical journals that miss the boat on both conceptual and mathematical grounds.

Surrogate endpoints
If correlation is not transitive, then all surrogate endpoints are forever suspect. Not just because we care about patient oriented outcomes, not just because hard endpoints like death and disability are easy to measure accurately and hard to miss, but because mathematics says it doesn’t work like that. Science is hard, and everything is always more complicated than it appears.

Linearity (or the lack thereof)
This theme recurs in the final segment of the book, ‘Existence,’ with tradeoffs in decisions about capital punishment and public policy-making. In medicine, we might discuss the difficulties with screening and early diagnosis trading off against the iatrogenic harms of workup and treatment. Or, to take another well-beaten horse, the tradeoff between finding more PE’s and prescribing more Coumadin. The Choosing Wisely Campaign stands out as an example of advising physicians on what not to do. Thus modern medicine has made huge strides, unfortunately past the “sweet spot” on the curve.

Discussion Questions

  1. Ellenberg states that when you have a really difficult question, one strategy is to answer a simpler question, and see if the universe cares. He presents this as a problem-solving tool, and a way to gain insights that can be applied to many different questions. Daniel Kahnemann makes a similar statement, but in his view substituting a different, more easily answerable question is a clear cognitive error which leads to unwarranted confidence in one’s conclusions. When is this strategy useful, and when is it dangerous? Do heuristics in medicine fit into either the constructive or the dangerous model?
  2. When a p value for the null hypothesis is low, it’s unlikely to be explained by random chance. Under what conditions are you likely to reject the null hypothesis and embrace the alternative theory?
  3. Ellenberg argues that in matters of public health there is a responsibility to act even in the face of uncertainty, and that over time, decisions based on expected value theory will result in lives saved. He cites the push from the surgeon general to label cigarettes as a health hazard based on consensus in the absence of damning evidence. How do we reconcile this example with the Women’s Health Initiative study which overturned widely held consensus, during which physicians exposed their patients to iatrogenic health risks?

Further Readings and Resources

  1. Bartlett, RH. Alice in Intensiveland: Being an Essay on Nonsense and Common Sense in the ICU, After the Manner of Lewis Carroll. Chest. 1995 Oct;108(4):1129-39. PMID: 7555127.
  2. Goldacre, Ben. Bad Pharma: How Drug Companies Mislead Doctors and Harm Patients. Reprint Edition. London, UK: Faber and Faber, 2014.
  3. Dunham, William. Journey through Genius: The Great Theorems of Mathematics. Reprint Edition. USA: Penguin Books, 1991.
  4. Joshi N. ALiEM Bookclub Risk Savvy by Gerd Gigerenzer. Posted Dec 12, 2014. Accessed June 8, 2015.
  5. Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.
  6. Padgett, Jason. Struck by Genius, How a Brain Injury Made Me a Mathematical Marvel. Boston, MA: Houghton Mifflin Harcourt, 2014.
  7. Silver, Nate. The Signal and the Noise: Why So Many Predictions Fail – but Some Don’t. USA: Penguin Books. 2015.
  8. Jordan Ellenberg on Uncertainty at TEDxMadison. YouTube . Published Jul 14, 2014. Accessed Jun 10, 2015.

* In lieu of a Google Hangout on Air Discussion – there will be a LIVE discussion at SMACC Conference in Chicago June 2015.

* Disclaimer: We have no affiliations financial or otherwise with the authors, the books, or Amazon.

Pik Mukherji, MD

Pik Mukherji, MD

Program Director
Emergency Medicine
Long Island Jewish Medical Center