During World War II, military leaders had a challenge.
They wanted to add more armor to American planes to minimize the chances of getting shot down by enemy fighters.
The challenge was that armor is heavy.
Too much of it makes the planes go slower and use more fuel.
Too little of it doesn’t protect them.
So to optimize its placement, the military looked at data of bullet holes on returning planes.
Some deduced that they should concentrate the armor on the red dots because that’s where the planes were getting hit the most.
But Abraham Wald, a brilliant mathematician from the Statistical Research Group, looked at the problem differently.
He reasoned that the armor should be placed on sections where there were no bullet holes (the engines).
That’s because those missing holes were on planes that didn’t return to be analyzed since they were shot down over enemy territory.
This concept is called “survivorship bias,” and it’s a lesson from a book called How Not to Be Wrong by Jordan Ellenberg.
It’s a brilliant example of why learning from failures is sometimes more valuable than learning from successes.
Learning from failures (missing planes) was more helpful than from successes (returning planes) in this case.
How to use this mental model
Survivorship bias is a type of sample selection bias where you look at a sample of data that is not representative of the entire population. One way to minimize its effects is to scrutinize your data sources and ensure you’re not omitting information about events that failed.
Note: This is based on an anecdote from Jordan Ellenberg’s book, How Not to Be Wrong, as well as some information from this article about survivorship bias [image source].
This is an excerpt from “Mental Models for Effective Managers.”