In his book What Do You Care What Other People Think?, physicist Richard Feynman tells the story of his service on the commission investigating the causes of the Space Shuttle Challenger disaster in 1986. As the investigation proceeded, Feynman discovered that engineers had wildly different estimates of the Shuttle’s failure rate.
One guy named Ullian, who was the range safety officer at Kennedy Space Center, told Feynman that out of 127 rockets he’d looked at, five failed, which was a four percent failure rate. Since he assumed a manned rocket would be safer than an unmanned one, he decided that the failure rate was probably a quarter of that, or one percent. But even that estimate was overruled:
NASA told Mr. Ullian that the probability of failure was more like 105.
I tried to make sense out of that number. “Did you say 1 in 105?”
“That’s right; 1 in 100,000.”
“That means you could fly the shuttle every day for an average of 300 years between accidents—every day, one flight, for 300 years—which is obviously crazy!”
“Yes, I know,” said Mr. Ullian. “I moved my number up to 1 in 1000 in answer to all of NASA’s claims….”
But NASA continued to resist Ullian’s efforts to publicize a more reasonable estimate. So every time he talked to an engineer, Feynman would ask where they got the 1/100,000 number. At one meeting, Feynman handed out slips of paper to a group of NASA folks and asked them to write down what they thought was the probability that a flight would fail due to an engine problem.
They write down their answers and hand in their papers. One guy wrote “99-44/100% pure” (copying the Ivory soap slogan), meaning about 1 in 200. Another guy wrote something very technical and highly quantitative in the standard statistical way, carefully defining everything, that I had to translate, which also meant about 1 in 200. The third guy wrote, simply, “1 in 300.”
Another guy said it was impossible to quantify, but Feynman pressed him. Finally the man said,
“100 percent”—the engineers’ jaws drop, my jaw drops; I look at him, everybody looks at him—“un, uh, minus epsilon!”
So I say, “Well, yes; that’s fine. Now, the only problem is, what’s epsilon?”
He says, “10-5.” It was the same number Mr. Ullian had told us about: 1 in 100,000.
But the bottom line was clear: the estimates were not just different, but enormously different. Different by a factor of more than 300. And when Feynman finally found the report that was the source of the 1/100,000 estimate, it turned out that the probability of failure in separate engine parts had basically been made up so that the total failure rate would come out to 1/100,000.
Ultimately, it turned out that the number had been invented to satisfy NASA leadership and Congress, which felt more comfortable with a number—any number at all—than with admitting that such a number was unattainable. It just made them feel more confident than saying “I don’t know.” This was not dishonesty. It was the result of sincere people trying to do a good job: they were told to give a numerical estimate and they did their best at guessing—and then other people looked at that number and developed an unreasonable confidence in it. The number got repeated and repeated until it became The Number, and people forgot that it really meant nothing.
As we now know, the overall catastrophic failure rate of the Space Shuttle was actually closer to 1/50.*
The point is not just that numbers can lie, but that in many cases, having any number at all can lie, or can at least be terribly misunderstood, because it gives a degree of confidence that our knowledge does not warrant. I was reminded of this lesson not long ago when I was looking at a business document that purported to chart how many hours employees spent doing various tasks. The estimates were numbers like 12.34 and 56.78—that is, to the hundredth. A hundredth is 36 seconds. So this chart purported to tell you what employees were doing to within a half-minute. Something tells me their record keeping isn’t really that accurate.
I’m just as susceptible as anyone else to being fooled by numbers. It’s important to know when the numbers are reliable, and when they’re being used as a substitute for “I don’t know.” Sometimes, people’s lives depend on it.
*-This isn’t exactly fair, since the 1/100,000 number Feynman was given was a failure rate for the engines, and Columbia’s engines didn’t fail. But many people took the 1/100,000 rate as a safety rate for the Shuttle as a whole.
Comments policy