The article “Pad Processing Error Doomed Falcon 1” [April 10, page 7] reminded me of a story I heard many years ago about Wernher Von Braun. During a launch — I don’t remember whether it was at White Sands or Peenemunde — a rocket blew up on the pad.
Later, a technician came forward and said that he had left a valve in the wrong position by mistake. Instead of firing the man, Von Braun sincerely thanked him, saying that the technician had saved them months of investigation. Perhaps Von Braun understood the nature of human error and that it is better to recognize that it happens and to build a culture based on trust and honesty that would help prevent such problems in the future. I imagine that the technician on the Falcon 1 team also came forward with much the same story, otherwise this fact would not have been known so quickly.
I have studied mishaps and catastrophes in many industries, including the space industry, and I have found that they are almost always a combination of events and factors that were allowed to develop long before the mishap actually occurred. Many times they are combinations of events that at some time someone said, “What are the odds?”
According to Mr. Musk, “If we had been looking at the right data … at the right time …,” I imagine that someone, when they were designing the telemetry monitoring system and the launch sequence, said: “What are the odds that something will go wrong on this system just prior to launch? If something does go wrong, we’ll catch it much earlier, so we don’t need to monitor this system all the time.”
What are the odds? In fact, the odds are pretty high when there are maintenance or test-related activities being performed.
I hope that Mr. Musk thinks about the Von Braun story, and realizes that this mishap occurred not because an individual did something wrong, but that this omission was the result of many factors, which could include poor procedures, a design that had not foreseen the necessity of working with the fitting just prior to launch, or even that the technician had too heavy a workload during the previous days and weeks leading up to the launch, and missed too much sleep.
The Von Braun story may be apocryphal, but it should be a lesson to us all. We should acknowledge that humans will always make mistakes, and that unexpected events, no matter how unlikely, happen. We need to work hard to understand all of the risks, and we should always be prepared for the unexpected.
David Fuller
Seabrook, Texas