As a top NASA mission manager, Jim Van Laak wrestled often with risk. Sometimes the challenges were technical, like how to prepare for the sheer number of spacewalks needed to construct the international space station.
During the predecessor shuttle-Mir program, Van Laak occasionally found dangers masked behind politics.
Now with the nonprofit National Institute of Aerospace (NIA) research center, Van Laak is working to share his insights and lessons learned with the fledgling commercial human spaceflight industry. He recently spoke to SpaceNews correspondent Irene Klotz.
What do you want to accomplish at the NIA?
I’m trying to use my experience and contacts to influence the new industry folks and get them started as quickly as possible on implementing the professional culture that it’s going to take to be successful.
As an example, I have tried to get the new industry folks to share information about close calls and near-misses. One of my initiatives right now is working with the Commercial Spaceflight Federation to create the industry’s own version of the mishap reporting system that’s been such an important part of aviation’s safety growth.
I was a military pilot in the late 1970s and early 1980s and the difference between the mishap rate in military aviation from then to today is literally an order of magnitude. I flew during the Cold War, but there was no fighting war going on in those days, so the 50 or so pilots we lost a year in the tactical Air Force were lost on training accidents. Over time, the military and other operational organizations have learned that keeping an open dialogue and sharing with each other what happens with a near-miss is a critical part of safety for everybody.
A really good example of this is the Airbus 380 issue of cracks in the wing ribs. Airbus didn’t have to tell anybody about that except the regulators, but everybody knows about it. Boeing probably knows as much about it as anybody on the planet. The reason they share the information is because they will benefit from things other people will tell them about cracking, but they’ll also benefit when other people reciprocate that openness by telling them about other issues they’re discovering. That’s a big part about what it is to be a professional in this industry, and I’m trying to get it started.
What kind of chance do you have to bring people like Elon Musk of Space Exploration Technologies Corp. (), who comes from the very competitive environment of Silicon Valley, to be open with information?
I don’t have a personal relationship with Elon. We’ve had a few email exchanges, but I do talk to people in the company. SpaceX President Gwynne Shotwell said as long as it’s a level playing field she would certainly be open to that kind of an initiative.
So how receptive has the industry been?
There is progress being made because there has to be. Another company — not SpaceX — has told me that they are not interested in supporting this information-sharing because “We’re the best in the business and we’re not going to learn anything from anybody else.” I’m not going to name who it is, but their colleagues are very concerned about the problem. Unfortunately the whole industry suffers from something like that. It’s a small industry and if there’s a mishap or even a very serious close call it will erode confidence in what’s going on.
One way to accomplish this could be to use the leverage of the government, not just NASA for when it buys commercial launches, but also the Federal Aviation Administration (FAA), which oversees the industry. Is that a tactic you’re pursuing, or do you want this to grow organically from the operators and manufacturers?
At this time, it’s not part of what I’m doing because those agencies already have formal reporting requirements. Part of what I am proposing for the industry is that it adopt a posture and request from the FAA the equivalent of something that’s called the Aviation Safety Reporting System, ASRS.
It allows a person involved in aviation to report an incident anonymously. You fill out a form, either paper or electronically, and it goes to a contractor for NASA who strips all the identification off of it and then just inputs the information into the system for learning purposes. The great thing about ASRS is that it gives limited immunity from penalty for inadvertent violations.
The FAA has already expressed a willingness to consider this — they haven’t said, “Yes, we’ll do it because it doesn’t exist” — but it’s a starting point for a conversation.
Are proprietary issues a concern?
Most of what needs to be done for the purpose of safety does not require revealing proprietary information. If you, for example, had a proprietary design for a battery and you had a problem that caused it to overheat or catch fire, you could describe the symptoms or even simply the fact that you’ve changed technology and suddenly you’ve exposed a new failure mode you didn’t consider. You can share that and industry would say, “Maybe we’d better be doubly careful before we adopt a new technology in this energy storage.” So I am not proposing to undermine the competitive position. The one thing that it does potentially do is admit to the world that you’re still learning, that you didn’t have it all figured out from the beginning. But anyone who’s surprised by that admission is not paying very close attention.
How do you go about implementing a safety culture like this?
Well, first I’d use the term “professional culture,” and safety is one of the key manifestations of it. The reason I want to say that is first of all there is a group of people who look down their nose at anything that says “safety” and whether or not you agree with them it automatically starts a communication problem that we don’t really need to have.
Second, there are a lot of things that people don’t view as related to safety and success that actually are absolutely paramount. If you don’t have real experience in the business, there are a lot of things that look like bureaucratic baloney that are in fact terribly, terribly important. One example is the requirement for data on the hardware — who made it, when, who the inspectors were, where did the material come from, how was it handled, how was it stored. That sounds like a terribly bureaucratic thing but in truth it’s an incredibly valuable requirement.
Think about a situation where you have a launch vehicle sitting out on the pad and a turbopump for a subsequent vehicle fails some kind of an acceptance test. Well, how do you clear the one that’s sitting on the pad when you’ve just had a failure of the same unit that should not have failed? The answer is you have to dive into the details and find out who built it, when did they build it, was it subjected to any handling problems during shipment, and so on and so forth.
Or let’s say you’ve just launched a spaceship to Mars and it’s full of happy people who are going to do all kinds of great science when they get to Mars. They go into Mars orbit, they’re getting ready to land and they fail some part of the predescent check — one of the actuators for one of the thrusters gives a noisy input. Now you’ve got a probably $10 billion or $12 billion mission, and if you’re conservative you abort. Or you could say, “Aw, what the hell, let them go,” and there’s a big group of people who would say that’s just flat irresponsible.
So what’s the real answer? The real answer is you want to put yourself into the position to clear that anomaly, or at least to have high confidence for making a decision to go or not go. That can only be done if you have that kind of information. As a mission manager, I personally have exercised that kind of knowledge probably several thousand times.
Are you seeing that the commercial companies are doing that?
They are learning how to do it.
To those of us who covered shuttle it was impressive how NASA could know the pedigree of everything from the littlest screw to a wing panel. But I guess the idea with private industry is, can you do it without the standing army of 25,000 people that it took NASA to fly the shuttles?
That is an absolutely critical question to answer. Commercial belongs operating in an area where the risk-reward relationship and the technology are mature. Where commercial is inappropriate is where those are not mature or well understood.
Imagine your company has got to make payroll and the only way you’re going to make payroll is if you launch next week. That has the potential to have an enormous impact on the judgment that you exercise. Human judgment is affected by these pressures. It’s a hard problem.
But it seems like the financial risks to a company of flying and losing passengers would be far greater than delaying a week or a month if needed for more inspections or to deal with safety issues.
It’s easy for someone who is not part of the system to think, “Well they were going to launch tomorrow, but they found an engine flaw.” Unfortunately, most of the
time you don’t “find a flaw.” Usually what happens is you find potential evidence of a potential problem. Then you get people together and you talk about, for example, someone putting the engine together and the torque wrench that they used hadn’t been calibrated in the time it was supposed to be calibrated. Well, is that a bureaucratic requirement or is that a real requirement? What are the chances that that torque wrench was so far out of calibration that it did damage or didn’t meet minimum performance specs? Those kinds of discussions happen all the time. Any one of them could have serious consequence for the mission — not necessarily destroy the mission.
The performance has to come from every element of the system, including the human decision-makers. But the human being is still a human being, subject to emotional and other stresses.
What else falls under the umbrella of “professional culture” in commercial space?
I think the biggest thing right now is the whole attitude question. Hubris is all over this new space industry. The space industry is overflowing with smart people, but that’s because that’s the cost of admission. You can’t get in the door if you don’t know how to run the rocket equation and do Hohmann transfers and things like that. That’s the nuts and bolts of the industry. What’s not necessarily present is maturity and humility.
The other thing I’m strongly recommending is more attention on lessons learned. That’s been a focus of mine my entire life. I’ve read well over 5,000 lessons learned and found that a half-dozen of them show up 100-plus times. There are plenty of examples where the road to failure is clearly marked and somebody came along and either was ignorant of it because they didn’t even ask the question, or they took a quick look at it and said, “Well that guy was an idiot. I’m not an idiot so it’s not going to be a problem.” That’s very distressing. It’s completely avoidable. Nobody expects you to go out and read all 5,000 lessons learned before you start working, but if you’re going to work in an area of propulsion you ought to go out and study the lessons from the propulsion. There’s plenty of evidence to indicate that that’s not been true from certain companies.