On Dec. 23, 2004, U.S. President George W. Bush signed into law the Commercial Space Launch Amendments Act of 2004 (CSLAA). Meant to promote the development of the emerging commercial spaceflight industry, the CSLAA made the Department of Transportation and the Federal Aviation Administration responsible for regulating commercial human spaceflight. It gave the FAA authority to regulate commercial human spaceflight safety only for the aspects of uninvolved public safety, but forbade FAA to levy any safety regulation for the safety of crew and flight participants onboard for a period of eight years, unless an accident happened before.

The CSLAA requires operators to provide prospective customers with written information about the risks of spaceflight and a statement of the fact that the U.S. government has not certified the vehicle as safe for carrying crew or spaceflight participants.

The rationale behind the CSLAA moratorium and subsequent extensions until October 2015 was to allow industry to acquire experience to create future regulations. The CSLAA moratorium is a gross mistake — not because it prevents the FAA from intervening, but because it does not require instead that industry develop its own initial safety program and rules.

Hardware and software can be designed to the best of our knowledge, but our knowledge is not perfect. We can apply the most rigorous quality control during manufacturing, yet perfect construction does not exist and some defective items will be built and escape inspection. A “safe” system is essentially one that through the selection of additional margins, redundancies, barriers and capabilities (escape, for example) will “tolerate” (to a certain extent) hardware failures, software faults and human errors; mitigate harmful consequences; and/or lower the probability of their occurrence.

By not requiring industry to establish safety rules up-front, in line with the experience gained through government space programs, the unintended effect of CSLAA has been to encourage industry to set the clock of its safety practices back to the early 1960s. The CSLAA may have planted the seeds of the first suborbital flight accident, the Oct. 31 fatal crash of Virgin Galactic’s SpaceShipTwo. The CSLAA allows companies to apply whatever level of failure tolerance they like in the design, without even requiring independent verification of the correct implementation.

In the early 1970s, I was a student of aeronautical engineering at the Politecnico di Torino in Italy, and I had the privilege of having as professor Giuseppe Gabrielli, one of the most prolific aircraft designers of all time, with more than 140 designs spanning a 50-year career. During one class, Gabrielli commented on the crash of his G91 fighter prototype back in 1961 by saying that “every good airplane is smeared with blood.” Gabrielli was not a cynical guy; he was just expressing with crude words the approach of the early times of aviation, still in use in the 1960s: “Fly-Fix-Fly.” You design an airplane, build a prototype and fly it. You discover flaws through accidents, incidents or close calls; fix them; and keep flying. At that time there was no other way: Safety programs and hazard analysis did not exist yet.

There was no safety program and no safety analyses were performed when the Atlas and Titan ICBMs were initially developed in the 1950s. Within 18 months after the fleet of 71 Atlas F missiles became operational, four blew up in their silos during operational testing. The worst accident occurred in Searcy, Arkansas, on Aug. 9, 1965, when a fire in a Titan 2 silo killed 53 people. As a response to those accidents, the U.S. Air Force developed a major safety standard, MIL-STD-882, establishing novel system safety engineering techniques and management concepts.

When North American Aviation developed the Apollo capsule in the 1960s, there was no safety program and no safety analysis was performed. A number of choices were made to optimize the mass of the vehicle. The early design used a pure oxygen atmosphere to lower the capsule internal pressure so that the shell could be made thinner and therefore lighter. A lighter inward-opening hatch was preferred to a heavier outward-opening design. Flammable thermoplastic materials were extensively used, and electrical wire bundles were made as light as possible by the choice of thinner, and therefore hotter, harnesses. The ingredients for a raging fire were all in place. On Jan. 27, 1967, during a ground test, a spark, probably caused by an electrical short circuit, triggered a fire. The internal capsule pressure rose, thus sealing the hatch and the three astronauts’ fate with it.

Modern safety analyses allow us to identify causes of potential accidents (called hazards) and to remove or mitigate them through the application of predefined best practices (e.g., safety rules like failure tolerance). Safety analyses and safety measures have been increasingly used over the last 40 years in government space programs, including in the development and operation of the international space station. Independent safety verification of correct implementation is a daily activity for NASA and its international ISS partners, performed by interdisciplinary groups of experts, so-called safety review panels. Because of massive use of safety analyses in the ISS program, seven safety review panels are in place: four at NASA and three at international partner agencies.

On the contrary, the commercial human suborbital spaceflight industry has been making the point that no safety regulation should be levied in the “learning period” and no independent verification performed, stating that these would stifle progress and kill the business. More recently, the industry has asked to extend the learning period indefinitely. The media and the public seem to have generally supported this position, possibly as a legitimate defense against utopian rules and red tape. But the risk here is wasted money and human lives just to reinvent the wheel.

During a February hearing of the U.S. House Science space subcommittee, George Nield, FAA associate administrator for space transportation, said that industry’s plea for a longer learning period ignores government expertise about crewed space systems gathered by NASA’s long-running human exploration program. It would be “irresponsible” to ignore the lessons from those programs and force regulators to collect a new set of data, Nield said.

Another bold message often echoed is that of the technological novelty and superiority of commercial suborbital vehicles compared with government programs. This is just marketing.

Yes, space travel is a risky business and for that exact reason the most modern safety practices must be applied and continuously improved. Industry can propose rigorous self-regulation as an alternative to government regulation. The safety practices developed by NASA could be formally adopted as reference and further updated and improved as industry accumulates new experience and knowledge.

The International Association for the Advancement of Space Safety (IAASS) has published a collection of such heritage safety rules in a standard, available from our website (http://iaass.space-safety.org/publications/standards/). With the exemption of two quantitative safety goals, a crashworthiness requirement and a data collection requirement, all safety rules in the IAASS standard are the same or similar to those applied in past and current government space programs, and by ISS commercial service vehicles.

No regulation is not a viable option. The return to the old-fashioned “Fly-Fix-Fly” approach can bring only the terrifying prospect of a stream of incidents and accidents possibly exhausting any residual public faith in the future of human spaceflight. Believing that space travel risks are forever inevitable, that substantial improvements are almost impossible, and relying on public acceptance of high levels of risk while society is increasingly risk-averse is a recipe for failure.

To paraphrase the finding of the U.S. Presidential Committee that investigated the Deepwater Horizon oil spill disaster of 2010 in the Gulf of Mexico: The commercial human spaceflight industry must move toward developing a notion of safety as a collective responsibility. Industry should establish a “Safety Institute. … an industry-created, self-policing entity aimed at developing, adopting and enforcing standards of excellence to ensure continuous improvement in spaceflight safety.

It is time to change the CSLAA of 2004.

“A life without adventure is likely to be unsatisfying, but a life in which adventure is allowed to take whatever form it will is sure to be short.”

— Bertrand Russell

Tommaso Sgobba is executive director of the IAASS. The views expressed are solely those 

of the author.