This computer-generated view depicts part of Mars at the boundary between darkness and daylight, with an area including Gale Crater beginning to catch morning light. Curiosity was delivered in 2012 to Gale crater, a 155-kilometer-wide crater that contains a record of environmental changes in its sedimentary rock. Credit: NASA JPL-CALTECH

This article originally appeared in the July 3, 2017 issue of SpaceNews magazine.

Mars 2020 is an ambitious mission. NASA plans to gather 20 rock cores and soil samples within 1.25 Mars years, or about 28 Earth months — a task that would be impossible without artificial intelligence because the rover would waste too much time waiting for instructions.

It currently takes the Mars Science Laboratory team at NASA’s Jet Propulsion Laboratory eight hours to plan daily activities for the Curiosity rover before sending instructions through NASA’s over-subscribed Deep Space Network. Program managers tell the rover when to wake up, how long to warm up its instruments and how to steer clear of rocks that damage its already beat-up wheels.

Mars 2020 will need far more autonomy. “Missions are paced by the number of times the ground is in the loop,” said Jennifer Trosper, Mars Science Laboratory mission manager. “The more the rover can do on its own, the more it can get done.”

The $2.4 billion Mars 2020 mission is just one example of NASA’s increasing reliance on artificial intelligence, although the term itself makes some people uneasy. Many NASA scientists and engineers prefer to talk about machine learning and autonomy rather than artificial intelligence, a broad term that in the space community sometimes evokes images of HAL 9000, the fictional computer introduced in Arthur C. Clarke’s 2001: A Space Odyssey.

NASA relies on artificial intelligence to identify promising exploration targets on Mars. Top-ranked targets are shaded green, second-ranked targets are orange. Credit: NASA/JPL-CALTECH
NASA relies on artificial intelligence to identify promising exploration targets on Mars. Top-ranked targets are shaded green, second-ranked targets are orange. Credit: NASA/JPL-CALTECH

To be clear, NASA is not trying to create HAL. Instead, engineers are developing software and algorithms to meet the specific requirements of missions.

“Work we are doing today focuses not so much on general intelligence but on trying to allow systems to be more independent, more self-reliant, more autonomous,” said Terry Fong, the NASA Ames Research Center’s senior scientist for autonomous systems and director of the Intelligent Robotics Group.

For human spaceflight, that means giving astronauts software to help them respond to unexpected events ranging from equipment failure to medical emergencies. A medical support tool, for example, combines data mining with reasoning and learning algorithms to help astronauts on multi-month missions to Mars handle everything from routine care to ailments or injuries “without having to talk to a roomful of flight controllers shadowing them all the time,” Fong said.

Through robotic Mars missions, NASA is demonstrating increasingly capable rovers. NASA’s Mars Exploration Rovers, Spirit and Opportunity, could do very little on their own when they bounced onto the red planet in 2004, although they have gained some autonomy through software upgrades. Curiosity, by comparison, is far more capable.

Last year, Curiosity began using software called Autonomous Exploration for Gathering Increased Science that combines computer vision with machine learning to select rocks and soil samples to investigate based on criteria determined by scientists. The rover can zap targets with its ChemCam laser, analyze the gases that burn off, package the data with images and send them to Earth.

“Scientists on the mission have been excited about this because in the past they had to look at images, pick targets, send up commands and wait for data,” said Kiri Wagstaff, a researcher in JPL’s Machine Learning and Instrument Autonomy Group.

Although data can travel between Earth and Mars in 10 to 30 minutes, mission controllers can only send and receive data during their allotted time on the Deep Space Network.

“Even if the rover could talk to us 24/7 we wouldn’t be listening,” Wagstaff said. “We only listen to it in a 10-minute window once or twice day because the Deep Space Network is busy listening to Cassini, Voyager, Pioneer, New Horizons and every other mission out there.”

The Mars 2020 rover is designed to make better use of limited communications with mission managers by doing more on its own. It will wake itself up and heat instruments to their proper temperatures before working through a list of mandatory activities plus additional chores it can perform if has enough battery power remaining.

Curiosity-Rover-NASA copy
NASA’s Curiosity rover, shown here in a self- portrait taken in Gale Crater, uses software that combined computer vision with machine learning to select rocks and soil samples to investigate based on criteria determined by scientists back on Earth. Credit: NASA/JPL-CALTECH

“Ideally, we want to say, ‘This area is of interest to us. We want images of objects and context from the instruments. Call us when you’ve got all that and we will use the information to get a sample,’” Trosper said.

NASA isn’t there yet, but Mars 2020 takes the agency in that direction with software to enable the rover to drive from point to point through Martian terrain while avoiding obstacles. “It’s the kind of basic skill toddlers learn, not to run into things, but it’s a good skill,” Fong said. “That type of autonomy is increasingly being added to our space systems. Going forward, I see us adding more and more of these intelligent skills.”

Future missions like NASA’s Europa Clipper will need robust artificial intelligence to look for plumes rising from a subsurface ocean and cracks in the moon’s icy surface caused by hydrothermal vents. When scientists can’t predict when or where they will make discoveries, they need artificial intelligence to “watch for things, notice them, capture data and send it back to us,” Wagstaff said.

As the Europa Clipper’s instruments collect data, the spacecraft’s onboard processor will need to “assign priorities to the observations and downlink the most interesting ones to Earth,” Wagstaff said. “We always can collect more data than we can transmit.”

That is particularly true of missions beyond Mars, where NASA orbiters can relay data. Missions to Europa or Saturn’s moon Enceladus also will experience communication delays because of the distance.

NASA has developed software on Earth-observation satellites that could be used in future missions to ocean worlds. The Intelligence Payload Experiment cubesat launched in 2013 relied on machine learning to analyze images and highlight anything that stood out from its surroundings.

“It has its eyes open to look for anything that doesn’t match what we expect or anything that stands out as being different,” Wagstaff said. “We can’t predict what we are going to find. We don’t want to miss something just because we haven’t trained instruments to look for it.”

A proposed future mission to bore through Europa’s ice to investigate whether life exists in an ocean below would require even more onboard intelligence. NASA probably would design software to look for inconsistencies in chemical composition or temperature. “That would keep you from having to say what life would look like, what it would it would be eating and its energy source,” Wagstaff said.

Before engineers send hardware or software into space, they test it extensively in analogous environments on Earth. Engineers test Mars missions in the desert. The best analog for Europa missions may be glacial deposits in the Arctic.

“We are acutely aware of risk mitigation because we are dealing with spacecraft that cost hundreds of millions or even billions of dollars,” Wagstaff said. “Everything we do is thoroughly tested for years in some cases before it is ever put on the spacecraft.”

AI at the controls

The capsules SpaceX and Boeing are building to ferry astronauts between Earth and the International Space Station are designed to operate autonomously from the minute they launch, through the demanding task of docking and on their return trip.

NASA crews will spend far less time learning to operate the spacecraft than preparing to conduct microgravity research and maintain the orbiting outpost, said Chris Ferguson, the former space shuttle commander who directs crew and mission operations for Boeing’s CST-100 Starliner program.

“It provides a lot of relief in the training timeframe. They don’t have to learn everything. They just have to learn the important things and how to realize when the vehicle is not doing something it’s suppose to be doing,” Ferguson told SpaceNews.

Starliner flight crew will train to monitor the progress of the spaceship. If something goes wrong, they will know how to take control manually and work with the ground crew to fix the problem, he added.

NASA insisted on that high degree of autonomy, in part, to ensure the crew capsules could serve as lifeboats in case of emergencies.

“If there’s a bad day up there and the crew needed to come home quickly, they could pop into the vehicle with very little preparation, close the hatch and set a sequence of events into play that would get them home very quickly,” Ferguson said.

In many ways, Starliner’s autonomy in flight is similar to an airplane’s. Whether on commercial airplanes or spacecraft, “everyone is beginning to realize pilots are turning into systems monitors more than active participants,” Ferguson said.

When Starliner docks with space station, the crew will be monitoring sophisticated sensors and image processors. Boeing relies on cameras, infrared imagers and Laser Detection and Ranging sensors that create three-dimensional maps of the approach. A central processor will determine which sensor is more likely to be accurate and will weight the data accordingly to ensure that two vehicles that were previously traveling quickly relative to one another come into contact at about four centimeters per second.

In spite of the complexity, astronauts will view displays that look similar to the ones airplane pilots see on instrument landing systems, Ferguson said.

Debra Werner is a correspondent for SpaceNews based in San Francisco. Debra earned a bachelor’s degree in communications from the University of California, Berkeley, and a master’s degree in Journalism from Northwestern University. She...