Rich Rieber, center, the lead mobility systems engineer for the Mars 2020 Perseverance rover, views a test rover at JPL’s Mars Yard with NASA Administrator Jim Bridenstine (right) and JPL Director Mike Watkins (left) in 2018. Credit: NASA

NASA is turning increasingly to autonomy and machine learning to make the most out of Mars exploration missions, but don’t count on self-driving rovers to zip across the Red Planet anytime soon.

“For space applications, we like to make sure that everything goes according to plan,” said Rich Rieber, lead mobility systems engineer for the Mars 2020 Perseverance rover at the Jet Propulsion Laboratory. “That goes doubly for surface assets because they are orders of magnitude more resource-constrained than satellites in space.”

Perseverance, like its predecessor Curiosity, relies primarily on two radiation-hardened processors called Rover Compute Elements that have roughly the same processing power as a state-of-theart desktop computer from the mid-1990s. That is not enough processing power to perform complex machine-learning operations.

Still, Perseverance will have more autonomy than Curiosity thanks to an additional flight computer programmed to help the rover land safely. Once Perseverance touches down on Mars, NASA plans to reprogram the flight computer to help the rover avoid rocks and craters as it travels around studying geological samples, searching for signs of ancient life and collecting soil samples.

“We have a finite window of time to do our entire science mission,” Rieber said. “The more time we spend driving, the less time we spend on science.”

NASA’s Jet Propulsion Laboratory developed artificial intelligence to spot craters in imagery gathered by the High-Resolution Imaging Science Experiment (HiRISE) camera aboard NASA’s Mars Reconnaissance Orbiter. Credit: NASA/JPL-CALTECH

For Perseverance, NASA upgraded AutoNav, the autonomous driving mode that helps the rover reach specific Martian coordinates.

“When we can micromanage a drive, we do because the human brain is much more capable of making complex decisions like planning a path,” Rieber said.

When mission operators don’t have enough imagery to chart the rover’s path, they turn to AutoNav to create a digital elevation model with imagery from Perseverance’s cameras. AutoNav then evaluates dozens of paths toward its desired destination by tracing where each of the rover’s six wheels will travel. It considers the terrain, looking at rocks and other obstacles as well as any height differences among the wheels that would make the rover tilt.

“We do that every 25 centimeters along the path,” Rieber said. “If it makes strategic progress toward our goal and it’s safe, we drive it.”

AutoNav, while helpful in evaluating terrain features, can’t spot the type of sand trap that doomed Spirit, the NASA rover that explored Mars from 2004 to 2009, and threatened Curiosity.

“It would look at a sand patch, see it was super flat and bomb right through it,” Rieber said. “What we do as rover planners is define sand patches as keep-out zones, red boxes on the map that the rover has to navigate around.”

Future Mars missions are likely to rely more heavily on autonomy and machine learning both for surface operations and to analyze data transmitted to Earth.

NASA is finding Martian craters, for example, by applying machine learning to 14 years of orbit imagery.

“We trained a classifier to scan the full archive and give us a ranked list of possible locations to zoom in on,” said Kiri Wagstaff, a JPL computer scientist.

The classifier is proving it can identify small craters that people who previously looked at the imagery missed.

“They are more often clusters of impacts instead of solitary impacts,” Wagstaff said.

NASA and the European Space Agency also plan to harness machine learning to help analyze data from the Mars Organic Molecule Analyser (MOMA), a suite of three instruments scheduled to launch in 2022 on the European Space Agency’s ExoMars rover to look for signs of past or present Martian life.

MOMA’s laser desorption experiment, in particular, gathers data that is difficult for scientists to analyze quickly. Nevertheless, quick decisions are necessary because scientists have only about 24 hours to decide how to tune the instrument to study specific rock samples.

“MOMA is a very customizable tunable instrument,” said Eric Lyness, software lead in the Planetary Environments Lab at the NASA Goddard Space Flight Center. “Based on what you see, there’s a ton of things you can do to further identify a sample.”

To speed up the decision process, NASA sends laser desorption data to Earth where a neural network algorithm helps point scientists in the right direction.

The algorithm can quickly evaluate new data and tell scientists the type of research they should consider for each sample.

“Ideally, in the future, AI will run on the spacecraft or on the surface of the planet,” Lyness said.

Someday, Mars missions could feature algorithms created through machine learning. For now, engineers rely on their knowledge and experience to come up with algorithms, which they then test in laboratories or in the field.

In the automotive industry, machine-learning and deep-learning algorithms are suggesting new algorithms for testing.

“If you get that right, it can open up a high level of accuracy and capability for those algorithms,” said Ossi Saarela, space segment manager at MathWorks, a computational software company.

For example, machines might be able to create an algorithm to identify and gather imagery of volcanic activity on the surface of a moon.

“An astronaut onboard a spacecraft could easily recognize these types of events and point a scientific instrument at it, but it’s difficult to train a computer to do those kinds of things,” Saarela said.

For now, NASA relies on autonomy and machine learning to perform tasks that pose no threat to Mars missions.

Curiosity relies on software, called Autonomous Exploration for Gathering Increased Science, to select rock and soil samples to investigate when the rover is not in contact with ground controllers. Based on criteria determined by scientists, the rover fires its ChemCam laser, analyzes gases that burn off, packages the data with images and sends them to Earth.

“That’s a win-win scenario,” Saarela said. “During times when the ground can’t communicate with the rover, there’s not much it could do.”

Also, the algorithm doesn’t have to be perfect to provide value.

“For scientific applications in particular, deploying AI could be very beneficial in the near term,” Saarela said.

This article originally appeared in the Jan. 18, 2021 issue of SpaceNews magazine.

Debra Werner is a correspondent for SpaceNews based in San Francisco. Debra earned a bachelor’s degree in communications from the University of California, Berkeley, and a master’s degree in Journalism from Northwestern University. She...