Spotlight | NASA Artificial Intelligence Group
SAN FRANCISCO — The March 20 eruption of Iceland’s Eyjafjallajökull volcano triggered a complex chain of events designed to assist NASA in gathering detailed data on the heat and volume of volcanic lava flows. It also demonstrated the merit of an autonomous system designed at NASA’s Jet Propulsion Laboratory.
Unlike in the past, when people used to scan data searching for the thermal signature that would indicate volcanic activity and then determine the best way to use ground- and space-based instruments to obtain additional information and imagery, the volcano in Iceland was detected by a computer algorithm designed precisely for that purpose and news of the eruption was quickly posted on the Internet. The algorithm, developed by researchers at the University of Hawaii, uses data obtained by the Moderate Resolution Imaging Spectroradiometer (MODIS) flying on NASA’s Terra and Aqua Earth-observing satellites to identify volcanic hot spots.
News of the Iceland volcano was picked up by NASA’s Volcano Sensor Web, which is constantly looking for this type of alert. The sensor web then sent an urgent request to the Earth Observing-1 (EO-1) satellite’s ground operations center to reorient the spacecraft and employ its scientific instruments to gather high-resolution imagery of the volcanic site.
This automated process, which uses information obtained from one or more sensors to direct the work of other ground- and space-based instruments, is the work of the Jet Propulsion Laboratory’s Artificial Intelligence Group in Pasadena, Calif. With a budget of approximately $4 million and 12 employees, the group has developed the Volcano Sensor Web network to track volcanic activity at 50 of the most active sites around the world. In addition, the Artificial Intelligence Group is developing sensor webs to monitor global fires, floods and ice formations.
The goal of these sensor webs is to put data into the hands of researchers much more quickly, said Steve Chien, technical group supervisor for the Artificial Intelligence Group and senior research scientist for autonomous systems.
Before the sensor web was developed, it could take three to four weeks to assign satellite instruments to target a specific location, wait for the data to be transmitted to ground stations, turn the information into a format suitable for analysis and conduct that analysis, said Ashley Davies, JPL volcanologist and lead scientist for JPL’s Earth Observing Sensor Web. “We have now got that process down to just a few hours.”
Thanks to the artificial intelligence software, EO-1 was able to target the volcano during its next passes over Iceland on March 24, 29 and 30.
JPL’s artificial intelligence team began streamlining that process in 2004 when it uploaded new software to EO-1, a satellite launched in 2000 as part of NASA’s New Millennium technology demonstration program. The software manages the spacecraft’s onboard resources, enables it to identify significant scientific events and directs the satellite’s two instruments, the Hyperion hyperspectral imager and the Advanced Land Imager, to obtain data on those events, Davies said.
“What we really want to do is automate as much of this as possible so the scientists don’t have to be watching for activity 24/7 or babysitting the software to tweak the results,” Chien said. “They can focus on truly intellectual activities instead of being down in the mud with the data.”
While the Volcano Sensor Web is proving its utility in ongoing Earth observations, the project was conceived as way to enable spacecraft to study volcanoes or other natural events occurring on distant planets in our solar system or even light-years away. “If a spacecraft is operating at a great distance from Earth and it encounters a transient event, it will not have time to send back data and wait for instructions,” Davies said. “If you can put the smarts on the spacecraft itself, it can recognize that something unusual is taking place and task itself to observe it.”
This type of artificial intelligence is likely to be needed if, as proposed, NASA and the European Space Agency () send a submersible vehicle to Jupiter’s moon Europa or a robotic airship to Saturn’s moon Titan, said Chien. In the case of Europa, the submersible vehicle would have to land on the surface of the moon, melt through the ice cap and explore the liquid ocean that scientists believe lies underneath. “That’s a tremendous autonomy problem because we know so little about the environment,” Chien said.
The ultimate autonomy problem, however, will be encountered if space agencies eventually send missions to far-flung solar systems. If a spacecraft travels 4.5 light-years away to visit Alpha Centauri, the round-trip travel time for light will be nine years. “So you want the spacecraft to go there, get into orbit by itself and start looking for interesting things,” Chien said.
As one step in that direction, the Artificial Intelligence Group is designing a sensor web that eventually could work on Mars or Europa. The group also is working with the National Science Foundation on a project launched in 2009 to develop a sensor web to monitor coastal weather fronts, eddies and harmful algal blooms as part of the Ocean Observatories Initiative. The Ocean Observatories Initiative would employ autonomous underwater vehicles, coastal radars, space-based instruments and buoys in a network designed to continuously monitor a wide range of ocean processes.
“The infrastructure would allow multiple observatories to operate day and night autonomously,” Chien said. When sensors detected an unusual event, they could signal the network to dispatch autonomous submarines to investigate. No human beings would even know that an event had occurred until they arrived at the office the next day, Chien said.