Students at the University of Maryland are working with NASA to use small ground-based sensors that could potentially help Earth observation satellites respond quickly and automatically to natural phenomena like seismic events or volcanic activity. NASA also hopes the same technology could be used to help future planetary rovers collect data and avoid ground hazards.

In experiments conducted by the students, sensors were placed at pre-determined locations and triggered to send two signals — one to NASA’s Earth Observing-1 (EO-1) satellite instructing it to take a picture of the location where the sensor was triggered and another to a miniature rover, which was sent to the same location to take a close-up picture.

Dan Mandl, a senior engineer and technologist at NASA’s Goddard Space Flight center in Greenbelt, Md., teamed with University of Maryland, Baltimore professor Mohamed Younis, who teaches a sensor networks class, to put 21 students on projects that demonstrated the effectiveness of the technology for interfacing with remote sensing satellites.

The theory behind the project is to use sensors to automatically respond to events and communicate with satellites without needing an individual on the ground to tell the sensors to do so, Mandl explained.

“The software in the sensors is basically intelligent; you can give it a high-level task, so if something breaks, it can find out where it is, take a picture on the ground and take a picture from space. Normally an engineer would have to do that,” Mandl said.

The applications in space for this type of technology are numerous, according to Mandl.

Sensor webs could be used to allow a rover to navigate through dangerous areas without needing a person on the ground watching its every move, he said. The technology also has applications for sensing volatile events, such as volcanoes.

“There’s no GPS [Global Positioning System] on the Moon, so you can sprinkle these sensors around instead of using GPS to help you steer through hazardous terrain,” Mandl said.

“If you’re going to Mars and you have one rover, it usually cannot go very far. It usually has to come back for recharging,” said Younis. “With sensor networks, you can scatter nodes on the surface, use them to collect data and send that back to Earth.”

The students’ experiments responded to such events as a faulty sensor, a temperature fluctuation and an attempt by the rover to navigate through a simulated hazardous terrain. During each event, the technology sensed the irregularity and the appropriate photos were taken. Students first tested the technology at the University of Maryland, and later at NASA Goddard.

There is a variety of different types of sensors which can be used in such applications, Mandl said, as some pick up temperature changes, others might measure light or the presence of metal.

Six undergraduates and 15 graduate students participated in the project, which finished in December, according to Younis.

“We cannot get students more excited than by giving them projects on sensor web application for NASA,” Younis said.

Andrew Wilson, who took the class as an undergraduate, said he had not been introduced to the technology before taking Younis’ class.

“The most interesting thing I thought was how the sensor nodes can be distributed,” Wilson said. “You don’t need to have a hard link or a direct link from your main computer to each node. They’re very flexible and can be used for a lot of things, for covering a large area.”

Mandl said he and his colleague, Vuong Ly, are now looking at applying sensor webs for use with NASA’s Cosmic Hot Interstellar Plasma Spectrometer, which studies gases and dust in space. That project is different from working with the EO-1 satellite because it does not require the sensors to interface with a computer network; the sensors can “talk” to the satellite directly.

Mandl expects the trend of “smart” technology in space to continue, with other equipment being self-aware like the sensors are, allowing components to fix themselves when things go wrong.

“The key is transitioning from a centrally-controlled mission to missions in which all of the participating components manage themselves, negotiate their own resources, heal themselves, can plug themselves into a generalized wireless network and search for how they fit into that network, and so on,” Mandl said.