BY JADE BOYD, Rice News Staff

Everyone could use an extra set of hands every now and then, and if two Rice
researchers and a crack team of scientists at NASA’s Johnson Space Center (JSC)
have their way, astronauts aboard the International Space Station just might get
their extra set.

NASA’s Robonaut project aims to create a humanoid robot to function as a second
set of eyes, arms and hands on spacewalks.

Two Rice assistant professors — Marcia O’Malley of mechanical engineering and
materials science and Nancy Niedzielski of linguistics — are working on the
project, a collaborative effort between NASA and the Defense Advanced Projects
Research Agency. The project is based at the Robot Systems Technology Branch at JSC.

NASA chose to build a humanoid robot because space flight hardware has been
designed for servicing by astronauts making spacewalks, known in the NASA
lexicon as extravehicular activities, or EVAs. In addition to being performed to
conduct repairs, numerous EVAs will be required to assemble the components of
the space station. So NASA will need all the hands it can get on these spacewalks.

Which is not to say that Robonaut will replace any astronauts. The robot will
not think for itself. It will be attached to the robotic arm of the space
shuttle or space station and will be tele-operated by a trained astronaut inside
the spacecraft. Using a 3-D virtual-reality helmet and two joysticks, the
astronaut inside the craft will see what Robonaut sees, feel what it feels
outside the craft and will be able to control its movements accordingly.

It may sound like a simple concept, but making a robot that can duplicate even
the simplest of human tasks is extremely challenging.

For example, O’Malley’s work on the project involves the "haptic" interface used
in the robot’s control system. Haptic, a term that originated in psychology,
refers to the perception of touch.

Among the myriad sensors on Robonaut — up to 150 in each arm — several allow
the astronaut operator to sense where the robot is and the amount of force that
Robonaut is exerting in the environment. Encoding the software needed to move
that information from the robot’s arm to the operator — and allow the astronaut
to react to it — is very complicated.

In a test this summer, for example, O’Malley said human operators using the two
Robonaut prototypes at JSC were asked to hold a soccer ball at arm’s length
between both hands and move it in a circle in front of their chests.
Surprisingly, this very simple act created serious problems for the robot’s
control system.

"They were dropping the ball — literally — more often than not, " said O’Malley.

The problem turned out to be a slight but significant delay in the time it was
taking to transfer information between the robot’s arms and the operator.

"They would push with both arms, and they wouldn’t feel anything, so they would
push some more, just to make sure they had a tight grip on the ball," said
O’Malley. "Then, the original signal would finally get to them, and they would
feel the pressure. They would respond by relaxing, but they wouldn’t feel that
right away. Instead, they would feel the increased pressure they had applied
earlier, so they would relax even more, and then the whole cycle would start
over again."

O’Malley and her colleagues were eventually able to solve the problem by
inserting some additional programming that compensated for the delay.

Like O’Malley’s work, Niedzielski’s is also critical for the control of the
robot. Niedzielski is part of six-member team that is creating a
voice-recognition system that the tele-operator will use for added control of
Robonaut.

"If your hand movements are being transmitted directly to the robot’s hands,
it’s not like you can reach over and touch a button on a control panel,"
Niedzielski said. "We’d like to give the astronaut the ability to do things like
freeze one arm after they get it positioned just right."

Niedzielski’s team hopes to design a voice-recognition system that is capable of
operating both on the ground and in space, and is thus sensitive to changes in
vocal quality that results from these very different conditions. Human listeners
can adjust to these changes effortlessly. But scientists don’t yet fully
understand how the human brain interprets language, so they’re hard-pressed to
teach a computer to do it.

For instance, "so much of the meaning we attach to words comes from context, and
it’s very difficult to teach a computer to rely on context," said Niedzielski.

Moreover, the system must be flexible enough to compensate for the physiological
changes that astronauts undergo in orbit. For instance, astronauts’ nasal
passages expand in microgravity, and as a result, the tone of their speech
changes. Niedzielski said the voice-recognition team faces a real challenge in
creating a system that can adapt to these kind of changes.

O’Malley and Niedzielski both look forward to working on the Robonaut project
for several years. If all goes according to NASA’s timeline, their work and the
work of dozens of other specialists will pay off with Robonaut’s initial flight
in about five years.

[NOTE: An image supporting this article is available at
http://www.riceinfo.rice.edu/projects/reno/rn/20031204/robonaut.html ]