Researchers at Queen’s University are developing a new robotic system to service more than 8,000 satellites now orbiting the Earth, beyond the flight range of ground-based repair operations. Currently, when the high-flying celestial objects malfunction – or simply run out of fuel – they become “space junk” cluttering the cosmos.

“These are mechanical systems, which means that eventually they will fail,” notes Electrical and Computer Engineering professor Michael Greenspan, who leads the Queen’s project. But because they are many thousands of kilometres away, the satellites are beyond the reach of an expensive, manned spaced flight, while Earth-based telerobotic repair isn’t possible in real time.

Dr. Greenspan’s solution to this problem is the development of tracking software that will enable an Autonomous Space Servicing Vehicle (ASSV) to grasp the ailing satellite from its orbit and draw it into the repair vehicle’s bay. Once there, remote control from the ground station can be used for the repair, he explains. “The repair itself doesn’t have to be done in real time, since everything is in a fixed position and a human can interact with it telerobotically to do whatever is required.”

The Queen’s team is now working to develop the ASSV with the aerospace company MDA (McDonald-Detweiller Associates) Space Missions, which earlier built the Canadarm and has been responsible for all Canadian systems in the International Space Station.

Computer vision is the main technical challenge of grasping the satellites, Dr. Greenspan continues. Since these objects circle the globe in “geosynchronous” orbit, their speed is synchronized with the Earth’s rotation. The robotic system must recognize the satellite first, then determine its motion and match that motion before grabbing it.

Due to the harsh illumination conditions in space, conventional video cameras are of limited use. The preferred sensor is a form of light-based radar called LIDAR, which provides a set of 3D points that accurately measure the surface geometry of the satellite.

The Queen’s team, which includes Electrical and Computer Engineering graduate students Limin Shang, Babak Taati and Michael Belshaw, has developed software that allows such a system to identify a satellite, determine its position and finally track it in real time, using this specialized range data. They have recently received funding from the Natural Sciences and Engineering Research Council (NSERC) to continue looking at fundamental aspects of this new technology.

Another potential, terrestrial application of their findings is in the area of “flexible” manufacturing, says Dr. Greenspan. Using vision systems and algorithms, objects can be recognized and tracked as they go down a conveyor belt or assembly line. “Once you can do that, automated manufacturing systems can interface much more flexibly with the objects,” he notes. “The result will be a much easier and more cost effective manufacturing process.”

PLEASE NOTE: A video from Dr. Greenspan’s lab, showing a target object in a real-time tracking sequence, may be viewed online.

Contacts:

Nancy Dorrance, Queen’s News & Media Services, 613.533.2869

Alissa Clark, Queen’s News & Media Services, 613.533.6000, ext. 77513

Attention broadcasters: Queen’s has facilities to provide broadcast quality audio and video feeds. For television interviews, we can provide a live, real-time double ender from Kingston fibre optic cable. Please call for details.