The goal of this national space project is to build a satellite that is equipped with a robot manipulator in order to repair failed satellites in geostationary orbit by human teleoperation from ground. One of the key features we are developing is the autonomous capturing of the target satellite supported by machine vision.
On this page you can see the improved setup of the ESS demonstrator. We simulate the visual servoing phase in the lab by a two-robot system: One of the robots carries a satellite mockup with an original sized appogee jet and performs a typical tumbling motion of a rigid body under zero gravity. The second robot tracks the satellite in order to capture it by inserting a specialized capture tool into the apogee motor.
The approach of the target satellite is controlled by real-time, model-based machine vision. Once contact between capture tool and apogee motor has been established force/torque-sensing takes over. In both phases all six degrees of freedom are being controlled. We use our teleoperationg system to control the whole capturing sequence.
The satellite's motion is observed by a Kalman-filter based trajectory estimator.
3.2MB Mpeg One Complete tracking and capturing cycle
6.5MB Mpeg Edited movie with view from the hand-camera and addtl. sequences of our simulation.
View from the hand camera. The result of the pose estimation is visualised (green wire-frame model). Note the red regions of interest of the currently tracked features.
The capturing has started. The tool is equiped with two distance sensor rings which guide the tool to the center of the appogee engine. See the simulation below.
The locking crown is closed. The satellite has been captured. See the force-controlled movement of the two robots combined to one rigid system.
The simulation of the capturing sequence.