DLR Portal
Home|Sitemap|Contact|Accessibility Imprint and terms of use Privacy Cookies & Tracking |Deutsch
You are here: Home:Projects:Completed Projects:Other closed space robotics missions
Advanced Search
Institute of Robotics and Mechatronics
Departments
Robotic Systems
Applications
Research
Projects
ASCEND
CeTI
CoViPa
DIH-HERO
EROSS IOD
euROBIN
FoF-X
iFOODis
IntelliMan
ION
KI4HE
LEROSH
MaiSHU
MaTiC-M
MMX
M-Runners
NatDyReL
OPERA
ORU-BOAS
SmartHand
SMiLE (project series)
SMiLE2gether
SMRS
SoftEnable
Surface Avatar
Completed Projects
AHEAD
AI4EU
AnDy
ARCHES
COMANOID
COSMA
DIH-RIMA
EmPReSs
EROSS+
EUROC
FACILITATORS
Factory of the Future
FUTURO
GINA
Hovitron
InFuse
Kontur-2
LIPA
MASCOT
METERON
MIRO Innovation Lab
MIROLab
MOSAR
Dynamische Regelung Humanoider Laufmaschinen
PHRIENDS
PULSAR
RACE-LAB
RAS – Robotic Airbag Systems
RobDREAM
ROBEX
ROKVISS
SAPHARI
SMART-Assist
SMErobotics
SMiLE (project)
RedMod
SoMa
SpaceBot Camp
STABLE
STAMAS
News and Publications
TAPAS
THE
VeriDream
VIACTORS
VITA
VVITA
Other closed space robotics missions
Spin-offs
Publications and downloads
Job offers
How to get to us
News Collection
Older closed space robotics missions (in alphabetical order)
Back
Print

PSPE



Payload Support for Planetary Exploration

Autonomous planetary exploration will play an important role in future space missions. Therefore we have studied the feasibility of robotised planetary exploration in several ESA contracts (ROBUST, PSPE, ROSA-M). Within the European Payload Support for Planetary Exploration (PSPE) project, we have proposed a Lander spacecraft configuration and control concept, which should allow to perform geo-science operations on Mars, but with higher local autonomy than, e.g., the pathfinder’s sojourner rover.

The proposed on-orbit system consists of the following components:

Lander system with Imaging Head and Nanokhod Rover
  • The Lander system (see above) carries all of the supply engineering and the complete control system for the Rover, the scientific instrumentation and the communication with the Ground Control Station.

  • The Imaging Head (see below), mounted on top of a vertical cantilever beam, is equipped with a space-applicable stereo camera and with a two-degrees-of-freedom pan-and-tilt unit. The cameras are optimised for both taking stereoscopic panorama images of the landing site as well as the detection of interesting objects around the Lander.

  • The Nanokhod Rover (see below) is a rugged, simple, reliable yet effective, track-driven carrier of scientific equipment, optimised to drive to interesting points in the vicinity of the lander, and to carry out in situ measurements. Tethered to the Lander via a power and data cable, the Nanokhod benefits from the Lander’s supply engineering.

Imaging head with pan-tilt-unit
.
Our modified Nanokhod rover
Due to the very large time delay between ground and space segment (transmission time greater than 20 minutes with respect to the Mars scenario) and typical limitations of communication bandwidth in space, on-line control is not feasible. This means that the Rover has to be commanded at a very abstract level: the ground control system provides an easy-to-use command interface, optimised for intuitive scientific experiment support without the need of specific robotics knowledge. It is based on a terrain mode which is reconstructed on-ground from the stereoscopic panorama-images during site exploration. A mission specialist selects interesting sites or objects in the terrain model, to be explored by the Nanokhod – the “long arm” of the Lander. A sophisticated path planning algorithm determines the optimal path with attention to given constraints (e.g., topography, estimated soil and known Rover driving characteristics). In our approach a list of way-points, determined by the path planer on-ground, will be uploaded to the space segment for autonomous execution on site.

For the feasibility study we have proposed and realised a vision-based approach for precise Rover localisation and guidance. The image processing and 3D-localisation system of the mentioned imaging head detects significant features of the rover in the stereo images, determines the current position of the rover with respect to a reference system, and controls the rover motion to reach the desired goal position. To alleviate the feature detection job, 4 LED’s are mounted on top of the rover’s payload cabin. Using a straight-forward blob finding algorithm, applied to the difference images (LED switched off/on), the 2D-coordinates of the LED’s in the image plane can be calculated. To determine the pose of the rover, first a stereo reconstruction algorithm generates 3D-points corresponding to the given pairs of 2D image coordinates, second an algorithm is applied, which matches these “measured” 3D-points with the “modelled” ones to get the transformation between the rover and the camera system. Successful verification was not only demonstrated at ESA´s planetary test bed, but also on a specially developed Mars surface model in the institute.

Originally developed for space missions, the system can be easily adapted to applications on Earth, e.g., for controlling mobile robots in harsh and dangerous environments.

Related article: A micro-rover navigation and control system for autonomous planetary exploration (see Downloads)


Downloads
A micro-rover navigation and control system for autonomous planetary exploration (0.89 MB)
Related Articles
ROKVISS-Projekt
Force-Feedback Joystick (2003–2010)
DEXHAND
Copyright © 2023 German Aerospace Center (DLR). All rights reserved.