The project MaiSHU (Multimodal Perception and Human-Machine Interfaces of Semi-Autonomous Intelligent Systems for Humanitarian Assistance in Uncertain and Unstructured Environments) aims to develop innovative technologies that enable assisted teleoperation of vehicles for humanitarian assistance and delivery of goods in challenging, unstructured environments.
Project details
In the two projects KI4HE and MaiSHU the work from the AHEAD project is continued and extended by modern technologies of environment perception, localization, semantic scene analysis, human-machine interface and teleoperation. The core of the MaiSHU project is to functionally integrate a robust multimodal perception strategy, complemented by AI-based semantic environment reasoning, and a powerful human-machine interface to enable reliable remote vehicle control. Based on satellite imagery and vehicle and perception data, a global control center provides high-level route plans to ensure safe passage over rough terrain. The technologies developed in this project complement current developments in autonomous driving on roads and allow high technology readiness levels (TRL) for their future commercialization for diverse mobility tasks in partially structured or unstructured outdoor terrains.
The work is distributed among the following work packages:
This collaborative project involves the industrial partners Blickfeld GmbH and SENSODRIVE, as well as the "Innovation Accelerator" of the United Nations World Food Programme (WFP), together with DLR.