SMiLE-AI

SMiLE_Logo_small.jpg
Credit:
PR

Building on previous SMiLE projects, the aim is to further develop the autonomy and robustness of robot systems while also improving their learning and interaction capabilities so that they can deal intelligently with novel, complex situations. The SMiLE-AI project focuses on these two goals, which will be achieved using cutting-edge AI technology. Two demonstrators will be used to demonstrate the long-term use of service robots and the adaptability of assistance robots in care.

  

Duration:

2025-01-07 until 2029-06-30

Fields of application:

  • Healthcare
  • Care robotics

Funding:

Bayerisches Staatsministerium für Wirtschaft, Landesentwicklung und Energie (StMWi)

Assistance system of the SMiLE2gether project
The use of robotic assistance systems: HUG, Rollin' Justin, EDAN (from left)

Project details

The core element of research activities in the field of assistive robotics for care is the SMiLE ecosystem for operating heterogeneous robot systems. This involves the development of different control modes that enable targeted command of the robots, adapted to the current operating environment. These control modes are supplemented by the option of controlling the robots remotely (teleoperation). This remote control is intended to enable autonomously operating robot systems to be put into use at an early stage. From a control centre, trained tele-care assistants can remotely control the robots if necessary, should the autonomous capabilities of the robot be insufficient in a given situation. With regard to user-related control options for the robots, different levels of autonomy are considered in the SMiLE ecosystem. In direct control, the robot is operated via a joystick, for example, and "directly" executes the movement commands initiated by the user. In the shared control autonomy level, user commands are adapted by AI-based software to facilitate the operation of the robot. Similar to a lane departure warning system in a car, partial autonomy helps to keep the robot on the right track for the task at hand. This can help, for example, to achieve the necessary precision when bringing the spout of a water bottle over a glass to pour a drink. However, shared control also helps to coordinate all degrees of freedom of the robot, for example, to drive a wheelchair through a door while the robot arm opens the door. At the autonomy level of supervised autonomy, the robot is operated via a graphical user interface, e.g., a tablet app, or via voice commands. Individual actions, such as fetching an object or opening a refrigerator or door, are performed independently by the robot at this level of autonomy. At the highest level of autonomy, a robot should be able to perform complex tasks independently and, if necessary, without direct human command. An adaptable assistance system for everyday care or personal support for people with physical impairments must also ensure flexible and smooth switching from one autonomy level to another. In addition, the autonomy functions must be further expanded for long-term and robust use in real-world environments.

SMiLE-AI's vision is to take the next big step towards an adaptable and reliable assistance robot ecosystem so that long-term use in realistic environments is actually possible. To this end, newly available AI methods in particular should help to substantially improve robot autonomy. Currently, the long-term use of robot systems in real-life everyday environments fails primarily due to the lack of ability to deal robustly with uncertainties and the limited adaptability of robot capabilities to respond appropriately to novel situations. This applies to both autonomous navigation and the manipulation of objects in complex everyday environments. As a result, considerable technical support is required, which stands in the way of the real-world deployment of service robots. In addition, speech recognition and natural language interaction are important prerequisites for user acceptance and need to be further developed. SMiLE-AI has the central goal of increasing robot autonomy (i.e. towards supervised autonomy) and enabling longer-term deployment of the robotic ecosystem. To this end, the following essential system properties must be improved: 1) greater robustness (fault tolerance and safe handling of unknown objects or environments), 2) transferability of knowledge and skill representations, 3) improved adaptability (learning and updating skills, flexible adjustment of the degree of autonomy) and 4) intuitive human-robot interaction (natural language interaction and ergonomic human-machine interfaces) to optimise user-friendliness. An important overarching goal of SMiLE-AI is to develop the features and capabilities to be as platform-independent as possible, i.e. implementation should be possible on a wide range of service and assistance robots.

Logo
Credit:
PR