Recent progress in physical Human‐Robot Interaction (pHRI) research showed in principle that human and robots can actively and safely share a common workspace. The fundamental breakthrough that enabled these results was the human-centered design of robot mechanics and control. This made it possible to limit potential injuries due to unintentional contacts. Previous projects, in particular the PHRIENDS project in which a part of the consortium has been involved, provided remarkable results in these directions, constituting the background foundation for this proposal.
Inspired by these results, SAPHARI will perform a fundamental paradigm shift in robot development in the sense that we place the human at the centre of the entire design. The project will take a big step further along the human-centered roadmap by addressing all essential aspects of safe, intuitive physical interaction between humans and complex, human-like robotic systems in a strongly interconnected manner.
PHRIENDS is about developing key components of the next generation of robots, including industrial robots and assist devices (which include robots for the emerging market of non-industrial applications, e.g. for service, health-care, and entertainment), designed to share the environment and to physically interact with people. Such machines have to meet the strictest safety standards, yet also to deliver useful performance: this poses new challenges to the design of all components of the robot, including mechanics, control, planning algorithms and supervision systems. We propose an integrated approach to the co-design of robots for safe physical interaction with humans, which revolutionizes the classical approach for designing industrial robots – rigid design for accuracy, active control for safety – by creating a new paradigm: design robots that are intrinsically safe, and control them to deliver performance. Although the scope of this project cannot encompass the integration of complete robot systems, PHRIENDS will create and deliver new actuator concepts and prototypes, new dependable algorithms for supervision and planning, new control algorithms for handling safe human-robot physical interaction and for fault-tolerant behaviour, and will integrate these components in meaningful subsystems for experimental testing, quantitative evaluation and optimization. The project will also contribute significantly to the ongoing effort of international bodies towards the establishment of new standards for collaborative human-robot operation.
Many industrial robots are much stronger than humans, but also very inflexible. For example, humans can throw objects much further and catch them much more gracefully, temporarily storing energy in elastic tendons and muscles. Such flexible actuators, however, require more sophisticated control algorithms than those used by traditional robots.
The goal of the STIFF consortium is to equip a highly biomimetic robot hand-arm system with the agility, robustness and versatility that are the hallmarks of the human motor system, by understanding and mimicking the variable stiffness paradigms that are so effectively employed by the human central nervous system. A key component of our study will be the anatomically accurate musculoskeletal modelling of the human arm and hand.
The project will develop novel methodologies to comprehend how the human arm can adapt its impedance, e.g., by changing the co-contraction level or by adapting reflex gains. The impedances of arm and hand will be investigated using powerful robot manipulators capable of imposing force perturbations. While stiffness & elasticity are currently exploited in the context of artificial laboratory tasks, we will investigate stiffness-dependent behaviour in natural tasks such as throwing a ball or inserting a peg in a hole. Existing closed-loop system identification techniques will be extended by non-linear time-variant techniques to identify the behaviour during reaching and grasping tasks. Grasp force modulation and hand muscle activity correlations will be acquired through machine learning techniques and then transferred to the robotic system. Finally, optimization techniques gleaned and validated on the detailed biophysical model will be transferred to the variable impedance actuation of the novel biomorphic robot.
This project aims at developing and exploiting actuation technologies for a new generation of robots that can co-exist and co-operate with people and get much closer to the human manipulation and locomotion performance than today’s robots do. At the same time these robots are expected to be safe, in the sense that interacting with them should not constitute a higher injury risk to humans than the interaction with another cautious human. This requires that robots with similar size and mass as the humans also have comparable power, strength, velocity and interaction compliance.
This ambitious goal can, however, not be achieved with the existing robot technology, in which the robots are designed primarily as rigid position or torque sources and most interaction skills are imposed by virtue of control software. Also, conventional robots in which interaction is controlled by software only, could not avoid an impact to damage the robot and possibly the human neighbour, as the controller will react with some delay. This project will develop both a solid theoretical understanding and the enabling technologies for the design of a new generation of robot actuator systems, capable of embodying the physical principles which shape the robot behaviour.
The project hinges about two systems of such enabling constraints, or synergies, respectively in the hand motor system and in the tactile and kinaesthetic sensory system, and about their interaction. Motor and sensory synergies are also our two key ideas for advancing the state of the art in artificial system architectures for the “hand” as a cognitive organ: indeed, the ultimate goal of THE Hand Embodied project is to learn from human data and hypotheses‐driven simulations how to better control, and even design, robot hands and haptic interfaces.