Almost 15 years after being paralysed by a stroke, a 58-year-old US-American woman was once again able to serve herself a drink of coffee. This was possible thanks to a state-of-the-art DLR robot arm and hand that she controlled with neural signals sent directly from her brain. It took just a few moments for her to grasp the drinking bottle with the robot hand, bring it up to her mouth and drink the coffee through a straw. To accomplish this, software decoded neural signals recorded from a small array of electrodes that reflected her intention to reach and grasp, and converted them into commands that directed the robot arm and hand. Researchers at the German Aerospace Center (Deutsches Zentrum für Luft- und Raumfahrt; DLR) present the results of their collaboration with Brown University, the United States Department of Veterans Affairs, and Massachusetts General Hospital in the 17 May 2012 issue of the scientific journal Nature.
12 April 2011: the BrainGate trial participant attentively tracks the movements of the lightweight DLR robot with a look of concentration. As she pictures herself moving her own arm, her brain sends the associated signals to a computer via a four by four millimetre sensor. The sensor had been implanted by surgeons more than five years earlier in the motor area of her cerebral cortex, a thin sheet of neurons on the exterior of the brain that transforms multisensory information into motor commands. The computer decodes the signals, and the DLR robot arm and five-fingered hand execute these decoded instructions, taking the place of her own paralysed arm and enabling her to drink on her own for the first time in almost 15 years. As a result of a brainstem stroke, she lost the ability to speak, and was rendered unable to make useful movement, except for her head and eyes. She couldn’t help but smile as she brought the straw to her mouth.
"It was a very emotional moment for everyone involved," says Professor Patrick van der Smagt, project leader of the DLR assistive robotic technology that enabled thoughts to become the most complex and sophisticated actions ever performed by a human using direct brain control. He is also professor for machine learning at the Technische Universität München.
Converting nerve signals into movement
In 2005, surgeons affiliated with Brown University in the United States implanted the tiny array of 100 hair-thin microelectrodes into the trial participant's brain. "In 2006 we were able to demonstrate, for the first time, that people with severe paralysis could control the movement of a computer cursor using motor commands decoded from their own neural signal patterns," explains Professor John Donoghue, a neuroscientist at Brown University and the Veterans Affairs Medical Center in Providence, Rhode Island. Donoghue leads neural interface research, along with Dr Leigh Hochberg. "The goal of our research is to develop technology that can help people who have lost functional use of their limbs regain independence," Hochberg said. "Our participants in these pilot clinical trials are tremendously inspiring."
"I wanted to combine this capability with our understanding of robotics," adds DLR scientist Patrick van der Smagt. The DLR Institute of Robotics and Mechatronics already had an extremely manoeuvrable robotic arm and hand: DLR's robot 'Justin' has long been capable of gripping and opening objects, as well as interacting with humans. Not long afterwards, van der Smagt, Hochberg and Donoghue and their teams joined together in a ground-breaking collaborative effort to couple the human brain to a robotic arm and restore the ability to reach and grasp in people suffering from paralysis.
The bionics team at the DLR Institute of Robotics and Mechatronics initiated a feasibility study in 2006. They developed specialised software and used data sets from the BrainGate team to convert neural signals into signals capable of moving the DLR robotic arm. In the initial tests, the participant began by controlling an animation of a robotic arm on a computer screen. "After validating our approach with the robot simulation, we were eager to let the participant control the real robot, which we finally achieved in April 2010," remembers Jörn Vogel, a research engineer at DLR. To enable the scientists to interpret the neural signals correctly, the participant first pictured herself controlling the robot arm, while observing pre-programmed movements; the corresponding brain activity was then recorded and used to build a 'map' between patterns in her brain's activity and what the robot did. Finally, this map was used to move the DLR robot in real time in response to her command signals.
Interaction between humans and robots
"Of course, it is especially important for the robot not to pose any risk to the participant," states van der Smagt. To maximise safety, the robot uses sensors to continuously check for unintended contact with its surroundings. In case of unexpected contact, the integrated control system intervenes, relaxing the robot in a few thousands of a second and rendering it forceless so that it stops with a gentle touch. The system precisely controls the force of the robot hand's grip and the speed of the lightweight arm's movements while it is being positioned by brain signals. This is the first time scientists have coupled the brain to a complex robot able to reproduce a human arm's capabilities.
This research has demonstrated that, even in people who have been paralysed for years, neural signals remain functional at a level where they can be used, for example, to move robotic limbs. "The use of a robot, at least temporarily, enabled the participant to regain some of the functions she used to perform with her own arm and hand," says DLR-scientist van der Smagt. The researchers now plan to conduct further tests to examine how cooperation between humans and robotic systems can be further expanded so that people with disabilities are able to carry out daily activities successfully.
DLR researchers will be demonstrating the applications of the DLR lightweight robot arm and five-finger hand at the Fachmese Automatica in Munich, Germany between 22 and 25 May 2012.