A Brain-Machine Interface and Minor Motions Allow Partially Paralyzed Patients to Feed Themselves
Designed to minimize mental strain, these two robot arms are under a shared control system with both BMI and gesture interfaces.
A team of researchers working in collaboration between the Johns Hopkins Applied Physics Laboratory (APL) and the Department of Physical Medicine and Rehabilitation (PMR) have showcased a new accessibility device: a pair of robotic arms, which can allow an individual with severely limited movement to feed themselves through thought and action.
“In order for robots to perform human-like tasks for people with reduced functionality, they will require human-like dexterity. Human-like dexterity requires complex control of a complex robot skeleton," explains David Handelman, PhD and senior roboticist at the APL's Intelligent Systems branch, of the team's work. "Our goal is to make it easy for the user to control the few things that matter most for specific tasks."
The control system developed by the team is based on a model of shared control, in which a brain-machine interface (BMI) can achieve useful work with a robotic system through minimal mental input.
"This shared control approach is intended to leverage the intrinsic capabilities of the brain machine interface and the robotic system," claims Francesco Tenore, PhD, "creating a 'best of both worlds' environment where the user can personalize the behavior of a smart prosthesis. Although our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines."
Preliminary though the results may be, they are also undeniably impressive: In one demonstration, a subject who has been partially paralyzed for three decades is able to instruct the robot hands to cut a dessert — and feed it to him, using a combination of a brain-machine interface and minor motions of the fist.
"This research is a great example of this philosophy where we knew we had all the tools to demonstrate this complex bimanual activity of daily living that non-disabled people take for granted," says Tenore. "Many challenges still lie ahead, including improved task execution, in terms of both accuracy and timing, and closed-loop control without the constant need for visual feedback."
The team's work has been published in the journal Frontiers in Neurorobotics under open-access terms.