For the first time in around 30 years, a partially paralyzed man has once again been able to feed himself thanks to some robotic arms and his own mind. Harnessing the man’s brain signals to control two prosthetic arms, the technology made the once impossible task of cutting and eating cake look like a piece of, well, cake.
The extraordinary feat is described in a new paper published in the journal Frontiers in Neurorobotics and relies on a brain-machine interface to directly connect brain and computer.
The man, who has limited upper-body mobility and is unable to use his fingers, was able to move his fists in response to prompts. The neural signals underpinning these motions were then translated by electrodes implanted in his brain and used to control robotic limbs.
Following instructions, such as “cut food”, “scoop food”, and, most importantly, “eat food”, the man was able to feed himself with the prosthetic arms, controlled using his mind via the brain-machine interface. And after 30 years of being unable to do so, the achievement was rightfully celebrated with applause and cheers.
“This shared control approach is intended to leverage the intrinsic capabilities of the brain machine interface and the robotic system, creating a ‘best of both worlds’ environment where the user can personalize the behavior of a smart prosthesis,” Dr Francesco Tenore, the paper’s senior author, said in a statement.
“Although our results are preliminary, we are excited about giving users with limited capability a true sense of control over increasingly intelligent assistive machines,” he added.
The study has been a long time in the making, building on more than 15 years of research, but could now offer hope to people with sensorimotor impairments.
“Brain-machine interfaces have the potential to increase the independence of such individuals by providing control signals to prosthetic limbs and re-enabling activities of daily living,” the authors write in their paper.
And the new study demonstrates this is possible, with minimal human input necessary. The robot does most of the heavy lifting, and the user can tailor its behavior to suit them.
“In order for robots to perform human-like tasks for people with reduced functionality, they will require human-like dexterity. Human-like dexterity requires complex control of a complex robot skeleton,” Dr David Handelman, the paper’s first author, explained.
“Our goal is to make it easy for the user to control the few things that matter most for specific tasks.”
The technology is still being optimized and is a way off clinical implementation just yet. The accuracy and timing of the robot’s movements can still be improved, Tenore explained, as can its reliance on constant visual feedback.
But, with more research, the authors are hopeful these things could be ironed out. The technology could be used to help “even beyond basic activities of daily living,” Dr Pablo Celnik concluded.