A robotic hand that uses artificial intelligence (AI) to provide unprecedented control for amputee users is currently being developed at École polytechnique fédérale de Lausanne (EPFL). These scientists are coupling their expertise in neuroengineering and robotics to enable wearers to control individual fingers of the device and automatically grasp objects when they contact the hand. The technology was found to perform successfully in a proof-of-concept study including three amputees and seven healthy participants. The findings of this study were published on September 11 in Nature Machine Intelligence.
A key concept from the neurological side of this system is the decoding of signals from existing motor nerves that are intended to signal a finger to move. These signals are gathered from remaining nerves in the amputee’s stump and are translated into motor signals for the individual prosthetic finger, a feat that has not been accomplished prior. From the robotic technology implemented in the device, the hand can assist the user in grasping and remaining in contact with them to manipulate objects in a realistic manner.
Conscious Control of the Prosthetic Hand
This machine learning algorithm used in this technique first learns how to decipher the user’s intentions for movement and convert these impulses to finger movements in the prosthetic hand. In training this algorithm, the amputee performs several hand movements with sensors on their residual limb to detect neuromuscular activity. Through this process, the machine learning algorithm learns what muscular stimuli at the site of amputation correlate to specific hand motions.
Once this training process is complete the user can convert their conscious motor impulses to fine-tuned movement in the individual prosthetic fingers. This algorithm filters out the extraneous signals in the limb and focuses only on those with significance to the intended movement.
“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” explained first author Katie Zhuang.
Unconscious Assistance from Robotic Technology
The scientists then developed the algorithm to assist users in holding and manipulating objects. When the robotic hand’s sensors encounter an object, the AI algorithm closes the fingers around it. This automatic grasping feature stems from previous research in which robotic arms were built to sense an object’s shape and grasp it based solely on information from contact.
“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” said Aude Billard, leader of EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.”
Going Forward with this AI Approach
Despite having success in this Nature Machine Learning publication, this algorithm must undergo further research and development before it can be integrated into prosthetic devices for amputees. It is currently being evaluated on a robot provided to these EPFL researchers.
“Our shared approach to control robotic hands could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces, increasing the clinical impact and usability of these devices,” concluded Silvestro Micera, EPFL’s Bertarelli Foundation Chair in Translational Neuroengineering, and Professor of Bioelectronics at Scuola Superiore Sant’Anna.
Balancing user and robotic control: our new paper on smart prosthesis control in collaboration with #Billard @EPFL_en @nccrrobotics @SantAnnaPisa @Bertarelli_fdn just published in @NatMachIntell https://t.co/51TXidQEdm pic.twitter.com/YTXr7Pm9vl
— Silvestro Micera (@_smicera) September 11, 2019