New prosthetic limbs go beyond the functional to allow people to ‘feel’ again
Researchers around the world have been developing prosthetics that closely mimic the part of the human body they would replace. This goes beyond the cosmetic and even the functional; these are bionic body parts that can touch and feel, and even learn new things.
“Touch isn’t a single sense,” said Gregory Clark, associate professor of biomedical engineering at the University of Utah and lead researcher of the study. “When you first touch objects with a natural hand, there’s an extra burst of neural impulses.”
The brain then “translates” these into characteristics such as firmness, texture and temperature, all of which are crucial in deciding how to interact with the object, he said. In other words, by using the LUKE Arm (named after the “Star Wars” hero Luke Skywalker, and manufactured by Deka), Walgamott, of West Valley City, Utah, was able to “feel” the fragility of a mechanical egg, just as he would have with a natural limb. He could pick it up and transfer it without damaging it.
As he performed everyday activities with the prosthetic — such as holding his wife’s hand, sending a text message and plucking grapes from a bunch — Walgamott told researchers that it felt like he had his arm back. Even his phantom pain was reduced.
“When the prosthetic hand starts to feel like the user’s real hand, the brain is tricked into thinking that it actually is real,” Clark said. “Hence, the phantom limb doesn’t have a place to live in the brain anymore. So it goes away — and with it, goes the phantom pain.”
Clark’s team were able to achieve these results by stimulating the sensory nerve fibers in a “biologically realistic” manner, he said. Using a computer algorithm as a go-between, they were able to provide a more biologically realistic digital pulse similar to what the brain normally receives from a native arm.
“Participants can feel over 100 different locations and types of sensation coming from their missing hand,” Clark said. “They can also feel the location and the contraction force of their muscles — even when muscles aren’t there. That’s because we can send electrical signals up the sensory fibers from the muscles, so the brain interprets them as real.”
The critical component of a prosthetic powered by thought would be the communication between the brain and a robotic body part — called the brain-computer interface (BCI).
The LUKE Arm uses a neural interface, but in other mind-controlled prosthetics, brain implants are used to send instructions to a robotic limb, much like how neurons transmit messages from the brain to a muscle. But this means precision brain surgery and all the attendant risks, not to mention the expense and recovery time.
This might be about to change.
Bin He, professor and head of biomedical engineering at Carnegie Mellon University, and his colleagues have been working on a noninvasive high-precision BCI, and reported a breakthrough in June: a “mind-controlled robotic arm . . . that demonstrates for the first time, to our knowledge, the capability for humans to continuously control a robotic device using noninvasive EEG signals.”
The term noninvasive is key. Noninvasive BCIs have shown promising results but only in performing distinct actions — for example, pushing a button. When it comes to a sustained, continuous action such as tracking a cursor on a computer screen, noninvasive BCIs have resulted in jerky, disjointed movements of the robotic prosthesis. In He and his team’s demonstration, the subject controlled with their mind a robotic arm to track a cursor on a computer screen, and the prosthetic finger was able to follow the cursor in a smooth, continuous path — just as a real finger would. What is more interesting is that, while they used a computer-wired EEG cap on the subject in the lab, He said that it is not necessary.
A smartphone app programmed with EEG recordings and wireless electrodes could streamline the process for everyday use, He said.
This could pave the way for thought-controlled robotic devices by decoding “intention signals” from the brain without needing invasive and risky brain surgery, He said.
Just as our native limbs are trained to perform various actions — basic ones such as grasping or walking, to precision ones such as neurosurgery or ballet — prosthetics, too, have to be calibrated for specific uses. Engineers at the lab of Joseph Francis, associate professor of biomedical engineering at the University of Houston, have been working on a BCI that can autonomously update using implicit feedback from the user.
“We are moving toward an autonomous system that will learn to perform new actions as per the user’s intentions with the least supervision from outside, and enable the user to control the prosthetic more independently,” said Taruna Yadav, a PhD student who is part of Francis’s team.
In 2018, a bionic hand developed in a collaboration between the Imperial College London and the University of Göttingen used a human-machine interface that interpreted the wearer’s intentions and sent commands to the artificial limb. The team used machine learning-based techniques to interpret neural signals from the brain to improve the performance of prosthetic hands.
“Our main goal is to let patients control the prosthetic as though they were their biological limbs,” said Dario Farina, lead author of the paper about their findings. “This new technology takes us a step closer to achieving this.”
Would there, then, be a difference in the way that bionic body parts would work for an amputee vs. a paralyzed person? No, Yadav said.
“In both cases, it will still read the user’s neural activity and generate a command to control a prosthetic limb,” she said. “However, the time and effort required to learn to control the BCI output may differ.”
Whereas paralysis may present a higher likelihood of nerve or spinal cord damage, He said that “a noninvasive BCI should apply to both, as long as the subject’s cognitive function is intact. The details of the BCI system can be tailored to the particular needs of the situation.”
And the next frontier? If BCIs and other neural interfaces provide a means to connect our brains with external devices that extend the function of our body, and if they can make paralyzed patients walk again or restore a body part that has been lost to disease or injury, would it be theoretically possible to develop bionic add-ons that could bestow superhuman abilities?
“In a sense, yes,” Clark said. “Indeed, we already do. Glasses restore normal vision to the nearsighted. But telescopes and microscopes allow us to see what would be otherwise unseeable. Canes assist in walking after injury, but fiberglass vaulting poles allow us to clear superhuman heights.
“In clinical applications, exoskeletons provide important assistive technologies after spinal cord injury or stroke,” Clark said. “Yet they can be used to increase the power and endurance of intact individuals.”
But in other ways, bionic parts are no match for nature.
“For all its merits, the LUKE Arm contains only 19 sensors, and generates six different types of movements. Similarly, the neural interface we use can capture or convey hundreds of different electrical signals from or to the brain,” Clark said. “That’s a lot, but both are impoverished compared with the thousands of motor and sensory channels of the human body, or its natural functional capabilities.”
The field of biomedical engineering, Clark added, exists to improve nature when it goes awry. “But we also try to understand and use nature to improve engineering — and ourselves,” he said.
from Hacker News https://ift.tt/2PkR9tL