Prosthetic wired to the brain could help amputees feel touch

High five! PNAS 2013

Our ability to grasp and manipulate objects relies on feedback from our sense of touch. Without these signals from the hand, we would have trouble performing even the most basic activities of daily living, like tying our shoes or turning a doorknob. Touch is even critical for emotional communication. We touch the people we care about, and it makes our limbs feel like part of us.

Science has made tremendous advances in technology that taps into signals from the brain to allow patients to move prosthetic limbs, but incorporating real-time sensory feedback would not only increase the dexterity and usefulness of robotic prosthetic limbs, but also make them feel like natural extensions of our bodies.

In my lab at the University of Chicago, we’re working to better understand how the sensory nervous system captures information about the surface, shape and texture of objects and conveys it to the brain. Our latest research creates a blueprint for building touch-sensitive prosthetic limbs that one day could convey real-time sensory information to amputees and tetraplegics via a direct interface with the brain.

To restore sensory motor function of an arm, you not only have to replace the motor signals that the brain sends to the arm to move it around, but you also have to replace the sensory signals that the arm sends back to the brain. We think the key is to invoke what we know about how the brain processes sensory information, and to then try to reproduce these patterns of neural activity through stimulation of the brain.

The research is part of a project to create a modular, artificial upper limb that will restore natural motor control and sensation in amputees and it has involved lots of people from academia, government and business. Our team is working specifically on the sensory aspects of these limbs.

In a series of experiments with monkeys, whose sensory systems closely resemble those of humans, we identified patterns of neural activity that occur when monkeys were natural holding or manipulating objects. We then successfully induced these patterns through artificial means.

During tasks to identify when and where the skin has been touched and how much pressure is being exerted, the animals responded the same way to actual physical contact as they did to artificial stimulation of the sensory cortex of the brain.

The result of these experiments created a set of instructions that can be incorporated into a robotic prosthetic arm to provide sensory feedback to the brain through a neural interface. Such feedback will bring these devices closer to being tested in human clinical trials.

The algorithms we use to decipher motor signals have come quite a long way, where you can now control arms with seven degrees of freedom - the number of ways the human arm can pivot. It’s very sophisticated.

But I think there’s a strong argument that prosthetics that seek to be extensions of ourselves will not be clinically viable until sensory feedback is also incorporated into the system. When it is, the functionality of these limbs will increase substantially.