Vikram Pandit looks to the right while reaching out to pick up a red plastic cup on the table to his left. He grasps the cup, and promptly crushes it. “Oops,” says Pandit, grinning sheepishly. The young, dark-haired college junior is not good at handling objects without looking. He was born without a left hand and relies on a prosthesis on that side of his body.

Pandit’s prosthetic hand—a motorized robotic gripper with a silicone glove matched to his skin tone—operates via electrodes attached to his forearm that pick up signals from his muscles. The hand enables Pandit to do many things, but handling fragile objects is not one of them. Once the hand receives a signal to close, it will continue to close unless Pandit consciously wills it to stop. This is because, unlike those of a natural hand, the fingers neither feel the pressure of the cup’s...

In a small two-room office next to a roaring highway in downtown Los Angeles, mechanical engineer Jeremy Fishel helps Pandit unscrew his prosthetic and attach a different one. It is a shiny black robotic hand with only three fingers, each sporting a bright, seafoam-green fingertip. Pandit repeats the task with the new hand—picking up a cup without looking. This time, he deftly grasps the cup and lifts it into the air, unscathed. He smiles broadly. “I didn’t even have to think about what I was doing,” says Pandit. “It’s nice.”

If you can’t feel anything, your hand is pretty close to useless. That is basically the problem with prosthetic hands for the last century—no touch.
—Gerald Loeb, University of Southern California


The green fingertips, called BioTacs, are sensors that each detect multiple sensations: pressure, temperature, and vibration. (See first set of photographs below.) Using that information, the second prosthetic hand stopped on contact with the cup instead of crushing it—a reflex that today’s prosthetic hands do not have. In fact, there are currently no prosthetic hands available, commercially or in clinical trials, that provide a sense of touch.

 

The first cochlear implant, called a “bionic ear,” went on the market in 1972, restoring hearing to many patients with damage to sensory hair cells. Retinal implants for degenerative eye conditions have shown success in clinical trials, providing sight to the blind. But there are no implants or systems to restore touch. And that, says Gerald Loeb, co-inventor of the BioTac, is a tragedy.

“Anyone who has ever had their fingers go numb from the cold knows that no matter how well your muscles are working, if you can’t feel anything, your hand is pretty close to useless,” says Loeb, a biomedical engineer at the University of Southern California who has studied neural prosthetics for the last 40 years and co-invented the cochlear implant. “That is basically the problem with prosthetic hands for the last century—no touch.”

Because of a lack of touch, upper-limb prosthetic users like Pandit must look at their prosthetic hands the whole time they use them. “Using an upper-limb prosthesis is incredibly cognitively involved,” says Paul Marasco, a sensory neurophysiologist at the Advanced Platform Technology (APT) Center at the US Department of Veterans Affairs. “You’re essentially substituting vision for touch.”

Unfortunately, the prosthetics research community has put most of its efforts into making arms with wider ranges of motion and more powerful motors, not tactile feedback. It’s like building a car with an engine but no wheels, muses Loeb, and then declaring, “Let’s make a better engine!”

But there may be a good reason for that. Touch, researchers agree, is our most complicated sense. It involves the perception of three-dimensional shapes and sizes, textures, vibrations, temperature, and pressure. Touch provides a sense of where our limbs are in space (a poorly understood phenomenon called proprioception: see sidebar, “Sixth Sense,” at the end of the article) and detects a huge range of forces, from the lightest brush of a cat’s fur on one’s fingertips to the smack of a baseball into one’s palm.

Today, a small number of researchers pursue the incorporation of touch into prosthetics, with eager patients spurring them on. Pandit, for one, aspires to be a heart surgeon. “Yeah,” he says with a wry smile, “I’ll need touch for that.”

 

SUPER SENSOR: University of Southern California junior Vikram Pandit (top left) uses a prosthetic hand equipped with BioTac sensors to delicately pick up a foam peanut and a water bottle. BioTac creator Jeremy Fishel tests the sensors (top right). Single BioTac sensor with a clear silicone “skin” shows the bone-like electronic core.
SUPER SENSOR: University of Southern California junior Vikram Pandit (top left) uses a prosthetic hand equipped with BioTac sensors to delicately pick up a foam peanut and a water bottle. BioTac creator Jeremy Fishel tests the sensors (top right). Single BioTac sensor with a clear silicone “skin” shows the bone-like electronic core.
Credit: Megan Scudellari

 

The research front lines


Colonel Geoffrey Ling, a critical-care physician and neurologist, was first deployed to Afghanistan in 2003. “There was not a single day that went by when I wasn’t taking care of a patient that was missing a limb,” says Ling. But when he returned to the United States, Ling realized that upper-limb prosthetic options for returning vets are limited. Leg prosthetics are far more advanced than upper-limb prosthetics, simply because upper-limb prosthetics—artificial arms and hands—make up a much smaller slice of the prosthetics market and designing them is “doggone hard,” says Ling.

 

 

Some patients reject the equipment because of the complete lack of sensory feedback.


Upper-limb prosthetics have long been a difficult medical and engineering challenge. “Body-powered” arms, often equipped with a hook in place of a hand, were invented after the Civil War and refined during World Wars I and II, with few changes since. A harness and cable around the shoulder of the intact arm gives patients motor control of the prosthesis through movements of the intact limb and shoulder. But according to surveys, less than half of patients with this type of prosthesis wear it regularly because it is uncomfortable and “ugly as sin,” says Ling. The device does provide one thing patients like—some sensory feedback from the tug and pull of the cable. Other patients choose a more advanced, motorized prosthetic—an electrically operated arm like Pandit’s—but some of them later reject the equipment because of the complete lack of sensory feedback.

 

 

“We had a patient population in need,” says Ling. “We’re the Department of Defense [DOD], and we have a saying in the military that no one should be left behind. We’re serious about that.” After returning from his second deployment, this time to Iraq, in 2006, Ling initiated the DOD’s Defense Advanced Research Projects Agency’s (DARPA) Revolutionizing Prosthetics program, a “Manhattan Project” for prosthesis innovation that pumped $153 million into prosthetic arm research over the next 6 years.

At the program’s kickoff meeting in 2006, Gerald Loeb sat in the audience. There to present modeling software for new prosthetic designs, Loeb stayed to hear other presenters discuss their ideas for incorporating touch (one of the goals of the program was to create prosthetic fingers so sensitive that users could read Braille). ?After the presentations, Loeb could only think one thing: “This project is doomed.” Teams were proposing über-sensitive and delicate devices spiked with gold wires, which were neither practical nor inexpensive enough for everyday use.

After the session, Loeb and touch expert Roland Johansson of Umeå University in Sweden went to a bar. As the alcohol began flowing, they commiserated about getting involved with a “foolish” program, one that was throwing tens of millions of dollars at a sophisticated arm with no chance of having a sense of touch. They began to brainstorm their own idea for a bionic finger. On a cocktail napkin, they sketched out what would become the BioTac: a bone-like electronic core dotted with electrodes and surrounded by an electrically conductive liquid, all encased in an inexpensive, elastic silicone skin. As the finger touches an object, they proposed, the “skin” would deform, press into the fluid and change the electrical resistance sensed by the electrodes on the “bone.” Voilà!—a touch sensor.

Loeb proposed the idea to DARPA, but they declined to fund his proposal. “They were too smitten with the others,” he says. In the end, Loeb and three graduate students, including Fishel, founded a company called SynTouch and received a small business grant from the National Institutes of Health to create the sensor. Over the next several years, they developed the BioTac.

 

TOUCH THIS: A motorized tactor developed by Kinea (bottom right) mechanically stimulates an alternative body surface (bottom left) to “playback” the sensations picked up by fingertip sensors (top left) of a prosthetic hand. The Modular Prosthetic Limb (top right and featured image), developed by the Johns Hopkins Applied Physics Laboratory, uses the Kinea sensors in its fingertips.
TOUCH THIS: A motorized tactor developed by Kinea (bottom right) mechanically stimulates an alternative body surface (bottom left) to “playback” the sensations picked up by fingertip sensors (top left) of a prosthetic hand. The Modular Prosthetic Limb (top right and featured image), developed by the Johns Hopkins Applied Physics Laboratory, uses the Kinea sensors in its fingertips.
Credit: HDT Global; Johns Hopkins University Applied Physics Laboratory

 

A call to arms


While Loeb pursued the BioTac, DARPA’s Revolutionizing Prosthetics program was busy with the development of two new prosthetic arms, both initially intended to include touch sensors. The first, named the “Luke” arm after Luke Skywalker, was designed to incorporate new software and hardware technologies while getting to the market as quickly as possible. It met that goal with flying colors: the Luke arm, made by DEKA Research & Development Corporation in Manchester, New Hampshire, went from idea to product in 5 years and is currently awaiting FDA approval. The second arm, the Modular Prosthetic Limb (MPL) developed by the Johns Hopkins University Applied Physics Laboratory, is being designed to function as much as possible like a human limb: controlled by conscious thought—via electrodes picking up muscle impulses, such as Pandit’s myoelectric hand, or other techniques—able to curl 45 pounds at the elbow, and capable of recognizing and manipulating objects through touch.

 

 

What if there were a way to return a sense of touch to the right place—to make an amputee actually feel his or her hand again?


During the development of the Luke and MPL arms, a small Illinois-based engineering company called Kinea Design, started by three Northwestern University professors and recently acquired by the robotics division of military manufacturer HDT Global, built fingertip sensors for each arm. The Luke sensor, a small, hard device the size of a fingertip pad, detects pressure in two directions as well as contact and texture. The MPL sensor detects pressure in three directions, texture, and temperature. It also detects contact at four discrete points, motivated by the idea of being able to read Braille (though it is not able to do so). In the end, the Luke arm did not include Kinea’s fingertip sensors, according to DARPA, but the MPL arm, which is still being tested, incorporates the sensors designed for it.

 

 

The BioTac, on the other hand, mimics a natural finger, down to the details. For instance, it has a white “fingernail” screwed onto its top. During construction of the sensor, Loeb’s team realized that fingernails play a critical role by holding skin in place: If you slide your finger side to side on a flat surface, for example, you will feel the skin stretch toward and away from the nail. Receptors around your nail pick up on that movement. The BioTac also has fingerprints, which turn out to be crucial to sense texture and vibration.

Thanks to those components, BioTacs are actually more sensitive than your own fingers. This past June, Fishel and Loeb published results demonstrating that a BioTac can correctly identify 117 textures 95 percent of the time—more accurately than human subjects.[1. J.A. Fishel, G.E. Loeb, “Bayesian exploration for intelligent identification of textures,” Front Neurorobot, 6:4, 2012.] The sensors are also able to identify the softness or hardness of a material, similar to the way our fingers test the ripeness of a peach or gauge the amount of air in a bicycle tire.[2. Z. Su, et al. “Use of tactile feedback to control exploratory movements to characterize object compliance,” Front Neurorobot, 6:7, 2012.]

Returning touch


Sensors have clearly become sophisticated, but a prosthetic touch system requires a second critical component: a way to deliver that sensory information to the user’s nervous system. Currently, the most advanced technique to provide tactile feedback is via tactors. Tactors are small electronic devices, the size of a bite-size candy bar, that buzz, push, or otherwise mechanically stimulate an alternative body surface, typically the shoulder or chest, as a way to “playback” the sensations picked up by the sensors. Pandit, for example, wears a black air cuff around his upper arm when using the BioTacs. It inflates and squeezes his arm to tell him how much pressure he is using when holding an object.

 

The air pressure cuff is a rudimentary tactor. For the Revolutionizing Prosthetics program, Kinea developed a small, motorized tactor with a rounded plastic tip that touches the skin to convey four sensations—tapping for contact, a push for pressure, vibration for texture, and side-to-side movement for friction.

Unfortunately, “putting mechanical senses on other parts of the skin is fundamentally limited,” says Loeb, because the hairy skin which covers much of our body, including the areas where tactors are placed, is far less sensitive than the hairless skin of our hands and fingertips, called glabrous skin. And tactors, though designed to convey multiple sensations, have difficulty providing more than one sensation at a time, such as pressure and texture simultaneously, or input from multiple fingertips.

Tactors may not give ideal results, but they do provide some benefit. Pandit’s air cuff, for example, allows him to shake someone’s hand without squeezing it dangerously hard.

But what if there were a way to return a sense of touch to the right place—to make an amputee actually feel his or her hand again? Purely by accident, that science fiction fantasy recently became reality.

Bionic woman


Claudia Mitchell first felt her hand again in the shower. As hot water poured over her head and splashed onto her upper chest above her heart, she felt as if it were splashing onto her left hand, a hand she had lost a year earlier in a motorcycle accident. Mitchell jumped out of the shower and dialed the Rehabilitation Institute of Chicago (RIC). “Guess what?” she squealed joyfully into the phone. “It’s working! I can feel it!”

 

 

TOUCH PIONEERS: Amputees Claudia Mitchell and Jesse Sullivan talk during a September 2006 press conference at the National Press Club. Both underwent a revolutionary surgery that allowed them to feel their missing hands by transplanting sensory nerves into their upper chests.
TOUCH PIONEERS: Amputees Claudia Mitchell and Jesse Sullivan talk during a September 2006 press conference at the National Press Club. Both underwent a revolutionary surgery that allowed them to feel their missing hands by transplanting sensory nerves into their upper chests.
Credit: © Matthew Cavanaugh/epa/Corbis

 

Mitchell, a 31-year-old ex-Marine, was the first individual to undergo targeted sensory reinnervation, a revolutionary surgical procedure to transplant sensory nerves from the shoulder of a missing arm to the pectoral muscles and chest skin. Developed by clinician and biomedical engineer Todd Kuiken, director of RIC’s Center for Bionic Medicine, the procedure was first performed as targeted muscle reinnervation (TMR), designed solely to restore motor control.[3. T.A. Kuiken, et al., “Redirection of cutaneous sensation from the hand to the chest skin of human amputees with targeted reinnervation,” PNAS, 104:20061-66, 2007.]

When individuals lose a limb, no matter how high the point of amputation, they still retain the ends of the nerves that used to travel to that limb. But nerves alone are too small to convey electrical signals to prosthetic electrodes, so Kuiken designed a surgery to reroute those nerves to chest muscles, which contract and amplify the nerve signals.

But the first time Kuiken performed the surgery, on a double arm amputee named Jesse Sullivan, something strange happened. Six months after the surgery, as Sullivan was preparing to work on his motor control, an assistant wiped his chest with alcohol. Sullivan said it felt cold, but not on his chest. It felt cold on his missing hand. “That was the bingo moment,” says Kuiken. “We went to do that on purpose the next time.”

Mitchell had the surgery in 2005, this time to deliberately return a sense of touch. The surgical team cut a sensory nerve leading to a softball-size patch of Mitchell’s chest and in its place attached the sensory nerve end from her left shoulder, which used to travel down to her left hand. “They rewired me to let that part of my skin be the sensors for my hand,” says Mitchell. “When you put your finger on a certain place on my chest, it feels like my pinkie finger, or my index finger, or my palm or knuckle.”

The next step was to add a sensor and tactor so that she might feel what her prosthetic hand feels. Mitchell used a prosthetic arm built at the RIC with tactors developed by Kinea poised above her reinnervated chest area. At the first touch of a ridged surface with a prosthetic finger, Mitchell felt the corresponding vibration, via the tactor, in her own finger. “It was a cool thing to be able to feel,” she says. “You don’t realize how important it is to have feedback in the right place until you do.”

The sensory reinnervation has only been performed on five patients, says Kuiken, because there isn’t a commercially available tactor system to take advantage of it. Right now, the reinnervated chest area only has so much space, primarily taken up by motor control electrodes, so there is little room for tactors, whose motors can interfere with the electrode antennas anyway. “We can do it in the lab and it’s cool,” says Kuiken, “but it’s not yet useful.”

Mitchell looks forward the day that it will be, and until then, she continues to visit the lab to help advance the technology. “I don’t want there to be another 23-year-old young woman who loses her arm, but there will be,” says Mitchell. “And when that happens, I want her to have a better experience than I had.”

Mind control


Sensory reinnervation returns a sense of touch to the right spot, but the simple push and buzz of a tactor conveys only a fraction of the information picked up by sensors like the BioTac. Today, a growing number of researchers believe that the best way to provide tactile feedback to prosthetic users isn’t via tactors and sensors, but by taking a cyborg route—inputting electrical signals directly into the nervous system.

 

 

NEXT-GEN ARM: The DEKA Arm, awaiting FDA approval, has torque sensors in the joints of the index finger and thumb to record force measurements, but it does not include fingertip sensors.
NEXT-GEN ARM: The DEKA Arm, awaiting FDA approval, has torque sensors in the joints of the index finger and thumb to record force measurements, but it does not include fingertip sensors.
Credit: DARPA

 

In September 2011, Tim Hemmes, a quadriplegic as the result of a motorcycle accident that damaged his spinal cord, reached up to tenderly touch hands with his girlfriend. “I got to reach out and touch somebody for the first time in seven years,” said Hemmes in a video recorded shortly after the event. Hemmes did so by controlling the MPL arm via an electrode grid, about the size of a large postage stamp, surgically implanted beneath his skull and resting on the surface of his brain. The grid detects electrical signals in the brain and uses those signals to move the robotic arm.

It was a major advance for brain-controlled prostheses in paralyzed individuals—for whom body-powered and myoelectric prostheses do not work—but while Hemmes could move the arm, he could not feel his girlfriend’s hand. The MPL hand includes tactile feedback sensors; however, the neural implant is not yet able to deliver signals from those sensors to the brain based on its current design. Johns Hopkins bioengineers hope to add that feedback capability to the implant for testing in clinical trials next year, says Michael McLoughlin, program manager for the prosthetics project at Hopkins, but there is still a lot to learn about how touch information is organized in the cortex and how to deliver it there safely.

Nonhuman primates are providing some insight. In May, neuroscientist Miguel Nicolelis and colleagues at Duke University in Durham, North Carolina, announced that monkeys with a brain implant could use thought alone to move a virtual arm and distinguish textures.[4. J.E. O’Doherty, et al., “Active tactile exploration enabled by a brain-machine-brain interface,” Nature, 479:228–31, 2011.] The monkeys directed a virtual arm to move over identical objects, each with a different texture. Each texture was communicated to the brain as a unique pattern of electrical signals, and monkeys were able to identify a particular texture to receive a reward. “The cortex learned a new electrical language associated with textures,” says Nicolelis.

A brain-computer interface that restores a sense of touch still smacks of science fiction, but efforts to develop it are more advanced than attempts to tap into peripheral nerves, which is how cochlear and retinal implants function. With touch, researchers have tried and failed to stimulate peripheral nerves—nerves outside the brain and spinal cord, such as those that travel up and down the arm and hand—for more than 40 years.

“Selecting nerves and knowing what information those nerves are responsible for conveying is still a black art,” says J. Edward Colgate, cofounder of Kinea and a touch expert at Northwestern University. Scientists don’t understand which information—pressure, vibration, temperature, etc.—travels via which nerve or how those signals are processed in the brain. Most attempts to stimulate peripheral sensory nerves result in a general buzzing or prickling feeling. “It can be strong or weak, but none of it feels like normal touch,” says Loeb.

A sense of self


In addition to improving motor control, there is another, perhaps more important, reason to add a sense of touch to prosthetics. In a study published last year in Brain, Marasco, Colgate, and Kuiken found that sensory reinnervation patients felt a sense of ownership of their artificial limb when using fingertip sensors and tactors—a feeling that the limb was actually theirs.[5. P.D. Marasco, et al., “Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation amputees,” Brain, 134:747-58, 2011.]

 

“Right now, amputees put a limb on and it is an insensate tool,” says Marasco. “They use it to do chores, and then take it off. But if we set it up so they have a sense of touch and it is embodied appropriately, then they might be able to retain their sense of wholeness.” At the APT Center’s labs in Cleveland, Ohio, Marasco is now designing a sensory feedback system for patients like Mitchell to take home and use with their prosthetic for a year, rather than periodically in the lab. He hopes the experiment will reveal whether long-term tactile feedback helps patients use a limb more effectively and with more ownership.

Back in SynTouch’s offices in Los Angeles, Pandit uses a prosthetic hand equipped with BioTacs to squeeze a bottle of water as a technician tracks the pressure on a computer screen. The company has yet to find a partner to incorporate the sensors into commercial prosthetic hands.

Loeb, sitting nearby, glances at the young man. “We’re at that place where there are lots of possibilities, some of which we thought we’d solve 30 years ago,” he says. “I have to think that with the range of money, technologies, and new knowledge that’s being thrown at this problem, a lot of good things are going to happen.”

The water suddenly spurts all over the floor, and Pandit laughs. He reaches out with his prosthetic hand, grasps the water bottle, and begins again.

 

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!
Already a member?