Continued from page 2

In contrast, Hemmes‘ chip sat on the surface of his motor cortex, a less invasive method that records from groups of cells. The size of two postage stamps, it’s based on a kind of electrical signal mapping used to track seizures in epilepsy patients.

Both approaches need study, says Daofen Chen of the National Institutes of Health, who oversees neurorehabilitation research. He compares the options to eavesdropping on a party by sending in individual microphones or setting up a recorder at the window.

Boninger adds that scar tissue can blunt the penetrating electrodes over time, and the surface chips may be easier to convert to a wireless system, which is important for commercial use.


Hemmes‘ operation took two hours. He had practiced imagining arm movements inside brain scanners, to see where the electrical signals concentrated. That’s where neurosurgeon Elizabeth Tyler-Kabara cut, attaching the chip through an inch-wide opening on the left side of Hemmes‘ skull.

Two days later, Hemmes was hooked to a computer, beginning simple cursor movements. The next week, it was time to test if he could trigger real-life movement using the DARPA arm.

Hemmes reclined in his wheelchair, the robot arm bolted to a steel rod nearby. The task: make the arm reach out to grasp a ball mounted on a board.

The arm whirs forward, then stops, then goes again, then suddenly pulls back.

“It’s doing the opposite of what I ask it do,” Hemmes says in frustration. “When I think about reaching back, it goes forward.”

Dr. Wei Wang, a member of the research team, watches Hemmes‘ brain patterns on a nearby computer screen, trying to match them to the robotic movements. Focus on your elbow, Wang advises.

Hemmes takes a deep breath and tries. The arm whirs forward this time, reaching the ball. The fingers clench around it.

“There’s no owner’s manual,” Hemmes says, thrilled that the back-and-forth pays off. “I’m training my brain to figure how to do all this.”

Letting go is harder, the motor growling as the arm tugs backward before the fingers fully release. Hemmes starts imagining his hand relaxing before pulling backward, and the robot hand follows.


Sure, a robotic hand that one day mounts to a wheelchair could be useful. But no matter how well today’s prosthetics move, they’ve got a problem: They don’t sense what they touch. Normally, instant messages flash from the skin up to the brain to say “squeeze tighter” so we don’t drop that coffee cup, or “tight enough” so we don’t hug too hard.

Story Continues →