Monkeys grab and feel virtual objects with thoughts alone (and what this means for the World Cup)

ByEd Yong
October 05, 2011
6 min read

This is where we are now: at Duke University, a monkey controls a virtual arm using only its thoughts. Miguel Nicolelis had fitted the animal with a headset of electrodes that translates its brain activity into movements. It can grab virtual objects without using its arms. It can also feel the objects without its hands, because the headset stimulates its brain to create the sense of different textures. Monkey think, monkey do, monkey feel – all without moving a muscle.
And this is where  Nicolelis wants to be in three years: a young quadriplegic Brazilian man strolls confidently into a massive stadium. He controls his four prosthetic limbs with his thoughts, and they in turn send tactile information straight to his brain. The technology melds so fluidly with his mind that he confidently runs up and delivers the opening kick of the 2014 World Cup.

This sounds like a far-fetched dream, but Nicolelis – a big soccer fan – is talking to the Brazilian government to make it a reality. He has created an international consortium called the Walk Again Project, consisting of non-profit research institutions in the United States, Brazil, Germany and Switzerland. Their goal is to create a “high performance brain-controlled prosthetic device that enables patients to finally leave the wheelchair behind.”

But for the moment, it’s still all about monkeys. Nicolelis has spent the last few years developing ever better “brain-machine interfaces” – devices that allow individuals to directly control a machine using their brains. In 2003, he showed that monkeys could steer a virtual arm using electrodes in their brains. Since then, other groups have shown that monkeys can use similar devices to feed themselves with robotic arms.

These machines all lacked something important – a sense of touch. “In the absence of feedback from the arm, the most mundane activities of daily living are slow and clumsy, and require herculean concentration and effort,” says Sliman Bensmaia, who works in the same area. Without touch, the prosthetic is just a glorified hook. Some devices can provide crude sensations (think about the rumble packs on video game controllers), but they really have to hook up to the brain itself to provide realistic feedback.

Two members of Nicolelis’ team, Joseph O’Doherty and Mikhail Lebedev, have started to solve this problem. They have developed a “brain-machine-brain interface” or BMBI, which sends signals to the brain as well as receiving them. Electrodes connect to a monkey’s motor cortex, the part of its brain that controls movements. They read the activity of the local neurons to decipher the monkey’s commands. The electrodes also stimulate the neighbouring somatosensory cortex, which governs the sense of touch. The brain talks to the machine, but the machine also talks to the brain.

Developing the BMBI was not easy. The biggest problem is that the incoming sensory signals interfere with the outgoing motor ones, like two people shouting at one another. “Nobody had been able to record and stimulate at the same time,” says Nicolelis. His solution was to exploit lulls in the conversation, by timing the incoming signals so they arrived between the spikes of the outgoing ones. “There’s a small window of time when you only block a very small portion of the signals from the brain,” says Nicolelis. “It worked as if there was no interference.”

O’Doherty and Lebedev fitted the BMBIs to two monkeys, whose names – M and N – bear the ring of British intelligence services. Both animals soon learned to explore three objects with a virtual arm, just by thinking about it. At first, the monkeys learned to control the arm with a joystick. M managed it within four training sessions, N took nine to master the technique, and both became better over time. But both eventually learned to steer the arm without their hands.

LIMITED TIME OFFER

Get a FREE tote featuring 1 of 7 ICONIC PLACES OF THE WORLD

Meanwhile, the electrodes fed their brains with signals that made the objects feel different. The monkeys used this textured information to pick the one that would earn them a tasty reward. This bit was easier than you might imagine. The somatosensory cortex doesn’t assign neurons to specific textures. Instead, it computes what we feel by analysing patterns of stimulation across large fields of neurons. “There are no specific nerves,” says Nicolelis. “You just give the signals to a general area of neurons and the brain figures it out.”

In this case, O’Doherty and Lebedev weren’t trying to create any specific textures; they just wanted to simulate different ones. For a working prosthetic, Bensmaia thinks that they will need “more sophisticated algorithms for sensory feedback”. He says, “The trick is to give the brain information that it can use in an intuitive way.” Ideally, that information would closely match the sensations evoked by an actual limb. “Otherwise, patients will receive a barrage of signals from the arm which may serve to confuse rather than assist them.”

Having successfully tested the BMBIs with a virtual arm, the next step is surely to test it with a physical one, before moving on to human trials. The potential applications are vast. Amputees and paralysed people could gain full and intuitive control of artificial limbs. Everyone could control technology with a thought. As O’Doherty and Lebedev write, “We propose that BMBIs can effectively liberate a brain from the physical constraints of the body.” But for the moment, the team have their World Cup target firmly in mind. “I think it is plausible,” says Bensmaia. “Of course, a lot of things would have to fall in place for that to happen.”

Reference: O’Doherty, Lebedev, Ifft, Zhuang, Shokur, Bleuler & Nicolelis. 2011. Active tactile exploration using a brain–machine–brain interface. Nature http://dx.doi.org/10.1038/nature10489

Image: It’s by Katie Zhuang from Nicolelis’s lab. Isn’t it great? TRON monkey!

More on prosthetics and related tech:

Go Further