A Blog by Ed Yong

Monkeys grab and feel virtual objects with thoughts alone (and what this means for the World Cup)

It's a ninja monkey that fires energy blasts... what could possibly go wrong?

This is where we are now: at Duke University, a monkey controls a virtual arm using only its thoughts. Miguel Nicolelis had fitted the animal with a headset of electrodes that translates its brain activity into movements. It can grab virtual objects without using its arms. It can also feel the objects without its hands, because the headset stimulates its brain to create the sense of different textures. Monkey think, monkey do, monkey feel – all without moving a muscle.
And this is where  Nicolelis wants to be in three years: a young quadriplegic Brazilian man strolls confidently into a massive stadium. He controls his four prosthetic limbs with his thoughts, and they in turn send tactile information straight to his brain. The technology melds so fluidly with his mind that he confidently runs up and delivers the opening kick of the 2014 World Cup.

This sounds like a far-fetched dream, but Nicolelis – a big soccer fan – is talking to the Brazilian government to make it a reality. He has created an international consortium called the Walk Again Project, consisting of non-profit research institutions in the United States, Brazil, Germany and Switzerland. Their goal is to create a “high performance brain-controlled prosthetic device that enables patients to finally leave the wheelchair behind.”

But for the moment, it’s still all about monkeys. Nicolelis has spent the last few years developing ever better “brain-machine interfaces” – devices that allow individuals to directly control a machine using their brains. In 2003, he showed that monkeys could steer a virtual arm using electrodes in their brains. Since then, other groups have shown that monkeys can use similar devices to feed themselves with robotic arms.

These machines all lacked something important – a sense of touch. “In the absence of feedback from the arm, the most mundane activities of daily living are slow and clumsy, and require herculean concentration and effort,” says Sliman Bensmaia, who works in the same area. Without touch, the prosthetic is just a glorified hook. Some devices can provide crude sensations (think about the rumble packs on video game controllers), but they really have to hook up to the brain itself to provide realistic feedback.

Two members of Nicolelis’ team, Joseph O’Doherty and Mikhail Lebedev, have started to solve this problem. They have developed a “brain-machine-brain interface” or BMBI, which sends signals to the brain as well as receiving them. Electrodes connect to a monkey’s motor cortex, the part of its brain that controls movements. They read the activity of the local neurons to decipher the monkey’s commands. The electrodes also stimulate the neighbouring somatosensory cortex, which governs the sense of touch. The brain talks to the machine, but the machine also talks to the brain.

Developing the BMBI was not easy. The biggest problem is that the incoming sensory signals interfere with the outgoing motor ones, like two people shouting at one another. “Nobody had been able to record and stimulate at the same time,” says Nicolelis. His solution was to exploit lulls in the conversation, by timing the incoming signals so they arrived between the spikes of the outgoing ones. “There’s a small window of time when you only block a very small portion of the signals from the brain,” says Nicolelis. “It worked as if there was no interference.”

O’Doherty and Lebedev fitted the BMBIs to two monkeys, whose names – M and N – bear the ring of British intelligence services. Both animals soon learned to explore three objects with a virtual arm, just by thinking about it. At first, the monkeys learned to control the arm with a joystick. M managed it within four training sessions, N took nine to master the technique, and both became better over time. But both eventually learned to steer the arm without their hands.

Meanwhile, the electrodes fed their brains with signals that made the objects feel different. The monkeys used this textured information to pick the one that would earn them a tasty reward. This bit was easier than you might imagine. The somatosensory cortex doesn’t assign neurons to specific textures. Instead, it computes what we feel by analysing patterns of stimulation across large fields of neurons. “There are no specific nerves,” says Nicolelis. “You just give the signals to a general area of neurons and the brain figures it out.”

In this case, O’Doherty and Lebedev weren’t trying to create any specific textures; they just wanted to simulate different ones. For a working prosthetic, Bensmaia thinks that they will need “more sophisticated algorithms for sensory feedback”. He says, “The trick is to give the brain information that it can use in an intuitive way.” Ideally, that information would closely match the sensations evoked by an actual limb. “Otherwise, patients will receive a barrage of signals from the arm which may serve to confuse rather than assist them.”

Having successfully tested the BMBIs with a virtual arm, the next step is surely to test it with a physical one, before moving on to human trials. The potential applications are vast. Amputees and paralysed people could gain full and intuitive control of artificial limbs. Everyone could control technology with a thought. As O’Doherty and Lebedev write, “We propose that BMBIs can effectively liberate a brain from the physical constraints of the body.” But for the moment, the team have their World Cup target firmly in mind. “I think it is plausible,” says Bensmaia. “Of course, a lot of things would have to fall in place for that to happen.”

Reference: O’Doherty, Lebedev, Ifft, Zhuang, Shokur, Bleuler & Nicolelis. 2011. Active tactile exploration using a brain–machine–brain interface. Nature http://dx.doi.org/10.1038/nature10489

Image: It’s by Katie Zhuang from Nicolelis’s lab. Isn’t it great? TRON monkey!

More on prosthetics and related tech:

10 thoughts on “Monkeys grab and feel virtual objects with thoughts alone (and what this means for the World Cup)

  1. That is literally the most exciting thing I’ve read in months.

    It seems wildly optimistic for them to hope to accomplish this by 2014, but so? It’s better than “twenty years away,” like every other future technology is.

    Seriously, wow.

  2. So let me address that 2014 angle, because someone else mentioned it on G+.

    Typically, I ask people about how long it’ll take before their discovery yields a practical application and they wave their hands and say 5-10 years. That’s the standard answer, and it’s a bit boring. The reason why the 2014 angle stood out for me as being different, and the reason why I focused on it at the start, is that it’s incredibly specific. They’re not just saying 5-10 years, they’re actually nailing a specific event as their deadline. That’s very rare, and also very easily measurable. They’ll either do it or not. Also, they are specifically working towards it; they’re talking to the Government, they’ve got a kid in mind, and so on.

    So yes, it might be wildly optimistic, but certainly Nicolelis is betting that he can do it, and I’ve got one independent source who thinks it’s plausible. Now, let’s see.

  3. If he’s confident he’ll do it, all he’ll need is the money to get him their. I’m skeptical at the least but wishing him the best of luck.

  4. The benefit odf a specific target, and a specific goal behavior, is that a lot of “what if we…?” questions can be treated as distractions. They won’t have to program for a wide range of behaviors (ok, they do, but there is a far wider range they will *not* have to take into account), and can focus on the task at hand. I really like the specific goal, and by attaching it to the World Cup, they have a great chance of attracting the financial support (public and private) needed. Already having a kid in mind means custom-building, rather than one-size-fits-all or a series of different test subjects. The kid, also, can have the benefit of being part of the training process from day one.

    This is very do-able.

  5. this all sounds good and probably with the best intentions but lets face it is it going to be affordable when complete my opion would be no only those with lots of money at their disposale will be able to afford it, instead of creating new devises how about making those allready available more affordable. as a person in a wheelchair and in a country that the government insists on making money off the poor i find it rather disturbing that this type of research keeps on . i am going to be in the same chair for the rest of my life because i cant afford what they want for a better version, when are we the poorer people going to get any benefit from this

  6. This is exciting to me on a personal level, having done some BCI work with humans and monkeys, but always strictly output-based (motor signals). I’ve always heard that sensory feedback is a whole different ball game 😉 and much more challenging, but clearly that’s changing. What’s also fascinating about a haptic feedback study like this is it brings up questions about subjectivity of perception, since the feedback is not equivalent to normal sensation…

    It’s definitely cause for optimism about the pace of development in this field, though experience tells me we probably can’t expect good quality, lasting prosthetic control across all users for a while yet. But proof of concept is half the battle. Kudos to Nicolelis et al, and nice post.

  7. It’s worth noting that human trials of 2 types of brain machine interfaces to control a prosthetic limb have already started at the University of Pittsburgh, under the direction of Andrew Schwartz and Michael Boninger.


    No results yet as far as I’m aware, and of course these two systems don’t incorporate sensory feedback, but it is another example of how rapidly this field is progressing.

Leave a Reply

Your email address will not be published. Required fields are marked *