National Geographic

Brain-training games get a D at brain-training tests

Braintrain.jpgYou don’t have to look very far to find a multi-million pound industry supported by the scantiest of scientific evidence. Take “brain-training”, for example. This fledgling market purports to improve the brain’s abilities through the medium of number problems, Sudoku, anagrams and the like. The idea seems plausible and it has certainly made bestsellers out of games like Dr Kawashima’s Brain Training and Big Brain Academy. But a new study by Adrian Owen from Cambridge University casts doubt on the claims that these games can boost general mental abilities.

Owen recruited 11,430 volunteers through a popular science programme on the BBC called “Bang Goes the Theory”. He asked them to play several online games intended to improve an individual skill, be it reasoning, memory, planning, attention or spatial awareness. After six weeks, with each player training their brains on the games several times per week, Owen found that the games improved performance in the specific task, but not in any others.

That may seem like a victory but it’s a very shallow one. You would naturally expect people who repeatedly practice the same types of tests to eventually become whizzes at them. Indeed, previous studies have found that such improvements do happen. But becoming the Yoda of Sudoku doesn’t necessarily translate into better all-round mental agility and that’s exactly the sort of boost that the brain-training industry purports to provide. According to Owen’s research, it fails.

All of his recruits sat through a quartet of “benchmarking” tests to assess their overall mental skills before the experiment began. The recruits were then split into three groups who spent the next six weeks doing different brain-training tests on the BBC Lab UK website, for at least 10 minutes a day, three times a week. For any UK readers, the results of this study will be shown on BBC One tomorrow night (21 April) on Can You Train Your Brain?

The first group faced tasks that taxed their reasoning, planning and problem-solving abilities. The second group’s tasks focused on short-term memory, attention, visual and spatial abilities and maths (a set that were closest in scope to those found in common brain-training games). Finally, the third group didn’t have any specific tasks; instead, their job was to search the internet for the answers to a set of obscure questions, a habit that should be all too familiar to readers of this blog. In each case, the tasks became more difficult as the volunteers improved, so that they presented a constantly shifting challenge.

After their trials, all of the volunteers redid the four benchmarking tests. If their six weeks of training had improved their general mental abilities, their scores in these tests should have gone up. They did, but the rises were unspectacular to say the least. The effects were tiny and the third group who merely browsed for online information “improved” just as much as those who did the brain-training exercises (click here for raw data tables).

Owen_tableBy contrast, all of the recruits showed far greater improvements on the tasks they were actually trained in. They could have just become better through repetition or they could have developed new strategies. Either way, their improvements didn’t transfer to the benchmarking tests, even when those were very similar to the training tasks. For example, the first group were well practised at reasoning tasks, but they didn’t do any better at the benchmarking test that involved reasoning skills. Instead, it was the second group, whose training regimen didn’t explicitly involve any reasoning practice, who ended up doing better in this area.

Owen chose the four benchmarking tests because they’ve been widely used in previous studies and they are very sensitive. People achieve noticeably different scores after even slight degrees of brain damage or low doses of brain-stimulating drugs. If the brain-training tests were improving the volunteers’ abilities, the tests should have reflected these improvements.

You could argue that the recruits weren’t trained enough to make much progress, but Owen didn’t find that the number of training sessions affected the benchmarking test scores (even though it did correlate with their training task scores). Consider this – one of the memory tasks was designed to train volunteers to remember larger strings of numbers. At the rate they were going, they would have taken four years of training to remember just one extra digit!

You could also argue that the third group who “trained” by searching the internet were also using a wide variety of skills. Comparing the others against this group might mask the effects of brain training. However, the first and second groups did show improvements in the specific skills they trained in; they just didn’t become generally sharper. And Owen says that the effects in all three groups were so small that even if the control group had sat around doing nothing, the brain-training effects still would have looked feeble by comparison.

These results are pretty damning for the brain-training industry. As Owen neatly puts, “Six weeks of regular computerized brain training confers no greater benefit than simply answering general knowledge questions using the internet.”

Is this the death knell for brain training? Not quite. Last year, Susanne Jaeggi from the University of Michigan found that a training programme could improve overall fluid intelligence if it focused on improving working memory – our ability to hold and manipulate information in a mental notepad, such as adding prices on a bill. People who practiced this task did better at tests that had nothing to do with the training task itself.

So some studies have certainly produced the across-the-board improvements that Owen failed to find. An obvious next step would be to try and identify the differences between the tasks used in the two studies and why one succeeded where the other failed.

Reference: Nature http://dx.doi.org/10.1038/nature09042

There are 14 Comments. Add Yours.

  1. Henry Mahncke
    April 20, 2010

    The interpretation of this study is astonishingly over-reaching. The BBC researchers designed their own cognitive stimulation program, applied it at a very low intensity in healthy young people, and then saw no effect on cognitive function. To then claim that their data shows that “computerized mental workouts don’t boost mental skills” is akin to saying “sugar doesn’t help with a headache, and sugar and aspirin are both molecules, so aspirin must not help with headaches either.” This is an elementary logical fallacy.

    There is a tremendous body of published evidence showing that certain specifically designed cognitive training programs drive real benefits:
    http://www.ncbi.nlm.nih.gov/pubmed/19220558
    http://www.ncbi.nlm.nih.gov/pubmed/17565162

    If the BBC researchers want to contest those results, they should use the programs and methods used in those studies. This is a fundamental principle of the scientific method, and its surprising to have to point it out to working scientists.

    The only conclusion from the BBC study is that very limited amounts of everyday cognitive stimulation does not improve cognitive function. This is an interesting conclusion, and the study should have reported it as such.

    Henry Mahncke
    http://www.positscience.com
    I am a researcher at Posit Science, where I design and test cognitive training programs.

  2. dearieme
    April 20, 2010

    “how did this research end up getting published in peer-reviewed Nature?”
    You’re kidding – it’s carried lots of Global Warming tripe, hasn’t it?

  3. Ed Yong
    April 21, 2010

    I’ve spammed the comments from Joe H and Damian W on the grounds that the identical IP addresses and virtually identical content of the comments suggest sockpuppetry. I’m not having any of that, although (as Henry’s comment shows) I’m more than happy for people to criticise the study.

  4. S Noel
    April 21, 2010

    The studies linked to by Henry Mahncke seems to suggest that brain training may be effective in older people with pre-existing cognitive defects.

    That is certainly not how such things are marketed (and I can’t find it in the posit site either), which I think is what this study shows.

  5. Jody Peake
    April 21, 2010

    I work with CogniFit, a brain fitness training company. Recently, a study was conducted that compared the cognitive improvement of people who used CogniFit’s proven methodology against people who just played challenging computer games. Only the people who used CogniFit brain training, which uses advanced technology to tailor training to individual needs, showed significant improvement in their cognitive skills. The BBC study reinforces those findings: one size does not fit all and just playing games does not help to significantly develop cognitive abilities.Here is a link to the result of the CogniFit study: http://ow.ly/1AYYZ

  6. Ed Yong
    April 21, 2010

    Sigh. Okay, here are a couple of tips for interacting with the blogosphere, dear PR person.

    Don’t come here using phrases like “proven methodology” and expect to be taken seriously.

    Don’t tell us that you’re linking to the results of the study and only link to a press release. That is completely useless to me and my readers. Has this work actually been published anywhere, or would you like us to take your word for it?

  7. Melody
    April 22, 2010

    Hi Ed,

    I just wanted to say thanks for posting this article, which I found well-balanced and full of information. Thanks also to Henry Mahncke for adding some more relevant links, concerning times when brain training does work.

    I’m a textile artist who is doing a residency with a neuroscientist from the Queensland Brain Institute in Australia. The residency’s theme is neurogenesis (cell birth) and apoptosis (cell death) in Alzheimer’s disease: my neuroscience colleague works on studies with mice that, I understand, show that learning new skills can improve outcomes in Alzheimer’s (note: not prevent or cure the disease). My artworks are based on Adam’s microscope images of mouse brains.

    I put a link to your post on our blog today: http://cultureatwork-hamlin-lord.blogspot.com/2010/04/using-your-brain.html

    We’d love it if you would pop over and take a look at our project!

  8. MW
    April 22, 2010

    A TV show getting published in Nature – wow.

    dearieme: without rising to the global warming bait, I’d point out that Nature and Science both have a reputation for caring more about whether the paper is exciting than whether it is correct.

    Personally, I wouldn’t consider using such software unless I was feeling at risk for dementia, so I’d be more interested in the results from an at-risk-of-dementia population.

    A story I’ve heard (i.e. an urban legend.) There was a professional chess player who at his peak would look 8 moves ahead. As he aged, his performance deteriorated, and he was quiet upset that now he could only look 4 moves ahead, but he kept playing chess frequently. Other than that, he seemed just fine mentally. When he died, an autopsy was performed, and they found his brain so riddled with Altzheimers that the doctor said he shouldn’t have been able to remember his own name.

    Can anyone confirm (or refute) this story?

    Another ‘I heard it somewhere but don’t remember where’: research shows that teaching kids music improves their maths ability. This supports the idea that training in one mental area can transfer to others, although I think there is a well established (but mysterious to me) link between maths and music.

  9. Frank Norman
    April 22, 2010

    Some commenters are referring to the “BBC study”. This is incorrect. The research was led by Adrian Owen, at the Medical Research Council’s Cognition and Brain Sciences Unit.

  10. Rich
    April 22, 2010

    ‘scuse me but you provide a link to “raw data tables” but the tables linked are all of derived statistics. Unless you meant “raw data” for the graphs but it doesn’t say that.

    Just saying.

  11. Tiffany
    April 22, 2010

    Please make sure you have all the facts before you write brain training off as something that doesn’t work. There are different types of brain training, and certain requirements / environments necessary for making it work – these studies were a bit rash and did not take all the information into account.

    LearningRx helps kids learn every day – using brain training. Our brain training really does work. To read a professional’s response to the recent Nature journal study go here: http://www.learningrxblog.com/nature-journal-brain-training-study/

    Thanks!

  12. Sebastian Dieguez
    April 25, 2010

    @8. MW

    Not an urban legend, the chess player case has been published in Neurocase (2005, Vol 11 (1): 26-31): http://www.informaworld.com/smpp/content~content=a713734399&db=all.

  13. Chris
    May 27, 2010

    There is a big difference between serious brain training programs based on peer reviewed research and casual brain games that have no scientific validation. Frankly, the market hasn’t done a very good job of explaining what a real brain training program is, including the need for scheduled blocks of training time. Here is a site that helps to separate validated brain training programs from casual brain games : http://www.braingamereview.com

  14. Nicolas Tartaglione
    June 8, 2010

    I love your articles I’m really into chess these days I used to be but starting to get back into it

Add Your Comments

All fields required.

Related Posts