Brain Games are Bogus
The answer, however, now appears to be a pretty firm no—at least, not through brain training. A pair of scientists in Europe recently gathered all of the best research—twenty-three investigations of memory training by teams around the world—and employed a standard statistical technique (called meta-analysis) to settle this controversial issue. The conclusion: the games may yield improvements in the narrow task being trained, but this does not transfer to broader skills like the ability to read or do arithmetic, or to other measures of intelligence. Playing the games makes you better at the games, in other words, but not at anything anyone might care about in real life.
Over at the Cogmed Web site, though, it looks like lives are being transformed. A beaming child sits at a desk, pencil in hand, next to a quote extolling the results at a private school in Jacksonville, Florida. Cogmed training is helpful for all ages, from “young children to senior adults,” but is of particular interest to people with “diagnosed attention deficits” or “brain injury,” or those who “feel the deteriorating effects of normal aging” or those who “find they’re not doing as well as they could, academically or professionally.” The training is a method to “effectively change the way the brain functions to perform at its maximum capacity.” Cogmed is operating in more than a thousand schools worldwide, more than a hundred of which are in the U.S. In January, Cogmed launched a major push into American schools, which it charges up to three hundred dollars per child.
Cogmed and the other companies stake their claims on “working memory,” the ability to keep information the focus of conscious attention, despite distractions—mental juggling, in other words. There is powerful, widely accepted evidence that working memory plays an important role in everything from reading ability and problem-solving to reasoning and learning new skills. (It also seems to help with musical sight-reading and proficiency at Texas hold ’em.) And problems with working memory play a role in A.D.H.D., which has become an American fixation. Working memory is also closely related to “executive function,” the brain’s ability to make a plan and stick with it, an active and fruitful area of psychology with broad social implications. Many psychologists consider working memory to be a core component of general intelligence. People who score highly on intelligence tests also tend to perform well on working-memory tests.
The experiments by Klingberg and others suggested that working memory could be markedly increased through training, the same way that sit-ups create stronger abs—and, more importantly, that the training could bring broad benefits, the way weight training can make a person a better all-around athlete. In Klingberg’s first experiment, published in 2002, he recruited students with A.D.H.D. and gave them Raven’s Progressive Matrices, a test of non-verbal reasoning that is used to measure intelligence. He then gave them regular working-memory workouts, increasing the difficulty of the games as they improved by giving them more to remember. At the end of several weeks of training, he reported, he gave the kids the Raven’s again, and they performed significantly better. He then found the same results in young adults without A.D.H.D. The studies were small, but gradually other psychologists entered the field, and, in 2008, the psychologist Susanne Jaeggi reported an even more electric result: working-memory training definitively increased intelligence, with more training bringing larger gains. Her data implied that a person could boost their I.Q. by a full point per hour of training.
Over the last year, however, the idea that working-memory training has broad benefits has crumbled. One group of psychologists, lead by a team at Georgia Tech, set out to replicate the Jaeggi findings, but with more careful controls and seventeen different cognitive-skills tests. Their subjects showed no evidence whatsoever for improvement in intelligence. They also identified a pattern of methodological problems with experiments showing positive results, like poor controls and a reliance on a single measure of cognitive improvement. This failed replication was recently published in one of psychology’s top journals, and another, by a group at Case Western Reserve University, has been published since.
The recent meta-analysis, led by Monica Melby-Lervåg, of the University of Oslo, and also published in a top journal, is even more damning. Some studies are more convincing than others, because they include more subjects and show a larger effect. Melby-Lervåg’s paper laboriously accounts for this, incorporating what Jaeggi, Klingberg, and everyone else had reported. The meta-analysis found that the training isn’t doing anyone much good. If anything, the scientific literature tends to overstate effects, because teams that find nothing tend not to publish their papers. (This is known as the “filedrawer” effect.) A null result from meta-analysis, published in a top journal, sends a shudder through the spine of all but the truest of believers. In the meantime, a separate paper by some of the Georgia Tech scientists looked specifically at Cogmed’s training, which has been subjected to more scientific scrutiny than any other program. “The claims made by Cogmed,” they wrote, “are largely unsubstantiated.”
In a conference call, several Cogmed executives told me that they did not accept the conclusions, saying that the various scientists had unfairly overlooked good evidence in support of Cogmed’s regimen. They cited, as one example, Melby-Lervåg’s decision to not consider brain-imaging studies, which they believe offer additional evidence of neurological improvements that take effect after people play their games. “There is a lot of research excluded, almost to the point where it seems like the research is designed to reach a particular conclusion,” said Travis Millman, vice-president and general manager of Cogmed.
Yet to understand whether a student will be more effective in a classroom, it is only logical to rely on direct measures of the student’s capabilities rather than neuroimaging studies showing which parts of their brain are active in a lab. Melby-Lervåg’s criteria—randomized trials with suitable controls and well-designed post-training tests—would strike most psychologists as entirely reasonable. Cogmed’s representatives also told me that they had seen first-hand how much of a difference it could make. Yet anecdotal clinical evidence is notoriously unreliable. (In a variation of the placebo effect, when people participate in training programs, they genuinely believe they are getting better, whether or not that is true.) Cogmed has also published two responses to the scientific criticism on its Web site, both containing similar kinds of sophistry. When I reached Klingberg in Sweden, he told that the Melby-Lervåg paper used a “low scientific standard”—a rather stunning charge, given that the research appeared in one of the field’s best peer-reviewed journals. But then I read him parts of a note I had obtained, sent by Cogmed to school psychologists in the U.S., pitching their program’s benefits. “Working memory plays a key role in learning as it is crucial for reading comprehension, math, test-taking, following instructions, and understanding and retaining new information,” the note read. Klingberg, who is a paid scientific consultant at Cogmed, laughed uncomfortably and admitted it wasn’t fair to imply that Cogmed training would help with all those things. He has made suggestions to Cogmed about making the marketing more accurate, he told me, but, “I am not comfortable with everything that is said.”
Melby-Lervåg first became interested in working-memory training because she works with children who have learning disabilities, and she knew their parents were signing up for Cogmed. Pearson is a respected name that is strongly associated with education. “Since they work with children with learning disabilities, they have a responsibility to market programs that are evidence-based,” she said. “It’s unethical.”
The responsibility is so heavy because the needs are so great. Many people who have suffered brain trauma are haunted by a feeling of diminishment and a frustration that they can’t do more to help themselves. There are millions of children with learning disabilities who feel lost and ashamed. And then there are all the seniors who struggle with mental dissipation. These are the customers.
And, really, what’s the harm? Working-memory training doesn’t do any damage, one could argue. But that’s a dangerous and naïve view, argues Zach Hambrick, who was involved in the Georgia Tech study and is an associate professor of psychology at Michigan State University. “If you are doing brain training for ten hours a week, that is ten hours a week you are not doing something else, like exercising,” Hambrick said. “It also gives people false hope, especially older adults for whom this is a big concern. What if they do this and they don’t see any benefits? What do you think? You think, ‘There must be something wrong with me,’ or ‘I am a lost cause.’ ”