A Blog by Virginia Hughes

What Americans Don’t Get About the Brain’s Critical Period

On April 17, 1997, Bill and Hillary Clinton organized a one-day meeting with a long and lofty title: The White House Conference on Early Childhood Development and Learning: What New Research on the Brain Tells Us About Our Youngest Children.

The meeting featured eight-minute presentations from experts in public policy, education and child development, and one neuroscientist. They discussed, among other things, how 6-month-old infants learn to discriminate the sounds of their native language, and how, if a kitten’s eye is patched during early development — and therefore deprived of light inputs — it will go permanently blind in that eye, even after the patch comes off. The First Lady gave the gist of the meeting in her opening remarks: The first three years of life, she reportedly said, “can determine whether children will grow up to be peaceful or violent citizens, focused or undisciplined workers, attentive or detached parents themselves.”

Behind the hyperbole of that statement is an important idea based in solid science. The first few years of life are a “critical period” for brain development, during which experiences — strong parental attachments, exposure to written and spoken language, social interactions — sculpt brain circuits in a way that’s difficult to un-sculpt. When a developing brain isn’t adequately stimulated, as often happens to children living in poverty, for example, or in the foster care system, this deprivation can lead to problems in cognition, attention and social behaviors.

The conference spurred a media frenzy. “Suddenly, every magazine and newspaper is saying, ‘Oh my god, life ends at the age of 3 when the critical period ends’,” recalls Charles Nelson, a developmental neuroscientist at Harvard. You might think that all the attention on critical periods would have led to more research on disadvantaged children. To some extent, it did (more on that later), but the vast majority of the public discussion went toward the other end of the socioeconomic spectrum.

America’s new obsession with the critical period launched a cottage industry of educational materials — such as the well-known Baby Einstein/Baby Mozart brand — which were endorsed and distributed by several nonprofits and state governments. The governor of Georgia, Zell Miller, even convinced hospitals to give out classical music cassette tapes to all new parents with instructions to play for their newborn. All of it was based on an untested premise: If depriving a baby during the critical period leads to terrible psychological outcomes, then giving her extra stimulation (or “enriched environments”) should lead to a super-duper brain.

There was (and still is) little science to back this up, and Nelson and other researchers did the best they could to clear up misconceptions. A few years ago, after a study came out showing that children who watch Baby Einstein videos actually do worse at learning words, the company stopped claiming its materials have educational benefits*. Yet many parents are still being told that enriched environments — whether colorful mobiles, “sensorial materials,” car seat galleries, or horseback riding lessons — spur early brain development.

Infants in Romanian orphanages are adequately fed and housed, but receive little attention from their adult caretakers.
Of late, space and bacteria have been in the news for all the wrong reasons. First, there was the wanton speculation about aliens that preceded the “arsenic life” controversy (NASA fanned the hype with a poorly described press conference). Then, the Journal of Cosmology made headlines with claims about fossilised bacteria in meteorites (NASA disavowed any participation). But to me, the real story involving space, bacteria and NASA is very different, but far more important. The gist is simple: when bacteria are sent into space, they become better at causing disease. This poses a big problem for the long-term space missions planned in the future, but cracking that problem could have big benefits for public health back on the ground. I’ve told this story in a feature for this month’s Wired UK, which has finally come online. The feature focuses on Cheryl Nickerson, an American scientist who is spearheading research in this field. I talk about Nickerson’s motivations, her latest fascinating results on how bacteria change in space, why this has already been a problem for space missions and why it’ll get worse, what it’s like to do science in space, and finally, what this means for human health back on Earth. Here’s a taster:
On September 18, 2006, aboard the Space Shuttle Atlantis, astronaut Heidemarie Stefanyshyn-Piper turned a crank and gave millions of bacteria an impromptu bath. She was holding a carefully sealed device composed of several glass barrels, each containing separate fluids that could be mixed at will. Carefully, she dunked some dormant bacteria into a nutritious broth that allowed them to grow, change and multiply. At the same time, scientists under the supervision of Cheryl Nickerson turned a similar crank in a room at Kennedy Space Center in Orlando, Florida, designed to mimic the Shuttle's temperature and humidity. The scientists synchronised their efforts via real-time radio communication. The co-ordinated experiment was a groundbreaking one: it demonstrated that bacteria turn into superbugs in the gravity-free environment of space, gathering together, gaining strength and becoming much more effective at causing disease. Science-fiction stories such as The Andromeda Strain love to play on the potential threat of alien infections, but earthly germs pose a far greater danger to human beings. With infectious powers bolstered by zero gravity, bacteria represent a significant risk to the health of space-faring humans, and it's a problem that an agency such as Nasa will have to crack if it is to send astronauts on longer missions. NASA has been taking the problem seriously -- the Atlantis experiment was just part of a larger research programme in space bacteria. By observing how bacteria react to the extreme environment of space, its researchers hope to learn more about how they behave in the human body. "It gives us a new handle on how to develop new ways of treating, preventing or diagnosing infectious diseases," says Nickerson, a feisty 49-year-old professor at Arizona State University's Biodesign Institute who is at the heart of the research and specialises in infectious bacteria and how they cause diseases. In an animated, south-western lilt, she explains her simple yet ambitious goals. "The bugs are winning the war. We always have to stay a step ahead." She slaps her hand on the desk to stress the importance of every word. "It's unacceptable that infectious diseases are the leading cause of death in young adults and children worldwide. We can do better and will do better." She wants nothing less than to find the next big weapons against infectious diseases.
I’m really proud of this. It’s the longest piece I’ve ever written and the first that combines some cool hardcore science with a profile of a scientist. I think it flows quite well, and it was given the lightest of edits; the words in the magazine are essentially mine. Thanks to Greg Williams at Wired for commissioning it, David Dobbs for giving some feedback on my draft, and Cheryl Nickerson, Duane Pierson, Mark Ott and Neal Pellis for their support.

That well-intentioned parents may be wasting money on a lot of shiny toys doesn’t exactly keep me awake at night. What’s disappointing is that the enrichment meme seems to have overshadowed the real lesson of the research on critical periods: that poverty and child neglect often have devastating and long-lasting effects on the brain.

Two studies published in the past week, for example, have shown that children who experience to severe neglect, abuse, or injury in childhood (even after age 3, by the way) have abnormal brain wiring when they hit adolescence.

The first report, published by Nelson and his colleagues in the Proceedings of the National Academy of Sciences, was part of a 12-year study tracking the fates of 136 Romanian orphans, some of whom were raised in state-run institutions and others in foster care families. Around age 8, children who grew up in institutions have less white matter, the tissue that links up different brain regions, compared with those raised in families, the study found.

Some may think of this as an unfortunate, though unsurprising reality of life in a post-Communist country — a tragic story for Romania, but not particularly relevant to us here in the U.S. Not true. Nelson points out that the defining element of institutional living, the absence of invested caregivers, is also what happens to many children in poverty. The work in Romania, he says, “is a wake-up call to the millions of children in the U.S. who are living in circumstances that are only marginally better than kids living in institutions.”

This idea is bolstered by the second new study, published yesterday in Neuropsychopharmacology. Researchers in Texas scanned the brains of adolescents who had experienced neglect or abuse before age 10. These kids had weaker white matter tracts in adolescence compared with peers who didn’t experience early adversity. What’s more, the adolescents with deficits in brain connectivity were more likely to be dealing with depression or substance abuse five years later.

When it comes to the reception of this research in yuppie circles, how much responsibility falls on science journalists? Whenever I pitch a story, an editor is bound to ask me about its relevance, or “take-home message,” for the publication’s readers. This is legitimate; of course I want my readers to be interested. But I suspect that sometimes, in framing a story for a targeted demographic, its message for those outside the bubble is lost. I can’t help but use the obvious metaphor. There seems to have been a critical period for reporting on critical period research, and the misinterpretations sculpted during that window are difficult to un-sculpt.

*Baby Einstein contests the claims of this study.

Photos courtesy of Charles Nelson

This post was originally published on The Last Word on Nothing