A Blog by

The Little Boy Who Should’ve Vanished, but Didn’t

He was 12 years old. He was a slave. He’d had no schooling. He was too young, too unlettered, too un-European; he couldn’t have done this on his own. That’s what people said.

Picture of a drawing of an older man and a young boy facing  a vanilla plant with their backs to the viewer
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Edmond (he had no last name—slaves weren’t allowed them) had just solved a botanical mystery that had stumped the greatest botanists of his day. In the early 1800s he was a child on a remote island in the Indian Ocean, and yet, against overwhelming odds, Edmond would get credit for his discovery—and for the most surprising reasons. I want to tell you his story. So I’ll start here, with a plant.

Picture of a drawing of a vanilla plant
Drawing by Robert Krulwich
Drawing by Robert Krulwich

This is a vanilla plant (or my version of one). It’s a vine. It climbs, sometimes way high, and when it flowers and is visited by a pollinator, it produces a bunch of long, stringy beans. Properly treated, those beans give off the flavor we associate with vanilla.

Picture of a drawing of Anne of Austria holding a mug of hot chocolate
Drawing by Robert Krulwich
Drawing by Robert Krulwich

When Spanish explorers brought vanilla from Mexico, it was mixed with chocolate and became a classy sensation, fancied by kings, queens, and, pretty soon, everybody else. In his book Vanilla: Travels in Search of the Vanilla Orchid, journalist Tim Ecott reports that Anne of Austria, daughter of Philip III of Spain, drank it in hot chocolate. Madame de Pompadour, one of the great hostesses (and mistresses) of King Louis XV, flavored her soups with it.

Picture of Madame de Pompadour with a bowl of steaming soup in front of her
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Francisco Hernandez, physician to King Philip II of Spain, called it a miracle drug that could soothe the stomach, cure the bite of a venomous snake, reduce flatulence, and cause “the urine to flow admirably.”

Picture of a drawing of a man peeing
Drawing by Robert Krulwich
Drawing by Robert Krulwich

And, best of all, it was a sexual picker upper. Bezaar Zimmerman, a German physician, claimed in his treatise “On Experiences” (1762) that, “No fewer than 342 impotent men, by drinking vanilla decoctions, have changed into astonishing lovers of at least as many women.”

Picture of a drawing of a woman laying her head on the shoulder of a man standing next to a vanilla bottle
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Demand, naturally, shot sky high. By the late 18th century, a ton of Mexican vanilla was worth, writes Ecott, “its weight in silver.”

With profit margins growing, a few plants were hustled out of Mexico to botanical gardens in Paris and London, then on to the East Indies to see if the plant would grow in Europe or Asia.

It grew, but it wouldn’t fruit, wouldn’t produce beans. Flowers would appear, bloom for a day, fold up, and fall off. With no beans, there could be no vanilla extract, and therefore nothing to sell. The plant needed a pollinator. In Mexico a little bee did the deed. Nobody knew how the bee did it.

Picture of a drawing of a bee saying 'Shhhhh'
Drawing by Robert Krulwich
Drawing by Robert Krulwich

What to do? In the 1790s people knew about plant sex. Bees, they knew, were pollinators.

If people could only figure out where vanilla’s sexual parts were hiding, they could become bee substitutes.

Enter the 12-Year-Old

They kept trying. One plantation owner, Ferréol Bellier-Beaumont, on the island of Réunion halfway between India and Africa, had received a bunch of vanilla plants from the government in Paris. He’d planted them, and one, only one, held on for 22 years. It never fruited.

The story goes that one morning in 1841, Bellier-Beaumont was walking with his young African slave Edmond when they came up to a surviving vine. Edmond pointed to a part of the plant, and there, in plain view, were two packs of vanilla beans hanging from the vine. Two! That was startling. But then Edmond dropped a little bomb: This wasn’t an accident. He’d produced those fruits himself, he said, by hand-pollination.

No Way

Bellier-Beaumont didn’t believe him—not at first. It’s true that months earlier the older man had shown Edmond how to hand-pollinate a watermelon plant “by marrying the male and female parts together,” but he’d had no success with vanilla. No one had.

But after his watermelon lesson, Edmond said he’d sat with the solitary vanilla vine and looked and probed and found the part of the flower that produced pollen. He’d also found the stigma, the part that needed to be dusted. And, most important, he’d discovered that the two parts were separated by a little lid, and he’d lifted the flap and held it open with a little tool so he could rub the pollen in. You can see what Edmond did in this video:

Edmond had discovered the rostellum, the lid that many orchid plants (vanilla included) have, probably to keep the plant from fertilizing itself. Could you do it again, Bellier-Beaumont asked? And Edmond did.

This was news. Big news. Bellier-Beaumont wrote his fellow plantation owners to say Edmond had solved the mystery, then sent him from plantation to plantation to teach other slaves how to fertilize the vanilla vine.

And so the Indian Ocean vanilla industry was born.

In I841, Réunion exported no vanilla. By 1848, it was exporting 50 kilograms (.0055 tons) to France; by 1858, two tons; by 1867, 20 tons; and by 1898, 200 tons. “By then,” Tim Ecott writes, “Réunion had outstripped Mexico to become the world’s largest producer of vanilla beans.”

Picture of a drawing of a graph showing vanilla exports from Reunion
Drawing by Robert Krulwich
Drawing by Robert Krulwich

The planters were getting rich. What, I wondered, happened to Edmond?

Well, he was rewarded. His owner gave him his freedom. He got a last name, Albius. Plus, his former owner wrote the governor, saying he should get a cash stipend “for his role in making the vanilla industry.”

The governor didn’t answer.

Edmond left his master and moved to town, and that’s when things went sour.

He fell in with a rough crowd, somehow got involved in a jewelry heist, and was arrested, convicted, and sentenced to five years in jail. His former owner again wrote the governor.

“I appeal to your compassion in the case of a young black boy condemned to hard labor … If anyone has a right to clemency and to recognition for his achievements, then it is Edmond … It is entirely due to him that this country owes [sic] a new branch of industry—for it is he who first discovered how to manually fertilize the vanilla plant.”

Picture of a drawing that says Entirely Due to Him
Drawing by Robert Krulwich
Drawing by Robert Krulwich

The appeal worked. Edmond was released. But what catches my eye here is Bellier-Beaumont’s choice of “entirely.” Our new vanilla business, he says, is “entirely” due to Edmond. He’s giving the former slave full credit for his discovery and retaining none for himself. That’s rare.

Then, all of a sudden, Edmond had a rival. A famous botanist from Paris—a scholar, a high official knighted for his achievements—announced in the 1860s that he, and not the slave boy, had discovered how to fertilize vanilla.

Picture of a drawing of a man with a beard holding a vanilla plant and looking suspicious
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Jean Michel Claude Richard claimed to have hand-pollinated vanilla in Paris and then gone to Réunion in 1838 to show a small group of horticulturists how to do it. Little Edmond, he presumed, had been in the room, peeked, and then stolen the technique.

So here’s a prestigious scholar from the imperial capital asserting a claim against a 12-year-old slave from a remote foreign island. What chance did Edmond have?

Picture of a drawing of a young boy who was a slave facing off with an old scholarly French man
Drawing by Robert Krulwich
Drawing by Robert Krulwich

He was uneducated, without power, without a voice—but luckily, he had a friend. Once again, Edmond’s former master, Bellier-Beaumont, jumped into action, writing a letter to Réunion’s official historian declaring Edmond the true inventor. The great man from Paris, he said, was just, well, mis-remembering.

He went on to say that no one recalled Richard showing them how to fertilize orchids, but everybody remembers, four years later, Edmond teaching his technique to slaves around the island. Why would farmers invite Edmond to teach “if the process were already known?”

“I have been [Richard’s] friend for many years, and regret anything which causes him pain,” Bellier-Beaumont wrote, “but I also have my obligations to Edmond. Through old age, faulty memory, or some other cause, M. Richard now imagines that he himself discovered the secret of how to pollinate vanilla, and imagines that he taught the technique to the person who discovered it! Let us leave him to his fantasies.”

The letter was published. It’s now in the island’s official history. It survives.

Picture of an etching of Edmond Albius with the vanilla plant in his hands
Etching of more adult Edmond Albius
Etching of more adult Edmond Albius

And Yet, a Miserable End

Edmond himself never prospered from his discovery. He married, moved back to the country near Bellier-Beaumont’s plantation, and died in 1880 at age 51. A little notice appeared in the Moniteur, the local paper, a few weeks after he died. Dated Thursday, 26 August, 1880, it read: “The very man who at great profit to his colony, discovered how to pollinate vanilla flowers has died in the hospital at Sainte-Suzanne. It was a destitute and miserable end.” His long-standing request for an allowance, the obituary said, “never brought a response.”

Picture of the Edmond Albius Statue in
The statue of Edmond in Réunion
Photograph courtesy of Yvon/Flickr

But a hundred years later, the mayor of a town on Réunion decided to make amends. In 1980 or so, a statue was built to honor Edmond. Writer Tim Ecott decided to take a look. He bought a bus ticket on the island’s “Vanilla Line,” rode to the stop marked “Albius,” got off, and there, standing by himself is Edmond (in bronze or concrete? I can’t tell). He’s dressed, Ecott says, like a waiter, with a narrow bow tie and jacket. He’s not wearing shoes: Slaves weren’t allowed shoes or hats. But he’s got a street named after him, a school named after him. He has an entry on Wikipedia. He’s survived.

Picutre of a drawing of a man with a beard holding a vanilla plant and looking sad
Drawing by Robert Krulwich
Drawing by Robert Krulwich

And the guy who tried to erase him from history, Richard? I looked him up. He also has a Wikipedia entry. It describes his life as “marred by controversy,” mentions his claim against Edmond, and concludes that “by the end of the 20th century,” historians considered the 12-year-old boy “the true discoverer.” So despite his age, poverty, race, and status, Edmond won.

This is such a rare tale. It shouldn’t be. But it is.

Editor’s Note: This post has been updated to correctly reflect the title of Tim Ecott’s book.

Two books recount Edmond’s story. Tim Ecott’s Vanilla: Travels in Search of the Vanilla Orchid is the most thorough and original, but How to Fly a Horse: The Secret History of Creation, Invention and Discovery by Kevin Ashton tells the same tale and marvels that a slave on the far side of the world, poor and non-white, could get credit for what he’d done. There is also Ken Cameron’s Vanilla Orchids: Natural History and Cultivation, a book that contains Thomas Jefferson’s handwritten recipe for vanilla ice cream.

Please Welcome Robert Krulwich to Phenomena!

krulwich_lgRobert Krulwich is a host of the show Radiolab, but he’s also a blogger, having written many posts over the years for National Public Radio. I’m delighted to welcome Robert to Phenomena, which is host to his new blog, “Curiously Krulwich.”

(Full disclosure: I’ve known Robert for a long time. We first met to hunt for autumn leaves in my neighborhood. And we’ve carried on a long-running conversation on a variety of topics such as whether parasites are terrible or awesome. Spoiler alert: they are awesome.)

To celebrate Robert’s arrival, I asked him a few questions about his blogging experiences:

You started out in television, then headed into radio. How did blogs make their way into your creative stream?

Like anything in life, first you hear a strange word, blog, and you wonder “What could that mean?” It sounds like something you’d find on a tugboat. Then, knocking around the web, I bumped into a few, and the ones I bumped into six years ago were gorgeously written, dazzlingly illustrated (bldgblog by Geoff Manaugh, Jason Kottke’s daily roundup at Kottke.org, Information is Beautiful from David McCandless, LoverofBeauty from I don’t know who, he never tells), each one wildly different from the other, yet all of them classy, dangerous, totally new to me, and I thought, how do I get in on this? I have story ideas all the time. I like to draw. I like to write. The fact that NPR (where I was at the time) is a radio network, and isn’t exactly into eyeball products, being more into ears, was no problem. They have a website and they let me launch a science based, sometimes meandering blog, Krulwich Wonders, where I wrote about history, animals, plants, puzzles, math, chemistry, music, art — and found a delightful audience of crabby, over-informed, sometimes charming, sometimes maddening readers who loved telling me how wrong I was or how right I was, while mailing me ideas that kept me going many times a week. It was so much fun. One time I even got a note from astronaut Neil Armstrong when I wondered out loud why he didn’t wander a larger patch of the moon when he visited up there. Here’s why, he barked back, sending me a long, fascinating letter that gave me goosebumps. So how’d I fall into blogging? I fell very, very happily, and when NPR downsized last year and let me go, I felt a little empty inside and wanted a new place to do it. This, I am happy to announce, is the place.

Does blogging feel the same as what you do on RadioLab, or does it feel like a different way to express yourself?

Well, the conversational tone is the same. I want to sound like myself. I don’t want what I do to be too studied, too formal, or too packaged. I want to sound like some guy who sits next to you on a train and turns out to be a good storyteller, and to your surprise (and, I’m hoping, delight) isn’t a bore. That’s how I try to be on the radio. That’s how I’ll try to be here. But, of course, there’s an obvious difference. On the radio (or the podcast) I’m playing with sound, and the thrill is to invent into your ear (which I do with my “genius” pal, Jad). On my blog, I’m playing with your eye. Every post I do is intentionally visual; it features something to look at; sometimes a video, sometimes a series of drawings, sometimes a photo, sometimes something I devise with friends that’s interactive and let’s you play with an idea. The important thing is that both Radiolab and the blog are designed to spill something you didn’t know into your head as intelligibly and as joyously and as carefully as I know how. That’s the goal; to make you learn something you didn’t think you needed or cared to know, but whoosh! Now you know it. That’s what I like to do.

You work in lots of artwork into your blog posts. What’s the process behind that? Do the artists come to you with ideas, or do you ask them to visualize something you want to write about?

I wish I had artists. When I started “Krulwich Wonders,” I did. I had a little budget and could hire people to help me. But those were the early days when managers were given play money to launch these adventures. The play period has long since ended and now I’m down to me, my box of colored pencils, my desk top scanner and an eraser. I can still call friends, and I do, and I will, but mostly I sit there thinking about, oh, I don’t know, “snail sex,” and I end up looking up pictures of snails, trying to find their genitalia (not the easiest thing to do if you’re not a snail) and sitting at my desk drawing one lopsided snail after another until, eventually, I get the thing right plausibly snaily enough and anatomically correct enough to publish. Then if my wife happens by, sees the drawing, and says, “Oh, what a nice pineapple,” I start over.

The ideas, by the way, come from whatever it is I’m wondering about.

What’s your favorite blog post so far? (Disclosure: my favorite is the one you wrote about the giant insects that were rediscovered on a remote island.)

Thanks, I liked that one too. But if I had to choose, I’d nominate one I wrote about why bees love hexagons, which you can find here. Or another about bees being totally and mysteriously absent from a cornfield, or this short meditation on absolutely nothing.  And, oh yes, a dance I posted that still makes me so happy I use it like alchohol whenever I’m gloomy. It’s here. 

What’s your plan for blogging from here on out?

To try things I’ve never tried before. To scare myself. To experiment with newfangled gifs, loops, slo-mo photography, and, if I dare, watercolor. To go wherever my curiosity takes me, and to take you (that’s right, I’m whispering in your ear, Carl Zimmer, you who know everything I know several weeks before I do), even you are coming with me.

My bags are packed.

A Blog by

Please Welcome Erika Engelhaupt to Phenomena!

Erika only has eyes for you, dino (shot at the Phil Fraley studio, while on assignment for the Philadelphia Inquirer).

Today, Phenomena gets a little spookier as we welcome Erika Engelhaupt to the salon. The name of her blog says it all: In Gory Details, she’ll be bringing you tales from the darker side of science — creepy thrills, macabre reality checks, and stuff for which the term “morbid fascination” aptly applies.

Maybe it has something to do with all the time she spent tromping around in swamps while studying environmental science, earning a couple of master’s degrees and – in her words – publishing “boring science papers.” After that, Erika ditched the science papers and began writing for newspapers, triggering a metamorphosis from scientist to science journalist. Now, in addition to being our newest Pheno-type, Erika is also the online science editor for National Geographic, and will help manage the Phenomena blog network.

I’ve known Erika for a while now (she was one of my editors at Science News), and she’s always seemed so…normal? To celebrate the launch of Gory Details, I asked Erika some questions about where she’s headed.


So what’s this obsession with gory stuff?

I suspect I may have read too many Stephen King novels at a young age. My mom and I would tear them in half down the spine so we could each read half at the same time. I’ve always enjoyed reading about creepy stuff (but no scary movies—I prefer my imagined horrors over Hollywood’s versions). Combine that with a love of science, and I guess you get Gory Details.

How did Gory Details come about?

I was an editor at Science News, and one day I was sitting in my office and looked at a shelf filled with books I had reviewed for the magazine. There were titles like Blood Work, The Killer of Little Shepherds (a fantastic forensic history), and That’s Disgusting. It had never really dawned on me until that point that maybe I had a morbid fascination. Suddenly it just popped into my head—I should write a column on the dark side of science and call it Gory Details.

At the time the magazine was soliciting ideas for news columns, but mine was initially considered too gross for a column. So I had to bide my time for two years until we were launching new blogs, and I got my chance. And it turned out that other people shared my curiosity.

What do you want this blog to be about?

So many things! First, I want to really delve into forensic science, because there’s so much going on right now. We’re at this strange point where there are really amazing high-tech methods being developed to analyze crime, but mostly what police have to work with is very old-school, and actually a lot of basic forensic analyses, like hair analysis, are being questioned — is this stuff even really science?

I’m fascinated by all manner of dead things, too. That includes archaeology, and pretty much anything involving old bones. I’m a sucker for a Neanderthal story, because I love to think about how close we are to our beetle-browed cousins.

Then there is, of course, the gross beat. I ended up writing a lot of stories about pee and poop in the blog previously, and every now and then I would announce a hiatus on bodily functions stories. But then someone would come along with some fascinating thing about fecal transplants or something, and I’d be off to the races with that.

I’d also like to branch out into some other areas that people might not immediately think of as gory, but that fall into the “huh, weird” category. So robots and artificial intelligence, perception (which, trust me, is full of really strange stuff), and the dark side of human nature.  And I’m an environmental scientist by training, so I’m going to claim that environmental nasties are something we need to examine in gory detail too.

Evidence of actual past scientific endeavor. Here's Erika in grad school, studying soil microbial activity in Costa Rica.
Evidence of actual past scientific endeavor: Erika studying soil microbial activity in Costa Rica.

 So…sort of the “eww” beat?

I’m happy to claim the “Eww” beat! When Maryn McKenna joined Phenomena recently with her scare-tastic blog Germination, someone on Twitter pointed out that she was claiming the “oops” beat (since she often covers how we humans have messed up the good thing we had going with antibiotics). And they noted that Ed’s on the “Wow” beat, and Nadia, I think we decided you were the “Boom” beat, right? Or maybe “Oooh”? And if Brian’s “Rock” and Carl’s “Life,” I guess that leaves me with “Eww”!

What are some of the spookier stories you’ve uncovered so far?

Some of my favorites have been ones that pose a “scary thought” kind of question. I really delved into what would happen if a nuclear bomb went off in Washington, D.C., where I live and where sometimes the threat of an attack feels quite real. I wrote about my own odds of survival less than a mile from the White House (not terrible, actually) and how to do the math on whether to seek better shelter or stay put.

Also along those lines are questions like “What lives on us after we die?” and “What drives ‘nice’ people to aggression?

One I found chilling in a different way was a story I uncovered about police in Israel who have developed a way to get fingerprints off rocks. They want to use the technique to find and prosecute Palestinians, often kids, who throw stones at Israelis. It’s a sad reminder of all the people hurt by that conflict.

I also love finding really weird stuff in out-of-the-way corners of science. I had great fun with  the story of a researcher who set up a re-enactment of da Vinci’s painting of the Mona Lisa using toy figures and posited that the original and a studio copy may have been made as the world’s first experiment in 3-D imaging. No one knew about this guy’s work, and after I broke the story it went nuts.

I imagine you’ve got a pretty thick skin since you’re used to diving into the world of weird. Is there anything that’s too creepy, scary or gross for you to deal with?

You know, it’s funny—there’s a psychology test for how easily disgusted a person is, and I tested out dead average. I’m not especially hard to gross out. Maybe that’s part of the fun; that I have a very normal response to this stuff.

But to answer your question, I have tried to be careful about writing about gory medical conditions, because I don’t want to come across as making light of people’s very real problems. And as for what I’m personally freaked out by, it’s gotta be crocodiles. They populate my nightmares.

“I took this in Costa Rica, at a bridge where people gather and occasionally throw meat to the crocodiles,” Erika says. “Nightmares for weeks.”



A Blog by

Introducing Germination: Diseases, Drugs, Farms, and Food

When I was a kid, my favorite part of school wasn’t class — even though I loved studying, and liked showing off what I knew. It wasn’t the uniforms, though my boarding school’s dresses and blazers, and shoes for indoor and outdoor games, were a puzzle that came together differently every time. And it certainly wasn’t the food: School dinner in England was a mystery of boiled sprouts and stewed rhubarb, even if the Texas high school lunches that came after taught me how to make Frito pie.

What I loved most about school, with a fierceness that bordered on devotion, were school supplies. The incense of a just-sharpened pencil. The order in a fresh box of pen cartridges. And more than anything, the promise in a new notebook, and the anticipation of filling its empty, perfect pages with everything I would discover and learn.

I’m feeling a similar thrill now, viewing this new space at Phenomena. Welcome to Germination, a blog that will explore public health, global health, and food production and policy—and ancient diseases, emerging infections, antibiotic resistance, agricultural planning, foodborne illness, and how we’ll feed and care for an increasingly crowded world.

If you followed me here from my previous blog Superbug at Wired, thanks, and get comfortable. If I’m a new discovery for you, here’s a capsule bio. I’m a freelance journalist working mostly for magazines (Wired,  Scientific American, Nature, Slate, the Atlantic, the Guardian and Modern Farmer, along with an array of women’s magazines). I’ve written two books so far—Superbug, about the global rise of antibiotic resistance, and Beating Back the Devil, about the Epidemic Intelligence Service, the disease-detective corps of the US Centers for Disease Control and Prevention—and am working on a third, about how we came to use antibiotics in agriculture, and what a mistake that turned out to be.

Before I was a magazine writer, I was a newspaper reporter, doing mostly investigative work: on the causes of cancer clusters, the social effects of drug trafficking, and a mysterious illness in reservists that turned out to be the first cases of Gulf War Syndrome. In my last newspaper job, I covered the CDC, under orders from the editor who hired me to “get in there and tell us these people’s stories.” I spent a lot of time talking my way into investigations and onto planes in the middle of the night. It was enormous fun.

Me, at TED, on March 18, 2015. Original here/a>.
Me, at TED, on March 18, 2015. Original here.
Maryn McKenna speaks at TED2015 - Truth and Dare, Session 6, March 16-20, 2015, Vancouver Convention Center, Vancouver, Canada. Photo: Bret Hartman/TED

I’m also a Senior Fellow of the Schuster Institute for Investigative Journalism at Brandeis University, and just finished a fellowship at MIT. I do some video. And I just gave a TED talk, on imagining what the world will be like after we’ve used up antibiotics. (The video has not gone up yet, but I’ll let you know when it does.)

As a journalist, my interest is complexity, inadvertence, and unintended consequences. (My Phenomena colleague Ed Yong jokes that he covers the “Wow” beat; I think of what I do as the “Oops” beat.) We got to widespread resistance because we wanted to cure infections quickly; we got to factory farming because we wanted to ensure affordable food. There isn’t (much) malfeasance in either of those endeavors,  but there is a ton of good intentions—and good intentions gone bad are a rich, rewarding subject. We might be here a while.

Here’s what you can expect at Germination: reports on new scientific findings; inquiries into policy initiatives; profiles and interviews with researchers doing cool things; history; and, occasionally, whimsy. I have been writing for a year for National Geographic‘s food platform The Plate, and some posts that deal more purely with food will be loaned or cross-posted there. (About which: You make Frito pie by opening a serving-size bag of Fritos along the back seam and plopping in a ladle of chili and some shredded yellow cheese. It tastes best when served by a lunch lady in a hairnet and a Texas Longhorns jersey.) If you’d like to hear more about my plans, head over to The Loom, where my new colleague Carl Zimmer has kindly conducted a Q&A with me.

When I think back to being a kid at the start of a school year, the initial thrill might have been those pristine new notebooks—but the bigger thrill was filling them. Phenomena is the most exclusive science-writing club on the internet, and I’m excited to join it. Please come along.

(Much gratitude to Jonathan Eisen, PhD, for suggesting Germination as a blog name.)

A Blog by

The Power of a Press Release

In 2011, Petroc Sumner of Cardiff University and his colleagues published a brain imaging study with a provocative result: Healthy men who have low levels of a certain chemical in a specific area of their brains tend to get high scores on tests of impulsivity.

When the paper came out, thousands of people across England were rioting because a policeman had shot a young black man. “We never saw the connection, but of course the press immediately saw the connection,” Sumner recalls. Brain chemical lack ‘spurs rioting’, blared one headline. Rioters have ‘lower levels’ of brain chemical that keeps impulsive behaviour under control, said another.

“At the time, like most scientists, we kind of instinctively blamed the journalists for this,” Sumner says. His team called out these (shameful, really) exaggerations in The Guardian, and started engaging in debates about science and the media. “We quickly began to realize that everyone was arguing on the basis of anecdote and personal experience, but not evidence. So we decided to back off, stop arguing, and start collecting data.”

And the data, published today in BMJ, surprised Sumner. His team found that more than one-third of academic press releases contain exaggerated claims. What’s more, when a study is accompanied by an exaggerated press release, it’s more likely to be hyped in the press.

Because press releases are almost always approved by a study’s leaders before being distributed, Sumner’s findings suggest that scientists and their institutions play a bigger role in media hype than they might like to acknowledge.

“We’re all under pressure as scientists to have our work exposed,” Sumner says. “Certainly I think a lot of us would be quite happy not to take responsibility for that — just to say, ‘Well, we can’t do anything about it, if they’re going to misinterpret that’s up to them but it’s not our fault’. And I guess we’d like to say, it is really important and we have to do something more about it.”

Sumner and his colleagues looked at 462 health or medicine-related press releases about issued by 20 British universities in 2011. For each press release, the researchers also analyzed the scientific study it was based on, and news articles that described the same findings.

The researchers limited the analysis to health and medicine partly because (as I’ve written about before) these stories tend to influence people’s behavior more than, say, stories about dinosaurs or space. They focused on three specific ways that press releases can distort or exaggerate: by implying that a study in animals is applicable to people; by making causal claims from observational data; and by advising readers to change their behaviors (“these results suggest that aspirin is safe and effective for children,” say, or, “it’s dangerous to drink caffeine during pregnancy”).

More than one-third of the press releases did each of these things, and the misinformation showed up in the media, too. For example, among press releases that gave exaggerated health advice, 58 percent of subsequent news articles also contained exaggerated health advice. In contrast, among press releases that didn’t make exaggerated recommendations, only 17 percent of news articles did so. The researchers found similar trends for causal claims and for inferring that animal work applies to people.

“We certainly don’t want to be blaming press officers for this,” Sumner says. “They’re part of the system. The academics probably don’t engage as much as they should.”

I called Matt Shipman, a science writer and press information officer at North Carolina State University, to ask what he thought of the findings. Shipman has been a press officer for seven years, and before that he was a journalist. “The numbers are very powerful,” he said, and they underscore the importance of press releases at a time when reporters often don’t have the time or resources for thorough reporting. (Shipman has just signed on with Health News Review to rigorously evaluate the quality of health-related press releases.)

Shipman also brought up an important caveat. Because this study is observational, it doesn’t prove that press releases are themselves the cause of hype. “If a researcher is prone to exaggeration, which leads to exaggerated claims in a news release, the researcher is likely to also be prone to exaggeration when conducting interviews with reporters,” Shipman says. “The news release may be a symptom of the problem, rather than the problem itself.”

When he writes press releases, Shipman says he almost always begins by meeting with the researcher in person and asking him or her to explain not only the findings, but what work led to them, why they’re interesting, and what other experiments they might lead to. Then Shipman writes a draft of the release and sends it back to the researcher for approval. He asks the scientist to check not only for factual inaccuracies, but for problems in emphasis, context, or tone. Different press officers at other institutions, however, write press releases using far less rigorous methods, as I have learned by swapping stories with them over the years. And some press officers are judged by the quantity of stories that come out in big outlets, which naturally creates an incentive to make research seems newsworthy, even when it might not be.

“What I think is probably the case is that all of the variables at play here — the researchers, the press officers, and the journalists — are all humans,” Shipman says. “And all of them are capable of making mistakes, intentionally or unintentionally.”

So. Is there any concrete way to reduce those mistakes?

In an editorial accompanying the BMJ study, author and doctor Ben Goldacre makes two suggestions. First, the authors of press releases and the researchers who approved them should put their names on the releases, he writes. “This would create professional reputational consequences for misrepresenting scientific findings in a press release, which would parallel the risks around misrepresenting science in an academic paper.” That seems reasonable to me.

Second, to boost transparency, press releases shouldn’t only be sent to a closed group of journalists, Goldacre writes. “Instead, press releases should be treated as a part of the scientific publication, linked to the paper, referenced directly from the academic paper being promoted, and presented through existing infrastructure as online data appendices, in full view of peers.”

That sounds good, but “would require a significant shift in the culture,” according to Shipman. Press officers would have to be brought into the process much earlier than they are now, he says. And scientists would have to be far more invested in press releases than many of them are now.

I think we journalists need to own our portion of the blame in this mess, too. Let’s go back to Sumner’s 2011 brain-imaging study, for example. His university’s press release didn’t have any wild exaggerations, and it certainly didn’t make a connection between the research and the riots. That came from the journalists (and/or their editors).

But that actually doesn’t happen very often, it turns out,” Sumner says. “Most of the time, the media stories stay pretty close to what’s in the press release.”

Which isn’t exactly great news, either.

A Blog by

The Problems of Health Journalism (Storify-ed)

Yesterday I wrote a post pondering a perennial topic in health journalism: How do journalists capture what’s new about a study without hyping its relevance in the general scheme of things? How do we avoid the embarrassing flip-flopping of health headlines?

The response from readers, both on the blog and on Twitter, has been robust and helpful. I created a Storify of the Twitter conversation and posted it below; I’ll try to update if/when the discussion continues. I also want to recommend two related resources:

Ed Yong’s recent talk about his career in science journalism, in which he explains why he has veered away from reporting on biomedicine and psychology.

Gary Schwitzer’s analysis of health stories appearing in major U.S. publications in the past seven years (published in the same issue of JAMA Internal Medicine as the resveratrol study that I mention in my post). The upshot: Health reporting could be much better than it is.

A Blog by

On Science Journalism, Blogs, and The Wow Beat

Last Thursday, I had a great time being interviewed by the Wall Street Journal’s Lee Hotz about science journalism, my career, and my approach to writing. It’s a long, meandering conversation, but hopefully an interesting one. You can find past episodes from the series here, including several names that should be familiar to regular readers of this blog.

Inside Out: Ed Yong from NYU Journalism on Vimeo.

A Blog by

Resveratrol Redux, Or: Should I Just Stop Writing About Health?

(Update, 5/13: This post generated a lot of discussion on Twitter, which you can see Storify-ed here. I also talked about these issues on NPR’s On the Media, which you can listen to here.)

The science of health is so, so confusing, I almost wonder if it wouldn’t be better for journalists to stop writing about health altogether. Or at least to dramatically change the way we do it.

Take one of the biggest health stories of the last decade: resveratrol, a compound found in certain red wines that has been shown to extend lifespan and/or curb disease in yeast, fruit flies, fish, worms, and mice.

Searching the New York Times archives for “resveratrol” gets you 156 items. Here’s a sampling of the headlines:

August 2003: Life-Extending Chemical Is Found in Certain Red Wines

November 2006: Yes, Red Wine Holds Answer. Check Dosage.

January 2011: Doubt on Anti-Aging Molecule as Drug Trial Stops

August 2011: Longer Lives for Obese Mice, With Hope for Humans of All Sizes

January 2012: University Suspects Fraud by a Researcher Who Studied Red Wine

November 2012: Resveratrol Ineffective in Normal-Weight Women

March 2013: New Optimism on Resveratrol

Is your head spinning yet?

From what I can tell, there’s nothing overtly wrong with the journalism in any of these stories. Most are based on a new study (or studies), and include varied perspectives of scientists who had nothing to do with the research. The reason the stories contradict each other is because the studies contradict each other.

This happens in science all the time; it’s even supposed to happen. Think of all those models of the atom you learned in chemistry class: from Thomson’s plum pudding to Rutherford’s nucleus to Bohr’s energy orbits to Pauli’s electron spin. Two steps forward, one step back, science moves along.

But when it comes to writing health stories, it’s hard — really, really hard — to include that slow scientific progression in a way that a reader will absorb. And I think that’s because readers don’t seek out health stories to satisfy abstract intellectual curiosities. They want to glean some kind of practical knowledge. How can I avoid sickness / lose weight / feel better / live longer?

For some messy health issues — such as whether it’s dangerous to drink while pregnant, say, or whether to get screened for cancer — the stakes are high. Resveratrol is not as serious. For most people, drinking a glass of wine or taking a daily resveratrol supplement is not going to do any biological harm. But there are other kinds of harm. Searching amazon.com for “resveratrol” gets you 2,186 health and personal care items, including supplements costing dozens or even hundreds of dollars.

I got thinking about this because of a study on resveratrol that came out today in a solid medical journal, JAMA Internal Medicine. Fifteen years ago, researchers collected urine samples from 783 older people who live in the Chianti region of Italy, where drinking red wine is common. It turns out that the level of resveratrol in the participants’ urine could not predict anything about their health outcomes. Those with the highest levels were just as likely to have inflammatory markers in their blood, and just as likely to get heart disease, cancer, and to die.

So I read that study and thought, this is important: My readers who buy or are thinking of buying resveratrol might appreciate knowing that its benefits haven’t panned out in people, at least not yet. Sure, a future study in people might report some benefit of resveratrol, but for now all I can do is offer the current state of knowledge. And that’s better than nothing, right?

But then…maybe it’s not. Take a look at those headlines again. I suspect a general reader is not coming away from those saying, “Gee whiz, look at the long and bumpy road to scientific progress!” They’re more likely to be saying, “When will those scientists get their act together?” Or worse, “Why do we keep dumping money into this capricious discipline?”

I don’t have any grand solution to this. I’ll undoubtedly keep covering health stories, because I believe in the public’s right to accurate information. And I believe in the process of science, however slow, to ultimately figure things out.

Still, is there a way that journalists could do this better?  How should I have covered the latest resveratrol study? Should we switch to a more explanatory, wiki-like model, so that a single study’s results are more fully contextualized? Should we be writing stories about batches of studies — maybe the last 10 studies of resveratrol, as opposed to the single newest one? Are headlines the real problem?

If you have any preferences or suggestions I’d love to hear them. I’m not likely to change the Health Journalism machine, but I’m more than happy to experiment on this blog.

A Blog by

A Guide for Scientists on Giving Comments to Journalists

The daily business of science journalism includes getting independent comments on new studies, and (in my opinion) providing those comments is one of the most important ways in which scientists interact with the media.

But from talking to scientists on Twitter, I know that there’s a lot of nervousness about giving comments to journalists. And when I send papers out to people for comments, I often get replies that say, “Sure, but what do you want me to say?”

A straw poll, conducted this morning on Twitter, suggested that people would value a guide on doing this. So, here’s what I am looking for. (To clarify, this isn’t me asking about your own work. It’s me asking you to comment on someone else’s work.)

Let’s start with some assumptions.

These may not be true for all journalists, but they’ll be Standard Operating Procedure for the good ones.

A) I have read the paper that I sent you and understand it (or I’ve talked to the scientists in question and they’ve explained it to me).  So you don’t need to explain its contents back to me.

B) I’m coming to you because you are an expert in this field and you know lots of stuff about it that I don’t know. I want you to use that expertise to help me put the research into context for my readers, and to help me point out any flaws and strengths.

C) I genuinely want to know what you make of the paper. I am not just trying to fill my story with a random cutaway quote to make it look like I did my job and asked around.

D) I’m not here to present people with the totality of your views, so what you say will almost certainly end up getting cut and distilled. BUT, I won’t do that in a way that misquotes or misrepresents you. If you say, “I’m fascinated by this approach but I think it has serious flaws”, I won’t cut that to “I’m fascinated”. I’m a journalist; I’m not making a movie poster.

E) All the tips below apply to situations where I email you a paper and ask for comments. If we’re chatting on the phone, it’s my job to guide you through all of this, but it will obviously take less time for both of us if you know what I’m after. And I’m talking about written interviews. Some of these will apply to TV and radio too, but those have very different constraints.

So, here’s what I would find useful:

1)       Weaknesses. The most important things you can tell me about a study are its weaknesses. Are there inaccuracies in the paper? Statistical failings? Do you think the conclusions don’t hold water? The last thing I want to do is to credulously cover a weak study. But I don’t work in your field and my bullsh*t detector is probably less finely calibrated than yours. So I’m basically relying on you to help me not mislead my readers. Maybe your comments will persuade me to drop a story because it’s just that bad. Maybe your comments will help me to confront an editor and say: “We shouldn’t cover this story that you seem so insistent on. Look: all these scientists think it’s bunk.”

2)       Strengths. But hey, it’s not all doom and gloom! If you’re excited, I want to hear that! Go, science, etc.! But also, tell me the reasons for that excitement. Did they get an unprecedentedly big data set? Some cool new method? An unusual model organism? Innovative technique?

3)       Your reaction. When you read the paper, how did it make you feel? Were you excited? Impressed? Overwhelmed by a deep existential malaise?

4)       The past. The paper will probably have a paragraph that crushes decades of earlier work. You will know all of that; I won’t have had time to read all those earlier papers. So tell me: How does this new discovery fit with what has come before? Is this based on a radical new approach? A long slog? Something that people in the field have been anticipating? Is it just reinventing the wheel?

5)       The present. Have other people found similar things? Contradictory things? Is this one of many such studies, or something truly original? If this is, say, a new approach to fighting malaria, how does it compare to all the other approaches people are investigating?

6)       The future. So, new discovery. Great. But what does it mean? Does it change what we knew about X? Does it open up new avenues for investigating Y? Will it lead to treatments or diagnostic tools for Z?

7)       Detail. Opinions may differ on this, but I like detail and specifics. People sometimes send me quotes that are paragraphs long and “This is probably much more than you need”. That’s true, but I’d rather know all that stuff and have to condense it into something I can use, than to only have something boring, vanilla, and non-descriptive (see the list below).

8)       Simple language, in some cases. Look, I know I’m asking a lot here, and it’s a bit much to expect you to lay out all the strength, weaknesses and context of a study for me and have to worry about jargon while you’re doing it. (Could you also rub my shoulders while you’re at it? Thanks ever so.) Just bear in mind that if something is riddled with jargon, I can paraphrase it but I can’t really quote it. That’s a little riskier for you, because maybe I might inadvertently misinterpret something you say. It’s also less good for me. I want to put your words in quote marks because it can really brighten up a piece.

Note that a lot of this boils down to you telling me something interesting that I couldn’t have predicted. That’s why, when people ask me, “Do you have any specific questions?” the answer is often, “No.” What you have to tell me—what springs into your head—is probably going to be far more interesting that anything I’m expecting you to tell me. Hence, any questions I have will be really broad like, “What does this mean?” or “Do you buy it?” or “How does this fit with other stuff?” or “Science me up, nerd.”

Update: I love Tom Stafford’s extra tip of “Don’t be afraid to tell me what the real story is.” Note that this is different to simply summarising the paper.

Now, here’s what I don’t find useful:

1)       A summary of what the paper showed. Around half of comments start with this. I don’t need it. I already know what the paper showed, or will have talked to someone else who explained it.

2)       Boilerplate adjectives. Please don’t say “This study is interesting…” when you actually mean “dubious” or “boring”.

3)       And on that note, the world’s most banal quote is: “This research is interesting but more work needs to be done”. It’s everywhere. It had invaded science stories like some linguistic cane toad. Of course, more research needs to be done. Otherwise, y’know, science would stop. But what research? What needs to be done? If you were doing that research, what experiments would you do? And if by “More work needs to be done” you really mean, “…because this impossibly flawed study tells us nowt”, then say that. Other banal quotes include, “We welcome any research that takes us further down the road towards [hand-wavy goal]” or “This adds to our understanding of [thingy]”.

4)       Publication politics. “I don’t know why this paper was published in Nature/Science/FEBS Letters” and other such comments are (usually) not useful. My readers don’t really know, or care about, publication hierarchies. “This paper should never have been published” can be useful for indicating strength of opinion, but I’d always want to know specifics about why. Isolated outrage makes for fun quotes, but not informative quotes.

5)       Citation politics. “The authors should have cited this paper instead of that paper.” Again, if an entire body of relevant work has been ignored, then let’s talk about that. But I’m not that bothered about whether reference 55 is the wrong reference 55.

And finally, a note on going off-the record.

Going off-the-record isn’t really a formal, enshrined, binding thing, but if you send me off-the-record comments, I won’t use them. However, my soul will ache when I see “This is off-the-record” followed by a long list of flaws and weaknesses and then “And now on-the-record” followed by something banal.

I get it. If you criticise a study, you risk angering colleagues who work in your field—the same people who you meet at conferences and review your papers. I’m not unsympathetic to that. But as I said, critical comments are probably the most useful variety that we get. You’re in a better position to criticise than I am. And it will probably carry more weight for a reader to see those words coming from the mouth of an expert in the field, than from some journalist. Critical comments do carry personal risk, but they also help us to fight credulous and uncritical science reporting.

Fellow journalists may totally disagree with any and all of this, in which case, have your say in the comments.


A Blog by

Carl Zimmer Interviews Me on the Making of a Story

Earlier this year, Nature published Dynasty—my story about Bob Paine, who changed the face of ecology with his bold ideas, but also spawned an academic family tree that did the same. The piece was the culmination of a year of work and is one of my favourites. So, I was honoured when the good people at the Open Notebook asked to interview me about it.

The Open Notebook is an unparalleled resource, exposing the tips and tricks that make great science journalism great. Last year, I interviewed Adam Rogers for them, about the making of his award-winning piece on a whiskey fungus. And to interview me about Dynasty, they chose Carl Zimmer.

I’ve been reading Carl Zimmer’s books for around a decade now. I’ve blogged alongside him at three different networks for five years. I’ve been plugging his superb work on this blog for around the same time. And now I’ve been interviewed by him. We talked about coming up with the idea for Dynasty, doing the reporting, and crafting the story.

It was a gruelling battle, much like the time Neal Stephenson fought William Gibson. But as we crawled out of the wreckage, bloodied and battered, a transcript emerged. You can find it here.

Walnut the True Measure of a Dinosaur’s Brain

Ampelosaurus had a surprisingly small brain. All fifty feet of the dinosaur – from its pencil-toothed muzzle to the tip of its long tail – was regulated by a mass of tissues about the size of a walnut and a half. That comparison isn’t sloppy shorthand. Ohio University paleontologist Lawrence Witmer actually went to the trouble of comparing the hulking sauropod’s brain to an English walnut to test an old axiom and correct a journalistic mistake.

Most everyone is familiar with the idea that huge dinosaurs had brains the size of a walnut. But where did this unflattering comparison come from? Witmer, who has tried to get at the origins of the idea, says the earliest example anyone has turned up so far is in Jennie Irene Mix’s 1912 book Mighty Animals, which says that the famous Jurassic sauropod Diplodocus “had a brain that was not much bigger than a walnut.” The meme didn’t really catch on until 1945, though, when paleontologist Edwin Colbert said that the armor-plated Stegosaurus also had a brain the size of the edible seed.

Witmer was reminded of the old saying when news source LiveScience covered a PLoS One study on the brain of Ampelosaurus that he had just published with Fabien Knoll, Ryan Ridgely, and coauthors. The 70 million year old sauropod, the report said, had a brain “which was not much bigger than a tennis ball.” That comparison is a bit too generous. (And Witmer nor his colleagues recall giving the news source such a comparison.) As Witmer explained in a Facebook post, “A tennis ball has a volume of about 140 cc, whereas the brain endocast of Ampelosaurus is just 39.5 cc.” A better measure of the dinosaur’s endocast, Witmer proposed, is a walnut. The side-by-side comparison of the pantry staple and the paleoneurological reconstruction shows just how startlingly small the dinosaur’s brain was.

In the post, Witmer wrote that he and his colleagues were “proposing the walnut as the new official unit of measure of dinosaur brain size, based on our microCT-scanned 26.2 cc walnut as the standard.” Of course, this was just some playful paleontology. “It’s just a joke, because it’s historically been such a player in dinosaur brain-size comparisons,” Witmer says. Still, the Cretaceous crack underscores the perplexing truth that some of the biggest animals of all time had ludicrously small brains.

“After studying maybe a couple dozen species of sauropods (which is a crap-ton of species for a CT-based study, “crap-ton” being another favorite unit of measure),” Witmer explains, “we have yet to find a clade that shows marked expansion of brain size.” In fact, Witmer points out, sauropod brains may have been even smaller than expected, since dinosaur brains didn’t fully fill the braincase endocasts that have been reconstructed from fossil remains. “Suffice it to say that if you thought a walnut or two was pretty small for an endocast volume, just imagine how much smaller the actual brain was,” Witmer says. Determining how much brain was in the braincase is the subject of further study by Witmer’s student Ashley Morhardt.

So what does all this mean for the biology and behavior of the animals? Are small brains really indicators of dullard dinosaurs? That’s difficult to say, Witmer says, since he and other paleo-brain specialists “can only talk in general and comparative terms.” Still, Witmer expects that sauropods “were pretty simple beasts” whose behavior was “largely governed by instinct and generally stereotyped responses.”

That doesn’t mean that sauropods were biological automatons, as some early 20th century paleontologists presumed. In addition to footprint evidence of sauropods walking together, Witmer points out, “we find sauropods with display structures” that belie some sort of social interaction and behavior. Ampelosaurus itself had such adornments – ornamental structures called osteoderms that gave the dinosaur’s flanks and back  some imposing decoration. And “Somehow all those sympatric Morrison [Formation] sauropods species were able to sort out whom to mate with,” Witmer says. Just because sauropods were probably dim-witted doesn’t mean that they weren’t capable of socializing or other relatively complex behaviors related to interacting with their own kind.

Of course, there was so much Mesozoic diversity and disparity that it would be wrong to assume that all dinosaurs were only capable of reflexive, knee-jerk responses to the world around them. Sauropods are just one part of an emerging picture of dinosaur neuroanatomy and behavior. “The exciting thing about recent comparative cognitive studies is how surprisingly ‘smart’ are a wide range of birds and reptiles, not just corvids” such as ravens, Witmer says. A sauropod’s intellect might not have been very impressive, but “we might indeed have been inspired by some dinosaur species to utter ‘Clever girl.'”


Knoll, F., Ridgely, R., Ortega, F., Sanz, J., Witmer, L. 2013. Neurocranial Osteology and Neuroanatomy of a Late Cretaceous Titanosaurian Sauropod from Spain (Ampelosaurus sp.). PLoS ONE. 8, 1: e54991. doi:10.1371/journal.pone.0054991

A Blog by

The way of Paine – my story about a scientific dynasty

It started, as many writing tales do, with John McPhee. In late 2011, I was reading a piece by the legendary writer, in which he talked about penning a series of profiles that were linked by a central character. I started thinking about doing something similar with a scientific bent—a story that would start with a single important person and trace the line of their academic descendants. Science, after all, is full of legacies. People learn their trade in the labs of senior scientists and go on to train students of their own. I wanted to explore how attitudes, skills and ideas are passed down these familial lines.

Finding the right person was hard. I needed someone who had made important contributions and produced a rich tree of students, who had never (or rarely) been intensively profiled, and who was still alive. That’s not a huge list.  But thanks to a timely chat with Nancy Baron, I found my subject: an ecologist called Bob Paine.

Paine’s a 6-foot-6-inches-tall bear of a man, forthright in his views, deeply in love with nature, and beloved by many. It was he who came up with the concept of a keystone species – one that is so disproportionately influential in its environment that removing it can remodel the entire community of life around it. The concept began with classic experiments when Paine threw starfish off an American coastline and saw the same beach overrun by mussels. Keystones are now enshrined in ecological textbooks and greatly affects how we see, and choose to care for, the world around us.

But Paine is a keystone himself. He’s a man who has had a tremendous impact on science, not just through ideas, but through the people he mentored, who have become eminent scientists in their own right. Unlike some senior scientists, Paine treated his trainees as equals, and emphasised independence, freedom and creativity.

My story—simply called “Dynasty” and out today in Nature—tells the tale of Paine’s ideas and the reasons behind the success. I look at his descendants and how they expanded on his ideas and philosophies, while also rebelling against them.

By D. Gordon E. Robertson

I’ve been working on this for the better part of a year, and I’m very proud of it. I hope you enjoy it. Here’s a teaser:

Bob Paine is nearly 2 metres tall and has a powerful grip. The ochre sea star, however, has five sucker-lined arms and can span half a metre. So when Paine tried to prise the creatures off the rocks along the Pacific coast, he found that his brute strength simply wasn’t enough. In the end, he resorted to a crowbar. Then, once he had levered the animals up, he hurled them out to sea as hard as he could. “You get pretty good at throwing starfish into deeper water,” he says.

It was a ritual that began in 1963, on an 8-metre stretch of shore in Makah Bay, Washington. The bay’s rocky intertidal zone normally hosts a thriving community of mussels, barnacles, limpets, anemones and algae. But it changed completely after Paine banished the starfish. The barnacles that the sea star (Pisaster ochraeus) usually ate advanced through the predator-free zone, and were later replaced by mussels. These invaders crowded out the algae and limpets, which fled for less competitive pastures.

Within a year, the total number of species had halved: a diverse tidal wonderland became a black monoculture of mussels. By re-engineering the coastline in this way, Paine dealt a serious blow to the dominant view in ecology of the time: that ecosystems are stable dramas if they have a diverse cast of species. Instead, he showed that individual species such as Pisaster are prima donnas, whose absence can warp the entire production into something blander and unrecognizable. He described these crucial creatures, whose influence far exceeds their abundance, as keystone species, after the central stone that prevents an arch from crumbling. Their loss can initiate what Paine would later call trophic cascades — the rise and fall of connected species throughout the food web. The terms stuck, and ‘keystone’ would go on to be applied to species from sea otters to wolves, grey whales and spotted bass.

Today, ecology students take these concepts for granted — but they shook the field when Paine first articulated them in the 1960s. “He’s been one of the most influential ecologists in the last half century,” says Simon Levin, a mathematical ecologist at Princeton University in New Jersey, and one of Paine’s closest friends. The revelation that not all species are equal was as disruptive to ecology as the loss of Pisaster was to Makah Bay. So was Paine’s insistence on tinkering with nature — what some have called kick-it-and-see ecology — at a time when most ecologists simply observed it.

But Paine — an organism whose disproportionate influence equals that of any starfish or sea otter — has also changed the ecosystem of scientists. In his five-decade career, he has trained a thriving dynasty of around 40 students and postdocs, many of whom are now leading ecologists themselves and who consider their time with Paine formative.

Science hosts many such dynasties: successions of academic leaders related not by blood, but by mentorship. Each generation inherits attitudes, philosophies and technical skills from the one before. Some, like Paine’s, are particularly fertile, sprouting lush branches on the academic tree and driving a field in a new direction. But Paine’s dynasty is remarkable not just for its scientific influence, but for its dedicated, tight-knit nature. Thanks to Paine’s original — and widely applicable — ideas, his emphasis on independent thought by his protégés and his fun, irreverent nature, almost every member has stayed in science, and specifically in ecology or marine biology.

“It’s a surprising list of superstars — great mentors of graduate students, who have published interesting work,” says Paine, who retired in 1998 but is still active in the field. These days, Paine can be spotted at ecological meetings by the swarm of academic descendants milling around him. Perhaps in this rich family, there are lessons about why some scientific dynasties flourish and grow, whereas others never bud.

A Blog by

Deleting a “memory molecule” doesn’t affect memory in mice

Our memories feel stable and secure. They’re such a critical part of our identities that losing them can feel like losing ourselves. But how do our brains achieve such permanence, when the molecules within them are constantly being degraded and recreated? How exactly do we store memories?

It’s likely that many molecules are involved, but over the last decade, one has emerged as a possible star player—an enzyme called PKM-zeta.

In 2006, Todd Sacktor from SUNY Downstate Medical Center managed to erase established memories in mice by injecting their hippocampus—a region involved in memory—with ZIP, a chemical meant to block PKM-zeta. A year later, Sacktor collaborated with Yadin Dudai at the Weizmann Institute in Israel to show that injecting ZIP into a different part of a mouse’s brain could erase month-old memories of an unpleasant taste. Many similar experiments followed suit, involving other animals, brain regions and labs. And in 2011, Sacktor managed to boost an old faded memory by increasing levels of PKM-zeta in the brain, by means of a virus loaded with copies of the gene.

As I wrote in a news piece for Nature, “these fascinating studies suggested that long-term memory, rather than being static and stable, is surprisingly fragile, and depends on the continuous activity of a single enzyme.”

But two teams of scientists have cast some doubt upon this neat tale, and upon PKM-zeta’s role as a memory molecule. Working independently, they deleted the gene for PKM-zeta in embryonic mice, producing adults that lacked the enzyme from birth.

And the rodents’ memories were fine. One group, led by Robert Messing at the University of California, San Francisco, showed that their mice formed persistent memories of fears, objects, movements and more. The other, led by Richard Huganir from Johns Hopkins University, showed that their mice had normal levels of long-term potentiation— a process that underlies learning and memory, where the connections between neurons become stronger. Both groups also found that injections of ZIP could still disrupt memories in their mice, suggesting that whatever this chemical is doing, it’s not acting on PKM-zeta (or at least, not doing so exclusively).

In my write-up for Nature, I go into more details about the studies, and discuss whether the results could be explained by other back-up systems that compensate for the loss of PKM-zeta. Do head over there for the full story.

Some navel-gazing about science writing and complexity

On a more personal note, the PKM-zeta story serves as a good reminder to resist easy explanations or tidy fables in science writing.

I have covered this molecule on this blog for years, including posts about two of the big splashy Science papers on memory erasure and memory strengthening. I gave the molecule a catchy epithet (“memory engine”). I wrote a long piece about its history and what it does.

Looking back at the coverage, I’m happy with the way the concepts are explained, but the pieces are rather breathless (“These were amazing results”, and “the implications of this are staggering,” quoth me.) And, most disappointingly, they’re largely one-sided. They present a hypothesis—which may or may not eventually turn out to be right—as hard fact. This piece, in particular, is 1,800 words long without a single outside voice.

Those voices were out there. In response to tweets about the new story,  I’ve seen many comments that are variants of: “Did anyone seriously think that a single molecule would explain the maintenance of long-term memory?” Clearly, there was scepticism about the idea; I just didn’t look hard enough.

And of course, I should know better. I worked in a cancer charity for many years and I know full well that any attempt to explain a complex thing, whether cancer or obesity, through a single simple route is almost certainly wrong. And I’ve spent the last year lambasting examples of science writing that favour false simplicity over real complexity (oxytocin, anyone?). If the PKM-zeta story ends up being more complicated, with multiple redundant back-ups and many molecular players, that’s not going to shock many neuroscientists. (As one said to me: “[The studies] show that the situation is complicated—surprise!”)

The problem is that simple explanations are seductive, and they make for nice stories. But the ultimate story of science is that things are regularly complicated, and often bafflingly so. The best science writers embrace that complexity, rather than sweeping it under the rug for the sake of a clean narrative.

To clarify, regarding PKM-zeta, I’m not taking “sides” (if sides even exist to be taken). This story will roll on. I know that both Huganir and Sacktor have more experiments planned, and other neuroscientists have contacted me with their take on the contrasting studies. My job, and my desire, is to chronicle this meandering work in a fair and appropriately critical way, and I look forward to doing that in the future.

For more on my views about communicating complexity in science writing, see the video below. I’m the second to speak.

A Blog by

Complex science and smart journalism

I had a delightful time at the SpotOn London 2012 conference this weekend, chatting to people about all things related to science, journalism and the internet. For reasons best known to them, the organisers decided to inflict me upon the attendees in three separate panels, videos of which are below.

This one’s on how to do smart journalism in the face of complex science. I lay out my thoughts on why people often criticise journalists for screwing up science reporting in boring ways, when there are more advanced forms of screw-up to consider and avoid.

This one’s about whether information from organisations like my former employers, Cancer Research UK, will replace traditional journalism. Spoiler: No.

And this one’s about safeguarding against fraud and dodgy practices in science. It touches on all the psychology-related material that I’ve been covering for the last year, and has a rather good discussion.