A Blog by

An Epidemic 14 Years Ago Shows How Zika Could Unfold in the US

An Aedes albopictus mosquito, which health authorities worry may begin to spread Zika.
An Aedes albopictus mosquito, which health authorities worry may begin to spread Zika.
Photograph by James Gathany, CDC.

If the Zika virus comes to the United States, we could face the threat of the same sort of virgin soil epidemic—an infection arriving in a population that has never been exposed to it before—that has caused more than 1 million known infections, and probably several million asymptomatic ones, in Central and South America. It’s nerve-wracking to wonder what that would be like: How many people would fall ill, how serious the effects would be in adults or in babies, and most important, how good a job we would do of protecting ourselves.

But, in fact, we can guess what it would be like. Because we have a good example, not that long ago, of a novel mosquito-borne threat that caused very serious illness arriving in the United States. And the data since its arrival shows that, despite catching on fairly quickly to what was happening, the U.S. didn’t do that good a job.

This possibility became more real Monday when the Pan American Health Organization released a statement that predicts Zika virus, the mosquito-borne disease that is exploding in South and Central America and seems likely to be causing an epidemic of birth defects especially in Brazil, will spread throughout the Americas. PAHO, which is a regional office of the World Health Organization, said:

There are two main reasons for the virus’s rapid spread (to 21 countries and territories): (1) the population of the Americas had not previously been exposed to Zika and therefore lacks immunity, and (2) Aedes mosquitoes—the main vector for Zika transmission—are present in all the region’s countries except Canada and continental Chile.

PAHO anticipates that Zika virus will continue to spread and will likely reach all countries and territories of the region where Aedes mosquitoes are found.

Those “countries and territories where Aedes mosquitoes are found” include a good portion of the United States, as these maps from the Centers for Disease Control and Prevention demonstrate:

CDC maps of the ranges of two mosquito species that could transmit Zika virus.
CDC maps of the ranges of two mosquito species that could transmit Zika virus.
Graphic from CDC.gov, original here.


The recent history is this: In the summer of 1999, the New York City health department put together reports that had come in from several doctors in the city and realized that an outbreak of encephalitis was moving through the area. Eight people who lived in one neighborhood were ill, four of them so seriously that they had to be put on respirators; five had what their doctors described as “profound muscle weakness.”

Within a month, 37 people had been identified with the perplexing syndrome, which seemed be caused by a virus, and four had died. At the same time, veterinarians at the Bronx Zoo discovered an unusual numbers of dead birds: exotics, like flamingos, and city birds, primarily crows. Their alertness provided the crucial piece for the CDC to realize that a novel disease had landed in the United States: West Nile virus, which was well-known in Europe, but had never been seen in this country before.

West Nile is transmitted by mosquitoes in a complex interplay with birds. It began moving with both birds and bugs down the East Coast and then across the Gulf Coast. As it went, the CDC realized that the neurologic illness that marked the disease’s first arrival had not been a one-time event, but its own looming epidemic within the larger one. “Neuroinvasive” West Nile, which in its worst manifestations caused not transient encephalitis but long-lasting floppy paralysis that resembled polio — and sometimes killed — bloomed in the summer of 2002 east of the Mississippi, and then moved west in the years afterward as the disease exhausted the pool of the vulnerable.

The CDC’s maps showing the emergence of “neuroinvasive” West Nile virus disease from 2001 to 2004; areas in black had the highest incidence.
Graphic by Maryn McKenna using maps by the CDC; originals available here.

So far, so normal, for a newly arrived disease. But here’s where the story gets complicated. By the beginning of this decade, West Nile had become endemic in the lower 48 states. It is not a mysterious new arrival; it is a known, life-altering threat. Its risk waxes and wanes with weather and insect populations, but it has one simple preventative: not allowing yourself to be bitten by a mosquito.

And yet: Here are the CDC’s most recent maps of neuroinvasive West Nile—showing that people are still falling to its most dire complication, 14 years after it was identified.

The CDC's maps for 2011-2014 showing the incidence of "neuroinvasive" West Nile virus disease; areas in black had the highest incidence.
The CDC’s maps for 2011-2014 showing the incidence of “neuroinvasive” West Nile virus disease; areas in black had the highest incidence.
Graphic by Maryn McKenna using maps by the CDC; originals available here.

The point here is not that people are careless or unthinking; in the early years of West Nile, two of the victims were the husband of the CDC’s then director, and the chief of its mosquito-borne diseases division, who would have been well aware of the risks. (Both recovered fully.) The point is that always behaving in a manner that protects you from a mosquito bite—conscientiously, persistently, faultlessly emptying pots and puddles, putting on long sleeves and repellent, choosing when not to go outdoors—is very difficult to maintain.

Zika is not West Nile. Among other things, Zika is spread by many fewer species of mosquitoes — one or possibly two, compared to 65 for West Nile. And West Nile’s non-human hosts, birds, live in closer proximity to more of us than Zika’s, which appear to be non-human primates. But though the rare, deadly complications of West Nile virus infection are different from those of Zika, they are just as serious and life-altering — and yet we failed to protect ourselves from them. As Zika spreads, we can hope that is a lesson we learn in time.

Previous posts in this series:

A Blog by

Polio Eradication: Is 2016 The Year?

A polio victim crawls on a sidewalk in India.
A polio victim crawls on a sidewalk in India.
Photograph by Wen-Yai King Flickr (CC).

As Yogi Berra (or Niels Bohr or Samuel Goldwyn) is supposed to have said, it’s difficult to make predictions, especially about the future. It’s especially dangerous to try to predict the behavior of infectious diseases, when small unpredictabilities in climate or trade or the behavior of governments can bring a problem that we thought was handled roaring back to life.

But as 2016 opens, it is fair to say that the disease public health experts are pinning their hopes on, the one that might truly be handled this year, is polio. There were fewer cases last year than ever in history: 70 wild-type cases, and 26 cases caused by mutation in the weakened virus that makes up one of the vaccines, compared to 341 wild-type infections and 51 vaccine-derived ones the year before. Moreover, those wild natural infections were in just two countries, Afghanistan and Pakistan, and the vaccine-derived cases were in five. The noose is tightening.

The most that health authorities can hope for this year is to end transmission of polio. The ultimate goal is eradication, which has happened only twice—for one human disease, smallpox, and one animal one, rinderpest. To declare a disease eradicated requires that the entire world go three years without a case being recorded. If there are no polio cases in 2016, eradication might be achieved by the end of 2018.

Which would make for nice round numbers, because the polio eradication campaign began in 1988. It is safe to say that no one expected it would take anywhere near this long; the smallpox eradication campaign, which inspired the polio effort, reached its goal in 15 years.

Smallpox was declared eradicated in 1980, so long ago that most people have no knowledge of how devastating a disease it was, or even what a case of the disease looked like. (There are survivors left, but they are aging; the last person infected in the wild, Ali Maow Maalin of Somalia, died in 2013.) In the same way, we’ve forgotten how difficult it is to conduct an eradication campaign. Smallpox was the first campaign that succeeded, but it was the fifth one that global authorities attempted. In its success, it demonstrated what any future campaign would need: not just a vaccine that civilians could administer, but an easy-to-access lab network, granular surveillance, political support, huge numbers of volunteers, and lots and lots of money.

In its own trudge to the finish, the polio eradication campaign has stumbled over many of those, from local corruption to extremist opposition to the still almost unbelievable interference of the CIA (which I covered here and here), along with the virus’s own protean ability to cross borders (to China) and oceans (to Brazil).

But now, at last, the end does look in sight. I asked Carol Pandak, director of the Polio Plus program at Rotary International — which since 1988 has lent millions of volunteers and more than a billion dollars to the eradication campaign —  how she thinks the next 12 months will go.

“We are getting closer,” she told me. “We have only two endemic countries left. Of the three types of the virus, type 2 was certified eradicated in September, and there have been no type 3 cases globally for three years. And Pakistan and Afghanistan have goals to interrupt transmission internally in May 2016.”

The diminishment of wild polio paradoxically creates greater vulnerability to vaccine-derived polio, which happens when the weakened live virus used in the oral vaccine mutates back to the virulence of the wild type. The only means of defusing that threat is to deploy the killed-virus injectable vaccine, which is widely used in the West but until recently was considered too expensive and complex to deliver in the global south.

To begin the transition, Pandak said, countries that still use the oral vaccine have agreed to give one dose of the injectable as part of routine childhood immunizations for other diseases. That should strengthen children’s’ immune reactions to polio, so that the reversion to wild type — which occurs as the weakened virus replicates in the gut — does not take place.

In the smallpox campaign, when eradicators thought they were almost done, there was a freak weather event—the worst floods that Bangladesh had experienced in 50 years—that triggered an internal migration and redistributed the disease. Polio is just as vulnerable to last-minute disruptions, especially since the two remaining endemic countries are hotspots of unpredictability. Travelers from Pakistan actually carried polio into Afghanistan in August.

“In Pakistan, the army has committed to providing protection for vaccinators in conflict areas,” Pandak told me, “and another strategy that has been successful has been to set up border posts to immunize people as they are fleeing areas of conflict and military operations. I have seen Rotary volunteers staffing 24/7 kiosks in train stations and toll booths, so that we can get people wherever they happen to be.”

There is no question that hurdles remain. By the World Health Organization’s order, polio is still considered a “public health emergency of international concern,” which requires countries where the disease is extant to either ensure its citizens are vaccinated before leaving, or prevent their crossing the border. And polio still lives quiescently in lab freezers all over the world, and those will have to be searched and their contents eliminated lest a lab accident bring the disease alive again (a warning that was recently circulated for rinderpest as well). Plus, up til now, the injectable vaccine has been made by starting with a virus that is not only live but virulent, posing the risk that a lab accident that could release it; British scientists announced on New Year’s Eve that they may have found a way to weaken it while still yielding a potent vaccine.

When it goes, if it does, polio will gift the world not only with its absence, but also with the abundant health infrastructure that was set up to contain and eliminate it, and can be turned to other uses. When I talked to Pandak, she sounded excited at the possibility that countries and volunteers would be able to turn their attention away from a single disease and toward ensuring the overall health of children.

“We have been doing this for 30 years,” she said. “We’ll continue to fundraise, advocate and raise awareness to the last case. We are committed to seeing this to the end.”


A Blog by

MRSA In Sports: Long-Standing, Simple to Prevent, Still Happening

Big news in sports the past few days: Daniel Fells, tight end for the New York Giants, is battling a MRSA infection so severe that he has been hospitalized in isolation and had multiple surgeries. Some news stories have speculated doctors may amputate his foot in an attempt to corral the infection.

It’s a tragic situation for the player, and no doubt frightening for the team, which is reported to have sought medical advice and scrubbed down their locker rooms to prevent any additional cases.

What it’s not, unfortunately, is new. MRSA—the acronym for methicillin-resistant Staphylococcus aureus, staph bacteria that are resistant to multiple classes of antibiotics—has been dogging sports teams for more than 20 years. And for at least 10 of those years, we’ve known what to do to prevent it. But it’s not at all clear that teams treat that prevention as a routine thing they should be doing—and because of that, every athlete’s infection seems like a random tragedy, instead of an avoidable mistake.

Among the long litany of MRSA cases in athletes, some have been high-profile: Lawrence Tynes, who is suing the Tampa Bay Buccaneers over a career-ending infection (two of his teammates were infected as well); Brandon Noble of the Washington Redskins, who lost his pro career over a knee infection (six of his teammates developed infections too); Kenny George of  the University of North Carolina-Asheville, who had part of his foot amputated.

But the list of those known to have been affected (and this is certainly not complete) is much longer. Some other names: Kellen Winslow (and five teammates) of the Cleveland Browns, Peyton Manning, Drew Gooden, Mike Gansey, Sammy Sosa, Alex Rios, Paul Pierce, Kenyon Martin, Braylon Edwards, and Grant Hill. And, in addition, the St. Louis Rams, the USC Trojans, and dozens of college and high school teams going back to 1993.

MRSA infections seem like they sweep in out of nowhere, especially the apocalyptically bad ones (such as MRSA pneumonia, which can kill a child in days). But in fact, some MRSA cases are very predictable. They are more likely to occur in what the Centers for Disease Control and Prevention call the “5 C’s“: places where there is crowding, skin-to-skin contact, compromised skin from cuts or abrasions, contaminated items and surfaces, and lack of cleanliness.

Add all those together, and you have a pretty good description of a football field, and a locker room after a game.

MRSA is simple to catch: The bacterium lives on the surface of our skin, and in our nostrils and other warm, damp body crevices, and causes an infection when the skin is breached and the bacteria slip into tissue or the bloodstream. In hospitals, where MRSA first became a problem in the 1960s, that breach could come from surgery, or an incision made to allow for a catheter or an IV. But in the everyday world, where MRSA has been a problem since the mid-1990s, the source is more likely to be a cut or a scrape—in the kitchen, in the outdoors, or, in sports, from a razor, training equipment, artificial turf, a wrestling mat, or pads or straps cutting into a shoulder or a shin. (And sometimes, nothing at all. Toxins manufactured by the bacterium can break down the skin, causing the hot pinpoint infections that people often mistake for spider bites.)

Fells is supposed to have been infected at some point in the past few weeks, after a toe and ankle injury and a cortisone shot to the ankle. I don’t have inside intel on his treatment, or on what the Giants do in their locker rooms. But I know what teams that had MRSA problems in the past did to shut their outbreaks down. It wasn’t complicated—but it required commitment and attention, and it took a while.

Between 2002 and 2006, the Trojans, the Rams, and the Redskins were all so spooked by epidemics among their players that they asked the CDC and local health departments for help. (The stories of the outbreaks are told in my last book, Superbug.) They learned that stopping the infections and protecting their players took many steps: requiring everyone to shower post-game. Scouring the hydrotherapy tubs. Disinfecting the training equipment and massage tables. Discouraging body shaving, even though it makes taping up—and untaping—a lot less uncomfortable. Raising the water temperature in the laundry machines. Making sure no one shared bars of soap in the shower or towels on the field.

After Noble’s injury, the Redskins ripped out their entire training facility and installed a new one, spraying germ-killing coatings on the lockers and discarding the shared benches for individual stools. The teams practiced these steps over and over, chastising and sometimes fining players who didn’t bother, and shut their outbreaks down.

MRSA is also a serious problem for school teams; in fact, it was school outbreaks—in a Vermont high school in 1993, a Pennsylvania college in 2000, a Connecticut university in 2003, and throughout Texas high schools for several years in a row—that first alerted researchers that athletes might be at special risk. When I was writing Superbug, I spent a lot of time with trainers and coaches, and it was striking how open they were about the problem. Whether because of affection for their students, responsibility to parents, or fear of lawsuits, athletic programs all over the US were educating kids and staffs about the danger, and teaching them how to protect themselves.

Pro teams, which clamp down on information about players’ injuries as competitive intelligence, mostly don’t talk about their MRSA plans. But it’s not clear they are training and protecting as comprehensively as schools do. A year ago, the Washington Post took a look back at Brandon Noble’s career-ending infection, and reported that MRSA prevention is not uniform across NFL teams. This season, Duke University’s Infection Control Outreach Network Program for Infection Prevention in the NFL, known for short as DICON, began working with the NFL Players Association to distribute a manual on infection prevention to all 32 teams and to train their personnel. That the teams agreed to participate is a big step—but that the program was needed suggests how vulnerable some players still are. Until MRSA prevention becomes routine in locker rooms, other players may end up as ill as Fells now is.

A Blog by

Noah (and his ark) Updated, Improved for Our Time

Instead of the Noah you know, the one who built the ark, sheltered all those animals, sailed for 40 days and 40 nights and got to see God’s rainbow, instead of him, I want you to meet a new one. An updated version.

This Noah shows up in a tough little essay written by Amy Leach, of Bozeman, Montana, who knows her science, knows there’s a flood coming—a flood of humans, seven billion and counting, already swamping the Earth, crowding the land, emptying the sea, and her more modern Noah—informed, practical, not inclined to miracles—has a different plan. He announces,

water color painting with text reading ''unfortunately, animals. we are not going to be able to bring all of you with us this time.''
Illustration by Robert Krulwich

The old Noah, you may remember, squeezed eight humans (wife, kids, their spouses) and at least two of every critter, big and small, onto his crowded ship. But the new Noah, being more practical, feels he can winnow a little. “Everybody” is a lot of animals, more than you know. Back in the day, Amy Leach writes,

pink watercolor background with two drawings of frogs peeking up over the text, which talks about what it would be like to bring two of every creature onto noah's ark
Illustration by Robert Krulwich

And, honestly, (I’m thinking to myself), if the world lost a scorpion or two, would anyone notice? Or want them back? And blotchy toads, biting little flies—some animals are hard to keep going on a tight, crowded ship. On the last voyage, dormitory assignments were beyond difficult.

And all those supplies? Amy Leach writes how the first Noah would have had …

a yellow watercolor background covered with text about collecting food for animals
Illustration by Robert Krulwich

This doesn’t mean we don’t care, new Noah says to the animals. We definitely, absolutely want to bring a bunch of you with us. But, we’ve got to be practical.

Even if our ark has grown to the size of a planet, carrying everybody through is not going to be logistically possible, which is why, he says,

blue watercolor background with black text on it about being in charge of a future noahs ark where not all animals are included
Illustration by Robert Krulwich

And anyway, that first Noah? He lived in a different age, a time they call the Holocene, before humans began to dominate and crowd out the other species. Back then, there weren’t as many people. And there were more kinds of animals, closer by, hiding in the woods, clucking in the yard, so the world was more various then, more intimate, more riotous, and thinking about it (a little wistfully, if only for a moment), the new Noah quietly recalls that on that first ark …

yellow watercolor background with text on top related to how noahs ark would be different today than it was in the Old Testament
Illustration by Robert Krulwich

And now, animals, it’s time for many of you to step away. You’ve had your unruly eons. They were wild, unplanned, noisy, great fun. Natural selection ran the world. Crazy things happened. Those were good times, Amy’s essay concludes …

blue watercoor with black text on top that reads''But the future belongs to us.''
Illustration by Robert Krulwich

Amy Leach is a writer living in Bozeman. Her collection of very short pieces—about jellyfish, beaver, salmon, plants that go topsy turvy and stand on their heads—are collected in a wonderful little book called “Things That Are.” In this column I do to Amy what the new Noah is doing to our planet: I edited her down, sliced, diced, slimmed (lovingly, I hope), trying to give you a taste for her fierce, crazy prose. But like the planet, she’s wilder in the original, so I hope you go there and sample the unedited version.

It Came From Basic Cable

On Sunday, Discovery kicked off Shark Week with a hoax. Not content to just play up sharks as serial killers in the annual extravaganza of blood, hokey reenactments, and menacing fins, the basic cable channel did their best at trying to fool viewers into believing that the 50 foot long, hypercarnivorous shark Carcharocles megalodonknown only from a fossil record that fizzles out in the 4-2 million year range – is still gobbling whales and prowling the modern seas. The stunt irritated viewers who quickly saw through the bad cgi and manufactured drama of the show, and the response was sharp enough that Michael Sorensen, executive producer of Shark Week, defended the show with a feeble bite back at critics.

Discovery built its reputation with science programming. Shark Week was always a high point. There was no part of the summer I looked forward to more. As a kid, I was hooked by shows that gave shark experts such as Eugenie Clark, John McCosker, and Samuel Gruber full attention as the researchers rhapsodized about the selachian subjects of their scientific fascination. I was so addicted that if I was going to be away from the VCR while Shark Week was on, I’d beg my friends to tape as many programs as they could stand so I could catch up on the shark marathon when I got home. But now Discovery is a joke, with the megalodon fiasco only being a confirmation of what has been clear for some time.

Frustrating as the program was, the megalodon fauxumentary didn’t come as a surprise. Discovery Communications had previously netted huge ratings with similar chicanery. Discovery didn’t “sink its credibility” with the prehistoric shark. The network family’s credibility was already long gone.

Last year, Animal Planet – a channel owned by Discovery Communications – spent a mind-numbing two hours trying to convince viewers that ichthyosapiens is real with the fictional Mermaids: The Body Found. Animal Planet doubled down this year with a reairing of the original and a noxious supplement titled Mermaids: The New Evidence. These shows carried only the barest of disclaimers that they were fiction rather than fact – I still get occasional emails and comments from those who believe mermaids are among us – and the shows broke audience records. Animal Planet representative Marjorie Kaplan said she and her associates are “thinking big” about how to follow the success. Megalodon: The Monster Shark Lives follows in the wake of Mermaids and may be a look at the future of “science” television. That is, if reality show nonsense doesn’t totally overwhelm formerly non-fiction channels first.

But documentary deceitfulness runs deeper than outright hoaxes, and is really nothing new. Most infamously, the creators of the 1958 Disney nature film White Wilderness used a turntable to launch lemmings off a cliff to make the rodents appear as if they were committing mass suicide. The program won an Academy Award for Documentary Feature. Nor are fraud and sleight of hand issues of the past.

A wolverine at the Kristiansand Zoo, Norway. Despite no authenticated case of a wolverine killing a human, the carnivore is presented as a bloodthirsty monster in 'Yukon Men.' Photo by Birgit Fostervold, distributed under a Creative Commons Attribution 2.0 Generic license.
A wolverine at the Kristiansand Zoo, Norway. Despite no authenticated case of a wolverine killing a human, the carnivore is presented as a bloodthirsty monster in ‘Yukon Men.’ Photo by Birgit Fostervold, distributed under a Creative Commons Attribution 2.0 Generic license.

Seeing is believing. That shouldn’t be. As passive viewers, we’re quite easy to fool, and different sorts of tricks are regularly used to present a facsimile of nature rather than a reality. In his book Shooting the Wild, for example, documentarian Chris Palmer explains that filmmakers Carol and Richard Foster capture animals which they then film on carefully-constructed sets. To film vampire bats lapping human blood, Palmer recounts, the Fosters created an artificial cave for the bats and a mock-up camp nearby where a volunteer pretended to be asleep while the bats scurried over him and eventually tucked in for a liquid lunch. (Wisely, the actor had been given a rabies vaccine beforehand.) Palmer notes that the Fosters are honest about such methods, but the networks who show their programs may or may not make such artificial setups clear to viewers.

The techniques of Wild America host Marty Stouffer were not so transparent. In 1996, after a damning article in the Denver Post that charged Stouffer’s “wild” vignettes were really filmed in enclosures and sometimes used captive animals, the network PBS ran an investigation on the show. The investigation, Palmer notes in Shooting the Wild, found that “a significant percentage” of Wild America episodes had some staged component and even included unethical behavior, such as filming wolves chase and kill a deer within an enclosure that the deer could not escape. Stouffer denies these charges and has moved on to other projects, but his reputation among other professionals was badly tarnished.

Even David Attenborough, the most beloved natural history host of all time, has presented a faked scene that trod over the ethical line. For Life in Cold Blood, a captive cobra was placed upon a rock and agitated to spit at the celebrity naturalist. Marine biologist Andrew Thaler has recounted similar deceptions on Animal Planet shows Call of the Wildman and Gator Boys. Captive animals aren’t always available for a close-up, though, so, as Palmer also documents, some filmmakers are not above goading or harassing wild animals to get more dramatic, fierce reactions from their stars.

Interviewers aren’t above manipulation, either. When a crew from the History channel series MonsterQuest interviewed paleontologist Donald Prothero for a show about an Apatosaurus-like dinosaur that supposedly lives in the Congo Basin, they handed the scientist a plaster cast of what was meant to be – but clearly wasn’t – a footprint of the animal. Prothero rightly rejected the evidence, but the crew tried over and over again to film a “gotcha” moment of Prothero thrown off guard. That never happened. And when the show eventually aired, the program failed to mention that their star explorers were young earth creationists who were pursuing the dinosaur because they mistakenly believed such a discovery would discredit evolutionary theory. The utterly disreputable H2 show Ancient Aliens pulled a similar scam on viewers in their episode on dinosaurs, tapping religious fundamentalists as science experts.

Editing and presentation can create fiction from reality, too. Adam Welz has rightly observed that editors can still remix stock footage with scary music to spawn hyperbolic, unrealistic visions of animals. Shark Week itself is a perfect example of this ubiquitous trend. Discovery relies on blood in the water to bring in viewers, and any education the audience receives is an aside. One of this year’s new programs was called Great White Serial Killer, for crying out loud, and shark experts rightly scoffed at the overplayed bloodthirstiness of the show on Twitter.

Of course, there are reputable and ethical cinematographers who are disturbed by such trends. (See Shooting the Wild for a fuller account.) But even documentarians who operate ethically may create artificial sets or use computer generated imagery to show viewers something that would be impossible to capture otherwise. Whether this counts as deceit or not rests on how the behind-the-scenes process of filmmaking is accounted for and disclosed to audiences. The fact is, shows like Mermaids and Megalodon are extensions of trends that have been in place for years.

Documentary creators use sets, careful edits, and even computer-generated effects to get the scenes they want. What you’re seeing on screen may be a facsimile of nature rather than something captured without human interference. This should go without saying, but don’t believe everything you see on TV.

Megalodon: The Monster Shark Lives gave science communicators like me an easy target. You really can’t miss a 50 foot shark. “You may fire when you are ready, Gridley.” If the program had clearly been labeled as fiction, I wouldn’t have much reason to get in a snit, but the show was clearly presented to capitalize on Discovery’s reputation as a non-fiction network and therefore dupe viewers. But in all this outrage, we shouldn’t forget that hoaxes, fraud, deception, and salacious depiction have existed for as long as there have been wildlife films. I don’t expect that to change anytime soon, especially since the most popular shows are the ones that are unapologetic fakes. Audiences bear some responsibility for making these monsters, too.

I’ve been heartened to see that objections to Discovery’s hoax have fleetingly catapulted some of my fellow science communicators into the media spotlight. Blogs, Twitter, and other forms of media, as Steven Silberman once observed, can act as a rapid-response immune system to nonsense, and quickly-executed takedowns may even change the public narrative. But I have absolutely no doubt that we’re going to go through all this again by next Shark Week, if not before. That’s why we need to keep talking. We may never be able to stop irresponsible ratings bait from airing, but we can try to co-opt the popularity of hyperbolic shows to maybe, just maybe, speak some science among the sensationalism.

[Note: National Geographic Wild is running a competing marathon called Sharkfest this week. I have not seen any of the episodes, but some of the show titles and clips – “California Killer”, “Florida Frenzy”, “Panic in Paradise”, etc. – seem to capitalize on the same hyperbole that Shark Week has relied on for so long. Everyone is chumming for viewers.]

Evolution is Wonderful

I’ll never forget the first time I saw the Milky Way. On a warm late August night in 2009, my wife and I stretched out on a campground table at Dinosaur National Monument, Utah to see the cloudy stretch of our home galaxy arc across the night sky. I had never been in a place dark enough to see the stellar display. I lived in central New Jersey my entire life, where light pollution blocked out all but the very brightest stars. But here, far from the suburban sprawl I was accustomed to, I could giddily gaze at a simple circumstance of the universe we live in and wonder about all that starlight.

I had come to the national park for the fossils. Dinosaur fanatic that I am, I couldn’t step foot in Utah without taking a direct route to one of the most glorious Jurassic bonebeds of all time, where a chaotic jumble of giant bones conjures up visions of life and death 150 million years ago. The quarry wall was closed for repairs, and so I happily settled to see a Brigham Young University excavation of a geologically-younger long-necked herbivore that would later be named Abydosaurus.

Such magnificent, long-lost creatures kept stomping through my imagination as I stared at the Milky Way. I’ve never been drawn into astronomy or physics, but I recalled that even light takes time  to travel. There was no way to be sure, but maybe some of the ancient lights I was looking at originally left their incomprehensibly distant stars when Abydosaurus and the monument’s other dinosaurs still walked the Earth. Seeing the illuminated points scattered over the park’s gorgeously-exposed geologic formations – the rocks little more than inky outlines in the dark – I felt like a time traveler standing between Earth and sky. There are few moments in my life when I have been as overtaken by sheer wonder and joy at the universe we live in.

The first time I visited Dinosaur National Monument, I woke up to see the sun hit this Permian formation across the Green River. Photo by Brian Switek.
The first time I visited Dinosaur National Monument, I woke up to see the sun hit this Permian formation across the Green River. Photo by Brian Switek.

Yet, despite how enraptured I felt by Deep Time, the horror novelist Stephen King thinks that I was missing out on the true wonder of existence. That’s because I’m an atheist, and, on NPR’s Fresh Air, King delivered this condescending quote about those who don’t see divinity in nature:

If you say, ‘Well, OK, I don’t believe in God. There’s no evidence of God,’ then you’re missing the stars in the sky and you’re missing the sunrises and sunsets and you’re missing the fact that bees pollinate all these crops and keep us alive and the way that everything seems to work together. Everything is sort of built in a way that to me suggests intelligent design.

I really don’t care about Stephen King’s views on the existence or non-existence of deities. That’s very, very far down on my list of issues worth worrying about. But King’s quote represents a snobbish and pervasive belief that those who see no evidence of gods are somehow impoverished in their lives. Creationists have been peddling this arrogant argument for quite some time – that without a god, the universe is purposeless and we are trapped in a nihilistic march towards oblivion.

I don’t feel that lack of hope or fascination. I’m not crippled by the sense of emptiness King and others presume I must feel.

We live in an indifferent universe. There is no destiny or plan, and Nature was not created for our benefit. Yet we’re still here. Our lineage goes back billions of years to the last common ancestor of all life on Earth, giving us traits in common with ever single living organism, and our ancestors have been fortunate enough to persist through the five worst global catastrophes of all time. At so many points in the past – whether minor in scale or as devastating as an asteroid striking the planet – history could have turned out quite differently, creating circumstances that would have prevented our evolution. We’ll never know all those alternatives. All we know is what has actually transpired.

Bones in the Jurassic quarry wall at Dinosaur National Monument. Photo by Brian Switek.
Bones in the Jurassic quarry wall at Dinosaur National Monument. Photo by Brian Switek.

To repeat a line from my book Written in Stone, we are creatures of time and chance. How wonderful is that? Out of all the innumerable possibilities in the history of life on Earth, a string of circumstances billions of years long transpired in such a way as to allow the origin of our species (and also accounts for the loss of all our human relatives along the way). And this unintended state of nature makes a humble bee pollinating a flower, a sunrise, the division of a cell, the jagged outline of a mountain in twilight, the petrified record of the dinosaurs, and everything else in existence all the more spectacular. (Paleontology and natural history are what I love most; we all admire different aspects of nature.) None of that was ordained to exist, and yet evolution and other ongoing natural processes have nonetheless generated phenomena which are not only beautiful, but comprehensible to us.

There is no need for the supernatural to invoke or appreciate wonder. And rather than reducing nature to equations and graphs, I truly believe that science – our ability to actually understand why bees pollinate flowers, why mountains rise, and how remnants of ancient life became locked in stone – makes the world all the more exquisite by not only giving us clues, but new questions to ask.

The closing paragraphs of Charles Darwin’s On the Origin of Species by Means of Natural Selection are some of the most reprinted words in all of science. So much so that they’ve become a little worn and cliche when plopped down into seemingly every book about evolution in existence. But no matter how many times you’ve read the lines, take a breath and really read Darwin’s conclusion over again:

It is interesting to contemplate an entangled bank, clothed with many plants of many kinds, with birds singing on the bushes, with various insects flitting about, and with worms crawling through the damp earth, and to reflect that these elaborately constructed forms, so different from each other, and dependent on each other in so complex a manner, have all been produced by laws acting around us. These laws, taken in the largest sense, being Growth with Reproduction; Inheritance which is almost implied by reproduction; Variability from the indirect and direct action of the external conditions of life, and from use and disuse; a Ratio of Increase so high as to lead to a Struggle for Life, and as a consequence to Natural Selection, entailing Divergence of Character and the Extinction of less-improved forms. Thus, from the war of nature, from famine and death, the most exalted object which we are capable of conceiving, namely, the production of the higher animals, directly follows. There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

Even as the specter of death hung over his “entangled bank”, Darwin was still in exuberant awe over such a simple natural process that could account for so much of what we find beautiful about life. Understanding the origin of such diverse and disparate organisms only makes our world feel more magnificent. I dare Stephen King to write a more beautiful tribute to nature.

A Blog by

Jonah Lehrer, Scientists, and the Nature of Truth

Last week the journalism world was buzzing about — guess who? — Jonah Lehrer. Yes, again. We knew about the science writer’s self-plagiarism and Bob-Dylan-quote fabrication. Last week a New York Magazine exposé by Boris Kachka claimed that Lehrer also deliberately misrepresented other people’s ideas.

Kachka’s piece led to some fascinating discussions about whether it’s possible to tell a science story that’s both riveting and fully accurate. Science journalist Carl Zimmer, for example, wrote a thoughtful, inspiring post about the messiness of science. All of the commentary left me wanting to hear more details from the scientists in Lehrer’s stories. Had they been misrepresented? If so, how? Were they upset? Did they complain?

Kachka and Zimmer zeroed in on a 2010 story about the scientific method that Lehrer wrote for the New Yorker. The story’s premise is clear from the title (“The Truth Wears Off”), the subtitle (“Is there something wrong with the scientific method?”), the nutgraf (“It’s as if our facts were losing their truth: claims that have been enshrined in textbooks are suddenly unprovable.”), and the last few lines (“Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.”). (more…)

A Blog by

Book Review: The Time Cure

Most scientists are reluctant to talk about “curing” mental illness, and rightly so. The mountain is too steep: These disorders have a range of genetic and environmental causes, and symptoms vary widely from person to person. But for post-traumatic stress disorder (PTSD) — in which people are haunted for months or years by memories of a life-threatening event — that framework is all wrong.

So says The Time Cure, a book out later this month claiming that people with PTSD can find long-lasting relief by simply re-framing their concept of time. The authors outline a new clinical approach, dubbed Time Perspective Therapy or TPT, which they say is far more effective than any other treatment.

The book includes a lot of common-sense advice: Focus on good rather than unpleasant memories, find enjoyable hobbies, fraternize with a supportive community, make realistic goals. Following these simple directives would no doubt help many people, sick or not, improve their lives. Still, given the millions of people who suffer from PTSD, heralding a cure seems an act of hubris — especially when the evidence is limited to a small (and not peer-reviewed) clinical trial and more than 100 pages of poignant personal stories.

A Blog by

Top 3 Reasons to Stop Fretting About Being an Old Dad

You probably heard about last week’s Nature study on older dads and autism; it got a lot of attention. The basic findings were fascinating but, in my opinion, far less sensational than what most of the news articles would have us believe.

The researchers, led by Kári Stefánsson of deCODE Genetics in Iceland, showed that the average 20-year-old man passes on about 25 new single-letter DNA mutations to his child. (These kinds of mutations happen spontaneously in sperm cells, so they don’t affect the DNA in the father’s other cells.) With each passing year of age, the man’s sperm acquires two more mutations. This makes sense, biologically. Sperm Primordial sperm cells divide over and over throughout a man’s life. To use an over-used metaphor: Each time the code gets copied, it creates an opportunity for a spelling mistake. Eggs Primordial eggs, in contrast, go through far fewer divisions. Women, no matter what their age, pass on about 14 mutations to each child, the study found.

The researchers also showed, using demographic data of Icelanders going back to 1650, that the average age of fathers has recently shot up, from 27.9 years in 1980 to 33 in 2011. Based on their calculations, that means the average number of mutations passed on to each kid (from mother and father combined) went from 59.7 to 69.9.

Here’s the sensational part. Stefánsson says, given that these mutations have been linked to autism, the increase in older fathers could partially explain why autism rates have risen over approximately the same time period. This is a plausible idea, sure, in theory. But there’s actually not much data to back it up (more on that later). And yet the assertion — reported in the New York TimesWall Street JournalWashington Post and more than 250 other outlets (Slate’s XXfactor blog even ran a piece titled, “Dude, Bank Your Sperm. It’ll Get You Laid.”) — was enough to scare some potential fathers. As one of my friends Tweeted, “great, my 34yr-old gonads may be ticking neuro-disorder timebombs.”

It’s an unsettling feeling, I’m sure. I’ve felt a similar panic about being an older mother (though for different reasons). But honestly, men, of all the things to spend time worrying about, this study is not one of them. Here’s why.