A Blog by

The Unlikely Story of the Map That Helped Create Our Nation

American statesman John Jay used this map in negotiating the Treaty of Paris, which established the United States as an independent country.
American statesman John Jay used this map in negotiating the Treaty of Paris, which established the United States as an independent country.
Collection of the New-York Historical Society

It’s arguably the most important map in our country’s history. After the Revolutionary War, British and American representatives met in Paris to negotiate the boundaries of a new nation: the United States of America. Both sides had a version of the same map, marked up to indicate where they thought the lines should be drawn.

“The diplomats literally debated the boundaries of the future United States while pointing at this map,” says Matthew Edney, a historian of cartography at the University of Southern Maine. Although the 1783 Treaty of Paris contains no maps or illustrations, its written descriptions of boundaries are based on the marked-up maps of the negotiators, Edney says.

The map used as a starting point by both sides was created by a Virginia-born doctor named John Mitchell and published for the first time in 1755. Mitchell’s map is well-known among historians and map librarians, but less so among the general public. That’s too bad because it has a fascinating story.

"Mr. Oswald's Line" marked in red on John Jay's copy of Mitchell's map is the boundary proposed by the British negotiator Richard Oswald.
“Mr. Oswald’s Line” written in red (at top right) on John Jay’s copy of Mitchell’s map indicates the boundary proposed by the British negotiator Richard Oswald.
Collection of the New-York Historical Society

If Mitchell had lived long enough to see how his map came to be used, he might have been appalled. Although the map played a role in loosening Britain’s imperial grip on North America, its original purpose was just the opposite.

John Mitchell was born in 1711 to a family of relatively wealthy tobacco farmers. He studied medicine abroad, at the University of Edinburgh, then returned to his native Virginia to practice. He and his wife lived near the western shore of the Chesapeake Bay, which would have been a swampy, steamy, mosquito-ridden place in those days, Edney says. The couple fell into poor health, and in 1746 they retreated to the cooler climes of Britain to recuperate.

In London, Mitchell began mingling in high society, apparently aided by his knowledge of botany, a hot topic among learned men of his day. “He’s hobnobbing with aristocrats who were really into gardening,” Edney says. “And he comes into contact with a lot of politicians who were also demon gardeners.”

The red boundary line on John Jay's map claims the entirety of Lakes Michigan, Huron, Erie, and Ontario for the United States.
The red boundary line on John Jay’s map (above) claims all of Lakes Michigan, Huron, Erie, and Ontario for the United States.
Collection of the New-York Historical Society
This version of Mitchell's map, held by the Osher Map Library, is not the actual map used by British negotiator Richard Oswald at the treaty of Paris, but his red boundary lines have been copied onto it.
This version of Mitchell’s map wasn’t used in Paris, but Richard Oswald’s boundary lines have been copied onto it. Oswald’s line through the Great Lakes is similar to the modern US-Canada border.
Osher Map Library

One introduction led to another, and eventually the Earl of Halifax took notice of Mitchell. Halifax presided over the Board of Trade and Plantations, which oversaw colonial affairs. At the time Halifax was trying to rally the British government to defend the North American territories against incursions by the French. Halifax saw Mitchell as a native expert on North America and commissioned him to make a map to help his cause.

Mitchell had no formal training in cartography or geography, and there’s nothing to suggest he had any previous interest in those topics, Edney says. Yet he created what may well have been the best map of North America available in the late 18th century, drawing upon the Board’s archives in London, as well as surveys and maps Halifax ordered from the colonial governors.

Mitchell’s map took a decidedly British view of who owned what on the continent. His boundary lines, and small notes he scattered across the map, favored British claims over those made by the Spanish and French.

In Florida, for example, Mitchell drew a southern boundary line well inside the territory claimed by Spain. In Alabama, there’s a small note that reads “A Spanish fort built in 1719 & said to be soon after abandoned,” an apparent effort to diminish any Spanish claims to the land.

Mitchell’s annotations, including this note about the site of an abandoned Spanish fort, were designed to support British territorial claims.
Osher Map Library

Others began making derivatives of Mitchell’s map that were even more politically pointed. “In the 1750s there was a whole series of what’s called Anti-Gallican societies basically saying ‘We need to boycott the French,’” Edney says. The map below was made by one of these groups. It cedes even less land to the French than Mitchell’s map does, and it highlights French forts built too close for comfort around British territory.

The title of that map is rife with pompous indignation:

A New and accurate Map of the English Empire in North America Representing their Rightful claim as confirmed by Charters and the formal Surrender of their Indian Friends, Likewise the Encroachments of the French with several Forts they have unjustly erected therein.

The Anti-Gallican maps based on Mitchell’s map helped stir up anti-French sentiment in Britain, Edney says. “In that sense, Mitchell’s map was really crucial,” he says. “It’s one of the very first maps we can actually document as having a political impact.”

That impact was huge: By swaying public (and political) opinion toward standing up to the French instead of appeasing them, the map helped precipitate the French and Indian War. That war, in turn, created the conditions that led to the Declaration of Independence 240 years ago today. Fighting the French blew Britain’s budget, prompting King George III to squeeze the colonies even harder to pay his debts. Taxation without representation, tea in the harbor, you know the rest.

This 1755 map from an Anti-Gallican society shows French encroachments (highlighted in white) near British territories.
Osher Map Library

Many copies of Mitchell’s map have survived, but only three copies marked up at the Treaty of Paris are known to exist. John Jay’s map (several details from which appear above) is held by the New York Historical Society. Richard Oswald’s map was given to King George III and now belongs to the British Library. A French copy of Mitchell’s map was used by the Spanish ambassador at the treaty negotiations; that resides at the National Historical Archive in Madrid. Unfortunately, none of these maps is freely available online.

The influence of Mitchell’s map didn’t stop with the Treaty of Paris.  In the 1890s, it was used in negotiations between Canada and the United States over fishing rights in the Gulf of Maine, and it has come into play in legal disputes between eastern US states, as recently as 1932.

In a detailed historical essay (the source for much of this post), Edney calls Mitchell’s map “an irony of empire.” Instead of helping to solidify British control of North America as intended, the map helped set in motion the events that led to the Revolutionary War, and later helped determine the boundaries of an independent United States, a devastating blow to British imperial aspirations on the continent.

–Greg Miller

Many thanks to Ed Redmond at the Library of Congress for suggesting this topic.

A Blog by

The 19th Century Doctor Who Mapped His Hallucinations

Hubert Airy's 1870 diagram of his migraine aura looks familiar to many migraineurs today.
Hubert Airy’s 1870 diagram of his migraine aura looks familiar to many migraineurs today.
The Royal Society

Hubert Airy first became aware of his affliction in the fall of 1854, when he noticed a small blind spot interfering with his ability to read. “At first it looked just like the spot which you see after having looked at the sun or some bright object,” he later wrote. But the blind spot was growing, its edges taking on a zigzag shape that reminded Airy of the bastions of a fortified medieval town. Only, they were gorgeously colored. And they were moving.

“All the interior of the fortification, so to speak, was boiling and rolling about in a most wonderful manner as if it was some thick liquid all alive,” Airy wrote. What happened next was less wonderful: a splitting headache, what we now call a migraine.

Hubert Airy's drawing shows how his migraine aura grew over the course of about 20 minutes (click the image to expand).
Hubert Airy’s drawing, shown here in its entirety, illustrates how his migraine aura grew over the course of about 20 minutes (click the image to expand).
The Royal Society

Airy was a student when he suffered his first migraine, but he later became a physician. His description of his aura—the hallucinatory symptoms that can precede a migraine—was published in the Philosophical Transactions of the Royal Society in 1870, along with a drawing that showed how the hallucination grew to take over much of his visual field. “It’s an iconic illustration,” says Frederick Lepore, an ophthalmological neurologist at Rutgers Robert Wood Johnson Medical School in New Jersey. “It’s so precise, like a series of time-lapse photographs.”

Lepore showed Airy’s drawing to 100 of his migraine patients who experience a visual aura (only a minority do). Forty-eight of them recognized it instantly, he wrote in a historical note in the Journal of Neuro-Ophthalmology in 2014. He still shows the drawing to his patients today. “People are astonished,” he says. “They say, ‘Where did you get that?’”

What’s more remarkable, Lepore says, is that Airy’s drawing anticipates discoveries in neuroscience that were still decades in the future.

Airy correctly deduced that the source of his hallucinations was his brain, not his eyes. He wasn’t the first to do this, but it was still an open question at the time.

What’s most prescient about his drawing, though, is that it anticipates the discovery of an orderly map of the visual world in the primary visual cortex, a crucial brain region for processing what we see. When Airy published his paper, that discovery was still nearly half a century away.

This diagram by Gordon Holmes illustrates how different regions of the visual field (right) map onto different regions of the primary visual cortex (left).
This diagram by Gordon Holmes illustrates how different regions of the visual field (right) map onto different regions of the primary visual cortex (left).
The Royal Society

Most accounts credit the British neurologist Gordon Holmes with that later discovery. Holmes studied the visual deficits of hundreds of soldiers who’d suffered gunshot wounds to the back of the head in Word War I. “The British helmet was seated high on the head,” Lepore wrote, in a historical paper describing Holmes’s contributions. Unfortunately, this left the primary visual cortex largely unprotected, and provided Holmes many opportunities to study damage to this part of the brain.

By carefully mapping the soldiers’ blind spots and the locations of their wounds, Holmes discovered that damage to the most posterior part of visual cortex (that is, the part farthest back in the head) resulted in blindness at the center of the visual field, whereas wounds located closer to the front of the visual cortex resulted in blindness off to the side. Everything the eyes see maps neatly onto the visual cortex.

Holmes also discovered—and this is the part that relates to Airy’s drawing—that the visual map is magnified at its center. If the visual cortex is a road atlas, the part that represents the center of the visual field is like one of those inset city maps that show a smaller area in lots more detail.

This meshes nicely with Airy’s observation that the zigzags around his blind spot were packed tightly together in the center of his visual field and grew wider in the periphery. “Airy’s drawing fits beautifully with our modern conception of how the visual cortex is organized,” Lepore says.

Hubert Airy's father, George, also saw zigzag hallucinations, but they didn't precede a headache for the elder Airy.
Hubert Airy’s father, George, also saw zigzag hallucinations, but they didn’t precede a headache for the elder Airy.
The Royal Society

There’s still much we don’t know about migraines and migraine auras. One hypothesis is that a sort of electrical wave sweeps across the visual cortex, causing hallucinations that spread across the corresponding parts of the visual field. In a loosely descriptive way, Airy’s time series drawings—showing an ever expanding shape—jibe with this too.

Even less is known about the neural mechanisms that might produce the vivid colors Airy drew and described. There are areas of the visual cortex, including one called V4, that contain neurons that respond to specific colors, as well as other neurons that respond to lines of specific orientations. Perhaps an electrical wave passing through such areas could produce colored zigzags, Lepore says. But no one really knows.

Airy wasn’t the first to draw his migraine aura. In fact, his father, George, who happened to be the Royal Astronomer, had published a sketch of his own zigzag hallucinations five years earlier (see above). A German neurologist published a fairly crude, looping sketch back in 1845. And others did so afterwards. The drawings made by the French neurologist Joseph Babinski (see below) are especially colorful, if lacking in detail.

But Hubert Airy’s drawing has stood the test of time better than most. His paper in the Philosophical Transactions, published at age 31, was his only contribution to the field. It’s written in the somewhat pompous, somewhat conversational style of a 19th-century polymath relating his observations to other learned men. One lengthy section recounts the observations of a Swiss doctor in the original French. Naturally, the readers of such a prestigious journal could translate for themselves.

That Airy got so much right at a time when so little was known about the brain is a testament to his powers of observation, Lepore says. He documented what he saw meticulously, even though it was visible to himself alone.

This detail from Joseph Babinksi's 1890 drawing of his migraine aura shows a zigzag pattern not unlike the one Hubert Airy saw.
This detail from Joseph Babinksi’s 1890 drawing of his migraine aura shows a zigzag pattern not unlike the one Hubert Airy saw.
Wellcome Library

–Greg Miller

A Blog by

This 1916 Guide Shows What the First Road Trips Were Like

In the early 20th century, drivers had to be reassured that road trips were safe, and turn-by-turn directions involved the "blue book"—and some math skills.
In the early 20th century, the Official Automobile Blue Book reassured drivers that road trips were safe and gave them turn-by-turn directions.
Photograph by NatPar Collection, Alamy

A cross-country road trip is a quintessentially American experience. From Jack Kerouac to the Griswold family, millions have loaded up the car and hit the open road. It’s always an adventure, but in modern times it’s a relatively tame one: The roads are paved, signs point the way, and Siri always has your back.

But a hundred years ago, traveling cross-country by automobile was intimidating, if not a little bit dangerous. Cars were unreliable. Roads were rough, and with the Interstate Highway System still decades away, a bewildering array of potential routes connected any pair of distant points.

A crucial aid in those days was a series of guides called the Official Automobile Blue Book. Each thick volume covered hundreds of routes, giving detailed turn-by-turn directions that put Google to shame, pointing out landmarks like cemeteries, factories, and places where the road crossed trolley tracks.

The Blue Book guides and others like them were the predecessors to road maps and atlases, says John Bauer, a geographer at the University of Nebraska at Kearney, who published what may be the only academic study of the series. Bauer suspects the guides may have even influenced some of the routes chosen for the state and federal highway networks built in the subsequent decades.

The leather cover and gilt lettering were designed to give the Blue Book guides an authoritative look.
The leather cover and gilt lettering were designed to give the Blue Book guides an authoritative look.
Prelinger Library / Photo by Greg Miller

I recently spent some time flipping through a 1916 volume at the Prelinger Library in San Francisco. It covered a huge swath of the country, from the Mississippi River to the Pacific Coast. With a leather cover and gilt lettering, the guide had the look and heft of a Bible. It included 1,286 individual routes—actually more, because some routes had lettered side routes. Route 528, for example, takes you from Fort Morgan, Colorado to Denver, via Greely, while route 528A takes you from Greely up to Estes Park.

What struck me, in addition to the sheer number of routes and their complexity (the 115-mile route from Fort Morgan to Denver had 40 steps), was the volume’s boosterish tone. The ads, with photos of well-dressed, apparently well-heeled people, make driving look très sophisticated. If cars had cupholders back then, these folks would be rolling with crystal goblets, not Big Gulps.

A section on Transcontinental Touring extolls the “wonders of the western country,” and urges readers not to be daunted by the journey, reminding them that at least 5,000 cars had made the trip from the Mississippi to the Pacific in the last two years. Five whole thousand! “True, the unexpected happens in those less developed and more sparsely settled sections of the country—but those unforeseen occurrences, seldom dangerous or serious, are the very thing that give romance and variety to a Western trip.”

Ads like this one in the 1916 Blue Book reflected the aspirations of the middle class, for whom automobile ownership had only recently come within reach.
Ads like this one in the 1916 Blue Book reflected the aspirations of the middle class, for whom automobile ownership had only recently come within reach.
Prelinger Library / Photo by Greg Miller

The Blue Book guides (which aren’t related to the Kelly Blue Book guides to car values still published today) were intended to look authoritative and to drum up enthusiasm for automobile touring, Bauer writes in his paper, published in Cartographic Perspectives in 2009. The guides also met a pressing need for navigational aids at the time. “They were uniquely suited for navigating the primitive network of local roads that existed prior to the 1920s,” he writes.

In the early 20th century, a trip from, say, Chicago to Denver, would involve hundreds of turns on small local roads that wound their way through the countryside and zig-zagged through towns. Road signs were virtually nonexistent, Bauer says, because until cars came along there was no need for them. Long trips were made by train, and virtually all short trips were made by local people who already had a mental map of the roads in their area.

Cars changed everything. By 1916, they were well on their way to becoming more than just a toy for the wealthy (all the same, those fancy people in the Blue Book ads reflected the aspirations of the newly automobiled middle class). The number of cars registered in the U.S. had doubled in the last two years, reaching 3.4 million (compared to 188 million in 2014). Lots more people were driving. And lots of them were getting lost.

The instructions for using the Blue Book guides (see below) seem complicated now, but they made sense in the context of the times. “You don’t drive but one or two or three miles before you have to turn,” Bauer told me. “It’s impossible to show all those turns at the scale of a typical map.”

This diagram explains how to use the reference maps to look up detailed driving directions between two places.
This diagram explains how to use the reference maps to look up detailed driving directions between two places.
Prelinger Library / Photo by Greg Miller

The guides do contain maps, but most of them aren’t meant to be used directly in navigation. Rather, they serve as a visual index to the written turn-by-turn directions. Like railroad maps, these index maps depict straight-line connections between towns, even when the actual routes were far more convoluted.

And convoluted they were. Here are two steps in the directions from Fort Morgan to Denver:

3.4 1.7 End of road, turn right; curving left just beyond. Pass school on left 4.4. Turn right with road 5.0. Cross concrete bridge 5.2.
8.8 5.4 End of road; turn right with travel. Cross bridge over Platte River and RR. 9.2

The numbers refer to mileage. The first number for each step is the total distance traveled so far, the second is the intermediate distance you’d see on your odometer if you re-zeroed it at the beginning of each step. For example, at the start of the second step above, you’d be 8.8 miles into the entire trip, and you’d have just finished the 5.4 mile segment described in the previous step (you’d have passed the school at 4.4 miles, the concrete bridge at 5.2, and hit 5.4 at the end of the road, right where the second step begins). Easy, right?

Well, following all these twists and turns would have been easier back when cars rarely broke 30 miles per hour, Bauer says, and there was time for a driver (or better whoever was riding shotgun) to look back and forth between the book and the road ahead.

This section of the 1916 guide shows driving directions between Greely and Estes Park, Colorado.
This section of the 1916 guide shows driving directions between Greely and Estes Park, Colorado.
Prelinger Library / Photo by Greg Miller

Even so, following these directions would require two things: a good odometer and a degree of diligence. If you got off track all your numbers would be off. You’d have to find your way back to the nearest point of reference.

The Blue Book guides weren’t purely for navigation. They also include introductions to towns and cities and flag points of historical or modern interest. The Fort Morgan-Denver route description, for example, includes this gem: “The intrepid Hollen Godfrey maintained a stage station at a point located near Merino. He was often attacked by Indians, but was never caught napping…”

Not much is known about how the Blue Book guides were made, Bauer says, but the publisher apparently paid professional “pathfinders” to drive the main roads each summer so the guides could be updated to reflect the quickly-changing road conditions. Amateur pathfinders, often members of local automotive clubs, also contributed.

This index map shows the main roads leading in and out of Chicago.
This index map shows the main roads leading in and out of Chicago.
Prelinger Library / Photo by Greg Miller

In a way, the success of the guides may have contributed to their demise. As more people felt emboldened to hit the road, pressure mounted on the government to build better roads. The Federal Aid Road Act of 1916 provided the first federal funding for building and improving roads. In 1926, the first network of numbered interstate highways was established. As signs went up along these routes, it became far easier to navigate without turn-by-turn directions.

Two years earlier, in 1924, Rand McNally had published its “Auto Chum,” the first edition of what would become its bestselling road atlas. Other companies soon jumped on the road-atlas band wagon. After 1927, the Blue Book guides with turn-by-turn directions were no longer published.

It’s ironic that nearly a century later, after decades of relying on road maps and atlases, so many drivers have gone back to turn-by-turn directions as their preferred navigational aid. If only Siri could flag the landmarks and throw out some trivia along the way.

–Greg Miller

A Blog by

Bomb-Damage Maps Reveal London’s World War II Devastation

The German Luftwaffe dropped thousands of bombs on London from 1939 to 1945, killing almost 30,000 people. More than 70,000 buildings were completely demolished, and another 1.7 million were damaged. The extent of the damage to each and every one of these buildings was logged and mapped in near real-time by surveyors, architects, engineers, and construction workers.

The result is an incredible collection of maps, color-coded by hand, that reveal the extent of the destruction in painstaking detail. Today, the maps remain an invaluable resource for academics, family historians, and even builders trying to avoid touching off unexploded bombs.

Key to LCC Bomb Damage Maps.
Key to LCC Bomb Damage Maps.
© 2015 The City of London (London Metropolitan Archives)

Now these bomb census maps are available in a beautiful oversized book released earlier this year to commemorate the 75th anniversary of the end of the Blitz, a nine-month period during which London and other British cities were relentlessly attacked by the German air force. “The London County Council Bomb Damage Maps, 1939-1945” contains large, high-quality reproductions of all 110 maps of the London region along with commentary from Laurence Ward, principal archivist at London Metropolitan Archives, which holds the original maps.

“There are just so many stories which these maps provide the starting point for,” Ward says. “They’re a great source in the sense that they make you want to go on and find out more.”

As soon as the bombs fell, data collection for the maps began. The London County Council, the central administrative authority of what was known as the County of London (roughly the area known today as Inner London), tasked its Architect’s Department with responding to bomb damage as it occurred. Surveyors, who before the war mostly worked on building sites to make sure regulations were followed and buildings were up to code, suddenly found themselves in charge of rescue operations. They worked with local rescue services made up of people from the construction fields, like engineers and bricklayers.

“Their primary aim was to pull people out of rubble and destroyed buildings and try to save lives,” Ward says. “They were set up as the rescue service because they had an understanding of how buildings worked, so if a building was about to collapse, making a judgment on how much time you had to get into the building and try and save people.” In all, the rescue services responded to 16,396 incidents and saved 22,238 people. Fifty-four of them died during these efforts.

Once a rescue operation was finished, the surveyors and rescue workers would work together to classify the damage, building by building, into six categories ranging from “blast damage—minor in nature” to “total destruction.” Their reports were sent to the London County Council, where they were recorded onto 1916 Ordnance Survey maps. Each damage category was given a color (shown in the key, above right), and the status of every affected building was colored by hand on the maps.

A diary entry included in the book, from architect Andrew Butler on April 20, 1941, gives an idea of what the work was like:

For the block I have started on—eight floors high with two flats on each floor—has had its whole face ripped off … I found it possible to stand on part of the roof. So, clutching a broken chimney, I surveyed the damage there. My notebook became very messy. What with the dust and soot, wet filth and the perspiration of fluster on my hands, it was difficult to read what I wrote. The notes served their purpose however when, after drying the book, I had to transcribe them into a report.

Visually, the maps are quite striking. The apparent randomness of the colors stands in contrast to the more orderly pattern of streets and buildings. In some places, whole swaths containing several blocks and dozens of buildings are colored black (total destruction) and purple (damaged beyond repair). In other places, the severity of damage varies widely, with areas colored yellow (minor blast damage) peppered with black, purple and red (seriously damaged).

Circles on the map denote strikes from V-1 and V-2 rockets, late additions to the German arsenal that caused tremendous damage. Beginning in June of 1944, Germany added the V-1 flying bomb to its attacks, which up to that point had mostly consisted of aircraft dropping incendiary bombs. The V-1 was a pilotless aircraft carrying a 1,870-pound warhead that could navigate by autopilot and crash into a target. More than 2,000 landed in the London region, killing 2,329 people. In September, a V-2 rocket, the world’s first ballistic missile, hit London. By the end of the war, 517 had detonated in London, killing 2,511 people.

The damage from World War II transformed London into the architecturally diverse city it is today. The maps help explain how rows of grand old flats can be interspersed with more modern buildings.

“Looking at a very, very small area, you can have buildings dating from maybe five different centuries sitting in quite close proximity to each other,” Ward says. “As you go further out you might be walking along a very fine Victorian street full of these beautiful terraced houses with lots of Victorian detailing on them, and then suddenly right in the middle of this road, there’ll be this kind of 1960’s low-rise housing block, very functional, very square. But it’s often that was the result of bomb damage.”

A view from the Golden Gallery of St. Paul’s Cathedral, looking east, which shows extensive bomb damage following the demolition of unsafe buildings in the summer of 1942. On the right, looking up Cannon Street, are Distaff Lane, Friday Street, Bread Street, Queen Victoria Street, and Cannon Street Railway Station. In the center foreground can be seen the tower of St. Augustine’s Church and Watling Street, which crosses Friday Street and Bread Street. To the left is St. Mary-le-Bow Church (Cross and Tibbs Collection, Collage 366 I S).
A view from the Golden Gallery of St. Paul’s Cathedral, looking east, which shows extensive bomb damage following the demolition of unsafe buildings in the summer of 1942. On the right, looking up Cannon Street, are Distaff Lane, Friday Street, Bread Street, Queen Victoria Street, and Cannon Street Railway Station. In the center foreground can be seen the tower of St. Augustine’s Church and Watling Street, which crosses Friday Street and Bread Street. To the left is St. Mary-le-Bow Church (Cross and Tibbs Collection, Collage 366 I S).
Photograph by Arthur Cross and Fred Tibbs © 2015 The City of London (London Metropolitan Archives) and reproduced by kind permission of the Commissioner of the City of London Police

The book also contains a remarkable collection of photos of damage in the City of London, a square-mile section at the center of greater London, taken by two police officers who would photograph damaged areas in the wake of attacks. The combination of the maps showing how widespread the destruction was, and the photos, such as the one above, showing what the damage looked like up close, really brings home the scale of the devastation.

“I just find it staggering that they managed to just carry on. London just carried on working,” Ward says. “It must have been an extraordinary time.”

A Blog by

These 15th-Century Maps Show How the Apocalypse Will Go Down

In 15th-century Europe, the Apocalypse weighed heavily on the minds of the people. Plagues were rampant. The once-great capital of the Roman empire, Constantinople, had fallen to the Turks. Surely, the end was nigh.

Dozens of printed works described the coming reckoning in gory detail, but one long-forgotten manuscript depicts the Apocalypse in a very different way—through maps. “It has this sequence of maps that illustrate each stage of what will happen,” says Chet Van Duzer, a historian of cartography who has written a book about the previously unstudied manuscript.

The geography is sketchy by modern standards, but the maps make one thing perfectly clear: If you’re a sinner, you’ve got nowhere to hide. The Antichrist is coming, and his four horns will reach the corners of the earth. And it just gets worse from there.

The manuscript is also the first known collection of thematic maps, or maps that depict something that’s not a physical feature of the environment (like rivers, roads, and cities). Thematic maps are ubiquitous today—from rainbow-colored weather maps to the red-and-blue maps of election results—but most historians date their origins to the 17th century. The apocalypse manuscript, which now belongs to the Huntington Library in San Marino, California, was written two centuries earlier, Van Duzer writes in his recently published book, Apocalyptic Cartography.

According to the manuscript, the four horns of the Antichrist will extend to the ends of the earth between 1600 and 1606. The horns represent the ways he will persuade people to follow him: deceit, cunning, cruelty, and imitation of the Deity.
According to the manuscript, the four horns of the Antichrist will extend to the ends of the earth between 1600 and 1606. The horns represent the ways he will persuade people to follow him: deceit, cunning, cruelty, and imitation of the Deity.
The Huntington Library

The manuscript was made in Lübeck, Germany, between 1486 and 1488. It’s written in Latin, so it wasn’t meant for the masses. But it’s not as scholarly as other contemporary manuscripts, and the penmanship is fairly poor, Van Duzer says. “It’s aimed at the cultural elite, but not the pinnacle of the cultural elite.”

The author is unknown. Van Duzer suspects it may have been a well-traveled doctor named Baptista. If so, he was in some ways very much a product of his time, yet in other ways centuries ahead of it.

The cartographic account of the Apocalypse begins with a map that shows the condition of the world between 639 and 1514. The earth is a circle, and Asia, Africa, and Europe are depicted as pie wedges surrounded by water. The text describes the rise of Islam, which the author sees as a growing threat to the Christian world. “There’s no way to escape it, this work is very anti-Islamic,” Van Duzer says. “It’s unfortunate,” he adds, but it was a widespread bias in that place and time.

Subsequent maps, which you can see in the gallery above, depict the “Sword of Islam” conquering Europe, followed by the rise of the Antichrist, a massive triangle that extends from pole to pole. Another map depicts the gates of Hell opening up on Judgment Day, which the author predicts will occur in 1651. A small, featureless globe depicts the world after that.

Unlike the Huntington manuscript, many works published around the same time, such as this hand-colored German book published in 1570, used pictures to depict the impending horrors of the Apocalypse. (LIBRARY OF CONGRESS)
Unlike the Huntington manuscript, many works published around the same time, such as this hand-colored German book published in 1470, used pictures to depict the impending horrors of the Apocalypse. (LIBRARY OF CONGRESS)
Library of Congress

All the maps in the manuscript are symbolic, but the post-apocalyptic map takes minimalism to the max. “There’s nothing on it, but it’s very clearly labeled as a map,” Van Duzer says. “It raises the question of what is a map, and it explores that boundary.”

The text is filled with idiosyncratic details. The author calculated the distance to Paradise: 777 German miles from Lübeck to Jerusalem, and thence another 1000 miles to the eastern end of the Earth (a German mile is an obsolete measurement with many variations, making it difficult to pin down the modern equivalent). He also calculated the circumferences of Earth and Hell (8,000 and 6,100 German miles, respectively, though his use of different numbers for pi suggests a shaky grasp of geometry).

In addition to the apocalyptic section, the manuscript includes a section on astrological medicine and a treatise on geography that’s remarkably ahead of its time. For example, the author writes about the need to adjust the size of text to prevent distortions on maps and make them easier to read, an issue cartographers still wrestle with today. (At the same time, he also chastises mapmakers for placing monsters on maps in places where they didn’t exist, an issue cartographers rarely wrestle with today.)

The geographical treatise ends with a short discussion of the purpose and function of world maps. It’s here, Van Duzer says, that the author outlines an essentially modern understanding of thematic maps as a means to illustrate characteristics of the people or political organization of different regions.

“For me this is one of the most amazing passages, to have someone from the 15th century telling you their ideas about what maps can do.”

—Greg Miller


Watch: Map of Hell

National Geographic Channel

Airing Sunday May 15 at 9 Eastern/Central

Seventy percent of Americans believe hell is a real place. Actor Danny Trejo has played plenty of bad guys in his time, so he’s on a mission to map out where the idea of hell came from. It’s a terrifying journey through 3,000 years of the afterlife. From ancient Greece to the birth of Christianity, to medieval Europe and modern America, visit real locations believed to be portals to the underworld and witness a hair-raising vision of hell come to life.
A Blog by

You (and Almost Everyone You Know) Owe Your Life to This Man.

Temperament matters.

Especially when nuclear weapons are involved and you don’t—you can’t—know what the enemy is up to, and you’re scared. Then it helps (it helps a lot) to be calm.

The world owes an enormous debt to a quiet, steady Russian naval officer who probably saved my life. And yours. And everyone you know. Even those of you who weren’t yet born. I want to tell his story…

It’s October 1962, the height of the Cuban missile crisis, and there’s a Soviet submarine in the Caribbean that’s been spotted by the American Navy. President Kennedy has blockaded Cuba. No sea traffic is permitted through.

Photograph by NY Daily News Archive, Getty
Photograph by NY Daily News Archive, Getty

The sub is hiding in the ocean, and the Americans are dropping depth charges left and right of the hull. Inside, the sub is rocking, shaking with each new explosion. What the Americans don’t know is that this sub has a tactical nuclear torpedo on board, available to launch, and that the Russian captain is asking himself, Shall I fire?

This actually happened.

The Russian in question, an exhausted, nervous submarine commander named Valentin Savitsky, decided to do it. He ordered the nuclear-tipped missile readied. His second in command approved the order. Moscow hadn’t communicated with its sub for days. Eleven U.S. Navy ships were nearby, all possible targets. The nuke on this missile had roughly the power of the bomb at Hiroshima.

“We’re gonna blast them now!”

Temperatures in the submarine had climbed above 100 degrees. The air-conditioning system was broken, and the ship couldn’t surface without being exposed. The captain felt doomed. Vadim Orlov, an intelligence officer who was there, remembers a particularly loud blast: “The Americans hit us with something stronger than the grenades—apparently with a practice depth bomb,” he wrote later. “We thought, That’s it, the end.” And that’s when, he says, the Soviet captain shouted, “Maybe the war has already started up there … We’re gonna blast them now! We will die, but we will sink them all—we will not become the shame of the fleet.”

Had Savitsky launched his torpedo, had he vaporized a U.S. destroyer or aircraft carrier, the U.S. would probably have responded with nuclear-depth charges, “thus,” wrote Russian archivist Svetlana Savranskaya, understating wildly, “starting a chain of inadvertent developments, which could have led to catastrophic consequences.”

But it didn’t happen, because that’s when Vasili Alexandrovich Arkhipov steps into the story.

Photo courtesy of M. Yarovskaya and A. Labunskaya
Photo courtesy of M. Yarovskaya and A. Labunskaya

He was 34 at the time. Good looking, with a full head of hair and something like a spit curl dangling over his forehead. He was Savitsky’s equal, the flotilla commander responsible for three Russian subs on this secret mission to Cuba—and he is maybe one of the quietest, most unsung heroes of modern times.

What he said to Savitsky we will never know, not exactly. But, says Thomas Blanton, the former director of the nongovernmental National Security Archive, simply put, this “guy called Vasili Arkhipov saved the world.”

Arkhipov, described by his wife as a modest, soft-spoken man, simply talked Savitsky down.

The exact details are controversial. The way it’s usually told is that each of the three Soviet submarine captains in the ocean around Cuba had the power to launch a nuclear torpedo if—and only if—he had the consent of all three senior officers on board. On his sub, Savitsky gave the order and got one supporting vote, but Arkhipov balked. He wouldn’t go along.

He argued that this was not an attack.

The official Soviet debriefs are still secret, but a Russian reporter, Alexander Mozgovoi, an American writer, and eyewitness testimony from intelligence officer Orlov suggest that Arkhipov told the captain that the ship was not in danger. It was being asked to surface. Dropping depth charges left then right, noisy but always off target—those are signals, Arkhipov argued. They say, We know you’re there. Identify yourselves. Come up and talk. We intend no harm.

What’s Happening?

The Russian crew couldn’t tell what was going on above them: They’d gone silent well before the crisis began. Their original orders were to go directly to Cuba, but then, without explanation, they’d been ordered to stop and wait in the Caribbean. Orlov, who had lived in America, heard from American radio stations that Russia had secretly brought missiles to the island, that Cuba had shot down a U.S. spy plane, that President Kennedy had ordered the U.S. Navy to surround the island and let no one pass through. When Americans had spotted the sub, Savitsky had ordered it to drop deeper into the ocean, to get out of sight—but that had cut them off. They couldn’t hear (and didn’t trust) U.S. media. For all they knew, the war had already begun

We don’t know how long they argued. We do know that the nuclear weapons the Russians carried (each ship had just one, with a special guard who stayed with it, day and night) were to be used only if Russia itself had been attacked. Or if attack was imminent. Savitsky felt he had the right to fire first. Official Russian accounts insist he needed a direct order from Moscow, but Archipov’s wife Olga says there was a confrontation.

She and Ryurik Ketov, the gold-toothed captain of a nearby Russian sub, both heard the story directly from Vasili. Both believe him and say so in this PBS documentary. Some scenes are dramatized, but listen to what they say …

As the drama unfolded, Kennedy worried that the Russians would mistake depth charges for an attack. When his defense secretary said the U.S. was dropping “grenade”-size signals over the subs, the president winced. His brother Robert Kennedy later said that talk of depth charges “were the time of greatest worry to the President. His hand went up to his face [and] he closed his fist.”

Video Still From ''Missile Crisis: The Man Who Saved the World''
Video Still From the PBS documentary, “Missile Crisis: The Man Who Saved the World.

The Russian command, for its part, had no idea how tough it was inside those subs. Anatoly Andreev, a crew member on a different, nearby sub, kept a journal, a continuing letter to his wife, that described what it was like:

For the last four days, they didn’t even let us come up to the periscope depth … My head is bursting from the stuffy air. … Today three sailors fainted from overheating again … The regeneration of air works poorly, the carbon dioxide content [is] rising, and the electric power reserves are dropping. Those who are free from their shifts, are sitting immobile, staring at one spot. … Temperature in the sections is above 50 [122ºF].

The debate between the captain and Arkhipov took place in an old, diesel-powered submarine designed for Arctic travel but stuck in a climate that was close to unendurable. And yet, Arkhipov kept his cool. After their confrontation, the missile was not readied for firing. Instead, the Russian sub rose to the surface, where it was met by a U.S. destroyer. The Americans didn’t board. There were no inspections, so the U.S. Navy had no idea that there were nuclear torpedos on those subs—and wouldn’t know for around 50 years, when the former belligerents met at a 50th reunion. Instead, the Russians turned away from Cuba and headed north, back to Russia.

Photograph courtesy of U.S. National Archives, Still Pictures Branch, Record Group 428, Item 428-N-711199
Photograph courtesy of U.S. National Archives, Still Pictures Branch, Record Group 428, Item 428-N-711199

Looking back, it all came down to Arkhipov. Everyone agrees that he’s the guy who stopped the captain. He’s the one who stood in the way.

He was, as best as we can tell, not punished by the Soviets. He was later promoted. Reporter Alexander Mozgovoi describes how the Soviet Navy conducted a formal review and how the man in charge, Marshal Grachko, when told about conditions on those ships, “removed his glasses and hit them against the table in fury, breaking them into small pieces, and abruptly leaving the room after that.”

Photo courtesy of M. Yarovskaya and A. Labunskaya
Photo courtesy of M. Yarovskaya and A. Labunskaya

How Arkhipov (that’s him up above) managed to keep his temper in all that heat, how he managed to persuade his frantic colleague, we can’t say, but it helps to know that Arkhipov was already a Soviet hero. A year earlier he’d been on another Soviet sub, the K-19, when the coolant system failed and the onboard nuclear reactor was in danger of meltdown. With no backup system, the captain ordered the crew to jerry-rig a repair, and Arkhipov, among others, got exposed to high levels of radiation. Twenty-two crew members died from radiation sickness over the next two years. Arkhipov wouldn’t die until 1998, but it would be from kidney cancer, brought on, it’s said, by exposure.

Nuclear weapons are inherently dangerous. Handling them, using them, not using them, requires caution, care. Living as we do now with North Korea, Pakistani generals, jihadists, and who knows who’ll be the next U.S. president, the world is very, very lucky that at one critical moment, someone calm enough, careful enough, and cool enough was there to say no.

Thanks to Alex Wellerstein, author of the spectacular blog Restricted Data, for his help guiding me to source material on this subject.

A Blog by

An Epidemic 14 Years Ago Shows How Zika Could Unfold in the US

An Aedes albopictus mosquito, which health authorities worry may begin to spread Zika.
An Aedes albopictus mosquito, which health authorities worry may begin to spread Zika.
Photograph by James Gathany, CDC.

If the Zika virus comes to the United States, we could face the threat of the same sort of virgin soil epidemic—an infection arriving in a population that has never been exposed to it before—that has caused more than 1 million known infections, and probably several million asymptomatic ones, in Central and South America. It’s nerve-wracking to wonder what that would be like: How many people would fall ill, how serious the effects would be in adults or in babies, and most important, how good a job we would do of protecting ourselves.

But, in fact, we can guess what it would be like. Because we have a good example, not that long ago, of a novel mosquito-borne threat that caused very serious illness arriving in the United States. And the data since its arrival shows that, despite catching on fairly quickly to what was happening, the U.S. didn’t do that good a job.

This possibility became more real Monday when the Pan American Health Organization released a statement that predicts Zika virus, the mosquito-borne disease that is exploding in South and Central America and seems likely to be causing an epidemic of birth defects especially in Brazil, will spread throughout the Americas. PAHO, which is a regional office of the World Health Organization, said:

There are two main reasons for the virus’s rapid spread (to 21 countries and territories): (1) the population of the Americas had not previously been exposed to Zika and therefore lacks immunity, and (2) Aedes mosquitoes—the main vector for Zika transmission—are present in all the region’s countries except Canada and continental Chile.

PAHO anticipates that Zika virus will continue to spread and will likely reach all countries and territories of the region where Aedes mosquitoes are found.

Those “countries and territories where Aedes mosquitoes are found” include a good portion of the United States, as these maps from the Centers for Disease Control and Prevention demonstrate:

CDC maps of the ranges of two mosquito species that could transmit Zika virus.
CDC maps of the ranges of two mosquito species that could transmit Zika virus.
Graphic from CDC.gov, original here.


The recent history is this: In the summer of 1999, the New York City health department put together reports that had come in from several doctors in the city and realized that an outbreak of encephalitis was moving through the area. Eight people who lived in one neighborhood were ill, four of them so seriously that they had to be put on respirators; five had what their doctors described as “profound muscle weakness.”

Within a month, 37 people had been identified with the perplexing syndrome, which seemed be caused by a virus, and four had died. At the same time, veterinarians at the Bronx Zoo discovered an unusual numbers of dead birds: exotics, like flamingos, and city birds, primarily crows. Their alertness provided the crucial piece for the CDC to realize that a novel disease had landed in the United States: West Nile virus, which was well-known in Europe, but had never been seen in this country before.

West Nile is transmitted by mosquitoes in a complex interplay with birds. It began moving with both birds and bugs down the East Coast and then across the Gulf Coast. As it went, the CDC realized that the neurologic illness that marked the disease’s first arrival had not been a one-time event, but its own looming epidemic within the larger one. “Neuroinvasive” West Nile, which in its worst manifestations caused not transient encephalitis but long-lasting floppy paralysis that resembled polio — and sometimes killed — bloomed in the summer of 2002 east of the Mississippi, and then moved west in the years afterward as the disease exhausted the pool of the vulnerable.

The CDC’s maps showing the emergence of “neuroinvasive” West Nile virus disease from 2001 to 2004; areas in black had the highest incidence.
Graphic by Maryn McKenna using maps by the CDC; originals available here.

So far, so normal, for a newly arrived disease. But here’s where the story gets complicated. By the beginning of this decade, West Nile had become endemic in the lower 48 states. It is not a mysterious new arrival; it is a known, life-altering threat. Its risk waxes and wanes with weather and insect populations, but it has one simple preventative: not allowing yourself to be bitten by a mosquito.

And yet: Here are the CDC’s most recent maps of neuroinvasive West Nile—showing that people are still falling to its most dire complication, 14 years after it was identified.

The CDC's maps for 2011-2014 showing the incidence of "neuroinvasive" West Nile virus disease; areas in black had the highest incidence.
The CDC’s maps for 2011-2014 showing the incidence of “neuroinvasive” West Nile virus disease; areas in black had the highest incidence.
Graphic by Maryn McKenna using maps by the CDC; originals available here.

The point here is not that people are careless or unthinking; in the early years of West Nile, two of the victims were the husband of the CDC’s then director, and the chief of its mosquito-borne diseases division, who would have been well aware of the risks. (Both recovered fully.) The point is that always behaving in a manner that protects you from a mosquito bite—conscientiously, persistently, faultlessly emptying pots and puddles, putting on long sleeves and repellent, choosing when not to go outdoors—is very difficult to maintain.

Zika is not West Nile. Among other things, Zika is spread by many fewer species of mosquitoes — one or possibly two, compared to 65 for West Nile. And West Nile’s non-human hosts, birds, live in closer proximity to more of us than Zika’s, which appear to be non-human primates. But though the rare, deadly complications of West Nile virus infection are different from those of Zika, they are just as serious and life-altering — and yet we failed to protect ourselves from them. As Zika spreads, we can hope that is a lesson we learn in time.

Previous posts in this series:

A Blog by

The Fantastically Strange Origin of Most Coal on Earth

This is a story about trees—very, very strange looking trees—and some microbes that failed to show up on time. Their non-appearance happened more than 300 million years ago, and what they didn’t do, or rather what happened because they weren’t there, shapes your life and mine.

All you have to do is walk the streets of Beijing or New Delhi or Mexico City: If there’s a smog-laden sky (and there usually is), all that dust blotting out the sun is there because of this story I’m going to tell.

It begins, appropriately enough, in an ancient forest …

Artist's reconstruction of a forest during the Carboniferous period. From 'Science for All' by Robert Brown (London, c1880). Illustration by World History Archive, Alamy
Artist’s reconstruction of a forest during the Carboniferous period. From ‘Science for All’ by Robert Brown (London, c1880). Illustration by World History Archive, Alamy

… whose trees “would appear fantastic to us in their strangeness,” write Peter Ward and Joseph Kirschvink in their book A New History of Life.

Some of them were giants: 160 feet tall, with delicate fernlike leaves that sat on top of pencil-thin trunks. This was the age when plants were evolving, climbing higher and higher, using cellulose and a tough fiber called lignin to stay upright. Had you been there, you would have felt mouse-sized.

Drawing by Robert Krulwich
Drawing by Robert Krulwich

These trees weren’t just odd looking. “One of their strangest traits was their very shallow root system,” write Ward and Kirschvink. “They grew tall and fell over quite easily.”

Drawing by Robert Krulwich
Drawing by Robert Krulwich

So imagine, then, these stands of towering, fernlike plants mostly growing in swamps. The air is warm and moist, and the land (Europe, the Americas, and Africa were at the time one continuous mass) is covered by millions—no, billions—of trees that are sucking carbon from the air, growing, aging, dying, falling, and releasing oxygen. This is a world littered with dead trees piling on top of each other.

Carboniferous Forest Diorama. Photograph by John Weinstein, Field Museum Library, Getty
Carboniferous Forest Diorama. Photograph by John Weinstein, Field Museum Library, Getty

But when those trees died, the bacteria, fungi, and other microbes that today would have chewed the dead wood into smaller and smaller bits were missing, or as Ward and Kirschvink put it, they “were not yet present.”

Where Are They?

Bacteria existed, of course, but microbes that could ingest lignin and cellulose—the key wood-eaters—had yet to evolve. It’s a curious mismatch. Food to eat but no eaters to eat it. And so enormous loads of wood stayed whole. “Trees would fall and not decompose back,” write Ward and Kirschvink.

Instead, trunks and branches would fall on top of each other, and the weight of all that heavy wood would eventually compress those trees into peat and then, over time, into coal. Had those bacteria been around devouring wood, they’d have broken carbon bonds, releasing carbon and oxygen into the air, but instead the carbon stayed in the wood.

Artist's engraving of a carboniferous forest circa 1754. From The Universe by FA Pouchet (London, 1874). Photograph by UniversalImagesGroup, Getty
Artist’s engraving of a carboniferous forest circa 1754. From The Universe by FA Pouchet (London, 1874). Photograph by UniversalImagesGroup, Getty

We’re talking about a spectacular amount of carbon. Biochemist Nick Lane guesses that the rate of coal formation back then was 600 times the normal rate. Ward and Kirschvink say that 90 percent—yup, 90 percent!—of the coal we burn today (and the coal dust we see flying about Beijing and New Delhi) comes from that single geological period, the Carboniferous period.

That’s why it’s called “carboniferous”—because it produced so much carbon. “The Carboniferous period was the time of forest burial on a spectacular scale,” the writers say.

Take Off Your Helmets and Say Thank You

And therefore, in a just (and biologically aware) world coal miners everywhere would be doffing their helmets to salute the tardy arrival of those teeny earth creatures, the wood-eating bacteria. By not being there 350 million years ago, and by not arriving for another 60 million years, giant seams of black coal now warm us, light us, and muck up our atmosphere. Equal numbers of environmentalists might spend the day throwing darts at these little guys for showing up so late.

A coal miner plants explosives in a coal mine. Photograph by H. Mark Weidman Photography, Alamy
A coal miner plants explosives in a coal mine. Photograph by H. Mark Weidman Photography, Alamy

And Now … in Spectacular Magnification, Let Me Introduce …

But enough of me talking about them. It’s time for you to take a close—and I mean close—look at these amazing wood-eaters. They come in many forms, but I’m choosing microbes called Trichonympha because they’re so tiny, so squirmy, and so, well, crazily busy. They’re single-celled and can be found, yes, inside a termite gut. They look, says photographer Richard Howey (who studies them), like teardrops, or pears “wearing wigs.”

Here they are in this Nikon Small World award-winning video by Danielle Parsons and Wonder Science TV:

When I first saw this video, I was shocked by the commotion. I had thought wood-eaters would be mellow, sluggish, and, well, a little less clumped together. So I had questions. A web search brought me to Richard Howey in Wyoming, who has written about and photographed Trichonympha, and I asked him to take a look at the video so I could pepper him with questions. Which is what I did …

Me: Wow! This is crazy. So much motion!
Richard Howey: Yes, it looks almost like a game of bumper cars.
Me: So why are they so squished together?
RH: I’m not sure. I was really stunned [when you showed this to me]. It seems like Macy’s on Christmas Eve. [Pause.] I know they reproduce at an incredible rate.
Me: What do you mean? Are we watching them having sex?
RH: They might be [laughs]. Their reproductive process is incredibly complicated … [goes on to discuss mating types]
Me: But mostly they’re eating, right?
RH: Oh, definitely. You see those little white crystals jiggling around?
Me: Yeah, those shiny, stonelike things? What are those?
RH: Those are little cellulose bits; the termite has chewed and shredded the wood, and now these bits have reached its intestines. The microbes scoop them up …
Me: And once they get them inside?
RH: They produce a dissolving agent that’s going to reduce those bits to starches and sugars that the termite can eat.
Me: I like their little wiggly nose-like tops.
RH: Those aren’t noses.
Me: Well, heads then …
RH: Actually … They’re kind of like legs. They have little locomotive hairs, flagella, attached there, and that’s how they propel.
Me: It’s weird. It looks like they know where they’re going …
RH: That’s an illusion. I think they just … go.
Me: Why don’t they stop? Do they ever rest?
RH: No, those flagella are very motile—they keep moving and moving and eating and eating …
Me: That’s it?
RH: That’s what they do. Always.

And we should be oh-so-thankful they do it. Because of them, dead trees get recycled. Soil gets replenished. Smaller organisms get fed. And miners can mine—which is only to say: Sometimes very little creatures make a very big difference.

Editor’s Note: The image of coal featured in this post was updated for accuracy.

The Time 19th Century Paleontologists Punched it Out

Cope vs Frazer. Art by Zander and Kevin Cannon, from Brinkman, 2015.
Cope vs Frazer. Art by Zander and Kevin Cannon, from Brinkman, 2015.

Edward Drinker Cope wasn’t exactly the most even-keeled of paleontologists. The great “Bone Wars” that sparked a race to uncover America’s prehistory required the enmity of two fossil fanatics, after all, and Cope certainly proved himself capable of throwing jabs and haymakers in print at his friend-turned-nemesis Othniel Charles Marsh. Even among friends Cope was known as “pugnacious” and “quarrelsome.” And as historian Paul Brinkman points out in a new paper, Cope didn’t always hold his punches in face-to-face confrontations, either. On a spring night in 1888, in the hall at Philadelphia’s American Philosophical Society, Cope brawled with his friend Persifor Frazer.

The blow-by-blow account, rediscovered by Brinkman in the University of Pennsylvania archives, was recorded by Frazer the day after the scuffle. The episode was especially strange, Brinkman writes, because Frazer was Cope’s closest friend. The naturalist often came to Cope’s defense as the cranky paleontologist’s reputation suffered in Philadelphia’s scientific community, with Frazer trying to quell “certain controversies” surrounding his friend. And while it ended up being Frazer who jumped to violence, Cope’s pig-headed attitude is what brought the fellows to blows.

The spark for the scuffle was a letter. Cope, Frazer wrote, had written that his friend held an opinion that was “false” and “untrue.” Frazer asked Cope to change the wording of his accusations to something softer, something less directly offensive, but Cope refused, adding that Frazer “had better let the personal part of this matter drop.”

But Frazer didn’t give up. He asked his friend, geologist N.H. Winchell, to intercede to change Cope’s mind at a meeting of the International Congress of Geologists in April of 1888. Again Cope refused, leaving Frazer to request a private meeting with Cope after the meeting’s proceedings were over. Frazer and Cope both sat there, attending to meeting business for six hours, but when it was all over Cope disappeared. He left the city before Frazer could catch him.

Frazer wouldn’t let it rest. We don’t know what the disagreement was about, but he felt so strongly that he found out Cope was to give a presentation at the American Philosophical Society on May 4th and asked another of his friends, Admiral McCauley, to speak to the irascible paleontologist. Cope didn’t answer any of Frazer’s pleas to meet. And still Frazer didn’t give up. He was so in a knot about the problem that he cancelled his plans with his wife to go to the opera that night and instead went to the scientific meeting in the hope that Cope would be there.

And so Cope was. Frazer let McCauley try to talk to Cope first, and just as the meeting was about to start Frazer closed the meeting door behind him to confront his friend in the hall. This was Cope’s last chance. “Prof. Cope the simple question is do you characterize what I wrote you as falsehood?” Frazer asked. Cope didn’t bite. “I do not know whether it is or not”, he replied, and at that point Frazer slapped Cope twice across the face. Here, in Frazer’s own words, is what happened next:

He gathered himself and plunged at me into the middle of the passage not reaching me. I struck him several blows one a hard one after which he reeled and stepping backwards fell on the staircase heading into the Janitor’s rooms overhead.

As he rose he clinched with me and I backed him to the wall where first he and then I took one step upwards. He tried hard to throw me sideways and at one instant nearly succeeded. Meantime placing my elbow against his throat & forcing his neck against the wall I held him motionless for some time. He had both my hands fast so that I could not remove them without a struggle which would have brought us both to the floor with great noise & made a public scandal by bringing out the members <of the A. P. S.> en masse which I wished if possible to avoid.

Here McCauley interceded, calling the fight. The two scientists loosened their grip on each other, and this was still not enough for Frazer. He demanded a further meting to talk about the upset, but McCauley told the obsessed naturalist to stop picking at his emotional wound. “No, there is nothing more to be said.”

Cope avoided Frazer for almost a year after the fight. In time, though, the two reconciled and Frazer remained one of Cope’s greatest supporters until the end. But if two such friends could so readily black each other’s eyes, I can only imagine the injuries Cope would have inflicted and sustained if his rival Marsh had cornered him. The Bone Wars could have been far worse.


Brinkman, P. 2015. Remarking on a blackened eye: Persifor Frazer’s blow-by-blow account of a fistfight with his dear friend Edward Drinker Cope. Endeavour. doi: 10.1016/j.endeavour.2015.06.001

A Blog by

231 Varieties of Rain: Frogdrops Keep Falling on My Head

Poor Rob McKenna. He drives a truck, so he’s constantly moving, never in the same place for long. And yet everywhere he goes—city, country, near, far, morning, afternoon—it doesn’t matter, wherever Rob is, it’s raining. He can turn, reverse, zigzag, it doesn’t matter. Clouds just follow him, and to prove it (because who would believe this?) he keeps a log and shares it with his friend Arthur Dent, who says, You should show this to scientists. He does, and the scientists tell him, Rob McKenna, we know what you are. You are a Quasi Supernormal Incremental Precipitation Inducer.

What’s that?

He’s a “Rain God.” That’s the gist. Clouds see him and can’t help themselves. They love him and want “to be near him, to cherish him, and to water him.” And the worst of it is, Rob (a totally fictional character in Douglas Adams’s Hitchhiker’s Guide to the Galaxy) hates rain. Can’t abide it. But the rain doesn’t care. So Rob tries to get along. He turns his curse into a part time job: Hotels and vacation spots pay him not to go there. He becomes a regular at a pub called the Thundercloud Corner, where he sits, grimly staring out the window at … well … at scenes like this:

GIF by tkyle
GIF by “tkyle

But because he spends so much time staring at rain, Rob learns to see rainfall as no one has seen it before; he sees its many shapes, moods. He realizes, in the words of poet Conrad Aiken, that raindrops are “the syllables of water,” that rain can take hundreds of different forms.

There’s lashing rain, sheets of rain, rain pissing, bucketing, pouring. There are drizzles. There are mizzles. But Rob McKenna likes superspecific categories. He’s a taxonomist. And so he creates his own rain glossary; it’s described in So Long, and Thanks for All the Fish—the fourth book in the Hitchhiker’s Guide series—with its 231 different rain types:

There’s “light pricking drizzle which made the roads slippery” (type 33).

There’s “vertical light drizzle” (type 47).

There’s “heavy spotting” (type 39) .

There’s “regular” cab-drumming and “syncopated cab-drumming” (types 126 and 127).

There’s “dirty blatter blattering against his windscreen so hard that it didn’t make much odds whether he had his wipers on or off” (type 17).

I love parsing through Rob’s categories. Being a rain gazer myself (and rain, by the way, feels especially noticeable here in New York, where I live) …

The wonderful graphic designer T. Kyle MacMahon—known as “tkyle”—is especially good at capturing the joys of rain-gazing.
The wonderful graphic designer T. Kyle MacMahon—known as “tkyle”—is especially good at capturing the joys of rain-gazing.
GIF by tkyle


GIF by tkyle
GIF by “tkyle

… I couldn’t help but notice that something is missing from Rob’s list. He limits himself to one kind of rain—the kind that rains water, what you might call “raindrop rains.” But, in fact, there are other kinds.

In her book Rain: A Natural and Cultural History, Cynthia Barnett mentions Jonathan Swift’s fanciful metaphor “raining cats and dogs” (a coinage from 1738), but she then goes on to describe actual, unfanciful, documented rains of—and I kid you not—golf balls, fish, and, though I’ve heard about this before, frogs. As in, raining frogs (or toads).
Frog rain is shockingly normal.

Drip, Drop, Thunk

Barnett writes that in June 1954, Sylvia Mowday and her kids were in a park in Sutton Coldfield, just north of Birmingham, England, when it began to rain. They opened their umbrellas and were heading for shelter when all of a sudden they felt “gentle thuds” on their umbrella tops, “too soft for hail.” When they looked, they saw tiny frogs, “wee bodies” falling from the sky.

Maybe that’s what you’re seeing in this video—posted from Knox County, Ohio, on June 11, 2012—which shows (after the filmmaker, “MrKoozzz,” focuses) teeny frogs, all facing the same way, after a rain. (Alternate explanation: Could they be migrating? Hopping from one pond to another? Nope, they arrived by rain, writes MrKoozzz. “That’s my story, and I’m sticking to it.”)

In 1873, Scientific American ran eyewitness accounts of a frog rain in Kansas City, Missouri. It happened again, Barnett writes, in 1901, in Minnesota. There are ancient accounts, medieval accounts, even battlefield stories. During a French/Austrian battle in 1794 …

“A hot afternoon was broken by such heavy showers that 150 soldiers had to abandon their trench as it filled with rainwater. In the middle of the storm, tiny toads began to pelt down and jump in all directions. When the rain let up, the soldiers discovered more toads in the folds of their three-cornered hats.”

Drawing by Robert Krulwich
Drawing by Robert Krulwich

Assuming that all these stories—or at least some of them—are true, how do hundreds of toads manage to get airborne? Little toads—teeny as they are, are much heavier than raindrops. “Modern meteorologists,” Barnett explains, believe that “tornadoes and waterspouts are the most likely culprits.” High winds, especially whirlwinds, pick up water, toads, frogs (fish, golf balls) and all, and whisk them across the sky for a little while, then lose speed and dump the contents on, for example, Sylvia Mowday and her kids.

(Though, Cynthia wonders, if a whirlwind can pull a frog up into the sky, where’s the algae, the other pond plants, the fish? Why didn’t the Mowdays get hit with pond scum? She doesn’t know.)

But frog rain happens. Maybe not as often as rain type 49 (“sharply slanting light drizzle”) or type 51 (“light to moderate drizzle freshening”) or a “dirty blatter battering,” but frogs have been falling from the skies often enough, long enough, that I think they’ve earned the right to be called precipitation.

It’s odd that Rain God Rob MacKenna would leave them out. But he’s a lesser deity. The Big Guy, as you may recall, was more frog-friendly. Just ask Pharaoh …

For the best, craziest, most over-the-top frog rain ever (particularly the slack-jawed look on Philip Seymour Hoffman’s startled face when giant toads begin falling from the sky into his brilliantly lit swimming pool), there is nothing better than Paul Thomas Anderson’s 1999 movie Magnolia. If you dare (and I suggest you do … but it’s pretty graphic …) take a look …

A Blog by

Bloodletting Is Still Happening, Despite Centuries of Harm

An illustration of a bloodletting, circa 1675.
An illustration of a bloodletting, circa 1675.

In the shadow of India’s largest mosque, the gutters run red with blood.

It’s a bizarre scene, if you’ve never seen a modern-day bloodletting. First, men wrap patients’ arms and legs with straps as tourniquets, to control the blood flow. Then they use razor blades to make tiny pricks in the hands and feet, and blood trickles into a concrete trough stained red with the day’s work.

The bleeding people look pretty happy, though. After all, they’ve paid for the service. They come to be cured of everything from arthritis to cancer.

(Video: Meet the bloodletters of Delhi and their patients.) 

But why? How has the bloodletting business, which many doctors today would rank along with reading bumps on the head as olde timey quackery, managed not to dry up?

The appeal seems to be in its simple logic.

Muhammad Gayas runs his bloodletting business in the garden of the Jama Masjid mosque in Old Delhi. He says pain and illness happen “when the blood goes bad,” which is pretty much the same basic premise that bloodletters have sold the public since Hippocrates advocated balancing the four humors—blood, black bile, yellow bile, and phlegm—more than 2,000 years ago. 

Bloodletting has been practiced around the world even longer than that, tracing at least 3,000 years ago to the Egyptians. It remained an obsession among many Western doctors through the 19th century, and was still a recommended treatment for pneumonia in a 1942 medical textbook—lest you think it went out after the Middle Ages along with the laying on of leeches. (Oh, and leeches still get some play, too, mainly for drawing down pockets of blood after plastic surgery or vascular microsurgery.)

So Does Bloodletting Ever Work?

It may be helpful for people with a few particular blood abnormalities. Doctors still use bloodletting, for instance, in cases of polycythemia—an abnormally high red blood cell count—and in a hereditary disease called hemochromatosis, which leaves too much iron in the blood.

I also came across a preliminary study suggesting vascular benefits in some diabetics with high iron levels, but this is far from a general treatment for the disease. Another small study in BMC Medicine got a lot of press in 2012 for showing that 33 people who gave up to a pint of blood had improved cholesterol ratios and blood pressure six weeks later compared with people who didn’t give blood, which the doctors also attributed to a reduction of iron levels. (Note that the amount of blood removed in the study was fairly low—a pint is about as much as you’d give when donating blood, which for the record is  a great thing for healthy people to do and is not the same thing as bloodletting.) 

When George Washington developed a swollen sore throat in 1799, doctors drained nearly half his blood and created blisters in his throat. Within a day, he died.
When George Washington developed a swollen sore throat in 1799, doctors drained nearly half his blood and created blisters in his throat. Within a day, he died.
Life of George Washington, Junius Brutus Stearns, 1851

But the design of that study doesn’t rule out a placebo effect—which has certainly contributed to bloodletting’s popularity in the past. What’s more, other studies suggest that too little iron is bad for cardiovascular health, so again, the potential benefit of removing blood is unclear.

Meanwhile, depleting the body’s blood supply can be risky. Not only is there the risk of losing too much blood, causing a dangerous drop in blood pressure and even cardiac arrest, but people who are already sick take their chances with infection or anemia. Not to mention that in most cases, bloodletting doesn’t cure what ails you.  

So no, we don’t need to revive the tradition of the neighborhood bloodletter. In a sense, though, their legacy is still around: Red-and-white barber poles represent blood, bandages, and the stick that patients would grip during barbers’ days as bloodletters.

How Bloodletting Bled Out

It took the great bloodletting wars of the 1800s to begin turning the tide against the practice. The prominent doctor Benjamin Rush (a signer of the Declaration of Independence) set off a fury when he began bleeding people dry during the 1793 yellow fever epidemic in Philadelphia. By all accounts, Rush was a bloodletting fanatic and in general a real piece of work: “unshakable in his convictions, as well as self-righteous, caustic, satirical, humorless, and polemical,” writes doctor Robert North in a biography.

Rush recommended that up to 80 percent of his patients’ blood be removed, and during the yellow fever outbreak, North recounts that “so much blood was spilled in the front yard that the site became malodorous and buzzed with flies.”

Bloodletting’s detractors grew in numbers after that, and eventually Pierre Louis, the founder of medical statistics, began convincing doctors to rely on statistical evidence over anecdotal “recoveries” of patients who had been bled. A particularly impressive analysis showed that bloodletting did not help pneumonia victims in Europe, and after bitter disputes among doctors in the 1850s, the practice began dying out.

In fact, one history of bloodletting refers to the stamping out of the practice—over the objections of the medical establishment, no less—as a triumph of reason and “one of the greatest stories of medical progress.”

A Blog by

Who’s the First Person in History Whose Name We Know?

Editor’s Note: This post has updated to clarify a sentence about the gender of the ancient writer.  

“It’s me!” they’d say, and they’d leave a sign. Leave it on the cave wall. Maybe as a prayer, maybe a graffito, we don’t know.

This was 30,000 years ago. Writing hadn’t been invented, so they couldn’t chalk their names on the rock. Instead, they’d flatten their hand, blow dust over it, and leave a silhouette like this:

a handprint is outlined in an orange/red pigment on the reproduction of the prototype fac simile of the cave Chauvet
Prototype fac simile of the cave Chauvet—Pont d’Arc, negative hand painted by blowing pigments. Photograph by Laurent CERINO, REA, Redux
Photograph by Laurent CERINO, REA, Redux

And for 30, 40 centuries across Europe, Asia, the Americas, and Australia, this is how cavemen, cavewomen, cave kids, hunters, nomads, farmers, and soldiers left their mark.

Picture of layers and layers of hands painted onto a cave wall in Argentina
Cave of the Hands, Patagonia, Province of Santa Cruz, Argentina. Photograph by
Javier Etcheverry, VWPics, Redux
Photograph by Javier Etcheverry, VWPics, Redux

Every one of these handprints belonged to an individual, presumably with a name, a history, and stories to tell. But without writing, we can’t know those stories. We call them hunter-gatherers, cave people, Neolithic tribes. We think of them in groups, never alone. Tens of thousands of generations come and go, and we can’t name a single person before 3200 B.C., not a one. Then, in Mesopotamia, writing appears, and after that people could record their words, sometimes in phonetic symbols so we could listen in, hear them talking and, for the first time, hear someone’s name—our first individual.

So who was it?

Who is the first person in the recorded history of the world whose name we know?

Just Guessing Here

Would it be a she or a he? (I’m figuring a he, because writing was a new thing, and males are usually the early adopters.) [*Please see note at bottom of post for more on this.]

Drawing of of man and a woman, the woman is crossed out.
All drawings by Robert Krulwich
Drawing by Robert Krulwich

Would he be a king? Warrior? Poet? Merchant? Commoner? (I’m guessing not a commoner. To be mentioned in an ancient document, he’d need a reputation, tools, and maybe a scribe. He wouldn’t be poor.)

Drawing of a king, a warrior, a poet, a merchant, and a commoner, with the commoner crossed out

Would he be a person of great accomplishment or just an ordinary Joe? (The odds favor a well-regarded person, someone who is mentioned often. Regular Joes, I figured, would pop up irregularly, while a great king, a leading poet, or a victorious general would get thousands of mentions.)

Drawing of a king sitting in a chair with a trident-like stick, looking at writing in front of him

So I trolled the internet, read some books, and to my great surprise—the first name in recorded history isn’t a king. Nor a warrior. Or a poet. He was, it turns out … an accountant. In his new book Sapiens: A Brief History of Humankind, Yuval Noah Harari goes back 33 centuries before Christ to a 5,000-year-old clay tablet found in Mesopotamia (modern Iraq). It has dots, brackets, and little drawings carved on it and appears to record a business deal.

Picture of an ancient tablet depicting beer production Inanna Temple in Uruk
MS1717, © The Schøyen Collection, Oslo and London http://www.schoyencollection.com/24-smaller-collections/wine-beer/ms-1717-beer-inanna-uruk
© The Schøyen Collection, Oslo and London

It’s a receipt for multiple shipments of barley. The tablet says, very simply:

29,086 measures barley 37 months Kushim

“The most probable reading of this sentence,” Harari writes, “is: ‘A total of 29,086 measures of barley were received over the course of 37 months. Signed, Kushim.’ ”

Drawing of a man facing the viewer with a speech bubble over his left shoulder that says " of “Oh, Kushim!”

So who was “Kushim”? The word might have been a job title, not a person (maybe kushim meant “barley assessor”) but check the video down below. It suggests that Kushim was indeed a guy, a record keeper who counted things for others—in short, an accountant. And if Kushim was his name, then with this tablet, Harari writes, “we are beginning to hear history through the ears of its protagonists. When Kushim’s neighbours called out to him, they might really have shouted, ‘Kushim!’”

It’s pretty clear Kushim was not famous, not hugely accomplished, certainly not a king. So all of my hunches were off.

But wait. The Kushim tablet is just one of tens of thousands of business records found on the deserts of Iraq. A single example is too random. We need more. So I keep looking and find what may be the second, third, and fourth oldest names we know of. They appear on a different Mesopotamian tablet.

Ancient stone tablet featuring a male figure, hunting dogs, and boars from Mesopotamia
Administrative tablet with cylinder seal impression of a male figure, hunting dogs, and boars. 3100-2900 B.C. Jamdat Nasr, Uruk III style, southern region, Mesopotamia. Clay, H. 2 in. (5.3 cm). Image copyright © The Metropolitan Museum of Art. Image source: Art Resource, NY
Image copyright © The Metropolitan Museum of Art. Image source: Art Resource, NY

Once again, they are not A-list ancients. Dated to around 3100 B.C.—about a generation or two after Kushim—the tablet’s heading is, “Two slaves held by Gal-Sal.” Gal-Sal is the owner. Next come the slaves, “En-pap X and Sukkalgir.” So now we’ve got four names: an accountant, a slave owner, and two slaves. No kings. They don’t show up for another generation or so.

Drawing of four individuals: an accountant, a slave owner, and two slaves

The predominance of ordinary Sumerians doesn’t surprise Harari. Five thousand years ago, most humans on Earth were farmers, herders, and artisans who needed to keep track of what they owned and what they owed—and that’s how writing started. It was a technology for regular people, not a megaphone for the powerful.

“It is telling,” Harari writes, “that the first recorded name in history belongs to an accountant, rather than a prophet, a poet, or a great conqueror.” Most of what people did back then was business.

Kings come, kings go, but keeping track of your barley—your sheep, your money, your property—that’s the real story of the world.


*Note from Robert Krulwich: I see that this column has offended a whole bunch of you. Yes, as many of you point out, my viewpoint was white, male (and hung up on fame and power) and many of you have serious, and totally legitimate arguments with my assumptions. Now that I read your comments, I’m a little surprised, and a touch ashamed of myself. But the thing is—those were my assumptions. They were wrong. I say so.

This is a blog. So it’s designed to be personal, and confessional. So I want you to know who’s talking to you, and if you think I’m way off base, by all means, let me know. And in the end, if you read the totality, my column and your responses, the story I wrote gets deeper and richer. You call me out on my assumptions, you offer some of your own, and what actually happened, what it was really like to be alive 5,300 years ago becomes… well, an argument among moderns about ancients that we will never meet.

Scholars aren’t unanimous about who’s name is oldest in the historical record. Yuval Noah Harari’s new book Sapiens: A Brief History of Humankind gives the crown to Kushim. The Oriental Institute at the University of Chicago goes for Gal-Sal and his slaves in their 2010-2011 annual report. Andrew Robinson, in his Writing and Script: A Very Short Introduction also champions Gal-Sal, but his book came earlier, so maybe Harari has scooped him. Here’s the video that argues for Kushim:

If the name Gal-Sal strikes some of you as familiar, it appears in the title of a 1942 Rita Hayworth/Victor Mature movie, My Gal Sal, about a songwriter who falls crazily in love with a singer on the vaudeville circuit named Sal (short for Sally Elliot). I watched it. It’s terrible. Kushim, meanwhile, survives. According to the blog Namespedia, it turns out that lots of Russian families call themselves Kushim to this day, and in the U.S., it’s a relatively popular first name. They’ve even got Kushim bar graphs!

A Blog by

Noah (and his ark) Updated, Improved for Our Time

Instead of the Noah you know, the one who built the ark, sheltered all those animals, sailed for 40 days and 40 nights and got to see God’s rainbow, instead of him, I want you to meet a new one. An updated version.

This Noah shows up in a tough little essay written by Amy Leach, of Bozeman, Montana, who knows her science, knows there’s a flood coming—a flood of humans, seven billion and counting, already swamping the Earth, crowding the land, emptying the sea, and her more modern Noah—informed, practical, not inclined to miracles—has a different plan. He announces,

water color painting with text reading ''unfortunately, animals. we are not going to be able to bring all of you with us this time.''
Illustration by Robert Krulwich

The old Noah, you may remember, squeezed eight humans (wife, kids, their spouses) and at least two of every critter, big and small, onto his crowded ship. But the new Noah, being more practical, feels he can winnow a little. “Everybody” is a lot of animals, more than you know. Back in the day, Amy Leach writes,

pink watercolor background with two drawings of frogs peeking up over the text, which talks about what it would be like to bring two of every creature onto noah's ark
Illustration by Robert Krulwich

And, honestly, (I’m thinking to myself), if the world lost a scorpion or two, would anyone notice? Or want them back? And blotchy toads, biting little flies—some animals are hard to keep going on a tight, crowded ship. On the last voyage, dormitory assignments were beyond difficult.

And all those supplies? Amy Leach writes how the first Noah would have had …

a yellow watercolor background covered with text about collecting food for animals
Illustration by Robert Krulwich

This doesn’t mean we don’t care, new Noah says to the animals. We definitely, absolutely want to bring a bunch of you with us. But, we’ve got to be practical.

Even if our ark has grown to the size of a planet, carrying everybody through is not going to be logistically possible, which is why, he says,

blue watercolor background with black text on it about being in charge of a future noahs ark where not all animals are included
Illustration by Robert Krulwich

And anyway, that first Noah? He lived in a different age, a time they call the Holocene, before humans began to dominate and crowd out the other species. Back then, there weren’t as many people. And there were more kinds of animals, closer by, hiding in the woods, clucking in the yard, so the world was more various then, more intimate, more riotous, and thinking about it (a little wistfully, if only for a moment), the new Noah quietly recalls that on that first ark …

yellow watercolor background with text on top related to how noahs ark would be different today than it was in the Old Testament
Illustration by Robert Krulwich

And now, animals, it’s time for many of you to step away. You’ve had your unruly eons. They were wild, unplanned, noisy, great fun. Natural selection ran the world. Crazy things happened. Those were good times, Amy’s essay concludes …

blue watercoor with black text on top that reads''But the future belongs to us.''
Illustration by Robert Krulwich

Amy Leach is a writer living in Bozeman. Her collection of very short pieces—about jellyfish, beaver, salmon, plants that go topsy turvy and stand on their heads—are collected in a wonderful little book called “Things That Are.” In this column I do to Amy what the new Noah is doing to our planet: I edited her down, sliced, diced, slimmed (lovingly, I hope), trying to give you a taste for her fierce, crazy prose. But like the planet, she’s wilder in the original, so I hope you go there and sample the unedited version.

A Blog by

The Little Boy Who Should’ve Vanished, but Didn’t

He was 12 years old. He was a slave. He’d had no schooling. He was too young, too unlettered, too un-European; he couldn’t have done this on his own. That’s what people said.

Picture of a drawing of an older man and a young boy facing  a vanilla plant with their backs to the viewer
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Edmond (he had no last name—slaves weren’t allowed them) had just solved a botanical mystery that had stumped the greatest botanists of his day. In the early 1800s he was a child on a remote island in the Indian Ocean, and yet, against overwhelming odds, Edmond would get credit for his discovery—and for the most surprising reasons. I want to tell you his story. So I’ll start here, with a plant.

Picture of a drawing of a vanilla plant
Drawing by Robert Krulwich
Drawing by Robert Krulwich

This is a vanilla plant (or my version of one). It’s a vine. It climbs, sometimes way high, and when it flowers and is visited by a pollinator, it produces a bunch of long, stringy beans. Properly treated, those beans give off the flavor we associate with vanilla.

Picture of a drawing of Anne of Austria holding a mug of hot chocolate
Drawing by Robert Krulwich
Drawing by Robert Krulwich

When Spanish explorers brought vanilla from Mexico, it was mixed with chocolate and became a classy sensation, fancied by kings, queens, and, pretty soon, everybody else. In his book Vanilla: Travels in Search of the Vanilla Orchid, journalist Tim Ecott reports that Anne of Austria, daughter of Philip III of Spain, drank it in hot chocolate. Madame de Pompadour, one of the great hostesses (and mistresses) of King Louis XV, flavored her soups with it.

Picture of Madame de Pompadour with a bowl of steaming soup in front of her
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Francisco Hernandez, physician to King Philip II of Spain, called it a miracle drug that could soothe the stomach, cure the bite of a venomous snake, reduce flatulence, and cause “the urine to flow admirably.”

Picture of a drawing of a man peeing
Drawing by Robert Krulwich
Drawing by Robert Krulwich

And, best of all, it was a sexual picker upper. Bezaar Zimmerman, a German physician, claimed in his treatise “On Experiences” (1762) that, “No fewer than 342 impotent men, by drinking vanilla decoctions, have changed into astonishing lovers of at least as many women.”

Picture of a drawing of a woman laying her head on the shoulder of a man standing next to a vanilla bottle
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Demand, naturally, shot sky high. By the late 18th century, a ton of Mexican vanilla was worth, writes Ecott, “its weight in silver.”

With profit margins growing, a few plants were hustled out of Mexico to botanical gardens in Paris and London, then on to the East Indies to see if the plant would grow in Europe or Asia.

It grew, but it wouldn’t fruit, wouldn’t produce beans. Flowers would appear, bloom for a day, fold up, and fall off. With no beans, there could be no vanilla extract, and therefore nothing to sell. The plant needed a pollinator. In Mexico a little bee did the deed. Nobody knew how the bee did it.

Picture of a drawing of a bee saying 'Shhhhh'
Drawing by Robert Krulwich
Drawing by Robert Krulwich

What to do? In the 1790s people knew about plant sex. Bees, they knew, were pollinators.

If people could only figure out where vanilla’s sexual parts were hiding, they could become bee substitutes.

Enter the 12-Year-Old

They kept trying. One plantation owner, Ferréol Bellier-Beaumont, on the island of Réunion halfway between India and Africa, had received a bunch of vanilla plants from the government in Paris. He’d planted them, and one, only one, held on for 22 years. It never fruited.

The story goes that one morning in 1841, Bellier-Beaumont was walking with his young African slave Edmond when they came up to a surviving vine. Edmond pointed to a part of the plant, and there, in plain view, were two packs of vanilla beans hanging from the vine. Two! That was startling. But then Edmond dropped a little bomb: This wasn’t an accident. He’d produced those fruits himself, he said, by hand-pollination.

No Way

Bellier-Beaumont didn’t believe him—not at first. It’s true that months earlier the older man had shown Edmond how to hand-pollinate a watermelon plant “by marrying the male and female parts together,” but he’d had no success with vanilla. No one had.

But after his watermelon lesson, Edmond said he’d sat with the solitary vanilla vine and looked and probed and found the part of the flower that produced pollen. He’d also found the stigma, the part that needed to be dusted. And, most important, he’d discovered that the two parts were separated by a little lid, and he’d lifted the flap and held it open with a little tool so he could rub the pollen in. You can see what Edmond did in this video:

Edmond had discovered the rostellum, the lid that many orchid plants (vanilla included) have, probably to keep the plant from fertilizing itself. Could you do it again, Bellier-Beaumont asked? And Edmond did.

This was news. Big news. Bellier-Beaumont wrote his fellow plantation owners to say Edmond had solved the mystery, then sent him from plantation to plantation to teach other slaves how to fertilize the vanilla vine.

And so the Indian Ocean vanilla industry was born.

In I841, Réunion exported no vanilla. By 1848, it was exporting 50 kilograms (.0055 tons) to France; by 1858, two tons; by 1867, 20 tons; and by 1898, 200 tons. “By then,” Tim Ecott writes, “Réunion had outstripped Mexico to become the world’s largest producer of vanilla beans.”

Picture of a drawing of a graph showing vanilla exports from Reunion
Drawing by Robert Krulwich
Drawing by Robert Krulwich

The planters were getting rich. What, I wondered, happened to Edmond?

Well, he was rewarded. His owner gave him his freedom. He got a last name, Albius. Plus, his former owner wrote the governor, saying he should get a cash stipend “for his role in making the vanilla industry.”

The governor didn’t answer.

Edmond left his master and moved to town, and that’s when things went sour.

He fell in with a rough crowd, somehow got involved in a jewelry heist, and was arrested, convicted, and sentenced to five years in jail. His former owner again wrote the governor.

“I appeal to your compassion in the case of a young black boy condemned to hard labor … If anyone has a right to clemency and to recognition for his achievements, then it is Edmond … It is entirely due to him that this country owes [sic] a new branch of industry—for it is he who first discovered how to manually fertilize the vanilla plant.”

Picture of a drawing that says Entirely Due to Him
Drawing by Robert Krulwich
Drawing by Robert Krulwich

The appeal worked. Edmond was released. But what catches my eye here is Bellier-Beaumont’s choice of “entirely.” Our new vanilla business, he says, is “entirely” due to Edmond. He’s giving the former slave full credit for his discovery and retaining none for himself. That’s rare.

Then, all of a sudden, Edmond had a rival. A famous botanist from Paris—a scholar, a high official knighted for his achievements—announced in the 1860s that he, and not the slave boy, had discovered how to fertilize vanilla.

Picture of a drawing of a man with a beard holding a vanilla plant and looking suspicious
Drawing by Robert Krulwich
Drawing by Robert Krulwich

Jean Michel Claude Richard claimed to have hand-pollinated vanilla in Paris and then gone to Réunion in 1838 to show a small group of horticulturists how to do it. Little Edmond, he presumed, had been in the room, peeked, and then stolen the technique.

So here’s a prestigious scholar from the imperial capital asserting a claim against a 12-year-old slave from a remote foreign island. What chance did Edmond have?

Picture of a drawing of a young boy who was a slave facing off with an old scholarly French man
Drawing by Robert Krulwich
Drawing by Robert Krulwich

He was uneducated, without power, without a voice—but luckily, he had a friend. Once again, Edmond’s former master, Bellier-Beaumont, jumped into action, writing a letter to Réunion’s official historian declaring Edmond the true inventor. The great man from Paris, he said, was just, well, mis-remembering.

He went on to say that no one recalled Richard showing them how to fertilize orchids, but everybody remembers, four years later, Edmond teaching his technique to slaves around the island. Why would farmers invite Edmond to teach “if the process were already known?”

“I have been [Richard’s] friend for many years, and regret anything which causes him pain,” Bellier-Beaumont wrote, “but I also have my obligations to Edmond. Through old age, faulty memory, or some other cause, M. Richard now imagines that he himself discovered the secret of how to pollinate vanilla, and imagines that he taught the technique to the person who discovered it! Let us leave him to his fantasies.”

The letter was published. It’s now in the island’s official history. It survives.

Picture of an etching of Edmond Albius with the vanilla plant in his hands
Etching of more adult Edmond Albius
Etching of more adult Edmond Albius

And Yet, a Miserable End

Edmond himself never prospered from his discovery. He married, moved back to the country near Bellier-Beaumont’s plantation, and died in 1880 at age 51. A little notice appeared in the Moniteur, the local paper, a few weeks after he died. Dated Thursday, 26 August, 1880, it read: “The very man who at great profit to his colony, discovered how to pollinate vanilla flowers has died in the hospital at Sainte-Suzanne. It was a destitute and miserable end.” His long-standing request for an allowance, the obituary said, “never brought a response.”

Picture of the Edmond Albius Statue in
The statue of Edmond in Réunion
Photograph courtesy of Yvon/Flickr

But a hundred years later, the mayor of a town on Réunion decided to make amends. In 1980 or so, a statue was built to honor Edmond. Writer Tim Ecott decided to take a look. He bought a bus ticket on the island’s “Vanilla Line,” rode to the stop marked “Albius,” got off, and there, standing by himself is Edmond (in bronze or concrete? I can’t tell). He’s dressed, Ecott says, like a waiter, with a narrow bow tie and jacket. He’s not wearing shoes: Slaves weren’t allowed shoes or hats. But he’s got a street named after him, a school named after him. He has an entry on Wikipedia. He’s survived.

Picutre of a drawing of a man with a beard holding a vanilla plant and looking sad
Drawing by Robert Krulwich
Drawing by Robert Krulwich

And the guy who tried to erase him from history, Richard? I looked him up. He also has a Wikipedia entry. It describes his life as “marred by controversy,” mentions his claim against Edmond, and concludes that “by the end of the 20th century,” historians considered the 12-year-old boy “the true discoverer.” So despite his age, poverty, race, and status, Edmond won.

This is such a rare tale. It shouldn’t be. But it is.

Editor’s Note: This post has been updated to correctly reflect the title of Tim Ecott’s book.

Two books recount Edmond’s story. Tim Ecott’s Vanilla: Travels in Search of the Vanilla Orchid is the most thorough and original, but How to Fly a Horse: The Secret History of Creation, Invention and Discovery by Kevin Ashton tells the same tale and marvels that a slave on the far side of the world, poor and non-white, could get credit for what he’d done. There is also Ken Cameron’s Vanilla Orchids: Natural History and Cultivation, a book that contains Thomas Jefferson’s handwritten recipe for vanilla ice cream.

A Blog by

George Washington’s Oh-So-Mysterious Hair

That hair you’ve seen so many times on the dollar bill? That hair he’s got crossing the Delaware, standing by a cannon, riding a horse in those paintings? His hair on the quarter? On all those statues? The hair we all thought was a wig? Well, it wasn’t a wig. “Contrary to a common belief,” writes biographer Ron Chernow in his Pulitzer Prize-winning Washington: A Life, George Washington “never wore a wig.”

I’m stunned.

Illustration of George Washington on a quarter
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

Turns out, that hair was his. All of it—the pigtail, the poofy part in the back, that roll of perfect curls near his neck. What’s more (though you probably already guessed this), he wasn’t white-haired. There’s a painting of him as a young man, with Martha and her two children, that shows his hair as reddish brown, which Chernow says was his true color.

Picture of a painting of George Washington with Martha Washington and her two children
The Courtship of Washington, John C. McRae, 1860 Image Courtesy of the Mount Vernon Ladies’ Association

The whiteness was an effect. Washington’s hairstyle was carefully constructed to make an impression. It wasn’t a sissyish, high-society cut. It was, back in the 1770s and 1780s, a military look, something soldiers or want-to-be soldiers did to look manly. “However formal it looks to modern eyes,” Chernow writes, “the style was favored by military officers.”

Illustration of George Washington in profile, emphasizing his long hair, which is down in this illustration
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

Think of this as the 18th-century equivalent of a marine buzz cut. In Washington’s time, the toughest soldiers in Europe, officers in the Prussian Army, fixed their hair this way. It was called a queue. British officers did it too. So did British colonials in America.

Here’s how it worked. Washington grew his hair long, so that it flowed back toward his shoulders.

Illustration of George Washington in profile, showing his hair being gathered before putting it into a ponytail
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

Then he’d pull it firmly back, broadening the forehead to give him, Chernow writes in his biography, “an air of martial nobility.” The more forehead, the better. Nowadays we notice chins. But not then. Foreheads conveyed force, power.

The look was achieved with appropriate muscularity. In the British Army a tough hair yank was a rite of passage for young officers; it was common to yank really hard.

Illustration of George Washington in profile, showing his hair being pulled backwards before being put into a ponytail
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

A military journalist, Joachim Hayward Stocqueler, describes a British soldier from that time who says his hair and skin was pulled so fiercely, he didn’t think he’d be able to close his eyelids afterward.

Once gathered at the back, hair was braided or sometimes just tied at the neck by a strap or, on formal occasions, a ribbon. Washington would occasionally bunch his ponytail into a fine silk bag, where it would bob at the back of his head.

Illustration of George Washington in profile, showing his hair tied in a bow
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

Then he would turn to his side hairs, which he “fluffed out,” writes Chernow, “into twin projecting wings, furthering the appearance of a wig.” George Washington “fluffing out”? That’s such an odd image. Artist Wendy MacNaughton, my partner in crime, sees it this way:

Illustration of George Washington in profile, emphasizing his curled hair
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

You should close your eyes and see him fluffling in your own way.

Next question: How did those side curls stay curled? Betty Myers, master wigmaker at Colonial Williamsburg in Virginia, wrote to me that it was common to grease one’s hair with pomade. Oily hair helped. We don’t know how often Washington shampooed, but the less he showered, the firmer his fluffs.

And now, to the whiteness. Washington’s hair wasn’t splotchy. It was like a snow-covered mountain, evenly white. This was accomplished by sprinkling a fine powder on the head. There were lots of powders to choose from, writes Myers, including “talcum powder, starch, ground orris root, rice powder, chalk, [or] even plaster of paris …” Washington probably used a finely milled (expensive) product, which was applied, cloud-like, to his head. To keep from gagging in a powder fog, it was common to cover the face with a cone of coiled paper, like this:

Illustration of George Washington covering his face with a cone while he powders his hair
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

The powder was sometimes applied with a handheld bellows. An attendant would pump a cloud of powder from a small nozzle and let it settle on the hair. But Washington, says biographer Ron Chernow, would dip a puff, a snakelike bunch of silk striplings—into a powder bag, then do a quick shake over his bent head. Maybe a slave would do this for him. When being powdered, it was traditional to wear a “powdering robe,” basically a large towel tied around the neck, to keep from being doused.

Picture of a drawing of a woman having her wig powdered
Photo by Hulton Archive/Getty Images
Circa 1750, A political cartoon entitled 'The English Lady in Paris, an Essay on Puffing by Louis le Grande', showing a seated old lady having her wig powdered by a nasty looking Frenchman. (Photo by Hulton Archive/Getty Images)

Which leaves one last puzzle. Washington was a careful, self-conscious dresser. When he appeared at the first Continental Congress, he was the only important delegate to wear a military costume, choosing, Chernow writes, the “blue uniforms with buff facings and white stockings” of the Virginia citizen militia while adding his own “silk sash, gorgets, [and] epaulettes.” Later, he’s described dancing at balls in black velvet. So if Washington liked dark clothes, how’d he keep the powder from showing? The man would have been covered in dandruff-like sprinkles. (Editor’s Note: One of our readers, Mike Whybark, shared a painting that makes me wonder … Maybe his shoulders did look a little snowed-on.) Myers, the wig scholar, says that’s why Washington bunched his ponytail into a silk bag, to keep from leaving a white windshield wiper splay of powder on his back when he was dancing with the ladies (which he liked to do). As for keeping the powder off one’s shoulders, how Washington did that—if he did do that—nobody could tell me. Probably every powder-wearing guy in the 1760s knew the secret, but after a couple of centuries, whatever Washington did to stay spotless is lost to us.

Illustration of George Washington, on the left, with white powder on his houlders, and on the right without white powder on his shoulders
Illustration by Wendy MacNaughton
Illustration by Wendy MacNaughton

We can stare all we like at his shoulders and wonder, but the truth is, there are some things about our first president we may never, ever know.

Illustration of George Washington winking with his hair perfectly fixed
Illustration by
Wendy MacNaughton
Illustration by Wendy MacNaughton

Wendy MacNaughton draws people, cats, bottles, scenes, faces, places. If, totally out of the blue, I call her and say, “Can you imagine Leonardo da Vinci’s personal notebook or George Washington getting his hair done?” she just giggles and draws. And a week later, I’m doing a happy dance. If you want to see what she’s up to right now, you’ll find more of her work here. And if you enjoy presidential hair stories, here’s the other Big Guy, Abe Lincoln, on a day in 1857 when he clearly lost his comb. Hairstylists shouldn’t look—it’s too scary.