What did the last common ancestor of living apes look like? That’s a difficult question to answer. Today’s apes – gibbons, orangutans, gorillas, chimpanzees, and ourselves – are varied and specialized primates with relatively sparse fossil records. Depending on which paleoanthropologist you ask, then, the last common ancestor of today’s apes was either small and gibbon-like or more like a great ape, with gibbons hanging from a dwarfed branch of the family tree.
Pliobates might help resolve the debate. Described by David Alba and colleagues, this 11.6 million year old ape was on the evolutionary “stem” leading to the last common ancestor between the gibbons and the great apes. Rather than being a large-bodied primate, though, Pliobates was relatively small and more gibbon-like in form, an adept climber with some ability to swing beneath the branches of the Miocene forest.
Not that Pliobates was one of our direct ancestors. Molecular evidence suggests that the split between gibbons and the rest of the apes occurred between 16 and 17 million years ago, long before this newly-named ape. Instead, Alba and coauthors write, Pliobates is more of a “persistent type” – an archaic remnant of the apes that led up to the major hominoid division. More fossils will help outline how the actual transition occurred, but, for now, Pliobates is an echo of what our forebears might have been like at the dawn of the apes.
Meaning:Pliobates is a reference to the primate’s intermediate place between Pliopithecus and gibbons (Hylobates), while the species name honors where the fossil was found.
Age: About 11.6 million years old.
Where in the world?: Catalonia, southeastern Spain.
What sort of critter?: An Old World monkey – or catarrhine – closely related to the last common ancestor of today’s apes.
Size: About 10 pounds.
How much of the creature’s body is known?: A partial skeleton including elements of the limbs and a skull.
Archaeologists have unearthed the oldest case of decapitation ever found in the New World. The skull belonged to a young man and was buried in Brazil about 9,000 years old, with severed hands covering its face in a mysterious pose—left hand over the right side of the face, fingers pointing up, and right hand over left side, pointing down.
No one, it seems, has ever seen anything like it. Why was this guy decapitated? Why the weird posing of the hands 9,000 years before Madonna’s song “Vogue“? And where’s the rest of him?
André Strauss of the Max Planck Institute for Evolutionary Anthropology found the skull, but he still finds it a mystery. He was excavating the Lapa do Santo site in eastern Brazil when he struck upon the head buried under a rock. He kept sifting away the dirt around it, looking for the rest of the skeleton, but it never materialized. Instead, he slowly uncovered the disembodied skull and hands, partially crushed from being buried for thousands of years.
The last thing Strauss, or anyone else, expected to find at such an old site was a decapitated head; the next oldest decapitation in South America is only about 3,000 years old, and practically on the other side of the continent, in Peru. “I’m not a decapitologist,” he says. (That’s not a real title, but given the number of severed heads in human history, maybe it should be.)
The find raised many questions. First, how did these people, who were hunter-gatherers living in a simple society with few tools (certainly no machetes) get the head off? Strauss got a tip from Sue Black, a forensic anthropologist at the University of Dundee. (Note: I’m taking her online course in human identification now, and it’s fantastic. If you want to learn what CSI is really like, sign up.)
Black noticed a similarity to a modern-day case she’s working on, in which the skeleton of a woman was found decapitated. She saw the same kind of fractures in the neck, suggesting that after the head was partially cut off, it was manually pulled and twisted to finish the job. It would have been difficult, and gruesome, work.
Lapa do Santo, incidentally, is also where the oldest human skeleton in South America was found, named Luzia, and the oldest rock art, which turns out to be a carving of a man with a giant phallus, dubbed “Little Horny Man.”
So yes, our hunter-gatherer ancestors sound just as interested in skulls and penis art as your average teenage boy today. But before you snicker, remember that these fascinations pop up all over the world throughout human history: sex and fertility, obviously, but also skulls.
Even though many people consider skulls morbid or even sinister today, for most of our existence people have had a fairly cozy relationship with human heads. They’re still pretty popular, too. A John Varvatos skull scarf costs 250 bucks.
In fact, I’m sitting at my kitchen table with a bright purple skull grinning at me as I write. It’s a life-sized ceramic head decorated with turquoise swirls in a Mexican Day of the Dead style. My husband and mother-in-law looked a little concerned when I dashed into a San Antonio gift shop to snatch it from the display window.
But I love my ceramic skull, and it’s part of a long symbolic tradition. People have always cut off heads and kept them, or buried them, or used them for all manner of purposes. Skulls can be war trophies: The Inca emperor Atahualpa drank from the gold-encrusted skull of a rival, maybe his brother. In fact, more than one culture figured out that a cranium makes a great cup. Or they can be more peaceful reminders of our ancestors.
“There is often no link about these similar forms of behavior practiced in different part of the world,” says Silvia Bello, an anthropologist who studies death practices at the Natural History Museum in London. “The fascination of humans for heads and skulls seems to be the common ground.”
We don’t know why our mystery man in Brazil was decapitated, but it most likely wasn’t as a trophy. There are no holes or scrape marks that would be expected if the head was cleaned for display, and the cranium wasn’t opened to remove the brain (which you would definitely want to do if a head was sitting out on display decomposing).
Strauss also doubts that he was killed as a rival or outsider. He was a local, based on the signature of strontium isotopes in his bones. He may not have been executed at all; perhaps he died of natural causes or in a fight, and his head was removed and buried in a special way for symbolic reasons that we may never understand.
One hint, though, lies in the fact that the hands were arranged over the face as opposites in that ‘vogue’ pose. (For the sticklers: It’s really not quite like Madonna’s vogue, if you look up photos of her, but I don’t know what else to compare it to.)
“There is an argument for great symbolism in these two hands,” says anthropologist John Verano of Tulane University. “Left and right, that’s dualism.” Opposites were a big theme in Inca and other South American cultures, though it’s not clear whether this opposite pose would have represented something good or bad—maybe both.
Whatever the people of Lapa do Santo intended, this decapitation is an important glimpse into the ritual dismemberment of human remains, says Michelle Bonogofsky of the University of California, Berkeley, who wrote a book on decapitations. She has seen skulls plastered, painted, and decorated, but has never seen a skull posed with severed hands.
“I found a head that had two feet in front of it once,” says Verano. “It seemed to be a sign of disrespect. But never the hands.”
Reference: The Oldest Case of Decapitation in the New World (Lapa do Santo, East-Central Brazil). Andre Strauss et al. PLOS ONE, published online September 23, 2015. http://dx.plos.org/10.1371/journal.pone.0137456
Two weeks ago, I was staring at an unbelievably star-studded sky. We were in the Peruvian Amazon, far from city lights, and had set up camp along the Alto Madre de Dios, one of the region’s mighty rivers. I wasn’t there to star gaze – my assignment was to report on the jungle tribe that has been emerging along this river – but it was impossible not to spend some time with the evening sky.
Previous trips to the planet’s rainforests have taken me into the trees at night, where leafy canopies obscure all but a few tantalizing patches of pinpricked sky. I’d always known something wonderful lurked above the palm fronds and ironwoods, but it wasn’t until this last trip that I finally got to see it.
On the riverbank, the naked southern sky stretched from one edge of my vision to another. There were few clouds, no trees overhead, and zero streetlamps to block the light that has been hurtling through space for thousands of years.
Instead, it was just me, snuggled into the sand, and a leaky ocean of twinkling stars.
For those who have never seen a truly dark sky, especially from south of the Equator, what a discovery awaits. Stars reveal their true colors (you’d be surprised by how many different hues they sparkle in!), incoming meteors regularly paint streaks across the sky, and the omnipresent disk of the Milky Way lords over everything in sight. The dark, dusty clouds that blot out light from the densest parts of our galaxy look like big, inky blotches dropped into a cosmic pool that’s just a smidge too far away to stir with a fingertip.
On these evenings in southern winter, the constellation Scorpius gleamed overhead, looking for all the world exactly like a scorpion, with the bright red supergiant Antares forming its beating heart. Next door was Sagittarius, the archer — and the constellation that dwarf planet Pluto is currently passing through. Alpha Centauri, the nearest stellar system to our own, shines brightly as the next port over, flanked by Beta Centauri and the jewel-like Southern Cross. I talked myself hoarse explaining as much as I could about how stars grow up and die, the supermassive black hole hiding behind those dusty cosmic clouds, the stories of the constellations and the different worlds we share our solar neighborhood with.
It’s one of the best experiences I’ve ever had.
But the next time I go to Perú, what I’d like to do is sit beneath this twinkling canvas and listen to the stellar stories that have their roots in the jungle – the tales that have been passed down among the native Mastigenka or Yine, people for whom this nightly spectacle isn’t so much a spectacle as a normal evening panorama. Which heroes and monsters have these cultures, living in the forests all around me, placed in the sky? How do they explain the periodic reddening of the moon or the occasional daytime disappearance of the sun?
For millennia, the nighttime sky has been a tablet upon which we’ve inscribed our histories. It contains a richness that transcends those visible points of light, with multiple narratives layered atop the same glittering framework. Now, as cities and their lights continue to creep inexorably outward, and indigenous cultures continually come under siege, I think it’s more important than ever to fight for the sky and the very human treasures it holds.
Editor’s Note: This post has updated to clarify a sentence about the gender of the ancient writer.
“It’s me!” they’d say, and they’d leave a sign. Leave it on the cave wall. Maybe as a prayer, maybe a graffito, we don’t know.
This was 30,000 years ago. Writing hadn’t been invented, so they couldn’t chalk their names on the rock. Instead, they’d flatten their hand, blow dust over it, and leave a silhouette like this:
And for 30, 40 centuries across Europe, Asia, the Americas, and Australia, this is how cavemen, cavewomen, cave kids, hunters, nomads, farmers, and soldiers left their mark.
Every one of these handprints belonged to an individual, presumably with a name, a history, and stories to tell. But without writing, we can’t know those stories. We call them hunter-gatherers, cave people, Neolithic tribes. We think of them in groups, never alone. Tens of thousands of generations come and go, and we can’t name a single person before 3200 B.C., not a one. Then, in Mesopotamia, writing appears, and after that people could record their words, sometimes in phonetic symbols so we could listen in, hear them talking and, for the first time, hear someone’s name—our first individual.
So who was it?
Who is the first person in the recorded history of the world whose name we know?
Just Guessing Here
Would it be a she or a he? (I’m figuring a he, because writing was a new thing, and males are usually the early adopters.) [*Please see note at bottom of post for more on this.]
Would he be a king? Warrior? Poet? Merchant? Commoner? (I’m guessing not a commoner. To be mentioned in an ancient document, he’d need a reputation, tools, and maybe a scribe. He wouldn’t be poor.)
Would he be a person of great accomplishment or just an ordinary Joe? (The odds favor a well-regarded person, someone who is mentioned often. Regular Joes, I figured, would pop up irregularly, while a great king, a leading poet, or a victorious general would get thousands of mentions.)
So I trolled the internet, read some books, and to my great surprise—the first name in recorded history isn’t a king. Nor a warrior. Or a poet. He was, it turns out … an accountant. In his new book Sapiens: A Brief History of Humankind, Yuval Noah Harari goes back 33 centuries before Christ to a 5,000-year-old clay tablet found in Mesopotamia (modern Iraq). It has dots, brackets, and little drawings carved on it and appears to record a business deal.
It’s a receipt for multiple shipments of barley. The tablet says, very simply:
29,086 measures barley 37 months Kushim
“The most probable reading of this sentence,” Harari writes, “is: ‘A total of 29,086 measures of barley were received over the course of 37 months. Signed, Kushim.’ ”
So who was “Kushim”? The word might have been a job title, not a person (maybe kushim meant “barley assessor”) but check the video down below. It suggests that Kushim was indeed a guy, a record keeper who counted things for others—in short, an accountant. And if Kushim was his name, then with this tablet, Harari writes, “we are beginning to hear history through the ears of its protagonists. When Kushim’s neighbours called out to him, they might really have shouted, ‘Kushim!’”
It’s pretty clear Kushim was not famous, not hugely accomplished, certainly not a king. So all of my hunches were off.
But wait. The Kushim tablet is just one of tens of thousands of business records found on the deserts of Iraq. A single example is too random. We need more. So I keep looking and find what may be the second, third, and fourth oldest names we know of. They appear on a different Mesopotamian tablet.
Once again, they are not A-list ancients. Dated to around 3100 B.C.—about a generation or two after Kushim—the tablet’s heading is, “Two slaves held by Gal-Sal.” Gal-Sal is the owner. Next come the slaves, “En-pap X and Sukkalgir.” So now we’ve got four names: an accountant, a slave owner, and two slaves. No kings. They don’t show up for another generation or so.
The predominance of ordinary Sumerians doesn’t surprise Harari. Five thousand years ago, most humans on Earth were farmers, herders, and artisans who needed to keep track of what they owned and what they owed—and that’s how writing started. It was a technology for regular people, not a megaphone for the powerful.
“It is telling,” Harari writes, “that the first recorded name in history belongs to an accountant, rather than a prophet, a poet, or a great conqueror.” Most of what people did back then was business.
Kings come, kings go, but keeping track of your barley—your sheep, your money, your property—that’s the real story of the world.
*Note from Robert Krulwich: I see that this column has offended a whole bunch of you. Yes, as many of you point out, my viewpoint was white, male (and hung up on fame and power) and many of you have serious, and totally legitimate arguments with my assumptions. Now that I read your comments, I’m a little surprised, and a touch ashamed of myself. But the thing is—those were my assumptions. They were wrong. I say so.
This is a blog. So it’s designed to be personal, and confessional. So I want you to know who’s talking to you, and if you think I’m way off base, by all means, let me know. And in the end, if you read the totality, my column and your responses, the story I wrote gets deeper and richer. You call me out on my assumptions, you offer some of your own, and what actually happened, what it was really like to be alive 5,300 years ago becomes… well, an argument among moderns about ancients that we will never meet.
Scholars aren’t unanimous about who’s name is oldest in the historical record. Yuval Noah Harari’snew bookSapiens: A Brief History of Humankind gives the crown to Kushim.The Oriental Institute at the University of Chicago goes for Gal-Sal and his slaves in their 2010-2011annual report. Andrew Robinson, in his Writing and Script: A Very Short Introduction also champions Gal-Sal, but his book came earlier, so maybe Harari has scooped him. Here’s the video that argues for Kushim:
If the name Gal-Sal strikes some of you as familiar, it appears in the title of a 1942 Rita Hayworth/Victor Mature movie, My Gal Sal, about a songwriter who falls crazily in love with a singer on the vaudeville circuit named Sal (short for Sally Elliot). I watched it. It’s terrible. Kushim, meanwhile, survives. According to the blog Namespedia, it turns out that lots of Russian families call themselves Kushim to this day, and in the U.S., it’s a relatively popular first name. They’ve even got Kushim bar graphs!
I suppose “Neanderthal delicacy” may sound like an oxymoron. Most people think of Neanderthals and other ancient people as cave men, brutes capable of little more than smashing and grunting. To the extent you’ve ever thought about what they ate, you probably assumed it was, well, whatever they could get their dirty hands on.
Or maybe you remember The Clan of the Cave Bear, the 1980 bestseller that helped shape Neanderthals in the popular imagination. In the book, a Homo sapiens girl named Ayla is adopted by Neanderthals who communicate mainly through hand signals and seem incapable of learning.
Yet the more we learn about our ancient cousins, the more sophisticated we find them to be. Amazing work on Neanderthal genetics by Svante Pääbo has found that they possessed a gene called FOXP2 that is key to speech in modern humans, raising the question of whether Neanderthals had language. They may even have been capable of abstract thinking and art.
Now, a new study suggests that the Paleolithic crowd had its own version of fine dining, unsettling as the choice of fare may be. It appears that baby elephants may have been a particular delicacy—basically, pachyderm veal.
Most studies of ancient diets have focused on simply figuring out what people ate, not what they liked. But Ran Barkai of Tel Aviv University and his graduate student Hagar Reshef wondered if there was any way to make a reasonable guess about the tastes of early hominins. They report their findings in an upcoming issue of Quaternary International.
“The direct investigation of taste preference in Paleolithic times is impossible,” says Reshef, but there’s “plenty of circumstantial evidence.”
First, the scientists point to recent evidence that Neanderthals did have a sense of taste. Work by Carles Lalueza-Fox found taste-related genes in Neanderthals, specifically for bitter tastes, that could have shaped their food preferences. The gene varied, as it does in modern humans. “What seems clear is that keeping a wide range of taste perception was key in hominin groups,” Lalueaza-Fox says.
As for what they ate, the butchered bones of mammoths and ancient elephant species, and particularly young elephants, are fairly common in Paleolithic archaeological sites around the world. In some cases, such as the Middle Pleistocene sites Gesher Benot Ya’akov in Israel and Notarchirico in Italy, the skulls of young elephants appear to have been dismantled, perhaps to eat the brain.
Young elephants would presumably be easier to kill than large ones, which could explain why more young ones were eaten. But even young elephants aren’t exactly easy to capture and kill, leaving Reshef wondering whether they were also hunted as a preferred food—because they’re tasty.
That raises one obvious question: Are baby elephants tasty? Here, Reshef and Barkai looked at the historical record and modern-day hunter-gatherers. A 1967 study of the Liangula hunters in East Kenya reported that they preferred young elephants because they tasted better, and reports from other groups followed suit, with the general consensus being that elephants, and especially the young, taste sweet and fatty.
The team also checked out the nutritional value and quality of elephant meat. Studies of the biochemical composition of fat tissue reveals a high nutritional value for young elephants compared with adults.
We can’t wind back time to ask a Neanderthal what he liked, but it seems plausible that they put some effort into finding food they liked, and that baby elephant was on the list. “I would say that both the vulnerability and taste are relevant,” Reshef says.
Why would we care what Neanderthals or other hominins liked to nosh on? They sharpened their flints while dreaming of slicing into baby elephant; I wait in line for two hours to eat fancy ramen noodle soup. To each his own, right?
Perhaps. But it’s also part of understanding what makes us human.
“I believe that taste preference in ancient times was a motivating power in human evolution by pushing creative and technological abilities,” says Reshef.
Just think about that for a second. The quest for deliciousness: a motivating power in human evolution.
I could buy it. Given how much human time, creativity, and effort go into food today (Exhibit A: any Whole Foods store), it’s easy to believe that we are who we are, at least just a little bit, because we have been working for so long on new ways to perfect the snack. Thank you, sense of taste.
(A special thank you to my keen-eyed colleague Mark Strauss for pointing out the elephant study.)
That hair you’ve seen so many times on the dollar bill? That hair he’s got crossing the Delaware, standing by a cannon, riding a horse in those paintings? His hair on the quarter? On all those statues? The hair we all thought was a wig? Well, it wasn’t a wig. “Contrary to a common belief,” writes biographer Ron Chernow in his Pulitzer Prize-winning Washington: A Life, George Washington “never wore a wig.”
Turns out, that hair was his. All of it—the pigtail, the poofy part in the back, that roll of perfect curls near his neck. What’s more (though you probably already guessed this), he wasn’t white-haired. There’s a painting of him as a young man, with Martha and her two children, that shows his hair as reddish brown, which Chernow says was his true color.
The whiteness was an effect. Washington’s hairstyle was carefully constructed to make an impression. It wasn’t a sissyish, high-society cut. It was, back in the 1770s and 1780s, a military look, something soldiers or want-to-be soldiers did to look manly. “However formal it looks to modern eyes,” Chernow writes, “the style was favored by military officers.”
Think of this as the 18th-century equivalent of a marine buzz cut. In Washington’s time, the toughest soldiers in Europe, officers in the Prussian Army, fixed their hair this way. It was called a queue. British officers did it too. So did British colonials in America.
Here’s how it worked. Washington grew his hair long, so that it flowed back toward his shoulders.
Then he’d pull it firmly back, broadening the forehead to give him, Chernow writes in his biography, “an air of martial nobility.” The more forehead, the better. Nowadays we notice chins. But not then. Foreheads conveyed force, power.
The look was achieved with appropriate muscularity. In the British Army a tough hair yank was a rite of passage for young officers; it was common to yank really hard.
A military journalist, Joachim Hayward Stocqueler, describes a British soldier from that time who says his hair and skin was pulled so fiercely, he didn’t think he’d be able to close his eyelids afterward.
Once gathered at the back, hair was braided or sometimes just tied at the neck by a strap or, on formal occasions, a ribbon. Washington would occasionally bunch his ponytail into a fine silk bag, where it would bob at the back of his head.
Then he would turn to his side hairs, which he “fluffed out,” writes Chernow, “into twin projecting wings, furthering the appearance of a wig.” George Washington “fluffing out”? That’s such an odd image. Artist Wendy MacNaughton, my partner in crime, sees it this way:
You should close your eyes and see him fluffling in your own way.
Next question: How did those side curls stay curled? Betty Myers, master wigmaker at Colonial Williamsburg in Virginia, wrote to me that it was common to grease one’s hair with pomade. Oily hair helped. We don’t know how often Washington shampooed, but the less he showered, the firmer his fluffs.
And now, to the whiteness. Washington’s hair wasn’t splotchy. It was like a snow-covered mountain, evenly white. This was accomplished by sprinkling a fine powder on the head. There were lots of powders to choose from, writes Myers, including “talcum powder, starch, ground orris root, rice powder, chalk, [or] even plaster of paris …” Washington probably used a finely milled (expensive) product, which was applied, cloud-like, to his head. To keep from gagging in a powder fog, it was common to cover the face with a cone of coiled paper, like this:
The powder was sometimes applied with a handheld bellows. An attendant would pump a cloud of powder from a small nozzle and let it settle on the hair. But Washington, says biographer Ron Chernow, would dip a puff, a snakelike bunch of silk striplings—into a powder bag, then do a quick shake over his bent head. Maybe a slave would do this for him. When being powdered, it was traditional to wear a “powdering robe,” basically a large towel tied around the neck, to keep from being doused.
Which leaves one last puzzle. Washington was a careful, self-conscious dresser. When he appeared at the first Continental Congress, he was the only important delegate to wear a military costume, choosing, Chernow writes, the “blue uniforms with buff facings and white stockings” of the Virginia citizen militia while adding his own “silk sash, gorgets, [and] epaulettes.” Later, he’s described dancing at balls in black velvet. So if Washington liked dark clothes, how’d he keep the powder from showing? The man would have been covered in dandruff-like sprinkles. (Editor’s Note: One of our readers, Mike Whybark, shared a painting that makes me wonder … Maybe his shoulders did look a little snowed-on.) Myers, the wig scholar, says that’s why Washington bunched his ponytail into a silk bag, to keep from leaving a white windshield wiper splay of powder on his back when he was dancing with the ladies (which he liked to do). As for keeping the powder off one’s shoulders, how Washington did that—if he did do that—nobody could tell me. Probably every powder-wearing guy in the 1760s knew the secret, but after a couple of centuries, whatever Washington did to stay spotless is lost to us.
We can stare all we like at his shoulders and wonder, but the truth is, there are some things about our first president we may never, ever know.
Wendy MacNaughton draws people, cats, bottles, scenes, faces, places. If, totally out of the blue, I call her and say, “Can you imagine Leonardo da Vinci’s personal notebook or George Washington getting his hair done?” she just giggles and draws. And a week later, I’m doing a happy dance. If you want to see what she’s up to right now, you’ll find more of her work here. And if you enjoy presidential hair stories, here’s the other Big Guy, Abe Lincoln, on a day in 1857 when he clearly lost his comb. Hairstylists shouldn’t look—it’s too scary.
The first known murder was just as brutal as any other. The attacker smashed the victim twice in the head, leaving matching holes above the victim’s left eyebrow. The dead body was then dropped down a 43-foot shaft into a cave—where it lay for nearly half a million years.
Talk about your cold case.
Paleontologists pieced together the 430,000-year-old skull and reported their forensic analysis Wednesday in the journal PLOS ONE. Injuries to the skull represent the oldest direct evidence of homicide, the scientists say.
As for whether this was the first murder ever to occur, “for sure that’s not the case,” says Nohemi Sala, lead author of the study. The scientists can describe this victim as a young adult, but the age and even gender are unknown.
“In the fossil record, there are many cases of traumatic injury, but not a lot of evidence of killing,” says Sala, a paleontologist at the Instituto de Salud Carlos III in Madrid.
That doesn’t mean killing was uncommon before modern times, of course, but fossilized remains of any kind are relatively rare so far back.
The last several tens of thousands of years, on the other hand, are littered with grisly scenes. Take the case of Shanidar-3, a Neanderthal who lived about than 50,000 years ago. A cut on one of his left ribs shows that Shanidar-3 was probably killed by a spear, making him perhaps the oldest known murder victim prior to the new find.
The latest skull comes from the Sima de los Huesos, or “Pit of Bones,” site in Spain, where paleontologists have found the remains of at least 28 individuals. Who were these people? Well, they weren’t modern humans, and they weren’t really Neanderthals either.
Exactly what to call the Sima de los Huesos people has been debated, but Sala and her colleagues identify them as members of the species Homo heidelbergensis, an early human ancestor that gave rise to the Neanderthals.
Cause of death
To figure out whether the skull fractures resulted from blows or from the fall down the cave shaft, the team compared the injuries to those from modern cases of violence and falls. A face-to-face attack with a blunt instrument best fits the pattern of injury, the scientists say. The bones showed no evidence of healing, so the victim probably died immediately or soon after the attack.
What’s more, the two holes in the skull are the same shape and appear to have been made by the same weapon. It’s very unlikely that an accidental fall onto a rock would produce two nearly identical skull fractures, the team says.
Sala says the weapon was probably “something very hard,” but we’ll never know if it was made of wood or rock, or something else.
The scientists scoured the site, she says, but didn’t turn up any potential murder weapons. There was only stone tool found at the site, and it wasn’t the right shape.
Another unsolved mystery: what drove an ancient person to kill. “Life was hard in the past,” Sala says, so there could have conflicts over resources or any number of reasons for a fight.
Even with difficult lives, though, Sala describes the Sima de los Huesos people as caring for one another. “There were 28 individuals at the site of different ages,” she says. “We know that some of these people had health problems. One person had very serious pathology in the lower back and probably had troule walking and moving.” Someone had to be caring for these people before their deaths, she says.
And while it might not sound like a lovely funeral today, the fact that people living at the site buried bodies by dropping them down the same shaft indicates some sense of ceremonial burial or ritual—the dead weren’t merely dragged away from the campsite to decay.
Overall, the site paints a picture of ancient people who lived, loved—and sometimes fought—together.
Sala’s take on life with Homo heidelbergensis: “They’re not so bad—at least they have also good points.”
When did the last of the ground sloths disappear? The standard answer is “about 10,000 years ago”. That’s the oft-repeated cutoff date for when much of the world’s Ice Age megafauna – from mastodons to Megatherium – faded away. It’s nice and neat, falling just after the close of the last Ice Age and during a time when humans were spreading to new continents. In fact, it’s too clean a cutoff. The shaggy, ground-dwelling sloths that inhabited almost the entire span of the New World didn’t all topple over at once. They very last of their kind, both protected and made vulnerable by life on islands, were still shuffling 4,200 years ago.
Calling the time of death for any species or lineage is always complicated by definitions and details. Should a species be considered extinct when its very last member perishes, or when the population sinks below a level from which they can recover? And in these fading families, should the explanation for extinction be the cause of death of the last individual, or do we assemble a more complex picture that considers factors that made the population vulnerable in the first place? Both science and storytelling influence our answers to these questions, but one thing is abundantly clear. Extinction is a process, not a single fell swoop.
Consider the times when the giant ground sloths disappeared. They were one of the great success stories of the Ice Age – with 19 genera ranging through South, Central, and North America, as well as Caribbean islands at the end of the Pleistocene – but, as reported by paleontologist David Steadman and colleagues in a 2005 study, 90% of the existing Ice Age sloths disappeared within the last 11,000 years.
Megalonyx and other giants from North America were some of the first to go. While Steadman and colleagues stressed that the dates represent “last appearance dates” rather than actual time of species death, the youngest known sloth remains from North America date to about 11,000 years ago. South America’s ground sloths, such the enormous Eremotherium, soon followed – the youngest dung and tissue samples found on the continent date between 10,600 and 10,200 years ago.
But for another 5,000 years, ground sloths survived. They weren’t on the continents, but scattered through the islands of the Caribbean. I had not even heard about these sloths until paleo geneticist Ross Barnett told me about them in a Twitter exchange long ago, and, as reviewed in the paper by Steadman and colleagues, there were at least five genera and thirteen species of large ground sloths that were unique to these islands.
The largest of all was Megalocnus. This sloth hasn’t received nearly as much attention as the other “mega”-prefixed sloths, but, as you can see from the bones on display at the American Museum of Natural History’s fossil mammal hall, this 200-pound sloth was still an impressive beast. Based on remains found in a limestone cave on Cuba, Steadman and colleagues determined that Megalocnus lived until at least 6,250 years ago.
Other smaller sloths persisted even longer. Parocnus, also found on Cuba, lived until about 4,960 years ago, and the small ground sloth Neocnus trundled over Hispaniola until about 4,500 years ago. There’s no direct evidence that people were hunting or eating the sloths, but, based on tentative evidence for human occupation of Caribbean islands around 5,000 years ago, Steadman suggest that the arrival of Homo sapiens tipped the sloth into extinction.
Of course, last appearance dates are often revised with new finds and updated techniques. Two years after the Steadman study, Ross MacPhee and coauthors published a new, youngest date for Cuba’s Megalocnus. From a tooth found on the island, the researchers estimated that the ground sloth survived to at least 4,200 years ago.
Through the lens of geologic time – wherein millions of years are thrown around because the numbers are too big to truly comprehend – extending the lifetime of a ground sloth another 2,000 years might not sound like much. But MacPhee and colleagues underscore the importance of getting good dates for when Ice Age creatures vanished. If people really showed up on Cuba and other sloth-bearing islands around 5,500 years ago, then humans and ground sloths coexisted for over a thousand years and the “blitzkrieg” model of extinction starts to crumble. Humans may have still been responsible for the extinction of the sloths and other species, but the record doesn’t show the pattern of rapid die-off that has sometimes been used to pin our species as the chief cause of megafaunal extinctions.
In time, we may get a clearer picture of why such a diverse and widespread ground of mammals disappeared. Assuming that humans, climate change, or any of the other traditional suspects without more detailed evidence masks the complexity of how extinction happens. But even if paleontologists eventually puzzle together what happened to these great beasts, I’ll still be saddened by the fact that I just missed the ground sloths. Especially because there are habitats – such as vast stretches of desert in the basin and range I call home – that could still host them. Sometimes, when hours of rolling over the interstate starts to addle my brain, I start to imagine them out among the Joshua trees – reminders that we still live in the shadow of the Ice Age world.
Long ago, about 36 million years before today, a raft of monkeys found themselves adrift in the Atlantic. They’d been blown out to sea by an intense storm that had ripped up the African coast, and now a mat of floating vegetation was the closest thing to land for miles in all directions. But luck was with them. Thanks to a favorable current, they were thrown onto the beach of a new continent – South America.
I’ll admit that this scenario requires a little scientifically-informed imagination. No one has ever found a fossilized huddle of monkeys clinging to battered vegetation in ancient ocean sediments. But we know that such events must have happened in the past. Teeth tell the tale.
In the latest issue of Nature, paleontologist Mariano Bond and colleagues describe a handful of fossil teeth found in the rainforest of Peru. Some are mysteries, too incomplete to identify down to genus or species, but a set of three molars are clearly from a new species of early monkey.
Three teeth might not seem like much to name a new animal, but, fortunately for paleontologists, mammals have always had very distinct teeth that tend to get fossilized even when the rest of the body decays. From the cusps and ridges, Bond and coauthors were able to narrow down the identity of this animal to a monkey that was about the size of a modern day tamarin. They’ve named it Perupithecus ucayaliensis.
At about 36 million years old, Perupithecus pushes back the arrival of monkeys on South America 10 million years earlier than previously thought. And even better, the molars of Perupithecus closely resemble those of Talahpithecus – an early monkey that lived around the same time in northern Africa. This doesn’t mean that Perupithecus was directly descended from Talahpithecus. Rather, it’s a another strong sign that the ancestors of New World monkeys were accidental migrants from Africa.
Perupithecus, or its immediate ancestors, probably arrived on rafts of storm-tossed vegetation. There wasn’t an overland route for the primates to make the same journey. Even though South America and Africa were once connected, they had drifted apart by 110 million years ago – long before the evolution of primates, much less monkeys. South America stayed an island continent from then until its collision with Panama about 3 million years ago. There was no other way from monkeys to get from Africa to South America except by sea. The monkeys that thrive in the Americas today, from tamarins to muriquis, are the descendants of prehistoric primates fortunate enough to survive the journey.
Five years ago cognitive scientist Rafael Núñez found himself in the Upper Yupno Valley, a remote, mountainous region of Papua New Guinea. The area is home to some 5,000 indigenous people, and Núñez and his graduate student, Kensy Cooperrider, were studying their conceptions of time.
Most of you reading this post have a Western understanding of time, in which time has a spatial relationship with our own bodies. The past is behind us, the future ahead. I look forward to Christmas and reach back into my memories. But that particular cognitive framework is not universal. Núñez’s work has shown, for example, that the Aymara people of the Andes think about time in the opposite way; for them, the future is behind and the past lies ahead.
An anthropologist working in Papua New Guinea, Jürg Wassmann, suspected that the Yupno have yet another way of thinking about time, and invited Núñez and Cooperrider to come down and investigate. The Yupno have no electricity and no roads; getting to a city involves a several-day hike. They live in small thatch huts surrounded by green mountains. This rolling landscape, the researchers discovered, is what centers the the Yupno’s conception of time. For them, the past is downhill and the future uphill.
Núñez and Cooperrider figured this out by analyzing the way the Yupno point during natural speech. And in the midst of doing those experiments, the researchers stumbled onto something else unexpected: The Yupno don’t point like Westerners do.
We Westerners have a boring pointing repertoire. Most of the time, we just jut out our arm and index finger. If our hands are occupied — carrying a heavy load, say — then we might resort to a jerk of the head or elbow. But if the pointer finger’s free, we’ll point it.
Not so for the Yupno. Within a few days of their arrival in the valley, Núñez and Cooperrider noticed that the Yupno often point with a sharp, coordinated gesture of the nose and head that precedes them looking toward the point of interest. Here’s how the scientists described the nose part of the gesture, dubbed the ‘S-action’, in a 2012 paper:
The kernel of the nose-pointing gesture is a distinctive facial form that is produced by a contraction of the muscles located bilaterally on both sides of the nose, which raise the upper lip and slightly broaden the wings of the nose,” they write. “Informally, the combined effect of pulling the nose upward and pulling the brow downward and inward may be characterized as an effortful scrunching together of the face.
Last year Núñez and Cooperrider made a second trip to the Yupno Valley to get a better understanding of how often the Yupno use the S-action, and why.
For this study (which was funded by the National Geographic Society), the researchers designed a game in which two people must work together to put various colored blocks into a particular configuration. One person, the director, sees a photo of the target configuration and then instructs the other person, the matcher, on where to move the pieces to make them match the photo.
The game presents a tough communication challenge that players meet by using lots of demonstratives (“This one over here!”, “That one over there!”) and frequent pointing, Núñez says.
The Yupno tend to use nose pointing more than finger pointing, as Cooperrider reported at the Cognitive Science Society meeting in July. That sharply contrasts with what the researchers observed among college students playing the same game in Núñez’s lab at the University of California, San Diego. Westerners, in the researchers’ words, “stuck unwaveringly to index finger pointing.”
OK, so culture seems to affect pointing behavior. But there are lots of ways in which Westerners are different from the Yupno. Why, I asked Núñez, should we care about pointing?
Pointing, he answered, seems to be a fundamental building block of human communication. Great apes are never seen pointing in the wild. And in human babies, pointing develops even before the first word.
If we want to understand why people point, then it’s critical to look at how all people point, not just the WEIRD (Western, educated, industrialized, rich, democratic) ones. “If we want to understand human evolution and human minds, we need to really look at variety,” Núñez says. And whatever theories researchers come up with to explain the evolutionary or neural roots of pointing, “they would have to be able to explain all of these different forms.”
The Yupno aren’t the only ones who point with their face. Lip pointing — in which protruding lips precede an eye gaze toward the area of interest — has been observed in people from Panama, Laos, and other groups in Australia, Africa, and South America. Head pointing, according to one study, happens frequently among people speaking Arabic, Bulgarian, Korean, and African-American Vernacular English.
Núñez speculates that early human ancestors used a wide variety of pointing gestures, and these have been shaped and pruned over time depending on the needs of a particular culture.
He doesn’t know why the Yupno prefer nose pointing, but speculates that it could be related to their penchant for secrecy. On the second day of his first visit, Núñez was walking through the woods with about 25 children behind him. He was struck by their quiet: For the entire 30 minutes, the children were whispering. He soon noticed that Yupno adults did it, too. “The amount of whispering that we observed in this community is unbelievable,” he says.
So perhaps the S-action is a way to convey meaning in a less showy way than extending an arm that everyone can see. “In this community, it’s very important to know who’s saying what to whom and about what and at what time,” he says. “There are a lot of cases where you don’t want to be seen saying something to somebody.”
But that’s just a hypothesis. Also mysterious: Why did Western culture lose its pointing variety?
Actually, Núñez muses, we may still be evolving on that front. Consider someone at a conference presenting information to several hundred people. What do they use? A laser pointer.
“If you want to call attention to something 25 meters away, no body part could be used to achieve that goal,” he says. “In our digital era we’re finding new ways to achieve the same fundamental goal that our ancestors had: How can I drag your attention to this particular thing?”
Hands evolved to punch faces. Faces evolved to take punches. That’s the hypothesis being bandied about by University of Utah researchers Michael Morgan and David Carrier, the pair proposing that the apparent “protective buttressing” of our skulls and hands is a sign of violent prehistoric fights where fists of fury dictated who would mate and who would exit the gene pool. It’s a great example of a just-so story.
Morgan and Carrier’s new paper, published in Biological Reviews, is a sequel to an initial paper that suggested our hands evolved as cudgels. This was more than a bit of a stretch. “The goal of this study was to test the hypothesis that the proportions of the human hand make it an effective weapon,” Morgan and Carrier wrote in the first study, but they couldn’t provide any evidence that punching was a preferred or even common mode of fighting in the past. The hypothesis rested on a post hoc fallacy of the same sort used by “aquatic ape” devotees – because our hands can be effective weapons, then they must have evolved for that purpose. No surprise that the concept of a spandrel – a trait that wasn’t molded specifically by natural selection, but is an evolutionary byproduct later co-opted for a different use – never appears in Morgan and Carrier’s considerations of pummeling fists.
But the skull paper is even stranger. Although Morgan and Carrier focused on the bludgeoning qualities of modern human hands in their previous paper, their new review suggests that our ancient relatives and forebears – the australopithecines – had faces that were molded into punching bags by natural selection. No sooner did humans come out of the trees, Morgan and Carrier suggest, than they started whaling away on each other. The trouble is that they undercut their own hypothesis, leaving only a crumpled heap of speculation.
Citing crime statistics from western countries, Morgan and Carrier write that fistfights often result in broken noses, jaws, and other facial bones. Therefore, they reason circularly, prehistoric humans that punched each other in the face should have more robust facial bones to cope with such blows. Given that early humans Australopithecus and Paranthropus – the latter often called “robust australopithecines” – had broad faces with wide cheeks and thick brow ridges, they’re obviously perfect candidates for Morgan and Carrier’s favored interpretation.
Morgan and Carrier didn’t study whether or not the hands of the early australopithecines could form a fist. Their previous work was on our species, Homo sapiens. Nor did they look for signs of broken facial bones or blunt-force trauma on prehistoric skulls, or even try to model how early human skulls would have reacted to the stresses of an incoming fist. The entire argument is simply that australopithecine skulls look like they could take a punch.
In Morgan and Carrier’s view, the heavy brows, large jaws, and flaring cheeks of the australopithecines are not signals of the way primates grow or the different plant foods they dined on, as paleoanthropologists have discerned, but were adaptations for reducing damage doled out by males as they competed for mates. There’s no evidence that australopithecines fought like this. The entire conjecture is based on sports like mixed martial arts and modern crime stats. And females don’t even figure into Morgan and Carrier’s hypothesis. Female mate choice, and why sexual dimorphism between the sexes has drastically decreased through time, is either ignored or overshadowed by the belief that we owe our most distinctive features to males walloping each other. This is bro science – dudes pummeling each other driving human evolution.
Those early humans couldn’t make the tight fists we do, though. Australopithecines – Lucy and her kin – were bipedal walkers that retained some signs of their arboreal ancestry, such as more ape-like arms and fingers. The hands and limbs of archaic hominins don’t match up with the supposedly “buttressed” skulls. More than that, our species doesn’t have the reinforced cheek bones, deep jaws, or prominent brow ridges that Morgan and Carrier cast as defensive structures. If our fists are so well-suited for punching, why have our faces lost their osteological protection? Morgan and Carrier suppose that we’re weaker than our ancestors, and therefore don’t need thick facial bones, but this runs counter to the heart of their hypothesis. If our hands evolved as weapons, then we should see a coevolution between striking hands and stout faces. Our prehistory shows no such pattern.
Saying that our hands are adapted to strike or that our skulls evolved to withstand those anatomical truncheons is fine as a hypothesis. But a hypothesis is just the initial fuel for the scientific engine. Morgan and Carrier haven’t let that experimental machinery run, instead looking to isolated tidbits of modern culture and projecting those behaviors onto our past. That’s not science. That’s storytelling.
Edit, 12/14, 10:59pm: This post has now been updated with responses from the new study’s lead author.
A few months ago I wrote a story for National GeographicNews that seemed to pique a lot of readers’ imaginations, and understandably so. It was about a study by Dean Snow reporting that, contrary to decades of archaeological dogma, many of the first artists were women.
Neat, right? But now there’s a twist in the tale: Another group of researchers is claiming the study’s methods were unsound. Snow has his own critiques of the criticism (more on that later). I’m less interested in who’s right than a fundamental question behind the controversy, and one that is relevant to all archaeological investigations: What does the present have to do with the past?
Snow’s study, published in the journal American Antiquity last October, focused on the famous 12,000- to 40,000-year-old handprints found on cave walls in France and Spain. Because these hands generally appear near pictures of bison and other big game, scholars had long believed that the art was made by male hunters. Snow tested that notion by comparing the relative lengths of fingers in the handprints. Why? Because among modern people, women tend to have ring and index fingers of about the same length, whereas men’s ring fingers tend to be longer than their index fingers.
Snow first scanned the hands of 111 people of European descent who lived near Pennsylvania State University, where he is an emeritus professor of anthropology. By comparing male and female hands on specific measures — such as the length of the fingers, the length of the hand, the ratio of ring to index finger, and the ratio of index finger to little finger — Snow developed an algorithm that could predict the sex of a given handprint. He also validated the algorithm on a second set of modern hands (50 males and 50 females).
The algorithm was only weakly predictive — with an accuracy of just 60 percent — because there’s a lot of overlap between the hands of modern men and women. But the equations were far more accurate when used on a set of 32 ancient hand stencils. The various measurements of these hands fell at the extreme ends of the modern sample, making it easy for the algorithm to categorize them as male or female. Snow found that 24 of the 32 prints — 75 percent — were female.
The new study, published Monday in the Journal of Archaeological Science, challenges Snow’s reference sample. A team led by Patrik Galeta of the University of West Bohemia in Pilsen, Czech Republic, collected handprints from 100 contemporary people in southern France and then ran those measurements through Snow’s algorithm.
Galeta found that Snow’s algorithm predicted female hands fairly well, but was useless for males, making it overall a bad predictor of sex. The study showed, in other words, that sex differences in hands among modern people living in Pennsylvania are not the same as differences among modern people living in France. “Our understanding is that hands of French males are on average smaller than U.S. males,” Galeta notes. And that, he adds, “is why U.S. methods failed to correctly identify French males.”
The bottom line: if two modern populations don’t match, then how can we possibly say anything about handprints tens of thousands of years old?
“What this shows is that a basic assumption that everyone has been making is wrong, which is that we can take a contemporary human population and use it as a model across space and time,” says archaeologist David Whitley of ASM Affiliates, an archaeological consulting firm in Tehachapi, California. Whitley was not involved in either study.
This might explain, Whitley adds, why researchers studying these old handprints have often come to contradictory conclusions. Before Snow’s work, evolutionary biologist R. Dale Guthrie performed a similar analysis of the cave prints and reported that most of them came from adolescent boys.
Snow, however, doesn’t agree with the criticisms of the new study. “I would stand by my guns here,” he says.
He sees two possible reasons that his algorithm didn’t work on the new French sample. One is that the Czech researchers didn’t use his algorithm in the same way that he did. Snow did his analysis in two steps, running the data first through an equation related to the length of the hand, and then running those results through another equation based on the ratio between the index and ring finger. The Czech researchers, in contrast, looked at the two equations separately.
Alternatively, it could be that the Czech researchers didn’t measure hand length the same way Snow did, he says. Snow measured from the tip of the middle finger to the creases where the wrist meets the palm. “If you measured the length of the hand using some other terminus at the base, you might lose a centimeter or so of the overall length,” Snow says.
So who’s right, and how can this be resolved? “I would have to see their data, and they would have to see my data, and we would have to work it out,” Snow says.
So far neither group has made contact with the other, though both parties seem willing. and the Czech group has not yet responded to my queries about their work. (If and when they do I’ll be sure to update this post.) The Czech group, for the record, rejects both of the explanations Snow proposed, saying that they used the algorithm and measured the hands exactly as Snow did.
Even if the Czech group is right, Snow says the main conclusion doesn’t change. “Even with their sample, they can show as well as I can that there were some women in them caves,” he says. “They might argue, well was it 50-50 or 70-30 or 80-20, but that part of it doesn’t concern me so much.”
Experts have been arguing over the identity of these handprints for decades, and that debate isn’t going away anytime soon. That’s part of good science. But I think this story also says something interesting about archaeology.
Archaeologists are constantly turning up objects from the distant past, and their job is to figure out what (or, in this case, who) they were. They begin, naturally, by making assumptions based on the objects and people we’re familiar with today. “It’s an issue we always confront — making ‘presentist’ projections onto the past,” Whitley says.
In the case of these handprints, the projection relates to our bodies. But it could be anything. “If you find a pot, then just calling it a pot assumes you have some understanding of what it was,” Snow says. “We all make inferences. You just have to be reasonably comfortable with your inferences.”
Many scientists have suggested that this cumulative culture depends on the size of our groups. The more of us there are, the faster our culture ratchets upwards in complexity. If our populations shrink, we lose skills and tech. We see this in theoretical models. We see it in past civilisations—Tasmania being the classic example.
And now, we can see it in two experiments. Working independently, two groups of scientists have shown that larger, more sociable groups are indeed better at maintaining complex cultural traditions, and even improving on them.
This might seem obvious, but people forget about it. As Joe Henrich, who led one of the new studies, told me, many scientists have assumed that Neanderthals were less intelligent than modern humans because they built less complex tools. The alternative is that they just lived in smaller, more scattered groups, and lacked the cultural ratchet that our ancestors had because of their big, connected societies.
As Henrich said: “For producing fancy tools and complexity, it’s better to be social than smart. And things that make us social are going to make us seem smarter.”
One hot early morning last July, archaeologist Jason De León and his team were collecting artifacts in an empty stretch of the Sonoran Desert in southern Arizona. The study area, about 55 miles south of Tucson and 40 miles north of the Mexican border, is traversed by hundreds of thousands of undocumented migrants every year. Since early 2009, De León has been cataloging the objects — water bottles, diapers, knock-off Nikes, rosaries — that the migrants leave behind on the brutal journey. But on this particular morning, his team stumbled on what he’d been dreading since day one: a dead body.
De León, 36, is one of National Geographic’s Emerging Explorers, and he described his Undocumented Migration Project yesterday to a packed auditorium at NG headquarters in Washington D.C. Most of what we hear about immigration comes from the perspective of law enforcement — think Border Wars — or pandering politicians. De León is using the migrants’ discarded possessions to tell their side of the story. “This is American history in the making,” he said, “and we can use the tools of archaeology to systematically record these steps.”
Since 2000, U.S. authorities in Tucson have made 4.5 million captures of undocumented migrants (a number that includes multiple captures of the same person). De León has interviewed hundreds of these hikers and become good friends with two: Miguel and Victor*.
He met the men in the summer of 2009 when visiting a migrant shelter in the border town of Nogales, Mexico. Miguel and Victor hadn’t known each other long. They met in a detention center in Tucson, caught after living illegally for 20-odd years in the United States. After being sent back to Nogales, they tried to trek back into the U.S. together, failed, and again wound up in Nogales. They were working in the shelter for a few weeks to pay for their stay; then they’d attempt to cross the border yet again.
De León spent a few weeks at the shelter, getting to know the two men over many hands of poker. Miguel and Victor talked optimistically about the future, promising De León that after they made it back to their home in Tucson, they’d invite him over to grill and catch up. They’d be drinking beers, but since De León was so young, they teased, he’d have apple juice.
De León went shopping with the men to get supplies for their trip. In towns like Nogales and nearby Altar, the local economy depends on migrants. Store shelves are lined with bottles of water and electrolyte juice, camouflage gear, hiking boots, first-aid kits. Altar’s baseball team is called the Coyotes, a nod to the Spanish euphemism for smugglers.
Victor bought a few garlic cloves for his backpack to ward off wild animals. De León wrote a good-bye message in marker on the inside flap: “Don’t forget you owe me an apple juice.”
De León walked with Victor and Miguel to the Western edge of town, right up to an ominous, dark tunnel that ran underneath a highway overpass. On the other side of the tunnel was the desert and, if they were lucky, a way back home. De León cried as he watched the backs of his new friends disappear.
There was a good chance, he knew, that he’d never see them again. But about three weeks later, he got a phone call from Victor. “We’re in Tucson and we’ve got your apple juice,” he said.
For the first few years of his project, De León focused on cataloging the many objects left by migrants out in the desert. It’s a fascinating collection of things both banal (bottles, paper scraps) and unique (one of his favorites is an “illegal alien card” showing a green-faced alien and an Area 51 logo on it). Sometimes these objects are found alone — a t-shirt here, a backpack there. But De León has also found “migrant stations” with huge piles of clothes and trash. Unlike traditional archaeology that’s focused on the distant past, he calls his work “the archaeology of 10 minutes ago, literally”.
More recently he’s gotten into forensics. Since 1998, some 5,600 bodies have been found on the U.S.-Mexico border. That’s thought to be a wild underestimate of the migrants who die on the trip, but no one knows much about how a dead body fares in those conditions.
After consulting with forensic experts, De León learned that pig carcasses are often used as proxies of human flesh. So last summer his team dressed a dead pig in typical migrant clothes, placed it in the middle of the desert, and set up motion cameras to watch. For the first two weeks the animal decomposed naturally. Then the vultures came. Within 24 hours, most of the pig and its clothes had disappeared, including ID cards the researchers had stuffed in the pockets.
De León felt uncomfortable with the brutality of that experiment, but was reminded of its purpose just two weeks later, when he found the body of a middle-aged woman face down on the desert floor. He covered her in a colorful blanket he had found nearby. Then he waited with his team for seven hours until the Tucson sheriff came to pick her up.
After she was gone, De León’s students built her a shrine, as is the custom for deceased migrants. They decorated it with a sundry collection of religious objects bought at a local store.
De León, too, was moved by the experience. Over the next few weeks, he got in touch with various authorities and found out that a fingerprint analysis had identified her as Marisol, a 41-year-old mother of three from Ecuador. She had been on her way to meet family in New York.
Then De León did the hardest thing he’s ever had to do. He called her family.
He struggled, he told us, to find something positive to say.
“I picked up the phone and I said, ‘I’m the person who found Marisol. And I just wanted to let you know that we sat with her for a long time, that we waited. We sat with her before the birds could get to her.'”
The first draft of the Neanderthal genome, published in 2010, came with some titillating news. It showed that 50,000 years ago, these ancient hominids interbred with the ancestors of many modern humans. If you have European or Asian ancestry, an estimated 1 to 4 percent of your DNA came from Neanderthals.
On the off chance that your mind hasn’t gone there, allow me: Our ancestors, looking pretty much like we do today, had sex with the short, extremely muscular, big-nosed, big-browed, big-headed Neanderthals. Were the differences between the two species mostly physical, with shared intellectual and cultural pursuits the subjects of their pillow talk? Or were Neanderthals violent, mute, and stupid, as so often depicted in popular culture? Or something in between?
Neanderthals almost certainly weren’t as brutish as assumed a century ago. Anthropologists now know that they used tools, made art, and may have talked. Still, nobody fully knows how their brains worked, or how their thinking was different from ours. The uncertainty is understandable considering the evidence. All scientists have to go on are the fossilized skulls the Neanderthals left behind.
Using a new and somewhat controversial (more on that later) method of analyzing these ancient skulls, scientists in England have proposed a theory about the structure of the Neanderthal brain. Although the brains of our ancestors and Neanderthals were about the same size, Neanderthals had larger brain areas related to vision and body control, according to a study out today in Proceedings of the Royal Society B: Biological Sciences.
This implies, the researchers say, that compared with our ancestors, Neanderthals had less brain space for dealing with other skills and behaviors. For example, if the Neanderthals had less brain area devoted to social cognition, it might explain why they traveled shorter distances, had fewer symbolic artifacts and lived in smaller communities.
“One of the implications of differing brain organization we propose is that Neanderthals had smaller social networks than modern humans because Neanderthals had smaller areas in their brains to deal with social complexity,” says investigator Eiluned Pearce, a graduate student working with experimental psychologist Robin Dunbar at the University of Oxford.
It’s an intriguing theory, no doubt. But some researchers wonder whether this isn’t paleo-phrenology*. Can crude anatomical relationships of the skull really reveal patterns of complex behavior?
Pearce’s team began with published data from a few dozen cranial ‘endocasts’, or rubber moldings made from the inside of skulls to show the shape of the outer brain. For this study, the researchers weren’t interested in the shape of the endocasts but rather their volume, to use as a proxy for brain size.
For each endocast, they also looked at the size of the eye sockets, or orbits. Studies of other primates have shown an interesting anatomical relationship: The bigger the eye, the bigger the visual cortex, the region at the back of the brain that interprets light signals from the retina to produce vision.
Comparing the endocasts made from 21 skulls of Neanderthals and 38 skulls of our ancestors, the researchers found that Neanderthals had larger orbits (after controlling for body size). That suggests that they also had larger eyes and visual cortices.
The findings agree with studies of endocast shape showing that Neanderthals had relatively larger occipital lobes (where the visual cortex resides) than our ancestors did, notes Emiliano Bruner, an anthropologist from the National Research Center on Human Evolution in Burgos, Spain. “We must seriously take into consideration that different human species may have had different cognitive capacities,” he says. “It is worth noting that ‘different’ does not mean worse or better, but just different.”
Why would the Neanderthals have larger eyes than our ancestors? The study suggests it’s because the Neanderthals evolved in Europe, at higher latitudes than hominids in Africa. At higher latitudes, they were exposed to lower light levels, requiring larger eyes for the same level of visual acuity. But other experts say this has nothing to do with vision. According to Bergmann’s Rule, species living in colder climates are larger than those living in warmer climates. “Humans at higher latitudes are bigger, and therefore have bigger orbits, than humans at lower latitudes,” says Trenton Holliday, an anthropologist at Tulane University.
Another problem, Holliday says, is that the researchers didn’t correct for the size of the face. Orbit size is known to increase with face size, and Neanderthals had larger faces than our ancestors did. “What I suspect is that if they correct for facial size, then the differences in relative size of the visual part of the brain will disappear,” he says.
The effect of face size “is definitely an avenue for further research,” Pearce says. But she doesn’t think it will make a difference. “Although overall body or face size might influence orbit size to some extent, a larger orbit still means a larger eye and therefore a larger visual cortex, which is our argument.”
But those are all technical concerns. The more interesting issue, to me, is the notion that the size of a brain area — the visual cortex, say — can say anything about how the Neanderthal brain worked. If there’s one thing that we’ve learned in the last century of neuroscience, it’s that the brain isn’t really modular. Yes, certain regions are specialized to process certain types of sensory inputs and are active during certain tasks. But they’re all part of distributed functional networks, and we’re nowhere near understanding how those networks lead to this or that behavior. Plus, we’ve learned from studies of injury that the brain is incredibly plastic, capable of finding several neural routes to carry out the same behavior.
So given all that, does it make sense to claim that Neanderthals didn’t have higher-order social cognition simply because their brains aren’t set up for it exactly like ours are?
Franz Gall, the founder of phrenology, had some things to say about the occipital lobe of female homo sapiens. According to the 2003 book Labeling People: “Gall also thought that, since women’s heads were larger in the back and their foreheads lower and smaller than those of men, they therefore sensed and judged differently, and their inferior organization made them superstitious.”
Postscript (11:52am EST):
The authors of the new study and one of my other sources have written me with responses to this post that I think are interesting and important, so I’ve copied them below.
The authors also asked me why I included the quote about phrenology at the end, and I think that’s a fair question. My intent was not to imply that this study is essentially phrenology, and I’m sorry if it came across that way. I guess my point is simply that we (in the media, but also scientists) must always be careful about how to interpret any particular finding. In this case, the study shows a contrast between the visual systems of Neanderthals and our ancestors. That could underlie a difference in their social processing, or it could very well not. The Gall example shows how these sorts of interpretations sometimes go too far. (I think today’s news coverage of this study makes my point pretty clearly.)
Now for what the real experts think:
Chris Stringer of the Natural History Museum (one of the study’s authors):
Re. larger face = larger eyes, these separately or together both point in the same direction of requiring more working space in the brain (somatic + visual)
“So given all that, does it make sense to claim that Neanderthals didn’t have higher-order social cognition simply because their brains aren’t set up for it exactly like ours are?”
No it doesn’t make sense to claim that, and I don’t think we claimed that – the implication instead is that, for example, Neas would not have been able to regulate such large social groups, and therefore would not have had the benefits of those larger social groups. A smaller size for the latter would have had implications for their level of social complexity and their ability to create, conserve and build on innovations.
I am surprised by the “debate” moving around on this paper. I mean, I agree we are dealing with inferences and speculation, but this is science, based on probability. Caution is recommended: as always. People find “normal” and fascinating making behavioural inferences from a single gene or molecule, but raise doubts for this complex analysis which takes into consideration so many factors. Furthermore, this study seems extremely detailed and careful when compared with usual standards in paleontology, which are often based on simply descriptions or basic statistics. Frequently, simple or even superficial approaches generate agreement or, at least, no criticism. In contrast if an approach is more complicated, it generates diffidence.
This article is not about paleo-phrenology, but it is about correlations. Apart from the hypothesis on climate and so on, the evidences of this paper come from correlations which, independently upon the adequacy of the theories to explain the causes (which of course are the ultimate aim in science and must be debated) represent an actual discovery, and an interesting proposal for further researches.