Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

In particular, Renfrew has been preoccupied by what he has dubbed the “sapient paradox”: the immense time lag between the emergence of anatomically modern human beings and the advent of the cultural be­haviors that we take to define humanity.

Prehistory is defined as that period of human history during which people either hadn’t yet achieved literacy–our basic ­information storage technology–or left behind no written records. Thus, in Egypt, prehistory ended around 3000 b.c.e., in the Early Dynastic Period, when hieroglyph-inscribed monuments, clay tablets, and papyrus appeared; in Papua New Guinea, conversely, it ended as recently as the end of the last century. Archaeologists and anthropologists accept this region-by-region definition of prehistory’s conclusion, but they agree less about its beginning. A few have seen prehistory as commencing as recently as around 40,000 b.c.e., with the emergence of Cro-Magnon man, who as Homo sapiens sapiens was almost indistinguishable from us (although Cro-Magnons, on average, had larger brains and more robust physiologies). However, most experts would probably say that prehistory began in the Middle Pleistocene, as many as 200,000 years ago–when Homo neanderthalensis (sometimes classified as Homo sapiens neanderthalensis) and archaic Homo sapiens emerged. Either way, it’s assumed that the appearance of Homo sapiens sapiens triggered “a new pace of change … that set cultural development upon [an] … accelerating path of development,” as Renfrew writes in Prehistory. But Renfrew thinks that this acceleration must have been due to something else.

“The evidence that Homo sapiens’ arrival equates with full linguistic abilities, the human behavioral revolution, and so on is very limited,” Renfrew told me, adding that he sees nothing clearly separating the flint tools of the Neanderthals from those associated with Homo sapiens. As for the cave paintings at Altamira, Lascaux, and other Southern European sites, which are 15,000 to 17,000 years old: “They’re amazing, but stylistically singular and very restricted in their distribution. They mightn’t be characteristic of early Homo sapiens.” Overall, Renfrew thinks, if aliens from space had compared Homo sapiens hunter-gatherers with their earlier counterparts, they probably wouldn’t have seen much difference.

Two and a half million years ago, the first protohumans, Homo habilis, shaped stones to take the place of the claws and fangs they lacked, using them to kill small animals and scavenge the remains of larger ones. The payoff was immense: whereas metabolic needs like food processing constrain brain size for most mammals, eating meat enabled habilis to start evolving a smaller gut, freeing that metabolic energy for the brain’s use. After a few hundred thousand years, later hominids like erectus and ergaster had developed straightened finger bones, stronger thumbs, and longer legs. The expansion of hominid brains–they were twice as big within a million years, three times by the Middle Paleolithic–enabled symbolic communication and abstract thought. By 50,000 b.c.e., our ancestors had spread from Africa through Asia, Europe, and Australia.

Archaeogenetics Emerges
The paradox, or puzzle, is this: if archaic Homo sapiens emerged as long as 200,000 years ago, why did our species need so many millennia before its transition, 12,000 to 10,000 years ago, from the hunter-­gatherer nomadism that characterized all previous hominids to permanent, year-round settle­ment, which then allowed the elabo­ration of humankind’s cultural efforts? To answer this question, Renfrew calls for a grand synthesis of three approaches: scientific archaeology, which collects hard data through radiocarbon dating and similar technologies; linguistic study aimed at constructing clear histories of the world’s languages; and molecular genetic analysis.

6 comments. Share your thoughts »

Credit: Bettmann/Corbis

Tagged: Biomedicine, computer modeling

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me
×

A Place of Inspiration

Understand the technologies that are changing business and driving the new global economy.

September 23-25, 2014
Register »