Humans evolved as an invasive species



Until this prehistoric hominid changed its diet to meat-centered,
expanding its brain to enable complex tool and weapon-making,
it was easy prey for the saber-toothed tiger.


How Homo sapiens Became the Ultimate Invasive Species

Many human species have inhabited Earth. But ours is the only one that colonized the entire planet. A new hypothesis explains why

By Curtis W. Marean | Jul 14, 2015, Scientific American

Sometime after 70,000 years ago our species, Homo sapiens, left Africa to begin its inexorable spread across the globe. Other human species had established themselves in Europe and Asia, but only our H. sapiens ancestors ultimately managed to push out into all the major continents and many island chains. Theirs was no ordinary dispersal. Everywhere H. sapiens went, massive ecological changes followed. The archaic humans they encountered went extinct, as did vast numbers of animal species. It was, without a doubt, the most consequential migration event in the history of our planet.

Paleoanthropologists have long debated how and why modern humans alone accomplished this astonishing feat of dissemination and dominion. Some experts argue that the evolution of a larger, more sophisticated brain allowed our ancestors to push into new lands and cope with the unfamiliar challenges they faced there.

Others contend that novel technology drove the expansion of our species out of Africa by allowing early modern humans to hunt prey —and dispatch enemies—with unprecedented efficiency. A third scenario holds that climate change weakened the populations of Neandertals and other archaic human species that were occupying the territories outside Africa, allowing modern humans to get the upper hand and take over their turf. Yet none of these hypotheses provides a comprehensive theory that can explain the full extent of H. sapiens‘ reach. Indeed, these theories have mostly been proffered as explanations for records of H. sapiens activity in particular regions, such as western Europe. This piecemeal approach to studying H. sapiens‘ colonization of the earth has misled scientists. The great human diaspora was one event with several phases and therefore needs to be investigated as a single research question.

Excavations I have led at Pinnacle Point on the southern coast of South Africa over the past 16 years, combined with theoretical advances in the biological and social sciences, have recently led me to an alternative scenario for how H. sapiens conquered the globe. I think the diaspora occurred when a new social behavior evolved in our species: a genetically encoded penchant for cooperation with unrelated individuals.

The joining of this unique proclivity to our ancestors’ advanced cognitive abilities enabled them to nimbly adapt to new environments. It also fostered innovation, giving rise to a game-changing technology: advanced projectile weapons. Thus equipped, our ancestors set forth out of Africa, ready to bend the whole world to their will.

A Desire to Expand

To appreciate just how extraordinary H. sapiens‘ colonization of the planet was, we must page back some 200,000 years to the dawning of our species in Africa. For tens of thousands of years, these anatomically modern humans—people who looked like us —stayed within the confines of the mother continent. Around 100,000 years ago one group of them made a brief foray into the Middle East but was apparently unable to press onward. These humans needed an edge they did not yet have. Then, after 70,000 years ago, a small founder population broke out of Africa and began a more successful campaign into new lands. As these people expanded into Eurasia, they encountered other closely related human species: the Neandertals in western Europe and members of the recently discovered Denisovan lineage in Asia. Shortly after the moderns invaded, the archaics went extinct, although some of their DNA persists in people today as a result of occasional interbreeding between the groups.

Once modern humans made it to the shores of Southeast Asia, they faced a seemingly limitless and landless sea. Yet they pushed on, undaunted. Like us, these people could envision and desire new lands to explore and conquer, so they built ocean-worthy vessels and set out across the sea, reaching Australia’s shores by at least 45,000 years ago. The first human species to enter this part of the world, H. sapiens quickly filled the continent, sprinting across it with spear-throwers and fire. Many of the largest of the strange marsupials that had long ruled the land down under went extinct. By about 40,000 years ago the trailblazers found and crossed a land bridge to Tasmania, although the unforgiving waters of the southernmost oceans denied them passage to Antarctica.

On the other side of the equator, a population of H. sapiens traveling northeast penetrated Siberia and radiated across the lands encircling the North Pole. Land ice and sea ice stymied their entry into the Americas for a time. Exactly when they finally crossed into the New World is a matter of fierce scientific debate, but researchers agree that by around 14,000 years ago they broke these barriers and swept into a continent whose wildlife had never seen human hunters before. Within just a few thousand years they reached southernmost South America, leaving a mass extinction of the New World’s great Ice Age beasts, such as mastodons and giant sloths, in their wake.

Madagascar and many Pacific islands remained free of humans for another 10,000 years, but in a final push, mariners discovered and colonized nearly all these locales. Like the other places in which H. sapiens established itself, these islands suffered the hard hand of human occupation, with ecosystems burned, species exterminated and environments reshaped to our predecessors’ purposes. Human colonization of Antarctica, for its part, was left for the industrial age.

Team Players

So how did H. sapiens do it? How, after tens of thousands of years of confinement to the continent of their origin, did our ancestors finally break out and take over not just the regions that previous human species had colonized but the entire world? A useful theory for this diaspora must do two things: First, it must explain why the process commenced when it did and not before. Second, it must provide a mechanism for rapid dispersal across land and sea, which would have required the ability to adapt readily to new environments and to displace any archaic humans found in them. I propose that the emergence of traits that made us, on one hand, peerless collaborators and, on the other, ruthless competitors best explains H. sapiens‘ sudden rise to world domination. Modern humans had this unstoppable attribute; the Neandertals and our other extinct cousins did not. I think it was the last major addition to the suite of characteristics that constitute what anthropologist Kim Hill of Arizona State University has called “human uniqueness.”

We modern humans cooperate to an extraordinary degree. We engage in highly complex coordinated group activities with people who are not kin to us and who may even be complete strangers. Imagine, in a scenario suggested by anthropologist Sarah Blaffer Hrdy of the University of California, Davis, in her 2009 book Mothers and Others, a couple of hundred chimps lining up, getting on a plane, sitting for hours extremely passively and then exiting like robots on cue. It would be unthinkable—they would battle one another nonstop. But our cooperative nature cuts both ways. The same species that leaps to the defense of a persecuted stranger will also team up with unrelated individuals to wage war on another group and show no mercy to the competition. Many of my colleagues and I think that this proclivity for collaboration— what I call hyperprosociality—is not a learned tendency but instead a genetically encoded trait found only in H. sapiens. Some other animals may show glimmers of it, but what modern humans possess is different in kind.

The question of how we came to have this genetic predisposition toward our extreme brand of cooperation is a tricky one. But mathematical modeling of social evolution has yielded some valuable clues. Sam Bowles, an economist at the Santa Fe Institute, has shown that an optimal condition under which genetically encoded hyperprosociality can propagate is, paradoxically, when groups are in conflict. Groups that have higher numbers of prosocial people will work together more effectively and thus outcompete others and pass their genes for this behavior to the next generation, resulting in the spread of hyperprosociality. Work by biologist Pete Richerson of U.C. Davis and anthropologist Rob Boyd of Arizona State additionally indicates that such behavior spreads best when it begins in a subpopulation and competition between groups is intense and when overall population sizes are small, like the original population of H. sapiens in Africa from which all modern-day people are descended. Hunter-gatherers tend to live in bands of about 25 individuals, marry outside the group and cluster into “tribes” tied together by mate exchange, gifting, and common language and traditions. They also sometimes fight other tribes. They take great risks in doing so, however, which raises the question of what triggers this willingness to engage in risky combat.

Insights into when it pays to fight have come from the classic “economic defendability” theory advanced in 1964 by Jerram Brown, now at the University at Albany, to explain variation in aggressiveness among birds. Brown argued that individuals act aggressively to attain certain goals that will maximize their survival and reproduction. Natural selection will favor fighting when it facilitates these goals. One major goal of all organisms is to secure a food supply, so if food can be defended, then it follows that aggressive behavior in its defense should be selected for. If the food cannot be defended or is too costly to patrol, then aggressive behavior is counterproductive.

In a classic paper published in 1978, Rada Dyson-Hudson and Eric Alden Smith, both then at Cornell University, applied economic defendability to humans living in small societies. Their work showed that resource defense makes sense when resources are dense and predictable. I would add that the resources in question must be crucial to the organism—no organism will defend a resource it does not need. This principle still holds today: ethnic groups and nation-states fight viciously over dense, predictable and valued resources such as oil, water and productive agricultural land. An implication of this territoriality theory is that the environments that would have fostered intergroup conflict, and thus the cooperative behaviors that would have enabled such fighting, were not universal in early H sapiens‘ world. They were restricted to those locales where high-quality resources were dense and predictable. In Africa, terrestrial resources are, for the most part, sparse and unpredictable, which explains why most of the hunter-gatherers there who have been studied invest little time and energy in defending boundaries. But there are exceptions to this rule. Certain coastal areas have very rich, dense and predictable foods in the form of shellfish beds.

And the ethnographic and archaeological records of hunter-gatherer warfare worldwide show that the highest levels of conflict have occurred among groups who used coastal resources, such as those in coastal Pacific North America.

When did humans first adopt dense and predictable resources as a cornerstone of their diet? For millions of years our ancient ancestors foraged for terrestrial plants and animals, as well as some inland aquatic foods on occasion. All these comestibles occur at low densities, and most are unpredictable. For this reason, our predecessors lived in highly dispersed groups that were constantly traveling in search of their next meal. But as human cognition grew increasingly complex, one population figured out how to make a living on the coast by eating shellfish. My team’s excavations at the Pinnacle Point sites indicate that this shift began by 160,000 years ago on the southern shores of Africa. There, for the first time in the history of humankind, people started targeting a dense, predictable and highly valued resource—a development that would lead to major social change.

Genetic and archaeological evidence suggests that H. sapiens underwent a population decline shortly after it originated, thanks to a global cooling phase that lasted from around 195,000 to 125,000 years ago. Seaside environments provided a dietary refuge for H. sapiens during the harsh glacial cycles that made edible plants and animals hard to find in inland ecosystems and were thus crucial to the survival of our species.

These marine coastal resources also provided a reason for war. Recent experiments on the southern coast of Africa, led by Jan De Vynck of Nelson Mandela Metropolitan University in South Africa, show that shellfish beds can be extremely productive, yielding up to 4,500 calories per hour of foraging. My hypothesis, in essence, is that coastal foods were a dense, predictable and valuable food resource. As such, they triggered high levels of territoriality among humans, and that territoriality led to intergroup conflict. This regular fighting between groups provided conditions that selected for prosocial behaviors within groups—working together to defend the shellfish beds and thereby maintain exclusive access to this precious resource—which subsequently spread throughout the population.

Weapon of War

With the ability to operate in groups of unrelated individuals, H. sapiens was well on its way to becoming an unstoppable force. But, I surmise, it needed a new technology —projectile weaponry—to reach its full potential for conquest. This invention was a long time in the making. Technologies are additive: they build on prior experiments and knowledge and become increasingly complex. The development of projectile weapons would have followed the same trajectory, most likely evolving from stabbing stick, to hand-cast spear, to leverage-assisted casting spear (atlatl), to bow and arrow, and finally to all the wildly inventive ways contemporary humans have come up with to launch deadly objects.

With each new iteration, the technology became more lethal. Simple wood spears with shaved points tend to produce a puncture wound, but such an injury has limited impact because it does not bleed the animal quickly. Tipping the spear with a sharpened stone increases the trauma of the wound. This elaboration requires several connected technologies, however: one must be able to shape a tool into a point that will penetrate an animal and shape a base that can be attached to a spear. It also requires some type of connecting technology to secure the stone point to the wood shaft—either glue or a tying material, sometimes both. Jayne Wilkins, now at the University of Cape Town in South Africa, and her colleagues have shown that stone tools from a site in South Africa called Kathu Pan 1 were used as spearpoints some 500,000 years ago.

The antiquity of the Kathu Pan 1 find implies that it is the handiwork of the last common ancestor of Neandertals and modern humans, and later remains from 200,000 years ago show that, as one might expect, both descendant species made these kinds of tools, too. This shared technology means that, for a time, there was a balance of power between Neandertals and early H. sapiens. But that situation was about to change.

Experts agree that the appearance of miniaturized stone tools in the archaeological record signals the advent of true projectile technology, for which lightness and ballistics are crucial. Such tools are too small to wield by hand. Instead they must have been mounted in slots grooved into bone or wood to create weapons capable of being launched at high speed and long distance. The oldest known examples of this so-called microlithic technology come from none other than Pinnacle Point. There, in a rock shelter known simply as PP5-6, my team found a long record of human occupation.

Using a technique called optically stimulated luminescence dating, geochronologist Zenobia Jacobs of the University of Wollongong in Australia determined that the archaeological sequence in PP5-6 spans the time from 90,000 to 50,000 years ago. The oldest microlithic tools at the site date to around 71,000 years ago.

The timing hints that climate change may have precipitated the invention of this new technology. Before 71,000 years ago, the inhabitants of PP5-6 were making large stone points and blades from a type of rock called quartzite. Back then, as team member Erich Fisher of Arizona State has shown, the coastline was close to Pinnacle Point.

And reconstructions of the climate and environment by Mira Bar-Matthews of the Geological Survey of Israel and Kerstin Braun, now a postdoctoral researcher at Arizona State, indicate that conditions were similar to the ones that prevail in the area today, with strong winter rains and shrubby vegetation. But around 74,000 years ago the world’s climate began shifting to glacial conditions. The sea level dropped, exposing a coastal plain; summer rains increased, resulting in the spread of highly nutritious grasses and woodlands dominated by acacia trees. We think a large migration ecosystem in which grazing animals traveled east in the summer and west in the winter, tracking the rainfall and hence the fresh grass, developed on the formerly submerged coast.

Exactly why the denizens of PP5-6 began making small, light armaments after the climate shifted is unclear. But perhaps it was to pick off animals as they migrated across the new plain. Whatever the reason, the people there developed an ingenious means of making their tiny tools: turning to a new raw material—a rock called silcrete—they heated it with fire to make it easier to shape into small, sharp points. Only with the shift in climate that occurred could these early modern humans have had access to a sufficiently steady supply of firewood from the spreading acacia trees to make the manufacture of these heat-treated microlithic tools into an enduring tradition. We do not yet know what kind of projectile technology these microliths were used for.

My colleague Marlize Lombard of the University of Johannesburg in South Africa has studied somewhat later examples from other sites and argues that they represent the origin of the bow and arrow, given that damage patterns on them resemble those seen on known arrow tips. I am not totally convinced, because her study did not test the damage created by atlatls. Whether at Pinnacle Point or elsewhere, I think the simpler atlatl preceded the more complex bow and arrow.

I also suspect that like recent hunter-gatherers in Africa, whose lives were documented in ethnographic accounts, early H. sapiens would have discovered the effectiveness of poison and used it to increase the killing power of projectiles. The final killing moments of a spear hunt are chaos—pounding heart, heaving lungs, dust and blood, and the stink of sweat and urine. Danger abounds. An animal run to ground, fallen to its knees through exhaustion and blood loss, has one last trick: instinct screams for the beast to lurch to its feet one final time, close the gap and bury its horns in your guts. The short lives and broken bodies of Neandertals indicate that they suffered the consequences of hunting large animals at close range with handheld spears. Now consider the advantages of a projectile launched from afar and tipped with poison that paralyzes that animal, allowing the hunter to walk up and end the chase with little threat. This weapon was a breakthrough innovation.

Force of Nature

With the joining of projectile weapons to hyperprosocial behavior, a spectacular new kind of creature was born, one whose members formed teams that each operated as a single, indomitable predator. No prey—or human foe—was safe. Availed of this potent combination of traits, six men speaking six languages can put back to oar and pull in unison, riding 10-meter swells so the harpooner can rise to the prow at the headsman’s order and fling lethal iron into the heaving body of a leviathan, an animal that should see humans as nothing more than minnows. In the same way, a tribe of 500 people dispersed in 20 networked bands can field a small army to exact retribution on a neighboring tribe for a territorial incursion.

The emergence of this strange brew of killer and cooperator may well explain why, when glacial conditions returned between 74,000 and 60,000 years ago, once again rendering large swathes of Africa inhospitable, modern human populations did not contract as they had before. In fact, they expanded in South Africa, flourishing with a wide diversity of advanced tools. The difference was that this time modern humans were equipped to respond to any environmental crisis with flexible social connections and technology. They became the alpha predators on land and, eventually, sea. This ability to master any environment was the key that finally opened the door out of Africa and into the rest of the world.

Archaic human groups that could not join together and hurl weapons did not stand a chance against this new breed. Scientists have long debated why our cousins the Neandertals went extinct. I think the most disturbing explanation is also the most likely one: Neandertals were perceived as a competitor and threat, and invading modern humans exterminated them. It is what they evolved to do.

Sometimes I think about how that fateful encounter between modern humans and Neandertals played out. I imagine the boasting tales Neandertals might have told around their campfires of titanic battles against impossibly huge cave bears and mammoths, fought under the gray skies of glacial Europe, barefoot on ice slick with the blood of prey and brother. Then, one day, the tradition took a dark turn; the regaling turned fearful. Neandertal raconteurs spoke of new people coming into the land—fast, clever people who hurled their spears impossible distances, with dreadful accuracy. These strangers even came at night in large groups, slaughtering men and children and taking the women.

The sad story of those first victims of modern human ingenuity and cooperation, the Neandertals, helps to explain why horrific acts of genocide and xenocide crop up in the world today. When resources and land get sparse, we designate those who do not look or speak like us as “the others,” and then we use those differences to justify exterminating or expelling them to eliminate competition. Science has revealed the stimuli that trigger our hardwired proclivities to classify people as “other” and treat them horrifically. But just because H. sapiens evolved to react to scarcity in this ruthless way does not mean we are locked into this response. Culture can override even the strongest biological instincts. I hope that recognition of why we instinctively turn on one another in lean times will allow us to rise above our malevolent urges and heed one of our most important cultural directives: “Never again.”


Curtis W. Marean is a professor at the School of Human Evolution and Social Change at Arizona State University and associate director of the university’s Institute of Human Origins. Marean is also an honorary professor at Nelson Mandela Metropolitan University in South Africa. He is particularly interested in the origins of modern humans and the occupation of coastal ecosystems. His research is funded by the National Science Foundation and the Hyde Family Foundations.




Agriculture, population growth, and statistical analysis of the radiocarbon record

  1. Jabran Zahid, Erick Robinson, and Robert L. Kelly


We statistically analyze the radiocarbon record and show that early farming societies in Europe grew at the same rate as contemporaneous foraging societies in North America. Thus, our results challenge the commonly held view that the advent of agriculture was linked to accelerated growth of the human population. The same rates of prehistoric population growth measured worldwide suggest that the global climate and/or biological factors intrinsic to the species and not factors related to the regional environment or subsistence practices regulated the growth of the human population for most of the last 12,000 y. This study demonstrates that statistical analysis of the radiocarbon record is a robust quantitative approach for studying prehistoric human demography.


The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.


1To whom correspondence may be addressed. Email: or

Author contributions: H.J.Z., E.R., and R.L.K. designed research; H.J.Z., E.R., and R.L.K. performed research; H.J.Z. and E.R. contributed new reagents/analytic tools; H.J.Z., E.R., and R.L.K. analyzed data; and H.J.Z., E.R., and R.L.K. wrote the paper.

The authors declare no conflict of interest.

This article is a PNAS Direct Submission. S.J.S. is a guest editor invited by the Editorial Board.


Navigation bar--use text links at bottom of page.

(Humanity’s Evolutionary Prehistoric Diet and Ape Diets–continued, Part C)

Timeline of dietary shifts
in the human line of evolution

Can you give us a timeline of dietary developments in the human line of evolution to show readers the overall picture from a bird’s-eye view, so we can set a context for further discussion here?

Sure. We need to start at the beginning of the primate line long before apes and humans ever evolved, though, to make sure we cover all the bases, including the objections often made by vegetarians (and fruitarians for that matter) that those looking into prehistory simply haven’t looked far enough back to find our “original” diet. Keep in mind some of these dates are approximate and subject to refinement as further scientific progress is made.

65,000,000 to 50,000,000 B.C.: The first primates, resembling today’s mouse lemurs, bush-babies, and tarsiers, weighing in at 2 lbs. or less, and eating a largely insectivorous diet.[8]

50,000,000 to 30,000,000 B.C.: A gradual shift in diet for these primates to mostly frugivorous in the middle of this period to mostly herbivorous towards the end of it, but with considerable variance between specific primate species as to lesser items in the diet, such as insects, meat, and other plant foods.[9]

30,000,000 to 10,000,000 B.C.: Fairly stable persistence of above dietary pattern.[10]

Approx. 10,000,000 to 7,000,000 B.C.: Last common primate ancestor of both humans and the modern ape family.[11]

Approx. 7,000,000 to 5,000,000 B.C.: After the end of the previous period, a fork occurs branching into separate primate lines, including humans.[12] The most recent DNA evidence shows that humans are closely related to both gorillas and chimpanzees, but most closely to the chimp.[13] Most paleoanthropologists believe that after the split, flesh foods began to assume a greater role in the human side of the primate family at this time.[14]

Approx. 4,500,000 B.C.: First known hominid (proto-human) from fossil remains, known as Ardipithecus ramidus–literally translating as “root ape” for its position as the very first known hominid, which may not yet have been fully bipedal (walking upright on two legs). Anatomy and dentition (teeth) are very suggestive of a form similar to that of modern chimpanzees.[15]

Approx. 3,700,000 B.C.: First fully upright bipedal hominid, Australopithecus afarensis (meaning “southern ape,” for the initial discovery in southern Africa), about 4 feet tall, first known popularly from the famous “Lucy” skeleton.[16]

3,000,000 to 2,000,000 B.C.: Australopithecus line diverges into sub-lines,[17] one of which will eventually give rise to Homo sapiens (modern man). It appears that the environmental impetus for this “adaptive radiation” into different species was a changing global climate between 2.5 and 2 million years ago driven by glaciation in the polar regions.[18] The climatic repercussions in Africa resulted in a breakup of the formerly extensively forested habitat into a “mosaic” of forest interspersed with savanna (grassland). This put stress on many species to adapt to differing conditions and availability of foodstuffs.[19] The different Australopithecus lineages, thus, ate somewhat differing diets, ranging from more herbivorous (meaning high in plant matter) to more frugivorous (higher in soft and/or hard fruits than in other plant parts).

There is still some debate as to which Australopithecus lineage modern humans ultimately descended from, but recent evidence based on strontium/calcium ratios in bone, plus teeth microwear studies, show that whatever the lineage, some meat was eaten in addition to the plant foods and fruits which were the staples.[20]

2,300,000 to 1,500,000 B.C.: Appearance of the first “true humans” (signified by the genus Homo), known as Homo habilis (“handy man”)–so named because of the appearance of stone tools and cultures at this time. These gatherer-hunters were between 4 and 5 feet in height, weighed between 40 to 100 pounds, and still retained tree-climbing adaptations (such as curved finger bones)[21] while subsisting on wild plant foods and scavenging and/or hunting meat. (The evidence for flesh consumption based on cut-marks on animal bones, as well as use of hammerstones to smash them for the marrow inside, dates to this period.[22]) It is thought that they lived in small groups like modern hunter-gatherers but that the social structure would have been more like that of chimpanzees.[23]

The main controversy about this time period by paleoanthropologists is not whether Homo habilis consumed flesh (which is well established) but whether the flesh they consumed was primarily obtained by scavenging kills made by other predators or by hunting.[24] (The latter would indicate a more developed culture, the former a more primitive one.) While meat was becoming a more important part of the diet at this time, based on the fact that the diet of modern hunter-gatherers–with their considerably advanced tool set–has not been known to exceed 40% meat in tropical habitats* like habilis evolved in, we can safely assume that the meat in habilis’ diet would have been substantially less than that.[25]

1,700,000 to 230,000 B.C.: Evolution of Homo habilis into the “erectines,”* a range of human species often collectively referred to as Homo erectus, after the most well-known variant. Similar in height to modern humans (5-6 feet) but stockier with a smaller brain, hunting activity increased over habilis, so that meat in the diet assumed greater importance. Teeth microwear studies of erectus specimens have indicated harsh wear patterns typical of meat-eating animals like the hyena.[26] No text I have yet read ventures any sort of percentage figure from this time period, but it is commonly acknowledged that plants still made up the largest portion of the subsistence.* More typically human social structures made their appearance with the erectines as well.[27]

The erectines were the first human ancestor to control and use fire. It is thought that perhaps because of this, but more importantly because of other converging factors–such as increased hunting and technological sophistication with tools–that about 900,000 years ago in response to another peak of glacial activity and global cooling (which broke up the tropical landscape further into an even patchier mosaic), the erectines were forced to adapt to an increasingly varied savanna/forest environment by being able to alternate opportunistically between vegetable and animal foods to survive, and/or move around nomadically.[28]

For whatever reasons, it was also around this time (dated to approx. 700,000 years ago) that a significant increase in large land animals occurred in Europe (elephants, hoofed animals, hippopotamuses, and predators of the big-cat family) as these animals spread from their African home. It is unlikely to have been an accident that the spread of the erectines to the European and Asian continent during and after this timeframe coincides with this increase in game as well, as they probably followed them.[29]

Because of the considerably harsher conditions and seasonal variation in food supply, hunting became more important to bridge the seasonal gaps, as well as the ability to store nonperishable items such as nuts, bulbs, and tubers for the winter when the edible plants withered in the autumn. All of these factors, along with clothing (and also perhaps fire), helped enable colonization of the less hospitable environment. There were also physical changes in response to the colder and darker areas that were inhabited, such as the development of lighter skin color that allowed the sun to penetrate the skin and produce vitamin D, as well as the adaptation of the fat layer and sweat glands to the new climate.*[30]

Erectus finds from northern China 400,000 years ago have indicated an omnivorous diet of meats, wild fruit and berries (including hackberries), plus shoots and tubers, and various other animal foods such as birds and their eggs, insects, reptiles, rats, and large mammals.[31]

500,000 to 200,000 B.C.: Archaic Homo sapiens (our immediate predecessor) appears. These human species, of which there were a number of variants, did not last as long in evolutionary time as previous ones, apparently due simply to the increasingly rapid rate of evolution occurring in the human line at this time. Thus they represent a transitional time after the erectines leading up to modern man, and the later forms are sometimes not treated separately from the earliest modern forms of true Homo sapiens.[32]

150,000 to 120,000 B.C.: Homo sapiens neanderthalensis–or the Neanderthals–begin appearing in Europe, reaching a height between 90,000 and 35,000 years ago before becoming extinct. It is now well accepted that the Neanderthals were an evolutionary offshoot that met an eventual dead-end (in other words, they were not our ancestors), and that more than likely, both modern Homo sapiens and Neanderthals were sister species descended from a prior common archaic sapiens ancestor.[33]

140,000 to 110,000 B.C.: First appearance of anatomically modern humans (Homo sapiens).[34] The last Ice Age also dates from this period–stretching from 115,000 to 10,000 years ago. Thus it was in this context, which included harsh and rapid climatic changes, that our most recent ancestors had to flexibly adapt their eating and subsistence.[35] (Climatic shifts necessitating adaptations were also experienced in tropical regions, though to a lesser degree.[36]) It may therefore be significant that fire, though discovered earlier, came into widespread use around this same time[37] corresponding with the advent of modern human beings. Its use may in fact be a defining characteristic of modern humans[38] and their mode of subsistence. (I’ll discuss the timescale of fire and cooking at more length later.)

130,000 to 120,000 B.C.: Some of the earliest evidence for seafoods (molluscs, primarily) in the diet by coastal dwellers appears at this time,[39] although in one isolated location discovered so far, there is evidence going back 300,000 years ago.[40] Common use of seafoods by coastal aborigines becomes evident about 35,000 years ago,[41] but widespread global use in the fossil record is not seen until around 20,000 years ago and since.[42] For the most part, seafoods should probably not be considered a major departure,* however, as the composition of fish, shellfish, and poultry more closely resembles the wild land-game animals many of these same ancestors ate than any other source today except for commercial game farms that attempt to mimic ancient meat.[43]

40,000 to 35,000 B.C.: The first “behaviorally modern” human beings–as seen in the sudden explosion of new forms of stone and bone tools, cave paintings and other artwork, plus elaborate burials and many other quintessentially modern human behaviors. The impetus or origin for this watershed event is still a mystery.[44]

40,000 B.C. to 10-8,000 B.C.: Last period prior to the advent of agriculture in which human beings universally subsisted by hunting and gathering (also known as the “Late Paleolithic”–or “Stone Age”–period). Paleolithic peoples did process some of their foods, but these were simple methods that would have been confined to pounding, grinding, scraping, roasting, and baking.[45]

35,000 B.C. to 15-10,000 B.C.: The Cro-Magnons (fully modern pre-Europeans) thrive in the cold climate of Europe via big-game hunting, with meat consumption rising to as much as 50%* of the diet.[46]

25,000 to 15,000 B.C.: Coldest period of the last Ice Age, during which global temperatures averaged 14°F cooler than they do today[47] (with local variations as much as 59°F lower[48]), with an increasingly arid environment and much more difficult conditions of survival to which plants, animals, and humans all had to adapt.[49] The Eurasian steppes just before and during this time had a maximum annual summer temperature of only 59°F.[50]

Humans in Europe and northern Asia, and later in North America, adapted by increasing their hunting of the large mammals such as mammoths, horses, bison and caribou which flourished on the open grasslands, tundra, and steppes which spread during this period.[51] Storage of vegetable foods that could be consumed during the harsh winters was also exploited. Clothing methods were improved (including needles with eyes) and sturdier shelters developed–the most common being animal hides wrapped around wooden posts, some of which had sunken floors and hearths.[52] In the tropics, large areas became arid. (In South Africa, for instance, the vegetation consisted mostly of shrubs and grass with few fruits.[53])

20,000 B.C. to 9,000 B.C.: Transitional period known as the “Mesolithic,” during which the bow-and-arrow appeared,[54] and gazelle, antelope, and deer were being intensively hunted,[55] while at the same time precursor forms of wild plant and game management began to be more intensively practiced. At this time, wild grains, including wheat and barley by 17,000 B.C.–before their domestication–were being gathered and ground into flour as evidenced by the use of mortars-and-pestles in what is now modern-day Israel. By 13,000 B.C. the descendants of these peoples were harvesting wild grains intensely and it was only a small step from there to the development of agriculture.[56] Game management through the burning-off of land to encourage grasslands and the increase of herds became widely practiced during this time as well. In North America, for instance, the western high plains are the only area of the current United States that did not see intensive changes to the land through extensive use of fire.[57]

Also during this time, and probably also for some millennia prior to the Mesolithic (perhaps as early as 45,000 B.C.), ritual and magico-religious sanctions protecting certain wild plants developed, initiating a new symbiotic relationship between people and their food sources that became encoded culturally and constituted the first phase of domestication well prior to actual cultivation. Protections were accorded to certain wild food species (yams being a well-known example) to prevent disruption of their life cycle at periods critical to their growth, so that they could be profitably harvested later.[58] Digging sticks for yams have also been found dating to at least 40,000 B.C.,[59] so these tubers considerably antedated the use of grains in the diet.

Foods known to be gathered during the Mesolithic period in the Middle East were root vegetables, wild pulses (peas, beans, etc.), nuts such as almonds, pistachios, and hazelnuts, as well as fruits such as apples. Seafoods such as fish, crabs, molluscs, and snails also became common during this time.[60]

Approx. 10,000 B.C.: The beginning of the “Neolithic” period, or “Agricultural Revolution,” i.e., farming and animal husbandry. The transition to agriculture was made necessary by gradually increasing population pressures due to the success of Homo sapiens’ prior hunting and gathering way of life. (Hunting and gathering can support perhaps one person per square 10 miles; Neolithic agriculture 100 times or more that many.[61]) Also, at about the time population pressures were increasing, the last Ice Age ended, and many species of large game became extinct (probably due to a combination of both intensive hunting and disappearance of their habitats when the Ice Age ended).[62] Wild grasses and cereals began flourishing,* making them prime candidates for the staple foods to be domesticated, given our previous familiarity with them.[63] By 9,000 B.C. sheep and goats were being domesticated in the Near East, and cattle and pigs shortly after, while wheat, barley, and legumes were being cultivated somewhat before 7,000 B.C., as were fruits and nuts, while meat consumption fell enormously.[64] By 5,000 B.C. agriculture had spread to all inhabited continents except Australia.[65] During the time since the beginning of the Neolithic, the ratio of plant-to-animal foods in the diet has sharply increased from an average of probably 65%/35%* during Paleolithic times[66] to as high as 90%/10% since the advent of agriculture.[67]

Remains of fossil humans indicate decrease in health status after the Neolithic. In most respects, the changes in diet from hunter-gatherer times to agricultural times have been almost all detrimental, although there is some evidence we’ll discuss later indicating that at least some genetic adaptation to the Neolithic has begun taking place in the approximately 10,000 years since it began. With the much heavier reliance on starchy foods that became the staples of the diet, tooth decay, malnutrition, and rates of infectious disease increased dramatically over Paleolithic times, further exacerbated by crowding leading to even higher rates of communicable infections.

Skeletal remains show that height decreased by four inches* from the Late Paleolithic to the early Neolithic, brought about by poorer nutrition, and perhaps also by increased infectious disease causing growth stress, and possibly by some inbreeding in communities that were isolated. Signs of osteoporosis and anemia, which was almost non-existent in pre-Neolithic times, have been frequently noted in skeletal pathologies observed in the Neolithic peoples of the Middle East. It is known that certain kinds of osteoporosis which have been found in these skeletal remains are caused by anemia, and although the causes have not yet been determined exactly, the primary suspect is reduced levels of iron thought to have been caused by the stress of infectious disease rather than dietary deficiency, although the latter remains a possibility.[68]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s