Did Humans Evolve To Eat Meat? An Evolutionary Biologist Explains What Your Anatomy Actually RevealsForbes·Scott Travers· April 27, 2026 The evolutionary case for eating meat is etched into human anatomy — but so is the case against it. The science deserves more than a simple verdict. getty Few questions produce more confident, contradictory answers than the one laid out in the headline of this story. Ask a carnivore diet enthusiast, and you’ll hear that humans are apex predators, biologically engineered for red meat and bone marrow. Ask a veganism advocate, and you’ll be directed toward our primate cousins, the chimpanzees and their largely plant-based diets. Ask a paleo proponent, and you’ll get something in between, involving a lot of wild game and seasonal berries. The problem with all of these answers is that they are each partly right, and that’s precisely what makes this such a genuinely interesting scientific question. The debate over the “natural human diet” is not merely a nutritional squabble; it cuts to the heart of who we are as a species, where we came from and how our bodies came to be built the way they are. What does the evolutionary and biological evidence actually say? As it turns out, something more nuanced and more fascinating than any tribe in the diet wars tends to admit. A Two Million-Year-Old History Of Meat-EatingLet’s start with the fossils, because they are unambiguous on one point: our ancestors were eating meat a very long time ago. Stone tool marks on animal bones recovered from Gona, Ethiopia, date butchery activity to approximately 2.6 million years ago — well before the genus Homo had fully established itself. By 1.5 million years ago, the evidence from Tanzania’s Olduvai Gorge suggests that early humans were not merely scavenging scraps. They were hunting. The most comprehensive synthesis of this evidence was published in a 2021 study in the American Journal of Physical Anthropology. Drawing on roughly 400 scientific papers spanning genetics, zooarchaeology, stable isotope analysis and comparative physiology, the authors argued that for the better part of two million years, Homo was a “hypercarnivore.” This means that Homo obtained more than 70% of dietary energy from animal sources, much like large social predators such as wolves and hyenas today. This is a striking claim, and not without its criticism. But the anatomical evidence supporting it is harder to dismiss than the headline figure might suggest. Consider the stomach. Human gastric acidity sits at a pH of approximately 1.5, which is as acidic as a dedicated scavenger, and far more acidic than other omnivores. Maintaining that level of acidity is metabolically expensive. It serves two purposes well understood by digestive physiologists: breaking down dense animal proteins, and sterilizing the bacteria that accumulate in aged meat. This is not the gut chemistry of a species that accidentally stumbled upon the occasional rabbit. Then there is the brain. Humans have, by any biological measure, an absurdly large brain relative to body size. Building and running it requires a continuous supply of iron, zinc, vitamin B12 and long-chain omega-3 fatty acids, particularly DHA. These nutrients exist in plants, but at lower concentrations and in forms that are often difficult to absorb. They are, however, abundant in animal tissue. The landmark “expensive tissue hypothesis,” published in Current Anthropology in 1995, proposed that the dramatic expansion of the human brain was made possible in part by a corresponding reduction in gut size — a trade enabled by switching to a higher-quality, more energy-dense diet rich in animal foods. Perhaps the most arresting piece of paleopathological evidence for our meat dependence comes from Tanzania. Fragments of a child’s skull, dated to 1.5 million years ago, show deformities consistent with porotic hyperostosis, a condition linked to vitamin B12 deficiency. B12 is found exclusively in animal-derived foods. If this interpretation is correct, it suggests that by that point in our evolution, meat had become not merely beneficial, but physiologically essential. Our bodies, having offloaded the metabolic cost of producing certain nutrients, had begun to rely on obtaining them externally. That is the biological definition of dietary dependence. Did We Prefer Meat Over Plants?If the story ended with meat, humans would look like obligate carnivores. But as we know full well, we are not. The evidence for deep, consequential plant-food adaptation in human biology is equally real — just harder to find in the ground. That last point is not a throwaway caveat. It’s a genuine methodological problem. Animal bones, stone tools and calcified tissue survive for millions of years. Tubers, seeds, leaves and fruit do not. The asymmetry in what the archaeological record preserves has almost certainly created a systematic overemphasis on meat in reconstructions of ancestral diets. Plant foods leave faint traces: microscopic starch granules on the surface of grinding stones, phytoliths in ancient sediment, the occasional charred seed. They require far more careful excavation to find. The important point is this: their near-absence in many fossil sites is not evidence of absence in the diet. The genetic evidence for plant-food adaptation is particularly compelling. A 2007 study in Nature Genetics found that humans carry significantly more copies of the salivary amylase gene (AMY1) than other primates. Amylase is the enzyme responsible for breaking down dietary starch in the mouth. The more copies you carry, the more amylase you produce and the more efficiently you digest carbohydrates. This is not a recent adaptation. The duplication of AMY1 appears to predate modern Homo sapiens and correlates with a long evolutionary history of consuming starchy plant foods, most likely underground storage organs like tubers, which were available year-round in ways that large game animals simply were not. Isotope analysis of Australopithecus skeletons, our ancestors from roughly 3 to 4 million years ago, shows dietary signatures consistent with a heavily plant-based diet, with little to no meat consumption. The major shift toward animal foods appears to coincide with the emergence of Homo and the development of stone tools, not with the entirety of the hominin lineage. This matters because it means the full span of our evolutionary history includes a very long chapter in which plants dominated. And then there is cooking, which complicates everything beautifully. Richard Wrangham’s “cooking hypothesis,” developed across decades of research and synthesized in his book Catching Fire, argues that the control of fire and the cooking of food, both plant and animal, was itself a primary driver of brain expansion. Cooking dramatically increases the bioavailability of calories from starchy roots and tubers, making them a suddenly viable staple. It also softens meat, reducing the jaw musculature required to chew it raw, which in turn frees up cranial space for brain tissue. The key insight is that cooking was not just a way of making food safer. It was a biological lever that changed what counted as food in the first place. Is Meat Non-Negotiable For Humans?This is where intellectual honesty requires us to step back from the data and ask what we are actually trying to learn. “Did humans evolve to eat meat?” is, in a narrow sense, answerable: yes, substantially so, particularly over the last two million years. But the more important question, “What does that mean for how I should eat today?” is one that evolutionary biology alone cannot answer. Miki Ben-Dor, one of the most vocal proponents of the hypercarnivore hypothesis, is explicit about this in his own published work. Ancestral eating patterns are not dietary prescriptions. The Pleistocene megafauna our ancestors hunted are largely extinct. Our food supply, physical activity levels, disease environment, lifespan and population density are radically different from anything our Stone Age ancestors encountered. Evolution optimizes for reproductive success in a given environment, not for long life in a modern one. The epidemiological literature adds another wrinkle here. A 2015 meta-analysis published in The Lancet Oncology and commissioned by the World Health Organization’s International Agency for Research on Cancer, classified processed meat as a Group 1 carcinogen and red meat as Group 2A, “probably carcinogenic” to humans. These classifications are not without nuance (dose matters enormously), but they suggest that whatever our ancestors’ digestive systems were calibrated for, the industrially produced meat that constitutes most modern consumption is a rather different substance. What the totality of evidence actually supports is this: humans are metabolically flexible omnivores with notable carnivorous adaptations that emerged over the past two million years, overlaid on an older primate legacy of plant consumption. We are neither obligate carnivores nor natural herbivores. We are something more ecologically interesting: a species whose evolutionary success was built precisely on dietary versatility, on the capacity to extract adequate nutrition from whatever combination of foods a given environment provided. That flexibility is arguably the most important thing our anatomy reveals. Not what we must eat, but what we can. The question of what we should eat — given our health, our ecology, our ethics and our planet — is one we have to answer ourselves, with better tools than our ancestors had. The bones, genes and stomach acid are informative. They do not, however, have to decide our menu. ARTICLE SOURCE: https://www.forbes.com/sites/scotttravers/2026/04/27/did-humans-evolve-to-eat-meat-an-evolutionary-biologist-explains-what-your-anatomy-actually-reveals/
How do these diets, carnivore & keto diets, fit into the broader landscape of dietary trends and fads, and what can we learn from their popularity?