Did Humans Evolve To Eat Meat? An Evolutionary Biologist Explains What Your Anatomy Actually RevealsForbes·Scott Travers· April 27, 2026 The evolutionary case for eating meat is etched into human anatomy — but so is the case against it. The science deserves more than a simple verdict. getty Few questions produce more confident, contradictory answers than the one laid out in the headline of this story. Ask a carnivore diet enthusiast, and you’ll hear that humans are apex predators, biologically engineered for red meat and bone marrow. Ask a veganism advocate, and you’ll be directed toward our primate cousins, the chimpanzees and their largely plant-based diets. Ask a paleo proponent, and you’ll get something in between, involving a lot of wild game and seasonal berries. The problem with all of these answers is that they are each partly right, and that’s precisely what makes this such a genuinely interesting scientific question. The debate over the “natural human diet” is not merely a nutritional squabble; it cuts to the heart of who we are as a species, where we came from and how our bodies came to be built the way they are. What does the evolutionary and biological evidence actually say? As it turns out, something more nuanced and more fascinating than any tribe in the diet wars tends to admit. A Two Million-Year-Old History Of Meat-EatingLet’s start with the fossils, because they are unambiguous on one point: our ancestors were eating meat a very long time ago. Stone tool marks on animal bones recovered from Gona, Ethiopia, date butchery activity to approximately 2.6 million years ago — well before the genus Homo had fully established itself. By 1.5 million years ago, the evidence from Tanzania’s Olduvai Gorge suggests that early humans were not merely scavenging scraps. They were hunting. The most comprehensive synthesis of this evidence was published in a 2021 study in the American Journal of Physical Anthropology. Drawing on roughly 400 scientific papers spanning genetics, zooarchaeology, stable isotope analysis and comparative physiology, the authors argued that for the better part of two million years, Homo was a “hypercarnivore.” This means that Homo obtained more than 70% of dietary energy from animal sources, much like large social predators such as wolves and hyenas today. This is a striking claim, and not without its criticism. But the anatomical evidence supporting it is harder to dismiss than the headline figure might suggest. Consider the stomach. Human gastric acidity sits at a pH of approximately 1.5, which is as acidic as a dedicated scavenger, and far more acidic than other omnivores. Maintaining that level of acidity is metabolically expensive. It serves two purposes well understood by digestive physiologists: breaking down dense animal proteins, and sterilizing the bacteria that accumulate in aged meat. This is not the gut chemistry of a species that accidentally stumbled upon the occasional rabbit. Then there is the brain. Humans have, by any biological measure, an absurdly large brain relative to body size. Building and running it requires a continuous supply of iron, zinc, vitamin B12 and long-chain omega-3 fatty acids, particularly DHA. These nutrients exist in plants, but at lower concentrations and in forms that are often difficult to absorb. They are, however, abundant in animal tissue. The landmark “expensive tissue hypothesis,” published in Current Anthropology in 1995, proposed that the dramatic expansion of the human brain was made possible in part by a corresponding reduction in gut size — a trade enabled by switching to a higher-quality, more energy-dense diet rich in animal foods. Perhaps the most arresting piece of paleopathological evidence for our meat dependence comes from Tanzania. Fragments of a child’s skull, dated to 1.5 million years ago, show deformities consistent with porotic hyperostosis, a condition linked to vitamin B12 deficiency. B12 is found exclusively in animal-derived foods. If this interpretation is correct, it suggests that by that point in our evolution, meat had become not merely beneficial, but physiologically essential. Our bodies, having offloaded the metabolic cost of producing certain nutrients, had begun to rely on obtaining them externally. That is the biological definition of dietary dependence. Did We Prefer Meat Over Plants?If the story ended with meat, humans would look like obligate carnivores. But as we know full well, we are not. The evidence for deep, consequential plant-food adaptation in human biology is equally real — just harder to find in the ground. That last point is not a throwaway caveat. It’s a genuine methodological problem. Animal bones, stone tools and calcified tissue survive for millions of years. Tubers, seeds, leaves and fruit do not. The asymmetry in what the archaeological record preserves has almost certainly created a systematic overemphasis on meat in reconstructions of ancestral diets. Plant foods leave faint traces: microscopic starch granules on the surface of grinding stones, phytoliths in ancient sediment, the occasional charred seed. They require far more careful excavation to find. The important point is this: their near-absence in many fossil sites is not evidence of absence in the diet. The genetic evidence for plant-food adaptation is particularly compelling. A 2007 study in Nature Genetics found that humans carry significantly more copies of the salivary amylase gene (AMY1) than other primates. Amylase is the enzyme responsible for breaking down dietary starch in the mouth. The more copies you carry, the more amylase you produce and the more efficiently you digest carbohydrates. This is not a recent adaptation. The duplication of AMY1 appears to predate modern Homo sapiens and correlates with a long evolutionary history of consuming starchy plant foods, most likely underground storage organs like tubers, which were available year-round in ways that large game animals simply were not. Isotope analysis of Australopithecus skeletons, our ancestors from roughly 3 to 4 million years ago, shows dietary signatures consistent with a heavily plant-based diet, with little to no meat consumption. The major shift toward animal foods appears to coincide with the emergence of Homo and the development of stone tools, not with the entirety of the hominin lineage. This matters because it means the full span of our evolutionary history includes a very long chapter in which plants dominated. And then there is cooking, which complicates everything beautifully. Richard Wrangham’s “cooking hypothesis,” developed across decades of research and synthesized in his book Catching Fire, argues that the control of fire and the cooking of food, both plant and animal, was itself a primary driver of brain expansion. Cooking dramatically increases the bioavailability of calories from starchy roots and tubers, making them a suddenly viable staple. It also softens meat, reducing the jaw musculature required to chew it raw, which in turn frees up cranial space for brain tissue. The key insight is that cooking was not just a way of making food safer. It was a biological lever that changed what counted as food in the first place. Is Meat Non-Negotiable For Humans?This is where intellectual honesty requires us to step back from the data and ask what we are actually trying to learn. “Did humans evolve to eat meat?” is, in a narrow sense, answerable: yes, substantially so, particularly over the last two million years. But the more important question, “What does that mean for how I should eat today?” is one that evolutionary biology alone cannot answer. Miki Ben-Dor, one of the most vocal proponents of the hypercarnivore hypothesis, is explicit about this in his own published work. Ancestral eating patterns are not dietary prescriptions. The Pleistocene megafauna our ancestors hunted are largely extinct. Our food supply, physical activity levels, disease environment, lifespan and population density are radically different from anything our Stone Age ancestors encountered. Evolution optimizes for reproductive success in a given environment, not for long life in a modern one. The epidemiological literature adds another wrinkle here. A 2015 meta-analysis published in The Lancet Oncology and commissioned by the World Health Organization’s International Agency for Research on Cancer, classified processed meat as a Group 1 carcinogen and red meat as Group 2A, “probably carcinogenic” to humans. These classifications are not without nuance (dose matters enormously), but they suggest that whatever our ancestors’ digestive systems were calibrated for, the industrially produced meat that constitutes most modern consumption is a rather different substance. What the totality of evidence actually supports is this: humans are metabolically flexible omnivores with notable carnivorous adaptations that emerged over the past two million years, overlaid on an older primate legacy of plant consumption. We are neither obligate carnivores nor natural herbivores. We are something more ecologically interesting: a species whose evolutionary success was built precisely on dietary versatility, on the capacity to extract adequate nutrition from whatever combination of foods a given environment provided. That flexibility is arguably the most important thing our anatomy reveals. Not what we must eat, but what we can. The question of what we should eat — given our health, our ecology, our ethics and our planet — is one we have to answer ourselves, with better tools than our ancestors had. The bones, genes and stomach acid are informative. They do not, however, have to decide our menu. ARTICLE SOURCE: https://www.forbes.com/sites/scotttravers/2026/04/27/did-humans-evolve-to-eat-meat-an-evolutionary-biologist-explains-what-your-anatomy-actually-reveals/
Foods that Americans were told to avoid for decades are back under Trump's new nutrition rules
By Andrea Margolis
Published January 18, 2026 11:01am PST
Foods that Americans were advised to avoid for decades are back on shopping lists — following updated federal dietary guidance released under President Donald Trump's administration.
After years of being told to avoid full-fat dairy, red meat and saturated fats like butter and beef tallow, the White House said updated guidance no longer broadly discourages those foods when consumed in moderation.
The changes reflect revisions to federal nutrition recommendations developed through the Departments of Health and Human Services (HHS) and Agriculture (USDA), rather than a wholesale reversal of prior advice.
Image: Dietary Guidelines for Americans, Department of Health & Human Services
It doesn't mean that every fatty food is encouraged. For example, experts still caution against eating too many processed snacks that are high in saturated fat such as chips, cookies and ice cream.
HHS Secretary Robert F. Kennedy Jr. said he was "ending the war on saturated fats" — though the updated report continues to recommend limits on daily intake.
"Protein and healthy fats are essential and were wrongly discouraged in prior dietary guidelines," he said.
Here's a handy summary of which foods are back — and how federal guidance and nutrition experts say they should be consumed.
1. Full-fat milk and yogurt
The new 2025–2030 Dietary Guidelines for Americans describe full-fat milk and yogurt as "healthy fats."
"In general, saturated fat consumption should not exceed 10% of total daily calories," the report states.
"Significantly limiting highly processed foods will help meet this goal. More high-quality research is needed to determine which types of dietary fats best support long-term health."
Full-fat dairy is packed with fat-soluble vitamins such as vitamin A, D, E and K, which "allows for better transportation and absorption," said Amy Goldsmith, a Maryland-based registered dietitian and owner of Kindred Nutrition.
"There can also be an increased satiation from the products, as the fat will decrease [the hormone] ghrelin," she told Fox News Digital.
Goldsmith noted saturated fat should still be portioned, even if not completely cut out.
"This is the nuance with the new dietary guidelines," she said. "As dietitians, we want to ensure this visualization [doesn't lead] to an increase in saturated fat, as it could contribute to an increase in chronic disease."
There's usually less added sugar in full-fat products, she noted, but they still need to be portioned out to avoid the consumption of too much saturated fat and total calories.
2. Butter
The Trump administration's new report lists butter as one of several fats that may be used in cooking, while prioritizing unsaturated oils.
"When cooking with or adding fats to meals, prioritize oils with essential fatty acids, such as olive oil," according to the guidelines. "Other options can include butter or beef tallow."
Goldsmith said butter is a great source of vitamin A, which is "essential for vision and immune health."
It also has vitamin E, an antioxidant, and vitamin K2 — which "ensures calcium is used to strengthen bones and teeth."
"It also is the best source of butyric acid, which serves as an anti-inflammatory," Goldsmith said.
"Most butters are 60–70% saturated fat… so it will be difficult to keep total saturated fat intake within recommended limits if portion and volume aren't taken into consideration," she added.
"In addition, if someone already has a high LDL cholesterol, butter would not be the best source of spread as it can continue to contribute to increasing LDL."
3. Beef tallow
Beef tallow is high in vitamin A, D, E and K, similar to other animal-based fats.
Goldsmith also noted that 40–50% of beef tallow is monounsaturated fat, a proportion comparable to some plant-based oils.
"The other 50% is saturated fat, however. So once again, serving size and volume need to be considered to keep saturated fat below 10% of total calories," she said.
"In addition, beef tallow can be more expensive than butter and difficult to get."
4. Red meat
Red meat contains essential amino acids that can't be produced by the human body, a dietitian said. (Getty Images)
The new report recommends "[consuming] a variety of protein foods from animal sources, including eggs, poultry, seafood and red meat, as well as a variety of plant-sourced protein foods, including beans, peas, lentils, legumes, nuts, seeds and so."
Goldsmith noted that red meat contains all nine of the essential amino acids that can't be produced by the human body alone.
"About 60% of the iron in meat is heme iron, and it's one of the best sources of zinc," she said.
"Heme iron is absorbed into the gut fast, which means it can rapidly restore ferritin, your iron stores."
Red meat's zinc is also crucial for immune cells and inflammation control.
"The new dietary guidelines stuck with the recommendation to keep saturated fat below 10% and, on average, red meat is 40–45% saturated fat," Goldsmith said.
"It will be important to vary animal protein to keep the saturated fat number down as high saturated fat diets contribute to heart disease and cancers."
Read more from FOX News Digital
ARTICLE SOURCE: https://www.ktvu.com/news/foods-americans-were-told-avoid-decades-back-under-trumps-new-nutrition-rules
Subscribe to Carnivore Talk on YouTube | Be our guest on the channel | Leave me a voicemail, yo!