Did Humans Evolve To Eat Meat? An Evolutionary Biologist Explains What Your Anatomy Actually RevealsForbes·Scott Travers· April 27, 2026 The evolutionary case for eating meat is etched into human anatomy — but so is the case against it. The science deserves more than a simple verdict. getty Few questions produce more confident, contradictory answers than the one laid out in the headline of this story. Ask a carnivore diet enthusiast, and you’ll hear that humans are apex predators, biologically engineered for red meat and bone marrow. Ask a veganism advocate, and you’ll be directed toward our primate cousins, the chimpanzees and their largely plant-based diets. Ask a paleo proponent, and you’ll get something in between, involving a lot of wild game and seasonal berries. The problem with all of these answers is that they are each partly right, and that’s precisely what makes this such a genuinely interesting scientific question. The debate over the “natural human diet” is not merely a nutritional squabble; it cuts to the heart of who we are as a species, where we came from and how our bodies came to be built the way they are. What does the evolutionary and biological evidence actually say? As it turns out, something more nuanced and more fascinating than any tribe in the diet wars tends to admit. A Two Million-Year-Old History Of Meat-EatingLet’s start with the fossils, because they are unambiguous on one point: our ancestors were eating meat a very long time ago. Stone tool marks on animal bones recovered from Gona, Ethiopia, date butchery activity to approximately 2.6 million years ago — well before the genus Homo had fully established itself. By 1.5 million years ago, the evidence from Tanzania’s Olduvai Gorge suggests that early humans were not merely scavenging scraps. They were hunting. The most comprehensive synthesis of this evidence was published in a 2021 study in the American Journal of Physical Anthropology. Drawing on roughly 400 scientific papers spanning genetics, zooarchaeology, stable isotope analysis and comparative physiology, the authors argued that for the better part of two million years, Homo was a “hypercarnivore.” This means that Homo obtained more than 70% of dietary energy from animal sources, much like large social predators such as wolves and hyenas today. This is a striking claim, and not without its criticism. But the anatomical evidence supporting it is harder to dismiss than the headline figure might suggest. Consider the stomach. Human gastric acidity sits at a pH of approximately 1.5, which is as acidic as a dedicated scavenger, and far more acidic than other omnivores. Maintaining that level of acidity is metabolically expensive. It serves two purposes well understood by digestive physiologists: breaking down dense animal proteins, and sterilizing the bacteria that accumulate in aged meat. This is not the gut chemistry of a species that accidentally stumbled upon the occasional rabbit. Then there is the brain. Humans have, by any biological measure, an absurdly large brain relative to body size. Building and running it requires a continuous supply of iron, zinc, vitamin B12 and long-chain omega-3 fatty acids, particularly DHA. These nutrients exist in plants, but at lower concentrations and in forms that are often difficult to absorb. They are, however, abundant in animal tissue. The landmark “expensive tissue hypothesis,” published in Current Anthropology in 1995, proposed that the dramatic expansion of the human brain was made possible in part by a corresponding reduction in gut size — a trade enabled by switching to a higher-quality, more energy-dense diet rich in animal foods. Perhaps the most arresting piece of paleopathological evidence for our meat dependence comes from Tanzania. Fragments of a child’s skull, dated to 1.5 million years ago, show deformities consistent with porotic hyperostosis, a condition linked to vitamin B12 deficiency. B12 is found exclusively in animal-derived foods. If this interpretation is correct, it suggests that by that point in our evolution, meat had become not merely beneficial, but physiologically essential. Our bodies, having offloaded the metabolic cost of producing certain nutrients, had begun to rely on obtaining them externally. That is the biological definition of dietary dependence. Did We Prefer Meat Over Plants?If the story ended with meat, humans would look like obligate carnivores. But as we know full well, we are not. The evidence for deep, consequential plant-food adaptation in human biology is equally real — just harder to find in the ground. That last point is not a throwaway caveat. It’s a genuine methodological problem. Animal bones, stone tools and calcified tissue survive for millions of years. Tubers, seeds, leaves and fruit do not. The asymmetry in what the archaeological record preserves has almost certainly created a systematic overemphasis on meat in reconstructions of ancestral diets. Plant foods leave faint traces: microscopic starch granules on the surface of grinding stones, phytoliths in ancient sediment, the occasional charred seed. They require far more careful excavation to find. The important point is this: their near-absence in many fossil sites is not evidence of absence in the diet. The genetic evidence for plant-food adaptation is particularly compelling. A 2007 study in Nature Genetics found that humans carry significantly more copies of the salivary amylase gene (AMY1) than other primates. Amylase is the enzyme responsible for breaking down dietary starch in the mouth. The more copies you carry, the more amylase you produce and the more efficiently you digest carbohydrates. This is not a recent adaptation. The duplication of AMY1 appears to predate modern Homo sapiens and correlates with a long evolutionary history of consuming starchy plant foods, most likely underground storage organs like tubers, which were available year-round in ways that large game animals simply were not. Isotope analysis of Australopithecus skeletons, our ancestors from roughly 3 to 4 million years ago, shows dietary signatures consistent with a heavily plant-based diet, with little to no meat consumption. The major shift toward animal foods appears to coincide with the emergence of Homo and the development of stone tools, not with the entirety of the hominin lineage. This matters because it means the full span of our evolutionary history includes a very long chapter in which plants dominated. And then there is cooking, which complicates everything beautifully. Richard Wrangham’s “cooking hypothesis,” developed across decades of research and synthesized in his book Catching Fire, argues that the control of fire and the cooking of food, both plant and animal, was itself a primary driver of brain expansion. Cooking dramatically increases the bioavailability of calories from starchy roots and tubers, making them a suddenly viable staple. It also softens meat, reducing the jaw musculature required to chew it raw, which in turn frees up cranial space for brain tissue. The key insight is that cooking was not just a way of making food safer. It was a biological lever that changed what counted as food in the first place. Is Meat Non-Negotiable For Humans?This is where intellectual honesty requires us to step back from the data and ask what we are actually trying to learn. “Did humans evolve to eat meat?” is, in a narrow sense, answerable: yes, substantially so, particularly over the last two million years. But the more important question, “What does that mean for how I should eat today?” is one that evolutionary biology alone cannot answer. Miki Ben-Dor, one of the most vocal proponents of the hypercarnivore hypothesis, is explicit about this in his own published work. Ancestral eating patterns are not dietary prescriptions. The Pleistocene megafauna our ancestors hunted are largely extinct. Our food supply, physical activity levels, disease environment, lifespan and population density are radically different from anything our Stone Age ancestors encountered. Evolution optimizes for reproductive success in a given environment, not for long life in a modern one. The epidemiological literature adds another wrinkle here. A 2015 meta-analysis published in The Lancet Oncology and commissioned by the World Health Organization’s International Agency for Research on Cancer, classified processed meat as a Group 1 carcinogen and red meat as Group 2A, “probably carcinogenic” to humans. These classifications are not without nuance (dose matters enormously), but they suggest that whatever our ancestors’ digestive systems were calibrated for, the industrially produced meat that constitutes most modern consumption is a rather different substance. What the totality of evidence actually supports is this: humans are metabolically flexible omnivores with notable carnivorous adaptations that emerged over the past two million years, overlaid on an older primate legacy of plant consumption. We are neither obligate carnivores nor natural herbivores. We are something more ecologically interesting: a species whose evolutionary success was built precisely on dietary versatility, on the capacity to extract adequate nutrition from whatever combination of foods a given environment provided. That flexibility is arguably the most important thing our anatomy reveals. Not what we must eat, but what we can. The question of what we should eat — given our health, our ecology, our ethics and our planet — is one we have to answer ourselves, with better tools than our ancestors had. The bones, genes and stomach acid are informative. They do not, however, have to decide our menu. ARTICLE SOURCE: https://www.forbes.com/sites/scotttravers/2026/04/27/did-humans-evolve-to-eat-meat-an-evolutionary-biologist-explains-what-your-anatomy-actually-reveals/
New study connects Coca-Cola to fatty liver disease, type 2 diabetes: "It shouldn't be sold"
Story by Marie Calapano
Woman drinking from a big bottle of Coca-Cola© Source: Shutterstock
Coca-Cola is one of the most familiar drinks in the world, often viewed as a simple refreshment. But a major new study is renewing scrutiny of what regular soda consumption may be doing to the body, linking sugar-sweetened beverages to higher rates of fatty liver disease and type 2 diabetes.
The findings have drawn strong reactions from health experts, with some questioning whether products like Coca-Cola should continue to be sold as everyday drinks. Specialist registered dietitian Nichola Ludlam-Raine said the drink’s sugar content is so concerning that “it shouldn’t be allowed to be sold,” while the company maintains its products are safe when consumed in moderation.
As metabolic disease rates continue to rise worldwide, the research raises new questions about how sugary drinks fit into modern diets and whether current consumer guidance goes far enough.
What the New Study Found
A can of Coca-Cola in a person's hand© Source: Shutterstock
The research, published in the medical journal Nature Medicine, analyzed sugar-sweetened beverage consumption across 184 countries, using population-level health data rather than sales figures. Researchers examined how soda and similar drinks contributed to long-term disease outcomes.
The study found that in 2020 alone, sugar-sweetened beverages were linked to 2.2 million new cases of type 2 diabetes worldwide, accounting for 9.8% of all newly diagnosed cases. The same analysis connected these drinks to 1.2 million new cardiovascular disease cases, or 3.1% of the global total.
Researchers said the burden was highest in regions where soda consumption is widespread and increasing, and warned the estimates likely understate the true impact, since conditions like fatty liver disease often go undiagnosed. While the analysis covered sugary drinks broadly, experts note Coca-Cola’s scale and reach place it at the center of the health debate.
Why Coca-Cola Raises Health Concerns
Coca-Cola bottle and glass on picnic table© Source: iStock
Medical experts say the concern isn’t just sugar, but how it’s consumed. Sugary drinks deliver large amounts of rapidly absorbed sugar without creating a sense of fullness, making it easy to consume excess calories quickly.
Coca-Cola contains high levels of added sugar, much of it as fructose, which is processed almost entirely by the liver. With frequent intake, the liver converts that sugar into fat, increasing the risk of non-alcoholic fatty liver disease and insulin resistance.
Over time, this can disrupt blood sugar regulation and raise the risk of type 2 diabetes, even in people who don’t see themselves as unhealthy. Researchers note that daily or repeated consumption compounds the effect, and that liquid sugar places a greater metabolic strain on the body than solid foods, making sugary sodas especially concerning.
What the Findings Mean Going Forward
Glass bottles of Coca-Cola© Source: Unsplash
The study has intensified debate over whether sugary drinks should face stronger regulation, including warning labels, taxes, or marketing restrictions. Several countries have already adopted such measures, while others continue to rely on consumer choice and industry self-regulation.
For consumers, the findings reinforce a growing consensus among health professionals: regular soda consumption carries measurable risks that build quietly over time. Cutting back on sugar-sweetened beverages is increasingly viewed as one of the simplest ways to reduce long-term strain on the liver and lower diabetes risk.
While the research does not call for an outright ban, it challenges how drinks like Coca-Cola are positioned in everyday life. As evidence mounts, many experts believe the question is no longer whether sugary sodas affect health, but how much longer they should be treated as harmless staples rather than occasional indulgences.
ARTICLE SOURCE: https://www.msn.com/en-us/health/other/new-study-connects-coca-cola-to-fatty-liver-disease-type-2-diabetes-it-shouldn-t-be-sold/ar-AA1TvhSH
Subscribe to Carnivore Talk on YouTube | Be our guest on the channel | Leave me a voicemail, yo!