Skip to content

Bob

Community Manager
  • Joined

  • Last visited

  1. People may have already been doing it anyhow. This is, at the least, the first time God went on record saying he would find the practice acceptable moving forward. Either due to post-flood conditions on earth, post-flood conditions of imperfect man moving forward, or because animal-sacrifice and the use of blood for atonement was going to be used ritually and as a teaching tool and would set the stage for Jesus sacrifice a couple thousand years later.
  2. This is why I have a problem with labels. These seem to fit us the best, if you absolutely MUST you a label.... 1) Omnivore 2) Faculative Carnivore 3) Hyper Carnivore To me, these all mean the same thing. I believe we need to be meat-centric, not plant-based, but if you want to entertain a little fruits and vegetation here and there and you choose ones that you don't react negatively to, then you've set a foot in the right direction.
  3.    Bob reacted to a post in a topic: Did Humans Evolve To Eat Meat?
  4.    Bob reacted to a post in a topic: Did Humans Evolve To Eat Meat?
  5. That picture looks like it was cooked to perfection also!
  6.    Bob reacted to a post in a topic: Study Finds Hidden Side Effects of GLP-1's
  7. I mean, would you rather have the jar of smushed peas, or the steak?
  8.    Bob reacted to a post in a topic: Just started carnivore
  9.    Bob reacted to a post in a topic: Just started carnivore
  10.    Miranda reacted to a post in a topic: Post a picture... Any picture
  11.    Miranda reacted to a post in a topic: Post a picture... Any picture
  12. I agree completely with this. A GLP-1 might be very useful in getting one started on a successful health journey, especially if they are vastly overweight and have overeating/control issues. But it doesn't replace a proper human diet. It's not a get out of jail free card for ultra-processed junk food. The goal is to ultimately make your lifestyle one of healthy and wise dietary choices, and not depend on GLP-1's. This is legit. Keto and Carnivore diets increase the body's secretion of GLP-1. In short, a proper human diet, done properly, produces the same result.
  13. Greek and Roman statues are bad? And I am amused with the guys getting some sunshine where the sun doesn't normally shine, lol
  14.    Bob reacted to a post in a topic: Post a picture... Any picture
  15. Did Humans Evolve To Eat Meat? An Evolutionary Biologist Explains What Your Anatomy Actually RevealsForbes·Scott Travers· April 27, 2026 The evolutionary case for eating meat is etched into human anatomy — but so is the case against it. The science deserves more than a simple verdict. getty Few questions produce more confident, contradictory answers than the one laid out in the headline of this story. Ask a carnivore diet enthusiast, and you’ll hear that humans are apex predators, biologically engineered for red meat and bone marrow. Ask a veganism advocate, and you’ll be directed toward our primate cousins, the chimpanzees and their largely plant-based diets. Ask a paleo proponent, and you’ll get something in between, involving a lot of wild game and seasonal berries. The problem with all of these answers is that they are each partly right, and that’s precisely what makes this such a genuinely interesting scientific question. The debate over the “natural human diet” is not merely a nutritional squabble; it cuts to the heart of who we are as a species, where we came from and how our bodies came to be built the way they are. What does the evolutionary and biological evidence actually say? As it turns out, something more nuanced and more fascinating than any tribe in the diet wars tends to admit. A Two Million-Year-Old History Of Meat-EatingLet’s start with the fossils, because they are unambiguous on one point: our ancestors were eating meat a very long time ago. Stone tool marks on animal bones recovered from Gona, Ethiopia, date butchery activity to approximately 2.6 million years ago — well before the genus Homo had fully established itself. By 1.5 million years ago, the evidence from Tanzania’s Olduvai Gorge suggests that early humans were not merely scavenging scraps. They were hunting. The most comprehensive synthesis of this evidence was published in a 2021 study in the American Journal of Physical Anthropology. Drawing on roughly 400 scientific papers spanning genetics, zooarchaeology, stable isotope analysis and comparative physiology, the authors argued that for the better part of two million years, Homo was a “hypercarnivore.” This means that Homo obtained more than 70% of dietary energy from animal sources, much like large social predators such as wolves and hyenas today. This is a striking claim, and not without its criticism. But the anatomical evidence supporting it is harder to dismiss than the headline figure might suggest. Consider the stomach. Human gastric acidity sits at a pH of approximately 1.5, which is as acidic as a dedicated scavenger, and far more acidic than other omnivores. Maintaining that level of acidity is metabolically expensive. It serves two purposes well understood by digestive physiologists: breaking down dense animal proteins, and sterilizing the bacteria that accumulate in aged meat. This is not the gut chemistry of a species that accidentally stumbled upon the occasional rabbit. Then there is the brain. Humans have, by any biological measure, an absurdly large brain relative to body size. Building and running it requires a continuous supply of iron, zinc, vitamin B12 and long-chain omega-3 fatty acids, particularly DHA. These nutrients exist in plants, but at lower concentrations and in forms that are often difficult to absorb. They are, however, abundant in animal tissue. The landmark “expensive tissue hypothesis,” published in Current Anthropology in 1995, proposed that the dramatic expansion of the human brain was made possible in part by a corresponding reduction in gut size — a trade enabled by switching to a higher-quality, more energy-dense diet rich in animal foods. Perhaps the most arresting piece of paleopathological evidence for our meat dependence comes from Tanzania. Fragments of a child’s skull, dated to 1.5 million years ago, show deformities consistent with porotic hyperostosis, a condition linked to vitamin B12 deficiency. B12 is found exclusively in animal-derived foods. If this interpretation is correct, it suggests that by that point in our evolution, meat had become not merely beneficial, but physiologically essential. Our bodies, having offloaded the metabolic cost of producing certain nutrients, had begun to rely on obtaining them externally. That is the biological definition of dietary dependence. Did We Prefer Meat Over Plants?If the story ended with meat, humans would look like obligate carnivores. But as we know full well, we are not. The evidence for deep, consequential plant-food adaptation in human biology is equally real — just harder to find in the ground. That last point is not a throwaway caveat. It’s a genuine methodological problem. Animal bones, stone tools and calcified tissue survive for millions of years. Tubers, seeds, leaves and fruit do not. The asymmetry in what the archaeological record preserves has almost certainly created a systematic overemphasis on meat in reconstructions of ancestral diets. Plant foods leave faint traces: microscopic starch granules on the surface of grinding stones, phytoliths in ancient sediment, the occasional charred seed. They require far more careful excavation to find. The important point is this: their near-absence in many fossil sites is not evidence of absence in the diet. The genetic evidence for plant-food adaptation is particularly compelling. A 2007 study in Nature Genetics found that humans carry significantly more copies of the salivary amylase gene (AMY1) than other primates. Amylase is the enzyme responsible for breaking down dietary starch in the mouth. The more copies you carry, the more amylase you produce and the more efficiently you digest carbohydrates. This is not a recent adaptation. The duplication of AMY1 appears to predate modern Homo sapiens and correlates with a long evolutionary history of consuming starchy plant foods, most likely underground storage organs like tubers, which were available year-round in ways that large game animals simply were not. Isotope analysis of Australopithecus skeletons, our ancestors from roughly 3 to 4 million years ago, shows dietary signatures consistent with a heavily plant-based diet, with little to no meat consumption. The major shift toward animal foods appears to coincide with the emergence of Homo and the development of stone tools, not with the entirety of the hominin lineage. This matters because it means the full span of our evolutionary history includes a very long chapter in which plants dominated. And then there is cooking, which complicates everything beautifully. Richard Wrangham’s “cooking hypothesis,” developed across decades of research and synthesized in his book Catching Fire, argues that the control of fire and the cooking of food, both plant and animal, was itself a primary driver of brain expansion. Cooking dramatically increases the bioavailability of calories from starchy roots and tubers, making them a suddenly viable staple. It also softens meat, reducing the jaw musculature required to chew it raw, which in turn frees up cranial space for brain tissue. The key insight is that cooking was not just a way of making food safer. It was a biological lever that changed what counted as food in the first place. Is Meat Non-Negotiable For Humans?This is where intellectual honesty requires us to step back from the data and ask what we are actually trying to learn. “Did humans evolve to eat meat?” is, in a narrow sense, answerable: yes, substantially so, particularly over the last two million years. But the more important question, “What does that mean for how I should eat today?” is one that evolutionary biology alone cannot answer. Miki Ben-Dor, one of the most vocal proponents of the hypercarnivore hypothesis, is explicit about this in his own published work. Ancestral eating patterns are not dietary prescriptions. The Pleistocene megafauna our ancestors hunted are largely extinct. Our food supply, physical activity levels, disease environment, lifespan and population density are radically different from anything our Stone Age ancestors encountered. Evolution optimizes for reproductive success in a given environment, not for long life in a modern one. The epidemiological literature adds another wrinkle here. A 2015 meta-analysis published in The Lancet Oncology and commissioned by the World Health Organization’s International Agency for Research on Cancer, classified processed meat as a Group 1 carcinogen and red meat as Group 2A, “probably carcinogenic” to humans. These classifications are not without nuance (dose matters enormously), but they suggest that whatever our ancestors’ digestive systems were calibrated for, the industrially produced meat that constitutes most modern consumption is a rather different substance. What the totality of evidence actually supports is this: humans are metabolically flexible omnivores with notable carnivorous adaptations that emerged over the past two million years, overlaid on an older primate legacy of plant consumption. We are neither obligate carnivores nor natural herbivores. We are something more ecologically interesting: a species whose evolutionary success was built precisely on dietary versatility, on the capacity to extract adequate nutrition from whatever combination of foods a given environment provided. That flexibility is arguably the most important thing our anatomy reveals. Not what we must eat, but what we can. The question of what we should eat — given our health, our ecology, our ethics and our planet — is one we have to answer ourselves, with better tools than our ancestors had. The bones, genes and stomach acid are informative. They do not, however, have to decide our menu. ARTICLE SOURCE: https://www.forbes.com/sites/scotttravers/2026/04/27/did-humans-evolve-to-eat-meat-an-evolutionary-biologist-explains-what-your-anatomy-actually-reveals/
  16. That was absolutely adorable! I loved when they pulled the steak away from him his eyes get all big like "yo! where are you going?" as he reaches for it again with both his hands, lol :)
  17. until

    In this episode of Carnivore Talk, we dive into the buzz around Health Secretary Robert F. Kennedy Jr.'s recent claims that the carnivore diet dramatically eliminated his dangerous visceral fat—the deep abdominal fat surrounding his organs linked to higher risks of heart disease, diabetes, and metabolic issues. Kennedy shared in a video that a full-body MRI showed his organs "covered with visceral fat." A doctor recommended the carnivore diet, claiming it could clear it in 90 days. After one month, he reported losing 40% of it—and soon dropped to the 1st percentile. He also noted that about half the cabinet is following a similar approach. What visceral fat really is and why it's more dangerous than regular "pinchable" fat. Whether you're a longtime carnivore, curious skeptic, or just following the MAHA movement, this episode explores the real-world results, the controversy, and how animal-based eating might help combat modern chronic disease drivers like processed foods and seed oils. Join the conversation in the comments: Have you seen changes in body composition or energy on carnivore? Share your story! Subscribe, like, and hit the bell for more unfiltered carnivore discussions. Eat meat, stay strong! 🥩 WATCH: https://www.youtube.com/watch?v=mNDt8DXqycA
  18.    Geezy reacted to a post in a topic: Post a picture... Any picture
  19.    Miranda reacted to a post in a topic: Post a picture... Any picture
  20. AI scans 400,000 Reddit posts to flag overlooked GLP-1 side effectsby University of Pennsylvania edited by Sadie Harley, reviewed by Robert Egan By using AI to analyze more than 400,000 Reddit posts, Penn researchers have identified patient-reported symptoms associated with GLP-1s, the popular weight-loss and diabetes drugs semaglutide and tirzepatide, that may not be fully captured in clinical trials or regulatory documents. The new study, published in Nature Health, covers more than half a decade of posts from nearly 70,000 Reddit users and highlights two main classes of symptoms that warrant further study: reproductive symptoms, including irregular menstrual cycles, and temperature-related complaints, such as chills and hot flashes. "Some of the side effects we found, like nausea, are well known, and that shows that the method is picking up a real signal," says Sharath Chandra Guntuku, Research Associate Professor in Computer and Information Science (CIS) at Penn Engineering and the study's senior author. "The underreported symptoms are leads that came from patients themselves, unprompted, and clinicians could potentially pay attention to them." "Clinical trials generally identify the most dangerous side effects of drugs," adds Lyle Ungar, Professor in CIS and a co-author on the study. "But they can fail to find what symptoms patients are most concerned about; even though social media is not necessarily representative, a large collection of posts may reflect additional concerns." The researchers caution that their findings are not causal. "We can't say that GLP-1s are actually causing these symptoms," notes Neil Sehgal, the study's first author and a doctoral student in CIS advised by Guntuku and Ungar. "But nearly 4% of the Reddit users in our sample reported menstrual irregularities, which would be even higher in a female-only sample. We think that's a signal worth investigating." Studying social media for healthIn 2011, Ungar participated in one of the earliest efforts to mine online, user-created content for information about drugs' adverse effects. "Online patient communities work a lot like a neighborhood grapevine," says Ungar. "People who are living with these medications are swapping notes with each other in real time, sharing experiences that rarely make it into a doctor's office visit or an official report." In the years since, social media use has only grown, making data from these platforms increasingly promising as a source of information about the side effects of medications, even as the platforms themselves have made accessing the data more difficult. (Guntuku has also published research on strategies for adapting to changes in platform access.) "Clinical trials are the gold standard, but by design, they are slow," says Guntuku. "This is not a replacement for trials, but it can move much faster, and that speed matters when a drug goes from niche to mainstream almost overnight." Leveraging AI to analyze social mediaUntil now, the most challenging part of this process, which Guntuku calls "computational social listening," has been scale. Because users vary in how they describe their symptoms, the effort required to map individual social media posts to language in the Medical Dictionary for Regulatory Activities (MedDRA), which clinicians use to describe symptoms, limited the amount of data this approach could handle. Now, large language models like GPT or Gemini have enabled the systematic analysis of social media posts at an unprecedented scale. "Large language models have made it possible to do this kind of analysis much faster with a level of standardization that could be difficult to achieve before," says Sehgal. Unreported symptomsWhile the population the researchers studied is admittedly not representative—Reddit users are younger, more likely to be male and disproportionately based in the United States—the symptoms described in their collective accounts largely match the known side effects of semaglutide and tirzepatide: about 44% of users in the study described at least one side effect, most commonly some form of gastrointestinal distress. What stood out was the nontrivial percentage of users who reported symptoms that may not be fully reflected in current drug labeling or routine adverse-event reporting. Nearly 4% of users who reported side effects described reproductive symptoms, including menstrual changes such as intermenstrual bleeding, heavy bleeding, and irregular cycles. Others reported temperature-related complaints, such as chills, feeling cold, hot flashes, and fever-like symptoms. In addition, fatigue ranked as the second most common complaint among Reddit users, despite reaching reporting thresholds in relatively few clinical trials. "These drugs are thought to work by engaging part of the brain called the hypothalamus, which helps regulate a wide variety of hormones," says Jena Shaw Tronieri, Senior Research Investigator at Penn's Center for Weight and Eating Disorders and a co-author of the study. "That doesn't mean the medications are necessarily causing these symptoms, but it could suggest that reports of menstrual changes and body temperature fluctuations are worth studying more systematically." Future directionsIn the near term, the researchers hope their findings will encourage clinicians and researchers to take a closer look at the side effects patients are discussing online. "They're clearly on patients' minds, and that's worth paying attention to," says Sehgal. The team also hopes to expand the work beyond Reddit and beyond English-language communities to test whether the same patterns appear across different platforms and populations. "We don't really know yet whether what we're seeing on Reddit reflects the experience of GLP-1 users globally, or whether it's particular to the kind of person who posts on Reddit in the United States," Ungar says. Ultimately, the researchers believe this kind of rapid, AI-assisted social media analysis could become a useful way to spot early warning signs around emerging drugs and wellness trends. For substances that trend quickly online, especially those sold in loosely regulated or unregulated markets, like injectable peptides, patient discussions on platforms like Reddit and TikTok may offer one of the earliest clues to what users are actually experiencing. "The whole point of this kind of approach is that it can move quickly, and that's exactly when it's most valuable," says Guntuku. ARTICLE SOURCE: https://medicalxpress.com/news/2026-04-ai-scans-reddit-flag-overlooked.html
  21. until

    🚫 Fiber is NOT essential. Meat is. The mainstream just freaked out over the new 2026 federal food pyramid that puts protein and meat front and center for kids. Nutrition "experts" and even some MAHA groups are begging the USDA: "Don't make school lunches too carnivore—kids need more fiber!" In this episode, we dismantle the hysteria. Why fiber has zero enzymes in the human body to digest it. Why there's no fiber deficiency disease in human history. How traditional zero-fiber carnivore populations thrived without it. And why pushing more plants onto kids' trays might be doing more harm than good. The real nutrient kids need? High-quality, bioavailable meat protein for growth, brain development, and satiety—not indigestible plant roughage that often causes bloating and irritation. If you're raising kids carnivore, fighting outdated school lunch guidelines, or just tired of the fiber myth dominating nutrition advice, this one’s for you. Drop your thoughts below: Is fiber essential… or the biggest nutritional oxymoron of our time? 👇 Watch now and share if you’re Team Meat Over Fiber! https://www.youtube.com/live/uXLAqm_mDm0?si=7-TmKkpGnhI5ytMy
  22. Scientists Reveal Why Bread Can Cause Weight Gain Without Overeating17 April 2026 ByDavid Nield (Photographer Basak Gurbuz Derman/Moment/Getty Images) New research in mice shows how eating bread can cause body weight and fat mass to increase, even though caloric intake stays at a similar level. The research, led by a team from Osaka Metropolitan University in Japan, highlights how carbohydrates can contribute to weight gain as well as excessive fat intake – which is what dietary advice tends to focus on. This isn't the first time nutritionists have talked about bread and carbohydrates and their contribution to weight gain, but there hasn't been much detailed research into the relationship – especially wheat flour – or into what might be happening at a metabolic level. The team discovered that eating more wheat bread was associated with reduced energy expenditure, pushing the metabolism towards a state where fat storage is prioritized, even when the calories in a diet stay at a similar level. The researchers analyzed the difference that bread in the diets of mice had on their weight (A) and fat tissue (B, C). (Matsumura et al., Mol. Nutr. Food Res., 2026) "These findings suggest that weight gain may not be due to wheat-specific effects, but rather to a strong preference for carbohydrates and the associated metabolic changes," says nutritionist Shigenobu Matsumura of Osaka Metropolitan University. The researchers set up experiments in which lab mice were given a choice between their normal, healthy cereal-based diet and either simple bread, baked wheat flour, or baked rice flour. The mice were then monitored to check their weight and how their bodies burned calories at rest and when active. Using blood samples, the study team also examined hormone, blood sugar, and metabolite levels in the animals, while post-experiment tissue analyses assessed gene expression in the liver. The experiments showed that the mice strongly preferred to switch from their standard diet to carbohydrate-heavy snacks, which then led to weight gain and more fat tissue in the mice, particularly in the males. Further analysis and follow-up tests suggested that these two key changes were being driven not by overeating or a lack of exercise, but by the foods themselves. In the wheat flour diet, fewer calories were being burned overall, while genes responsible for turning carbohydrates into fat were activated. Another follow-up test focusing on the wheat flour group showed that when the chow diet was restored, the weight gain stopped, and the metabolic shifts were reversed. "In the future, we hope this will serve as a scientific foundation for achieving a balance between 'taste' and 'health' in the fields of nutritional guidance, food education, and food development," says Matsumura. The findings are more evidence of how what we eat can cause changes in how our body processes food and burns the calories it contains. In the case of bread, it seems to slow down the body's metabolic engine. One limitation of the study is that it used mouse models, rather than human volunteers. While it's likely that similar processes are happening in people, it's not certain – so that's something future studies can pick up. The researchers also want to experiment with a broader selection of foods to identify what exactly it is about bread that causes this reaction. No diet study like this exists in isolation, of course. We know that a variety of other factors can also impact how our metabolism reacts to food and drink, including age and hormone-related changes. Related: There's a Surprising Link Between a Key Nutrient, Obesity, And Alzheimer's Risk Further research should help establish the role that wheat and bread can play in a diet and how the simple "calories in, calories out" rule isn't always straightforward. "Going forward, we plan to shift our research focus to humans to verify the extent to which the metabolic changes identified in this study apply to actual dietary habits," says Matsumura. "We also intend to investigate how factors such as whole grains, unrefined grains, and foods rich in dietary fiber, as well as their combinations with proteins and fats, food processing methods, and timing of consumption, affect metabolic responses to carbohydrate intake." The research has been published in Molecular Nutrition & Food Research. ARTICLE SOURCE: https://www.sciencealert.com/scientists-reveal-why-bread-can-cause-weight-gain-without-overeating

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Account

Navigation

Search

Search

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.