Human Dietary Evolution: Gaming the System.
The dietary habits of human beings have undergone significant change since the extinction of Pleistocene megafauna and the subsequent advent of agriculture in response to survival pressures associated to the ever-increasing effort required in obtaining animal-based sources of nutrition and the diminished returns yielded by forced reliance on smaller prey animals. We were forced to rely more heavily on plants and fungi than was previously necessary and found a solution in the form of domestication to address issues of not only food scarcity, but also digestibility and toxicity.
You see, despite contemporary dietary practices reflecting omnivorous patterns, human physiology is largely unchanged compared to that of our Upper Pleistocene ancestors and that means that we are still fundamentally hypercarnivores according to our physiology and not true omnivores. This matters because understanding our actual physiological requirements, in terms of nutritional needs, and the reality of what our bodies are inherently adapted to deal with as food, allows the individual to align their dietary lifestyle choices to work with their physiology rather than against it – which yields overwhelmingly positive health outcomes compared to contemporary dietary patterns.
The ‘elephant in the room’ then would be the existence of people who follow a vegetarian or even vegan dietary approach and yet are still in good health. Many may assume this disqualifies the notion that humans are still Hypercarnivores by physiology, but appreciation for the nuance involved in the overlapping of our physiological requirements with our ‘learned practices’ which enabled us to avoid starvation and oblivion at the end of the Pleistocene offers a better understanding of this juxtaposition.
Through an examination of human ingenuity in cooking, processing, and the domestication of food sources, we can illustrate how these developments have led to a pervasive misconception that humans are true omnivores. Furthermore, it will demonstrate that, in the absence of cooking, processing and domesticated food sources, the plant and fungi options available for use within the human diet would be severely limited, reinforcing the argument for our hypercarnivorous nature. By analyzing these aspects, we can better understand the evolutionary pressures that shaped our dietary practices and how modern perceptions may overlook these deep-seated biological realities.
Hypercarnivorous Nature of Humans:
The term "hypercarnivore" refers to species that derive more than 70% of their dietary intake from animal sources (Parker et al., 2020). Anatomically and physiologically, humans exhibit several traits indicative of hypercarnivory, including a relatively high requirement for fat and protein and certain micronutrients predominantly found in animal tissues, such as vitamin B12 and heme iron (Cohen et al., 2009). Unlike typical omnivores, whose digestive systems are adapted to process a wide range of fibrous plant materials, humans have evolved a digestive system that is significantly more efficient at processing animal matter. For instance, the human gastrointestinal tract is shorter and less complex compared to that of herbivores and true omnivores, indicating an evolutionary adaptation to a diet rich in animal proteins and fats rather than cellulose (Perry et al., 2015). Additionally, our savagely low stomach PH puts us on par with carrion animals such as vultures and like all carnivores we rely on auto enzyme-based digestion in our stomachs and small intestine rather than fermentation as is required for the breakdown of most plant materials.
The Extinction of Megafauna and Agricultural Shift:
The extinction of the megafauna at the end of the Pleistocene marked a critical turning point in human dietary practices. As large game became scarce due to climate change and overhunting, early humans were forced to adapt through the development of sophisticated hunting and foraging strategies (Haynes, 2002). This adaptation involved not only a reliance on smaller game but also a more diverse foraging repertoire, including fish, birds, and small mammals, which sustained human populations during this transitional period. However, the transition to agriculture around 10,000 – 12,000 years ago fundamentally altered food production and consumption patterns. The domestication of staple plants, such as wheat, rice, and maize, alongside livestock like cattle and pigs, allowed for a more stable and varied diet but simultaneously obscured the hypercarnivorous foundation of human dietary evolution.
The introduction of agricultural practices meant that we were able to selectively breed and domesticate plants to become more increasingly suited to use as a staple food source rather than the supplementary source they were before (i.e. used to avoid starvation if/when the hunt was unsuccessful). This allowed for societies to be able to support larger populations, but it also led to a decline in the consumption of wild game and a shift in dietary priorities. This agricultural revolution, while a significant achievement of human ingenuity, inadvertently fostered the misconception that humans are naturally omnivorous, as the diets of agricultural societies became heavily plant-based.
The Role of Cooking and Processing:
Cooking and processing represent pivotal innovations in human dietary history, with profound implications for nutrition and social structure. By making food safer and more digestible, cooking and processing have enabled humans to extract greater nutritional value from various foods, including starchy plants and tougher cuts of meat (Wrangham, 2009). These innovations not only enhanced the energy & nutrient intake from existing sources but also facilitated the consumption of foods that might otherwise be inedible or toxic in their raw forms. For example, it is necessary to cook potatoes before they become edible to us, and it is also necessary to process Lima beans by soaking prior to cooking to nullify the toxins and make them safe to eat. These technological advancements have led to the belief in human omnivorism, as they allowed for the integration of a diverse array of foods that would otherwise be indigestible or harmful. However, it is crucial to recognize that without cooking & processing, the variety and safety of the human diet would be severely constrained, emphasizing our reliance on animal sources for essential nutrients.
Wild Food Availability: A Return to Ancestral Diets:
To understand the limitations of human dietary adaptability in the absence of cooking, processing, and domestication, it is essential to examine what could be safely consumed in the wild. In a natural setting, the edible plant foods available to humans are extremely limited and often not nutritionally adequate. Many wild plants require cooking to neutralize toxins or enhance digestibility, such as legumes (e.g., raw kidney beans contain toxic lectins) and certain tubers (e.g., cassava contains cyanogenic glycosides) (Pérez-Jiménez & Paiva, 2010). Even fruits and nuts, while theoretically edible, do not provide sufficient essential nutrient content compared to animal sources. Foraging for wild plant foods poses significant challenges in terms of both safety and nutrition. Moreover, whilst there is risk associated to pathogens (bacteria and/or parasites) which might be encountered when consuming raw meat, humans exhibit a stomach pH that is among the lowest in the animal kingdom, typically ranging from 1.5 to 3.5 (Höfer et al., 2006). This acidic environment is crucial for the effective breakdown of animal-based food sources and serves a significant role in pathogen defence. The strong stomach acid acts as a barrier against harmful bacteria and parasites commonly found in raw meat, ensuring that ingested pathogens are neutralized before they can cause harm (Klein et al., 2016). The evolutionary adaptation of a low stomach pH reflects the dietary patterns of early humans, who consumed a diet rich in animal protein and fats, further supporting their classification as hypercarnivores (Wrangham et al., 1999).
Gaming the System:
It can therefore be reasoned that modern consumption patterns which are so heavily reliant on domesticated plants and/or fungi, most of which still requires cooking and/or processing before it is safe to consume, are in fact an ingenuity enabled bypassing of our inherent physiological digestive restrictions. We have learned to ‘game the system’ and work around our digestive limitations, but this doesn’t mean we are ‘true’ omnivores because all other creatures on this beautiful planet of ours are classified according to what their digestive function is in their natural habitat and no other creature boasts having learned to process, cook and domesticate their food like human have.
However, this isn’t to suggest that we should abandon our ingenuity of cooking, processing & domestication, and no-one in their right mind is arguing that we should revert to raw meat only diets. Rather, it stands to reason that knowing that our physiology is still that of a Hypercarnivore and understanding how we have learned genius ways of making foods more accessible, we can make the conscious decision to align our dietary choices with what is inherently more suited to our digestive system (i.e. animal sources) and use our ingenuity to augment this, providing even greater yield and in turn superior nutrition.
Conclusion
In conclusion, while the evolution of human dietary practices reflects significant ingenuity in cooking, processing, and domestication, these adaptations have led to the misconception that humans are true omnivores. The anatomical and physiological traits of humans support a classification as hypercarnivores, underscoring the reliance on animal-based nutrients throughout our evolutionary history. The limitations of wild food availability further highlight the unsustainable nature of a strictly wild diet for humans, emphasizing our deep biological roots in hypercarnivory. Understanding our hypercarnivorous roots provides crucial insight into our dietary needs and challenges in the modern world, where the abundance of processed foods may obscure the nutritional deficiencies that can arise from a disconnect with our evolutionary past. As we navigate contemporary dietary landscapes, recognizing our true dietary heritage can inform better health choices and dietary practices, ultimately promoting superior health outcomes, or at least helping to avoid the pitfalls of modern consumption patterns and the associated lifestyle diseases.
References:
· Cohen, A. S., & et al. (2009). Human dietary needs and evolutionary history. *Evolutionary Anthropology*, 18(1), 20-33.
· Eaton, S. B., & et al. (2002). The ancestral human diet: what was it and should it be a model for modern diets? *European Journal of Clinical Nutrition*, 56, S8-S28.
· Haynes, G. (2002). *Mammoths, Mastodonts, and More: The Story of the Ice Age Animals*. National Geographic Society.
· Hassan, F. A. (2007). Agricultural origins: A bio-cultural perspective. *World Archaeology*, 39(1), 1-17.
· Höfer, S., Markl, J., and Möller, A. (2006) ‘Stomach pH in Mammals’, Journal of Comparative Physiology B, 176(5), pp. 291-302.
· Klein, A., Decker, H., and Grunert, O. (2016) ‘The Role of Gastric Acid in Pathogen Defense: Implications for Diet and Health’, Microbial Pathogenesis, 101, pp. 30-36.
· Mead, P. S. (2004). Food-related illness and death in the United States. *Emerging Infectious Diseases*, 10(1), 7-15.
· Parker, C. A., & et al. (2020). Defining hypercarnivory and its implications for human evolution. *Paleoanthropology*, 2020, 1-10.
· Perry, G. H., & et al. (2015). The evolutionary biology of human dietary patterns. *Nature*, 527(7577), 56-62.
· Pérez-Jiménez, J., & Paiva, M. J. (2010). Nutritional and health benefits of legumes. *Nutritional Reviews*, 68(4), 186-200.
· Wrangham, R. (2009). *Catching Fire: How Cooking Made Us Human*. Basic Books.
· Wrangham, R.W., Jones, J.H., Laden, G., Pilbeam, D., and Conklin-Brittain, N.L. (1999) ‘The Raw and the Stolen: Cooking and the Evolution of Homo’, Current Anthropology, 40(5), pp. 567-594.