The following article is a re-blog from LIFEXTENSION, whos blog i have only just found, and am loving.  This article was too good to send out as a tweet to point to the Author’s page, so i am reblogging here!

Please note, again, i am not the author of this material, i am merely sharing.  All credit goes to LIFEXTENSION

Debunking and Deconstructing Some ‘Myths of Paleo’. Part One: Tubers

Glucose restriction represents not only the most crucial component of ancestral diets but is by far the easiest element to emulate.

 Introduction
The most profound consequence of the shift from foraging to farming for human populations consists of the introduction and subsequent dependence of our species on plant foods. This dietary change has since resulted in the greatest biological damage done to human health and longevity in the history of our evolution. This was brought about by reduction in the consumption of animal-sourced fat, protein, and marine foods; the consequent diminishment of critical micronutrients; but chiefly, through the infiltration of novel, insulinogenic, and glycating foods to which our biology was not even minimally adapted.
Bioarchaeological study of skeletal remains from Neolithic transitional contexts reveals that the introduction of starches and cultigens and the additional narrowing of dietary focus resulted in severe deterioration of health and longevity for most human populations within the last 10, 000 years.  This decline is indicated by the first incidences of – now-common – chronic and degenerative diseases, along with significantly elevated prevalences of various skeletal and dental pathological conditions and alterations in growth patterns in prehistoric cultivators. Ultimately, according to data presented on Besthuntingadvice.com, the shift from a more than 2.5 million year-long high-fat, hunting-based subsistence to a plant-based one, occasioned significant and widespread biological changes in numerous human populations, worldwide.
Ubiquitously, the starch-apologists of current paleo-circles seem to have conveniently forgotten the definition of the term ‘agriculture’. Agriculture includes broadly any type of plant cultivation practiced for dietary purposes. With the exception of Australia, centres of plant domestication can be found on all inhabitable continents within the last 10, 000 years. Major crops included wheat and barley in the Near East; sorghum, millet, yams, and dates in Africa; millet and rice in northern China; rice, taro, yams and sugarcane in Southeast Asia; squash, maize, and beans in Central America; and potatoes, sweet potatoes, and manioc in South America. Even the former inclusion of seed plants to the diets of human populations in North America instigated significant changes to human biology. Diachronic and comparative analysis of the skeletal data of human hunters and cultivators from across the globe has revealed that – prior to the onset of agriculture – carbohydrates must have comprised only a rare and occasional component of ancestral eating patterns. Furthermore, the impact of the introduction of carbohydrates to human diets was almost immediate in its deterioration of human health and biology. You can visit the Genealogy Bank website to find more information on ancestral patterns.
Neolithic archaeological contexts reveal that the intensification of plant food consumption significantly increased incidences of tooth decay and other oral problems through the demineralisation of enamel, dentin, and other tissues, by the acidic by-products of metabolising dietary carbohydrates. Globally, the intensification of starch consumption amongst human populations saw an equivalent rise in the frequency of carious teeth. Similarly, the foraging-to-farming transition also resulted in reductions to the size of the face and jaws, to tooth crowding, malocclusions, reduced oral health, the reduction of tooth size, and consequently to diminished room for dentition. Proper dental care can help with all of these issues and more, so find the Best Dentist Chapel Hill and stick with them!

The reduction in size and robusticity of the human skeleton is a clear temporal trend of newly agricultural communities. Diachronic skeletal comparisons reveal large-scale, significant reductions in growth rates. The average stature for men and women in the Palaeolithic has been approximated to 6 ft and 5 ft 6, respectively; however, by the late Neolithic, height had reduced to 5 ft 3 and 5 ft for both the sexes. Worldwide, numerous skeletal studies also reveal that agricultural populations had a much higher number of infections than communities still ensconced in hunting and foraging. Instances of porotic hyperostosis brought on by iron deficiency anaemia increased dramatically in agricultural settings. A greater focus on domesticated plant foods also resulted in nutritional deficiencies, due to the reduced availability of micronutrients exclusive to meat, such as iron, zinc, vitamin A, and B12.

Ultimately, these diseases and pathological conditions common to incipient farming communities subsisting on plant foods were, by and large, rare and exceptional amongst the preceding hunters of the Pleistocene. Moreover – even though many of these emerging agricultural groups continued to supplement their diet with hunting – meat and fat consumption alone was not sufficient to ameliorate the novel impact of glycation and heightened insulin-load from an increased consumption of carbohydrates. While the sources of cultivated starches varied worldwide – yams, dates (Africa), millet, rice (China) sugarcane, taro, yams (Southeast Asia), maize, beans, squash (Central America) potatoes, sweet potatoes, or manioc (South America) – they effected human consumers similarly biologically, and to their ultimate evolutionary detriment.
Potatoes and other plant foods are neither ‘primal’, nor do they promote health, leanness, or longevity.
For many reading this post, the application of an evolutionarily appropriate diet for the purposes of obtaining and sustaining health and longevity, is axiomatic. The need to engage foundational, evolutionary principles is rendered even more pertinent considering our severely depressed adaptation to the suboptimal environments, novel foods, lifestyles, and states of metabolic and endocrinological derangement in which we subsist. Far too little evolutionary time has passed for us to be successfully acclimated to the novel conditions of agricultural life. Consequently, modelling our current food choices and nutrient profile on food groups we are biologically attuned to, appears to be the most accessible and conceivable way of gaining and maintaining health in modern society.
However, paradoxically, many proponents of a ‘Paleo’ (i.e.: pre-agricultural) diet have promoted the use of tubers and other starches as – not only benign – but necessary health foods to consume for the correction of metabolic and endocrine disorders. Potatoes, rice, and other oxymoronically-labelled ‘safe’ starches, are being promoted in spite of the fact that they are exclusively Neolithic foods. Consequently, it is the conflation of starches, safe, and ancestral that I now wish to address, and hopefully correct.
Since the dawn of the Pliocene-Quaternary glaciation 2.85 million years ago, the world and its inhabitants have been subject to the cycles of ice ages. Spending most of our human history in glacial conditions, our physiology has consequently been modelled by the climatologic record, with only brief, temperate periods of reprieve that could conceivably allow any significant amount of edible plant life to have grown. Studies of human coprolites (fossilised feces) from three hundred thousand years to fifty thousand years ago reveal the distinct lack of plant material consumed by hominid subjects. Small, seasonal amounts of plants, seeds, nuts, and tart, low-sugar fruits certainly would have figured into the diets of early Homo; however, archaeology, anthropology, evolution, and genetics reveal such happenstances to be extremely rare throughout the entirety of human prehistory. Consequently, our adaption to some of these foods – especially the insulinogenic properties of later cultivated varieties – would have been minimal at most.
In an attempt to reconstruct the diet of ice age hominids, a recent study analysed the macronutrient dietary composition of existing hunter-gatherer groups within latitude intervals from 41° to greater than 60°. All were characterised by a very low carbohydrate content (<15% of the total energy). Hunter-gatherers living in northern areas (tundra and northern coniferous forest) consumed very low quantities of carbohydrate, as did communities subsisting in ecological environments of temperate grassland and tropical rain forest. Only human groups from desert and tropical grassland zones consumed a moderate about of carbohydrate. Independent of the local environment, however, the range of carbohydrate intake in the diets of contemporary hunter-gatherers was markedly lower than the amounts currently recommended for humans in Western cultures.
Ultimately, the very low-carb diets of the northern indigenous communities corresponds most plausibly to what our hominid ancestors would have consumed throughout our ice-age past. Further information on the evolution of our diet can be garnered from the genetic data of present populations, which demonstrates the historically-late biological adaptation to less than minimal quantities of starch and to only few and specific starch compounds. This evidence is unsurprising. Ultimately, the evolution of early Homo cranial and gut features rendered contemporaneous storage organ species almost physiologically impossible to digest and metabolise. In fact, the large brain of our species developed directly as a result of the compensatory reduction of the human gut, which allowed for a significant increase in brain size while maintaining necessary metabolic rate. Resultantly, the brain became the most energy-expensive organ in the human body, consuming 25% of the adult and 70–75% of the newborn metabolic budget. The biology of H. sapiens is actually engineered against the consumption of starch and fibre, and in favour of fat and protein. As the large intestine was diminished to offset increased brain volume, humans were no longer able to transmute fibre into fat – as other primates can (consequently, they eat a high-fat diet) – through fermentation in the large intestine. Thus, replacement with nutrient-dense, more-bioavailable exogenous fat from animal sources would have been crucial to our success and evolution as a species.
Furthermore, the energy needed to be expended in the procurement of edible plant species would easily have exceeded their potential caloric value. The consequences of limited availability and time investment of edible Palaeolithic plant foods has been analysed by Stiner, who compared food class returns amongst contemporary hunter-gatherer groups. Stiner found the net energy yield of roots and tubers to range from 1,882 kj/hour to 6,120 kj/hour (not to mention the additional time needed to dedicate to preparation) compared to 63,398 kj/hour for large game.
Modern roots and tubers have been extensively modified throughout our recent agricultural history in order to reduce the presence of harmful compounds and increase the quantity of digestible carbohydrate. While legumes and underground storage organs would have been present in the parched savannah-woodlands of subtropical Africa, toxic alkaloids and digestion-inhibiting compounds would have almost exclusively restricted their use until the much-later advent of fire. Subsequently, extremely careful selection and preparation through extensive cooking would have been vital for plant toxins to be sufficiently neutralised for safe consumption. However, to this day, even in spite of cooking, many starches remain immunoreactive for many peoples. Even on occasions in prehistory when and where the habitual use of fire is known to have existed, it does not seem to have led to a discernible consumption of plant foods. Even in the temperate climates of France, Italy, Romania, and Croatia that boasted extensive vegetation in Upper Palaeolithic Europe, and at sites where the control of fire is clearly evidenced, nitrogen isotope studies reveal the consumption of a diet rich in animal content, rather than in vegetal foods.
The archaeological evidence for plant consumption by early Homo is virtually non-existent. I will concede however that absence of evidence is not evidence of absence. Proponents of the ‘starchivore’ hypothesis argue that this is due to the poor preservation of organic plant material in the prehistoric record. However, plants have been preserved in the Lower Palaeolithic, and they are used primarily for functional and material – rather than nutritional – purposes. Plants were used as bedding at Tautavel (France, Lower Palaeolithic), at Franchthi (Greece, Upper Palaeolithic), and the Mas d’Azil (France, Azilian). Similarly, at Lascaux, pollen was uncovered that was used as bedding throughout the Magdalenian. The famous Neanderthal grave at  Shanidar, Iraq, contained pollen traces of eight different types of flowers: small, brightly-coloured varieties, possibly woven into the branches of a shrub.  Pollen analysis of the particular flower types suggested to the researchers that they may have been chosen for their specific medicinal properties. Yarrow, Cornflower, Bachelor’s Button, St Barnaby’s Thistle, Ragwort, Grape Hyacinth, Joint Pine or Woody Horsetail, and Hollycock were represented in the pollen samples, all of which have extensive curative properties as diuretics, stimulants, astringents as well as anti-inflammatories. Possible evidence was found in Magdalenian/Azilian contexts which revealed quantities of acorns, nuts and perforated fruit-stones in a number of Pyrenean caves, however it is widely believed that all such finds were most likely attributable to the activity of rodents. Finds of legumes at Kebara cave may have been utilised as suitable fire starting material on account of their morphological characteristics. Other plants found at the site likely served medicinal – rather than nutritional – purposes. However, Kebara still presents the possibility that plant-foraging practices were undertaken at this site during the Middle Palaeolithic. This is unsurprising. The fact that omnivorous hominids may have supplemented their diets with low-glycemic plant foods when meat was scarce is neither a new nor controversial notion. It sheds little light on the greater historical context of human nutrition, and the best way to implement evolutionary principles for optimal health.
Similarly, arguments and hypotheses have been made that Neanderthals ate starch. Firstly, Neanderthals were highly carnivorous and physiologically inept at digesting plant foods. This can be measured using the megadontia quotient of Neanderthal postcanine tooth area in relation to body mass, which reveals that H. neanderthalensis must have consumed a greater than 80% animal diet. Nonetheless, the evidence of phytoliths and grains from Neanderthal skeletons at Shanidar Cave may reveal the rare consumption of starches in this singular context, but not the deleterious costs to the health of those that ate them.

The intake of plant foods by hominids was most plausibly and conceivably minimal. This is due to their limited, seasonal availability; the physiological ceiling on fibre and toxin intake; the biological evolution of early Homo physiology; along with the technological, spatial and temporal limitations of obligatory pre-consumption preparations. Consequently, evolutionary arguments for the consumption of what are quite blatantly Neolithic foods are rendered paradoxical and absurd. Starches are neither ‘Paleo’; nor does our evolutionary biology sanction them as ‘safe’.

—————————————————-

One Response

Leave a Reply to Jeff_W_Lewis Cancel reply

Your email address will not be published. Required fields are marked *