Articles

Our Designated Diet Beyond Breast Milk: Musings on Edenic Nutrition
Our Designated Diet Beyond Breast Milk: Musings on Edenic Nutrition

Some of the most persistent misconceptions surrounding nutrition congregate around the subject of “what is the natural diet for humans (after breast milk)?”. The idea is relatively simple: identify which human diet is the “designated” one, the most “biological”, “Edenic”, “ancestral” or “instinctive” and you have the blueprint for human nourishment in the palm of your hand, which would surely be amazing. The problem, of course, is that an eclectic (often contradictory) throng of these primordial dietary suppositions jut out across our modern dietary landscapes, all claiming to be the one we are absolutely intended to eat, above all others. Almost all of them, however, are laced with fallacies, often conceived by a triad of historical, scientific and “intuitive” assumptions. Let’s explore a few examples. 

Grass seeds, more commonly known as grains, are a commonly maligned food group within paleo/primal circles (indeed, a lot of the time, so are carbs in general). The proposition goes as follows: as grains did not exist in our diet pre-agriculture, that is to say, prior to 10 thousand years ago, we are not really “designed”, and thus anatomically ill-equipped, to process them. This presumption has gone relatively viral. Even people that do not follow a paleo diet per se have adopted the advice. Nevertheless, it is not true. Our terrestrial predecessors consumed grains for tens of thousands (probably hundreds and hundreds of thousands) of years prior to the supposed genesis of agriculture. In fact, one of the reasons why grains formed the bulk of the earliest agricultural vanguard was precisely because of their multi-generational preference amongst earlier hunter-gatherer populations. They were not planted ex nihilo. For example, peoples of the shoreline of the Sea of Galilee in Israel consumed wheat and barley a further 10,000 years before these grains were domesticated. In the Gravettian culture of North Europe, remnants of oat flour have been found on pulverising tools dating back to over 30 thousand years ago, indicating use for flatbreads and gruel or porridge (in an area that is now Italy). In cave sites in Mozambique, wild sorghum granules have been found on grounding stones that might date to over 100 thousand years ago. The actual length of time that grain consumption goes back beyond this is anyone’s guess. 

Furthermore, bread and beer are not products of agriculture as many assume. They are hunter-gatherer inventions. Wild strains of the ancient Einkorn and Emmer wheat, wild barley et al, were used to make early beers, thus far dated to at least 13,000 years prior in Israel. Similarly, in Jordan, comprehensive brewing apparatus predate supposed agriculture by over 1000 years, leading many anthropologists to theorise that beer (and the need for more beer) constituted the most significant reason for the development of agriculture in the first place. Either way, grain consumption is far from new. Grains are a paleo food. Just don’t conflate that fact with those far-removed, heavily hybridised modern strains, and all the dubious ways in which they are prepared.

Many so-called instinctive/naturopathic diets propose the doctrine that our natural diet would entail only foods that we would be able to pleasingly eat individually, as an uncooked mono meal, without any additional flavourful garnish or accompaniment needed to enliven, embellish or mask it. Under these dietary guidelines, overly bland foods, for instance, grains, many roots, and meat, would not have been selected as they would be seen as insufficiently appetising in their raw, unsullied form. Likewise, overly strong, spicy, pungent or bitter foods like chilli, garlic, onion, neem or horseradish would not be considered suitable dietary candidates as they are too intense and “insulting” to our palate; often a bite or two is literally enough to make us cringe. Even cacao would not have made the selection due to its intense bitterness (though the sweetish pulp that naturally surrounds the beans would have been well received).  Certainly, for an explorative hunter-gatherer, anything too strong of taste could be an indication of poison, so there is a natural sense in the cautionary approach. Instead, he or she would focus on sweet or fatty fruits, young tender leaves, and a few nuts. I am sure wild honey would (or at least could) opportunistically feature as well. Essentially, this “instinctive” approach is a fruit-centric diet. What nature provides for us, in its most tempting and enticing variations, is the human diet par excellence. What nature serves us, through the seasonal procession of different fruit and berry species coming into their glorious ripeness (in many cases splitting open a little to accommodate their pert fullness) is an exhibition of ultimate allure to the instincto (and I would wager, for many of us too).

While I accept the positive value in a good number of these principles, and the central position that fruit consumption undoubtedly had for swathes of our terrestrial ancestry, I want to draw your attention to two assumptions that have unnecessarily limited many versions of these diets, and turned congregations of adherents, into overly restrictive dogmatists. Firstly, by avoiding anything overtly strong or “challenging” to the taste buds, you tend to avoid a phenomenal spectrum of natures potent medicinal cornucopia, 20-50 wild plants of which could easily accumulate to entail a broad-spectrum, multi-organ turbo charge – improving activity, functionality, resistance, and protection across the entire body. During my childhood I had asthma, some days it was very severe. However, through a mix of intuition and noticing how food affected me, I discovered that eating raw onion stopped asthma attacks in their tracks. Raw onion was not only a lifesaver, it literally cured my asthma to the point where I never had an outburst again, just through the regular consumption of it. Now, I admit it was not easy, chewing through the onion, tongue burning, eyes weeping, but when you can literally feel your bronchial tubes relaxing and opening up, oxygen rushing into your lungs as it should, trust me, you keep going. Correctly prescribed medicinal plants, leaves, bulbs, roots and barks are very powerful. They can make you sweat, burn, stimulate all of your juices, target ailing organs by a direct interaction with them, enliven and tonify. The disregard of such a broad and diverse kingdom is simply not a smart move.

 

I am sure you are aware that numerous faith-based diets have a strong anthropocentric ethos to them; a philosophy which also pervades a lot of the naturopathic /instinctual eating literature. In this light, nature, especially the bounty that she freely provides, was made for us, is designated to us, exists to feed and nourish humanity. And yet, when we wind up being initiated into a culture that proffers belief in an anthropocentric reality, it can be difficult sometimes to disentangle food and good nutrition from the cultural biases of our food community. “And God said, Behold, I have given you every herb yielding seed, which is upon the face of all the earth, and every tree, in which is the fruit of a tree yielding seed; to you it shall be for food” (Genesis 1:29). If nature’s bounty of fruits (which all, by definition, bear seeds), nuts and seeds is appointed ours, it can easily be taken that these foods are prêt à manger and good to go, intended, just as nature provides them. Appointed or not, a central component of that far-reaching belief flies in the face of plant biology. 

You see, in reality, most seeds, and for that matter plants in general, don’t want to be eaten. In fact, nature goes to considerable lengths to protect its wares from opportunistic herbivores, and it does this by a complex of different means, working over many different temporal cycles. The most immediate action is through the presence or production of secondary toxins, toxins like tannins, alkaloids and terpenoids which can be poisonous in larger doses. Tannins generally have a bitter, astringent taste for example, and when concentrations are high enough, act as an effective deterrent. This can also happen in real-time due to over-browsing on an individual plant or tree. The plant responds by increasing secondary plant toxins to its leaves or seeds to deter those consuming it. Plants can start sweet and become bitter in a few hours, and even warn surrounding neighbours who will immediately up their distasteful plant chemical defences too. Seeds also have other secondary toxins, lectins and so-called anti-nutrients which have the capacity to cause “inability to thrive syndrome” – strongly impairing digestion, instigating adverse biological effects, interfering with hormone output, and with enough consumption, cause serious reproductive disorders and infertility. Let me say again, most plants do not want to be eaten; seeds want to sprout and grow into baby plants and baby trees, plant leaves want to produce energy for their growth, fruiting, flowering and reproduction. An anthropocentric view masks this reality. Ripe fruits are the predominant exception to this claim. They want to be plucked and gorged upon, they want their seeds to be scattered far and wide, to form new offspring, new groves. A good percentage of small fruit seeds typically pass through our digestive canal unscathed, and (as a bonus) in their own fertilising manure. The relationship, at least as it would look like within traditional indigenous networks, is an incredible, under-appreciated example of cross-species mutualism.

The truth is, we can not properly discuss the true nature of our designated diet, without involving the concept of innovation, without bringing its import to the table. It was innovation that allowed our diet to grow, to extend beyond any initial parameters. I once laughed at an argument placed against urine therapy and its naturalness within the human dietary framework. It was alleged that urine therapy couldn’t be a reasonable or natural practice because humans don’t naturally have a cup to pee into, and we probably weren’t a good enough shot without one. But by that same logic, we wouldn’t consume the water and flesh of coconuts because we don’t “naturally” have a machete. Or, be able to climb an Açaí tree to harvest the fruits because we don’t “naturally” have rope around our feet. And yet, clearly, even the most rudimentary use of innovation brings all of these things readily to hand. Clearly, our ability to innovate is part and parcel of who we are and how we have always developed. Homo Innovatus we could so easily say. Each time our terrestrial ancestry did innovate, the world was a little changed, a new tier of palatable possibility was erected, either by allowing access to unprecedented foods or through the “digestion” of a new remit of foods outside of our own bodies, before we even consumed them. In either case, innovation, biology and food suitability are intrinsically linked, and we should not try and separate them. The key perhaps, the nourishing sweet spot, was to always have one barefoot in the fruit forests, and the other barefoot, in the experimental terrains. Because, after all, the very most vitalising diets, the most nourishing formation, may still be yet to come, lying undiscovered until our curiosity unearths them.

Kyle Author