Is “All Natural” Really “All Good”?
Updated: Feb 4, 2022
The presumed safety of natural substances compared to synthetic (or artificial) food ingredients, has created a rush to “all natural” and “clean labels” that has become a tsunami over the last decade. This clamor for naturals is to avoid the use of multi-syllabic chemical names on the label, but to also in a clever and indirect way assure consumers they are getting something safer than an artificial ingredient because it was presumably vetted by nature. Or, possibly the equally fallacious claim that “we evolved” consuming the natural, “so how could it be harmful”? However, is this belief that natural is inherently safer that artificial actually true? Putting aside urban myths and unicorns for the moment, could the most realistic theory of “vetted by nature” include eliminating (i.e., killing off) the non-tolerant members of the species that consumed the natural, leaving only the hardiest of individuals? Is this food-based Darwinism?
I would tend to believe that from the viewpoint of a hunter-gatherer (with a paucity of dining choices), anything that did not appear repugnant, tasted good (i.e., not bitter with potentially dangerous alkaloids), sustained me (i.e., nutritious) and, was not immediately (acutely) toxic, was probably a good thing to eat – at least most of the time. Our hunter-gatherer ancestor would likely not eat green apples or three-day-old fish more than a couple of times and, he learned to limit his intake of certain items, which could have toxic effects if consumed in high amounts (e.g., licorice), to process certain foods that would be toxic in their native state (e.g., cassava root) and to eat only rhubarb stems and not the leaves. Hunter-gatherers with disaccharide, gluten or lactose intolerance, favism or phenylketonuria were just out of luck, even though the basis of their intolerance was consumption of a natural. Nature regarded these hunter-gatherers with intolerance or metabolic anomalies as residing at the tail of the bell curve and dealt with them rather harshly ─ either they learned to be careful in their selection of foods or they did not live to pass on their genetic-based challenges.
In the late 19th and early 20th centuries, the idea of naturals as being inherently safe was never much of a front-seat issue with most consumers, compared to the outrage that synthetic chemicals generated, once they were characterized as poisonous. Especially influential in the concept of poisonous food additives was the pioneering efforts of Harvey Wiley, whose work on food additives at the turn of the 20th century, eventually resulted in the Pure Food and Drug Act of 1906. Wiley fervently believed that all added non-natural substances posed a potential risk to public health and none was wholesome (White, 1994). Wiley’s screeds during his government service and later as chief of the laboratories of Good Housekeeping Magazine were, however, mis-directed to some extent, in that his condemnation of “chemicals” was directed at toxic entities such as coal tar-derived artificial colors and preservatives (such as formaldehyde and borax), but these admittedly bad actors were conflated with all chemicals used in food and all were characterized as harmful. During this period, the Wiley followers even wanted the regulations to refer to all chemicals added to foods as poisons. At this point in time, the degree of hysteria about artificial chemicals as poisons in food was at its all-time peak among the informed few, but not yet at its all-time high as a wide-sp