by Kenneth W. Krause.
Kenneth W. Krause is a contributing editor and “Science Watch” columnist for the Skeptical Inquirer. Formerly a contributing editor and books columnist for the Humanist, Kenneth contributes regularly to Skeptic as well. He may be contacted at email@example.com.
Americans currently spend less than ten percent of their disposable income on food, as opposed to more than twenty percent in 1950. Pasteurized milk has saved millions of us from outbreaks of campylobacter and E. coli. Our meats and vegetables last longer than ever and, when prepared with a modicum of skill and restraint, taste pretty good too. Why? Because natural is not always better and because science delivers marvelous outcomes.
Unadulterated science, that is. The equation gets a little messier, on the other hand, when incorrigible greed, governmental hypocrisy, and popular indifference (or blind faith) are entered into the calculation. So the new, real-world result for most Americans is, to say the least, less than appetizing: Seventy percent of American calories now come from industrially processed foods, an $850 billion per year venture.
Highly nutritious and generally affordable vegetables, eggs, fresh meats, fruits, beans, and nuts, for example, are commonly forsaken for their obscurely constructed, pre-packaged, and fast-food counterparts. In only the last century or so, says business journalist Melanie Warner, we have acceded to the “most dramatic nutritional shift in human history,” consuming twice the added fats, half the fiber, sixty percent more added sugar, three times the sodium, and immeasurably more corn and soybean product.
In Pandora’s Lunchbox, Warner exposes the “weird science” of food disassembly and reconstruction commonly applied by various food technologists and manufacturers including National Starch, Kraft, Tyson, General Mills, Sysco, and Pepsi. Subway’s Sweet Onion Teriyaki sandwich, for instance, contains 105 ingredients, more than half of which are “dry, dusty substances” added to the meat (13), bread (22), teriyaki glaze (12), and fat-free sweet onion sauce (8). “Eat fresh” indeed!
Yes, corporate food scientists have a lot on their plate. The end product must not only taste good and withstand the heat and physical wear and tear of processing; it must be consistent from package to package and possess an uncannily protracted shelf life. Perhaps most imperatively, however, processed foods need to be cheap, efficiently produced, and at the same time, marketable as “healthy.”
Consider breakfast cereals as one especially egregious example. There are good ones (Cheerios and Corn or Bran Flakes) and bad ones (Fruit Loops and Cocoa Puffs), right? Not so much.
One-fifth of all Americans and a whopping one-third of their kids, support a $10 billion per annum business nearly every morning. Boxed cereals are creatures of the 20th century and, after beer, wine, cheese, soda, milk, salty snacks, and bread, they are presently the most popular food item in U.S. grocery stores.
In 1905, Will Keith Kellog first altered the original Corn Flakes recipe to make his product last. He sacrificed the grain’s germ and bran because it caused corn and wheat to go rancid. Thus, only the starchy center was left for consumption and, as a result, most of the vitamins and minerals were eliminated as well. Although their scientists later discovered how to deactivate the specific enzymes causing the problem, Kellogs’ never restored the more nutritious whole-grain formula.
By the 1960s, many packaged cereals were produced through extrusion machines that cooked any number of ingredients into whatever shapes manufacturers thought average consumers would find appealing. Exceptionally harsh and “nutritionally devastating,” as Warner describes, extruders literally rip food molecules apart and melt the remains under extreme temperatures and pressures in a process called “plasticization.” Vitamins A, B1, C, E, and folate, along with natural antioxidants, fare most dreadfully according to a Texas A&M study published in 2009.
Following extrusion, many cereals are pressure cooked, dried, and toasted at temperatures between 525 and 625 degrees Fahrenheit to ensure resistance to decomposition and an extended shelf life. As such, cereal boxes can line grocery store aisles for many consecutive months prior to purchase. There is a downside, of course: Whatever vitamins might have survived extrusion and cooking will tend to degrade as the products sit.
But the processed food industry devised a solution to that problem too—though not a particularly good one, according to Warner. To compensate for nutritional loss, manufacturers add synthetic vitamins, often two or more times the amount printed on the package. In other words, if the label says consumers get 30 percent of their recommended daily allowance of vitamin C, for instance, the processor may have actually added 75 percent.
But maybe the very definition of “vitamin” is flawed. As early as the 1970s, studies have suggested that added synthetic vitamins, as we currently conceive them, might provide little nutritional benefit absent certain phytochemicals that always accompany them in nature—carotenoids, flavonoids, and polyphenols, in particular. Plants use these chemicals to ward off pathogens, and they may benefit us as well by thwarting heart disease and cancer, for example, and even by slowing the aging process. As Warner reports, cereal companies have tried very hard, but so far failed to conjoin this “complex web of nutrients” into their products.
And average consumer might be surprised to know how synthetic vitamins are actually constructed. Most are concocted in Chinese factories few Americans would tolerate as neighbors and few are produced through natural processes. Vitamin D, for instance, requires multiple industrial chemicals to transform sheep grease into the supplement commonly dumped into our milk.
Vitamin B1 starts with coal tar, and vitamin A comes from lemongrass oil and acetone. Vitamin B3 emanates from a waste product in the manufacture of nylon 6,6, a material used to make carpets and vehicle air bags. In fact, the most food-based synthetic vitamins are C, B2, and B12, produced through genetically modified bacteria and the fermentation of corn derivatives.
None of which is to necessarily imply toxicity, of course. But, again, synthetically derived vitamins may be of little nutritional value when split from their natural complements. Even if manufacturers one day discover how to recombine vitamins and phytochemicals, the effort might be all for naught. As Warner recounts, some scientist believe that only the complete biological environments inherent in fruits and vegetables will suffice. And the addition of sugars and nitrates could cause problems too. In other words, if Americans think they can continue to eat poorly and supplement their way to health, they very likely have another think coming.
So, despite a barrage of patently deceptive advertising to the contrary, breakfast cereals are not good for us. But the converse and more crucial question remains—are processed cereals demonstrably bad? As Warner notes, the industry has long relied on the less than inspired “better than a donut” defense. I suppose that depends on the donut, but, in general terms, certain metabolic facts tend to equate rather than distinguish the two foods.
Our ancestors crushed, milled, and cooked their grains for many centuries before cereal companies assumed control. But the old way’s objective was very different from that of the new. Our forebears labored over bowls and pots in order to gain access to the cereal grain’s well-concealed and highly-prized nutrients. Today, the industry extrudes and “gun puffs” our grains to the extreme point where both digestion and nutrition have become practically irrelevant.
Indeed, much processed food—packaged cereals, most notably—come to us essentially “predigested.” As such, we invite their starches to surge into our bloodstreams, triggering spiked insulin levels and, potentially, insulin resistance, metabolic syndrome, obesity, and type II diabetes. In this specific context, the conventional hypothesis that all calories are created equal is plainly flawed.
Perhaps it’s time to adjust our definition of what constitutes a “normal diet”—in particular, to distinguish it from an “average” or “typical diet.” A normal diet for any given species or population is one for which that species or population evolved to consume. The evidence is clear, and I hope compelling: humans did not evolve to receive anything close to seventy percent of their calories from industrially processed foods.