How Wonder Bread Saved Civilization

Before it was the scorn of nutritionists, fortified white bread was a health food breakthrough that helped stop disease and win a war.

Welcome back, Wonder Bread.   Keri Wiginton/Chicago Tribune

A filip of news led me down an interesting historical rabbit hole yesterday: ”Wonder Bread will soon be back on store shelves in the Chicago area.”

The persistence of Wonder Bread—an Indiana invention—and its pillowy goodness is impressive, even as whole grain has surpassed white bread in sales. The nutritional basis for whole-grain’s rise has been known for ages; over 70 years ago, white bread was considered not just a health issue, but a threat to the Republic. In response, millers met in Chicago and agreed, after a governmental push, to enrich their product, thus buying white bread another three-quarters of a century of dominance—after doing its part to win the War and lift the South out of poverty.

In the early 1900s, a new, now forgotten epidemic began to surface in American mental hospitals: outbreaks of pellagra, a disease caused by a lack of vitamin B3 and leads to the four D’s of diarrhea, dermatitis, dementia, and death. In 1909, Peoria State Mental Hospital, the northernmost point of the epidemic, reported 45 deaths out of 135 cases. But the causes weren’t known, and the outbreak was followed by an outbreak of pellagraphobia, as the Tribune reported that corn was a potential cause. Or a lack of corn. Or flies; it would take decades to figure out.

Corn was one of the causes, but not in the way people originally thought. The degermination of corn, a Southern diet staple in bread and otherwise, began just before pellagra began to effect hundreds of thousands of Americans. It’s estimated that it killed 150,000 people in the first half of the 20th century; its origins in diet were not conclusively proven and accepted until the 1930s.

Poor Americans were, in essence, starving from a lack of vitamins, and a country that was facing the prospect of a World War could not afford the ill effects of malnutrition. In July 1940, a year and a half before Pearl Harbor, the Brits announced a plan—never realized, except among troops—to fortify their bread with thiamine (vitamin B1). Two months later, the Food and Drug Administration held the “flour hearings” of 1940, which laid out the problem and its potential solutions.

The next month, the Surgeon General set up a meeting in Chicago between the leaders of the country’s milling and baking organizations: a meeting “to find ways and means of restoring to American flour, American bread, the life-giving substances which the staff of life once contained—and more besides,” Reader’s Digest reported in “Supercharged Flour—An Epochal Advance.”

“This new machine-science transformed bread that had once been dark and coarse and strong to bread that was pure and white and dainty,” Paul de Kruif wrote. “The whiter it was the better man liked it. It became the staple food for one-third of all the people of the world.”

“But this super-milling scalped the life from the wheat grain,” de Kruif continues. “Man kept for himself wheat’s whiteness, its starch, its calories—and fed the husks, the germ the life, to swine. But his body could not use those calories without the vitamins in the husk and the germ he had discarded. So today our pure white flour has lost all but a fraction of the thiamin, the riboflavin, the nicotinic acid that were the secret of the bread’s strength.”

When the FDA reconvened the flour hearings in November, the industry was ready to put it back. By May of the next year, the guidelines were set—including the word “enriched,” which was chosen over “restored,” in order to not give the game away. By 1943, 75 percent of bread in New York was enriched, up from virtually none a few years prior.

It was, in fact, a wonder. Pellagra essentially disappeared overnight.

Approximately 16,000 inmates of the House of Correction of the City of Chicago (56 per cent of whom were alcoholics) were screened in 1948-49 for classical nutritional deficiency syndromes; 451 newly admitted alcoholics were given careful nutritional examinations during the “pellagra season,” and detailed, serial, physiological, and biochemical observations were made on 24 selected alcoholics before and during therapy.

Among the 451 newly admitted inmates 23 per cent were grossly underweight…. Among the 24 selected alcoholics, 39 per cent were grossly underweight…. The admission fasting hour excretion of thiamine and riboflavin by these selected men ranged from 0.7 to 20 and 1.7 to 164 µg./hr., respectively. These levels of excretion of thiamine and riboflavin agreed closely with those reported for active healthy young men.

Among all these men were found only 2 with pellagra, 1 with possible beriberi, 3 with florid ariboflavinosis, 1 with Wernicke’s encephalopathy, and 7 with possible nutritional polyneuropathy. No cases of shipboard scurvy, xerophthalmia, or gross phrynoderma were detected. In all, 2.2 per cent of the men were judged to have clinical evidence of avitaminosis.

…Data from various agencies were analyzed and field observations were made in the “Skid Row” area of Chicago. The end results indicated no significant changes in eating habits, economic status, or alcoholic consumption of the Chicago alcoholic. Vitamin pills and nutrition education have passed him by. The science of fermentation and distillation of spiritous liquors is traditional and liquors are not fortified with vitamins.

The only innovation since 1938 which bears on the alcoholic’s nutritional status has been vitamin enrichment of bread, started in Chicago in 1940-41. Alcoholic pellagra virtually disappeared from Cook County Hospital in 1942-43 when niacin, for flour enrichment, was first made by the ton. The alcoholic eats mainly fortified bread, and we conclude that this food habit has been the most significant factor contributing to the present surprising lack of avitaminosis among alcoholics.

Was it good? No, absolutely not. It was so bad that, as Aaron Bobrow-Strain documents, the USDA actually set up a program in the mid-’50s idea of heartland America, Rockford, to develop a less terrible white bread. But people ate it anyway: fortified white bread was the bread that saved civilization (and people who ate baguettes), after all.

Whether you looked at Better Homes and Gardens, Sunset, or Harper’s Bazaar, homemaker advice columns in small-town newspapers, or the more lofty New York Times food section, it would have been hard to find anything good said about the taste of industrial white bread. In a steady stream of newspaper articles, letters to the editor from housewives, and popular magazine features, industrial white bread was described as “cottony fluff,” “cotton batting,” “fake,” “inedible,” “limp,” and “hot air.”

[snip]

…health and vigor sold bread far more than taste or freshness did. Tellingly, 98 percent of Rockford housewives in the USDA study believed that, despite its many flaws, industrial white bread was highly nutritious—a strong and vigorous food.

In contrast to what we now associate with Wonder bread, Rockfordians choked down that marshmallowy concoction because it was a health food. It wasn’t until the 1980s that what had long been known about milling, degermination, and nutrients expressed itself as a market good, something Bobrow-Strain traces to the buying power and health consciousness of the upper-middle class in the Reagan era. (At the end of Bright Lights, Big City, the Gatsby of that period, the protagonist literally purifies himself and begins his redemption with fresh-baked bread.)

In a generation, the wonder bread that had eliminated a third-world problem in the American South and gained popularity as a symbol of America’s post-war triumph, instead turned into the opposite: a marker of poverty, ill-health, and bad taste, a consensus broken over bread.

 

Share

Advertisement

Submit your comment

Comments are moderated. We review them in an effort to remove foul language, commercial messages, abuse, and irrelevancies.