Skip to main content
Skip to article
A split image: left side a 1930s kitchen with whole ingredients on a wooden table, right side a modern factory food production line, muted tones
11 min read By Attic Recipes

How Industrialization Changed What Is in Your Food

Between 1900 and 1980, the food supply changed more than in the previous thousand years. Here is what changed, how it happened, and what the evidence shows.

A Transformation Without a Single Cause

The food that most people in industrialized countries eat today would be largely unrecognizable to a cook working from an early 20th century recipe collection. Not in appearance — much of it is designed to look familiar — but in composition. The ingredients, the processes used to create them, and the proportion of the diet they represent have changed fundamentally over the course of the twentieth century.

This did not happen because of a single decision or a single technology. It happened as a series of overlapping developments — in chemistry, in agriculture, in refrigeration, in retail, in regulation — that accumulated into a transformation of the food supply that has no historical precedent in its speed or scale.

Understanding what changed, when, and why is not a nostalgic exercise. It is the necessary context for understanding why old recipes work the way they do, and why many of the ingredients they assume as a baseline are no longer the default.


The First Wave: Canning, Margarine, and Shelf Stability (1860s–1930s)

Industrial food processing began in the second half of the nineteenth century, driven primarily by military logistics and the need to feed growing urban populations that were increasingly disconnected from food production.

Canning, developed in the early 1800s and industrialized after the American Civil War, made it possible to preserve food for months or years without salt, smoke, or fermentation. It was a genuine technological achievement that also changed what food tasted and behaved like — canned vegetables are softer, more uniform, and nutritionally different from fresh ones, with heat-sensitive vitamins reduced during processing.1

Margarine, invented in France in 1869 as a cheaper substitute for butter, was originally made from beef tallow and skimmed milk. It bore little resemblance to the partially hydrogenated vegetable oil margarines that became common in the mid-twentieth century. Its introduction established the principle — which would become central to food industrialization — that manufactured substitutes for whole foods could be sold on price and shelf stability grounds regardless of compositional differences.

By the 1930s, when the cookbook that underlies this site was written, these technologies existed but had not yet displaced traditional food at the household level in Central Europe. The cookbook assumes whole ingredients because those were still what most households worked with.


Hydrogenation: The Fat That Changed Everything (1900s–1960s)

The single chemical process with the largest impact on the twentieth-century food supply was hydrogenation — the addition of hydrogen to unsaturated vegetable oils under pressure in the presence of a metal catalyst, converting liquid oils into solid or semi-solid fats.

Patented in 1902 and commercially applied from around 1911 (Crisco was the first major consumer product), hydrogenation solved two problems for food manufacturers: it made cheap vegetable oils — cottonseed oil, later soybean oil — behave like solid animal fats at room temperature, and it dramatically extended shelf life by reducing the oxidation that makes liquid oils go rancid.2

The health implications took decades to become clear. Partial hydrogenation produces trans fatty acids — geometric isomers of naturally occurring unsaturated fats — that are not found in significant quantities in natural foods. Research beginning in the 1990s established that industrial trans fats raise LDL cholesterol, lower HDL cholesterol, and increase cardiovascular disease risk more than any other dietary fat, including saturated fat.3

The regulatory response was slow. Denmark banned industrial trans fats in 2003. The United States FDA declared partially hydrogenated oils no longer Generally Recognized As Safe only in 2015, with full compliance required by 2020.4 During the intervening decades — roughly 1950 to 2000 — partially hydrogenated vegetable oils were a dominant ingredient in processed foods across the industrialized world, present in margarine, baked goods, fried foods, crackers, and many other products, often replacing the animal fats that earlier generations had used without apparent harm.

The irony is precise: the replacement of traditional animal fats with industrially processed vegetable fats — promoted as a health improvement — produced a dietary fat with demonstrably worse health outcomes. The animal fats in old recipes were displaced by a technology whose risks were not understood at the time of its adoption.


Refined Seed Oils: A Different Problem (1950s–present)

Separate from hydrogenation, the widespread adoption of refined seed oils — soybean, corn, sunflower, cottonseed — as cooking and manufacturing fats represents a different change in the fat composition of the diet.

These oils are high in polyunsaturated fatty acids, particularly omega-6 linoleic acid. Consumption of omega-6 fatty acids has increased dramatically in industrialized countries over the twentieth century, while omega-3 consumption has remained relatively stable or declined, shifting the omega-6 to omega-3 ratio from an estimated historical range of roughly 1:1 to 4:1, to ratios of 15:1 to 20:1 in typical modern Western diets.5

Whether this shift has health consequences is an area of active research and genuine scientific debate. Some studies associate high omega-6 intake with increased inflammation markers; others do not find this effect at typical dietary levels. This is contested territory and should be understood as such — not as settled science in either direction.6

What is not contested is that the fat composition of the food supply changed substantially and rapidly in the twentieth century, in ways that have no precedent in prior human dietary history.


Sugar: Quantity and Form (1800s–present)

Sugar consumption in Europe and North America has increased continuously since the early 1800s, when the industrialization of sugar production from beet and cane made it cheap enough for general household use. Per capita sugar consumption in the United Kingdom increased approximately tenfold between 1815 and 1970.7

The more recent development — beginning in the 1970s — is the introduction of high-fructose corn syrup (called glucose-fructose syrup in Europe), a liquid sweetener produced from corn starch by enzymatic conversion. It is cheaper than sucrose, easier to handle in industrial food production, and sweeter per unit volume. It became the dominant sweetener in soft drinks and many processed foods in North America, and a significant ingredient in processed foods globally.8

The health debate around high-fructose corn syrup specifically, versus sucrose, is ongoing and not fully resolved — the metabolic difference between the two at typical consumption levels remains contested. What is clearer is the trajectory: total added sugar in the diet increased substantially across the twentieth century, and the sources shifted from household sugar used in home cooking to sugar added invisibly during industrial food manufacturing, making it harder for individuals to track or control intake.

Old recipes use sugar as a deliberate ingredient in specific quantities for specific purposes. Ultra-processed foods contain added sugar as a manufacturing input in products where consumers do not expect it — bread, sauces, cured meats, condiments — at levels that accumulate across a day’s eating without being legible as “eating sugar.”


The NOVA Classification and Ultra-Processed Food (2009–present)

The most useful framework for understanding the transformation of the food supply is the NOVA classification, developed by Carlos Monteiro and colleagues at the University of São Paulo, first published in 2009 and refined subsequently.9

NOVA classifies foods into four groups based on the extent and purpose of processing, not nutrient content:

Group 1 — unprocessed or minimally processed foods: fresh, dried, or frozen meat, fish, vegetables, fruit, eggs, plain milk, plain yogurt, legumes, grains, nuts.

Group 2 — processed culinary ingredients: salt, sugar, oils, butter, flour, starch — substances extracted from Group 1 foods and used in cooking.

Group 3 — processed foods: products made by adding Group 2 ingredients to Group 1 foods — canned fish, cheese, cured meats, bread made with flour, salt, water and yeast.

Group 4 — ultra-processed foods: industrial formulations containing ingredients not found in home kitchens — hydrolyzed proteins, modified starches, emulsifiers, flavor enhancers, colorants, artificial sweeteners, interesterified fats — combined with Group 2 ingredients, with little or no Group 1 content.

The distinction matters because a growing body of epidemiological research associates ultra-processed food consumption with increased risk of obesity, type 2 diabetes, cardiovascular disease, depression, and all-cause mortality, independent of nutrient content.10 The mechanism is not fully understood — it may involve the additives, the food matrix, the displacement of whole foods, or combinations of these — but the association across multiple large cohort studies is consistent enough to be taken seriously.

A 2019 study in The BMJ following 105,159 French adults found that a 10% increase in ultra-processed food consumption was associated with significant increases in cardiovascular disease risk.11 A 2023 umbrella review in The BMJ synthesizing evidence across 45 meta-analyses found consistent associations between ultra-processed food consumption and adverse health outcomes across multiple disease categories.12

These are associations, not proven causal mechanisms. But the consistency across independent studies in different populations is notable.


What Old Recipes Assumed That No Longer Exists

A 1930s recipe assumes certain things about its ingredients that are no longer automatically true.

It assumes that butter is butter — cream, cultured or not, churned. Today, products labeled “butter blend” or sold near butter may contain added vegetable oils or emulsifiers. It assumes that bread is flour, water, salt, and yeast, possibly fat. Industrial bread typically contains a range of additives — emulsifiers, enzymes, preservatives, sometimes added sugar — that extend shelf life and modify texture in ways home baking does not require. It assumes that lard is rendered pork fat. Commercial lard is often partially hydrogenated.

This is not alarmism. It is a factual description of how ingredient categories have changed. When following old recipes with the intention of replicating what they produced, ingredient sourcing matters in ways that would not have occurred to the original cook, because adulteration of basic ingredients was not then what it is now.

The most direct path to cooking from old recipes as intended is to use ingredients from Group 1 and Group 2 of the NOVA classification — whole or minimally processed foods, and basic culinary ingredients. This is not a dietary ideology. It is an ingredient specification.


This post reflects current scientific understanding as of the publication date. The health effects of specific dietary changes remain an area of active research. Where findings are contested or preliminary, this is noted in the text. This post does not constitute medical or nutritional advice.


Sources

Footnotes

  1. Rickman, J.C., Barrett, D.M. & Bruhn, C.M. (2007). Nutritional comparison of fresh, frozen and canned fruits and vegetables. Journal of the Science of Food and Agriculture, 87(6), 930–944. https://doi.org/10.1002/jsfa.2824

  2. Eckel, R.H. et al. (2007). Americans’ awareness, knowledge, and behaviors regarding fats. Journal of the American Dietetic Association, 107(3), 415–422. https://doi.org/10.1016/j.jada.2006.12.008

  3. Mozaffarian, D., Katan, M.B., Ascherio, A., Stampfer, M.J. & Willett, W.C. (2006). Trans fatty acids and cardiovascular disease. New England Journal of Medicine, 354(15), 1601–1613. https://doi.org/10.1056/NEJMra054035

  4. U.S. Food and Drug Administration (2018). Final determination regarding partially hydrogenated oils. Federal Register, 83 FR 23358. https://www.federalregister.gov/documents/2018/05/21/2018-10714/final-determination-regarding-partially-hydrogenated-oils

  5. Simopoulos, A.P. (2002). The importance of the ratio of omega-6/omega-3 essential fatty acids. Biomedicine & Pharmacotherapy, 56(8), 365–379. https://doi.org/10.1016/S0753-3322(02)00253-6

  6. Ramsden, C.E. et al. (2013). Use of dietary linoleic acid for secondary prevention of coronary heart disease and death. BMJ, 346, e8707. https://doi.org/10.1136/bmj.e8707

  7. Mintz, S.W. (1985). Sweetness and Power: The Place of Sugar in Modern History. Viking Penguin, New York.

  8. White, J.S. (2008). Straight talk about high-fructose corn syrup: what it is and what it ain’t. American Journal of Clinical Nutrition, 88(6), 1716S–1721S. https://doi.org/10.3945/ajcn.2008.25825B

  9. Monteiro, C.A. et al. (2019). Ultra-processed foods: what they are and how to identify them. Public Health Nutrition, 22(5), 936–941. https://doi.org/10.1017/S1368980018003762

  10. Monteiro, C.A. et al. (2018). Household availability of ultra-processed foods and obesity in nineteen European countries. Public Health Nutrition, 21(1), 18–26. https://doi.org/10.1017/S1368980017001379

  11. Srour, B. et al. (2019). Ultra-processed food intake and risk of cardiovascular disease. BMJ, 365, l1451. https://doi.org/10.1136/bmj.l1451

  12. Lane, M.M. et al. (2026). Ultra-processed food exposure and adverse health outcomes: umbrella review of epidemiological meta-analyses. BMJ, 384, e077310. https://doi.org/10.1136/bmj-2023-077310

Share this article:

Explore More Topics

Newsletter signup