Essays On Development Of Early Food Service Establishment

Looking for information on the origins and evolution of classic French menus?
We are often asked to provide the sequence of items comprising the "Classic 12 course French menu." The classic French style of menu making and courses is complicated. The number of courses, and the number of dishes served at each course, are period and meal dependant. Our research confirms "classic" meals generally offer 4 to 8 courses. Examples of 12 course menus are rare, perhaps suggesting they are not "standard" at all. Note: number of covers does not mean number of courses.

"Composition of the Classic Meal...formal meals consisted of several 'courses'--usually there or four but at times five or more--each composed of several dishes brought to the table at the same time. Here is how A.-B.-L. Grimod de La Reyniere describes such a meal in his 1805 Almanach des gourmands: 'An important dinner normally comprises four courses. The first consists of soups, hors d'oeuvres, releves, and entrees; the second, of roasts and salads; the third of cold pasties and various entremets; and lastly, the fourth, of desserts including fresh and stewed fruit, cookies, macaroons, cheeses, all sorts of sweetmeats, and petits fours typically presented as part of a meal, as well as preserves and ices.' In describing the different courses, Grimod de la Reyniere puts different types of dishes in the same category. Some are defined by aspect and mode of preparation...Others are defined by their position and function in the sequence..."
---Arranging the Meal: A History of Table Service in France, Jean-Louis Flandrin, translated by Julie E. Johnson [University of California Press:Berkeley] 2007(p. 3-4)

"Many nineteenth-century authors suggested or justified a reduction in the number of courses and dishes. We have seen that between the sixteenth century and the seventeenth, fewer course came to be served at aristocratic tables. But their number was far from fixed in the seventeenth and eighteenth centuries. Important dinner... usually had five courses--soups, entrees, roasts, entremets, and dessert...Menon's Cuisiniere bougreoise, published in 1746, offers one three course menu and two four-course menus, which also differ in how the courses are distributed."
---Arranging the Meal (p. 95)

Details on the courses served in Grimod's period
"A contemporary grand dinner was composed of four services: the first was made up of soups, hors d'oeuvres, releves and entrees. There might also be a visit from some savoury flying saucer or assiette volante, i.e. things that must be eaten as soon as they are taken off the spit, out of the oven or off the hob; Grimod gives these examples: 'minute' cutlets, steaks, chicken croquettes, ortolans and other little birds on skewers, little pates, cheese ramequins or any form of souffle. The second service comprised of roasts and salads, with the obligatory groses pieces decorating the ends of the table. In general, these remained untouched, for they were more to please the eye than the appetite and could be anything from a vast mille-feuille to a Nerac terrine, a heap of crayfish or a blue carp. The third service involved cold pates and entremets, either sweet or savory... The final service was our modern dessert, with fruits, compotes, jams, biscuits, macaroons, cheeses, petits fours and sweets as well as ices. At a large, formal dinner, the first service could contain anything up to a hundred dishes. In general, a colour, either white or brown, predominated...This colour consideration became universal in nineteenth-century cooking."
---A Palate in Revolution: Grimod de La Reyniere and the Almanach des Gourmands, Giles Macdonogh [Robin Clark:London] 1987 (p. 114)

5 course 17th century French menu
"Under Louis XIV, the menus were magnificent. Doubtless, not all the dishes which figured in the five obligatory courses which made up the gala banquets were perfectly executed, nor were they as variet as they should have been. Nevertheless, there were many of them, if one may judge from the menu of the dinner offered by Mme. la Chanceliere to Louis XIV in 1656 at her Chateau of Pontchartrain...
First course: Eight potted meats and vegetables and sixteen hot hors-d'oeuvre.
Second course: Eight important intermediate dishes called broths. Sixteen entrees of fine meats.
Third course: Eight roast dishes and sixteen vegetable dishes cooked in meat stock.
Fourth course: Eight pates or cold meat and fish dishes and sixteen raw salads, with oil, cream and butter.
Fifth and last course: Twenty-four different kinds of pastries--twenty-four jars of raw fruit--twenty-four dishes of sweetmeats--preserves, dried and in syrup and jams.
There were, in all, 168 garnished dishes or plates, not counting the various foodstuffs served as dessert."
---Larousse Gastronomique, Prosper Montagne [Crown Publishers:New York] 1961(p. 618)

7 course 19th century French menu
"Soup
The Remove (any combination of meat, game, fish and poultry is permissable)
The Entree (meat, sweetbreads, poultry, fish)
First Entremets (croque-en-bouche, small fish, pate, etc.)
The Roast (centerpiece of the meal)
Second Entremets (cooked vegetables, fruit)
Dessert (cakes, pastries, etc.)"
---Art of Eating in France: Manners and Menus in the Nineteenth Century, Jean-Paul Aron [Harper Row:New York] 1973 (p. 111-114)

16 course American menu, 1883: I & II

9 course 19th century French menu served to Queen Victoria

12 course American menu, 1898
"The order of the dinner service is soup, fish, flesh, fowl. These may be supplemented to any extent with entrements and entrees. Mets are the principal dishes. Entremnents, the dishes served between the mets. Entrees, dishes which are served between any of the courses.
First Course: Canapes of caviare, with small bits of anchovy toast, or in their season muskmelons, are sometimes served as the first course, but ordinarily oysters or clams on the half shell is the first dish presented. The smallest-sized shell-fish are preferable to the large ones. One half dozen are served on each plate and placed symmetrically on or around a bed of cracked ice; a quarter of a lemon cut lengthwise is placed in the center. Cayenne pepper and grated horse-radish are passed with this sauce, also very thin slices of brown bread buttered and folded together, then cut into small squares or triangular-shaped pieces. The plates holding the shell-fish may be placed on the table before dinner is announced; but as there is no place to conveniently lay the folded napkin except on the plate, it is as well not to serve the mollusks until the guests are seated.
Second Course: Soup

Introduction

The world has progressed through hunter–gatherer, agricultural, and industrial stages to provider of goods and services. This progression has been catalyzed by the cultural and social evolution of mankind and the need to solve specific societal issues, such as the need for preservation to free people from foraging for food, and the need for adequate nutrition via consistent food supply year round. These forces led to the development of the food industry, which has contributed immensely to the basis for a healthy human civilization and helped society prosper and flourish (Lund 1989).

Development of food science and technology

According to Harvard Univ. biological anthropologist Richard Wrangham, food processing was launched about 2 million years ago by a distant ancestor who discovered cooking, the original form of food processing (Wrangham 2009). Later, but still during prehistoric times, cooking was augmented by fermenting, drying, preserving with salt, and other primitive forms of food processing, which allowed groups and communities to form and survive. Humans thus first learned how to cook food, then how to transform, preserve, and store it safely. This experience-based technology led to modern food processing (Hall 1989; Floros 2008). Much later, the domestication of plants and land cultivation became widespread, and at the end of the last Ice Age, humans revolutionized eating meat by domesticating animals for food. Thus, plant and animal agriculture also contributed to improving the human condition.

Study of every ancient civilization clearly shows that throughout history humans overcame hunger and disease, not only by harvesting food from a cultivated land but also by processing it with sophisticated methods. For example, the 3 most important foods in Ancient Greece—bread, olive oil, and wine—were all products of complicated processing that transformed perishable, unpalatable, or hardly edible raw materials into safe, flavorful, nutritious, stable, and enjoyable foods (Floros 2004).

Today, our production-to-consumption food system is complex, and our food is largely safe, tasty, nutritious, abundant, diverse, convenient, and less costly and more readily accessible than ever before. This vast food system includes agricultural production and harvesting, holding and storing of raw materials, food manufacturing (formulation, food processing, and packaging), transportation and distribution, retailing, foodservice, and food preparation in the home. Contemporary food science and technology contributed greatly to the success of this modern food system by integrating biology, chemistry, physics, engineering, materials science, microbiology, nutrition, toxicology, biotechnology, genomics, computer science, and many other disciplines to solve difficult problems, such as resolving nutritional deficiencies and enhancing food safety.

The impact of modern food manufacturing methods is evident in today's food supply. Food quality can be maintained or even improved, and food safety can be enhanced. Sensitive nutrients can be preserved, important vitamins and minerals can be added, toxins and antinutrients (substances such as phytate that limit bioavailability of nutrients) can be removed, and foods can be designed to optimize health and reduce the risk of disease. Waste and product loss can be reduced, and distribution around the world can be facilitated to allow seasonal availability of many foods. Modern food manufacturing also often improves the quality of life for individuals with specific health conditions, offering modified foods to meet their needs (for example, sugar-free foods sweetened with an alternative sweetener for people with diabetes).

Biology, Cell BiologyUnderstanding of postharvest plant physiology, food quality, plant disease control, and microbial physiology; food safety
BiotechnologyRice with increased content of beta-carotene (vitamin A precursor); enzymes for cheesemaking, breadmaking, and fruit juice manufacture
ChemistryFood analysis, essential for implementing many of the applications listed here; improved food quality; extended shelf life; development of functional foods (foods and food components providing health benefits beyond basic nutrition)
Computer ScienceFood manufacturing process control, data analysis
GenomicsUnderstanding of plant and animal characteristics; improved control of desirable attributes; rapid detection and identification of pathogens
Materials ScienceEffective packaging; understanding of how materials properties of foods provide structure for texture, flavor, and nutrient release
MicrobiologyUnderstanding of the nature of bacteria (beneficial, spoilage, and disease-causing microorganisms), parasites, fungi, and viruses, and developments and advances in their detection, identification, quantification, and control (for example, safe thermal processes for commercial sterilization); hygiene; food safety
NutritionFoods fortified with vitamins and minerals for health maintenance; functional foods for addressing specific health needs of certain subpopulations; development of diets that match human nutrient requirements; enhanced health and wellness
Physics, EngineeringEfficient food manufacturing processes to preserve food attributes and ensure food safety; pollution control; environmental protection; waste reduction efforts
Sensory ScienceUnderstanding of chemosenses (for example, taste and odor) to meet different flavor needs and preferences
ToxicologyAssessment of the safety of chemical and microbiological food components, food additives

Controversies about processed foods

Although today the public generally embraces and enjoys key benefits of the food supply—value, consistency, and convenience—some suggest that the cost to society of obtaining these benefits is too high. Negative perceptions about “processed foods” also exist, especially among consumers in the United States. A range of factors contributes to these perceptions. These include uneasiness with technology, low level of science literacy, labeling, and advertising that have at times taken advantage of food additive or ingredient controversies, influence on perception of voluntary compared with involuntary nature of risk, and high level of food availability (Slovic 1987; Clydesdale 1989; Hall 1989). Other factors contributing to negative public perceptions about processed foods include the increasing prevalence of obesity in many industrialized or developed countries, use of chemicals in production or additives in foods, little personal contact between consumers and the agricultural and food manufacturing sectors, food safety issues, and concern that specific ingredients (particularly salt), may contribute to illnesses or impact childhood development (Schmidt 2009).

Some books on food in the popular press have implied that the food industry has incorrectly applied the knowledge of food science and technology to develop processed foods that result in poor dietary habits. The premise of some critics of processed foods is that knowledge of chemistry and the physical properties of food constituents allow the food industry to make processed foods that result in overeating and cause the general population to abandon whole foods. The argument is stretched further to suggest that the development of processed foods is responsible for promoting bad eating habits and is the cause of chronic disease. Such an argument is specious, because personal preferences, choice, will power, and lifestyle factor into the decision of what and how much to eat. The challenge surrounding the connection between lifestyles and health (that is, diet and chronic disease) is discussed in the next section of this review.

The population challenge

During the 2009 World Summit on Food Security, it was recognized that by 2050 food production must increase by about 70%—34% higher than it is today—to feed the anticipated 9 billion people (FAO 2009a). This projected population increase is expected to involve an additional annual consumption of nearly 1 billion metric tons of cereals for food and feed and 200 million metric tons of meat.

Another challenge is the large, growing food security gap in certain places around the world. As much as half of the food grown and harvested in underdeveloped and developing countries never gets consumed, partly because proper handling, processing, packaging, and distribution methods are lacking. Starvation and nutritional deficiencies in vitamins, minerals, protein, and calories are still prevalent in all regions of the world, including the United States. As a consequence, science-based improvements in agricultural production, food science and technology, and food distribution systems are critically important to decreasing this gap.

In addition, energy and resource conservation is becoming increasingly critical. To provide sufficient food for everyone in a sustainable and environmentally responsible manner, without compromising our precious natural resources, agricultural production must increase significantly from today's levels and food manufacturing systems must become more efficient, use less energy, generate less waste, and produce food with extended shelf life.

Although scientific and technological achievements in the 20th century made it possible to solve nutritional deficiencies, address food safety and quality, and feed nearly 7 billion people, further advancements are needed to resolve the challenges of sustainably feeding the growing future population in industrialized and developing nations alike. In fact, to meet the food needs of the future, it is critically important that scientific and technological advancements be accelerated and applied in both the agricultural and the food manufacturing sectors.

Achievements and promises

The next section of this review, “Evolution of the Production-to-Consumption Food System,” summarizes the parallel developments of agriculture and food manufacturing from the beginnings of modern society (the Neolithic revolution) to the present; it also addresses the current diet and chronic disease challenge. The subsequent section, “Food Processing: A Critical Element,” explains why food is processed and details the various types of food processing operations that are important for different food manufacturing purposes. Then the following section, “Looking to the Future,” outlines suggestions to improve our food supply for a healthier population, and briefly discusses the various roles that researchers, consumers, the food industry, and policy makers play in improving the food supply for better health; it also addresses the promises that further advancements and application of technologies in the food system hold for the future.

Evolution of the Production-to-Consumption Food System

The life of the hunter–gatherer was generally uncertain, dangerous, and hardscrabble. Thomas Hobbes, in his Leviathan (I561), described life in those times as “the life of man in a state of nature, that is, solitary, poor, nasty, brutish, and short.” Agriculture transformed that existence by making available a far larger and generally more reliable source of food, in large part through domestication and improvement of plants and animals.

Domestication leads to civilization

Domestication is the process of bringing a species under the control of humans and gradually changing it through careful selection, mating, and handling so that it is more useful to people. Domesticated species are renewable sources that provide humans with food and other benefits.

At the end of the last Ice Age, humans domesticated plants and animals, permitting the development of agriculture, producing food more efficiently than in hunter-gatherer societies, and improving the human condition. Domestication did not appear all at once, but rather over a substantial period of time, perhaps hundreds of years. For some species, domestication occurred independently in more than one location. For animals, the process may have begun almost accidentally, as by raising a captured young animal after its mother had been killed and observing its behavior and response to various treatments. Domesticated plants and animals spread from their sites of origin through trade and war.

The domestication of plants and animals occurred primarily on the Eurasian continent (Smith 1998). A prominent early site was in the Middle East, the so-called Fertile Crescent, stretching from Palestine to southern Turkey, and down the valleys of the Tigris and Euphrates Rivers, where barley, wheat, and lentils were domesticated as early as 10000 y ago and sheep, goats, cattle, and pigs were domesticated around 8000 y ago. Rice, millet, and soy were domesticated in East Asia; millet, sorghum, and African rice in sub-Saharan Africa; potato, sweet potato, corn (maize), squash, and beans in the Americas; Asiatic (water) buffaloes, chickens, ducks, cattle, and pigs in the Indian subcontinent and East Asia; pigs, rabbits, and geese in Europe; and llamas, alpacas, guinea pigs, and turkeys in the Americas.

The introduction of herding and farming was followed by attempts to improve the wild varieties of plants and animals that had just been domesticated. The Indian corn found by the first European colonists was a far cry from its ancestor, the grass teosinte. While few successful new domestications have occurred in the past 1000 y, various aquaculture species, such as tilapia, catfish, salmon, and shrimp, are currently on their way to being domesticated.

Although the primary goal of domestication (ensuring a more stable, reliable source of animal and plant foods) has not fundamentally changed, the specific goals have become highly specialized over time. For example, we now breed cattle for either beef or dairy production, and cattle and hogs for leaner meat. We breed chickens as either egg layers or broilers. In addition, selection for increased efficiency of producing meat, milk, and eggs is prominent in today's agriculture, as discussed later in this section.

Agriculture, built on the domestication of plants and animals, freed people from the all-consuming task of finding food and led to the establishment of permanent settlements. What we know as civilization—cities, governments, written languages, an expanding base of knowledge, improved health and life span, the arts—was only possible because of agriculture. Along with domestication of plants and animals, people began the journey of discovery of methods to extend the useful life of plant and animal food items so that nourishment could be sustained throughout the year. With a fixed (nonnomadic) population also came primitive food storage and, with that, improvements in food safety and quality.

In July 2009, an important discovery and conjecture was made about the recognition that food security was of paramount importance. Kuijt and Finlayson (2009) reported that they believe they have discovered several granaries in Jordan dating to about 11000 y ago. This would suggest that populations knew the importance of having a dependable food supply before the domestication of plants. The authors further suggested that “Evidence for PPNA (Pre-Pottery Neolithic Age) food storage illustrates a major transition in the economic and social organization of human communities. The transition from economic systems based on collecting and foraging of wild food resources before this point to cultivation and foraging of mixed wild and managed resources in the PPNA illustrates a major intensification of human-plant relationships.” Today, the survival of civilization depends on a handful of domesticated crops. Of the roughly 400000 plant species existing today (Pitman and Jorgensen 2002), fewer than 500 are considered to be domesticated.

Selecting for desirable crop traits

The primary force in crop domestication and subsequent breeding is selection, both artificial and natural, described below. Charles Darwin, in developing the theory of natural selection, relied heavily on the knowledge and experiences of plant and animal breeders (Darwin 1859). Crops were domesticated from wild ancestors’ gene pools that had been altered by selection imposed by early agriculturalists and by natural selection imposed by biotic and abiotic environmental factors (Harlan and others 1973; Purugganan and Fuller 2010). Selection changes gene pools by increasing the frequency of alleles (genes encoded by a place in the genome and that may vary between individuals and mutant/parent strains) that cause desirable traits and decreasing the frequency of alleles that cause undesirable traits. Modern crop varieties are still shaped by the same forces.

The causes of the bursts of domestication activity have been the subject of much speculation (Smith 1998), but the changes symptomatic of domestication are well established for many species (Harlan and others 1973; Doebley and others 2006). Legumes and the large-seeded grasses collectively known as cereals (for example, maize, wheat, rice, and sorghum) contribute most of the calories and plant protein in the human diet. For these and other annual crops such as sunflower and squash, the initial changes during domestication involved ease of harvesting and the ability to compete with weeds. Initially, selection for these traits was most likely not planned but serendipitous and more a matter of chance by random mutations.

The most significant problem confronting most agriculturalists, both early and modern, is weed competition. Early agriculturalists scattered seeds on ground that had been prepared, most likely by burning or some other disruption of the soil surface. Those seeds that passed their genes onto the next generation (natural selection) were those that best competed with weeds. Selection pressure due to weed competition results in a number of changes, including the reduction or elimination of seed dormancy and larger seeds (Harlan and others 1973; Smith 1998). Dormancy is very undesirable in annual crops, and most domesticated species germinate rapidly upon planting. Selection against dormancy has been so extreme, however, that under certain weather conditions, seeds of modern wheat varieties (Triticum aestivum) and barley (Hordeum vulgare) sprout while still in the seed head, destroying the value of the grain crop. Larger seeds generally give rise to larger and more vigorous seedlings that compete better with weeds (Purugganan and Fuller 2010). In the grasses, selection for larger seed size is associated with increased starch and decreased protein in the endosperm. For example, the protein content of teosinte (Zea mays parviglumis)—the wild ancestor of maize (Zea mays mays), which is referred to as corn in North America—is approximately 30%, while the protein content of modern maize is 11% (Flint-Garcia and others 2009).

While the goal of selection is to alter the targeted trait (appearance and/or performance) and the genetic variation underlying the selected trait will be reduced over time, unselected traits will also often change, and these changes may be negative (for example, reduced endosperm protein in grasses that have been selected for larger seeds).

For example, in the United States, the major selection criterion for maize is increased grain yield (Tracy and others 2004), and strong selection pressure for increased grain yield leads to increased starch content and decreased protein content (Dudley and others 2007). Critics focus on such changes as evidence that the quality of our food supply has been “damaged” by modern plant breeding and agricultural practices. But has it? In United States agriculture, maize is grown for its prodigious ability to convert the sun's energy into chemical energy (carbohydrates), while we have abundant sources of plant and animal protein. In other parts of the world, maize is a staple crop, and diets of many people are deficient in protein. To improve the nutrition of the poor whose staple is maize, plant breeders at the Intl. Center for Maize and Wheat Improvement (Centro Internacional de Mejoramiento de Maíz y Trigo, CIMMYT) developed quality protein maize (QPM) that has an improved protein content and amino acid profile (Prasanna and others 2001). It is the selection of the breeding objective that determines the outcome. Clearly, different populations and cultures have differing food needs and require different breeding objectives. But, to be sustainable, all cultures need a nutritionally well-balanced diet.

Changes in food animal agriculture and fisheries

Animal food products are good sources of high-quality protein, minerals (for example, iron), and vitamins, particularly vitamin B12, which is not available in plant materials. Livestock production is a dynamic and integral part of the food system today, contributing 40% of the global value of agricultural output, 15% of total food energy, and 25% of dietary protein and supporting the livelihoods and food security of almost a billion people (FAO 2009b). Seafood, including products from a growing aquaculture segment, provides at least 15% of the average animal protein consumption to 2.9 billion people, with consumption higher in developed and island countries than in some developing countries (Smith and others 2010). Except for most of sub-Saharan Africa and parts of South Asia, production and consumption of meat, milk, and eggs is increasing around the world, driven by population and income growth and urbanization (FAO 2009b; Steinfeld and others 2010). The rapidly increasing demand for meat and dairy products has led during the past 50 y to an approximately 1.5-fold increase in the global numbers of cattle, sheep, and goats; 2.5-fold increase in pigs; and 4.5-fold increase in chickens (Godfray and others 2010). The nutritional impact of animal products varies tremendously around the world (FAO 2009b; Steinfeld and others 2010).

The structure of the livestock sector is complex, differs by location and species, and is being transformed by globalization of supply chains for feed, genetic stock, and other technologies (FAO 2009b). The current livestock sector has shifted from pasture-based ruminant species (cattle, sheep, goats, and others having a multichamber stomach, one of which is the rumen) to feed-dependent monogastric species (for example, poultry) and is marked by intensification and increasing globalization (Steinfeld and others 2010). A substantial proportion of livestock, however, is grass-fed (Godfray and others 2010) and small-holder farmers and herders feed 1 billion people living on less than $1 a day (Herrero and others 2010).

The rates of conversion of grains to meat, milk, and eggs from food animals have improved significantly in developed and developing countries (CAST 1999). Technological improvements have taken place most rapidly and effectively in poultry production, with broiler growth rates nearly doubled and feed conversion ratios halved since the early 1960s. In addition to these productivity gains, bird health and product quality and safety have improved through applications of breeding, feeding, disease control, housing, and processing technologies (FAO 2009b). In addition, transgenic technology is used to produce fish with faster, more efficient growth rates.

Meeting the needs of a growing population

As a result of improved public health measures and modern medicine, the population has mushroomed from an estimated 1 to 10 million in 10000 BC to an estimated 600 to 900 million in AD 1750 and an estimated 6.8 billion today. Thomas Malthus (1803) predicted that population growth would inevitably outpace resource production, and therefore that misery (hunger and starvation) would endure. Undoubtedly, application of science and technology in agriculture and food and beverage manufacturing has negated these predictions and fed population growth (Figure 1).

The application of science to agriculture has dramatically increased productivity, but until the Green Revolution of the 1960s and 1970s, productivity was not keeping pace with population growth. Large areas of the world, including the 2 most populous nations, China and India, were experiencing severe food shortages and anticipating worse. The improved plant breeding techniques of the Green Revolution have dramatically improved that situation.

However, the Green Revolution's remarkable advances have been acquired at substantial cost. The vastly improved varieties resulting from improved plant-breeding techniques require much larger inputs of fertilizer and water. Poor farmers often cannot afford the fertilizer, and adequate water supplies are becoming an increasing problem in many areas. Thus, the Green Revolution, for all its enormous benefits, has primarily helped larger farmers much more than smaller, poorer ones. In addition, pesticide applications in the developing world are too often inappropriate or excessive—in some cases because the farmer is unable to read the label—and there is no structure (for example, a regulatory agency such as the Environmental Protection Agency) to regulate their use.

Problems are not, however, confined to the developing world. Nutrient run off in the United States and other countries leads to algal blooms in lakes and estuaries and to “dead zones” completely lacking in oxygen in lakes and oceans. Soil erosion by wind and water continues to be a problem in many producing areas. Soil quality thus suffers. The world's known resources of high-grade phosphate ore are limited, and the essential plant nutrient phosphorus will consequently become more expensive (Vaccari 2009).

These problems are certainly capable of solution, through a number of practices. Beneficial options include “no-till” agriculture (which leaves the root systems of previous crops undisturbed, thereby retaining organic matter and greatly discouraging erosion), integrated pest management, IPM (which focuses pesticide use where needed, substantially decreasing the amount used), precision agriculture (which site-specifically targets production inputs such as seed, fertilizer, and pesticides where and when needed), drip irrigation (controlled trickling of water), and use of new technology for recovering nitrogen and phosphorus from processing wastewater for use as fertilizer (Bongiovanni and Lowenberg-Deboer 2004; Frog Capital 2009; Gebbers and Adamchuk 2010).

Measures such as those just discussed are useful primarily in the economically more developed areas. Developing countries require other steps adapted to their local areas and focused particularly on improvements for the many millions of small, poor farmers. Improved plant varieties, produced both by conventional breeding and through biotechnology are necessary, as are improved varieties of fish and livestock. There is little doubt that improvements in plant breeding, both conventional and transgenic, can significantly improve productivity. Technological improvements, such as automated plant monitoring via robotics, are “helping plant breeders trim years off the process of developing crop varieties tailored to local conditions” (Pennisi 2010).

The list of such needs is far too long to explore here, but it also must include public health measures. A major problem yet to be addressed is the subsidization of agricultural products in developed nations. Products from small, unsubsidized farmers in developing nations cannot compete in the world market with subsidized products from advanced nations. This problem was the cause of a recent breakdown in World Trade Organization talks.

Some see organic agriculture as an answer to these problems. Organic farming has some clear merits, particularly those practices, such as crop rotation and the use of green or natural biocontrol agents and animal manure, which have been used by farmers for millennia (King 1949). The use of degraded plant and animal residues increases the friability (tendency to crumble, as opposed to caking) and water-holding capacity of soil, and nutrients from decaying plants and animal manure are more slowly available than those from most commercial fertilizers. Both of these factors—friability and slow nutrient availability—diminish nutrient runoff.

While organic agriculture continues to grow in response to consumer preferences in the developed world, there are limitations to widespread use of organic practices. Organic agriculture requires substantially more land and labor than conventional practices to produce food, and the resulting yields are not great enough and too expensive to address the needs of the growing population. The supply of composted animal manure is limited and relatively expensive compared to commercial fertilizers. Organic agriculture excludes the use of synthetic pesticides, and the few “natural” ones that are permitted are seldom used (Lotter 2003). Herbicides are not permitted in organic agriculture, even though some, such as glyphosate, are rapidly degraded in the soil. These exclusions require more manual labor for weed and pest control. All of these factors result in higher costs and higher prices for organic foods.

Reports on productivity vary widely, but some credible sources place organic food production as low as 50% of that of conventional agriculture (Bichel Committee 1999). Yield differences may be attributable to a number of factors such as agro-ecological zone (for example, temperate and irrigated compared with humid and perhumid), crop type, high-input compared with low-input level of comparable conventional crop, and management experience (Zundel and Kilcher 2007). In addition, current organic methods exclude the use of the products of modern biotechnology—recombinant DNA technology—essential to future increases in agricultural productivity. Nevertheless, the more useful practices of organic agriculture must be part of the agriculture of the future.

Although poverty and malnutrition exist in all countries, by far the most severe problems in achieving availability, safety, and nutritive value of food and beverages occur in the developing world (IFPRI 2009). Water shortages and contaminated water, poor soil, destruction of forest for fuel, use of animal manure for fuel, the spread of plant and animal diseases, and the complete lack of a sound food safety infrastructure are among the most vexing problems. Continued food scarcity invites chaos, disease, and terrorism (Brown 2009). The gap between developing and developed nations is not only in economics but also in science, governance, and public information. Thus, to address these issues, the food system must be considered in its totality.

Eighty percent of agricultural land is used for grain fed to meat animals and yields only 15% of our calorie intake. Many have suggested that world food shortages could be greatly alleviated by consuming less meat and using the grain supplies now consumed by animals more directly. Reduction in meat intake, particularly red meats, would confer some health benefits, but the potential effects on world food supplies are less clear and quite possibly much less than many presume. If developed nations consume much less meat, the price of meat will fall and poorer nations will consume more. If more grain is consumed, grain prices will rise, to the detriment of populations that already rely heavily on grain. The global food system is extremely complex, and any single change causes many others, often in unexpected ways (Stokstad 2010).

Clearly, the solution to the challenge of meeting the food demands of our future world population lies in these principal thrusts:

  • • Increased agricultural productivity everywhere, but particularly among poor farmers, of whom there are hundreds of millions.
  • • Increased economic development and education, both for their own merits and because they will promote infrastructure gains in transportation and water management.
  • • Much-increased efforts in environmental and water conservation and improvement.
  • • Continued improvements in food and beverage processing and packaging to deliver safe, nutritious, and affordable food.
  • • Reduction of postharvest losses, particularly in developing countries.

We must achieve all of these goals. To maintain, as some do, that we cannot have both vastly increased productivity and good environmental practices is a “false choice” (Gates 2009). Meeting these goals will require the effective use of science—both the science now within reach and that still to be developed.

Preserving the food supply

Postharvest losses occur between harvest and consumption as a result of spoilage of raw agricultural commodities, primarily during storage and transportation, before they can be stabilized for longer-term storage. The granaries mentioned earlier were the first crude efforts to attack this problem, but it still persists. Postharvest losses due to rodents, insects, and microbial spoilage in some areas amount to 30% or more of the harvested crop. This results in wasted seed, water, fertilizer, and labor. Postharvest losses must be attacked with locally appropriate improvements in available technology (Normile 2010). It is not enough merely to increase and conserve the supply of raw food; it must be conserved against further loss by processing and be packaged, distributed to where it is needed, and guaranteed in its safety, nutritional value, and cultural relevance. That is the role of science and technology and engineering applied to the processing of foods and beverages.

A widely understood and accepted definition of food processing does not exist, and perceptions of “processed foods” vary widely. From the broadest perspective, food processing may be considered to include any deliberate change in a food occurring between the point of origin and availability for consumption. The change could be as simple as rinsing and packaging by a food manufacturer to ensure that the food is not damaged before consumer accessibility, or as complex as formulating the product with specific additives for controlling microorganisms, maintaining desired quality attributes, or providing a specific health benefit, followed by packaging that may itself play a role in microbial control or quality preservation. Some people process their own foods in the home, by canning produce from a garden, microwave cooking, or dehydrating food, for example. Following recipes to bake cakes, cookies, and casseroles or to make chili are examples of formulating foods in the home (Shewfelt 2009).

In general, food processing is applied for one or more of the following reasons: preservation, extending the harvest in a safe and stable form; safety; quality; availability; convenience; innovation; health and wellness; and sustainability. Although the private sector carries out these processes and delivers the final product to the consumer, public investment in generating the science and engineering base necessary to continue the creativity and ultimate application of new technologies is clearly warranted.

Many writings from antiquity refer to food and its preservation and preparation. Major advances in food preservation accelerated with the development of canning, which proceeded from the investigations of Nicolas Appert in France and the subsequent activities of Peter Durand in England in the early 19th century. Appert used corked glass bottles to preserve food, and Durand introduced the concept of metal cans. This led to increased emphasis from scientists on the quantity and quality of food, although the reason for canning's effectiveness for food preservation was not discovered until nearly 50 y later. Louis Pasteur reported to the French Academy of Sciences in 1864 on the lethal effect of heat on microorganisms. W. Russel of the Univ. of Wisconsin and Samuel Cate Prescott and William Lyman Underwood of the Massachusetts Inst. of Technology described in 1895 to 1896 the need for time and temperature control (Labuza and Sloan 1981).

 “Mr. Appert found the art of fixing seasons; he makes spring, summer and fall live in bottles similarly to the gardener protecting his tender plants in greenhouses against the perils of the seasons.” (From the Courrier de l’Europe of February 10, 1809; Szczesniak 1992).

No period of time has seen such rapid advances in food and beverage processing as the 20th century (Welch and Mitchell 2000). Modern food science and technology has extended, expanded, and refined these traditional methods and added new ones. Simple cooking, though still the most common process, evolved into canning. Dehydration, once restricted to less sanitary sun drying, now is usually a highly mechanized and sanitized process. Refrigeration has evolved from cool storage to sophisticated refrigerators and freezers, and the industrial techniques of blast freezing and individual quick freezing (IQF) are less detrimental to nutritional quality and sensory quality (for example, taste, texture). All of these developments contributed to increased nutritional quality, safety, variety, acceptability, and availability of foods and beverages. Many of these techniques are now combined into more effective preservation technologies through the concept of “hurdle technology,” combining techniques to create conditions that bacteria cannot overcome, such as combining drying with chemical preservatives and packaging, or mild heat treatment followed by packaging and refrigerated storage (Leistner and Gould 2002).

Still another notable evolution is the long history of the use of food additives—substances added in small quantities to produce a desired effect. Of the 32 “technical effects” (functional purposes) listed by the Food and Drug Administration in the Code of Federal Regulations, 24 can be recognized in the few cookbooks and recipe compilations that have survived from more than 150 y ago.

Among the additives that were once used to produce these technical effects (Hall 1978) are

  • • Pearl ash (from wood ashes) and vinegar as leavening agents.
  • • Sodium silicate (water glass) for dipping eggs to preserve them.
  • • Lye for hulling corn.
  • • Sulfur dioxide from burning sulfur as a fumigant and preservative.
  • • Unlined copper utensils for making pickles greener.
  • • Saltpeter and roach alum as curing and pickling agents.
  • • Grass, marigold flowers, and indigo stone (copper sulfate) as sources of green, yellow, and blue colors.

Before the days of widespread industrial production of food and before the advent of modern chemistry and toxicology, these and many other crude additives were used confidently within the family without any knowledge of the risks they presented.

Regulatory oversight

In the 20th century, the development of the science of toxicology permitted the careful evaluation of the safety of substances added to food. The advent of modern chemistry permitted the detection of intentional adulteration of foods by purveyors using deceitful practices, and led to the passage and enforcement of modern food laws. Frederick Accum's “Treatise on the Adulteration of Food,” published in 1820, marked the beginning of this effort. In the United States, the Pure Food and Drugs Act of 1906 prohibited adulteration and misbranding of food, issues that continued to be addressed in the United States via federal statutes. Prior to 1958, the burden of proving that a substance posed an unacceptable risk rested with the government. In that year, the Food Additives Amendment to the 1938 Federal Food, Drug, and Cosmetic Act changed that by advancing the concept of “adulteration” and imposing on food manufacturers the task of proving prior to marketing that an additive is safe under the conditions of its intended use.

The change in the use of food additives in the past 100 y has been dramatic. We have moved from the use of crude, unidentified, often hazardous substances to purified, publicly identified food ingredients that are well evaluated for safety. Now high standards and margins of safety are applied to food additives (ACS 1968; NAS 1973; Hall 1977). Today, because of modern means of detection, intentional food adulteration in industrialized countries is considered uncommon, occurring more often in foods imported from countries without effective food safety infrastructure. Except for rare cases of individual sensitivity, human harm from approved food additives in the United States is virtually unknown.

Advances in food science and technology

Drying, canning, chemical preservation, refrigeration (including chilling and freezing), and nutrient conservation and fortification were the significant advances of the 19th and 20th centuries and permitted population growth in more developed countries. Such population growth could only occur if there was sufficient food. The industrial revolution could not have occurred without a food delivery system that allowed people to leave the farms, migrate to the cities, and engage in useful production of goods and services for society.

Among the important developments during the early part of the 20th century were the discovery of vitamins and the realization of the importance of other micronutrients such as iodine, iron, and calcium. Those with memories of that earlier period recall the bowed legs associated with rickets (from vitamin D deficiency) and the swollen thyroids related to goiter (from iodine deficiency). With the introduction of the draft just before World War II, the army discovered widespread malnutrition among young American males. This led to the foundation of the Food and Nutrition Board of the Inst. of Medicine of the Natl. Academies and also the development in 1941 of the Recommended Dietary Allowances (RDAs) for essential nutrients. The difficulty of achieving these RDAs from available foods, especially among the poor, led manufacturers to fortify common foods with vitamins and other micronutrients, beginning with iodized salt in 1924. Today, fortified foods, defined by federal Standards of Identity, include such staples as pasta, milk, butter, salt, and flour.

Technological innovations in food preservation were dependent on advances in the sciences, especially chemistry and microbiology. How these sciences and technologies are applied within each society depends on the economic, biological, cultural, and political contexts for each society. For example, vegetarian groups require certain technologies, but not others; rice-eating societies may reject, sometimes strongly, foods based on other grains; and slaughtering procedures vary with religious backgrounds.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *