Pin It

Eating Locally — Still Good 

Jill Richardson on the five biggest myths people use to discredit the local-food movement ... and why they're wrong

click to enlarge news_feat-1.jpg

It's become predictable. At regular intervals, someone, somewhere, will decide it's time to write another article "debunking" the local food movement. The latest installment is by Steve Sexton, posted on the Freakonomics blog ( — which also treated us to a previous post called "Do We Really Need a Few Billion Locavores?") And we must not forget the leading anti-locavore, James McWilliams, who gave us "The Locavore Myth" and similar articles.

  But if you enjoy the flavor of organic heirloom tomatoes, fresh-picked from the farm, here's how to read these articles without feeling guilty that your love of local food is harming the planet and starving people in the Global South.

Myth 1: People who eat local eat the same diet as those who don't.

  A favorite anti-locavore argument is that eating local does not reduce oil usage or carbon emissions. Now, if locavores were munching on locally produced Big Macs and other highly processed foods as the rest of the mainstream food system does, this argument might be correct. But that's not the case.

  James McWilliams likes to use the example of a study on lamb which shows that eating New Zealand lamb in London actually has a smaller carbon footprint than lamb from the U.K. The New Zealand lamb is raised on pasture, and even when you factor in the carbon emissions from shipping, it is still friendlier to the environment than grain-fed factory farmed U.K. lamb. Well, sure. Only no self-respecting London locavore would dream of eating grain-fed, factory farmed lamb. He or she would find a local farmer raising lamb on pasture instead. Now compare the carbon footprint of that to the New Zealand lamb. With similar production methods and a correspondingly similar carbon footprint, the major difference between the two would be the oil required to ship the New Zealand lamb halfway across the world.

Myth 2: The only reason for eating local is reducing "food miles."

  Often anti-locavore arguments, such as the one above from McWilliams, are predicated on the notion that locavores only eat local to reduce food miles — the number of miles the food traveled from farm to fork — and the reason they do that is to reduce carbon emissions. Since modern shipping methods are relatively efficient, it is then easy to prove that it's very efficient to transport a truckload or train car full of fresh peaches from California around the rest of the U.S., compared to the efficiency of driving a relatively small quantity of peaches to and from a farmers market. No doubt one can come up with numbers showing that, per pound of peach, transporting large quantities of peaches across the country uses less oil than transporting smaller quantities shorter distances.

  But that assumes this is the only benefit to eating local, and it isn't. For one thing, who picked the California peaches? Probably migrant laborers. How were they treated? How were they paid? Probably poorly. What was sprayed on those peaches? In 2004, more than 100 different pesticides were used in California peaches, including highly toxic ones like methyl bromide, paraquat, chlorpyrifos and carbaryl. This totaled 468,804 pounds of pesticides used on peaches in California alone that year. How about water usage? What is the rationale of growing the majority of the nation's fruit in a state that does not have enough water without heavy irrigation (and also lacks the necessary abundance of water to accomplish all that irrigation)?

  Then consider your own enjoyment and nutrition. Wouldn't you rather eat a fresh fruit or vegetable that was just picked? And wouldn't it be nicer to eat a variety that was selected for flavor and not for its ability to withstand shipping and storage? These are not merely hedonistic considerations, as nutrients can degrade over time once produce is harvested. What's more, nutrients that were never in the soil to begin with cannot possibly be present in the food. A farm with healthy soil will also produce healthier — and more flavorful — food. Your body is wired up to desire flavorful fruits and vegetables because they are better for you. And when you eat out at a restaurant that serves local food, often the chef can work together with local farms so the farmers plant the specific varieties of fruits and vegetables that the chef wants to serve.

  One last reason for eating local is the relationship that one forms within one's community, and the economic multiplier effect that occurs within the community when one buys local. This extends beyond just food to other goods as well. When you spend your money locally, it enriches your community. When you buy from a large grocery chain, some of your money goes to pay the clerk who checked you out and the manager who oversees that clerk, but the rest goes to the grocery store's corporate headquarters, to the truckers, the warehouses, and to the farm that grew your food, far away from where you live. What's more, when you buy from the same local farmers each week, you build relationships with those who grew your food.

Myth 3: Growing food locally is inefficient.

  This is the latest tirade against eating local: "comparative advantage," which argues it makes the most sense to grow Alabama's potatoes in Idaho, where potato yields far exceed those in Alabama. Alabama should grow whatever it grows best and then it should ship that to Idaho, right?

  This depends on your idea of efficiency. Idaho is no doubt growing Russet Burbank potatoes, the kind used in French fries. These are large, high-yielding potatoes, especially when — as described by writer Michael Pollan — they "have been doused with so much pesticide that their leaves wear a dull white chemical bloom and the soil they're rooted in is a lifeless gray powder." In The Botany of Desire, Pollan describes how an Idaho farmer with 3,000 acres grows potatoes (and nothing but potatoes!). He begins with a soil fumigant, then herbicide. Then he plants his potatoes, using an insecticide as he does. Next, another herbicide — and so on. For "efficiency" he applies these pesticides by adding them to the water in his irrigation system, water that is then returned to its source, a local river. He also has a crop-dusting plane spray the plants every two weeks, and he applies 10 doses of chemical fertilizer. (With all of these chemicals, the farmer told Pollan he won't eat his own potatoes. He grows a special, chemical-free plot of spuds near his house for his own consumption.) Altogether, in a good year, an Idaho potato farmer will spend $1,950 per acre in order to net $2,000 per acre. Efficient?

  Perhaps the Alabama potato farmers who are achieving much lower yields than they might in Idaho are using the same business model. If so, they are bad businesspeople, as it would require a lot of costly inputs to produce less of a commodity that is sold by the pound, and they would make rather little money for their trouble. But if any of them subscribe to the locavore model of farming and eating, then this will not be the case.

  Hopefully, the Alabama farmers forgo the costly inputs, so that the money earned from the potatoes after their harvest is their own. Potatoes, after all, require little soil fertility. In the Andes, where potatoes were first domesticated, a farmer named Juan Cayo grows potatoes at the end of a four-year rotation on his fields near Lake Titicaca. First, he grow fava beans, which infuse the soil with nitrogen. For the next two years, he grows grains (barley and oats), which use up most of the nitrogen in the soil. Last, he grows potatoes, which are happy enough to grow using whatever fertility is leftover.

  Crop rotation also serves to deal with the nematodes and insects that the Idaho farmer sprayed for. If any pests find the potato crop and begin to breed, they will have a bad surprise when, the next year, a different crop is sown in that field and they suddenly have no food. Some of them might find where the farmer is now growing this year's potatoes, but it will take them time to get there. For the bugs that do arrive, there are a number of organic strategies. Least efficient, but always an option, is picking the bugs off by hand. Better yet, a farmer can provide habitat for predatory insects or birds that prey on the pest or spray Bacillus thuringiensis, a bacteria that naturally kills insects.

  Weeds can be suppressed by a heavy layer of mulch, removed via tilling or pulled by hand if necessary. A common strategy is to allow the weed seeds in the field to germinate before planting your crop, then kill the weeds via tilling, and subsequently plant the crop. You might even be able to convince your chickens to kill the weeds for you. Additionally, some weeds in your field are actually a good thing. Not so many that they choke out your crop, of course, but there are edible weeds, weeds that can be used as medicinal plants, and even weeds that, when allowed to grow sparingly in your field, can boost production of your crop. For example, garden guru John Jeavons recommends dandelion, purslane, stinging nettle and lamb's quarters as edible (and nutritious) weeds that will help your crops, too.

  Last, an organic farmer growing for a local market will plant a number of potato varieties, not just one. The reasons for growing several varieties are many. Some varieties may be best for baking, others for mashed potatoes, and still others for frying. One variety might taste the best or yield the most but lacks natural resistance to a fungal disease, whereas another variety with mediocre yields does naturally resist the disease. Some varieties mature faster than others, allowing the farmer to harvest and sell potatoes all season long. And if all of the potatoes fail, the farmer is also growing a number of other crops in addition to potatoes. In short, agrobiodiversity — growing diverse varieties and diverse species — provides insurance.

Myth 4: We can't feed a growing population on local (organic) food.

  This is the biggest whopper of all. The recent Freakonomics article states: "From roughly 1940 to 1990, the world's farmers doubled their output to accommodate a doubling of the world population. And they did it on a shrinking base of cropland. Agricultural productivity can continue to grow, but not by turning back the clock. Local foods may have a place in the market. But they should stand on their own, and local food consumers should understand that they aren't necessarily buying something that helps the planet, and it may hurt the poor."

  This argument ignores the vast expanses of land planted with entirely unnecessary crops for feeding the world: cotton, sugarcane, palm oil, soybeans, corn, rubber, tobacco and fast-growing trees like eucalyptus for paper production, to name a few. No doubt we need some cotton, sugar, and corn, etc. But the amount of land under these crops, which are then used to produce biofuels, processed foods, factory-farmed meat, paper, clothing, and industrial inputs, is immense, wasteful, and largely (although not entirely) unnecessary.

  Prior to the European conquest of the Americas, sugar was reserved for the very wealthy. By the height of the Industrial Revolution, sugar made up a significant percent of calories in the British diet. Palm oil, which is now found in 50 percent of processed foods and other items like cleaning products in the United States, was once a local, traditional West African food. Today, palm oil production is ravaging the rainforests of Indonesia and Malaysia and expanding to other areas like Papua New Guinea and Latin America.

  Both of these crops, as well as corn, soy and jatropha, are also grown for biofuels, which do not feed people. Neither does paper, which we in the United States use as a cheap renewable resource, unaware of enormous areas covered in fast growing trees that are often quite disruptive to the ecosystem around them in order to meet our needs. And grain-fed meat, as pointed out so many years ago by Frances Moore Lappé in Diet for a Small Planet, is a wasteful use of calories compared to feeding grain directly to people. If we care about feeding the world while using fewer resources, switching to pasture-raised meat — and less of it per person in the developed world — is a must. Doing so would likely improve our health as a nation at the same time.

Myth 5: Eating local (organic) food is elitist.

  In the United States, where processed food is artificially cheap and many people eat what they can afford to buy at the expense of their health, local food is a luxury. For those who do not grow their own food, and especially for those who want to eat in restaurants or buy from a grocery store, local and organic food is expensive. But let's reframe the issue. Instead of asking for cheaper (but unhealthy and environmentally destructive) food, let's ask for living wages so that anyone who works full time can support their family and feed them well. Let's ask for an expanded middle class instead of a growing gap between rich and poor.

  We must also note that outside of the United States and Europe, this equation is different. For peasants, local, organic food is cheap and low-risk. Going back to the example of potatoes, in the Andes, a farmer might grow 50 varieties of potatoes. Some varieties cannot be eaten directly because they are bitter or spicy, and they are instead freeze-dried using traditional methods and stored for years as a hedge against years with a bad crop. Likewise, the Andes are home to more than 3,000 varieties of quinoa. A few animals are kept to eat items that humans cannot eat and to serve as a sort of insurance — a literal "piggy bank." When income is needed, the farmer can sell a pig or a cow. Farmers grow different varieties and different crops at different altitudes: llamas (for meat) and alpacas (for fiber) in the highest areas, then potatoes, quinoa and other tubers and grains a little lower, then corn, and citrus and vegetables at lower altitudes. They have done this since pre-Incan times. Farmers from the lowlands trade crops with farmers from the highlands.

  This sort of agriculture is not unique to the Andes (although the crops and the use of different altitudes is). Using these methods, farmers can avoid going into debt and can protect their families against bad years. Hopefully, if one variety of a crop fails, another does not. Rare is the year when every single crop fails — and should that happen, farmers have stores of preserved food from years past and can even subsist on weeds and wild plants and animals. This does not conform to the capitalist model of maximizing yield and profits, but it serves as a low risk strategy to prevent hunger without exhausting the soil or other local resources. For the billions of peasants in the world, purchased, processed foods are elitist, not local, organic foods.

  Next time you read a column that tells you your love of fresh, flavorful, healthy local foods is elitist, inefficient, or contributing to world hunger, feel free to shred that article and put it in your compost pile — then continue enjoying your delicious Green Zebra and Brandywine tomatoes with a little bit of extra virgin olive oil, homegrown basil and sea salt without the slightest bit of guilt.

Jill Richardson is the founder of the blog La Vida Locavore and a member of the Organic Consumers Association policy advisory board. She is the author of Recipe for America: Why Our Food System Is Broken and What We Can Do to Fix It.


Subscribe to this thread:

Add a comment

Pin It


Submit an event Jump to date

Latest in News

© 2018 Gambit
Powered by Foundation