Much of the way that agriculture has developed in the last fifty years seems inevitable — as if consolidation, ever-larger farms and ever-fewer farmers are just the natural evolution of the farm economy. This narrative also says that this is a small price to pay for “feeding the world.” But the truth of the matter is that all those changes and many more happened because of specific farm policy — and the consequences of these changes have very real financial, social and environmental costs.
Here we focus on some of the ways that policies have changed the farm economy to now benefit a few players and corporations — and how those changes disadvantage smaller farmers, the economic health of rural communities and food security.
Farming is a unique business. The volatility of weather, pests, global markets and more make it unpredictable in ways often unrelated to the farmers’ skill or management techniques. Most farmers borrow a great deal every year just for operating costs, as well as for repairs or new equipment, with the anticipation that their harvest will earn enough to pay back their loans.
Agriculture is also uniquely important to the security of any nation — people must eat. Like other industries, the farm economy has good and bad years; for all the reasons that agriculture is unique, there is a long history of the government providing a safety net to the sector, especially for bad years.
Following the Great Depression of the 1930s, the New Deal included big changes in farm policy intended to guarantee farmers a fair price for their goods. These policies skewed towards farmers who had more land, exacerbating existing inequality even then, but for the many farmers they helped, they were invaluable.
A key feature was a program that managed agricultural supply and kept farm prices from falling too low on commodity goods (e.g., grains, dairy, some other storable crops). Because harvest comes at roughly the same time for everyone, another unique feature of agriculture is that the price farmers get for their goods drops at the peak of harvest, because the market is suddenly flooded. Ironically, the better the harvest is, the lower the price is likely to go for the farmers who produce it.
The New Deal programs, called supply management, stabilized these swings by keeping the supply of commodity goods constant. Crop harvests vary from year to year, so keeping a steady amount on the market requires some structure. A floor price — essentially a minimum wage — ensured farmer prices would not drop too far below the cost of production; a grain reserve allowed the government to purchase surplus commodities to keep them off the market; and conservation incentives kept marginal land out of production. When the program was in full effect from 1941 to 1953, businesses who bought commodities paid those commodities’ full cost of production. The federal government, meanwhile, only had to buy the surplus, so the cost to the taxpayer was much lower than it has been since — in ways we will see below. 1
In the mid-1950s, business groups proposed ways to “modernize” farming, addressing what they saw as economic inefficiencies: too many farmers on diversified farms and a short supply of factory workers. These groups proposed plans to eliminate a third of farm families, replacing a network of millions of medium-sized family farms with fewer, much larger farms producing the same quantities more “efficiently,” while the displaced farmers went to work in factories. 2
Federal farm policy soon moved in this direction. Supply management provisions were weakened and “get big or get out” became the instruction from US Secretaries of Agriculture, as far back as the mid-1950s. The “minimum wage” floor price was eliminated and farmer prices for commodities eventually became subject to the whims of the market. Without a minimum wage, farmers produced all they could to try to make back their costs — but as everyone planted every possible acre and supply increased, the prices that farmers made dropped.
Low prices meant farmers couldn’t pay back their loans, and they “got out” in droves; the number of farms dropped from nearly 4.8 million in 1954 to two million in 2014. 3 This doesn’t mean that any less food is being produced; in fact, much of the same land is still being farmed — but farms have been significantly consolidated. In 1987, nearly 60 percent of cropland was operated by midsized farms (100 to 1000 acres), while 15 percent was in farms over 2,000 acres. In the next 30 years, land shifted away from the smaller operations and to the larger: by 2015, both categories held 36 percent of cropland. Farms with at least $1 million in gross income rose from 31 percent of all production in 1991 to to 51 percent in 2015. 4
Since 2008, the prices farmers received for commodities on the open market have hovered around 37 percent of the cost of production. 5 At that pay rate, it is a wonder we still have any farmers left. In fact, so as not drive all farms out of business after dismantling the supply management programs, the federal government has tried out a variety of other support systems, including direct payments and insurance programs. These various schemes are known as farm subsidies. While vilified in public policy debates, subsidies are a flawed but vital part of today’s farm economy: they are an effort to fill the gap between what the farmer spent to grow the crop and what she gets paid for it. These government payments do not adequately fill the gap, but along with off-farm income, they are what keep the remaining commodity farmers on the land.
But if the floor price was a minimum wage, the subsidy system is more like food stamps. A minimum wage is mandated by the government and paid by the employer, while food stamps are paid by the taxpayer. Similarly, the floor price for farmers was set by the government and paid by the companies who bought the commodities. Today, instead, the big agribusinesses who buy the grain — which virtually all goes to confined animal operations, ethanol or export — pay less than it cost to produce, while taxpayers make up part of the difference for the farmers in the form of subsidy payments.
A focus on economic “efficiency” has driven farm policy for many years. The argument states that economies of scale allow larger producers to grow more food for less cost — critical to feed a growing world population. It is true that Americans now spend just 10 percent of their income on food 6, but the argument is flawed in two ways: it fails to take into account the many externalized costs of this food system, or to recognize that industrialized agriculture – with all of its promises to eradicate hunger – is not currently feeding the world.
Many critically important assets do not have a formal economic value — think of a clean environment, good public health or a thriving community. What are they worth, in monetary terms? As a result, industry can dump waste into a river, pump too much groundwater, create noise or odor nuisances or market dangerous products and not have to pay the price of these actions. These are called externalized costs: expenses paid by someone other than the businesses who create the problems.
Our cheap food system is built on externalized costs. Disposal of animal waste from a concentrated animal feeding operation (CAFO) is a good example: if a manure lagoon leaks and contaminates drinking water of a nearby town, the CAFO operator does not generally pay cleanup costs. Instead, the town may pay for a new water filtration system, or individual citizens may have to purchase bottled water. If noise and odor cause local property values to fall or if animal waste pollutes a river and tourism declines, the CAFO operator does not make up for the lost revenue. Instead, the whole region pays the price while the CAFO keeps doing business as usual. 7
The US food system externalizes the costs of: diet-related disease; air pollution from livestock operations, processing and shipping; soil contamination and loss; use of finite resources like petroleum and water; the economic decline of rural communities and much more.
Industrial agriculture would not be profitable for agricultural corporations and would not produce food that was so cheap for the consumer if corporations had to take these expenses into account. For instance, a 2005 study estimated that the price of the US’s reliance on pesticides equals approximately $10 billion in environmental and societal costs. The agriculture industry would look quite different if pesticide companies and users were responsible for paying this sum.8
Industrialized agriculture has left one in nine people undernourished worldwide. That is even though agricultural production today produces 2,800 daily calories for every person on earth 9 — enough to feed the global population of 10 billion we expect by 2050. The fact is that feeding the world is a problem of power, not of calories, and industrial agriculture has concentrated power in an increasingly small number of hands.
Additionally, research has shown that various kinds of sustainable agriculture do produce yields in the range of those produced by chemical-dependent methods. Depending on the circumstances and crop, sustainable yields have been shown to be equivalent, slightly greater (particularly in drought conditions, which is increasingly important as the climate changes) or 15 to 20 percent lower than those of industrial agriculture reliant on chemical inputs like synthetic fertilizers and pesticides. 10,11 Given how underfunded the research and development of sustainable agriculture techniques have been in comparison to conventional, the yield differences are relatively small, suggesting that further research investment has the potential to reveal dramatic productivity gains. 12
The policy changes that have led to fewer and ever-larger farms have done the same in agribusiness: companies that sell to farmers or buy and process their goods have merged and gotten bigger, too. Today, just 20 percent of farms control nearly 70 percent of US farmland 13, while four meatpackers slaughter 85 percent of beef, the top four companies control 66 percent of all hog slaughter 14 and nationally, four firms control 63 percent of the retail market – and in some local markets, that percentage is as high as 80 percent. 15
There are now so many steps in between the producer and the consumer – the farmer and the eater – that the farmer gets less than 15 cents of the consumer dollar. 16 In fact, one study showed that the food marketing sector – not the farmer – determines 80 percent or more of product value. 17
This enormous market power wielded by a handful of small companies has the effect of dramatically reducing competition in agricultural markets and reducing prices paid to farmers. Farmers used to have several buyers vying for their cattle, hogs, milk or grain, which meant each potential buyer had to offer an attractive price. Today, many farmers are lucky if they even have two options – many only have one possible buyer, which means he must take whatever price is offered. 18
The ever-growing companies that now dominate the market are a far cry from local businesses — they are virtually all multinational corporations. Smithfield, the top US pork producer, is owned by a Chinese company, while many US-based firms have been exporting practices like chemical-dependent agriculture and CAFOs around the globe for years. Foreign ownership is not inherently negative; as a nation of immigrants, the US has thrived on influences from abroad. But many global corporations, whether based in the US or elsewhere, are primarily concerned with their bottom line, setting up operations wherever they can get the best tax breaks, cheapest labor or most business-friendly regulations – conditions that are not good for the well-being of the local community, workforce or environment.
As farms have consolidated, they have also consolidated ownership of their supply chain in a process called vertical integration. This started in the 1940s, when companies like Tyson Foods began buying up the formerly independent parts of their supply chain — breeding facilities, feed mills, slaughterhouses — and integrating them under their umbrella. What would have formerly been a network of small businesses supporting and being supported by a local meat industry is now one self-contained corporate economy. 19
Owning all links in the supply chain gives the integrator control over price and quality throughout, and the process has helped to drive down consumer meat prices. But it also means that a company is essentially an island, no longer reliant on other businesses and keeping all of its own profits.
There is one piece of the operation that even the most vertically integrated companies do not own: the farms that raise the animals. Meatpacking companies have determined that farms are the least profitable part of the business, and instead contract with farmers to raise company-owned animals on the farmers’ land. 20
The farmer must take out loans, often starting at $1 million, and build barns to company specifications. Once the farmer is committed to the agreement, by way of significant debt, he or she often finds that the contracts are not as fair or profitable as the company salesmen promised. 21 In fact, many contract chicken farmers face bankruptcy and risk losing their home and farm if they try to get out of the business.
The model of vertically integrated meat companies using contract farm labor was developed by Tyson, and has been the norm in the chicken industry for many years — so much so that it is known as chickenization. Most hogs are now also raised under similar arrangements. As of 2015, almost 95 percent of livestock and poultry operations were managed under some kind of contract. 22 Contracts are common in dairy, beef and even for crops, though the terms in those sectors currently tend to be less exploitative of the grower than chicken contracts.
One of the supposed economic benefits of efficiency — including fewer and larger farms — is reduced labor costs from fewer workers. But independent family farmers are an economic driver, spending money in their community and creating jobs for others. As they have been replaced by automation and a far smaller number of poorly-paid farm workers, there is less economic activity in rural regions. 23
In addition, there is evidence that larger farms spend less at local businesses than small farms. A Minnesota study, for example, shows that farms with a gross income of $100,000 made nearly 95 percent of their expenditures locally, while farms with gross incomes above $900,000 spent less than 20 percent locally. 24 Vertically integrated operations, meanwhile, have no need of local businesses, as they produce everything under their own corporate umbrella.
Numerous studies in the last 50 years show that the consolidation and industrialization of agriculture operations in rural communities has resulted in lower incomes, greater income inequality and poverty, declining Main Streets and fewer stores. 25 A North Dakota metastudy found “detrimental effects of industrialized farming on many indicators of community quality of life, particularly those involving the social fabric of communities.” 26 As local businesses close, institutions like schools, churches and hospitals close as well, and people become more isolated.
Many of the businesses that do still seek to locate in rural areas, including CAFOs and meatpacking plants, do not deliver on promised jobs and further exploit already vulnerable communities. In North Carolina, for example, the hog industry has a long history of siting tremendously polluting CAFOs in poor communities of color. 27 Similarly, the meat industry relies on immigrants, many of them undocumented, to fill the dangerous and low-paid jobs in packing plants. 28 With few legal protections and isolated in mostly-white rural regions, these workers are at the mercy of their employer.
Both of these kinds of operations also produce terrible odors, cause tremendous wear on rural roads and can contaminate the water and air. These facilities often receive large tax breaks, which means that the town does not have the revenue stream to maintain the roads 29 or provide services for the new population of immigrant workers. 30 Adding insult to injury, these impacts can also lead to lowered property values and lost tourism revenue. 31
Today, 16 percent of percent of people below the poverty line live in rural areas, compared to 13 percent in urban areas. Rural jobs have not returned following the Great Recession. Most of the counties with the highest participation in the Supplemental Nutrition Assistance Program (formerly called food stamps) are rural.32 So are most of the so-called food deserts 33, as the number of rural grocery stores continues to decline. 34 More time and money is required for long drives to the supermarket – or to see a doctor or go to the bank or any number of basic necessities that are no longer located “in town.” Rural areas, particularly those that are majority white, are also suffering from health crisises, including high rates of obesity 35 and rising death rates. 36 The opioid epidemic has devastated rural areas; by just one measure, from 1999 to 2015, rural opioid death rates quadrupled among 18 to 25-year-olds. 37 There are many causes at play, but researchers point in part to the decline of jobs, economic stress and related anxieties of living in a decimated community. 38
https://www.ers.usda.gov/amber-waves/2017/march/large-family-farms-continue-to-dominate-us-agricultural-production/. Based on: Hoppe, Robert, and James MacDOnald. (2016) America’s Diverse Family Farms, 2016 Edition. USDA ERS. Retrieved from https://www.ers.usda.gov/publications/pub-details/?pubid=81401