ERS Charts of Note

Subscribe to get highlights from our current and past research, Monday through Friday, or see our privacy policy.
See also: Editors' Pick: Charts of Note 2019 gallery.

Reset

Since 2000, U.S. cotton producers have increasingly used genetically engineered (GE) seeds with stacked traits

Wednesday, December 16, 2020

Genetically engineered (GE) crops are broadly classified as herbicide-tolerant (HT), insect-resistant (Bt), or “stacked” varieties that combine HT and Bt traits. HT crops can tolerate one or more herbicides and provide farmers with a broad variety of options for effective weed control by targeting weeds without damaging crops. Bt crops contain genes from the soil bacterium Bacillus thuringiensis and provide effective control of insect pests, such as the tobacco budworm and pink bollworm. GE varieties of cotton were commercially introduced in the United States in 1995. GE seeds have accounted for the majority of cotton acres since 2000, expanding from 61 percent of acreage that year to 96 percent in 2020. During this time, the share of cotton acres planted with seeds that had the individual HT or Bt traits shrank as growers turned more often to stacked varieties that carried both traits. In 2000, about 26 percent of total cotton acres were HT only, 15 percent were Bt only, and 20 percent used stacked seeds. By 2020, 8 percent of acres were HT only, 5 percent were Bt only, and 83 percent used stacked seeds. This chart appears in the December 2020 Amber Waves article, “Use of Genetically Engineered Cotton Has Shifted Toward Stacked Seed Traits.”

Disabilities remain a strong risk factor for food insecurity

Monday, December 14, 2020

The 1990 Americans with Disabilities Act was intended to provide opportunities to thrive for those with disabilities. But for some people with disabilities, barriers still exist to being able to afford adequate food. In 2019, 10.5 percent of all U.S. households were food insecure—that is, they struggled to put enough food on the table for all their household members. Adults who report being unable to work because of a disability have a high prevalence of food insecurity. Among U.S. households with an adult who was not in the labor force because of disability, 31.6 percent were food insecure in 2019. Among U.S. households that included an adult who reported a disability but was not out of the labor force due to a disability, 22.6 percent were food insecure. In contrast, 7.6 percent of households containing no adults with disabilities were food insecure in 2019. Households with adults out of the labor force because of disabilities were almost four times as likely to experience the more severe very low food security compared to U.S. households as a whole. Households with very low food security report cutting or skipping meals and not eating enough because there was not enough money for food. Households with low food security report avoiding substantial reductions or disruptions in food intake, in some cases by relying on a few basic foods. This chart appears in the Economic Research Service’s Amber Waves article, “Thirty Years After Enactment of the Americans with Disabilities Act, Disabilities Remain a Risk Factor for Food Insecurity,” December 2020.

Average share of income spent on food at home in the U.S. has fallen over time, but less sharply over the last two decades

Friday, December 11, 2020

In 1960, U.S. consumers spent an average of 17.0 percent of disposable personal income (DPI) on food. By 2019, this share had shrunk to 9.5 percent. This decrease was driven by a decline in the share of income people spent on food at home. The share of DPI spent on food purchased at supermarkets, supercenters, convenience stores, and other retailers fell from 13.7 percent in 1960 to 5.7 percent in 2000. Over the same period, the share of DPI spent on food purchased from restaurants, fast-food places, schools, and other away-from-home eating places rose from 3.3 percent to 4.2 percent. The declining share of income spent on food at home reflects, in part, efficiencies in the U.S. food system (which kept inflation for food-at-home prices generally low) and rising disposable incomes. A slower decline in share of income spent on food at home after 2000 could reflect U.S. consumers opting to prepare more meals at home and purchasing more expensive grocery store options than they did in earlier decades. This chart appears in “Average Share of Income Spent on Food in the United States Remained Relatively Steady From 2000 to 2019,” in the Economic Research Service’s Amber Waves magazine, November 2020.

Smaller gatherings may have meant smaller turkeys on the Thanksgiving table

Wednesday, December 9, 2020

Errata: On December 11, 2020, text was updated to distinguish that preliminary data was used in the chart and that official November turkey weight data is forthcoming.

Expectations that people will gather in smaller groups for the holidays this year may have resulted in increased demand for smaller turkeys for Thanksgiving tables. However, the industry may not have been able to fulfill that change in consumer demand in time for the food-centered holiday. For turkeys to be grown in time for Thanksgiving, they must have been hatched in August or earlier. As a result, the reaction to changing demand can be delayed. Stocks of frozen turkeys build up all year long in preparation for the seasonal spike in demand, so many of the birds that were for sale in grocery stores around Thanksgiving were processed even earlier in the year. Even so, preliminary data in the weeks that led up to Thanksgiving suggested that the average live weight of turkey hens processed was below last year’s average. Birds that classify as turkey hens are those that weigh fewer than 16 pounds raw or fewer than about 20 pounds live. Official data for November turkey weights will be released by USDA’s National Agricultural Statistics Service in late December. This chart is drawn from Economic Research Service’s Livestock, Dairy, and Poultry Outlook, November 2020.

Disparities in educational attainment by race and ethnicity persist in rural America

Monday, December 7, 2020

Higher educational attainment is associated with higher median earnings, higher employment rates, and greater workforce opportunity. Among all rural residents who are 25 years old or older, the percentage of those who had completed a bachelor’s degree or higher rose from 15 percent in 2000 to 20 percent in 2018. In addition, the share of the rural population 25 or older without a high school degree or equivalent dropped from 24 percent in 2000 to 13 percent in 2018. Even so, ethnic and racial disparities persist in education. Rural Hispanics continued to have the highest share (35 percent) without a high school degree, despite significant gains in high school and higher educational attainment rates between 2000 and 2018. Over the same period, Blacks or African Americans had the largest decrease (20 percentage points) of rural individuals without a high school degree. This change eliminated the gap between the shares of Blacks or African Americans and Whites who had graduated from high school but had not completed a bachelor’s degree. Nevertheless, the share of rural Blacks or African Americans without a high school degree remained nearly double that of Whites in 2018. This chart appears in the November 2020 Amber Waves finding, “Racial and Ethnic Disparities in Educational Attainment Persist in Rural America.”

Farm share of retail price for flour fairly stable since recovery from 2016-17 low point

Friday, December 4, 2020

The farm share of the retail price of all-purpose white flour—the ratio of the price farmers receive for wheat to the price consumers pay for flour in grocery stores—averaged 13 to 14 percent in 2016 and 2017 before reaching 19 and 20 percent in 2018 and 2019. In the latter half of 2017, farm prices rose in connection with lower-than-expected U.S. wheat production. In mid-2018, as the 2017/18 U.S. wheat crop matured, dry conditions in the Northern Plains further trimmed wheat production prospects. In response, domestic wheat prices rose again, despite abundant global wheat supplies. Ultimately, U.S. wheat farmers received an average price of $4.72 per bushel for their 2017/18 crop, up from $3.89 in the previous year. The farm value of the amount of hard red winter wheat needed to make one pound of all-purpose white flour increased from 7 cents in 2017 to 9 cents in 2018 before falling back to 8 cents in 2019. The average retail price of flour fell from 51 cents per pound in 2017 to 46 cents in 2018 and 44 cents in 2019. Costs for milling, packaging, transporting, and retailing also affect what consumers pay for flour at grocery stores. The Economic Research Service (ERS) forecasts higher farm prices for wheat during the 2020/21 marketing season and moderately higher retail prices for cereal and bakery goods, a category of products that includes all-purpose white flour, through 2020 and 2021. This chart is based on the Price Spreads from Farm to Consumer data product on the ERS website.

Farm sector profits forecast to rise in 2020

Wednesday, December 2, 2020

Inflation-adjusted U.S. net cash farm income (NCFI)—gross cash income minus cash expenses—is forecast to increase $23.4 billion (21.1 percent) to $134.1 billion in 2020. U.S. net farm income (NFI), a broader measure of farm sector profitability that incorporates noncash items including changes in inventories, economic depreciation, and gross imputed rental income, is forecast to increase $35.0 billion (41.3 percent) from 2019 to $119.6 billion in 2020. While cash receipts for farm commodities are forecast to fall $7.8 billion (2.1 percent), direct Government payments are expected to rise to $46.5 billion, more than twice the 2019 amount, a result of supplemental and ad hoc disaster assistance payments for COVID-19 relief in 2020. Total production expenses, which are subtracted out in the calculation of net income, are projected to fall $9.5 billion (2.7 percent) in 2020, including a drop of $5.6 billion in interest expenses. If forecasts are realized, NFI in 2020 would be at its highest level since 2013 and 32.0 percent above its inflation-adjusted average calculated over the 2000-19 period. NCFI would be at its highest level since 2014 and 22.5 percent above its 2000-19 average. Find additional information and analysis on the USDA, Economic Research Service Farm Sector Income and Finances topic page, reflecting data released December 2, 2020.

India, a major rice and wheat producer, sharply expands subsidized distribution of food grains in response to pandemic

Monday, November 30, 2020

As part of its response to the economic impacts of the COVID-19 pandemic, India has sharply increased its distribution of wheat and rice to the 800 million Indian citizens (about 58 percent of the population) eligible to receive subsidized rations. Facing major shocks to employment and incomes associated with nationwide measures to control the virus, India announced a relief program in March 2020 worth $22.3 billion. The program, now extended through November 2020, supplements the highly subsidized, standard monthly ration of 5 kilograms per person of wheat or rice with an additional free allocation of 5 kilograms. Implementation of the program led to a 75-percent increase in India’s total wheat and rice distribution from April to September compared with earlier years, with the average monthly distribution of rice more than doubling. India is a major global holder of food security stocks of both rice and wheat, as well as the world’s largest rice exporter—with 2021 exports forecast at 12.5 million tons. While India is currently forecast to maintain large surpluses of wheat and rice in government stocks during the October 2020-September 2021 marketing year, the sharp increase in subsidized domestic distribution has the potential to substantially reduce those food security stocks if the COVID-19 relief program is extended beyond November 2020 into the 2020-21 marketing year. This chart was drawn from the Economic Research Service’s Rice and Wheat Outlooks, November 2020.

Real wages for hired U.S. farmworkers rose between 1990 and 2019

Friday, November 27, 2020

Hired farmworkers make up less than 1 percent of all U.S. wage and salary workers, but they play an essential role in labor-intensive industries within U.S. agriculture, such as the production of fruits, vegetables, melons, dairy, and nursery and greenhouse crops. Farm wages have risen over time for nonsupervisory crop and livestock workers (excluding contract labor). According to data from the USDA’s Farm Labor Survey, real (inflation-adjusted) wages rose at an average annual rate of 1.1 percent between 1990 and 2019. In the past 5 years, real farm wages grew even faster at an average annual rate of 2.8 percent. This is consistent with growers’ reports that the longstanding supply of workers from Mexico has decreased, as growers may respond over time by raising wages to attract workers from other sources. The gap between farm and nonfarm wages has slowly shrunk but is still substantial. In 1990, the average wage for nonsupervisory farmworkers—$9.80 an hour in 2019 dollars—was about half the $19.40 wage of private-sector nonsupervisory workers in the nonfarm economy. By 2019, the $13.99 farm wage was 60 percent of the $23.51 nonfarm wage. This chart appears in the October 2020 Amber Waves data feature, “U.S. Farm Employers Respond to Labor Market Changes With Higher Wages, Use of Visa Program, and More Women Workers.”

U.S. apple production is forecast to decline 3 percent in 2020

Wednesday, November 25, 2020

The holidays may look a little different this year, but apple pie is likely still on the table. The U.S. apple crop in the 2019/20 season (August to July) was the sixth largest on record, with total production reaching 11 billion pounds. Total utilized production, which includes production that is actually sold, was 10.6 billion pounds. In June 2020, apples in storage hit a record high of 47.9 million bushels, up 24 percent from the same time last year. The increase in domestic supplies received a further boost by a decrease in U.S. exports stemming from high tariffs from India and China. This overall increase in supplies led to lower grower prices in 2019/20. For the 2020/21 season, total production is projected 3 percent lower than the prior season (total utilized production is estimated at 10.3 billion pounds), with production in Washington, the Nation’s leading apple producer, similarly down 3 percent. The quantity of apples harvested may be even lower because of labor supply uncertainties surrounding the pandemic. Apples for the processed market are also expected to decrease, notably in Pennsylvania (17 percent) and Virginia (16 percent). Bagged apples, however, are currently in demand and are likely to stay in demand for apple pie season. This chart is based on the ERS Fruit and Tree Nuts Outlook, released September 2020.

Popularity of sweet potatoes, a Thanksgiving staple, continues to grow

Tuesday, November 24, 2020

If sweet potatoes are on your Thanksgiving menu this year, you are not alone. According to the Economic Research Service’s (ERS) food availability data, supplies of sweet potatoes available for U.S. consumers to eat averaged 7.2 pounds per capita per year in 2017-19, up from an average 3.9 pounds in 1997-99. Availability is calculated by adding domestic production, initial inventories, and imports, then subtracting exports and end-of-year inventories. These national supplies are then divided by the U.S. population to estimate per capita availability. Consumer interest in nutrition and food companies expanding their sweet potato-based offerings, such as sweet potato fries, may be contributors to the rise in sweet potato availability. Sweet potatoes are high in vitamin A and vitamin C. A cup of boiled sweet potatoes without the skin (and without any added fats or marshmallow toppings) contains 249 calories and 287 percent of the daily recommended amount of vitamin A, 47 percent of vitamin C, and 29 percent of dietary fiber for a 2,000 calories-per-day diet. This chart uses data from ERS’s Food Availability (Per Capita) Data System.

Illinois, home of two processing plants, leads U.S. pumpkin production

Monday, November 23, 2020

Pumpkins are in high demand in America during the fall and winter holidays, whether to be used as decoration or as a key ingredient in various desserts. There are two broad categories of pumpkin to fit the two main uses: Halloween pumpkins (also known as ornamental pumpkins) and processing pumpkins (used for food). Illinois leads the country in pumpkin production overall, growing roughly three to four times more pumpkins than any other state, depending on the year. Much of this is driven by the state’s dominance of the processing pumpkin market. About three fourths of the processing pumpkins acres in the country are grown and canned in Illinois, where two major canning facilities are located. Almost 80 percent of Illinois’ pumpkin production is for processing pumpkins, with no other top state producing more than 5 percent of its share as processing pumpkins. Further information on pumpkins can be found on the Economic Research Service Trending Topics page on Pumpkins: Background & Statistics.

Despite locust outbreak, grain production in most acutely affected African countries set to be highest on record

Friday, November 20, 2020

Several countries in the “Horn” region of Africa are facing the brunt of what the U.N. Food and Agricultural Organization (FAO) describes as the “worst desert locust crisis in 25 years.” Paradoxically, grain production in those countries is forecast to hit record volumes. The current desert locust outbreak originated in mid-2018 when successive rain events in the arid Arabian Desert spurred vegetation development. The latter, in turn, provided ample feedstock for the burgeoning locust population. Trade winds blew the pests into Africa in early 2019, where the locusts settled into the low-elevation arid to semi-arid grasslands. Regionally abundant rainfall through the end of 2019 and into 2020 supported vegetation growth, which once again aided in the expansion of locust swarms. However, the locusts primarily remained in low-elevation grasslands, largely avoiding the higher-elevation grain production zones. Further, the rainfall that increased feedstock for the locusts also helped increase yields for agricultural crops, such as corn, barley, sorghum, and wheat. Ultimately—and despite a significant locust infestation—grain production in this region is forecast not only above the 2019 levels but also to reach the highest level on record. This situation mirrors that of the less severe locust infestation of 2003-05, during which aggregate grain production rose during the height of the outbreak. This chart is drawn from material included in the Economic Research Service’s Wheat and Feed Outlook reports from August 2020, and has been updated with November data.

One-third of U.S. counties in 2018 had one or more farmers markets that accepted SNAP benefits

Wednesday, November 18, 2020

Farmers markets are great sources of fresh fruits, vegetables, and other healthy foods. USDA has expressed a commitment to increasing access to these foods for low-income households participating in the Supplemental Nutrition Assistance Program (SNAP). As with retail food stores, farmers markets must be authorized by USDA to accept SNAP benefits. Data from USDA's Agricultural Marketing Service show that 72 percent of U.S. counties reported having at least one farmers market in 2018. Of those counties, 45 percent—32 percent of all 3,143 U.S. counties—reported having one or more farmers markets that accepted SNAP benefits. The number of farmers markets in a county that report accepting SNAP benefits is one of the updated statistics in the Economic Research Service’s (ERS) Food Environment Atlas. The Atlas assembles statistics on more than 280 food environment indicators at the county or State level that can influence food choices and diet quality. According to the Atlas, 1,015 counties had one or more farmers markets that accepted SNAP benefits as a form of payment, and 49 counties had 10 or more farmers markets that accepted SNAP benefits. The data for this map can be found in ERS’s Food Environment Atlas, updated September 2020.

Climatic trends dampened productivity growth on Wisconsin dairy farms

Monday, November 16, 2020

In 2019, Wisconsin’s production of fluid milk was second only to California’s. According to data from USDA’s National Agricultural Statistics Service, Wisconsin generated 30.6 billion pounds of milk that year, with milk sales totaling $5 billion. In recent years, Wisconsin dairy farms have been exposed to substantial weather volatility characterized by frequent droughts, storms, and temperature extremes (both hot and cold). This has resulted in considerable fluctuations in dairy productivity. Researchers from the Economic Research Service (ERS) among others, found that total factor productivity (TFP), which measures the rate of growth in total output (aggregate milk produced) relative to the rate of growth in total inputs (such as the number of cows, farm labor, feed, and machinery), increased at an average annual rate of 2.16 percent for Wisconsin dairy farms between 1996 and 2012. This increase was primarily driven, at an annual rate of 1.91 percent, by technological progress—such as improved herd genetics, advanced feed formulations, and improvements in milking and feed handling equipment. However, trends in rainfall and temperature variation were responsible for a 0.32 percent annual decline in the productivity of Wisconsin dairy farms during the same period. For example, an average increase in temperature of 1.5 degrees Fahrenheit reduced milk output for the average Wisconsin dairy farm by 20.1 metric tons per year. This is equivalent to reducing the herd size of the average farm by 1.6 cows every year. This chart appears in ERS’s October 2020 Amber Waves finding, “Climatic Trends Dampened Recent Productivity Growth on Wisconsin Dairy Farms.”

Larger corn and soybean farms used more futures, options, and marketing contracts in 2016

Friday, November 13, 2020

U.S. farmers can use a variety of market tools to manage risks. With a futures contract, the farmer can assure a certain price for a crop that has not yet been harvested. An option contract allows the farmer to protect against decreases in the futures price, while retaining the opportunity to take advantage of increases in the futures price. Futures and options usually do not result in actual delivery of the commodity, because most participants reach final financial settlements with each other when the contracts expire. In a marketing contract, by contrast, a farmer agrees to deliver a specified quantity of the commodity to a specified buyer during a specified time window. Corn and soybean farms account for most farm use of each of these contracts, and larger operations are more likely to use them than small. With more production, larger farms have more revenue at risk from price fluctuations, and therefore a greater incentive to learn about and manage price risks. Fewer than 5 percent of small corn and soy producers used futures contracts, compared with 27 percent of large producers. Less than 1 percent of small corn and soy producers used options, compared with 13 percent of large producers. And about 19 percent of small corn and soy producers used marketing contracts, compared with 58 percent of large producers. This chart is based on data found in the Economic Research Service report, Farm Use of Futures, Options, and Marketing Contracts, published October 2020. It also appears in the November 2020 Amber Waves feature, “Corn and Soybean Farmers Combine Futures, Options, and Marketing Contracts to Manage Financial Risks.”

European Union’s Farm to Fork initiative to reduce use of agricultural inputs may increase food prices and further global food insecurity

Thursday, November 12, 2020

Researchers at USDA’s Economic Research Service (ERS) recently evaluated the potential impacts of the European Commission (EC)’s Farm to Fork and Biodiversity Strategies initiative that calls for restrictions in the use of agricultural inputs such as land, antimicrobials, fertilizers, and pesticides in European Union (EU) agricultural production. The proposal pledges to use EC trade policies and other international efforts to promote a vision of sustainability in agriculture, suggesting intentions to extend the reach of the policy beyond the EU. A mandated reduction in these inputs impacts food prices in three ways: production costs could increase as farmers substitute labor for other inputs; production could decrease as a result of fewer inputs being used; and prices on the international market could increase due to tightening of available supplies. Depending on how broadly these measures to reduce the use of agricultural inputs would be adopted globally, U.S. food prices could rise by 1 to 62 percent, and worldwide food prices could grow by 9 to 89 percent. These rising costs could affect consumer budgets and ultimately reduce worldwide gross domestic product (GDP) by $94 billion to $1.1 trillion, and consequentially, increase the number of food-insecure people in the world’s most vulnerable regions by 22 million to 185 million. This chart is drawn from the ERS report, Economic and Food Security Impacts of Agricultural Input Reduction Under the European Union Green Deal’s Farm to Fork and Biodiversity Strategies.

U.S. households in the lowest income quintile spent an average of 36 percent of income on food in 2019

Tuesday, November 10, 2020

Households spend more money on food as their incomes rise, but the amount spent represents a smaller share of their overall budgets. In 2019, households in the lowest income quintile, with an average 2019 after-tax income of $12,236, spent an average of $4,400 on food (about $85 a week). Households in the highest income quintile, with an average 2019 after-tax income of $174,777, spent an average of $13,987 on food (about $269 a week). The three-fold increase in spending between the lowest and highest income quintiles is not the result of a three-fold increase in consumption, however. Rather, as people gain more disposable income, they often shift to more expensive food options, including dining out. Even with this shift, as income increases, the percent of income spent on food goes down. In 2019, food spending represented 36.0 percent of the lowest quintile’s income, 14.1 percent of income for the middle quintile, and 8.0 percent of income for the highest quintile. The statistics in this chart predate the coronavirus pandemic and its impacts. This chart appears in the Food Prices and Spending section of the Economic Research Service’s Ag and Food Statistics: Charting the Essentials data product.

Economic recovery, competition shape projections of U.S. farm prices to 2030

Monday, November 9, 2020

USDA projections for changes in nominal (not adjusted for inflation) U.S. farm prices between 2020 and 2030 indicate a mixed outlook shaped by the expected recovery in U.S. and global demand, continued export competition, and market conditions during 2020. For crops, the strongest gains are projected for wheat and cotton. Wheat prices are projected to rise as domestic and export demand begin to outpace domestic production, while higher cotton prices are driven by a projected recovery in export demand. Modest changes in prices for U.S. corn and soybeans from current levels reflect the relatively steady demand for these products during 2020, together with the moderating influences of productivity gains and continued export competition. Among livestock products, farm prices of hogs, broilers, and eggs are projected higher by 2030, as economic recovery restores growth in domestic and export demand. U.S. beef cattle prices are expected to rise during the early years of the 10-year projection period, before declining somewhat as the multi-year cattle cycle and a longer-term trend of sluggish demand growth turn prices downward. The projections are based on an assumed long-term macroeconomic outlook that includes a recovery in income growth—beginning in 2021—from the declines that have occurred in most economies during 2020. The outlook for the U.S. economy, and for many important U.S. agricultural markets and competitors, however, remains uncertain. This chart is based on projections prepared by the USDA Interagency Projections Committee using data available as of October 9, 2020, and released by the Office of the Chief Economist on November 6, 2020. Updates are shown in the Economic Research Service Agricultural Baseline Database.

WIC participation fell by 30 percent between fiscal years 2010 and 2019

Friday, November 6, 2020

USDA’s Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) helps to safeguard the health of low-income pregnant, breastfeeding, and postpartum women, as well as infants and children up to age 5 who are at nutritional risk by providing supplemental foods, nutrition education, and health care referrals at no cost to participants. On average, 6.4 million people per month participated in the program in fiscal year 2019, 7 percent fewer than in the previous fiscal year and a 30-percent drop from the program’s historical high of 9.2 million participants in fiscal year 2010. The number of WIC participants in each category—women, infants, and children—fell by 6-7 percent between fiscal years 2018 and 2019. This marked the ninth year in the program’s history that participation for all three groups fell. Declining U.S. births and improving economic conditions have likely played a role in the falling WIC caseloads. In fiscal year 2019, children 1-4 years of age made up 51 percent of all participants, while infants constituted 25 percent and women constituted 24 percent. The data in this chart pre-dates the COVID-19 pandemic and its impact on WIC participation. This chart appears in the Economic Research Service publication, The Food Assistance Landscape: Fiscal Year 2019 Annual Report, July 2020.

Charts of Note header image for left nav