South East Farm Press Logo

Looking at the scorecard at the end of the report.

Brad Haire, Executive Editor

April 19, 2018

3 Min Read

The USDA Prospective Planting report was released March 29. Did it seem about right to you?

The good folks at USDA give guidance on the report’s methodology, or how the agency collected, analyzed and reported the data, including a record on the report’s reliability over the years compared to the final estimate report, which USDA publishes in mid-summer.

Brief background: The estimates for this year’s prospective planting report were based on surveys conducted during the first two weeks of March from a sample of about 83,000 farm operators selected from a list to ensure all operations in the United States have a chance to be selected, the report says.

Each regional field office submits its analysis of the current situation to the Agricultural Statistics Board. The survey data is compiled to the national level and reviewed independently of each state's review, the report says. Acreage estimates were based on survey data and the historical relationship of official estimates to the survey data.

Okay. So, how do prospective planting reports typically compare to the final estimate reports?

Looking at the scorecard at the end of the report, which takes in account a 20-year records for some crops, shows the report has a pretty good batting average when swinging at large-acreage commodities.

On corn, for example, chances are 9 out of 10, or what the report refers to as 90-percent confidence level, that the difference between the prospective report and the final acreage report will not exceed 2.4 percent, high or low.

For corn, changes between the prospective planting report estimates and the final acreage report over the past 20 years have averaged 941,000 acres, ranging from a low 32,000-acre difference one year (the report doesn’t specify what year) to a high 3.07-million acre difference another year. For corn, the prospective plantings estimates have been below the final estimate nine times and above 11 times. Against soybeans, the reports bats about the same as it does on corn.

For upland cotton, a lower-acreage crop, the report’s batting average is a bit more erratic. The 90-percent confidence level for cotton is 11 percent. Changes between the prospective planting report estimates and the final acreage report over the past 20 years for cotton have averaged 605,000 acres, ranging from a low 6,000-acre difference one year (again no specified year) to a high 2.1 million-acre difference another year. The prospective planting estimates have been below the final estimate 13 times and above 7 for cotton.

Acreage data and ag statistics are a good and needed service for U.S. agriculture. Though the agency has had a few missteps, including a rather large one a few years ago for peanuts, the USDA does this service well and to a level no other organization or agency can reach.

There are academic studies published that have delved deeper and better than I into the reliability of the USDA planting reports, and I admit the statistical mechanics of the planting report made my head hurt a bit.

I am reminded of a joke: There are three kinds of farm reporters — those who are good at statistics and those who are not.

Good luck. Take care, and thanks for reading.

About the Author(s)

Subscribe to receive top agriculture news
Be informed daily with these free e-newsletters

You May Also Like