A participant at Purdue University’s recent digital roundtable asked a question during a break that no one could answer. Considering the discussion included well-respected Extension leaders and agricultural economists, that’s saying something.
The question seemed simple: How many farmers retrieve data they collect through their yield monitors and use it to make decisions?
The reason for asking was obvious. The digital roundtable was devoted to showing what new technology can do. Much of it centered around measuring things we couldn’t measure before and turning such things as aerial images from drones into more than pretty pictures.
Yield monitors have been around for over 25 years. Some believe yield monitors ushered in the age of precision agriculture. But that’s now transforming into the age of digital agriculture. How many farmers are taking advantage of information yield monitors collect? If they’re not, what opportunities will they miss as digital agriculture and the internet of things evolves?
Back up the bus
Going back to the original question — how many farmers retrieve yield monitor data — there doesn’t appear to be very solid data pointing toward an answer. One ag economist is aware of a survey that shows that while most farmers have yield monitors, a much lower percentage likely actually print maps after the season.
That means there are still some combines running without yield monitors. Even if most combines have them, how many are calibrated correctly? Bob Nielsen Purdue’s Extension corn specialist, is a stickler for calibration, and why not? After all, garbage in, garbage out.
If you know your yield monitor is off by 8% to 10%, there may still be some value in knowing what one part of the field does compared to another. But if the goal is using data 10 years from now to make very precise decisions on how much fertilizer to apply or how many seeds per acre to plant on a very small space, what are you really accomplishing if decisions aren’t based on sound data in the first place? Put differently, why invest in sophisticated tools to change rates, perhaps even hybrids or varieties, in a pass across the field if the information you’re basing those decisions on is faulty?
Of those who pull data out of the monitor, store it, maybe share it with their agronomists, then print maps, how many use the maps once they’re printed? How many filing cabinets around the state are filled with pretty maps, waiting for someone to figure out how to use them?
If you think the day is coming when data will be useful — and those at the digital roundtable certainly do — there’s value to making sure data is accurate, and capturing it and storing it properly now.
So, did the participant get his answer? No, but he certainly prompted lots more questions. And take heart. After all, the first tractor appeared in the early 1900s. Yet the horse vs. tractor debate lingered until the end of World War II.
There is always an adoption curve for new technology. The problem today is that technology is evolving much faster than it did a hundred years ago. How long can you afford to wait before stepping up your game?
Comments? Email firstname.lastname@example.org.