Farm Progress is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Serving: Central

On-the-fly precision agriculture debuts

This happens to be weeding day for Joe Smith, and so he finds himself climbing into his Hi-Boy seat just as the sun begins poking shafts of new light through the tree line. Joe's father, a farmer too, was always one to get an early jump on a day, and Joe follows suit.

He follows suit in a lot of other things, too. But there are major differences in how the men farm, starting with the small plastic card — a data card — that Joe pulls from his overall pocket. With one hand he deposits the card in a slot. With the other, he turns the key and fires up the Hi-Boy. It takes some dexterity, but Joe is used to the drill. He's been doing it for years.

Joe lowers the boom, checks to make sure the computer is running, and then starts driving through his cotton plants. As he goes, the plants are given the once-over by sensors located on the front side of the boom. If sensors working in conjunction with on-board computers detect something wrong, an on-the-fly prescription is rendered, and in milliseconds, an appropriate amount of chemical is released just where it's needed from nozzles mounted on the rear of the boom.

Throughout the morning, while maintaining a 15-mph speed, Joe listens to Bob Dylan's impression of a strangling pussycat.

To Joe, this is standard stuff. But 10 years ago, Joe's father would have been turning cartwheels with the technology.

“I tell you, man. I have no idea how he made it back then. I got some of the old record books down and looked at what Daddy's input costs were. It'll blow your mind how much farmers around here in the Delta were spending. Nowadays,” Joe waves his hand toward the computer screen projected onto the Hi-Boy windshield, “we take all this for granted.”

Some non-fiction

Joe is fiction, of course. But, more importantly, the computer and machine (recently named “GreenSeeker”) described are not. And if Tim Sharp has his way, everything you read above will be possible very shortly.

Precision agriculture has been years in the making. Sharp, department chair of agriculture at Tennessee's Jackson State Community College, has seen and worked with many systems.

If you take a timeline of precision agriculture in the Mid-South, early on many viewed grid sampling as the ultimate answer. Intuitively everyone was saying, “Boy, if we can just get all this fertilizer right everything will be hunky-dory.”

From there, researchers started looking at other things. They looked at yield maps and found there's a tremendous amount of variability that they had no idea the cause of. Even when all nutrition problems were fixed, variability remained.

The problem with yield maps was resolution, says Sharp. The pixel sizes were just too big and hid much of what was going on in the field. A term for this was even coined: “data masking.”

Researchers hit another wall when computers proved to be too slow. To go to higher and higher data resolution, researchers needed faster computers.

“The new 1,000-plus gigabyte computers allowed us to do things we'd previously been unable to do. In time, we began getting access to satellite imagery. But the satellite imagery had the same problem the yield monitors did — pixel sizes were too big (around 4-meter pixels were standard),” says Sharp.

Flying cameras

Then Sharp and fellow researchers got some cameras flying. “We had some multi-spectral imagery shot out of planes. That imagery allowed us to deal with variability and to understand what we were dealing with.

“Truly, most of what we're dealing with is sheer soil productivity. It isn't like what you see on typical SCS soil maps with big zones. The reality is soil variability occurs in small, discreet areas. True, it's generally zonal in nature. But within those zones the variability is much more detail-intensive than we first thought.”

From that, researchers began matching up and correlating imagery to yield. “If you correlate an NDVI (Normalized Differential Vegetation Index — done by taking a ratio of the red light band and near-infrared light band and visually maximizing crop vigor) we found, as suspected, that the main thing we're dealing with in a field is soil productivity.”

Well, the truth is that's not fixable, says Sharp. And as a result, researchers no longer said, “Let's make the whole field yield 1,000 pounds of cotton.” There are areas of a field that will always yield that much and areas that never will. No matter what you do from a management standpoint that won't change.

Instead, researchers said, “Rather than trying to fix everything in the field, let's just manage inputs and scale them to be appropriate for the soil productivity of a given spot.”

Once you go to that concept, however, you must be able to manage the applications. In trying to accomplish this, problems were found with the equipment. The current variable rate equipment won't go down to single-meter application zones, says Sharp. About the best that can be done are 60-square-foot zones.

“That's not bad, it just isn't what we need. Because of the reaction time, the valves and everything that controls variable rate equipment, we couldn't get below the 60-foot data pixel. But even at that level, and using multi-spec images, we're able to cut $60 to $100 per acre off cotton production costs. That's really good.”

One of the problems with this is the difficulty in geo-referencing multi-spec images, says Sharp. “You can't get calibrated imagery on a large scale. To do so, you'd have to go out into every field, put panels on the ground, get reference points, and calibrate light and other things. It's just too labor-intensive to do things in that manner.”

But the big bottleneck in using these systems is geo-referencing, says Sharp. Users must take the image from the aircraft and match it up on computers through a process called “geo rectifying.” That's incredibly labor-intensive and slow.

“What we're finding is that to get below that 60-foot pixel, the geo-referencing process must be absolutely perfect. It's just too slow.”

Parallel to Mid-South research, researchers at Oklahoma State University have been working on how to variably apply nitrogen to wheat. As with their Delta research brethren, OSU engineers went through a whole trail of concepts themselves — “hit the wall with a problem, solve it, hit the wall with another problem, solve that, on and on,” says Sharp. Through that, OSU researchers decided that they needed to get machines to work on a 1-meter square.

The OSU researchers took a new track in solving that problem. They decided to put an imaging system on the applicator as opposed to an aircraft. Using a clean sheet of paper, they engineered the new concept from the ground up. They created new imaging systems, new spraying sensors, new technologies for spraying, new everything. In the end, they came up with a design that puts an imaging sensor on every row.

All the data collected is digital. “What's acquired from satellites and what they're getting from their new ground rig is exactly the same. Since we're no longer dealing with film but digital, there's no difference in the data type. But when your sensor platform is 3 feet off the ground as opposed to on a plane, the field of view is small. That's the reason for an imaging sensor on every 38-inch row.”

There were other things to consider. For example, OSU researchers had to calibrate the sensors for ambient light. Why? If you're shooting at 30 percent humidity or 70 percent humidity it radically changes reflectance values. If you're at 6,000 feet pointing a camera at a field, you're actually shooting images through 6,000 feet of water. With the boom-mounted sensors, however, a much cleaner image is produced.

There were also light intensity levels to worry with. If you go out in the morning, midday or evening, light is very different. OSU's solution was not to work with ambient light but to do away with it totally. Instead of relying on ambient light, the sensors emit their own light — the aforementioned pure red and a near infrared. Thus sunlight isn't needed.

The GreenSeeker system does have a sensor that “looks up” and reads ambient light. “It's asking, ‘What's the ambient light right now?’ Once that's determined, it calculates everything and removes ambient light from the final data. In other words, it corrects to a light standard that it alone is emitting,” says Sharp.

Researchers wanted the machine to move across the field at 15 mph and have the capability of observing each square foot of field that goes under the sensor and adjust rates according to the reading.

Visualize a boom with a sensor on the front side and a nozzle on the backside. As the machine drives across the field, for every 6 inches (or whatever the computer is programmed for) a signal goes to the GreenSeeker computer telling it what the crop looks like.

The programming might call for 4 ounces of Pix. The computer tells the controller to put Pix out for the next 3 milliseconds. The controller info then travels to the spray nozzle and tells it to get ready because 3 feet from now, it's to spray 4 ounces of Pix for 3 milliseconds. And this is going on at every row and every nozzle bank that will alternate from “off” to 40 gallons per acre 150 times per second.

“This is a leapfrog technology. We're leaping over everything that's being studied in satellite and aircraft imagery in the Mid-South,” says Sharp.

Hidden brilliance

What's funny is OSU researchers didn't know the full brilliance of what they'd developed.

“I'd been talking to John Solie (an OSU pioneer of the system) for a couple of years. I was aware of their sensor work. I kept hoping they'd get to where they could output an image. Last fall, I found out they could,” says Sharp.

Sharp knew the image output would be important for cotton and on Dec. 15, he excitedly took off for OSU. When he got there, they showed Sharp what the system could do in wheat.

“I told them what their machine could do for all crops. Up until recently, they were just worrying about nitrogen on wheat. But I told them, ‘Look guys, what you've got will also work in any other crop. We can do everything you're doing in wheat in cotton. We just need to create the cotton algorithms.’

“We've simply got to program the machine so it notices a cotton plant and knows what the crop requires. They recognized this and agreed to send us an eight-row setup to use for research. We're going to start imaging cotton fields this summer.”

Sharp will also be running COTMAN at the same time he's doing the imaging. What he wants to create is a season-long NDVI growth development curve.

“For example, we must create the baseline for what the right NDVI for seven-node cotton on such-and-such a date. That means we must work in tremendous numbers of fields — probably 100. We're trying to get 2,000 acres. We're building a broad base of information that will be broadly applicable. But it takes a huge number of data points.”

Sharp and colleagues must do this work on farm fields and deal with actual farm variability. It won't do any good unless the entire scale of variability is available. As no field has the whole range, the system must be used in many fields.

“We've been using many of the farm fields for previous research. We've already been able to directly correlate yield and NDVI — the first step. Once that's accomplished, once NDVI equals yield, everything follows. For example, we know that if you have an NDVI of 0.45 that will equal 952 pounds of cotton. The images taken in the first week of August directly correlate to the yield in October. We've got it down to that.”

In this new research, Sharp will study a mix of cotton varieties on 38-inch rows.

“This new system will work anywhere, on any growing system. But before it does, we have to develop the databases. It'll work on ultra-narrow-row cotton; it'll work on row spacing other than 38 inches. But first, we've got to have the data.”

Sharp's plan is to collect sensor data this summer and have a spray system in the field in the summer of 2003. This year, he'll have sensors and no spray nozzles. OSU has patents and, in conjunction with a California company, will soon begin producing the system commercially.

What will the GreenSeeker system look like in three years?

First off, this system will never eliminate some computer work, says Sharp. One of the key steps of using the system is knowing what to do with information it's offering up. Someone — probably a consultant — must make decisions about what needs to be done in the field. Once that information is entered, though, a farmer — just like Joe Smith — can simply hop on the tractor and go.

“That's it. From that point, it's operator-proof,” says Sharp. “As an agronomist, I look at this machine and say, ‘My God, OSU has some good engineers. Their work is stunning.’ It makes you wonder what else has been invented and can be used in ways that have never been considered. What's really great is (GreenSeeker) isn't at a beginning point. They've been working on this for years, and it's already functional in wheat. It's already cranking. We're not at the outset here. This system is already proven.”

Editor's note: for more information, visit: e-mail:

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.