Steel girders rose out of the ground behind the Indiana Corn and Soybean Innovation Center last summer. The girders are at the Purdue University Agronomy Center for Research and Education, also known as the Purdue Agronomy Farm. By mid-September, there were cameras high above the crops, attached to the framework supported by the girders. Why build a steel frame with a trolley-like system, technically called a gantry, in a cornfield?
Jian Jin, a professor in Purdue’s Agricultural and Biological Engineering Department, leads the project. Lots of work is going on with unmanned aerial vehicles and different types of cameras to capture images of crops in fields and plots from the air. Some of the latest work at the center involves not only producing pictures and plant health images from these UAV flights, but also determining how to turn information within the images into data points.
Purdue Extension agronomists Bob Nielsen and Jim Camberato demonstrated this concept at a digital roundtable field day at the center in September. Once you have data points and not just images, it’s a shorter leap to machine learning and artificial intelligence, which could increase the value of the information for those trying to make crop management decisions or attempting to improve or breed better crops, Nielsen says.
Cameras in the sky
That’s where Jin’s gantry approach could prove invaluable, experts say. Yang Yang, Purdue’s director of digital phenomics and the Controlled Environment Phenotyping Facility on Purdue’s campus, says the challenge is to understand what those UAV images actually mean.
“When you get an image with a UAV camera, it’s like a snapshot in time,” he explains. “You know what was happening with the plants at that moment in that environment.
“But what if the environment changes? The change could be a shift or change in wind speed or direction. It could be a rain event or a shift in temperature. The environment which plants face in the field constantly changes.”
What happens in the CEPF is that plants are raised singly in pots inside a growth chamber. The environment can be varied by computer instructions and precisely controlled. Then plants can be sent to one of three scanners to determine what’s happening within the plant and plant roots. It’s still a moment in time.
“What Jian Jin is doing is capturing images constantly on a set of plants growing in the ground as environmental changes happen,” Yang says. “He can’t control the environment, but he can almost constantly capture images of what happens as the environment changes naturally with varying weather conditions.”
The goal is that by having images similar to UAV images, only shot much more often of the same plants, researchers can get a feel for how environment affects plants and what shows up in UAV images, Yang says. In fact, they hope to do better than get a “feel” for what happens. They want to eventually make correlations based on what they learn when conditions change so they can better interpret information gathered in a single UAV image.
Steel girders rose from the soil, but it’s not about the steel. It’s about the technology they make possible, Yang says. Hopefully, the gantry system will be a means to an end: better understanding of how UAV images can be used.