I’ve always hear of McGill University, but not until last year did I get a chance to see it. Okay, so maybe it was just a drive-by in the cab on my way to the 2014 ASABE International conference held in Montreal. But even so.
The university consistently ranks among the Top 50 best global universities, along the likes of Yale, Harvard, Stanford, Princeton, UC Berkeley and other big names.
At the conference, I met with two of McGill’s finest, graduate students Bharath Sudarsan and Trevor Stanhope. Both specialize in machine vision and robotics. In fact, they were the mind that came up with this year’s robot competition challenge (see our gallery, A robot that taps trees)
I asked them about the research they were working on and their visions for the future in farming.
Stanhope was presenting a paper at 2 pm that day on machine vision, specifically on research he did over the past two summers working with an organic producer in Quebec. The producer was using a mechanical guidance system that used tactile rods or sensors to guide the cultivator between the rows of crop to kill weeds.
You might also like:
“With organic systems, mechanical weed control is the only way to control weeds without the use of chemicals,” Stanhope says. “The mechanical guidance system he had been using used rods to sense the plant, but it didn’t work well in the early stages of plant growth and would just knock over plants. And using RTK correction would have been cost prohibitive for him. So I built a computer vision system that would address that issue.”
The system, which mounts on a cultivator, converts the rows into multiple signals that are fed back to a hydraulic control system that is used for mechanical guidance. The system can be used at the earlier stages of crop growth and, during later stages, switched out for the mechanical rods or keep both systems installed.
“So far, it has worked really well,” Stanhope says. “Last year was proof of concept. This year, we came back at it with much more robust system."
Sudarsan was also doing work in machine vision, except in his case he was using it to read the texture and composition of the soil for the purposes of adapting tillage decisions. He says there are a few systems available that can do texture analysis based on machine vision—namely, one by Garford based out of the UK, But theirs is designed for light tillage as opposed to heavy tillage.
His will be a “sensor kit” that would be loaded with different types of sensors that you could use for, in this case, soil sampling. “It is possible for machine vision to measure every particle of soil," Sudarsan says. “Today, it is very expensive to do this by hand.”
Sudarsan defines machine vision as “using a camera to get information to make robots intelligent.” He says a major advantage of it is that all you need is a camera to do the processing. And, now that you have cameras on handheld mobile phones, all you need is to write an app to process the information and automate the process.
“To put it simply, using cameras is much less expensive than buying fancy equipment or doing grid soil sampling,” he says. “So that is where I see machine vision being incorporated more. It is a very economical solution.”
In the future, both students say that UAV systems will play a big role in machine vision to detect such things as nutrient deficiency based on drainage problems. UAVs take photos of fields from the sky and can be much more accurate than low-res satellites, they say.
“People need to realize that drones are not just used for destruction,” Sudarsan says. “They actually have a lot of potential in farming.”
They explained that drones are just another form of “machine visions,” only in this case it is mounted on a fixed wing oli-copter instead of a machine. But that same advantage applies.
“Machine vision from an aerial perspective is like bird’s eye view,” Sudarsan says. “It gives you a lot of information, and all we have to do is write the proper algorithms to process the information and give instructions.”
Looking ahead, Stanhope says it will likely take both approaches—ground and aerial-- to get the information needed to make better farm management decisions.
“There are benefits to both,” Stanhope says, and cites colleagues who are currently working on a machine vision system that integrates both ground and aerial into an autonomous unit that can do tillage based on the soil information.
I asked both students their vision for machine vision and robotics in farming.
“I see it the way my professor explained it,”Sudarsan says. “There was a farmer in a pub and someone asked him why he wasn’t in the field weeding his crops. The farmer replied, I am.”
Stanhope’s vision? “I’d like to see the job of farming get less labor intensive. It is a hard profession with a large amount of labor relative to the returns. I think if more could be done remotely, more young people would go into it. It shouldn’t have to be so laborious.”
At the end of the interview, I thanked them for their time and said I was impressed by their research and ability to explain such complex topics.
Sudarsan smiled, held up his hands, and humbly replied, “We’re just kids.”
Like what you’re reading? Subscribe to Farm Industry News Now e-newsletter to get the latest news and more straight to your inbox twice weekly.