OK, now that I’ve got your attention, don’t even think of it. Only an idiot would try doing that.
While autonomous — self driving — vehicle technology has tremendous potential on farms where fences are few and far between, real live neural centers (brains) are still required behind the wheels or remote controls. And that’s one flaw of the artificial intelligence systems.
Yet, major auto and artificial intelligence manufacturers — Google-owned Waymo, Tesla, General Motors, Toyota and others — are road-testing AV technologies to run taxi and courier services in the big cities. Some of the initial tests have had unfortunate results.
During one of the most recent tests in March, an autonomous Uber SUV in Arizona killed a bicyclist. The vehicle’s cameras, LIDAR sensors and radar worked properly. The backup driver in the car didn’t react in time to avoid the accident.
That same month, a Tesla SUV running on autopilot slammed into a concrete highway divider, killing its driver. State-of-the-art tech systems still have trouble anticipating all that human brains are trained to do.
You’re still needed at the wheel
In early May, Harvard University’s Chan School of Public Health conducted a forum on the challenges of autonomous driverless vehicles. Expert panelists from the National Safety Council, Toyota Research Institute and the autonomous vehicle industry pointed out much of what you might surmise.
Of course, they concurred that much of the technology offers great safety potential — automatic breaking, lane departure warnings and blind spot monitors. But as Peter Sweatman, cofounder of CAVita, a connected and automated vehicle consulting group, pointed out: "Less than 50% of people really want to ride in a driverless vehicle."
"Could an automated vehicle navigate road work?" he wondered. And, could an autonomous vehicle anticipate whether a larger vehicle or bus might be less likely to yield?
Ag risks remain
Here are a couple other forum take-aways pertaining to running auto-steer and autonomous farm equipment:
• Autopilot systems have known weaknesses. Tesla’s manual, for instance, warns that it may not see stationary objects, a shortcoming highlighted when a Tesla slammed into a stopped fire truck near Los Angeles in January. The systems are designed to discard radar data about things that aren’t moving to prevent false alarms.
• Human error (backup driver reliance on the technology) and delayed response time are uncorrectable conditions. Drivers who rely too much on autonomous features may zone out at the wheel, worried Jay Winsten, director of Harvard ’s Center for Health Communication.
That, reportedly, was the case with one of the above noted fatalities. "The more highly automated driver assistance systems are integrated, the more complacent the driver becomes," Winsten added.
Autopilot systems can lull on-board backup drivers into thinking the systems are more capable than they are. That, unfortunately, allows distractions to steer eyes and minds away from roads and field rows — a risk for tractors and combines as well.
So, before you entrust your $100,000-plus auto-steering machines to artificial intelligence, don’t unplug in your real intelligence. Yes, I’m so "old school." Your liability insurance carrier might be as well.
About the Author(s)
You May Also Like