Ryan Kunde is a winemaker whose family’s picture-perfect vineyard nestles in the Sonoma Valley north of San Francisco. But Kunde is not your average farmer. He’s also a drone operator—and he’s not alone. He’s part of the vanguard of farmers who are using what was once military aviation technology to grow better grapes using pictures from the air, part of a broader trend of using sensors and robotics to bring big data to precision agriculture.
What “drones” means to Kunde and the growing number of farmers like him is simply a low-cost aerial camera platform: either miniature fixed-wing airplanes or, more commonly, quadcopters and other multibladed small helicopters. These aircraft are equipped with an autopilot using GPS and a standard point-and-shoot camera controlled by the autopilot; software on the ground can stitch aerial shots into a high-resolution mosaic map. Whereas a traditional radio-controlled aircraft needs to be flown by a pilot on the ground, in Kunde’s drone the autopilot (made by my company, 3D Robotics) does all the flying, from auto takeoff to landing. Its software plans the flight path, aiming for maximum coverage of the vineyards, and controls the camera to optimize the images for later analysis.
BreakthroughEasy-to-use agricultural drones equipped with cameras, for less than $1,000.
Why it mattersClose monitoring of crops could improve water use and pest management.
Key players3D Robotics; Yamaha; PrecisionHawk
This low-altitude view (from a few meters above the plants to around 120 meters, which is the regulatory ceiling in the United States for unmanned aircraft operating without special clearance from the Federal Aviation Administration) gives a perspective that farmers have rarely had before. Compared with satellite imagery, it’s much cheaper and offers higher resolution. Because it’s taken under the clouds, it’s unobstructed and available anytime. It’s also much cheaper than crop imaging with a manned aircraft, which can run $1,000 an hour. Farmers can buy the drones outright for less than $1,000 each.
The advent of drones this small, cheap, and easy to use is due largely to remarkable advances in technology: tiny MEMS sensors (accelerometers, gyros, magnetometers, and often pressure sensors), small GPS modules, incredibly powerful processors, and a range of digital radios. All those components are now getting better and cheaper at an unprecedented rate, thanks to their use in smartphones and the extraordinary economies of scale of that industry. At the heart of a drone, the autopilot runs specialized software—often open-source programs created by communities such as DIY Drones, which I founded, rather than costly code from the aerospace industry.
Drones can provide farmers with three types of detailed views. First, seeing a crop from the air can reveal patterns that expose everything from irrigation problems to soil variation and even pest and fungal infestations that aren’t apparent at eye level. Second, airborne cameras can take multispectral images, capturing data from the infrared as well as the visual spectrum, which can be combined to create a view of the crop that highlights differences between healthy and distressed plants in a way that can’t be seen with the naked eye. Finally, a drone can survey a crop every week, every day, or even every hour. Combined to create a time-series animation, that imagery can show changes in the crop, revealing trouble spots or opportunities for better crop management.
It’s part of a trend toward increasingly data-driven agriculture. Farms today are bursting with engineering marvels, the result of years of automation and other innovations designed to grow more food with less labor. Tractors autonomously plant seeds within a few centimeters of their target locations, and GPS-guided harvesters reap the crops with equal accuracy. Extensive wireless networks backhaul data on soil hydration and environmental factors to faraway servers for analysis. But what if we could add to these capabilities the ability to more comprehensively assess the water content of soil, become more rigorous in our ability to spot irrigation and pest problems, and get a general sense of the state of the farm, every day or even every hour? The implications cannot be stressed enough. We expect 9.6 billion people to call Earth home by 2050. All of them need to be fed. Farming is an input-output problem. If we can reduce the inputs—water and pesticides—and maintain the same output, we will be overcoming a central challenge.
Agricultural drones are becoming a tool like any other consumer device, and we’re starting to talk about what we can do with them. Ryan Kunde wants to irrigate less, use less pesticide, and ultimately produce better wine. More and better data can reduce water use and lower the chemical load in our environment and our food. Seen this way, what started as a military technology may end up better known as a green-tech tool, and our kids will grow up used to flying robots buzzing over farms like tiny crop dusters.
DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.
“This is a profound moment in the history of technology,” says Mustafa Suleyman.
What to know about this autumn’s covid vaccines
New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.
Human-plus-AI solutions mitigate security threats
With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure
Next slide, please: A brief history of the corporate presentation
From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.