|A normalized difference vegetation index (NDVI) image generated from aerial photos taken over Iowa. Image courtesy of AgPixel|
Next to the dogfighting of WWI, it may be aerial application—formerly known as “crop dusting”—that’s most apt to evoke that classic, leather-clad, goggled image of the world’s earliest aviators. In fact, the first seed-sowing hot air balloons nearly predated the Wright Flyer. Helicopters too have a long history in the profession, driven aboard truck-mounted helipads to rural locations too far from an airfield for fixed-wing applicators to make sense. Those pilots willing to brave the long hours, towers and power lines of agricultural work can enjoy some of the most austere flying locations in the business.
But these days, nobody’s job escapes the buzzwords of progress. The corporate-cultured among us might even say that “Big Data” has come to aerial application. Yuck.
For those of us who hate all corporate jargon that starts with “Big”, there’s precision agriculture—a more descriptive term for how modern geospatial technology has advanced farming techniques. Starting with the detailed mapping of farmland in aerial or satellite images linked to GPS coordinates, precision agriculture has allowed farmers to detect subtle discrepancies that might indicate the presence of disease, infestation or other issues. Hints of these might come from such things as heat signatures or erratic vegetation levels.
The analysis of these images has been used to develop plans of action, including where and how much to apply a pesticide. Agricultural pilots benefit from these images, since the information they reveal can be fed back into an aircraft’s avionics to provide meticulous, zigzagging flight paths or “swaths” over a field. In short, the barnstormers of yesteryear are still around today; they just have better toys.
Many of these innovations have, in fact, been around since the 1990s. But the 21st century has brought with it a host of new developments in precision agriculture. Researchers are developing software to analyze aerial images automatically, rather than having to wait for an after-landing analysis by crop scientists. They are also designing application hardware for safer, more efficient and even more automated ways to spray product. Perhaps the most obvious and politically charged development, though, is the commercialization of unmanned aerial systems (UASs). Depending on your perspective, the UAS is an economical tool for photographing crops, a dangerous new obstacle for low-flying helicopter pilots or even a disruptive technology that could threaten the aerial application field itself.
|USDA’s MD600 conducting spray trials for public health sprays (i.e. mosquito control). Photo courtesy of USDA-ARS|
Scott Bretthauer is an application technology extension specialist at the University of Illinois. He studies how existing equipment maximizes on-target applications and mitigates drift. He educates aerial applicators through the Professional Aerial Applicators’ Support System (PAASS) program, the industry’s primary mechanism for recurrent training. He also helps calibrate equipment through Operation S.A.F.E (Self-regulating Application and Flight Efficiency) clinics run by the National Agricultural Aviation Research and Education Foundation. In short, he’s well versed in the tech.
As Bretthauer explained, the technology used by aerial applicators hasn’t changed much in the past decade. What has improved, though, is the industry’s understanding of that technology. Scientists know much more about how the physical and chemical properties of a product—even size of droplets coming from the spray nozzle—can determine its success. For example, smaller droplets exhibit greater efficacy, depositing on more of a plant’s surface area and sticking to the plant more easily. But they also exhibit a greater potential for drift, blowing more easily off course and for greater distances.
Currently, application research laboratories use a device called a laser diffraction system to measure the droplet size produced by a given spray nozzle. “Essentially, you spray through your laser, the laser is diffracted by each droplet, and then the system measures how much that laser is diffracted. Then it calculates the droplet size,” explained Bretthauer.
In the past few years, new laser diffraction systems have been created that more accurately measure droplet size. “We just got a brand new set of nozzle models that show us the droplet size for the more commonly used types of nozzles, and we’re going to be working on expanding that list,” Bretthauer said.
|Starboard view of the spray boom on USDA’s MD600 spraying herbicides for weed control in College Station, Texas. Photo courtesy of USDA-ARS|
|A Sept. 8, 2014 Landsat 8 image shows actively growing crops along Nebraska’s Platte River before harvesting.
Photo courtesy of NASA Goddard Space Flight Center
Another discovery provided by the newer laser diffraction system concerns the effect of chemical additives, called adjuvants, on the application process. Scientists are only just learning how these additives affect product properties such as droplet size.
“So the technology itself isn’t necessarily new, but the data and what we know about it is definitely brand new and continues to improve, and we’re getting a better and better picture,” said Bretthauer.
When it comes to spraying systems being developed for aerial applicators that turn themselves on and off as the pilot navigates the field, Bretthauer admitted that they have potential. But he said existing systems suffer from drawbacks and therefore haven’t been widely adopted.
He said that, at least on fixed-wing application platforms, when the pilot uses a spray valve to deactivate the system, the fan-driven spray pump exerts a negative pressure on the boom called “suck-back”, while a spring-loaded check valve slams shut inside the boom. Both features help prevent the product from continuing to drip after spray system deactivation. But “the systems that are totally automatic entry and exit basically just stop the pump, and when you stop the pump, you don’t have suck-back anymore.”
Even those pilots who use a pump brake to deactivate the system (which doesn’t provide suck-back) still prefer to do so under manual control rather than trusting the automation.
|Helicopters are still ideal platforms for aerial application flights in remote areas. Photo courtesy of the NAAA|
Even in aerial application, not all automation earns the distrust of pilots. GPS systems have long been critical in helping aerial applicators navigate over a crop. These systems prescribe the swaths that the pilot flies to efficiently apply product without wasting chemicals or allowing them to drift.
A common setup onboard the helicopter is a screen with light bars that shift to left or right to let the pilot know when he or she is straying from the course, not entirely unlike the course deviation indicator (CDI) needle on a VOR receiver. Also like the CDI, the light bars reverse when a pilot reaches the end of a swath until a 180-degree turn to a parallel course back the other way is completed. Tracking a swath while flying low over a field at a high airspeed, all the while keeping alert for wind shifts and the occasional hard-to-see power lines, can be difficult work.
Fittingly, some of the latest research aims to improve the existing (if not widely adopted) autonomous spray system technology, so the pilot can focus more on flying the aircraft. At the forefront is the U.S. Department of Agriculture’s Agricultural Research Service. In its Aerial Application Technology Research Unit in College Station, Texas, Dr. Clint Hoffman and his colleagues are tasked with developing new technology to make aerial application more efficient. One of their specific goals is to advance two technologies: automated spray systems that vary their work based on their location, and the “prescription maps” that inform them.
“What we’re doing is collecting an image, in this case off of one of our ag airplanes; we have a camera system that we can put on there,” said Hoffman. “So we fly over the field, take the picture, and then we’ve put together a system that we released in December.”Similar to software used currently in commercial equipment but never before in aerial application, it constructs a prescription map that tells the spray system where product should be applied. For example, if an aerial image were taken indicating that 40 percent of a field had weeds and needed to be sprayed with an herbicide, those areas of the image would be designated as “spray zones”. Feeding a prescription map into a computer onboard the aircraft would allow the spray system to act independently of the pilot.
As Hoffman explains, “the system actually just turns on and off as you fly across the field; and that’s not really something that a pilot can do because their reactions aren’t so fast. We’ve got it down to half a second, on or off as you’re moving through the field.”
As good as the spray and navigation systems get, it’s really the aerial images of precision agriculture that grants its precision. Many different types are collected depending on the analysis sought by crop scientists, often in separate light spectra. The most basic type of image is a standard aerial photograph, providing a farmer with a birds-eye view of his farmland in the visible spectrum.
When taken in the near infrared (NIR) spectrum, an image shows vegetation in greater detail due to the tendency of plants to reflect near-IR light. These two types can be combined in what’s called a normalized difference vegetation index (NDVI), depicting a comparison of the near IR and visible light reflected from the crops. This may sound excessively technical, but it has a very important purpose: sick or sparse vegetation reflects more visible light and less near-IR light than healthy vegetation, so an NDVI can detect not only the presence, but also the vitality of a crop.
But no matter what type of image is used, all suffer from the same limitations. First, even the most detailed images merely detect variability in a spectrum rather than determining the cause of that variability. It takes careful analysis by an agronomist or other agricultural expert to determine what the shifting color patterns mean in each case; often, they actually have to walk through the crops comparing what they see on the image to the actual state of the vegetation in a time-consuming process known as “ground truthing”.
|Orthomosaic images such as the Synthesized Natural Color (bottom), Color-Infrared (center) and NDVI (top) each tell a different story about the health of a crop. Image courtesy of AgPixel|
Only then can the image be properly interpreted. Another result of this is that images have to be analyzed after landing. Some in the industry believe that the next advancement in precision agriculture will come only when a camera can photograph and analyze the health of crops in a single step—or in other words, one flight.
At AgriThority, a science consultancy based in Kansas City, Mo., UAS expert David Lincoln has been petitioning industry stakeholders to fund his company’s research into developing software that can both detect and diagnose crop health.
“We actually have the science available to design a test to get basically some calibration of lightweight sensors for diagnostic, not just detection,” he said. Many large agricultural companies already own vast test plots, on which they carefully control the treatment of crops and measure the results. Though these plots are meant to develop better crop strains, AgriThority could use them to develop diagnostic software.
“The idea is to embed that expertise in those algorithms,” he said. “The type of research project that I want to do is where we put a small lightweight sensor a couple of hundred feet up on test plots, which can be looked at by the dozens.”
AgriThority would collect images in different spectra of those crops as they experience varying states of health, while scientists do chemical and visual analyses at each photographed moment. Once they determined how crops of varying health levels appear in multispectral images, the researchers would feed that data into their software. Eventually, the software would become smart enough that it no longer requires expert analysis—instead producing a diagnosis instantly for the farmer.
Though he admitted to being platform-agnostic, Lincoln said he believes that the best application of this software might be in small, inexpensive UAS operated by the farmers themselves. Doing so would allow crop scientists to study much larger areas in less time, while giving farmers the potential to diagnose their own crops without hiring out to a service at all.
Lincoln also said that arming UASs with diagnostic imagery could have potential cost savings, even to aerial applicators who make their livings piloting fixed-wing and rotary-wing aircraft. There might be a profit loss for applicators who would rather be paid to treat an entire field than on an as-needed basis, but “I just feel like adding the service to the application costs can offset any loss of product fails and airtime,” he said, suggesting that enterprising aviators could add a UAS as a service addition to their businesses.
When asked whether this diagnostic software could be integrated into a manned platform like a helicopter for aerial photography, he said, “That would really be a tertiary level of development, but certainly feasible. Early on, the industry needs to go to a light, cheap, high-school operated drone to do the scouting, and if a flight by heavy aircraft is necessary, then leave that for application.”
For more than 40 years, the U.S.’s Landsat satellites have provided the longest continuous global record of the Earth’s surface, and they are expected to do so well into the next decade. The newest “bird,” Landsat 8, was orbited in 2013.
First placed into orbit in 1972, the Landsat series collects spectral images at moderate spatial resolutions that are coarse enough for global coverage but detailed enough to capture human-scale processes. You can see large man-made objects like highways in a Landsat image, but not individual houses, NASA says. But Landsat helped launch an evolution in unclassified space-based Earth imagery that, over four decades, has produced increasingly fine resolution images. Together with the U.S. Global Positioning System navigation satellites, these developments spurred the development of precision agriculture.
Sensors on today’s satellites can depict objects of 2.5 feet or less and produce images in a range of wavelengths from X-ray through ultraviolet to visual and IR. These generate vast amounts of raw data on soil, substrate, moisture and vegetation conditions. Scientists are striving to transform that data into information useful for crop management.
Not even a month has passed since the comments period for the FAA’s Notice of Proposed Rulemaking (NPRM) regarding small, unmanned aerial systems (sUASs) closed. But already, society is making aggressive predictions about where UASs will see the heaviest commercial use, and many believe that agriculture will be at the top of the list. For the most part, UASs are being suggested as a means to collect aerial imagery. Providers such as AgPixel are already offering UASs as an image collection platform along with satellites and fixed-wing aircraft.
But models such as the Yamaha RMAX are blurring the distinction in roles. The RMAX has seen years of extensive use spraying product on the smaller farm plots of Japan. Now, thanks to a recent Section 333 exemption granted by the FAA, Yamaha can offer the RMAX to customers in the U.S. for precision application. Even though current laws handicap drones with altitude and line-of-sight restrictions, a recent announcement at AUVSI’s Unmanned Systems 2015 international tradeshow revealed the FAA’s plans to investigate the safety of beyond-line-of-sight operations through a series of research projects, potentially granting models like the RMAX a far wider operational range. In fact, one of the projects tasks fixed-wing UAS provider PrecisionHawk with studying precision agriculture.
While Hoffman from the USDA-ARS acknowledged the real concern UASs pose to manned aviators in all industries, he said he doesn’t believe that they will replace manned aerial applicators, at least not in the near future. “To me, they’ll eventually be another tool in the toolbox, so to speak, depending on how things go,” he said, explaining that the smaller payload capacity of UASs such as the RMAX make them ideal for applications on smaller farms and vineyards, but unsuitable for the large-scale applications necessary on American farmland, even if they are successful overseas.
Of Japanese farms, he said, “A lot of them are around a hectare or two. They’re up in the slope of a mountain, which is not the best area to make an aerial application in.” He added that while China has been embracing UASs for spraying in the short term, they have also been buying more and more manned aircraft for aerial application as their farms grow larger.
His views resonate among agricultural pilots, whose biggest concern seems to be safety, not job security. The National Agricultural Aviation Association (NAAA) published a long list of recommendations to the FAA before the NPRM period closed, which focused completely on how UASs could be made safer for their manned counterparts. Among the list of recommendations were things like painting UASs in aviation orange and white, equipping them with strobe lights and ADS-B Out transceivers and requiring licensure and medical clearance for ground operators and spotters.
As AgriThority’s resident UAS expert, Lincoln has a more optimistic view of their future in the industry. He said that even the safety concerns brought up by the NAAA can be overcome. “I don’t think there has to be a dangerous transition period.”
All of this would seem to imply one thing—that aerial applicators can and should consider adding a UAS component to their services, at least for collecting images. According to the FAA’s current drone operator certification standards, trained pilots are still the best qualified to operate them in the NAS. For the time being, aerial application is still done best with manned aircraft. Perhaps the future of agriculture will see a single operator providing the whole package to farmers: deploying a drone to diagnose the health of crops, feeding that information into a helicopter GPS and then getting back behind the controls.