THE EARLIEST UNMANNED AERIAL SYSTEMS were balloons used by European armies in the mid-1800s to deliver bombs. In the last two centuries, as military units across the globe have advanced in their use of the technology, civilian and commercial uses of unmanned systems have increased too, with drones now an easily accessible tool to collect data.
And while the technology is readily available, the know-how of understanding and using the data for informed decisions is lacking, particularly in agricultural systems.
A team of scientists in the Mississippi Agricultural and Forestry Experiment Station, collaborating with colleagues at Texas A&M and the University of Illinois, are hoping to develop new systems to make information more readily available for producers.
Dr. Joby Czarnecki, an assistant research professor in the Geosystems Research Institute and a MAFES scientist, leads the research team with the goal of placing information in the hands of producers as soon as possible.
Czarnecki has spent 22 years working with precision-agriculture tools. She has seen the revolution in drone technology, going from three days to process data to immediate results. She has also seen the mounds of data gathered, only to sit idle and not used to inform decisions.
"Unmanned aerial systems can provide continuous streams of data, but once that data is produced, what happens next?" Czarnecki asked. "We need a practical way to readily and reliably produce timely, actionable information from the data so that end users, like farmers, do not need to spend their time parsing through the mosaic image to find problems."
Several obstacles stand in the way of producing actionable information from data generated through precision agriculture, with one hurdle being the lack of internet connectivity, providing a challenge for the unmanned aerial system to upload information to cloud-based processing systems.
Mississippi is ranked 42nd in the U.S., or among the top-ten worst states in the nation, when it comes to broadband access across the state. Given the limited access and size of the data being generated, producing an unmanned aerial system that will provide immediate feedback can be problematic, at the very least.
Also, there is the issue of image quality produced by the unmanned system. All drones are not the same; there are different cameras and sensors in each system and the more sophisticated the camera, the higher the price of the UAS. Cameras with multispectral technology can detect electromagnetic wavelengths outside of the visible spectrum, with more than the three channels or primary colors found in images produced from typical cameras, namely RGB or red, green, and blue.
"We are interested in knowing if machine learning, via something like a deep-learning neural network, would compensate for lower quality images enough that the end user can make the same decision as would be made with data from a more expensive multispectral camera as well as poor quality images," Czarnecki said.
The scientists have loaded RGB image datasets into the network to determine if artificial intelligence is a viable way to accurately predict crop health when paired with low-cost sensors, which may lack near-infrared (NIR) spectroscopic functionality. They are also trying to determine if intelligence can guide users towards the correct interpretation of what is going on in the field.
To ensure image quality is usable, regardless of the technology used, scientists must confirm radiometric and geographic calibration of the data. Radiometric calibration compensates for varying illumination in the scene, while geographic calibration refers to the exact location in the field. The team set out to automate these calibrations, so the images presented for processing would be accurate.
"In aerial images, distortions can occur due to the angle of the aircraft and the lighting conditions imposed by the sky," Czarnecki said. "These distortions require correction before analysis."
A team of scientists at Texas A&M tackled this obstacle. Dr. Alex Thomasson, department head in MSU's Department of Agricultural and Biological Engineering, led the team while employed at Texas A&M. The researchers invented, designed, constructed, and tested an autonomous mobile ground control point that provides reference data on the location, height, temperature, and reflectance of crops in the field.
"The smart ground control points system was successful in communicating with the UAV during flight, recording positioning onboard the UAV in real-time," Thomasson said. "Our field studies showed that using the ground control points in processing of image mosaics reduced reflectance error by roughly 50 percent and reduced the error of plant-height measurements from digital surface models by about 20 percent."
With image quality, color correction, and location issues resolved, the scientists set out to produce useful, reliable, and important information for specific key production decisions in real time. A corn production system was used in this study to evaluate early season re-planting, rescue nitrogen fertilizer amendments, and harvest readiness.
The team developed a decision tree to mimic the mental process used by agronomists. The team then determined which questions could be answered with automation, which questions could be answered through artificial intelligence, and which decisions would require the human factor.
"Many of the details regarding soil temperature, weather, and proximity to forest edges can be automated, allowing a user to quickly determine if these were likely causing the crop stress exhibited," Czarnecki said. "We also have a damage library that is being used to train the machine to readily identify certain types of abiotic crop stress."
However, the decision tools only narrow down the likely culprits in a production system. If there are 50 different items that may have caused crop failure, the decision tools may be able to narrow the list to 10 that then require human intervention to deduce further, Czarnecki said.
And while the research is ongoing, the results are promising for farmers interested in using precision agriculture in their production system.
"The decision tools will not make the actual decision on what a farmer needs to do," Czarnecki said. "I've yet to meet a farmer who wasn't keenly aware of which areas in his field historically experience stress. This tool just identifies the areas it detects as anomalies and narrows down the list of potential causes to prune down the decision tree for the user."
While data can be overwhelming, the end goal is to give farmers information upon which they may make informed decisions, Czarnecki added.
This study was funded by the USDA National Institute of Food and Agriculture. In addition to Thomasson, collaborators include Dr. Brian Smith, associate professor in MSU's Department of Industrial and Systems Engineering; Dr. Brien Henry, professor in MSU's Department of Plant and Soil Sciences and associate dean of the Graduate School; and Dr. Girish Chowdhary, assistant professor in the University of Illinois Urbana-Champaign's Department of Electrical and Computer Engineering.