Source: MISSISSIPPI STATE UNIV submitted to NRP
ENHANCING ACCESSIBILITY, RELIABILITY, AND VALIDATION OF ACTIONABLE INFORMATION FROM UNMANNED AERIAL VEHICLE IMAGE DATA
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1015118
Grant No.
2018-67021-27668
Cumulative Award Amt.
$473,430.00
Proposal No.
2017-06662
Multistate No.
(N/A)
Project Start Date
Apr 15, 2018
Project End Date
Apr 14, 2021
Grant Year
2018
Program Code
[A1521]- Agricultural Engineering
Recipient Organization
MISSISSIPPI STATE UNIV
(N/A)
MISSISSIPPI STATE,MS 39762
Performing Department
Geosystems Research Institute
Non Technical Summary
This research represents a critical step in USDA NIFA's efforts to develop tools and precision technologies for monitoring, measurement, and detection in agricultural systems. Despite sigificant ($700M) investments in largely data-centric agricultural technologies, unmanned aerial system (UAS)-based remote sensing is generally not enhancing farm profitability and productivity. The critical gap is the inability to leverage the data for improved decision making. UAS will not add greatly to farm productivity and profitability until there is a practical way to readily and reliably produce timely, actionable information from the data they offer. Our goal is to improve farm profitability and productivity on a broad scale through UAS-image-data-enhanced decision making. Our research objectives are to (1) develop methods that enable UAS-image data to be converted into actionable information by the end of a 20-minute flight, (2) develop methods that provide for automated radiometric and geographic calibration of the data so they are consistently accurate, and (3) evaluate broadly-applicable use cases in corn production that will make clear the value of the data in common farm-management decisions. The novelty of the work is that it focuses on overcoming bottlenecks that are limiting potential enhancements in productivity and profitability. The significant positive impact of this effort will be development and convincing application of engineered tools (software, processes, and protocols) to produce useful, reliable, and important information for specific key production decisions. Accomplishment of these objectives is essential to the wider uptake of UAS-image data.
Animal Health Component
50%
Research Effort Categories
Basic
(N/A)
Applied
50%
Developmental
50%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
40415102020100%
Knowledge Area
404 - Instrumentation and Control Systems;

Subject Of Investigation
1510 - Corn;

Field Of Science
2020 - Engineering;
Goals / Objectives
Our goal is to improve farm profitability and productivity on a broad scale through UAS-image-data-enhanced decision making, and we have three specific research objectives. (1) We will develop methods that enable UAS-image data to be converted into actionable information by the end of a 20-minute flight. (2) We will develop methods that provide for automated radiometric and geographic calibration of the data so they are consistently accurate. (3) We will evaluate broadly-applicable use cases in corn production that will make clear the value of the data in common farm-management decisions. The three use cases we will evaluate involve the utility of UAS-image data in improving decision making for (a) early-season re-planting, (b) rescue nitrogen fertilizer amendments, and (c) late-season harvest scheduling. We focus on corn because it is one of the most widely grown crops in the country and the world, and one that is grown extensively in the states involved in this project.
Project Methods
Objective 1. Readily usable data.At current, the intelligent scoutingapplication developed by Dr. Chowdhary does not perform adaptive mission planning, which will be the major effort in this Objective. Once developed, the machine vision algorithm will utilize linear-programming based approaches to design a UAS path to get higher-resolution images of anomalous areas. We will also extend the function of the application to include multispectral imagery.Our approachfocuses on identifying minute variations in HSV variables of groups of pixels that are difficult to spot with the naked eye.Stage 1 involves implementing and evaluating computationally streamlined implementations of two classes of unsupervised machine learning (clustering) algorithms: a new Deviance Detection Algorithm (DDA) that leverages techniques in pixel-value change detection, and Dirichlet Process K-means clustering (DP-Means).Stage 2 involves linear programming approaches to optimize the flight mission plan such that a maximum number of interesting images are obtained within the flight endurance of the UAS.We will develop an application system on the Android (with iOS applications developed in follow-on work) kernel that can work on modern smartphones and tablets for fast information turnaround. The image processing algorithms will be integrated into the application system so they can run on the tablet, and not require cloud services.Objective 2. Reliable data.We will conduct research on a ground-truth system for calibrating on-farm UAV-based image mosaics. The physical system will involve smart ground control points (GCP)that include elevated reflectance targets, as well as built-in solar-powered GPS and wireless communications.Based on the known GCP locations, the algorithm will locate the smart GCPs in the images, geometrically correct the mosaic, and radiometrically calibrate the mosaic, all automatically.Four experiments will be conducted in both years of the project.A set of large calibration tarps will be laid out for the piloted-aircraft flights. A fixed-wing UAV with a multispectral camera will be flown at 100 m above ground level (AGL) to create a mosaic of the field. A manned aircraft with a multispectral camera will be flown at 3000 m AGL (or lower) to create a single image of the field. Pixel values from the tarps will be used to radiometrically calibrate the single piloted-aircraft image, and the automated software will use the GCPs with reflectance targets to radiometrically correct the UAV-based image mosaic.Several readily-identifiable zones will be selected in the field area, and pixel values in these zones will be extracted from the radiometrically-calibrated manned-aircraft image, which will be considered ground-truth as the state of the art in aerial remote sensing. Pixel values in these zones will also be extracted from the radiometrically-calibrated UAV-based mosaic, and the average of these pixels will be compared to the average of the ground-truth pixels for each selected zone.Reliability of the automatic calibration method will be evaluated by comparing its resulting mosaic to UAS-image mosaics created by other means. Comparisons will involve quality of the reflectance mosaic in each band as well as the NDVI mosaic. After each experiment and evaluation of the data, modifications will be made to improve the overall system to maximize accuracy and system practicality.Objective 3. Broadly important data.This objective is focused on developing specific methodologies for using UAS-image data in important use cases (i.e., specific applications) for corn production.The three specific use cases we will focus on are early season re-plant, rescue nitrogen fertilizer amendments, and harvest readiness. For each, we will collect RGB and multispectral imagery at resolutions specified within each case. Sensors and platforms to be used include the integrated 1-inch 20MP RGB camera on a DJI Phantom 4 Pro and the Micasense RedEdge multispectral camera on a SkyHero X8 multi-rotor.Our use cases will compare the decision made based on the traditional producer model (specified under each use case) with a UAS-data-driven decision. The UAS-based decision will focus on bringing incremental improvements to objectivity of factors which influence the decision in the areas where UAS offers a strength.We will expand the applicability of our Mississippi-centric data to other regions by incorporation of big data from regional and national scales (e.g., yield averages, commodity price trends). Then we will conduct further analysis to discover which techniques will enable identification of hidden data patterns in the large data sets, recommend dimensionality reductions to reduce the number of attributes in the data set without losing relevant data patterns and trends, and construct models based on discovered attribute relationships that will be used to make data-driven predictions. The first step will be to use dimension reduction to determine the variables that account for the majority of the variation in the data. A second step will be to use cluster analysis to group the observations into similar clusters to identify patterns in the data. The relationships discovered through dimension reduction and clustering will enable the development of a multi-objective optimization framework to determine the near-optimal combination of control variables for a given crop circumstance.

Progress 04/15/18 to 04/09/21

Outputs
Target Audience:With the health crisis, efforts were constrained by lack of event and opportunities to interact with people. Most of the effort documentable consisted of formal classroom instruction to students where the engineering aspects and scientific hurdles were presented. Other audiences included scientists, engineers, and researchers in a professional setting. Changes/Problems:The final result is a departure from the original product as envisioned to be sure. Initially the problem that emerged was inability to tap into a live data stream from our selected sensor. We changed hardware and continued down the original route. However, with the rapidly evolving pace of UAV technology, we were quickly working on a problem that in many ways had just been solved by industry. In one of our early attempts to overcome the issues with the multispectral sensor, Dr. Chowdhary's team had started a machine learning approach which seemed promising, and effort was shifted to fully mature that portion of the work. Consequently, the final product, while addressing many of the issues we set out to overcome, is not the complete turn-key solution we had originally expected at the end of the project. It does however address many of the individual components which interact to limit the usefulness of UAV for decision making. What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest?Dr. Czarnecki taught a data-enabled agriculture course for freshman for the Fall 2020 semester. Co-PIs Chowdhary, Smith, and Thomasson were all invited to provide guest lecture to students and share their experiences related to this project with the students, in addition to the information Dr. Czarnecki shared. Students were exposed to the current issues with the technology as well as the innovations undertaken to address these issues and what the individual researchers were doing in this arena. This was mirrored with efforts from UIUC asDr. Chowdhary alsopresented several seminars across the country as well as globally highlighting challenges and opportunities in digital agriculture. Dr. Chowdhary's Master's student, Corey Davidson, has drafted both his thesis and related journal article. At this time, neither has been officially submitted, but the time is coming. Another of his staff worked with other researchers and students tocollecting data and incorporate UAV data into their individual research programs. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Impact:The utility of UAV data is limited on many fronts. We have addressed several of these issues with this project. The first is reducing the cost barrier bytaking an image from a cheap sensor and generating an index value that simplifies the information being communicated. Our testing indicated the result was comparable to more expensive alternatives in practice, although room for improvement was noted. We also created methods toprovidebackground information with UAV data that provides context. If we think about the mental process a subject matter expert would use to identify the cause of a production stress, we know that they are able to consider external factors that may be important to understanding why the stress is present. We incorporated the ability to have weather data matched to the UAV imagery. Weather is explanatory in many issues of crop stress, but also speaks to image quality concerns. Together with our prior work on improving the radiometric quality of UAV imagery, we have collectively improved several aspectsdata quality and usability. Objective 1.We evaluated the Pix2Pix with cGAN model as a tool to utilize RGB composite imagery and generate a pseudo-NDVI style crop health index. A visual rating test with five subject matter experts was conducted to determine if Pix2Pix output of a crop health index could serve as a proxy for NDVI and NDRE. The concern was that the model-generated output might introduce aspects to the image that would inappropriately guide the user to a decision (e.g., perhaps if all the output images were brighter it might give a false sense of crop health).Quantitative metrics were also used to compare input to output images. These metrics included structural similarity (SSIM), peak signal-to-noise (PSNR), and mean squared error (MSE). SSIM evaluated the extent to which input and output images were comparable, PSNR measured the quality of the output image, and MSE measured the error between input and output. We used archival UAV data from corn, cotton, and soybean as inputs for the model. Thesedata were collectedfrom three different UAV sensors. RGB and multispectral were collected on during the same flight for consist flight and illumination conditions.The visual observers were presented witha blinded sample set of 30 image trios, where a trio consisted of the original image, the real multispectral derived index,and a Pix2Pix generated crop health image based on the RGB composite. Our visual raters were university faculty with doctorates in agriculture fields. All routinely utilize UAV and imagery for their research and teaching responsibilities.They were asked to select between the real and Pix2Pix generatedwhich image was the best representation of the original RGB image and which image indicated better crop health. Overall,50% of the responses correctly identified the real image. However, from the comments provided, although observers were able to correctly identify the true image, it was apparent that the perceived differences between true and Pix2Pix generated images would make a negligible difference in the management decision as no trends were identified in selection of which image indicated better crop health. Thus it was concluded that the processing with Pix2Pix had no detrimental impact on the end user ability to make decisions.Regarding the quantitative measurements, the model performed consistently when presented with multiple crops and multiple sensors. Some highlights from the analyses relate to the visual rating when compared to qualitative metrics as there was an observable relationship between the metrics and the ability of raters to correctly identify the true image. This suggests that a model such as this could be finetuned to provide the highest quality output for end users. Objective 2. Nothing to report. Activity concluded in prior year. Objective 3.To better support decision support, the original intent was to incorporate incoming data streams that could help filter through potential casual factors for crop health stresses, per our 3 focus areas. Early in the project, Dr. Smith's lab was tasked withfinding and/or developing a source of time-appropriate weather data to feed into the model of crop health. However, as the project evolved to focus less on an app and more on addressing the individual components of quality, this line of work had been sidelined. Following the results from Objective 1, the task was re-engaged.The task was completed bydeveloping code to extract weather data from the NOAA Land-based-station Datasets.Functionally the time stampfrom the images isaligned to instantaneous weather. The result is code that can be incorporated into the model of crop health thus allowing users to consider the role weather conditions played in the outcome of the crop health index result from the Pix2Pix model.The user-friendliness of the NOAA Land-based-station data allows the code to be readily modified for application to other models.

Publications


    Progress 04/15/19 to 04/14/20

    Outputs
    Target Audience:Dr. Czarnecki (MSU) was invited to speak with two influential groups during the reporting period. In both instances the request was to address how automation and technology are contributing to agriculture. The first was Mississippi State University's Research and Technology Advisory Council. The Council provides strategic advice and assistance to the MSU VP for Research and Economic Development on a wide variety of topics that directly affect the overall research productivity of the institution and its economic development role. It is comprised of 11 appointed members, most of whom represent private industry in the State of Mississippi. The second was the Research Center Administrators Society at their winter meeting. The Society is a national non-profit organization whose membership is comprised of agricultural research station managers, directors, university, college, and USDA administrators. In both presentations, USDA NIFA was acknowledged as the funding agent for this effort. Dr. Chowdhary (UIUC) has interacted with colleagues working in similar research areas ofdeep learning, robotics, and agricultural automation. He has additionally been interacting with students, at both the University and K12 level to excite them about these fields. Finally, he has been engaging with our producer stakeholders. The bulk of his engagement has been with his undergraduate students as he has incorporated elements of UAS-based data collection and analysis for agriculture in his undergraduate teaching. Dr. Thomasson (TAMU) participated in a recorded interview for a podcast by Precision Farming Dealer entitled, "Outlook and Obstructions to Advancing Automation in Ag." He was also interviewed for an article in Southwest Farm Press entitled, "AI Will Improve On-Farm Decision Making." He was also interviewed for an article in The Western Producer entitled, "Precision Ag Struggling for Full Acceptance." He also presented talks at the Texas Plant Protection Conference in December 2018 and 2019. All of these efforts were aimed at disseminating this and related research to the farming community. Furthermore, Dr. Thomasson and his research team presented research related to this effort to agricultural engineers at the annual meeting of ASABE, to cotton producers and researchers at the Beltwide Cotton Conference, to plant biologists at the Phenome conference, and to optical engineers and physicists at SPIE's conference on "Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping," of which he is co-organizer. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?MSU - Mississippi State University apprenticed one undergraduate student as a UAV pilot. The student was a graduating senior in the Agricultural and Biological Engineering Department. He obtained his FAA remote pilot certificate following classroom instruction. Thereafter the student was tasked with operating the dedicated aircraft for this project and collecting image data and preparing it for delivery to UIUC after a probationary period of close supervision and interaction with full time University staff engaged in UAV operations. During the summer months the student was fully engaged in UAV operations and logged an average of 10 hrs per week as pilot in command. Following his graduation in May 2020, the student began work as an AGRIntelligence Technician with Helena Chemical Co. This student was covered with University funds. TAMU - Texas A&M University had one agricultural engineering postdoc who developed his expertise on the hardware and software associated with this project. He is now a faculty member at a university in South Korea. Texas A&M also had one Ph.D. student who developed his skills in experimental design and execution as well as data analysis. Texas A&M expanded its team of UAV pilots to include four Ph.D. students, one undergraduate student, and one postdoc. This group developed marketable capabilities in the advancing area of agricultural autonomy and precision agriculture. UIUC - Three project-affilliated students received training on conducting research and executed elements of research independently and in collaboration. Lessons learned from engaging in this research project pertaining to data collection and analysis were included into lab activities for undergraduate courses. How have the results been disseminated to communities of interest?Refereed journal articles and presentation papers (see section on products for details). Interviews for popular press articles (see section on target audience for details). Presentations given at farmer-oriented and scientist-oriented conferences. Incorporation of lessons learned into undergraduate cirricula. Engagement with end user community. What do you plan to do during the next reporting period to accomplish the goals?The critical tasks remaining in this project relate to the deep learning network. Immediate plans are to conclude the quality assessment of the deep learning outputs.The primary goal of the statistical analyseswill be to estimate and ensure high accuracy between the produced image from the network and the ground-truth image and data used for comparison. Additional statistics will include the likelihood of a qualitative observer being able to accurately discriminate between real and fake images, and what impact both have on decision making. Once testing is complete we will begin authoring a manuscript for publication related to this portion of the project. In the meantime, we must re-integrate some functions that were split into separate thrusts. If the deep learning pipeline can be confirmed to be working successfully, it needs to be augmented to include extraneous data streams for climate to improve the autonomous decision making abilities of the network. It also needs to be tested using sub-par versions of current datasets to determine the effects of data quality on the outcome and thus what improvements are gained when data quality are improved.

    Impacts
    What was accomplished under these goals? Impact: The utility of unmanned aerial systems in agriculture is hampered by data quality and a lack of guidance for end users on how to interpret data. As with any aerial image, distortions exist due to the angle of the aircraft and the lighting conditions imposed by the sky. These distortions require correction before analysis. With this project we have successfully developed an autonomous platform for the field that will correct for these distortions by improvingthe geometric and radiometric quality of the data. The mobile calibration standcommunicates with the aircraft in flight to provide the information necessary to apply correction factors which remove distortion from the image. However, even quality data can't interpret itself. Therefore, we are utilizing deep learning neural networks to assist users with interpretation of the data. Deep learning networks, much like humans,learn from sample data and experience. Thus we have been teaching our network to identify crop production problems related to our case studies using our collected field and image data. With proper training and development,the network will become smart enough to function like an assistant to the end user. Objective 1. Methods for actionable information Because of the difficulties noted in the previous report related to accessing the datacard from the multispectral sensor while in flight, a different tactic was adopted to obtain actionable information. The basic idea being that an end user doesn't necessarily care how the data are obtained, so long as the decision making ability is not compromised and the same management action is prescribed. In line with this shift, we are attempting to utilize data streams that we can access paired with neural networks to improve the decision making ability. Weset up adeep learning neural network based on the Pix2Pix neural network. The model was trained and tested using standard RGB image datasets collected with UAVs by MSU and UIUC. Pre-processing work was done to get flight imagery into the proper arrangement to run through the network. The network then generates fake versions of the original images. The fake versions as well as the real versions are converted to color heat maps which represent crop health (pseudo-NDVI style vegetation index). The intent is to determine if end users can distinguish real images from fake images, and whether or not their management decision is biased when using one or the other. We are using our field data to determine if color heat maps generated from real images are accurate. The Structural Simularity and other quantitative metrics are also being used to estimate the accuracy of the "fake" images in comparison with the "real" images for our specific case studies in corn. The key outcome expectedfrom these methods is to determine if deep learning is a viable way to accurately predict crop healthwhen paired with low-cost sensors, which may lack NIR functionality. As well as to determine if deep learning can guide users towards the correct interpretation of imagery in our case studies. The outcomes represents a change in condition as at current, most UAV processing ends with creation of a classified image. This system would go beyond anomaly detection and indicate potential casual agents. Objective 2. Methods for calibrated data Dr. Thomasson and his team have invented, designed, constructed, and tested an autonomous mobile ground control point that provides reference data on position for georectification as well as height, temperature, and reflectance of crops in the field. The device navigates farm fields in collaboration with a UAS, providing multiple instances of references in the imagery of the field during UAS operation. Data from the image references can be used in an automated fashion to calibrate the imagery in order to provide very accurate data for decision making. This product follows field experimentation where anautonomous mobile ground control pointperformed as designed. Image data were collected by the UAS in collaboration with the mobile ground control point. Data on georeferencing as well as object height, reflectance, and temperature were collected in order to validate the utility of the system in improving accuracy.The accuracy levels achieved were 10 cm for georeferencing, 4 cm for height, 1% for reflectance, and 2 °C for temperature.The key outcome is a functional system that enables high-accuracy image data to be collected with UAS, enabling high-confidence decision making for farm operations and agricultural research.The system provides a change in condition, as it will allow a service providerto bring minimal equipment to the field, and with minimal effort, utilize the autonomous system to provide improved quality data to the client in less time. Objective 3. Case studies Data from field trials and well as coordinate UAV imagery was delivered to UIUC to feed into the image processing pipeline. These data included (1) small plot studies on corn with varying levels of nitrogen, documented through spectrophotometer readings, as well as tissue sampling. (2) data from research fields where nitrogen was purposely un-applied, adjacent to adequately fertilized corn, (2) data collected simultaneously with multispectral and RGB sensors to evaluate the predictable ability of these sensors under the same flight, sky, and field conditions. We are awaiting the training datasets generated by the Pix2Pix network so that our agronomic experts can evaluate the utility of the deep learning product.At this point, all MSU effort is going to support UIUC on the deep learning and interpretation tasks for the project.

    Publications

    • Type: Journal Articles Status: Published Year Published: 2019 Citation: Han, X., Thomasson, J. A., Xiang, Y., Gharakhani, H., Yadav, P. K., & Rooney, W. L. (2019). Multifunctional Ground Control Points with a Wireless Network for Communication with a UAV. Sensors, 19(13), 2852.
    • Type: Journal Articles Status: Published Year Published: 2020 Citation: Han, X., Thomasson, J. A., Wang, T., & Swaminathan, V. (2020). Autonomous mobile ground control point improves accuracy of agricultural remote sensing through collaboration with UAV. Inventions, 5(1), 12.
    • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Han, X., Thomasson, A., Siegfried, J., Raman, R., Rajan, N., & Neely, H. (2019). Calibrating UAV-Based Thermal Remote-Sensing Images of Crops with Temperature Controlled References. In 2019 ASABE Annual International Meeting (p. 1). American Society of Agricultural and Biological Engineers. Boston, MA.
    • Type: Conference Papers and Presentations Status: Published Year Published: 2020 Citation: Thomasson, A. (2020). Cooperative Air and Ground Robots for Field Phenotyping. Phenome 2020. Tucson, AZ.


    Progress 04/15/18 to 04/14/19

    Outputs
    Target Audience:The flight control app and decision support is targeted at end user groups. The need to make the app time efficient with maximum decision making is more relevant to the audience, and this goal is driving development of that product. Researchers, by contrast, often have the luxury (and necessity) of waiting for optimum flight conditions, whereas field scouts have limited time to spend in each field. By contrast, for now the smart ground control point technology is geared toward the research community, because (for example) crop breeders are likely more willing to add cost to their data-acquisition systems to make them much more accurate. In the long run, however, the technology should be applicable on-farm. Changes/Problems:We have had difficulties in accessing the Micasense datastream during flight. The way the data are stored within the sensor do not lend themselves easily to real-time access to image data. RGB systems, such as the DJI Phantom, have streaming capabilities inherent within their function. However, DJI frequently makes changes to their internal coding, making these platforms challenging to interface with the app. We are working on workarounds, but it isn't a straight forward path. Additionally, with planting behind in most of the US, year 2 field data collection efforts are not as planned, however, we'll likely see a lot of fields with issues if the farmers can manage to get a stand of corn in the field. What opportunities for training and professional development has the project provided?Mississippi State University supports one master's level graduate student on this project, under the direction of Dr. Smith. The student is earning a degree in Industrial and Systems Engineering. Involvement with this project has broadened the student's skills in data analysis, and also increased the student's understanding of agronomy. The student supports the project by indentifying data streams and authoring code for inclusion of the data streams with processes within the application developed by Univeristy of Illinois. The Univeristy of Illinois supported one graduate student and a host of undergraduate student workers with project funds. The graduate student has primarily been reponsible for leading the app development and managing the efforts of the undergraduate student workers. Texas A&M University has had one agricultural engineering postdoc who is developing his expertise on the hardware and software associated with this project. They have also had one Ph.D. student who is developing his skills in experimental design, execution, and data analysis. They have also now expanded their team of UAV pilots to include four Ph.D. students, one undergraduate student, and one postdoc. This group is developing marketable capabilities in the advancing area of agricultural autonomy and precision agriculture. How have the results been disseminated to communities of interest?Texas A&M has been active in publishing and presenting their portion of the effort at relevance conferences. What do you plan to do during the next reporting period to accomplish the goals?Objective 1. The University of Illinois will continue to develop the app, and have begun working with Dr. Smith to mesh program coding for integration of the external data streams that empower decision support. Machine learning efforts will continue to increase the automated decision making capacity of the app for end users. Additionally, effort will be devoted to engaging with end user communities through publications and presentations. Objective 2. Texas A&M University will continue to develop the mobile GCPs, and will work with the other collaborating univerisities to implement these improvements in data accuracy to practical research and agronomic situations. Objective 3. Mississippi State Univeristy purchased the same UAS being used for app development. A variety of corn fields, many under Dr. Henry's purview, have been selected for data collection during the season, and data are already been collected according to flight protocols. Dr. Czarnecki will visit Dr. Chowdhary's lab in June to learn to use the app, and will bring that technology back for testing during the growing season. At the same time, a student from Dr. Chowdhary's lab will be in Starkville, MS, during the summer and will work with Dr. Smith and Dr. Czarnecki to label archived UAV data to feed into the data pipeline that will inform the machine learning for the app.

    Impacts
    What was accomplished under these goals? Objective 1. We will develop methods that enable UAS-image data to be converted into actionable information by the end of a 20-minute flight. In the first year, progress was made towards development of an app for mobile tablets that will conduct a preliminary flight, called the scout flight, which provides a high-level overview of the entire area of interest. Anomalies detected during the scout flight are logged as points of interest for a second flight. Heuristics are used to rank the priority of anomalies, and a second flight plan is generated to re-visit the anomalies. The second flight plan is a point-to-point flight that only covers these identified areas. The second flight, also called the inspection flight, provides higher resolution imagery because the data are collected at significantly lower altitudes. At current, it has been difficult to automate the upload of the second flight to the aircraft without landing. Also, we are pushing towards being able to use data from other sensors, but have found difficulties in gaining access to the data while the aircraft is in flight. Objective 2. We will develop methods that provide for automated radiometric and geographic calibration of the data so they are consistently accurate. Smart ground control points (GCPs) have been developed and shown to successfully communicate with a UAV during flight, recording GCP position onboard the UAV in real-time. Field studies have shown that using the GCPs in processing of image mosaics reduces reflectance error by roughly 50%, and it also reduces the error of plant-height measurements from digital surface models by about 20%. We are currently transitioning the GCP hardware and software to an autonomous mobile system that will enhance practicality for on-farm use. Objective 3. We will evaluate broadly-applicable use cases in corn production that will make clear the value of the data in common farm-management decisions. Because the app was not available for use immediately at the start of the growing season, data were collected according to the same protocol for post-processing. The first iteration of protocols was a mismatch between resolution and crop size, meaning that plants were not visible in many instances during emergence flight. This points to a need to have flexible altitudes for different stages of growth/key concerns that are being scouted for. This was not an issue with later flights because the corn was sufficiently large by the time rescue nitrogen and harvest were concerns. However, as it has become more apparent that access to the data from the Micasense RedEdge sensor will be limited during flight, most of the data collected do not necessarily advance the development of the app. Adjustments have been made for year 2. On the data side, a decision tree was developed to mimic the mental process followed by crop scouts to diagnose issues in corn. For each decision, the necessary input data or actions were identified. Data streams were found that can be matched to provide the external information necessary for decisions that require data.

    Publications

    • Type: Journal Articles Status: Published Year Published: 2018 Citation: Han, X., J.A. Thomasson, G.C. Bagnall, N.A. Pugh, D. Horne, W. L. Rooney, J. Jung, A. Chang, L. Malambo, S. Popescu, I. Gates, and D.A. Cope. 2018. Measurement and calibration of plant-height from fixed-wing UAV images. Sensors 18:4092; DOI:10.3390/s18124092
    • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Han, X., J.A. Thomasson. 2019. Coordination and control for automatic mobile ground control points in agricultural remote sensing. In Proc. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV. Bellingham, Wash.:SPIE.
    • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Han, X., J.A. Thomasson, and Y. Xiang. Multifunctional Ground Control Points based on Wireless System Network for UAS Application. ASABE Paper No. 1800308. St. Joseph, Mich.: ASABE.
    • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Han, X. April 2019. Coordination and control for automatic mobile ground control points in agricultural remote sensing. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV conference, SPIE Defense and Commercial Sensing symposium.
    • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Han, X. July 2018. Multifunctional Ground Control Points based on Wireless System Network for UAS Application. ASABE Annual International Meeting