Source: UNIVERSITY OF ARKANSAS submitted to NRP
PREDICTING RICE YIELD WITH UAV IMAGERY AND DEEP LEARNING
Sponsoring Institution
State Agricultural Experiment Station
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1027724
Grant No.
(N/A)
Cumulative Award Amt.
(N/A)
Proposal No.
(N/A)
Multistate No.
(N/A)
Project Start Date
Sep 17, 2021
Project End Date
Sep 30, 2025
Grant Year
(N/A)
Program Code
[(N/A)]- (N/A)
Recipient Organization
UNIVERSITY OF ARKANSAS
(N/A)
FAYETTEVILLE,AR 72703
Performing Department
ASU-College of Engineering
Non Technical Summary
The ability to predict crop yield throughout the growing season is an important goal for precision agriculture. UAV imagery combined with advanced computer vision techniques offer many potential benefits for rapid detection of plant stress and management interventions, but guidance for this integration is still limited. The proposed work will develop a novel unsupervised machine learning approach to characterize crop developmental trajectories from spectral data. This is an important step towards the broader next goal of our project to improve the generalizability of deep learning models for rice yield prediction by training on a diverse data set of 15 fields representing multiple cultivars, management conditions, and environments.
Animal Health Component
0%
Research Effort Categories
Basic
0%
Applied
0%
Developmental
100%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
20315302080100%
Goals / Objectives
A key goal for modern agriculture is to increase the efficiency of crop production while conserving natural resources. Precision agriculture seeks to address this challenge by using advanced technology to monitor plant growth throughout the growing season, enabling the application of inputs only when and where they are needed. Unmanned aerial vehicles (UAVs) equipped with multiple sensors offer tremendous advantages for precision agriculture and planning management interventions with high spatial and temporal resolution(Maes & Steppe 2019). State-of-the-art computer vision methods are also now being integrated with UAV imagery for many specific applications, including early yield prediction(Nevavuori et al. 2019; Duan et al. 2019), disease and pest detection(Liu & Wang 2021), and planting density estimation(Liu et al. 2020).A major challenge for the development of deep learning models for applications in agriculture and other domains is the generalization of models to new contexts and datasets. For the task of yield prediction, for example, it can be difficult to acquire training datasets encompassing many years, management conditions, and crop cultivars. Correspondingly, training sets often comprise UAV imagery from a few fields and years, which are then split into many smaller sub-images to achieve the tens of thousands of samples needed to efficiently train deep learning models(Nevavuori et al. 2019). While these sub-images capture extensive within-field heterogeneity in soil nutrient status and moisture availability, imagery can vary extensively among cultivars and in different environments, challenging efficient generalization and deployment of trained models for use in precision agriculture. A particular challenge is accounting for differences in developmental trajectories across sites, years, and cultivars(Duan et al. 2021).Even within the same field on a particular calendar date, individual plants vary extensively in their developmental stage, blurring the relationship between remotely-sensed spectral signatures and plant stress (since spectral characteristics also change across development).As part of our initial work on this problem, our interdisciplinary network of collaborators developed several deep learning architectures for predicting rice yield from multitemporal UAV imagery based on a single 16-hectare field in Lonoke County, Arkansas (Bellis et al.,in review). Deep learning models showed improved predictive performance compared to traditional statistical learning approaches, with root mean square error as low as 7.4% of the mean yield. While this study revealed important recommendations for suitable model architectures for deep learning prediction of rice yield, because models were trained using data from a single field, they are not expected to generalize well to imagery from other fields and cultivars.The overarching goal of the proposed research is to improve the generalizability of deep learning-based yield prediction models to new cultivars, environmental contexts, and management conditions.For FY22, we will work towards this goal through the following project aim:Develop and test new approaches to correct for spectral variation associated with developmental progression rather than plant stress.A novel unsupervised learning method to improve the temporal resolution of UAV imagery via developmental phasing will be evaluated.The results of this work will be used to inform a strategy for generalization of deep learning-based yield prediction models to new environmental contexts, cultivars, and management conditions.The utility of a 'phenophase detection' subnetwork, to select stage-specific models prior to yield prediction, will be evaluated for potential to improve deep learning models from Belliset al.(in review).
Project Methods
UAV imagery for plants observed in different fields on the same calendar date, and even in different areas of the same field, comprise individuals at different developmental stages. This presents a challenge to analysis since spectral signatures of plant canopies change throughout the course of plant development, in addition to being impacted by water and nutrient availability(Xue & Su 2017).To correct for developmental variation in spectral indices, we will evaluate a new approach for developmental staging of individual 'pixels' from UAV imagery. This method is inspired by an unsupervised learning algorithm previously developed to increase the temporal resolution of single-cell RNA transcriptomic datasets(Trapnell et al. 2014). Using this method, Trapnellet al.(2014) addressed the problem of averaging across a population of cells measured at a single timepoint; since such a population includes many cells at different stages of intermediate differentiation, averaging the gene expression signal within a single timepoint obscures patterns in the data by hiding individual cell trajectories. We will extend this method to increase resolution of individual 'pixel' trajectories from UAV images.To evaluate this method, we will leverage several large datasets collected by collaborator Dr. Ahmed Hashem (A-State College of Agriculture, University of Arkansas System Division of Agriculture). These include UAV imagery from 15 rice fields in Arkansas, each imaged at multiple timepoints throughout the growing season. The images comprise a diversity of cultivars and management conditions and represent growing seasons from two years (2019 and 2020).In addition to rice, we will evaluate the utility of our unsupervised developmental staging method using additional datasets for soybean and corn collected by Dr. Hashem in collaboration with Dr. Steven Green (A-State College of Agriculture, University of Arkansas System Division of Agriculture). A benefit of these additional datasets is that they have been collected in the context of an experimental nitrogen rate study, and so will allow for careful evaluation of the ability of our modeling framework to distinguish changes in spectral signatures associated with development vs. nutrient status.For each included dataset, we will consider pixels at two resolutions (5 cm and 50 cm) after masking soil and background pixels. For 2019 and 2020, orthomosaic images with more than 25 different vegetation indices (VIs) each have already been generated by Dr. Hashem and his undergraduate mentee. We will reconstruct the ordering of pixels through development using the Monocle algorithm (Trapnell et al. 2014), adapted to UAV images where each pixel within a field is associated with a measurement for 25 different VI features and 10 timepoints. In brief, this will involve 1) dimension reduction of the VI data using independent component analysis, 2) constructing a minimum spanning tree and identifying the longest path through this lower dimensional VI space and 3) assigning each pixel to a new value of 'pseudotime', a value which captures progression of the pixel along the learned developmental trajectory in (2). Variation in pseudotime assignments of pixels will be compared to collected metadata describing developmental stage observed in field at the time of each flyover date (one observation per field), estimated growing degree days, and growth stage predictions (from https://dd50.uaex.edu). These comparisons will be used to determine which of the estimates of developmental stage (pixel-specific pseudotime value, field-observed growth stage, growing degree day, or DD50) is most efficient in phasing VI developmental trajectories across different cultivars, environmental conditions, and years.We hypothesize that pseudotime values will be highly effective for developmental phasing of VI trajectories.If this hypothesis is supported, it would suggest that increasing the developmental resolution of pixels in UAV images could lead to substantial improvements in model training and performance for deep learning models based on developmentally phased data. With deeper understanding of patterns of variation in spectral signatures over time and across cultivars and conditions, we will be well equipped to train a full-scale model for rice yield prediction using UAV data from 15 fields flown in 2019 and 2020 by Dr. Hashem's group. A separate model will be trained for each developmental stage, given previous results suggesting limited benefit of more complicated convolutional neural networks (CNNs) that included data from multiple timepoints (Bellis et al.,in review). If the proposed unsupervised learning method is successful in automatic fine-scale detection of phenological stage, we will also explore addition of a phenophase detection subnetwork. The purpose of the phenophase detection subnetwork will be to automatically select the appropriate stage-specific model among all models trained, to allow yield prediction based solely from spectral data, important since developmental trajectories vary substantially across cultivars, climatic regions, and years (Duan et al. 2021). Automated detection of developmental stage based on spectral characteristics would help facilitate large scale deployment of trained models and eliminate the requirement for manual scoring of developmental stage or separate calculation of growing degree day or related metrics for each flyover (which are typically only captured at very coarse resolution). Being able to 'hybridize' stage-specific model predictions for different areas in the same field may also lead to improvements in accuracy.

Progress 10/01/23 to 09/30/24

Outputs
Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Nothing to report

Publications


    Progress 10/01/22 to 09/30/23

    Outputs
    Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

    Impacts
    What was accomplished under these goals? Nothing to report

    Publications


      Progress 10/01/21 to 09/30/22

      Outputs
      Target Audience: Oral presentation was co-led by PI Bellis and collaborator Dr. Ahmed Hashem at the 2022 National Conservation System Cotton & Rice Conference "Rice Yield Prediction using Unmanned Aircraft Systems" in Jonesboro, AR. Jan. 31, 2022. Target audience: rice growers, industry representatives,and researchers One open accesspeer-reviewedmanuscript was published in March 2022. Target audience: researchers and the public Training and professional development for one undergraduate student, who continueda thesis-basedM.S. focused on this projectstartingin June2022. Three invited seminars by PI Bellispresented results from this work atthe Boyce Thompson Institute Postgraduate Society Sympiosium "Breaking Limitations in Plant Science"(Sep. 16, 2022; Ithaca NY), the Idaho State University Biology Department Seminar Series (Mar. 17, 2022; Pocatello, ID), and the California State University, San Bernadino Biology Department Seminar Series (Nov. 19, 2021). Target audience: faculty, students, and postdocs in NY, ID, and CA. PI Bellis received the Arkansas Biosciences Institute New Investigator of the Year award for 2022; an article describing her research on the funded project was published on the University of Arkansas System Division of Agriculture website on Oct. 19, 2022https://aaes.uada.edu/news/emily-bellis-abi-researcher/. Target audience: Arkansas growers and the scientific community Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? One undergraduate computer science student at Arkansas State University leddata analysis and model training and began pursuing anM.S. in Computer Science at A-State to continue work on this project for his Master's thesis in June 2022. How have the results been disseminated to communities of interest? Results of this work were disseminatedthroughoral presentations to stakeholders and researchers at the 25th Annual Cotton & Rice conference in Jonesboro, AR (Jan. 31-Feb. 2, 2022); the 2022 ASA, CSSA, SSSA International Annual Meeting in Baltimore, MD (Nov. 6-9); and at three invited seminar talks. We also published a peer-reviewed manuscript in March 2022. A press release for a New Investigator Award received by Bellis in 2022also highlighted this work (see https://aaes.uada.edu/news/emily-bellis-abi-researcher/)and was shared through social media platforms of the Arkansas Agricultural Experiment Station What do you plan to do during the next reporting period to accomplish the goals?Now that the complete dataset for 2021 and 2022 growing seasons has been acquired, we will focus this year on development and testing of a yield predictionmodel that uses manifold learning for automateddetection of phenophase. The M.S. student working on this project will present over his work at the North American Plant Phenotyping Network meeting in St. Louis in Feb. 2023 and PI Bellis will present at the Rice Technical Working Group meeting in Feb. 2023 in Hot Springs, AR. We are preparing apublication summarizing this work with anticipated submission date near the end of the next reporting period.

      Impacts
      What was accomplished under these goals? A second year of time-series UAS data for a large number of small research plotsgrown at the USDA Dale Bumpers National Rice Research Center in Stuttgart, AR was collected for the 2022 growing season. The 2022 growing season will serve as a hold out test set for our phenotype prediction models. All image and yield data for both growing seasons have beencleaned and processed and preliminary analyses were undertaken.We have begun training and testing new unsupervised and supervised machine learning methods to improve our phenotypepredictionmodels now we have the complete dataset.

      Publications

      • Type: Journal Articles Status: Published Year Published: 2022 Citation: E.S Bellis*, A.A. Hashem*, J.L. Causey, B.R.K. Runkle, B. Moreno-García, B. Burns, V.S. Green, T.N. Burcham, M.L. Reba and X. Huang. Detecting intra-field variation in rice yield with UAV imagery and deep learning. Frontiers in Plant Science 13: 716506. *co-first authors.


      Progress 09/17/21 to 09/30/21

      Outputs
      Target Audience:- one oral presentation co-led by PI Bellis and collaborator Dr. Ahmed Hashem at the 2022 National Conservation System Cotton & Rice Conference "Rice Yield Prediction using Unmanned Aircraft Systems" in Jonesboro, AR. Target audience: rice growers and researchers. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? One undergraduatecomputer science student at Arkansas State University is contributing to data analysis and model training and has applied foran M.S. in Computer Science at A-State to continue work on this project forhis Master's thesis. How have the results been disseminated to communities of interest? We will be presentingthe results of this work through two oral presentations tostakeholders and researchersat the 25th Annual Cotton & Rice conference Jan. 31-Feb. 2, 2022. What do you plan to do during the next reporting period to accomplish the goals? We will continue data analysis for the Stuttgart dataset. Priorities are to 1) evaluate and compare performance of additional deep learning architectures, 2) implement automatic classification of development stage for the 'phenophase detection' module, and 3) evaluate accuracy of models trained on small research plots projectedto the field scale.

      Impacts
      What was accomplished under these goals? As part of a new collaboration in the 2021 growing season, we collected time-series UAS data fora large number of small plots capturing diversity among ~20 different rice cultivars grown atthe USDA Dale Bumpers National Rice Research Center in Stuttgart, AR includingthree different nitrogen levelsand three different seeding rates. This contributed to a new dataset capturing a larger amount of variation in rice growth stage and spectral variation than used previously.We are currently training models using this more diverse dataset and determining the extent to which theyare scalable to the field level. To improve the performance of the 'phenophase detection' module,we are now including additionalfeatures to capturecanopy texture variation(which we hypothesize to be highlyinformative forgrowth stage prediction), derived from Grey Level Co-occurrence Matrices.

      Publications

      • Type: Journal Articles Status: Under Review Year Published: 2022 Citation: E.S Bellis*, A.A. Hashem*, J.L. Causey, B.R.K. Runkle, B. Moreno-Garc�a, B. Burns, V.S. Green, T.N. Burcham, M.L. Reba and X. Huang. Detecting intra-field variation in rice yield with UAV imagery and deep learning. *co-first authors.