Source: PENNSYLVANIA STATE UNIVERSITY submitted to NRP
HARNESSING DRONES FOR PLANT DISEASE DIAGNOSIS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1015499
Grant No.
2018-67012-28076
Cumulative Award Amt.
$165,000.00
Proposal No.
2017-07142
Multistate No.
(N/A)
Project Start Date
Mar 15, 2018
Project End Date
Mar 14, 2020
Grant Year
2018
Program Code
[A7201]- AFRI Post Doctoral Fellowships
Recipient Organization
PENNSYLVANIA STATE UNIVERSITY
408 Old Main
UNIVERSITY PARK,PA 16802-1505
Performing Department
Agricultural & Biological Engi
Non Technical Summary
The rapid application of science and technology on the farm has made the US the most food secure nation the world has ever seen. In order to lead the development of science and technology on the farm, new tools must continually be developed to enhance human performance and productivity in agriculture. This is especially important as climate change is leading to more frequent extreme weather events and globalisation is leading to an increasing number of novel pathogens and pests. Given constraints on expanding the human expertise workforce (e.g. extension officers) cheap and efficient technology is required to monitor agricultural land to assess potential threats and enable timely management while improving land productivity. Unmanned aerial vehicles (UAVs) coupled with deep learning models offers the potential to extend the monitoring capacity of farmers in the US to the assess and manage their land. My preliminary research has shown deep learning models can detect healthy and diseased plants with over 90% accuracy using in-field images of plant leaves collected with digital cameras. In this project, I propose to investigate the performance of UAVs in collecting image datasets of agricultural land to support automated plant disease diagnoses with deep learning models. I plan to 1) collect image datasets of diverse crops and stressors at the Russell E. Larson Agricultural Research Station from different heights/image resolutions for a range of spectral bands to train deep learning models to automatically diagnose plant diseases, 2) compare the improvement in diagnosing plant health by collecting image datasets with UAVs to the 3m resolution satellite data available from Planet.com and 3) build a workflow to train agricultural extension staff to use UAVs and deep learning models for automated plant disease detection.
Animal Health Component
100%
Research Effort Categories
Basic
(N/A)
Applied
100%
Developmental
(N/A)
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
21272102080100%
Goals / Objectives
The goals of this project are to investigate the performance of UAVs in collecting image datasets of agricultural land to support automated plant disease diagnoses with deep learning models. If unmanned aerial vehiclaes (UAVs) can efficiently collect images of crops and these can be diagnosed with very high accuracy, they will extend the human capacity to monitor and assess crop health leading to increase land productivity in the US. To investigate this new tool I will address three key objectives:1: Investigate the ideal UAV image datasets to detect incidents of plant diseases.2. Examine the potential improvements in detecting incidents of plant diseases earlier in the field with vegetation indices (VI) datasets compared to RGB datasets.3: Build a workflow to teach and train new technology
Project Methods
Objective 1: Investigate the ideal UAV image datasets to diagnose plant diseases.UAVs can be flown within a range of heights less than 400 ft above ground level in open airspace (Class G) and carry cameras that capture spectral reflectance measurements in the visible, red edge, and near infrared. There are inherent trade-offs to monitoring fields with UAV technology including flight time limitations due to battery power, flight time to collect data (dependent on weather conditions such as wind speed and flight height), camera weight (5 band camera versus the standard RGB camera) and the size of image datasets (requiring larger computer systems for analysis and data storage). To analyze the effects of UAV flight parameters on the performance of UAV datasets to diagnose plant diseases, I will build deep learning models with a range of UAV datasets. The UAV parameter I will test for this objectiveis flight height (200 ft, 100 ft, 50 ft, 25 ft, 15 ft) and I will record flight time, battery usage, dataset size (dependent on percent sidelap and frontlap of images taken). These parameters will be tested with an RGB sensor camera (e.g. 1/2.3" CMOS, Effective pixels: 12.4 M (total pixels: 12.76 M), lens FOV 94° 20 mm (35 mm format equivalent) f/2.8 focus at ∞). Flights will be done twice per week. The image datasets will be used to build deep learning models in the TensorFlow platform and model results will be evaluated for different datasets.Objective 2: Examine the potential improvements in diagnosing plant diseases earlier in the field with vegetation indices (VI) datasets compared to RGB datasetsFor this objective I will repeat the flight heights of objective 1 with a MicaSense RedEdge camera (blue, green, red, red edge, near IR (global shutter, narrowband), 8.1 cm per pixel) attached to a UAV. I will record the flight time, battery usage, dataset size (dependent on percent sidelap and frontlap of images taken) for the collection of this newdataset. The image datasets will be used to build deep learning models in the TensorFlow platform and model results will be evaluated for different datasets. Because NIR reflectance is more strongly related to plant stress than RGB reflectance, NDVI calculated with NIR has the potential to identify problems days before they might appear with VARI. This would suggest that UAV image datasets with NIR data can be used to diagnose plant diseases earlier in the seasonthan VARI.

Progress 03/15/18 to 03/14/20

Outputs
Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Provided training for remote pilot certification for undergraduate students in order to train for image collection of agricultural fields using unmanned aerial vehicles (UAVs). How have the results been disseminated to communities of interest?Datasets have been shared with 2 collaborating research labs (Greg Roth Agronomy lab and Paul Esker Plant Pathology lab) to be used in future research projects. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? - Created dataset of wheat fields for training of deep learning (AI) models for plant disease detection and yield prediction. - Developed new protocol for collecting imagery data for wheat plant diseases. - Developed new protocol for collecting imagery data for yield trial experiments

Publications


    Progress 03/15/18 to 03/14/19

    Outputs
    Target Audience: academic audience (professors, graduate students, undergraduate students), public. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? provided training for remote pilot certification for undergraduate students in order to train for image collection of agricultural fields using unmanned aerial vehicles (UAVs). How have the results been disseminated to communities of interest? Datasets have been shared with 2 collaborating research labs (Greg Roth Agronomy lab and Paul Esker Plant Pathology lab) to be used in future research projects. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

    Impacts
    What was accomplished under these goals? Trained 1 undergraduate and 1 research technician to collect drone imagery data in an agricultural research field. Created weekly imagery datasets of 4 research fields to be used for research activities by David Hughes research lab, Paul Esker research lab, and Greg Roth research lab at Penn State Developed new protocol for collecting imagery data for wheat plant diseases Developed new protocol for collecting imagery data for yield trial experiments Created dataset of wheat fields for training of deep learning (AI) models for plant disease detection and yield prediction. Developed new protocol for collecting imagery data for wheat plant diseases. Developed new protocol for collecting imagery data for yield trial experiments

    Publications