Source: UNIV OF IDAHO submitted to NRP
FACT-AI: HARNESSING ARTIFICIAL INTELLIGENCE FOR IMPLEMENTING INTEGRATED PEST MANAGEMENT IN SMALL-GRAIN PRODUCTION SYSTEMS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1025693
Grant No.
2021-67021-34253
Cumulative Award Amt.
$499,500.00
Proposal No.
2020-08861
Multistate No.
(N/A)
Project Start Date
Mar 1, 2021
Project End Date
Feb 28, 2025
Grant Year
2021
Program Code
[A1541]- Food and Agriculture Cyberinformatics and Tools
Recipient Organization
UNIV OF IDAHO
875 PERIMETER DRIVE
MOSCOW,ID 83844-9803
Performing Department
(N/A)
Non Technical Summary
A cornerstone of successful implementation of integrated pest management (IPM) in agriculture is correct pest identification but this requires time and expertise. As a result, many growers forgo identification and pesticide applications are made unnecessarily. This is a problem because indiscriminate use of pesticides is costly to growers and harmful to human health and the environment. Machine learning and artificial intelligence (AI) for automated pest identification from photographs has the potential to address this issue.The goal of this project is to develop such anAI-based decision support system for pest identification in wheat-based production systems of the Inland Pacific Northwest (PNW) USA, with extensibility to other wheat growing regions. In addition to the identification tool, this system will allow users to share photos and otherinformation as well as expertise about pests found. This system will be available to growers, crop researchers, and extension educators as a desktop computer interface as well as iOS and Android apps.
Animal Health Component
50%
Research Effort Categories
Basic
50%
Applied
50%
Developmental
(N/A)
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
21115991130100%
Goals / Objectives
The long-term goal of this project is to develop and deploy an open-source, AI-based application for pest identification in wheat-based production systems of the inland Pacific Northwest (PNW) of the USA, with extensibility to other wheat growing regions.The main objectives and sub-objectives of the project are:1. Create an open-source artificial intelligence software framework for automated identification of Inland PNW cereal system crop pests from cell phone photographs1a. Collect images of current and anticipated insect pests of Inland PNW cereal crops and rotational crops and train an artificial neural network (ANN) to classify them to species1b. Refine the image processing, ANN training, and prediction for efficiency in various visual contexts and pest combinations and to utilize real-time interactions with users2. Incorporate the framework from (1) into an AI-aided decision support system (DSS) and community-based resource for managing pests in Inland PNW cereal systems.2a. Couple the identification framework with recommendations within a mobile application for use by producers and pest advisors2b. Build in the capacity for users to upload images into a web portal for community feedback and supervised inclusion in the ongoing training database3. Refine and disseminate the system developed in (2) to the target user populations3a. Solicit volunteer test users through extension outlets, grower meetings and other conduits3b. Release Version 1.0 and disseminate throughout the Inland PNW and in other regions with shared rotational crops and pest complexes
Project Methods
1. In order to build an AI-enabled tool for automated identification of insect pests in the Inland Pacific Northwest cereal crop systems we will:Conduct field work and collect images of pestsBuild a scalable, open-source data infrastructure management system consisting of adatabase, software, and interfaceEnsure that data generated are securely archived in Findable, Accessible, Interoperable, and Re-Usable (FAIR) repositories with accompanying metadata2. To refine the deep learning classifier capable of identification of insect pests from photographs we will go through an iterative process of training and comparing the classifier under different conditions, includinng:Different network architecturesModel ensembling strategiesData augmentation techniquesIncorporating image metadata3. We will couple the identification tool with a mobile and desktop application for use by producers and growers. This will be accomplishedby building aweb application developed with PHP/Drupal running on a dedicated virtual web server. The application will interact with a mobile app for both iOS and Android operating systems through a web service API.4. Finally, we will refine and disseminate the tool to the target audience through the following efforts:Extension outlets and grower meetings, providing use training and soliciting feedbackInvolving stakeholders (growers, small grain and pulse commission representatives, and extension educators) in an advisory boardProduce extension bulletin materials and peer-reviewed publications

Progress 03/01/21 to 03/21/23

Outputs
Target Audience:Target audience reached during this first stage of the project included individual cereal farmers in northern Idaho and professional biologists/researchers. Changes/Problems:PD Borowiec departed University of Idaho to take on another role at Colorado State University. This caused temporary disruption to project activities. CoPD Rashed transitioned from University of Idaho to Virginia Tech and wishes to no longer be involved in the project. We are currently working on transferring the project to Colorado State University and setting up subawards for University of Idaho. What opportunities for training and professional development has the project provided?In year 1, this project provided the most significant professional development opportunities to Alexander McKeeken, a master's student funded by the project in this reporting period. Alexander gained much experience in machine learning software engineering, research, and scientific writing. He participated in writing of a review publication which was accepted for publication in the high-impact journal Methods in Ecology and Evolution. These opportunities allowed Alexander gain important skills and helped him continue on his chosen professional trajectory. He applied and was admitted to a PhD program at the University of Minnesota. Alexander graduated from the University of Idaho in May 2022. In year 2, the project provided training to PhD student Stephanie Eskew, who became involved in image annotation and gained cereal pest identification skills. How have the results been disseminated to communities of interest?Our results thus far have been disseminated to stakeholders who assist in data collection and will be involved in feedback for the AI-assisted tool and on-line community portal. CoPI Eigenbrode presented the project to a number of cereal schools: the WSU Wheat Academy, Pullman, Washington, Dec. 14, 2022; the Idaho County Prairie Cereal School, Greencreek Idaho, 24 Jan., 2023; the Northwest Grass Growers Association meeting, Greencreek Idaho, 26 Jan., 2023; the Nez Perce County Cereal School, Lewiston Idaho, 8 Feb.; the Palouse Alternative Cropping Symposium, Colfax Washington, 23 Feb, 2023. We have also published an open-access review article about deep learning for similar applications, referenced in Products. What do you plan to do during the next reporting period to accomplish the goals?During the next reporting period we plan to: Gather 2,500 additional images Train deep learning model with all images collected and integrate with app and web interface Meet with advisory board for feedback Refine mobile phone application and web portal interface with feedback from stakeholders Disseminate the results to a wider audience through extension newsletters, presentations at workshops

Impacts
What was accomplished under these goals? In the first and second year of the project we have made significant progress towards the three main objectives. First, we collected 3,565 images of pests and beneficials for the project (objective 1a). This is a slower than anticipated rate of image collection but we have a plan in place to increase image collection effort in years three and four. In year 2 we also annotated all collected images, sorting them to pest species for neural network training. Additionally, we are in the process of cropping and annotating the collected images with bounding boxes, which will enable us to extract more information from each training image. This has been done for 1,940 images so far. We carried out all annotations in the program LabelStudio, which was designed specifically for manual labeling of images intended for training supervised machine learning algorighms. This will enable efficient transfer of images and annotations for neural network training in the next reporting period. Work by graduate student Alexander McKeeken revealed best practices that increase accuracy of artificial neural networks at insect identification (objective 1b). His findings suggest that using most recent neural net architectures (EfficientNet) combined with image augmentation, class balancing, and optimal input image size contribute to accuracy. He also found that averaging predictions across multiple trained networks (so-called model ensembling) combined with cropping, resizing and flipping to create multiple variants of the input image works to both increase accuracy and substantially decrease error variance. All these findings constitute a set of best practices that we will use in future model training and refinement. Alexander also laid the groundwork for future improvements to neural net training by writing well-documented code in self-contained units that can be easily reused. In working towards objective 2, incorporating AI-assisted identification into decision support system and community-based resources, so far we have built a website with compiled information about relevant insect pests in the Inland Northwest cereal crop systems. This information is available at https://cerealpestaid.net/insect/. We also have an on-line system and database for collecting, identifying, and storing images for training developed by Jennifer Hinds and Lucas Sheneman. In working towards objective 3, disseminating the developed system, we have so far identified diverse stakeholders that will help collecting image data, form an advisory board, and provide feedback on the system. We have not been able to convene the board thus far but have a plan in place to meet for the first time in during the next reporting period. Significant progress was made with software development (objectives 1, 2a, 2b, 3b) in year 2 of the project. Web and mobile app interfaces have been developed. For the web application, the following functionality was implemented: User accounts An interactive map of observations Pest information and sorting Observation filtering Observation view containing image and classification Commenting on observations Ability to correct an observation Bulk image upload Moderation portal. No images reach public view without approval. Image upload form linked to a TensorFlow model ready to switch out with official model Mobile app version of the interface has the following functionality: User accounts, interactive map, pest information and sorting, observation filtering, observation view (this and the preceding items reflect the functionality of the web portal) A camera interface to take photos TensorFlow model integration Geolocation to tag where the photo was taken An input form that allows the user to tag which crop the image was taken on as well as other comments This interface was developed by CoPD Sheneman and his team with input from other PIs, and is ready for a trained neural network model input and stakeholder feedback.

Publications

  • Type: Journal Articles Status: Published Year Published: 2022 Citation: Borowiec M.L., Dikow R.B., Frandsen P.B., McKeeken A., Valentini G., White A.E. 2022. Deep learning as a tool for ecology and evolution. Methods in Ecology and Evolution 13(8): 1640-1660. https://doi.org/10.1111/2041-210X.13901


Progress 03/01/21 to 02/28/22

Outputs
Target Audience:Target audience reached during this first stage of the project included individual cereal farmers in northern Idaho and professional biologists/researchers. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This project provided the most significant professional development opportunities to Alexander McKeeken, a master's student funded by the project in this reporting period. Alexander gained much experience in machine learning software engineering, research, and scientific writing. He participated in writing of a review publication which was accepted for publication in the high-impact journal Methods in Ecology and Evolution. These opportunities allowed Alexander gain important skills and helped him continue on his chosen professional trajectory. Heapplied and was admitted to a PhD program at the University of Minnesota.Alexander graduated from the University of Idaho in May 2022. How have the results been disseminated to communities of interest?Our results thus far have been disseminated to a limited group of stakeholders who assist in data collection and will be involved in feedback for the AI-assisted tool and on-line community portal. General information about deep learning for similar applications will be soon published in open-access review article referenced in Products, with the target audience of professional biologists. What do you plan to do during the next reporting period to accomplish the goals?During the next reporting period we plan to: Gather upwards of 4,000 additional images Further refine the deep learning model for identification Meet with advisory board for feedback Begin development of mobile phone application and its integration with on-line portal Disseminate the results to a wider audience through extensionnewsletters, presentations at workshops

Impacts
What was accomplished under these goals? The aim of this project is to benefitcereal crop system growers in the Inland Northwest. This will be accomplished through supporting the decision-making process around treatment for insect pests or beneficials. Our project will deliver AI-assisted identicfication tools for growers and an on-line support community.The goal of these tools is to allow growers and other users (crop advisers, extension specialists) make quick and accurate decisions about treatment for potential insect pests or non-treatment of beneficials. Accurate identificaton of insects is usually time- and resource-intensive and our tools aim to make this process more efficient. Lack of identification resources encourages indiscriminate use of pesticides, which incur costs on the grower, are detrimental to the environment, and potentially harmful to beneficial insects that can help fight crop pests. In the first year of the project we have made significant progress towards the three main objectives. First, we collected 1,703 images of pests and beneficials for the project (objective 1a). We anticipate a much larger number of images to be collected in the next phase, as the last season was extremely dry and difficult for cereal agriculture in our region, resulting in fewer insects observed. Work by graduate student Alexander McKeeken revealed best practices that increase accuracy of artificial neural networks at insect identification (objective 1b). His findings suggest that using most recent neural net architectures (EfficientNet) combined with image augmentation, class balancing, and optimal input image size contribute to accuracy. He also found that averaging predictions across multiple trained networks (so-called model ensembling) combined with cropping, resizing and flipping to create multiple variants of the input image works to both increase accuracy and substantially decrease error variance. All these findings constitute a set of best practices that we will use in future model training and refinement. Alexander also laid the groundwork for future improvements to neural net training by writing well-documented code in self-contained units that can be easily reused. In working towards objective 2, incorporating AI-assisted identification into decision support system and community-based resources, so far we have built a website with compiled information about relevant insect pests in the Inland Northwest cereal crop systems. This information is available athttps://cerealpestaid.net/insect/. We also have an on-line system and database for collecting, identifying, and storing images for training developed by Jennifer Hinds and Lucas Sheneman. In working towards objective 3, disseminating the developed system, we have so far identified diverse stakeholders that will help collecting image data, form an advisory board, andprovide feedback on the system. The board will meet for the first time in during the next reporing period.

Publications

  • Type: Journal Articles Status: Accepted Year Published: 2022 Citation: Borowiec M.L., Dikow R.B., Frandsen P.B., McKeeken A., Valentini G., White A.E. 2022. Deep learning as a tool for ecology and evolution. Methods in Ecology and Evolution. https://doi.org/10.1111/2041-210X.13901