Progress 11/01/23 to 10/31/24
Outputs Target Audience:Target audiences included individual cereal farmers in northern Idaho and Pacific Northwest and professional biologists/researchers. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?In the previous reporting period, graduate student Stephanie Eskew was supported by this project in tasks of collecting, identifying, and labeling field photographs. This helped hone her skills in insect identification and familiarized her with basic theory of preparing data for training of computer vision models. Three undergraduate students, Alyzabeth Bass, Bryli Jensen, and Milo Flint, helped annotate photos and participated in field photo sessions. This required training them in insect identification and field work in agricultural ecosystems. PD Borowiec traveled to USDA NIFA Project Directors' meeting in Manhattan, Kansas, where he presented a poster on the project. He learned much about engineering and precision agriculture solutions and networked with colleagues from fields very different from his own primary area of expertise, which is entomology. How have the results been disseminated to communities of interest?Our results thus far have been disseminated to a limited group of stakeholders who assist in data collection and will be involved in feedback for the AI-assisted tool and on-line community portal. In a prior reporting period, we published an open-access review article with the target audience of professional biologists. A version of the tool was demonstrated as part of a presentation to The Wheat Academy at Washington State University to more than 60 producers and crop advisors. This is part of preparations to test this version with selected beta testers in the final field season of the project. What do you plan to do during the next reporting period to accomplish the goals? Expand and clean image dataset for neural network training Train the final deep learning model for identification Meet with advisory board for feedback Share the mobile application with beta testers (field consultants with The McGregor Company) and make final adjustments Publish finalized mobile phone application and on-line portal Disseminate the results to a wider audience through extension newsletters, presentations at workshops
Impacts What was accomplished under these goals?
Summary of progress for Objective 1. In the 2024 season, we collected a total of 5,600 images of cereal crop pests (Sub-objective 1a) in the field. This adds to a total of 9,165 images. Of these, 8,280 have been manually annotated in a self-hosted LabelStudio web application with identifications and bounding boxes and made ready for neural network training and evaluation. In addition to these field-collected images, we supplemented our dataset with publicly available images of crop pests from sources such as the Global Biodiversity Information Facility (GBIF) and Kaggle. Based on experiments on best practices from previous reporting periods, we trained a preliminary neural network to recognize 25 classes of insect pests and beneficials in cereal cropping systems (Sub-objective 1b). We trained and evaluated three convolutional neural network (CNN) architectures for our classifier, including InceptionResNetV2, MobileNet V3, and EfficientNetB6, with EfficientNetB6 performing best overall. We mitigated the effects of unavoidable class imbalance in our training data by adopting a weighted data augmentation strategy where we sampled less common classes more frequently. Importantly, this preliminary model achieved 85% validation accuracy. We are currently expanding, cleaning, and refining the dataset for another round of training. Summary of progress for Objective 2. We launched a website compiling information about relevant insect pests in the Inland Northwest cereal crop systems (https://cerealpestaid.net/insect/). This information is now integrated with an interface (not yet public) that is available as both a cross-platform mobile application (Sub-objective 2a) and an on-line portal (Sub-objective 2b) to provide AI-assisted identification and community feedback on insect pest photographs. The web portal functionality implemented to date include: User accounts An interactive map of observations Pest information and sorting Observation filtering Observation view containing image and classification Commenting on observations Ability to correct an observation Bulk image upload Moderation portal. No images reach public view without approval. Image upload form linked to a TensorFlow model ready to switch out with trained model Geolocation tagging for the photographs taken An input form that allows the user to tag The mobile app version of the interface was implemented in Flutter for both iOS and Android. It reflects the functionality of the web portal and additionally it allows for: Intuitive camera interface to take photographs Geolocation tagging for the photographs taken An input form that allows the user to tag which crop the image was taken on as well as other comments and metadata Importantly, the preliminary deep learning model has been successfully converted to a portable TensorFlow Lite representation and integrated into and tested under the mobile app framework. This allows for a simple swap of the model when improved models are available towards the end of this project. As such, this interface was developed by CoPD Sheneman and his team with input from other PIs and is ready for a trained neural network model input and stakeholder feedback. Summary of progress for Objective 3.In December 2024, CoPD Eigenbrode presented the mobile application interface at the WSU-Wheat Academy, targeting cereal farmers, crop consultants, and other stakeholders from northern Idaho and eastern Washington.
Publications
|