Progress 03/01/21 to 03/21/23
Outputs Target Audience:Target audience reached during this first stage of the project included individual cereal farmers in northern Idaho and professional biologists/researchers. Changes/Problems:PD Borowiec departed University of Idaho to take on another role at Colorado State University. This caused temporary disruption to project activities. CoPD Rashed transitioned from University of Idaho to Virginia Tech and wishes to no longer be involved in the project. We are currently working on transferring the project to Colorado State University and setting up subawards for University of Idaho. What opportunities for training and professional development has the project provided?In year 1, this project provided the most significant professional development opportunities to Alexander McKeeken, a master's student funded by the project in this reporting period. Alexander gained much experience in machine learning software engineering, research, and scientific writing. He participated in writing of a review publication which was accepted for publication in the high-impact journal Methods in Ecology and Evolution. These opportunities allowed Alexander gain important skills and helped him continue on his chosen professional trajectory. He applied and was admitted to a PhD program at the University of Minnesota. Alexander graduated from the University of Idaho in May 2022. In year 2, the project provided training to PhD student Stephanie Eskew, who became involved in image annotation and gained cereal pest identification skills. How have the results been disseminated to communities of interest?Our results thus far have been disseminated to stakeholders who assist in data collection and will be involved in feedback for the AI-assisted tool and on-line community portal. CoPI Eigenbrode presented the project to a number of cereal schools: the WSU Wheat Academy, Pullman, Washington, Dec. 14, 2022; the Idaho County Prairie Cereal School, Greencreek Idaho, 24 Jan., 2023; the Northwest Grass Growers Association meeting, Greencreek Idaho, 26 Jan., 2023; the Nez Perce County Cereal School, Lewiston Idaho, 8 Feb.; the Palouse Alternative Cropping Symposium, Colfax Washington, 23 Feb, 2023. We have also published an open-access review article about deep learning for similar applications, referenced in Products. What do you plan to do during the next reporting period to accomplish the goals?During the next reporting period we plan to: Gather 2,500 additional images Train deep learning model with all images collected and integrate with app and web interface Meet with advisory board for feedback Refine mobile phone application and web portal interface with feedback from stakeholders Disseminate the results to a wider audience through extension newsletters, presentations at workshops
Impacts What was accomplished under these goals?
In the first and second year of the project we have made significant progress towards the three main objectives. First, we collected 3,565 images of pests and beneficials for the project (objective 1a). This is a slower than anticipated rate of image collection but we have a plan in place to increase image collection effort in years three and four. In year 2 we also annotated all collected images, sorting them to pest species for neural network training. Additionally, we are in the process of cropping and annotating the collected images with bounding boxes, which will enable us to extract more information from each training image. This has been done for 1,940 images so far. We carried out all annotations in the program LabelStudio, which was designed specifically for manual labeling of images intended for training supervised machine learning algorighms. This will enable efficient transfer of images and annotations for neural network training in the next reporting period. Work by graduate student Alexander McKeeken revealed best practices that increase accuracy of artificial neural networks at insect identification (objective 1b). His findings suggest that using most recent neural net architectures (EfficientNet) combined with image augmentation, class balancing, and optimal input image size contribute to accuracy. He also found that averaging predictions across multiple trained networks (so-called model ensembling) combined with cropping, resizing and flipping to create multiple variants of the input image works to both increase accuracy and substantially decrease error variance. All these findings constitute a set of best practices that we will use in future model training and refinement. Alexander also laid the groundwork for future improvements to neural net training by writing well-documented code in self-contained units that can be easily reused. In working towards objective 2, incorporating AI-assisted identification into decision support system and community-based resources, so far we have built a website with compiled information about relevant insect pests in the Inland Northwest cereal crop systems. This information is available at https://cerealpestaid.net/insect/. We also have an on-line system and database for collecting, identifying, and storing images for training developed by Jennifer Hinds and Lucas Sheneman. In working towards objective 3, disseminating the developed system, we have so far identified diverse stakeholders that will help collecting image data, form an advisory board, and provide feedback on the system. We have not been able to convene the board thus far but have a plan in place to meet for the first time in during the next reporting period. Significant progress was made with software development (objectives 1, 2a, 2b, 3b) in year 2 of the project. Web and mobile app interfaces have been developed. For the web application, the following functionality was implemented: User accounts An interactive map of observations Pest information and sorting Observation filtering Observation view containing image and classification Commenting on observations Ability to correct an observation Bulk image upload Moderation portal. No images reach public view without approval. Image upload form linked to a TensorFlow model ready to switch out with official model Mobile app version of the interface has the following functionality: User accounts, interactive map, pest information and sorting, observation filtering, observation view (this and the preceding items reflect the functionality of the web portal) A camera interface to take photos TensorFlow model integration Geolocation to tag where the photo was taken An input form that allows the user to tag which crop the image was taken on as well as other comments This interface was developed by CoPD Sheneman and his team with input from other PIs, and is ready for a trained neural network model input and stakeholder feedback.
Publications
- Type:
Journal Articles
Status:
Published
Year Published:
2022
Citation:
Borowiec M.L., Dikow R.B., Frandsen P.B., McKeeken A., Valentini G., White A.E. 2022. Deep learning as a tool for ecology and evolution. Methods in Ecology and Evolution 13(8): 1640-1660. https://doi.org/10.1111/2041-210X.13901
|
Progress 03/01/21 to 02/28/22
Outputs Target Audience:Target audience reached during this first stage of the project included individual cereal farmers in northern Idaho and professional biologists/researchers. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?This project provided the most significant professional development opportunities to Alexander McKeeken, a master's student funded by the project in this reporting period. Alexander gained much experience in machine learning software engineering, research, and scientific writing. He participated in writing of a review publication which was accepted for publication in the high-impact journal Methods in Ecology and Evolution. These opportunities allowed Alexander gain important skills and helped him continue on his chosen professional trajectory. Heapplied and was admitted to a PhD program at the University of Minnesota.Alexander graduated from the University of Idaho in May 2022. How have the results been disseminated to communities of interest?Our results thus far have been disseminated to a limited group of stakeholders who assist in data collection and will be involved in feedback for the AI-assisted tool and on-line community portal. General information about deep learning for similar applications will be soon published in open-access review article referenced in Products, with the target audience of professional biologists. What do you plan to do during the next reporting period to accomplish the goals?During the next reporting period we plan to: Gather upwards of 4,000 additional images Further refine the deep learning model for identification Meet with advisory board for feedback Begin development of mobile phone application and its integration with on-line portal Disseminate the results to a wider audience through extensionnewsletters, presentations at workshops
Impacts What was accomplished under these goals?
The aim of this project is to benefitcereal crop system growers in the Inland Northwest. This will be accomplished through supporting the decision-making process around treatment for insect pests or beneficials. Our project will deliver AI-assisted identicfication tools for growers and an on-line support community.The goal of these tools is to allow growers and other users (crop advisers, extension specialists) make quick and accurate decisions about treatment for potential insect pests or non-treatment of beneficials. Accurate identificaton of insects is usually time- and resource-intensive and our tools aim to make this process more efficient. Lack of identification resources encourages indiscriminate use of pesticides, which incur costs on the grower, are detrimental to the environment, and potentially harmful to beneficial insects that can help fight crop pests. In the first year of the project we have made significant progress towards the three main objectives. First, we collected 1,703 images of pests and beneficials for the project (objective 1a). We anticipate a much larger number of images to be collected in the next phase, as the last season was extremely dry and difficult for cereal agriculture in our region, resulting in fewer insects observed. Work by graduate student Alexander McKeeken revealed best practices that increase accuracy of artificial neural networks at insect identification (objective 1b). His findings suggest that using most recent neural net architectures (EfficientNet) combined with image augmentation, class balancing, and optimal input image size contribute to accuracy. He also found that averaging predictions across multiple trained networks (so-called model ensembling) combined with cropping, resizing and flipping to create multiple variants of the input image works to both increase accuracy and substantially decrease error variance. All these findings constitute a set of best practices that we will use in future model training and refinement. Alexander also laid the groundwork for future improvements to neural net training by writing well-documented code in self-contained units that can be easily reused. In working towards objective 2, incorporating AI-assisted identification into decision support system and community-based resources, so far we have built a website with compiled information about relevant insect pests in the Inland Northwest cereal crop systems. This information is available athttps://cerealpestaid.net/insect/. We also have an on-line system and database for collecting, identifying, and storing images for training developed by Jennifer Hinds and Lucas Sheneman. In working towards objective 3, disseminating the developed system, we have so far identified diverse stakeholders that will help collecting image data, form an advisory board, andprovide feedback on the system. The board will meet for the first time in during the next reporing period.
Publications
- Type:
Journal Articles
Status:
Accepted
Year Published:
2022
Citation:
Borowiec M.L., Dikow R.B., Frandsen P.B., McKeeken A., Valentini G., White A.E. 2022. Deep learning as a tool for ecology and evolution. Methods in Ecology and Evolution. https://doi.org/10.1111/2041-210X.13901
|