Source: UNIVERSITY OF MISSOURI submitted to
DISCOVERY OF PLANT SECRETS USING AN AIRBORNE HYPERSPECTRAL IMAGING SYSTEM (AHSIS)
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1031481
Grant No.
2023-70410-41181
Cumulative Award Amt.
$400,403.00
Proposal No.
2023-05415
Multistate No.
(N/A)
Project Start Date
Sep 1, 2023
Project End Date
Aug 31, 2027
Grant Year
2023
Program Code
[EGP]- Equipment Grants Program
Project Director
Zhou, J.
Recipient Organization
UNIVERSITY OF MISSOURI
(N/A)
COLUMBIA,MO 65211
Performing Department
(N/A)
Non Technical Summary
This project aims to acquirie a shared cyber airborne hyperspectral imaging system (AHSIS) to enhance collaborative research capacity in artificial intelligence (AI)-enabled agriculture at 11 collaborative institutions in the region. The AHSIS will support 21 researchers in multiple disciplines to acquire, process, and analyze high-volume, high-resolution and full-spectrum (400 - 2,500 nm) hyperspectral imagery data of crops. The integrated advanced hyperspectral imager and high-performance computing (HPC) system will provide a shared testbed to identify crop secrets and discover the insights of interactions between crop genotype, environment, and management (G×E×M). The project aims to enable and enhance the fundamental and applied research and extension activities in AI-enabled agriculture. The AHSIS will initially support six multidisciplinary research and extension projects in crop breeding and genetics, soil health and crop management (including forage), and AI innovation. The airborne full-spectral imaging system is expected to greatly enhance the capacity to identify novel crop traits for accurately selecting superior crop genotypes (varieties) and distinguishing crop response to different stresses. We will also utilize the AHSIS and the collaborating network to develop a shared spectral database for different crops under different environments. The large-volume hyperspectral imagery data, crop genomic data and associated crop data will provide an ideal dataset for developing use-inspired AI innovations. The AHSIS will be accessed by 200+ postdocs, graduate and undergraduate students through research, teaching and training activities. The project aligns with the EGP goal of increasing access to shared special purpose equipment for scientific research in the food and agricultural sciences programs.
Animal Health Component
60%
Research Effort Categories
Basic
30%
Applied
60%
Developmental
10%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4022499202050%
1020110208030%
2051699102020%
Goals / Objectives
The overall goal of this project is to enable and enhance fundamental and applied research on the theme of AI-enabled agriculture at the MU and collaborating institutions. The project will fulfill the following objectives: (1) Provide a shared cyber instrument to acquire, process and analyze advanced hyperspectral imagery data of crops and soil for conducting collaborative research and extension activities in AI-enabled agriculture, (2) enhance foundational and applied research in AI-enabled agriculture through deep collaborations among interdisciplinary studies, (3) conduct broader research and extension projects for science and technology dissemination, and for training the next-generation workforce in food and agriculture.The requested instrument will support the research and extension activities of nine PI/Co-PIs/SP and 11 collaborators (major users) to collaboratively work on six planned research areas (projects) and extension activities. The project will generate a regional impact on MU and 10 external institutions. We expect the project will enhance the application of emerging sensing technologies and AI to improve climate-smart and sustainable agriculture. The project is expected to benefit over 200 postdoctoral, graduate and undergraduate students in research projects, education programs, and hands-on course work, as well as farmers, industry, stakeholders and policymakers.
Project Methods
The instrument system will be integrated with ongoing and new research projects that are currently or previously supported by USDA-NIFA, NSF and commodity groups (see Current and Pending Support). The project will enable six planned research projects through collaboration in precision agriculture, high-throughput crop phenotyping and environmental science, as well as innovations in AI algorithms and cyber-physical systems.A Steering Committee consisting of the project's PIs and selected collectors will be established to ensure the instrument use and priorities remain flexible and malleable.The scheduling request form and updated schedule will be posted online - visible to the public through the project website.The shared instrument will be accessed by 80+ postdoctoral, graduate and undergraduate students in research projects (approx. four students per research group) who will get trained on advanced sensing and data analytic technologies. The instrument will be integrated into 5+ undergraduate and graduate courses in MU in different degree programs, including Ag Systems Technology, Plant Science and Computer Science reaching 200+ students each year

Progress 09/01/23 to 08/31/24

Outputs
Target Audience:We received the instrument in early 2024 and completed the initial setup by the spring of 2024. By now, we have tested the instrument and developed operation manuals using preliminary research projects. In this report period, we reached out to the following target audiences through research and outreach activities. - Researchers and university faculty:We reach out to a large range of researchers from the University of Missouri and collaborative institutions, and USDA ARS groups. Through collaboration, we worked with more than 10 faculty and 15 scientists/engineers from multiple disciplines. - Formal students:the project was integrated into all courses for undergraduate students and studentsin the teaching program Agricultural Systems Technology of the University of Missouri. The drone and instrument were particularly used for teachingAST 3225: Sensor and Control Technology for Agriculture, AST 4160: Internet of Things for Agriculture, and AST 4930: Capstone.Collected data were used in classes of Precision agriculture. The total number of students involved in the period is estimated at 120 students. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?We provided training opportunities for research communities, including drone imaging system operation and data analysis. We are planning to have workshops to reach out to a wide range audiences, including undergraduate, graduate students, research scientists and professionals. How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals?In next cycle, we are planning to achieve the following objectives: - Develop a website for the project to share data and improve project impact. - Collaborate with more researchers to make the max use of the shared instrument. - Develop AI models for analyzing hyperspectral imagery - Publish 2 more papers.

Impacts
What was accomplished under these goals? In this reporting period, we have acquired and tested the proposed cyber instrument, including computing system, drone control system and hyperspectral imaging system. Through preliminary test, the instrument was brought to a broad audience, including researchers, professionals, undergraduate and graduate students and other stakeholders. In this reporting period, we had the following achievements: We acquired the cyber hyperspectral imaging system at the University of Missouri and accessed by collaborative institutions. Developed data collection and processing pipeline for the instrument through preliminary research. Conducted preliminary research data for publications Initialized a project webpage to share hyperspectral data. Teaching activities were conducted using the hyperspectral images.

Publications