Progress 09/01/23 to 08/31/24
Outputs Target Audience:The audience of the project includes engineers, data scientists, plant breeders and geneticsts, and industry stakeholders (cotton growers and extension agents). Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?In PI Li's Lab, the grant was used to primarily support one doctoral student Daniel Pettiat the Agricultural and Biological Engineering Department at the University of Florida. In addition, two visiting doctoral students (Chenjiao Tan and Lizhi Jiang) were also partially supported to conduct research on this project at the ABE Department at the University of Florida. In Co-PI Chee's Lab, the funds were used to support one MS student Dalton West. How have the results been disseminated to communities of interest?In addition to publications and conference presentations, we demonstrated our field data collection system to the NIFA director during his visit to UF in January 2024. Li's lab also demonstrated the system for multiple undergraduate student cohorts. What do you plan to do during the next reporting period to accomplish the goals?We plan to collect and analyze the data in year 2 and publish our data in peer-reviewed journals.
Impacts What was accomplished under these goals?
Objective 1. Manage multi-scale heterogeneous field data by developing data collection, curation, storage, and sharing workflows. We have developed MALLARD, a robust data management system designed to streamline the collection, curation, storage, and sharing of multi-scale heterogeneous field data. MALLARD is a single-page web application comprising a Python-based backend, implemented with FastAPI, and a responsive frontend for user interaction. The backend is divided into microservices: the gateway (API management), edge (static file delivery), and transcoder (video processing). The system integrates with MARS-X, an agricultural robotics platform, enabling seamless upload and processing of field-collected data, including automated metadata extraction from EXIF tags and ROSbag files. MALLARD currently supports uploading and managing image and video data through both web-based interfaces and direct robot integration, enhancing accessibility and efficiency for researchers. In the following year, we plan to expand functionality, optimize video transcoding, and enhance data visualization and retrieval capabilities. We have made progress in improving the MARS-X robot platform, focusing on enhancing its data collection and integration capabilities. The platform now supports multi-camera video recording using Raspberry Pi HQ modules, each running ROS nodes to transmit compressed video data to an onboard controller. To streamline data processing, we integrated MARS-X with the MALLARD data management system, enabling automated extraction, transcoding, and metadata generation for video streams. These advancements have improved the robot's ability to efficiently handle and transfer field data, reducing operator workload and increasing data accessibility for research and analysis. Objective 2: Plant organ-level high resolution phenotyping through real-time video tracking and 3D deep learning models.Manual flower counting is valuable to plant breeders, but is often too labor-intensive to be practical. Existing automated approaches are computationally demanding and lack user-friendly interfaces. We made progress in addressing these challenges by developing a self-supervised active learning framework to build a lightweight flower tracking model deployable on ground robots for real-time operation in the first year of the project. Using camera and GPS data, the system also automates flower location extraction. Tested on an Nvidia Jetson Xavier AGX, our approach achieved <10% MAPE in flower counts while maintaining real-time performance, making it a practical and integrated solution for cotton phenotyping. Cotton breeding programs require efficient phenotypic trait measurements to develop high-yield varieties, yet manual methods are time-intensive. We studied a novel 3D Gaussian splatting (3DGS) method for automated measurement using instance segmentation. A 360-degree video of a cotton plant is processed to generate 2D images, estimate camera poses, and create a sparse point cloud with COLMAP. The 3DGS model reconstructs the plant scene, and YOLOv8x generates 2D masks, which are combined with SAGA to segment individual cotton bolls and stems in 3D. Phenotypic traits, including boll count and stem length, are estimated with a mean absolute percentage error (MAPE) of 11.43% and 10.45%, respectively. This approach enhances trait accuracy and offers a novel 3D tool for cotton breeding programs. Objective 3: Applying phenomics data in a phenotypic selection system towards ideotype breeding. In the first year, our goal was to collect phenotype data and the results will be used for ideotype breeding in the late stage of the project.
Publications
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
Tan, Chenjiao; Sun, Jin; Paterson, Andrew H.; Song, Huaibo; Li, Changying. Three-view cotton flower counting through multi-object tracking and RGB-D imagery. Biosystems Engineering, vol. 246, pp. 233-247, 2024.
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
Petti, Daniel; Zhu, Ronghang; Li, Sheng; Li, Changying. Graph Neural Networks for lightweight plant organ tracking. Computers and Electronics in Agriculture, vol. 225, pp. 109294, 2024.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Jiang, Lizhi; Li, Changying; Sun, Jin; Chee, Peng; Fu, Longsheng. Estimation of Cotton Boll Number and Main Stem Length Based on 3D Gaussian Splatting. ASABE, St. Joseph, MI, 2024.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Tan, Chenjiao; Li, Changying; Sun, Jin; Song, Huibo. Multi-Object Tracking for Cotton Boll Counting in Ground Videos Based on Transformer. ASABE, St. Joseph, MI, 2024.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Petti, Daniel J.; Li, Changying. Active Learning for Real-Time Flower Counting with a Ground Mobile Robot. ASABE, St. Joseph, MI, 2024.
|