Source: UNIV OF CALIFORNIA submitted to
DSFAS: AI-BASED FAST AUTOMATED 3D SCENE RECONSTRUCTION OF PLANTS AND FARMS FOR AGRICULTURAL ROBOTS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1032345
Grant No.
2024-67021-42528
Cumulative Award Amt.
$727,999.00
Proposal No.
2023-11702
Multistate No.
(N/A)
Project Start Date
Aug 1, 2024
Project End Date
Jul 31, 2028
Grant Year
2024
Program Code
[A1541]- Food and Agriculture Cyberinformatics and Tools
Project Director
Jawed, M.
Recipient Organization
UNIV OF CALIFORNIA
(N/A)
LOS ANGELES,CA 90095
Performing Department
(N/A)
Non Technical Summary
Precision agriculture holds immense potential for improving farming practices, but current methods for monitoring and managing crops are often outdated and labor-intensive. Our project aims to address this challenge by developing an integrated software and hardware system for 3D reconstruction of farming environments. Led by a team of experts from the University of California, Los Angeles, and North Dakota State University,we will design, develop, and field-test an unmanned ground vehicle (UGV) equipped with advanced imaging capabilities. By autonomously capturing images from various camera angles, the UGV will enable farmers, agronomists, and roboticists to create virtual reality (VR) environments of farms, allowing them to visualize and interact with their crops in 3D.This innovative system will allow users to specify various parameters such as crop type, time of day, and season, and the VR environment will be able to emulate "time," generating dynamic 3D scenes representing the entire life cycle of crops. We envision three main application areas for this project: first, providing a complete package for robots and sensors used in precision agriculture, thus facilitating testing and deployment of robotic systems in real-world farming environments; second, offering a non-invasive tool for phenotyping in plant breeding and precision agriculture through automated 3D reconstruction of entire plants and farms; and third, enabling remote work for breeders and agronomists by providing accurate 3D representations of farming environments.Through collaboration with farmers and industry stakeholders, our project aims to advance knowledge and adoption of precision agriculture technologies. By providing new tools and techniques for monitoring and managing crops, we hope to improve farm productivity, efficiency, and sustainability, ultimately contributing to a more sustainable and environmentally friendly agricultural industry.
Animal Health Component
25%
Research Effort Categories
Basic
50%
Applied
25%
Developmental
25%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4027410202050%
2052499208050%
Goals / Objectives
The major goals of this project can be summarized as follows:Design, development, and field-testing of an integrated software and hardware system: The primary goal of the project is to create an integrated system comprising software (computer vision, machine learning) and hardware (unmanned ground vehicle) to facilitate 3D reconstruction of farming environments. This system will enable users such as farmers, agronomists, and roboticists to create virtual reality (VR) environments of farms.Dynamic 3D scene generation: The project aims to develop a system that can generate dynamic 3D scenes representing the entire life cycle of crops. Users will be able to specify various parameters such as the type of crop, time of the day, season, etc. Importantly, the VR environment will be able to emulate "time", providing users with a visual representation of the crop's entire life cycle.Addressing the gap in robotics and precision agriculture: The project aims to fill the gap in the testing of robots and sensors used in precision agriculture by providing a complete package for simulation environments. This will enable testing and validation of robotic systems and sensors before deployment in real-world farming environments.Developing a fully automated robotic platform for 3D reconstruction: The project seeks to develop a fully automated robotic platform capable of creating 3D scenes of entire farms. This will provide an emerging non-invasive tool for phenotyping in plant breeding and precision agriculture.Enabling remote work for breeders and agronomists: With the advancement of technology, the project anticipates that breeders and agronomists will often work from remote locations in the future. By providing an accurate 3D representation of farming environments, the project aims to enable remote tasks such as quantification of phenotypic traits.The project will be led by a qualified team comprising PD Jawed (expertise: robotics for precision agriculture), co-PD Joo (expertise: computer vision, virtual reality, and AI) from UCLA, and co-PD Rahman, an agronomist at NDSU. Additionally, the team will collaborate with farmers and stakeholders from the industry for field trials, ensuring the relevance and applicability of the developed system in real-world farming scenarios.
Project Methods
The project will be conducted in several stages, including design, prototyping, testing, and evaluation of an integrated software and hardware system for creating 3D scenes of plants and farms that evolve with time. The general scientific methods will involve a combination of computer vision, machine learning, robotics, and virtual reality (VR) techniques. Unique aspects of the project include the development of a fully autonomous unmanned ground vehicle (UGV) with a manipulator for image capture, as well as the integration of time-based 3D scene evolution in the virtual reality environment.Project Phases:Design and Prototyping:Software Development: Design and development of computer vision and machine learning algorithms for image processing and 3D reconstruction.Hardware Design: Design and prototyping of an unmanned ground vehicle (UGV) with a manipulator for autonomous image capture.Integration: Integration of software and hardware components to create a unified system for 3D scene capture and reconstruction.Testing and Optimization:Field Testing: Testing the integrated system in real-world farming environments to assess performance and reliability.Optimization: Iterative optimization of algorithms and hardware components based on testing results.Data Collection and Analysis:Image Capture: Autonomous image capture by the UGV at various camera angles using the robotic manipulator.3D Reconstruction: Reconstruction of farm environments and plants in a virtual reality (VR) environment using computer vision techniques.Time-based Evolution: Incorporation of time-based evolution of 3D scenes to simulate plant growth and farm activities over time.Evaluation and Impact Assessment:Change in Knowledge:Formal and informal educational programs will be conducted to disseminate project findings to target audiences, including farmers, agricultural professionals, roboticists, and educators.Workshops, training programs, and outreach activities will be organized to increase awareness and understanding of the project outcomes.Change in Action:Adoption of precision agriculture technologies and practices will be measured through surveys, interviews, and assessments of farm management practices.Changes in behavior and practices among target audiences will be assessed through pre- and post-project surveys and interviews.Change in Condition:Improvement in farm productivity, efficiency, and sustainability will be evaluated through quantitative measures such as yield increase, resource savings, and environmental impact.Enhanced research capabilities and innovation in the agricultural sector will be assessed through metrics such as publications, patents, and technology transfer activities.Evaluation Plan:Milestone Evaluation:Evaluation of project milestones and deliverables, including software prototypes, hardware prototypes, and field testing results.Assessment of project progress against timeline and budget.Performance Evaluation:Evaluation of the performance and reliability of the integrated software and hardware system in real-world farming environments.Assessment of the accuracy and efficiency of 3D scene reconstruction and time-based evolution.Impact Assessment:Evaluation of the impact of the project on target audiences, including changes in knowledge, actions, and conditions.Measurement of the adoption of precision agriculture technologies and practices among farmers and agricultural professionals.Assessment of the contribution of the project to research innovation and technology transfer in the agricultural sector.Data Collection:Quantitative data collection through surveys, interviews, and assessments of farm management practices.Qualitative data collection through case studies, focus groups, and expert opinions.Longitudinal data collection to assess the long-term impact of the project on target audiences and the agricultural sector.