Source: NORTH DAKOTA STATE UNIV submitted to NRP
INTELLIGENCE ON FARM WEED CONTROL SOLUTION BASED ON EDGE-AI AND UGV SYSTEMS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1031641
Grant No.
2024-67022-42001
Cumulative Award Amt.
$590,397.00
Proposal No.
2022-11354
Multistate No.
(N/A)
Project Start Date
Mar 1, 2024
Project End Date
Feb 29, 2028
Grant Year
2024
Program Code
[A1551]- Engineering for for Precision Water and Crop Management
Recipient Organization
NORTH DAKOTA STATE UNIV
1310 BOLLEY DR
FARGO,ND 58105-5750
Performing Department
(N/A)
Non Technical Summary
This project is dedicated to addressing the pressing need for more effective and environmentally conscious weed control in agriculture. Leveraging advanced Edge-AI technology, we are developing a cooperative Unmanned Ground Vehicle (UGV) system equipped with multiple tanks for precise targeted spraying. Built upon our newly designed Multifunctional Robotic Vehicle, this UGV employs onboard computer vision to accurately identify and treat weed patches in the field. Our interdisciplinary team of experts, including a weed scientist, autonomous vehicle engineer, and AI specialist, collaborates closely with local farmers and stakeholders to validate field experiments. Through this precision agriculture research, we aim to significantly impact not only the North Dakota farming industry but also agricultural practices worldwide, providing farmers with sustainable and efficient weed management solutions that reduce herbicide usage and benefit both their operations and the environment.This project is dedicated to addressing the pressing need for more effective and environmentally conscious weed control in agriculture. Leveraging advanced Edge-AI technology, we are developing a cooperative Unmanned Ground Vehicle (UGV) system equipped with multiple tanks for precise targeted spraying. Built upon our newly designed Multifunctional Robotic Vehicle, this UGV employs onboard computer vision to accurately identify and treat weed patches in the field. Our interdisciplinary team of experts, including a weed scientist, autonomous vehicle engineer, and AI specialist, collaborates closely with local farmers and stakeholders to validate field experiments. Through this precision agriculture research, we aim to significantly impact not only the North Dakota farming industry but also agricultural practices worldwide, providing farmers with sustainable and efficient weed management solutions that reduce herbicide usage and benefit both their operations and the environment.
Animal Health Component
40%
Research Effort Categories
Basic
30%
Applied
40%
Developmental
30%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
40223002020100%
Knowledge Area
402 - Engineering Systems and Equipment;

Subject Of Investigation
2300 - Weeds;

Field Of Science
2020 - Engineering;
Goals / Objectives
The overall goal of this project is to develop a UGV system that can be used in precision weed management to automatically recognize and spray weed patches and species in specific locations in real-time. Our short-term goal is to develop a UGV system to recognize and spray the weed species and patches on specific locations with onboard sensors and Edge-AI technologies. Our long-term goal is to deploy these systems and AI algorithms for large-scale weed control and reduce herbicide usage to contribute to the agricultural industry in the US and the world.The specific objectives of the proposed research are:Develop artificial intelligence algorithms based on Edge-AI computing technology to identify weed patches between crop rows.Develop an autonomous UGV system based on the pre-developed Multifunctional Robotic PlatformDevelop a UGV system that can autonomously travel to weed locations, recognize the weed on-site, and target spray.
Project Methods
1: Develop artificial intelligence algorithms based on Edge-AI computing and TPU technologies for weed identification applications in between row and within row patches.The first step will be to import the greater than 1 million greenhouse crop and weed species pictures we previously collected to Efficient Net-Edge TPU models to test the prediction accuracy. We will utilize our North Dakota State University (NDSU) Computationally Assisted Science and Technology (CCAST) cloud computing power to proceed with image processing. The images will be divided into training (80%), and validation (20%) sets. This step will prove the AI algorithm can identify the crop and weed species.The second step will be collecting more image data for field validation. Image data (RGB, thermal, and multispectral) will be collected through installed sensors on the UGV platform. The field data collection will be acquired biweekly depending on weather conditions.The third step will be deploying and validating the selected Efficient Net-Edge TPU model using the Edge-AI TPU device on UGV systems in the field in North Dakota (Casselton, ND).2: Develop an autonomous UGV system based on the pre-developed Multifunctional Robotic Platform.Phase 1. Design and fabrication of drive and steering units.To enhance the driving capacity of the UGV system on soft soil farm fields and fields with some slopes, the power actuators for the drive wheels are re-calculated and simulated by developing a simulation model in MATLAB/Simulink by considering real soil conditions in study fields. Phase 2. Central Electronic Control Unit (CECU) design and implementation.To make the UGV system operate autonomously, a control algorithm is developed and embedded in a system's CECU. First, the system's CECU algorithms will be developed and validated via computer simulation models in MATLAB/Simulink environment. The CECU system architecture shown in Figure 6 will have several input signals, such as GPS coordinates, camera reading data to identify plant rows, sonic sensor readings to identify plant rows and weeds to spray and compute the coordinates of the UGV wheels with respect to the plant rows.Phase 3. Implementation and ValidationThe developed and validated control strategies and algorithms of the CECU of the system via computer simulation models will be implemented using NI myRIO and Arduino boards, and DC motors in the fields. The fields with different plant row patterns will be selected and the system's autonomous drive and steering control algorithms will be tested. Moreover, the UGV's power capacity will be tested in different soil conditions and fields with some slopes. 3: Develop a UGV-based weed identification system that can recognize weed location and species in real time between crop rows and target spray automatically.Phase 1. We will be planting 5 different weed species (red weed, Kochia, pigweed, ragweed, horseweed) at one of our experiment fields in Casselton, ND. Also, two different species of crops (soybean, and corn) will be planted at the same location. A duplicate field which includes the exact experiment design for weed and crop layout will be conducted to minimize any loss of the weed or crop data. Six backup plots for weed species will be prepared on the side of each field to serve as spare samples for any additional data collection.Phase 2. Install RGB, thermal, and 3D cameras on our UGV platform, and smart sprayer system design. the image acquisition system will first be installed on the UGV, and then image preprocessing, AI model, image post processing steps will be conducted through the Edge-AI component. We will collect the imagery data from each camera sensor twice a week in the summer in ND. Phase 3. Once the Edge-AI model is developed, the model will be transferred to the Edge-AI kit, which is installed on the back of the UGV system. A smart sprayer system will be integrated with the Edge-AI kit to receive the signal to target spray on different weed species.

Progress 03/01/24 to 02/28/25

Outputs
Target Audience:During this reporting period, our research on robotic actuation-based precision mechanical weeding has reached academic researchers, industry professionals, farmers, students, and policymakers. We engaged with the academic community through conference presentations,and collaborations with researchers specializing in AI-driven agricultural automation. Industry stakeholders, including agri-tech companies and robotic system developers, showed interest in our work, leading to discussions on commercialization and technology integration. Our outreach also extended to farmers and agricultural practitioners, particularly in the Midwest and Northern Great Plains, where we explored real-world applications of precision mechanical weeding. We conducted farm events like Grand Farm Autonomous Nation Conference and Big Iron Showand demonstrations to showcase the benefits of AI-driven weed detection, emphasizing reduced herbicide use and improved efficiency. A key focus has been on workforce development, training graduate students in robotics, AI, and precision agriculture. Students have actively participated in system development, gaining hands-on experience in mechanical design, AI model training, and real-time system integration. These efforts ensure the next generation of researchers and engineers is equipped with the skills needed to advance agricultural automation and sustainability. Changes/Problems:While our simulated environment in RViz and Gazebo has proven invaluable for initial algorithm development and testing, transitioning to real-world deployment with the robot has presented expected, yet significant, challenges. As is typical in robotics, the discrepancy between simulated and real-world performance is apparent. Real-world sensor data exhibits greater noise and susceptibility to outliers than our idealized simulations. Furthermore, the dynamics of the physical environment introduce complexities not fully captured in our models. These factors necessitate a refinement of our approach to ensure robust and reliable navigation. Going forward, we anticipate implementing more robust filtering techniques to mitigate the impact of unexpected noise and outliers present in the real-world sensor data. We are also exploring the incorporation of adaptive control strategies. These strategies will allow the robot to dynamically adjust its behavior based on real-time feedback from the environment, improving its ability to handle unforeseen circumstances and navigate more effectively in complex and unpredictable settings. This iterative process of real-world testing, analysis, and algorithm refinement is crucial for achieving our objective of robust navigation performance. What opportunities for training and professional development has the project provided?Through this project, we have provided the students withhands-on experience with Delta robots, focusing on kinematic modeling and mechanical design. Also, we have provided trainingskills in electronics, particularly with Arduino Mega 2560, and learn to integrate mechanical components with electronic systems. How have the results been disseminated to communities of interest?We have actively engaged in outreach programs with local communities to share the progress and outcomes of this project. For example, we have collaborated withGrand Farm, a local non-profit organization, which hosts events such as theAutonomous Nation Conference. This conference brings together farm producers, stakeholders, researchers, and the public to discuss advancements in agricultural technology. Through this platform, the Principal Investigator (PI) has actively promoted the project's goals and provided professional insights by serving as a panelist and invited speaker at similar events. These efforts have allowed us to disseminate our research findings, demonstrate the practical applications of our autonomous UGV system and AI algorithms, and gather valuable feedback from the agricultural community. This engagement has not only raised awareness about the project but also fostered collaboration and knowledge exchange with key stakeholders in the industry. What do you plan to do during the next reporting period to accomplish the goals? Enhance System Performance and Field Testing: Optimize AI algorithms (YOLO models) and sensor fusion (GPS, LiDAR, IMU) for improved weed detection and navigation accuracy. Conduct comprehensive field trials to validate the UGV system's performance in real-world agricultural conditions. Develop Dynamic Operational Logic and Outreach: Implement object tracking and dynamic operational logic for real-time weed removal in moving field scenarios. Expand outreach efforts by collaborating with organizations likeGrand Farmand participating in conferences to disseminate results and gather stakeholder feedback. Publish Research Findings: Submit at leastthree manuscriptsto peer-reviewed journals to share advancements in AI-based weed detection, autonomous navigation, and precision mechanical weeding.

Impacts
What was accomplished under these goals? 1. Develop AI Algorithms for Weed Identification (Objective 1) We have developed and trained deep learning models (YOLOv8-YOLOv11) for real-time weed detection, deploying them on an edge device (Jetson AGX Orin) for efficient processing. These AI algorithms have been integrated with robotic systems to enable real-time weed localization and targeting. Additionally, we collected and processed multispectral UAV imagery to create detailed weed and crop segmentation maps using vegetation indices (NDVI, VARI) and deep learning models like U-Net. To support these efforts, we utilized annotation tools such as SAM (Segment Anything Model) to create high-quality training datasets for accurate weed and crop classification. 2. Develop an Autonomous UGV System (Objective 2) We have developed a robust GPS-based navigation system that uses sensor fusion (IMU, encoder, and GPS) combined with an Extended Kalman Filter (EKF) to achieve precise localization and waypoint-based navigation. Simultaneously, we collected and analyzed LiDAR data to enable obstacle detection, row detection, and path planning, with ongoing work to integrate LiDAR-based SLAM for more reliable navigation. Furthermore, we built and tested two robotic platforms,MiniWeedBot, which are integrated with GPS, Jetson, and Pixhawk systems to demonstrate autonomous navigation capabilities in controlled environments. 3. Develop a UGV System for Autonomous Weed Recognition and Targeted Spraying (Objective 3) We have designed and fabricated a Delta robotic arm for precision mechanical weeding, integrating its inverse kinematics with real-time weed detection models to enable accurate targeting and removal of weeds. The system uses an auger drill bit as the end-effector, selected for its ability to minimize crop damage while effectively uprooting weeds. We successfully synchronized weed detection and robotic actuation, allowing the system to identify and remove weeds in real time under static conditions. A graphical user interface (GUI) was also developed to control the system, including features for homing, object detection, and arm actuation, ensuring seamless operation.

Publications