Progress 03/01/24 to 02/28/25
Outputs Target Audience:During this reporting period, our research on robotic actuation-based precision mechanical weeding has reached academic researchers, industry professionals, farmers, students, and policymakers. We engaged with the academic community through conference presentations,and collaborations with researchers specializing in AI-driven agricultural automation. Industry stakeholders, including agri-tech companies and robotic system developers, showed interest in our work, leading to discussions on commercialization and technology integration. Our outreach also extended to farmers and agricultural practitioners, particularly in the Midwest and Northern Great Plains, where we explored real-world applications of precision mechanical weeding. We conducted farm events like Grand Farm Autonomous Nation Conference and Big Iron Showand demonstrations to showcase the benefits of AI-driven weed detection, emphasizing reduced herbicide use and improved efficiency. A key focus has been on workforce development, training graduate students in robotics, AI, and precision agriculture. Students have actively participated in system development, gaining hands-on experience in mechanical design, AI model training, and real-time system integration. These efforts ensure the next generation of researchers and engineers is equipped with the skills needed to advance agricultural automation and sustainability. Changes/Problems:While our simulated environment in RViz and Gazebo has proven invaluable for initial algorithm development and testing, transitioning to real-world deployment with the robot has presented expected, yet significant, challenges. As is typical in robotics, the discrepancy between simulated and real-world performance is apparent. Real-world sensor data exhibits greater noise and susceptibility to outliers than our idealized simulations. Furthermore, the dynamics of the physical environment introduce complexities not fully captured in our models. These factors necessitate a refinement of our approach to ensure robust and reliable navigation. Going forward, we anticipate implementing more robust filtering techniques to mitigate the impact of unexpected noise and outliers present in the real-world sensor data. We are also exploring the incorporation of adaptive control strategies. These strategies will allow the robot to dynamically adjust its behavior based on real-time feedback from the environment, improving its ability to handle unforeseen circumstances and navigate more effectively in complex and unpredictable settings. This iterative process of real-world testing, analysis, and algorithm refinement is crucial for achieving our objective of robust navigation performance. What opportunities for training and professional development has the project provided?Through this project, we have provided the students withhands-on experience with Delta robots, focusing on kinematic modeling and mechanical design. Also, we have provided trainingskills in electronics, particularly with Arduino Mega 2560, and learn to integrate mechanical components with electronic systems. How have the results been disseminated to communities of interest?We have actively engaged in outreach programs with local communities to share the progress and outcomes of this project. For example, we have collaborated withGrand Farm, a local non-profit organization, which hosts events such as theAutonomous Nation Conference. This conference brings together farm producers, stakeholders, researchers, and the public to discuss advancements in agricultural technology. Through this platform, the Principal Investigator (PI) has actively promoted the project's goals and provided professional insights by serving as a panelist and invited speaker at similar events. These efforts have allowed us to disseminate our research findings, demonstrate the practical applications of our autonomous UGV system and AI algorithms, and gather valuable feedback from the agricultural community. This engagement has not only raised awareness about the project but also fostered collaboration and knowledge exchange with key stakeholders in the industry. What do you plan to do during the next reporting period to accomplish the goals? Enhance System Performance and Field Testing: Optimize AI algorithms (YOLO models) and sensor fusion (GPS, LiDAR, IMU) for improved weed detection and navigation accuracy. Conduct comprehensive field trials to validate the UGV system's performance in real-world agricultural conditions. Develop Dynamic Operational Logic and Outreach: Implement object tracking and dynamic operational logic for real-time weed removal in moving field scenarios. Expand outreach efforts by collaborating with organizations likeGrand Farmand participating in conferences to disseminate results and gather stakeholder feedback. Publish Research Findings: Submit at leastthree manuscriptsto peer-reviewed journals to share advancements in AI-based weed detection, autonomous navigation, and precision mechanical weeding.
Impacts What was accomplished under these goals?
1. Develop AI Algorithms for Weed Identification (Objective 1) We have developed and trained deep learning models (YOLOv8-YOLOv11) for real-time weed detection, deploying them on an edge device (Jetson AGX Orin) for efficient processing. These AI algorithms have been integrated with robotic systems to enable real-time weed localization and targeting. Additionally, we collected and processed multispectral UAV imagery to create detailed weed and crop segmentation maps using vegetation indices (NDVI, VARI) and deep learning models like U-Net. To support these efforts, we utilized annotation tools such as SAM (Segment Anything Model) to create high-quality training datasets for accurate weed and crop classification. 2. Develop an Autonomous UGV System (Objective 2) We have developed a robust GPS-based navigation system that uses sensor fusion (IMU, encoder, and GPS) combined with an Extended Kalman Filter (EKF) to achieve precise localization and waypoint-based navigation. Simultaneously, we collected and analyzed LiDAR data to enable obstacle detection, row detection, and path planning, with ongoing work to integrate LiDAR-based SLAM for more reliable navigation. Furthermore, we built and tested two robotic platforms,MiniWeedBot, which are integrated with GPS, Jetson, and Pixhawk systems to demonstrate autonomous navigation capabilities in controlled environments. 3. Develop a UGV System for Autonomous Weed Recognition and Targeted Spraying (Objective 3) We have designed and fabricated a Delta robotic arm for precision mechanical weeding, integrating its inverse kinematics with real-time weed detection models to enable accurate targeting and removal of weeds. The system uses an auger drill bit as the end-effector, selected for its ability to minimize crop damage while effectively uprooting weeds. We successfully synchronized weed detection and robotic actuation, allowing the system to identify and remove weeds in real time under static conditions. A graphical user interface (GUI) was also developed to control the system, including features for homing, object detection, and arm actuation, ensuring seamless operation.
Publications
|