Source: WEST TEXAS A&M UNIVERSITY submitted to NRP
NRI: CASS: CONFIGURABLE, ADAPTIVE, AND SCALABLE SWARM OF GROUND AND AERIAL ROBOTS FOR COLLABORATIVE SMART AGRICULTURE
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1027373
Grant No.
2021-67021-35959
Cumulative Award Amt.
$1,000,000.00
Proposal No.
2021-10941
Multistate No.
(N/A)
Project Start Date
Sep 1, 2021
Project End Date
Aug 31, 2025
Grant Year
2021
Program Code
[A7301]- National Robotics Initiative
Recipient Organization
WEST TEXAS A&M UNIVERSITY
(N/A)
CANYON,TX 79016
Performing Department
(N/A)
Non Technical Summary
The US has been the world's leader in bringing innovative solutions to modern agriculture aimed at improving the quantity and quality of agricultural products, while minimizing economic and environmental costs.Large farm sizes have catapulted the industry to use big machinery, driven by the need to complete field operations in a timely manner with minimal labor. However, larger machinery has also led to long-term economic and environmental concerns resulting from soil compaction, limited ability to address small-scale field variability, and reduced crop productivity.Several field tasks, such asmonitoring crop health, weed scouting andcontrol, assessing soil fertility status, and harvesting,are still typically performed by human workers and require trained labor. Many low-cost, small robots with embedded swarm intelligence bring novel approaches to address these challenges and achieve system-level flexibility and improved robustness. This project will establish a Configurable, Adaptive, and Scalable Swarm (CASS) system consisting of unmanned ground and aerial robots for collaborative smart agriculture by:(1)establishing the overall system architecture and lay the technical groundwork;(2)developing the CASS system consisting of the physical hardware platform, its digital twin, and a user interface (UI); and(3)evaluating the CASS system for its technical functionality, task performance, and usability, through testbed evaluations and two real-world case studies.Our long-term goal of establishing a deployable, easy-to-use CASS system that can serve as a universal platform for broad agriculture applications by: (a) establishing the technical and theoretical groundwork for the deployable, scalable swarm system, (b) developing and integrating core component technologies, and (c) producing evidence-based knowledge via comprehensive evaluations. This NRI project serves as a critical pathway towards this long term goal.Utilizing a scalable mobile ad-hoc network and a novel swarm decision making technique based on a probabilistic finite state machine model, the CASS system will seamlessly integrate the hardware platform, its digital twin, and the UI. The hardware platform will employ a heterogeneous team of configurable ground and aerial robots. The digital twin will support i) high-fidelity simulations of individual robots for locomotion and individual sensor performance and ii) low-fidelity simulations of the entire swarm and larger task teams for swarm behavior and collective task performance ondigital environmental models created for each target application.Two real-world case studies will be used for evaluation: (Study I) field scouting and weed management and (Study II) determining locations and behavior of grazing beef cows.
Animal Health Component
40%
Research Effort Categories
Basic
40%
Applied
40%
Developmental
20%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4021719202050%
4021830202025%
4023310202025%
Goals / Objectives
This overall goal of thisproject is todevelopa Configurable, Adaptive, and Scalable Swarm (CASS) system consisting of unmanned ground and aerial vehicles (UGVs & UAVs) that can dynamically change team formation, size, and interaction with humans when presented with user-defined tasks-which may also dynamically change. Three objectives are:Objective 1.Establish overall system architecture and lay the technical groundwork for scalable robotic swarms with local communication, decision-making, task planning, and human-swarm collaboration capabilities.Objective 2.Establish the CASS system consisting of the physical hardware platform, its digital twin, and user interface for collaborative smart agriculture applications.Objective 3.Evaluate the CASS system for its 1) technical functionality - locomotion, sensing, and communication, 2) task performance at the individual robot and the task group level, and 3) overall system robustness and resilience through testbed evaluations and real-world case studies.
Project Methods
Objective 1.Establish overall system architecture and lay the technical groundwork for scalable robotic swarms with local communication, decision-making, task planning, and human-swarm collaboration capabilities.Methods:This objective will be carried out in three tasks: 1) establish overall system architecture; 2) develop technical framework for mobile ad-hoc wireless network (MANET); and 3) universal consensus algorithm and its specific use for task allocation in CASS. The overall system architecture will consist of the physical CASS platform consisting of multiple UGVs and UAVs, digital twin of CASS, and user interface (UI) devices, which are integrated via the ROS 2 middleware. For connecting individual robotic agents and the computing systems, we propose a hybrid wireless communication strategy for establishing a scalable wireless network for CASS via a mobile ad-hoc network (MANET). Robot-to-hub/cloud and vice versa (R2H) will use WiFi and/or cellular and robot-to-robot (R2R) will rely on ZigBee. The consensus algorithm which will form one of the basic functions of the individual robots in the swarm will allow the robots to make lower-level decisions collectively for autonomous task performance and response to adverse event. The proposed algorithm is simple and scalable and thus ideally suitable for the implementation in CASS and the proposed ZigBee based the R2R communication strategy.Objective 2.Establish the CASS system consisting of the physical hardware platform, its digital twin, and user interface for collaborative smart agriculture applications.Methods: This objective will be accomplished by the following three tasks: 1) establish the CASS hardware platform; 2) develop the digital twin of CASS; and 3) develop adaptive UI environment for potential end-uses.Hardware of CASS will optimally leverage existing resources and new robots built and purchased form this project. The target size is 25, including 15 ground vehicles and 10 areal vehicles. Existing and new robots will be configured to share common wireless communication capabilities and feature various sensors for target experiments. The digital twin of CASS will be developed to offerhigh-fidelity local-level simulations of individual robots and small task teams with capabilities of simulating onboard sensors, wireless communication, and mobility; and low-fidelity swarm-level simulations of larger task teams or the entire swarm for collective task performance.Building reliable and valid environmental models for various agriculture applications involves accurate sensor data captured by UAVs and UGVs and processing and extrapolation of the data to create a valid digital counterpart of the real environment. Our primary focus will be in a row-crop field while the research outcomes will be transformative and thus can be extended to other application areas, such as livestock farms.Two specific UI devices considered for this project are touchscreen tablets and wearable AR devices.Objective 3.Evaluate the CASS system for its 1) technical functionality - locomotion, sensing, and communication, 2) task performance at the individual robot and the task group level, and 3) overall system robustness and resilience through testbed evaluations and real-world case studies.Methods: Technical evaluation of the CASS system will be conducted in the following phases: [Phase I] Technical evaluations using a testbed environment (i.e., cotton fields) throughout the technical developmental phases at each milestone; and [Phase II] Two case studies to evaluate real-world performance of CASS, validity of the digital twin for predicting individual robot- and swarm-level task performances and for simulating larger scale crop field performance, and usability of the UI modalities. Phase I will adopt standard engineering protocols for each of the components for technical evaluations. Phase II focuses on two real-world case studies selected to demonstrate CASS capabilities in diverse applications. Two case studies include: Case Study I) field scouting and field management in cotton and peanut fields; and Case Study II) localization and gazing behavior detection for beef cows. Case Study I will employ an optimal number of UGVs and UAVs determined by the low-fidelity simulator.Multiple swarm algorithms for coordinated field scouting tasks will be selected and embedded for each type of platforms. Given the entire CASS system with available UGVs/UAVs, the consensus algorithm will determine a specific task team to be deployed. When weeds are detected while scouting, the robots will broadcast an alert message with the location and type (small or large) of weeds through the established hybrid MANET. Specific robot platform with necessary weed treatment equipment will then be dispatched. Since we have a limited number of robots with such specialized capabilities, this process will not require local decision-making. Case Study IIwill focus on using CASS consisting of multiple UAVs to collect aerial images of cattle in pastures to determine feasibility of the UAV-collected images for creating a digital twin model of a herd of 300-350 cows for tracking their locations and projecting their behavior.

Progress 09/01/23 to 08/31/24

Outputs
Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This project brought togehter a diverse group of graduate and undergraduate students from various disciplines, fostering a collaborative learning environment where participants could excahnge ideas and expertise across fields. During this reporting period, the team included four graduate students from Mechanical Engineering, one from Soil & Crop Sciences, one from Animal Science, and one from Agricultural Engineering. These students worked togehter to develops everal foundational technologies that are integral to this project's goals and participated in the experimentalcase studies. A significant aspect of the project was the interdisciplinary training provided to students. Several engineering students were newly trained and certified to conduct animal research under Institutional Review Board (IRB)-approved protocols. Those with no prior experience working with aerial vehicles were trained to comply with Federal Aviation Administration (FAA) regulations. Students from non-engineering disciplines were also introduced to robot operations and the technical aspects of ground support systems. To facilitate this knowledge exchange, we held monthly project meetings and an annual project workshop in August 2024. How have the results been disseminated to communities of interest?The results have been disseminated through peer-reviewed publications (as listed under "Products") and presentations and invited talks as listed below: (Invited Panelist)Kiju Lee,Inventures Conference, Farming Futures Panel Discussion, Calgary, Canada, May 30, 2024 (Abstract + Oral Presentation) (Speaker: Kiju Lee) K. Lee,K-N. Lee,Y. Wei,J. Johnson, J. Landivar, E. Mendes, C. Lindhorst, D. Um, J. Cason, R. Hardin, L. Tedeschi, J. Landivar-Bowles, M. Bhandari, "Challenges and Opportunities of Swarm Robotics for Smart and Collaborative Agriculture," International Commission of Agricultural and Biosystems Engineering, Jeju, Korea, May 19-23, 2024 (Invited Speaker)Kiju Lee, National Institute for Occupational Safety and Health - Robotics Interest Forum Seminar, Online, May 15, 2024, Title: "Swarm Robotics and Its Potential for Complex Real-World Problem" (Invited Speaker)Kiju Lee, Interagency Working Group (IWG), The Intelligent Robotics and Autonomous Systems (IRAS), Title: "Swarm Robotics and Its Potential for Complex Real-World Problem" [National] (Formed in 2017, IRAS IWG coordinates federal R&D across 28 participating agencies chaired by Dr. Thomson, National Program Lead of Division of Agricultural Systems at USDA NIFA and Dr. Wachs, Program Director, Division of Information and Intelligent Systems, NSF) (Invited Presenter) Kiju Lee,Data-Driven Intelligent Agricultural Systems Symposium, Texas A&M AgriLife, February 8, 2024. Title: "Challenges and Opportunities of Swarm Robotics and Its Real-World Application in Agriculture" [Regional] What do you plan to do during the next reporting period to accomplish the goals?Research: We will finalize the technical development for physical CASS system, test, evaluate, and implement multirobot communication protocols/algorithms, and implement multirobot algorithms tailored for target agricultural tasks (i.e., scouting, weed localization/mapping, soil mapping), and focus on carrying out the case studies that are designed to showcase the fully integrated CASS system and its capabilities. As the project enteres the last year of the 4-year project period, the team will collectively seek for continued funding opportunities to expand and further develop this core technology and investigate the ways to bring thisto farmers. Dissemination of the results: We are currently working on a data paper, publishing the UGV-collected data to be publicly available to support other reserachers in robotics, agricluture, and other relevant fields to use. We also plan on several co-authored journal and conference papers reporitng the results of system-level integration and experiments. The team is also actively involved with several university-wide initiatives allowing us to present and showcase the results from this project at various occasions.

Impacts
What was accomplished under these goals? OBJECTIVE1.Establish overall system architecture and lay the technical groundwork for scalable robotic swarms with local communication, decision-making, task planning, and human-swarm collaboration capabilities. During this reporting period, we focused on developing coverage control and task allocation algorithms and system-level integration and implementation for multi-robot task performance. This is a critical milestone towards achieving swarm-level intelligence for target agricultural tasks. We have also continued improving our hardware capabilities by building additional robots and developing prototypes for new capabilities. Hardware development:We have built three additional hammerhead-bots (HHBs) in addition to two existing ones, making a total of 12 fully functional ground robots (5 HHBs, 4 AION-R1, 2 CLAWbots, 1 a-WaLTR) for this project. In addition, we worked on equipping some of these robots with special functions like adding a soil sensor.This custom-built ground robot with autonomous scouting capabilities has been developed for specialized, real-time soil parameter measurement. The system integrates robust mechanical components, advanced sensing technologies, and an efficient control system, optimized for field applications through the use of cost-effective and lightweight materials. Software development:During the first two years of technical development (2021 - 2023), we primarily focused on autonomous algorithms necessary for individual robots. During the current reporting period, we focused on multi-robot algorithms, including coverage control and task allocation, and integrating component technologies and algorithms for system-level integration. The architecture of the autonomous crop field scouting system using a heterogeneous multirobot system (MRS) is built on the Robot Operating System (ROS) 2 framework. Wireless communication is facilitated by a local radio mesh network. High-resolution aerial images from a UAV are used to detect in-between crop rows. Task allocation algorithms then determine the optimal number and type of robots and assign them to specific locations, while minimizing robot use and considering capacities and field constraints. During scouting, robots transmit pose data to a central computer for real-time visualization via the UI. All robots are operated on ROS2 Humble in Ubuntu 22.04. Individual robots are equipped with autonomous localization and navigation algorithms for field scouting. Task allocation within the scope of the present work aims to assign robots to a target field that may involve more than one distinctive regions. The robot team may involve diverse robots, each with unique locomotion capabilities and physical dimensions, and therefore, the types of robots that can operate in specific regions may be constrained by these factors. Optimized task allocation seeks to assign appropriate robots to specific regions based on the constraints and capacities of each robot type while minimizing the total number of robots used. Considerdistinctive regions that are subject to coverage. For each region, the crop row detection algorithm determines the number of rows and the length of each row. The size of the region is calculated based on the total travel length required for scouting. Based on that information, the system calculates the total length of the crop rows to be covered and automatically determines the number and types of robots required for completing the task within a user-defined time limit. OBJECTIVE2.Establish the CASS system consisting of the physical hardware platform, its digital twin, and user interface for collaborative smart agriculture applications. UGV-enabled data collection on crop fields:We continued collecting data from two different locations, Corpus Christi and College Station, TX, for cotton and peanut fields. The data collection was performed using UGVs. The locations for the data collection were the cotton field in Corpus Christi, and cotton and peanut fields in College Station. The RGB and depth images were collected based on the seasons. We are also annotating the images for training a machine-learning-based model for detecting plants' growth, weeds, and other key features. The team is collectively working on a data publication with some case studies demonstrating the use of the UGV-collected data. Digital Twin updates & Technical Updates from Corpus Christi Sites:Early-season yield predictions in commercial cotton fields can serve to explore various production and marketing management strategies. The advancement of remote sensing technology has enabled the collection of vast amounts of precise, spatio-temporal data on field crops. This, coupled with progress in data processing and analytics, presents an opportunity to develop dynamic, data-driven models capable of simulating crop performance from the growing season to harvest. In this study, we evaluated a yield prediction method that utilizes high-resolution UAS imagery to create Digital Twin models. This process involves the development of temporal Digital Twin models for canopy cover, plant height, canopy volume, and vegetative indexes, which are updated on a weekly basis. These Digital Twin models are then employed as input for artificial intelligence algorithm that predict cotton lint yield. The resulting platform, based on Digital Twins, demonstrated the ability to obtain reliable yield estimates approximately 8-10 weeks before harvest. The 2024 cotton growing season in the Lower Costal Bend Region of Texas was characterized by low accumulation of heat units and adequate soil water during the vegetative phase.However, the crop experienced high temperatures and low rainfall during the boll filling period, followed by excessive rainfall during the boll opening phase.A commercial cotton field located at Driscoll Texas was monitored weekly, from planting to harvest, using an Unmanned Aircraft System (UAS), equipped with an RGB camara.Data extracted from the collected images included Canopy Cover, Plant Height, Canopy Volume and Excess Greenness vegetative index. The temporal plant features were used as input to a Digital Twin Model to evaluate the performance of the crop in terms of yield potential throughout the season and to forecast crop defoliation date.The Digital Twin model forecasted an increasing yield trend until the early bloom period, followed by a drastic reduction in yield potential during the boll filling period and an unexpectedly early crop maturity date. OBJECTIVE3.Evaluate the CASS system for its 1) technical functionality - locomotion, sensing, and communication, 2) task performance at the individual robot and the task group level, and 3) overall system robustness and resilience through testbed evaluations and real-world case studies. The physical demonstration was conducted in the Texas A&M AgriLife Research farm where peanut and cotton plants are grown. The autonomous crop row detection algorithm was applied to extract the geo-coordinates in between the crop rows in these two regions. The algorithm determined the sizes of these regions to be 508 and 5018, respectively. Although both peanut and cotton rows were planted at 1-m row spacing, the usable distance between the rows is reduced as the plants grow larger and begin to close the canopy. At the time of the field testing, the average gap between the two crop row canopies was 0.4 m for peanut (Region 1) that have prostrate plant morphology and 0.6 m for cotton (Region 2). We employed three robots custom-built using the AION-R1 chassis (AION Robotics) [Type 1] and three made of the Hammerhead Chassis Kit. These robots are commonly equipped with localization and autonomous navigation functions.

Publications

  • Type: Other Journal Articles Status: Published Year Published: 2024 Citation: Lee K, Lee K. Terrain-aware path planning via semantic segmentation and uncertainty rejection filter with adversarial noise for mobile robots. Journal of Field Robotics. 2024.
  • Type: Other Journal Articles Status: Published Year Published: 2024 Citation: Wei Y, Lee K. CLAW: Cycloidal Legs-Augmented Wheels for Stair and Obstacle Climbing in Mobile Robots. IEEE/ASME Transactions on Mechatronics. 2024 Aug 22.
  • Type: Other Journal Articles Status: Published Year Published: 2024 Citation: Lee K, Lee K. Adaptive Centroidal Voronoi Tessellation With Agent Dropout and Reinsertion for Multi-Agent Non-Convex Area Coverage. IEEE Access. 2024 Jan 8.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Wei Y, Lee K, Lee K. Autonomous Field Navigation of Mobile Robots for Data Collection and Monitoring in Agricultural Crop Fields. In2024 21st International Conference on Ubiquitous Robots (UR) 2024 Jun 24 (pp. 707-712). IEEE.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Jarecki A, Lee K. Mixed Reality-Based Teleoperation of Mobile Robotic Arm: System Apparatus and Experimental Case Study. In2024 21st International Conference on Ubiquitous Robots (UR) 2024 Jun 24 (pp. 198-203). IEEE.


Progress 09/01/22 to 08/31/23

Outputs
Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided? This project involved graduate and undergraduate students from different majors and disciplines, enabling them to collaborate with and learn from each other. During this reporting period, four students in Mechanical Engineering, two in Soil & Crop Sciences, one in Animal Science, and three in Agricultural Engineering participated in this project, developing several foundational technologies to support the project and case studies. Several of engineering students were newly trained and certified to conduct animal research through IRB-approved protocols; those who had never worked with aerial vehicles were trained for FAA regulations; those in non-engineering majors learned about robot operations and technical groundworks. We held three on-site in-person workshops to bring this highly multidisciplinary team together to learn about the real environments and limitations and conduct field experiments. To explore the opportunity to use an existing custom-built UAV from the Texas A&M AgriLife group, we invited the developer of the system to Texas A&M University, College Station, to provide technical training and share insights about the functional extension of the system for our project team. All students involved in this project actively participated in monthly project meetings, presenting research updates and engaging in discussions, helping them to develop professional communication and presentation skills. Senior personnel closely worked with their advisees as well as other graduate students in this project via serving on their advisory committees and co-supervising the task-specific research activities. How have the results been disseminated to communities of interest?The results have been disseminated through journal publications and conference proceedings/presentations. C. Zheng,A. Jarecki, andK. Lee(2023), "Integrated System Architecture for Virtual-Physical Hybrid Robotic Swarms and Human-Swarm Interaction,"Scientific Reports, 13(1): 14761, doi: 10.1038/s41598-023-40623-6 M. HammondandK. Lee (2023), "ARMoR: Amphibious Robot for Mobility in Real-World Applications,"IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM),June 27 - July 1, 2023, Seattle, WA, USA B. Gurjar, S. Kumar, J. Johnson, R. Hardin, M. Bagavathiannan (2023), "Designing and Testing a Micro-Volume Spray System for Site-Specific Herbicide Application Using Ground Robots,"ASA, CSSA, SSSA International Annual Meeting J. Johnson, D.Wang, C.Hu, S.Xie, D.Song and,M.Bagavathiannan, "Development of a robotic flaming weed control system using a six-axis manipulator mounted on a quadruped robot,"WSSA 2024 (under review) K-N. LeeandK. Lee, "Terrain-Aware Path Planning via Semantic Segmentation and Uncertainty Rejection Filter with Adversarial Noise for Mobile Robots,"submitted toJournal of Field Robotics (under review) Y. WeiandK. Lee, "CLAW: Cycloidal Legs-Augmented Wheels for Stair and Obstacle Climbing in Mobile Robots," submitted toIEEE/ASME Transaction on Mechatronics (under review) K-N. LeeandK. Lee, "Decentralized Multi-Robot Coverage Control of Non-Convex Domains using Consensus-based Self-Dropout with Voronoi Tesselation," submitted toIEEE International Conference on Robotics and Automation (under review) K-N. LeeandK. Lee, "Adaptive Centroidal Voronoi Configuration with Agent Dropout and Reinsertion for Multi-Robot Aera Coverage in Non-Convex Domains," submitted toIEEE Access (under review) What do you plan to do during the next reporting period to accomplish the goals?OBJECTIVE 1: Test multirobot mesh communicationat different scales up to 10+ units. Conduct physical experiments for multirobot task allocation and coverage control employing 10+ robotic agents. Evaluate the UI and improve. OBJECTIVE 2: Finalize the simulation software that takes input of the satellite image of the target field and generates the simulation environment. Integrate the UAV-collected data to create a high-fidelity simulation environment and technically merge the component technologies for full-scale swarm simulations. Conduct user evaluation studies for the UI design, modalities, and usability. OBJECTIVE 3: Finalize the design and detailed protocols of the two case studies and prepare the necessary paperwork.

Impacts
What was accomplished under these goals? OBJECTIVE 1 Hardware development:We continued to develop a multi-robot platform to establish a configurable team of 10+ heterogeneous robots, sharing wireless communication and collaborative capabilities.Building on theα-WaLTR platforms reported in the previous annual report, we built and tested new robots, including CLAWbot (1), AION-R1 platforms with custom sensors (4), hammerhead robots with custom sensors (2*), ARMoR platforms (3*), and a custom quadcopter UAV (1). The numbers in the parathesis indicate the number of units, where * indicates some units still being assembled. Together with the twoα-WaLTR platforms, we expect to have 12 ground units and 1 aerial unit fully functional by early Spring 2024. Theα-WaLTR platforms have Jetson TX2, and all other platforms have Raspberry Pi as the main onboard computer. We adopted the Rajant wireless mesh network device primary wireless communication device for all robots except for ARMoRs - smaller robots configured for localized tasks within a short distance from one of the other platforms equipped with the Rajant devices. All robots have an XBee device for short-range, low-bandwidth, low-power communication among themselves.All the robots operate in ROS2 (Robot Operating System 2). Software development:To equip the ground platforms with core autonomous functions, several component algorithms have been developed for autonomous localization and navigation and multi-agent task allocation for agricultural environments.For localization, it is essential to integrate data from diverse sensors for a comprehensive result. The Extended Kalman Filter (EKF) algorithm has been widely implemented for robot state estimation. We applied a comparable approach to integrate data from GPS, IMU, and wheel encoders for accurate robot state estimation.For autonomous navigation, we implementedthe artificial potential field method. This method establishes a virtual potential field generated by surrounding obstacles, enabling the repulsive force to propel the robot away from these obstacles to avoid collision. We also investigated algorithms for task allocation for a multirobot system. Given an agricultural field subject to monitoring or data collection, the field must be divided and assigned to individual robots. Such problems are generally referred to as coverage control problems in multiagent systems. Many solutions to these problems exist, but covering agricultural fields using UGVs involves additional constraints. OBJECTIVE 2 UGV-enabled data collection:The data collection was performed using UGVs. The locations for the data collection were the cotton field in Corpus Christi, and cotton and peanut fields in College Station. The RGB and depth images were collected based on the seasons. The images of the cotton field in Corpus Christi were captured from April to August 2023. In College Station, images of the cotton and peanut fields were collected from June to October 2023. We collected45,147 images (32,668 from Corpus Christi and 12,479 from College Station).From the collected data, we plan to annotate the images for object detection and semantic segmentation. The annotated images will be a training dataset for neural networks of object detection and semantic segmentation. To create the training dataset, we will perform the following procedures: 1) Select representative images, 2) Manually annotate the selected images, 3) Train the neural networks, 4) Generate the annotated images from a subset of raw images using the trained neural networks, 5) Manually correct the mislabeled areas in annotated images created by the neural networks, 6) Merge the annotated images from the neural networks into the training dataset, and 7) Go to step 2). UAV-enabled autonomous weed detection:The research was conducted at the David Wintermann Rice Research and Extension Center in Eagle Lake, TX. The specific objectives of this study were to 1) identify weed patches in a rice crop using image analysis techniques and 2) compare the efficacy of a drone-based precision herbicide application with a conventional backpack spray application. The weed species targeted in this research were Amazon sprangletop (Leptochloa panicoides), yellow nutsedge (Cyperus esculentus), and hemp sesbania (Sesbania herbacea). The You Only Look Once (YOLOv5) model was used to detect late-season weeds, and a transformation matrix was used to convert image coordinates into global coordinates for site-specific herbicide applications to manage individual weed patches in rice using a UAS sprayer.Results showed that the detection accuracy of the YOLOv5l model for hemp sesbania, Amazon sprangletop, yellow nutsedge, and barnyard grass was 70%, 62%, and 56%, respectively. The findings also show that model-based extracted coordinates had less positional accuracy compared to manual GPS coordinates if the transformation matrix had errors. The GPS-based site-specific aerial spray application saved up to 45% of herbicide volume compared to the broadcast application. These findings demonstrate the high potential for using deep-learning models and UAS for site-specific management of weed escapes in rice. Future improvements will include real-time weed detection and spraying using an on-board data processing system. Additional experiments are also vital for improving the detection accuracy of the model in rice. Preliminary investigation of mixed-reality (MR) based user interface (UI):The primary UI modality we proposed was an MR-based interface using Microsoft's Hololens2. As the first phase of development and evaluation, we designed a simple MR-based UI for a user to remotely control a UGV with a robotic manipulator mounted on the top for mobile manipulation. The robot and Hololens communicated with each other via WebSocket, supporting long-distance and outdoor communication.The UI was designed using the Unity game engine. The UI visualizes interactive holograms of the main menu buttons, camera panel, and robot arm hologram representation. The overall color scheme was selected to reduce user eye strain. A preliminary test with 10 participants was conducted, and we are currently in the process of analyzing the results. The participants were divided into two groups, one using a conventional computer interface and the other using a Hololens interface. Each participant was directed to identify four small, brightly-colored targets in a room by controlling the robot using the assigned UI device. We administered a brief assessment regarding their experience and the NASA TLX questionnaire. The results are collectively analyzed now and will be reported in the next annual report. Hybrid Robotic Swarm (HyRoS) simulator:We have also developed a hybrid swarm simulator to support this project (Zheng, Jarecki, and Lee, 2023). The hybrid robotic swarm (HyRoS) can simulate a hybrid swarm of physical and virtual agents to develop, implement, and evaluate swarm algorithms and related technologies. The system comprises of three main modules: (1) the virtual module simulating robotic agents, (2) the physical module consisting of real robotic agents, and (3) the UI module (Figure 12). Communications among the modules are facilitated by the Photon Network and ROS bridges. We also adopted Hololens for the UI modality here. OBJECTIVE 3 Yuan Wei, a graduate research assistant, has passed and obtained a Remote Pilot Certificate from FAA to fly the UAV under FAA's Small UAS Rule (Part 107). To fly the drone on the university's property (cattle farm and plant farm), we need to get approval from the Texas A&M University System Office of Risk Management. Currently, we have gained permission to fly the drone at the university cattle farm. We plan to get permission to fly the drone on crop fields as well next year.The team is in the process of finalizing specific demonstration scenarios and experimental protocols for the case studies.

Publications

  • Type: Journal Articles Status: Under Review Year Published: 2024 Citation: K-N. Lee and K. Lee, "Terrain-Aware Path Planning via Semantic Segmentation and Uncertainty Rejection Filter with Adversarial Noise for Mobile Robots, submitted to the Journal of Field Robotics
  • Type: Conference Papers and Presentations Status: Under Review Year Published: 2024 Citation: K-N. Lee and K. Lee, Decentralized Multi-Robot Coverage Control of Non-Convex Domains using Consensus-based Self-Dropout with Voronoi Tesselation, submitted to IEEE International Conference on Robotics and Automation
  • Type: Journal Articles Status: Under Review Year Published: 2024 Citation: Y. Wei and K. Lee, CLAW: Cycloidal Legs-Augmented Wheels for Stair and Obstacle Climbing in Mobile Robots, submitted to IEEE/ASME Transaction on Mechatronics
  • Type: Journal Articles Status: Under Review Year Published: 2024 Citation: K-N. Lee and K. Lee, Adaptive Centroidal Voronoi Configuration with Agent Dropout and Reinsertion for Multi-Robot Aera Coverage in Non-Convex Domains, submitted to IEEE Access
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Designing and Testing a Micro-Volume Spray System for Site-Specific Herbicide Application Using Ground Robots, Authors: Bholuram Gurjar, Sunil Kumar, Joe Johnson, Robert G Hardin, Muthu Bagavathiannan
  • Type: Conference Papers and Presentations Status: Under Review Year Published: 2024 Citation: Development of a robotic flaming weed control system using a six-axis manipulator mounted on a quadruped robot, Authors: Joe Johnson, Di Wang, Chengsong Hu, Shuangyu Xie, Dezhen Song and, Muthukumar Bagavathiannan Under review: WSSA 2024
  • Type: Journal Articles Status: Published Year Published: 2023 Citation: Zheng, C., Jarecki, A. & Lee, K. Integrated system architecture with mixed-reality user interface for virtual-physical hybrid swarm simulations. Sci Rep 13, 14761 (2023). https://doi.org/10.1038/s41598-023-40623-6
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: M. G. Hammond and K. Lee, "ARMoR: Amphibious Robot for Mobility in Real-World Applications," 2023 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Seattle, WA, USA, 2023, pp. 910-915, doi: 10.1109/AIM46323.2023.10196173.


Progress 09/01/21 to 08/31/22

Outputs
Target Audience: Nothing Reported Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?This project involved graduate and undergraduate students from different majors and disciplines, enabling them to collaborate with and learn from each other. During this reporting period, four students in Mechanical Engineering, two in Soil & Crop Sciences, one in Animal Science, and three in Agricultural Engineering participated in this project, developing several foundational technologies to support the project and case studies. Several of engineering students were newly trained and certified to conduct animal research through IRB-approved protocols; those who had never worked with aerial vehicles were trained for FAA regulations; those in non-engineering majors learned about robot operations and technical groundworks. We held three on-site in-person workshops to bring this highly multidisciplinary team together to learn about the real environments and limitations and conduct field experiments. To explore the opportunity to use an existing custom-built UAV from the Texas A&M AgriLife group, we invited the developer of the system to Texas A&M University, College Station, to provide technical training and share insights about the functional extension of the system for our project team. All students involved in this project actively participated in monthly project meetings, presenting research updates and engaging in discussions, helping them to develop professional communication and presentation skills. Senior personnel closely worked with their advisees as well as other graduate students in this project via serving on their advisory committees and co-supervising the task-specific research activities. How have the results been disseminated to communities of interest?MARS Conference: PI Lee has presented her newest ground robot,α-WaLTR at a private conference hosted by Amazon Founder Jeff Bezos. This robot, developed by the prior DARPA project, is an important part of this NRI project. Lee showcased this NRI project as the first real-world application of the robot in complex missions. Papers under review & in preparation: Based on the outcomes of the first project year, we have 1 journal and 1 conference paper under review and several (2 journal and 1 conference papers) currently in preparation. Upon acceptance, the publication information will be uploaded accordingly. Media publicity:This project has been featured in external and institutional media, as listed below. The Eagle, "Adaptive swarm robotics could revolutionize smart agriculture." December 13, 2021.https://theeagle.com/landandlivestockpost/adaptive-swarm-robotics-could-revolutionize-smart-agriculture/article_dea08796-4c9b-11ec-9902-032be0e63d11.html Texas Farm Bureau, "Swarm robotics could help farmers and ranchers," by Jennifer Whitlock. December 2, 2021.https://texasfarmbureau.org/swarm-robotics-could-help-farmers-and-ranchers/ Texas A&M Today; Texas A&M Engineering News. Adaptive swarm robotics could revolutionize smart agriculture. November 15, 2021.https://today.tamu.edu/2021/11/15/adaptive-swarm-robotics-could-revolutionize-smart-agriculture/ What do you plan to do during the next reporting period to accomplish the goals?Objective 1.Establish overall system architecture and lay the technical groundwork for scalable robotic swarms with local communication, decision-making, task planning, and human-swarm collaboration capabilities. Task 1-1.Test, modify, and upgrade wireless networking using the selected radio devices for 10+ dynamic nodes. Task 1-2.Improve the localization function to reduce the resolution from several meters to 1-2-meter ranges. Task 1-3.Establish the swarm system supporting virtual-physical hybrid swarm simulations and develop swarm tactical algorithms that can broadly support broad agricultural tasks (e.g., scouting, area coverage, target localization, tracking, etc.) Objective 2.Establish the CASS system consisting of the physical hardware platform, its digital twin, and user interface for collaborative smart agriculture applications. Task 2-1.Conduct year-long data collection from target agricultural fields to 1) develop new navigation algorithms, reliable and efficient for agricultural environments; 2) establish digital twin environments; and 3) test mobile platforms for autonomous data collection. Task 2-2.Complete technical development and customization of four Aion-R1 platforms, fourα-WaLTR platforms, and one aerial testbed platform. Task 2-3.Develop a physics-based digital twin environment using the data from Task 2-1 for high and low-fidelity simulations. Objective 3.Evaluate the CASS system for its 1) technical functionality - locomotion, sensing, and communication, 2) task performance at the individual robot and the task group level, and 3) overall system robustness and resilience through testbed evaluations and real-world case studies. Task 3-1.Complete technical preparation to support Case Study II (cattle farm) and develop 2-3 experimental scenarios involving 1-2 UASs, cows with GPS ear tags, and sensors installed in the feeding stations, water chutes, and mineral feeders. Task 3-2.Prepare the technology field operations by equipping the system with a portable communication system and developing the deployment plans and protocols. Task 3-3.Develop tentative scenarios for Case Study I (cotton/peanut fields), involving diverse tasks commonly required in these fields.

Impacts
What was accomplished under these goals? IMPACT This project aims to address the challenges of large heterogeneity and high complexity in agriculture automationby developinga new integrated system that seamlessly combines the scalable physical hardware platform, its simulator (referred to as adigital twin), and user interface (UI) modalities. This is a multifaceted, highly complex research problem requiring transdisciplinary approaches for technology development and field evaluation. This NRI project will serve as acritical pathwaytoward our long-term goal of establishing a deployable, easy-to-use robotic swarm system that can serve as a universal platform for broad agriculture applications. To do so, this project will 1) establish the technical and theoretical groundwork of the deployable robotic swarms for human-swarm collaborative smart agriculture, 2) develop and integrate core component technologies, and 3) conduct comprehensive evaluations producing evidence-based knowledge.The proposedConfigurable, Adaptive, and Scalable Swarm (CASS) system will bring societal impact and agricultural benefits by improving the quality and quantity while reducing cost and environmental impact. Significant research outcomes and progresses for the three objectives are detailed below. OBJECTIVE 1 Mobile platforms:We upgradedthe current custom-developed robotic platformα-WaLTR by replacing (1) the fragile acrylic casing with the carbon-fiber plates and (2) embedded sensors with an integrated sensor suite that contains GPS, IMU, and XBee modules for improved sensing capabilities for autonomous navigation. Four existing commercial UGV platforms have also been fully prepared with custom control boards and sensor suites. These UGV platforms have been brought tofield testingto identify locomotion challenges and address them via future developmental efforts. The robots performed well on most of the cases, except for long hays winding around the wheels ofα-WaLTR. The original wheels designed for urban environments are currently modified to move on diverse agricultural fields. A robotic platform for herbicide spray:A software and hardware design was developed for automating a micro-volume herbicide spray using a mobile robot for early-stage weed control. The robotic system has a stereo camera, one inertial measurement unit, and multiple linearly actuating spray nozzles. The space occupied by weeds is represented as candidate line segments for spray and then constructed a directed acyclic graph (DAG) that embraces the feasible nozzle paths among weeds and an optimal K-nozzle assignment/motion planning problem for optimal coverage.It was observed that the lateral errors of herbicide spray are at sub-centimeter levels. Wireless networking:Establishing reliable robot-to-robot (R2R) and hub-to-robot (H2R) wireless communication is one of the most important technical groundworks that is required for further technology development in both hardware and software.We tested the radiomodules for signal quality and communication distances during our visit to AgriLife Research Extension Center at Corpus Christi (Co-PI Landivar, Co-I Bhandari, Co-I Um).When the robots equipped with ES device enters the cornfield, the cost increases significantly. The maximum distance of the connection in the cornfield is around 80m; this was significantly shorter than the 200m communication range expected in the open field. One solution to extend the communication range is that put the ground station much higher to make sure the signal covers the whole field. Autonomous navigationsystem:The algorithm consists of perception modules, localization modules, path planning modules, and a control module.The modules with the red-dashed box are fully customized for our purpose. The modules with the blue-dashed box are partially modified from open-sourced software for our purpose. Others are configured for autonomous navigation considering the hardware specification of our mobile robots. Terrain classification and navigation:For ground robots to navigate in diverse agricultural environments, it is important for the robots to perceive the surface conditions and determine whether the surrounding environments are traversable or not. We first developed an algorithm that can be applied to common terrain typesusing PSPNet to perform terrain classification. ROS RTAP-MAP, which is an open-sourced software, is adopted for obstacle and ground detection using a 3D point cloud. Open-sourced EKF is used to estimate the robot pose using GPS, IMU, and wheel encoder. GPS measures the longitude and latitude of the robot. IMU measures the angle and rate of yaw/pitch/roll. It also measures the robot's vertical, lateral, and horizontal acceleration. The wheel encoder measures the wheel speed. EKF integrates those measurements with the 3D kinematics model of a wheeled mobile robot. OBJECTIVE 2 Mixed-reality (MR) based user interface (UI) development:Microsoft HoloLens is adopted as the primary UI device for this project. It provides the hardware for an MR-based UI, where the head-mounted display shows a graphical representation of holograms in front of the user, which can be interacted with for robot control.This platform, a UGV equipped with a robotic arm and a high-quality depth camera, can be used for several agricultural applications, including observation of livestock in remote locations and up-close inspection of produce in fields.We are currently working on a small-scale user evaluation study togain feedback on the current UI system. Development of user interface for data transfer and visualization:A copy of the UAShub from the original UASHUB (developed at Texas A&M AgriLife Research) was developed to manage, visualize, and share the UASdatasets for this project. This hub is equipped with raw and processed UAS data and product upload and download, geospatial data products such as orthomosaics and 3D point cloud visualization, and general processing capabilities.DJI Phantom 4 RTK UAS equipped with RGB sensors along withRTKGPS were used to collect the data at weekly intervalsfrom an experimental field of cotton located at Corpus Christi. Collected datasets were processed to generate the 3D point clouds and DSMsto develop the digital twin of cotton environments. Two mission plans, a grid mission plan and double grid mission planwere tested to develop an efficient mission planning approach for UAS to generate Digital Twin environments. Improving georeferencing of UAS-based imagery data:Generally, Ground Control Points (GCPs) are placed at different sites when collecting UAS data to improve the geo-referencing of UAS data and to allow analysis over time. However, in this case, we want to minimize or discard the use of GCPs. We tested the performance of the local RTK-base station and the Texas Department of Transportation (TxDOT) Real Time Network (RTN) with respect to geolocation accuracy. We found the performance of TXDOT-RTN as accurate as the local RTK-base station or better. Thus, we plan to utilize this network to collect UAS data in the 2023 cotton growing season and further evaluate its performance over time. OBJECTIVE 3 We focused on preparing the necessary technical components for supporting our future evaluation studies. We identified thetechnology needs for the proposed case study. We purchased and installed10 GPS collar tags (Moovement) for tracking the GPS locations of the cows. These tags require a station to connect to the internet.

Publications