Progress 09/01/23 to 08/31/24
Outputs Target Audience:The target audience of this project includes farmers, researchers, and private industry. Organic farmers are in critical need of robust precision weed management solutions that limit excessive tillage and increase the competitiveness of the cash crop. These technologies must be affordable and accessible. By building image repositories for organic field crops and making them available to the research community, we will accelerate innovation in the research community around organic weed management. Lastly, by taking proof-of-concept technologies and operationalizing them for organic farmers, we will de-risk innovations that can be expanded on by private industry. In the second reporting period of this project, team members presented at events including state and national soybean boards workshops, the Weed Science Society of America annual meeting, and USDA ARS leadership meetings and departmental seminars. PI Mirsky gave keynote presentations at All of the aforementioned events were attended by varying combinations of farmers, researchers, and private industry personnel. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?There are seveal graduate students involved in the project as well as post docs. We also present at annual conferences and host hackathons for professional development to graduate students. How have the results been disseminated to communities of interest?Objective 5. Facilitate knowledge exchange about alternative weed control tactics using robotics among organic farmers, researchers, and educators. Team members presented at events including state and national soybean boards workshops, , the Weed Science Society of America annual meeting, and USDA ARS leadership meetings and departmental seminars. PI Mirsky gave keynote presentations at several commodity based farmer focused events, technology conferences, government agency events, and farmer field days . All of the aforementioned events were attended by varying combinations of farmers, researchers, and private industry personnel. Work was also disseminated through journal publications as noted in the Products section of this report. What do you plan to do during the next reporting period to accomplish the goals?Continue research and development of training data, sprayer technology, and professional development activities.
Impacts What was accomplished under these goals?
We designed and built a computer vision system, including machine vision camera, lens, artificial light, computational platform, gps receiver, 5g modem, and height sensor. We used novel integration of system firmware and software with hardware components to create a working prototype. Iterative optimization of the camera parameters was then carried out that focused on travel speed, availability of light, and ranges of weed sizes. Finally, we installed server infrastructure and set up remote access into the camera system when in operation, as well as automatic data upload. With the system above, we collected a representative dataset of crops (maize, beets) under Danish field conditions. The system was mounted on four different tractors and two full size autonomous robots (Robotti and Farmdroid). The collected dataset consists of approximately 200,000 images of 16 megapixel each. We then annotated 2,500 of these images on a per-plant basis using bounding boxes and per-pixel masks with human-in-the-loop methods. Specifically, an existing YOLO-based detection model was used to populate the images with suggestive plant bounding boxes. After manual refinement of the boxes, a modified SAM (segment anything model) was used to create species-specific instance masks of all the plants in the annotated dataset. Based on the training dataset above, a series of instance segmentation models were developed and trained. These models were trained to differentiate between major species groups within the annotated dataset, specifically monocot weeds, dicot weeds, maize, beets, cereals, beans, lupins, buckwheat, and peas. Utilizing the computational platform within the camera and optimization libraries (Pytorch, nVidia tensorRT), the trained models were optimized to be able to run in real-time in field conditions with larger than 100% overlap at the designated travel speed of 6 km/hour, thus providing an accurate map of all plants within the traversed area. With the developed camera system we are able to capture approximately 10 images per second. Depending on the driving speed, each plant can be imaged at least three times, thus providing varied perspective views of each plant instance. To implement this functionality, we have developed a tracking module to 1) detect crop rows and interrows to boost appropriate classification accuracy, 2) keep track of each individual plant for accurate mapping and spraying, and 3) make use of multi-perspective views of plants to increase detection and species classification accuracy. With this tracking system, we have shown an increase in the stability of the computer vision algorithm, as well as solved the issues stemming from detection uncertainty from overlapping image regions. At the Beltsville, Maryland location we conducted image collection of corn and common summer annual weed species relevant to the Maryland region for the image repository. At the North Carolina State University location, we extended evaluation of a mobile inkjet sprayer for targeted application of microvolume herbicides. We also conducted image collection of soybean and common summer annual weed species relevant to the North Carolina region for the image repository. At the Texas A&M University site, we developed of a multi-nozzle system, including electronic controller for the sprayer and sprayer output analysis. We also conducted image collection of cotton and common summer annual weed species relevant to the Texas region for the image repository. Services you provided, E.g. consulting, counseling, tutoring N/A Objective 1. Expand and adapt existing machine vision. See the description of activities for Denmark (above) for an overview of machine vision work progress. Supporting efforts continued among the Maryland, North Carolina, and Texas partners to build and expand upon a national image repository. This data collection involved the use of a high-resolution camera system mounted on a robotic platform arm that, on a daily or bi-daily basis, traversed a large outdoor potting table containing corn, cotton, soybean, and common summer annual weed seedlings. By seeding crops and weeds into sterile soil, we know the correct species of each emerging plant from seedling and onwards which ensures accurate image annotation. Objective 2. Develop a low-volume organic herbicide sprayer prototype that targets intra-row (0-15 cm) weeds (cotyledon size and larger). The Denmark team has constructed a field-deployable cell sprayer with 25 nozzles. The working width of the sprayer is 1 meter, corresponding to the field of view of the camera system, orthogonal to the driving direction. It will be field-tested in the 2025 spring season in combination with the computer vision system. The Texas A&M collaborator has developed a multi-nozzle sprayer system. Each nozzle provides approximately one-centimeter spray resolution, and the valve response time is two milliseconds. The system integrates a computer vision system for real-time weed detection, nozzle control circuits, and accompanying hardware, including the nozzle mount, accumulator tank, and power supply. The entire setup is mounted on an Amiga farm-ng® robot to test its functionality. Experiments were conducted to synchronize the vision system, ground robot, and sprayer for precise microdose application of herbicides. The system will be tested in varied agricultural systems in 2025. Objective 3. Field test and iteratively improve the smart sprayer system on multiple robotic and tractor mounted platforms. As noted above, the current computer vision part of the smart sprayer system has been successfully deployed on two full-size robots as well as four different tractors without issues. This was achieved with minimal dependency from the tool carrier, where the only external dependencies are 12V power and an A-mount. To make things simpler, field tests were carried out with a mobile car battery, leading to an installation time of about five to ten minutes on any of the tractors or robots. Objective 4. Create a vision based "look back" or reassessment system(s) within 1.5 years to enable a camera to assess accuracy of the targeting system and provide system optimization data to continuously retrain the system. With the chosen components (dual RTK GPS, lidar-based height sensor) we have demonstrated the ability to georeference each individual plant in the traversed area of the field. Because of GPS positioning capabilities, plants can then later be reidentified across multiple runs. With this in mind, we have not yet installed a second "look back" camera. The system, however, is designed with the option of expanding the number of cameras from one to two. The team has experimented with another approach involving the extension of the field of view of the single camera in the driving direction to overlap with the spray zone. This could allow a single camera to be used for crop and weed detection, and simultaneously use the lowest part of the images to inspect sprayed areas. ?
Publications
- Type:
Conference Papers and Presentations
Status:
Submitted
Year Published:
2024
Citation:
J�rgensen, R. N., Skovsen, S. K., Green, O., S�rensen, G. 2024. Enhancing Precision Agriculture Through Dual Weed Mapping: Delineating Inter and Intra-row Weed Populations for Optimized Crop Protection. 16th International Conference on Precision Agriculture
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Madsen, M.S.N., Skovsen, S.K., Melander, Bo., J�rgensen, R.N. 2024. Enhancing Precision Weeding with YoloV11 Object Tracking for Robust Early Crop Detection as a Foundation for Future Organic Precision Spraying Trials. 5th NJF - EurAgEng - Agromek Joint Seminar.
- Type:
Conference Papers and Presentations
Status:
Submitted
Year Published:
2024
Citation:
Gurjar, B., Kumar, S., Johnson, J., Hardin, R. G., & Bagavathiannan, M. (2023, October). Designing and Testing a Micro-Volume Spray System for Site-Specific Herbicide Application Using Ground Robots. In ASA, CSSA, SSSA International Annual Meeting. ASA-CSSA-SSSA.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Mirky et al. Precision organic weed management; 18 seminars, webinars, government agency presentations, stakeholder groups, and private industry.
|
Progress 09/01/22 to 08/31/23
Outputs Target Audience:The target audience of this project includes farmers, researchers, and private industry. Organic farmers are in critical need of robust precision weed management solutions that limit excessive tillage and increase the competitiveness of the cash crop. These technologies must be affordable and accessible. By building image repositories for organic field crops and making them available to the research community, we will accelerate innovation in the research community around organic weed management. Lastly, by taking proof-of-concept technologies and operationalizing them for organic farmers, we will de-risk innovations that can be expanded on by private industry. In the first reporting period of this project, team members presented at events including state and national soybean boards workshops, the August 2023 Keck/NASA Jet Propulsion Lab carbon workshop, the Weed Science Society of America annual meeting, the American Chemical Society Spring 2023 conference, a Northeast Climate Hub meeting, a Southeast Climate Hub meeting, and USDA ARS leadership meetings and departmental seminars. PI Mirsky gave keynote presentations at the National Predictive Modeling Tool Initiative (Feb. 2023) and the International Agro-Geoinformatics 2023 conference; he also participated in the Sustainable Precision Agriculture in the Era of IoT and AI BARD/NSF workshop and the Center for AI Image Repo workshop. All of the aforementioned events were attended by varying combinations of farmers, researchers, and private industry personnel. Changes/Problems:Work was delayed for some project objectives due to difficulty putting in place the agreement between USDA and Aarhus University. However, we still expect to meet our objectives by the end of the project. What opportunities for training and professional development has the project provided?Project leadership and staff participated in an intensive five-day webinar series on the beta-testing of technology led by user-testing experts from Game Theory, a Vermont-based software development collective. This webinar series was followed by an in-person training event at North Carolina State University in June, 2023. The project facilitated a visiting Fulbright postdoctoral researcher from Australia. This enabled professional development of the Australian Fulbrighter working with the US and Danish team, as well promoting cross cultural exchange and further international collaboration. How have the results been disseminated to communities of interest?In the first reporting period of this project, team members presented at events including state and national soybean boards workshops, the August 2023 Keck/NASA Jet Propulsion Lab carbon workshop, the Weed Science Society of America annual meeting, the American Chemical Society Spring 2023 conference, a Northeast Climate Hub meeting, a Southeast Climate Hub meeting, and USDA ARS leadership meetings and departmental seminars. PI Mirsky gave keynote presentations at the National Predictive Modeling Tool Initiative (Feb. 2023) and the International Agro-Geoinformatics 2023 conference; he also participated in the Sustainable Precision Agriculture in the Era of IoT and AI BARD/NSF workshop and the Center for AI Image Repo workshop. All of the aforementioned events were attended by varying combinations of farmers, researchers, and private industry personnel. Work was also disseminated through journal publications as noted in the Products section of this report. What do you plan to do during the next reporting period to accomplish the goals?During the next reporting period we plan to expand on the computer vision work to develop a real-time capable computer vision model. The model will initially be developed and tested for corn, and will be able to differentiate between corn and emerging weeds at the early stages of the crop growth (0-15 cm tall). Using the Nvidia Jetson Orin AGX platform, we intend to train a yolov8 model, capable of analyzing a high resolution color image fast enough for spraying directly after detection. We plan to integrate weed detection logging using RTK-gps, which later will support logging of spray patterns as well.We also intend to conduct field testing, which would put us ahead of the official project timeline. This field testing will refine real-time detection of weeds using the developed computer vision system. With regards to sprayer-centric objectives, greenhouse experiments will be carried out to evaluate sprayer efficacy on emerging weeds and fine tune the integration of the spray system, machine vision, and targeting algorithm. This includes 1) choice of organic herbicide, 2) hit rate of the nozzle system, and likely 3) accuracy of the computer vision system for delivering control signals to the nozzles.
Impacts What was accomplished under these goals?
Objective 1. Expand and adapt existing machine vision Efforts continued among the Maryland, North Carolina, and Texas partners to build and expand upon a national image repository. This data collection involved the use of a high-resolution camera system mounted on a robotic platform arm that, on a daily or bi-daily basis, traversed a large outdoor potting table containing corn, cotton, soybean, and common summer annual weed seedlings. Using this system, an extensive image dataset of the three crops and weeds was created. By seeding crops and weeds into sterile soil, we know the correct species of each emerging plant from seedling and onwards. Denmark has been part of this effort on the image processing side to transform the collected images into a readily usable dataset. The Denmark and NCSU collaborators worked together to develop a computer vision system prototype consisting of a weatherproof machine vision camera, machine vision lens, and LED-based ring flash. The computational platform, power supply, and GPS components must still be ruggedized. The Denmark, Maryland, and North Carolina team members collaborated to collect field images in the three different climates/regions. While each camera system was built upon the same principle, they differed in the expected image quality and field of view. Objective 2. Develop a low-volume organic herbicide sprayer prototype that targets intra-row (0-15cm) weeds (cotyledon size and larger) NCSU explored several off-the-shelf hardware options for precise, low-volume application of liquid products that could be adapted to fulfill project needs. Most work has revolved around adapting an inkjet printhead (from a large character drop on-demand printing system used for coding and marking systems for texts, dates, or logos) to apply herbicide. The printhead consists of 32 linearly arranged inkjet valves and nozzles, spaced 4.5 mm apart, that can be individually controlled. Experiments have been conducted to 1) evaluate the ability to control the nozzles accurately, in real time, from a computational platform, including integration with a camera and Machine Vision system; 2) characterize the performance of the printhead across a range of pressures, valve open times, ground height, and ground velocity; 3) determine the accuracy at which the printhead can hit various arrangements of targets; and 4) characterize the dispersed droplet morphologies (with high speed imagery) and the effectiveness of different spray configurations on target coverage. This prototype test system was mounted on a Farm-ng Amiga platform for lab and field experiments. The Texas A&M collaborator worked extensively to design a multi-nozzle sprayer from the ground up using off-the-shelf nozzles. This included development of control circuits and the accompanying hardware (nozzle mount, accumulator tank, power supply). Experiments were then carried out to evaluate the usability of the sprayer for microdose application of herbicides. Objective 3. Field test and iteratively improve the smart sprayer system on multiple robotic and tractor mounted platforms As noted above, field tests of the computer vision system were carried out as part of field image collection. This included evaluation of the camera-related components, specifically related to investigating whether the artificial light source provided enough light to 1) reliably collect images at a fast shutter speed to avoid motion blur while driving and 2) brighten hard shadows caused by sunlit conditions sufficiently to detect weeds under said bright conditions. We also evaluated the performance of the camera, lighting, and computing integration hardware when operating under field conditions. Objective 4. Have a vision based "look back" or reassessment system(s) within 1.5 years to enable a camera to assess accuracy of the targeting system and provide system optimization data to continuously retrain the system We haven't yet progressed to this objective, which relies on observing a field-deployed sprayer that will be deployed in upcoming reporting periods. Objective 5. Facilitate knowledge exchange about alternative weed control tactics using robotics among organic farmers, researchers, and educators. In the first reporting period of this project, team members presented at events including state and national soybean boards workshops, the August 2023 Keck/NASA Jet Propulsion Lab carbon workshop, the Weed Science Society of America annual meeting, the American Chemical Society Spring 2023 conference, a Northeast Climate Hub meeting, a Southeast Climate Hub meeting, and USDA ARS leadership meetings and departmental seminars. PI Mirsky gave keynote presentations at the National Predictive Modeling Tool Initiative (Feb. 2023) and the International Agro-Geoinformatics 2023 conference; he also participated in the Sustainable Precision Agriculture in the Era of IoT and AI BARD/NSF workshop and the Center for AI Image Repo workshop. All of the aforementioned events were attended by varying combinations of farmers, researchers, and private industry personnel. Work was also disseminated through journal publications as noted in the Products section of this report.
Publications
- Type:
Journal Articles
Status:
Published
Year Published:
2022
Citation:
Dobbs, A.M., Ginn, D., Skovsen, S.K., Bagavathiannan, M.V., Mirsky, S.B., Reberg-Horton, C.S. and Leon, R.G. 2022. New directions in weed management and research using 3D imaging. Weed Science 70:641-647.
- Type:
Journal Articles
Status:
Published
Year Published:
2022
Citation:
Hu, C., Xie, S., Song, D., Thomasson, J.A., Hardin IV, R.G. and Bagavathiannan, M. 2022. Algorithm and system development for robotic micro-volume herbicide spray towards precision weed management. IEEE Robotics and Automation Letters 7(4): 11633-11640.
- Type:
Journal Articles
Status:
Published
Year Published:
2022
Citation:
Coleman, G.R., Bender, A., Hu, K., Sharpe, S.M., Schumann, A.W., Wang, Z., Bagavathiannan, M.V., Boyd, N.S. and Walsh, M.J. 2022. Weed detection to weed recognition: reviewing 50 years of research to identify constraints and opportunities for large-scale cropping systems. Weed Technology 36(6): 741-757.
- Type:
Journal Articles
Status:
Published
Year Published:
2022
Citation:
Sapkota B.B., Popescu, S., Rajan, N., Leon, R.G., Reberg-Horton, C., Mirsky, S. and Bagavathiannan, M.V. 2022. Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton. Scientific Reports 12(1):19580.
|