Progress 11/01/23 to 10/31/24
Outputs Target Audience:We have participated in local and regional events attended by other researchers beyond our team who work on similar topics and various stakeholders. These include (1) a meeting organized by UC ANR at the Desert Station where we provideddemos ofsome of our technology development in this project, (2) a meeting organizedat UC Davis(jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), and (3) a meeting organized at UCR within the Center of Robotics and Intelligent Systems. We also organized a special track on agricultural robotics and visionas part of the 2024ISVCconference, continuing a successful track organized last year in the same venue. We have published and presented threeconference papers (2024IEEE ICRAinternational conference, 2024 ASME AIM international conference,and Springer 2024ISCV international symposium) attended by robotics and automation, machine vision, and artificial intelligence researchers and practitioners. Two book chapters (Elsevier AP "Hyperautomation in Precision Agriculture" andCRC Press "Mobile Robots for Digital Farming") have been published.Additionally, threejournal papers (all in Elsevier Computers and Electronics in Agriculture) havebeen published. Furthermore, we have disclosed to UCR new IP outcomes, which arecurrently undergoing marketability analysis. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?A total of eightPhD students (Amel Dechemi, Hanzhe Teng, Dimitris Chatziparaschis, Cody Simons, Pamodya Peiris, Aritra Samanta, Jingzong Zhou, and Xiao'ao Song) and one postdoctoral researcher (Caio Mucchiani) have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2) robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presenting research findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. Three of the PhD students have now graduated (Dechemi, Teng,and Simons). How have the results been disseminated to communities of interest?We have participated in local and regional events attended by other researchers beyond our team who work on similar topics and various stakeholders. These include (1) a meeting organized by UC ANR at the Desert Station where we provideddemos ofsome of our technology development in this project, (2) a meeting organizedat UC Davis(jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), and (3) a meeting organized at UCR within the Center of Robotics and Intelligent Systems. We also organized a special track on agricultural robotics and visionas part of the 2024ISVCconference, continuing a successful track organized last year in the same venue. We have published and presented threeconference papers (2024IEEE ICRAinternational conference, 2024 ASME AIM international conference,and Springer 2024ISCV international symposium) attended by robotics and automation, machine vision, and artificial intelligence researchers and practitioners. All these papers were presented in the community.Presented material is available online on the lab's YouTube channel for easy and unobtrusive access to anyoneinterested in learning more about these efforts. Lastly, we continued open sourcing different types of data we collect in the field, developed digital twins, as well as algorithms, with most notable the research dataset and code for our 2025 Computers and Electronics in Agriculture journal paper. What do you plan to do during the next reporting period to accomplish the goals?We are currently working on deployinga user-centered automated pressure chamber, offering the capability to make quick SWP measurements. Our method leverages visual learning and mobile application development. We hope that it can also lend itself as a useful training tool for future workers and specialists conducting SWP measurements in orchards.
Impacts What was accomplished under these goals?
In this period we prioritized work under all four objectives. We developed an improved prototype of an automated pressure chamber (Elsevier CompAg 2024 papers) anddeployed advancedvisual learning algorithms to detect instances of dry/bubbling/wet xylem in video footage (Springer ISVCpaper). We focused on end-effector design (2025book chapter and IEEE/ASME AIM paper) as well as different aspects of autonomy including field testing and deployment (2024 book chapter),multi-sensor fusion (IEEE ICRA paper), as well as mapping and odometry estimation (Elsevier Compag 2025 paper). The dataset, developed algorithm, and comparisons with baselines of the latter work have been open sourced (https://github.com/UCR-Robotics/AG-LOAM) to further stimulate and support relevant agricutlural robotics and automation technology research.
Publications
- Type:
Book Chapters
Status:
Published
Year Published:
2025
Citation:
Dechemi, A., Hale, T.O., Eng, C. and Karydis, K., 2025. Design, integration, and field evaluation of a robotic leaf cutting and retrieving system. In Hyperautomation in Precision Agriculture (pp. 203-215). Academic Press.
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2025
Citation:
Teng, H., Wang, Y., Chatziparaschis, D. and Karydis, K., 2025. Adaptive LiDAR odometry and mapping for autonomous agricultural mobile robots in unmanned farms. Computers and Electronics in Agriculture, 232, p.110023.
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
C. Mucchiani and K. Karydis, Development of an Automated and Artificial Intelligence Assisted Pressure Chamber for Stem Water Potential Determination. Computers and Electronics in Agriculture, 2024, vol. 222, pp. 109016.
- Type:
Peer Reviewed Journal Articles
Status:
Published
Year Published:
2024
Citation:
C. Mucchiani, D. Zaccaria, and K. Karydis, Assessing the potential of integrating automation and artificial intelligence across sample-destructive methods to determine plant water status: A review and score-based evaluation. Computers and Electronics in Agriculture, 2024, vol. 224, pp. 108992.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
J. Zhou, X. Song, and K. Karydis, Design of an End-effector with Application to Avocado Harvesting. In IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics (AIM), 2024, pp. 1241-1246.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
D. Chatziparaschis, H. Teng, Y. Wang, P. Peiris, E. Scudiero, and K. Karydis, On-the-Go Tree Detection and Geometric Traits Estimation with Ground Mobile Robots in Fruit Tree Groves. In IEEE Int. Conf. on Robotics and Automation (ICRA), 2024, pp. 15840-15846.
- Type:
Book Chapters
Status:
Published
Year Published:
2025
Citation:
Peiris, P., Samanta, A., Mucchiani, C., Simons, C., Roy-Chowdhury, A., Karydis, K. (2025). Vision-Based Xylem Wetness Classification in Stem Water Potential Determination. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2024. Lecture Notes in Computer Science, vol 15047. Springer, Cham
- Type:
Book Chapters
Status:
Published
Year Published:
2024
Citation:
Chatziparaschis, D., Scudiero, E. and Karydis, K., 2024. Robot-assisted soil apparent electrical conductivity measurements in orchards. In Mobile Robots for Digital Farming (pp. 55-88). CRC Press.
|
Progress 11/01/22 to 10/31/23
Outputs Target Audience:We have participated in local and regional events attended by other researchers beyond our team but who work on similar topics as well as various stakeholders. These include a meeting we organized at UCR (jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), afield data collection together with Dr.Robert Krueger,Research Leader of the USDA-ARS National Clonal Germplasm Repository for Citrus & Dates,stationed at UCR, as well a field data collection in the Gallo Vineyard where we worked together with Gallo's agricultural engineering researchers. We also organized a workshop as part of the 2023 IEEE/RSJ IROS conference (https://sites.google.com/view/agrobotics) that was received very well by the agricultural robotics and automation technology researchers attending the workshop (both from industry and academia) We havepublished and presented two conferencepapers (2023 IEEE/CVFWACV international conference and Springer 2023 ISCV international symposium) attended by roboticsand automation, machine vision, and artificial intelligenceresearchers and practitioners. For the latter we also organized a special track focusing on datasets for agriculture. Presented material is available online on the lab's YouTube channel for easy and unobtrusive access to anyone interested in learning more. One additional journal paper (IEEE RAM) has been published, and was invited for presentation in 2024 IEEE ICRA. Furthermore, we have disclosed to UCR new IP outcomes, which is currently undergoing marketability analysis (to be reported in detail in the next period). Changes/Problems:Because of delays caused by COVID-19lab closures, some experimental testing was completed during this review period and caused delays in later tasks focusing on data analysis and machine intelligence. We are now on track to complete these tasks as well, although a short no-cost-extension may be needed. What opportunities for training and professional development has the project provided?Four PhD students (Amel Dechemi, Hanzhe teng, Dimitris Chatziparaschis,Cody Simons)and one postdoctoral researcher (Caio Mucchiani) have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2) robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presenting research findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. How have the results been disseminated to communities of interest?We organized a stakeholder meetingat UCR (jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), afield data collection together with Dr.Robert Krueger,Research Leader of the USDA-ARS National Clonal Germplasm Repository for Citrus & Dates,stationed at UCR, as well a field data collection in the Gallo Vineyard where we worked together with Gallo's agricultural engineering researchers. We also organized a workshop as part of the 2023 IEEE/RSJ IROS conference (https://sites.google.com/view/agrobotics) that was received very well by the agricultural robotics and automation technology researchers attending the workshop (both from industry and academia) We havepublished and presented two conferencepapers (2023 IEEE/CVFWACV international conference and Springer 2023 ISCV international symposium) attended by roboticsand automation, machine vision, and artificial intelligenceresearchers and practitioners. For the latter we also organized a special track focusing on datasets for agriculture. Presented material is available online on the lab's YouTube channel for easy and unobtrusive access to anyone interested in learning more. One additional journal paper (IEEE RAM) has been published, and was invited for presentation in 2024 IEEE ICRA. Furthermore, we have open-sourced a dataset collected on citrus tree farms(https://ucr-robotics.github.io/Citrus-Farm-Dataset/) to further stimulate and support relevant agricutlural robotics and automation technology research. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we plan to continue our efforts under all objectives. Specifically, we are planning to fully automate the pressure chamber method while deploying stronger machine learning based tools for visual detection of xylem status. We are also seeking to deploy in larger field settings our mobile manipulation robots to sample leaves at faster rates than our current prototypes.
Impacts What was accomplished under these goals?
In this period we prioritized work under all four objectives. We developed an initial prototype of a machine-vision enhanced pressure chamber, and demonstrated how visual learning can be employed to detect instances of dry/wet xylem in video footage (IEEE RAM paper). We also focused on deploying planners developed in the project's collaborating site (UC Merced) into physical robots (a mobile manipulator) in the field to sample leaves (IEEE RAM paper). Further, wedeveloped vision-based tools in support of autonomous robot navigation (IEEE/CVF WACV paper), and generated a large multi-modal dataset taken in citrus tree farmsin support of machine vision, artificial intelligence, andautonomous robot navigation (Springer ISVC paper). We have made the dataset publically available (https://ucr-robotics.github.io/Citrus-Farm-Dataset/) to further stimulate and support relevant agricutlural robotics and automation technology research.
Publications
- Type:
Book Chapters
Status:
Published
Year Published:
2023
Citation:
Hanzhe Teng, Yipeng Wang, Xiaoao Song, and Konstantinos Karydis. "Multimodal Dataset for Localization, Mapping and Crop Monitoring in Citrus Tree Farms." In International Symposium on Visual Computing, pp. 571-582. Cham: Springer Nature Switzerland, 2023.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2023
Citation:
Hanzhe Teng, Dimitrios Chatziparaschis, Xinyue Kan, Amit K. Roy-Chowdhury, and Konstantinos Karydis. "Centroid distance keypoint detector for colored point clouds." In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 1196-1205. 2023.
- Type:
Journal Articles
Status:
Published
Year Published:
2023
Citation:
Amel Dechemi, Dimitrios Chatziparaschis, Joshua Chen, Merrick Campbell, Azin Shamshirgaran, Caio Mucchiani, Amit Roy-Chowdhury, Stefano Carpin, and Konstantinos Karydis. "Robotic Assessment of a Crops Need for Watering: Automating a Time-Consuming Task to Support Sustainable Agriculture." IEEE Robotics & Automation Magazine, vol. 30, no. 4, pp. 52-67, Dec. 2023.
|
Progress 11/01/21 to 10/31/22
Outputs Target Audience:This project relies on a strong collaboration with the Almond Board of California (www.almonds.com). During this second year we have engaged with scientists and practioners from the board to keep our research aligned with the interests of the growers they represent. In particular, on May 13, 2022 we have visited an almond orchard located near Merced (CA) and together with the owner we actively took part in a complete set of pressure chamber measurements throughout the orchard. This process has allowed us to refine our approach for the selection of the sampling locations. In this reporting period we published and presented one paperin a major venue (IEEE/RSJ IROS international conference)attended by robotics and automation researchers and practioners, which feature segments of agricultural robotics and technology that this award pertains to. The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone interesting in learning more. Furthermore, we have pushed forward twopatent applications. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?ThreePhD students (Amel Dechemi, Keran Ye, Cody Simons), twoMS students (Merrick Campbell, Joshua Chen [both graduated in this reporting period],and one postdoctoral researcher (Caio Mucchiani) have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2) robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presenting research findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. How have the results been disseminated to communities of interest?One paperhasbeen presentedat the IEEE\RSJ IROS2022conference.The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone.We have also been discussing regularly with the Almond Board of California, whereby we gather their feedback via short demos and seek to acquire access to commercial farms to test our technology. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we plan to continue our efforts under objectives 1 and 2, as well as initiate controlled field experiments that will guide further development. In addition, a prototype of our current leaf extraction mechanism will be replicated at UC Merced site so that we can conduct multiple field experiments fordistinct tree crops at faster rates. A large dataset is an asset for visual analysis, and being able to expedite data collections will also have positive effects inobjective 2 activities. Finally, with our commercial partner we will continue to engage almond growers to get domain feedback about our proposed solution.
Impacts What was accomplished under these goals?
In this period we prioritized work under objectives 1 and 2in multiple concurrent directions. Theone mostly developedincludedrobotic leaf extraction (IEEE\RSJ IROSconference paper and one submitted patent application).At the same time, we conducted field experiments of stem water potential measurements manually and collected a first dataset to enable automatedvisual sensing of water expression at the cut stem. Initial findings were submitted to a conference but the paper was not accepted - we are currently revising this work to include theautomated pressure chamber design and results are forthcoming. One more patent application on automated soil salinity estimation was also submitted in this period.
Publications
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2022
Citation:
M. Campbell, A. Dechemi, and K. Karydis, An Integrated Actuation-Perception Framework for Robotic Leaf Retrieval: Detection, Localization, and Cutting. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2022, pp. 9210-9216.
|
Progress 11/01/20 to 10/31/21
Outputs Target Audience:In this reporting period we published and presented papers in major venues attended by robotics and automation researchers and practioners, which feature segments of agricultural robotics and technology that this award pertains to. The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone interesting in learning more. Furthermore, we started investigating the potential for tech transfer, and as such we interacted with several stakeholders (practioners, ag advisors, and growers) as part of customer discovery for an NSF I-Corps program at UCR we participated in. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?TwoPhD students, one MS student, one undergraduate student and one postdoctoral researcher have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2)robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presentingresearch findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. How have the results been disseminated to communities of interest?The researcher papers have been presented in their respective conferences (the IEEE RA-L journal was jointly accepted and presented atIEEE ICRA 2021 conference).The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone. Furthermore, results have been communicated directly to stakeholdersas part of our customer discovery for an NSF I-Corps program at UCR we participated in. We have also been discussing regularly with the Almond Board of California, whereby we gather their feedback via short demos and seek to acquire access to commercial farms to test our technology. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we plan to focus primarily in Objectives 1, 2and 4, so that we can create the physical sampling and automation tools which can then be integrated in a refined multi-robot coordination and planning framework (Objective 3).
Impacts What was accomplished under these goals?
In this period we prioritized work under objective 3 (published IEEE RA-L journal paper) that determines how to optimally sample in field by factoring any prior information bias (e.g., past soil moisture maps). We also developed new means to automate some of the processes required to attain soil information mapsin a scalable manner by developing a robot to measure soil electric conductivity which can be linked to soil salinity (published IEEE CASE conference paper). The latter was tested and evaluated in UCR's experimental fields. We have also been working actively under the goals of Objectives 1 and 2 in multiple concurrent directions (leaf extraction - a conference paper is in preparation; automated pressure chamber design - a first prototype is being fabricated; visual sensing - a dataset to use is being developed).
Publications
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2021
Citation:
Campbell, M., Ye, K., Scudiero, E. and Karydis, K., 2021, August. A Portable Agricultural Robot for Continuous Apparent Soil Electrical Conductivity Measurements to Improve Irrigation Practices. In 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE) (pp. 2228-2234). IEEE.
- Type:
Journal Articles
Status:
Published
Year Published:
2021
Citation:
Kan, X., Thayer, T.C., Carpin, S. and Karydis, K., 2021. Task Planning on Stochastic Aisle Graphs for Precision Agriculture. IEEE Robotics and Automation Letters, 6(2), pp.3287-3294.
|
|