Source: The Regents of University of California submitted to NRP
COLLABORATIVE RESEARCH: NRI: INT: MOBILE ROBOTIC LAB FOR IN-SITU SAMPLING AND MEASUREMENT
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1024633
Grant No.
2021-67022-33453
Cumulative Award Amt.
$544,835.00
Proposal No.
2020-08996
Multistate No.
(N/A)
Project Start Date
Nov 1, 2020
Project End Date
Oct 31, 2025
Grant Year
2021
Program Code
[A7301]- National Robotics Initiative
Recipient Organization
The Regents of University of California
200 University Office Building
Riverside,CA 92521
Performing Department
Electrical and Computer Engineering
Non Technical Summary
Increasing population, decreasing arable land, climate change, and a declining skilled workforce pose unprecedented challenges to our ability to satisfy the growing demand for food on a global scale. Accurate assessments of multiple spatiotemporal conditions, such as evapotranspiration, are instrumental to fine tune the amount of water used in farming. Today, precise water use measurement requires the use of a pressurized chamber, an instrument that is cumbersome to operate and greatly limits the number of measurements that can be made given the need for human collection of plant specimens in the field. Consequently, critical parameters for large orchards are obtained by interpolating very sparse sample sets, thus failing to capture the inherent variability requiring precise adjustment of agricultural inputs. In this project, for the first time, we will develop a mobile robotic lab that is capable of autonomously selecting regions to sample, physically collect leaves and immediately perform on-board analysis to measure leaf water potential, improving the accuracy, precision and efficiency used in present-day pressure chamber technology. The mobile lab system features both aerial vehicles as well as ground robots, and during the project we will design a novel robotized pressure chamber enabling the measurement of leaf water potential at scale. The system will be tested and validated in the field in four different agronomic testbeds in California on four different perennial crops in collaboration with commercial partners.This project will develop the scientific and technological foundations to create a new mobile robotics lab to perform sampling and analysis of water leaf potential at spatiotemporal scale not achievable with current technologies. The PIs will work on hardware/software co-design of novel actuators to autonomously acquire and analyze specimens in the field. We will tackle fundamental questions about coordination of heterogeneous robotic systems operating under resource constraints, and perception algorithms to extract how to best perform leaf water potential measurements observing human behaviors. Finally, collected data will be used to quantitatively confirm or disprove our hypothesis that current sampling practices fail to capture the existing heterogeneity inleaf water potential.The proposed system has broad applicability and can be used to improve agriculture practices in a wide variety of crops throughout the US with the potential of great savings in the use of water and agrichemicals. Both participating universities are Hispanic Serving Institutions (HSIs), and students from all backgrounds will be involved. Findings will be presented at leading conferences and papers will be freely made available. Hardware designs and code will be open source and data collected during the project will also be made freely available to the scientific community. Results will be disseminated to the broader public through the University of California TV, and existing outreach initiatives at both institutions will be leveraged to engage K-12 students, as well as industry stakeholders.
Animal Health Component
0%
Research Effort Categories
Basic
70%
Applied
0%
Developmental
30%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
1020999202033%
4027410202034%
4047310202033%
Goals / Objectives
The overarching goal of this project is to develop and deploy heterogeneous teams of autonomous robots (specifically, aerial and ground robots) to enable frequent and dense sampling in the field. The motivating hypothesis is that an increase in sampling density and frequency can indicate noticeable spatiotemporal variability in water potential that would remain otherwise undetected because of insufficient sampling resolution. In pursue of this goal, the project tackles four key objectives, described below.Objective 1 [Robotized Pressure Chamber Development]: We will develop all the components to sample individual leaves and assess their water potential, autonomously. The system will consist of a ground mobile base, a manipulator, a pneumatic control board, and an integrated leaf acquisition and pressure chamber device. Components will be optimized through co-design of hardware, sensing, and control.Objective 2 [Visual Sensing for Accurate Determination of Leaf Water Potential]: We will rely on visual sensing to determine the leaf water potential. To achieve so, we will establish new image quality enhancement algorithms using limited training data and develop new algorithms for imitation learning form human experts to help address the long-standing challenges of bubbling and presence on non-xylem water when measuring leaf water potential.Objective 3 [Multi-robot Coordination and Planning]: We will study how to effectively coordinate multiple aerial and ground vehicles to perform targeted sampling in areas of interest while being cognizant of the inherent operational constraints due to the limited energy supply provided by the batteries. Coordination tasks will be cast as optimization problems related to the orienteering problem.Objective 4 [Evaluation]: We will test our developed system in the field at separate locations in northern and southern California where at least four different specialty crops (grapes, almonds, citrus, and avocados) are grown. The validation task serves two purposes. First, it will provide feedback for the iterative evolution of the design and implementation of the hardware/software system we will develop. Second, collected data will be used to test our working hypothesis that current sampling practices fail to capture spatiotemporal variability in leaf water potential.This project iscollaborativebetween UC Merced and UC Riverside.
Project Methods
Efforts: Two types of experiments will be performed during the four year project. The first set of experiments aims at perfecting the accuracy of the robotized pressure chamber we will develop. To this end, measurements obtained with the robotized pressure chamber will be cross-validated with leaf water potential measurements obtained with a manually operated portable pressure chamber. These initial experiments will ensure adequate data accuracy before we perform the data analysis process described in the next subsection. We anticipate these experiments to take place in the first andsecond year of the project. Once the robotized pressure chamber has been perfected, data collection experiments will involve the entire system. We note that UAVs and ground robots do not need to operate at the same time, but it is instead foreseeable that imagery collected by the UAV will be processed off-line. This is an acceptable approach because the underlying physical phenomena are slow varying. Ground robots will then collect data from both the pressure chamber and the soil probe and store them as entries with spatio-temporal references for the subsequent data analysis.Evaluation: The co-robot system we will develop will be tested in the field to prove or disprove the hypothesis that current interpolation approaches based on few measurements per tens of acres fail to capture significant variability in leaf water potential. The value of the hypothesis is that more accurate estimates are essential for tuning inputs and implementing precision agriculture practices.All outdoor robot testing will adhere to any applicable regulations, as for example in the case of aerial robot testing which is regulated by UC and FAA policies (and in which case we will work with the UC Center of Excellence on Unmanned Aircraft System Safety hosted at UC Merced). All robot testing and equipment use will adhere to Standard Operating Procedures (SOPs) developed by the PIs and approved by both institutions' Environmental and Health Safety (EHS) departments.In addition to simulation models and lab tests, the proposed system will be deployed and evaluated in various testbeds where different specialty crops are grown. Specifically, we have identified four testbeds to evaluate system performance. These testbeds include (1) an experimental vineyard managed by one of our industrial partners located in Firebaugh (Fresno county, CA), and (2) almond orchard located in Merced County. In Riverside County, in consultation with local commercial partners, tests will be performed in the Agricultural Operations (Ag Ops) facilities managed by UC Riverside where (3) citrus and (4) avocados are grown. These testbeds are geographically distributed and used to grow different crops, thus allowing to operate the proposed solution under heterogeneousconditions.

Progress 11/01/23 to 10/31/24

Outputs
Target Audience:We have participated in local and regional events attended by other researchers beyond our team who work on similar topics and various stakeholders. These include (1) a meeting organized by UC ANR at the Desert Station where we provideddemos ofsome of our technology development in this project, (2) a meeting organizedat UC Davis(jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), and (3) a meeting organized at UCR within the Center of Robotics and Intelligent Systems. We also organized a special track on agricultural robotics and visionas part of the 2024ISVCconference, continuing a successful track organized last year in the same venue. We have published and presented threeconference papers (2024IEEE ICRAinternational conference, 2024 ASME AIM international conference,and Springer 2024ISCV international symposium) attended by robotics and automation, machine vision, and artificial intelligence researchers and practitioners. Two book chapters (Elsevier AP "Hyperautomation in Precision Agriculture" andCRC Press "Mobile Robots for Digital Farming") have been published.Additionally, threejournal papers (all in Elsevier Computers and Electronics in Agriculture) havebeen published. Furthermore, we have disclosed to UCR new IP outcomes, which arecurrently undergoing marketability analysis. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?A total of eightPhD students (Amel Dechemi, Hanzhe Teng, Dimitris Chatziparaschis, Cody Simons, Pamodya Peiris, Aritra Samanta, Jingzong Zhou, and Xiao'ao Song) and one postdoctoral researcher (Caio Mucchiani) have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2) robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presenting research findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. Three of the PhD students have now graduated (Dechemi, Teng,and Simons). How have the results been disseminated to communities of interest?We have participated in local and regional events attended by other researchers beyond our team who work on similar topics and various stakeholders. These include (1) a meeting organized by UC ANR at the Desert Station where we provideddemos ofsome of our technology development in this project, (2) a meeting organizedat UC Davis(jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), and (3) a meeting organized at UCR within the Center of Robotics and Intelligent Systems. We also organized a special track on agricultural robotics and visionas part of the 2024ISVCconference, continuing a successful track organized last year in the same venue. We have published and presented threeconference papers (2024IEEE ICRAinternational conference, 2024 ASME AIM international conference,and Springer 2024ISCV international symposium) attended by robotics and automation, machine vision, and artificial intelligence researchers and practitioners. All these papers were presented in the community.Presented material is available online on the lab's YouTube channel for easy and unobtrusive access to anyoneinterested in learning more about these efforts. Lastly, we continued open sourcing different types of data we collect in the field, developed digital twins, as well as algorithms, with most notable the research dataset and code for our 2025 Computers and Electronics in Agriculture journal paper. What do you plan to do during the next reporting period to accomplish the goals?We are currently working on deployinga user-centered automated pressure chamber, offering the capability to make quick SWP measurements. Our method leverages visual learning and mobile application development. We hope that it can also lend itself as a useful training tool for future workers and specialists conducting SWP measurements in orchards.

Impacts
What was accomplished under these goals? In this period we prioritized work under all four objectives. We developed an improved prototype of an automated pressure chamber (Elsevier CompAg 2024 papers) anddeployed advancedvisual learning algorithms to detect instances of dry/bubbling/wet xylem in video footage (Springer ISVCpaper). We focused on end-effector design (2025book chapter and IEEE/ASME AIM paper) as well as different aspects of autonomy including field testing and deployment (2024 book chapter),multi-sensor fusion (IEEE ICRA paper), as well as mapping and odometry estimation (Elsevier Compag 2025 paper). The dataset, developed algorithm, and comparisons with baselines of the latter work have been open sourced (https://github.com/UCR-Robotics/AG-LOAM) to further stimulate and support relevant agricutlural robotics and automation technology research.

Publications

  • Type: Book Chapters Status: Published Year Published: 2025 Citation: Dechemi, A., Hale, T.O., Eng, C. and Karydis, K., 2025. Design, integration, and field evaluation of a robotic leaf cutting and retrieving system. In Hyperautomation in Precision Agriculture (pp. 203-215). Academic Press.
  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2025 Citation: Teng, H., Wang, Y., Chatziparaschis, D. and Karydis, K., 2025. Adaptive LiDAR odometry and mapping for autonomous agricultural mobile robots in unmanned farms. Computers and Electronics in Agriculture, 232, p.110023.
  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2024 Citation: C. Mucchiani and K. Karydis, Development of an Automated and Artificial Intelligence Assisted Pressure Chamber for Stem Water Potential Determination. Computers and Electronics in Agriculture, 2024, vol. 222, pp. 109016.
  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2024 Citation: C. Mucchiani, D. Zaccaria, and K. Karydis, Assessing the potential of integrating automation and artificial intelligence across sample-destructive methods to determine plant water status: A review and score-based evaluation. Computers and Electronics in Agriculture, 2024, vol. 224, pp. 108992.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: J. Zhou, X. Song, and K. Karydis, Design of an End-effector with Application to Avocado Harvesting. In IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics (AIM), 2024, pp. 1241-1246.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: D. Chatziparaschis, H. Teng, Y. Wang, P. Peiris, E. Scudiero, and K. Karydis, On-the-Go Tree Detection and Geometric Traits Estimation with Ground Mobile Robots in Fruit Tree Groves. In IEEE Int. Conf. on Robotics and Automation (ICRA), 2024, pp. 15840-15846.
  • Type: Book Chapters Status: Published Year Published: 2025 Citation: Peiris, P., Samanta, A., Mucchiani, C., Simons, C., Roy-Chowdhury, A., Karydis, K. (2025). Vision-Based Xylem Wetness Classification in Stem Water Potential Determination. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2024. Lecture Notes in Computer Science, vol 15047. Springer, Cham
  • Type: Book Chapters Status: Published Year Published: 2024 Citation: Chatziparaschis, D., Scudiero, E. and Karydis, K., 2024. Robot-assisted soil apparent electrical conductivity measurements in orchards. In Mobile Robots for Digital Farming (pp. 55-88). CRC Press.


Progress 11/01/22 to 10/31/23

Outputs
Target Audience:We have participated in local and regional events attended by other researchers beyond our team but who work on similar topics as well as various stakeholders. These include a meeting we organized at UCR (jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), afield data collection together with Dr.Robert Krueger,Research Leader of the USDA-ARS National Clonal Germplasm Repository for Citrus & Dates,stationed at UCR, as well a field data collection in the Gallo Vineyard where we worked together with Gallo's agricultural engineering researchers. We also organized a workshop as part of the 2023 IEEE/RSJ IROS conference (https://sites.google.com/view/agrobotics) that was received very well by the agricultural robotics and automation technology researchers attending the workshop (both from industry and academia) We havepublished and presented two conferencepapers (2023 IEEE/CVFWACV international conference and Springer 2023 ISCV international symposium) attended by roboticsand automation, machine vision, and artificial intelligenceresearchers and practitioners. For the latter we also organized a special track focusing on datasets for agriculture. Presented material is available online on the lab's YouTube channel for easy and unobtrusive access to anyone interested in learning more. One additional journal paper (IEEE RAM) has been published, and was invited for presentation in 2024 IEEE ICRA. Furthermore, we have disclosed to UCR new IP outcomes, which is currently undergoing marketability analysis (to be reported in detail in the next period). Changes/Problems:Because of delays caused by COVID-19lab closures, some experimental testing was completed during this review period and caused delays in later tasks focusing on data analysis and machine intelligence. We are now on track to complete these tasks as well, although a short no-cost-extension may be needed. What opportunities for training and professional development has the project provided?Four PhD students (Amel Dechemi, Hanzhe teng, Dimitris Chatziparaschis,Cody Simons)and one postdoctoral researcher (Caio Mucchiani) have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2) robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presenting research findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. How have the results been disseminated to communities of interest?We organized a stakeholder meetingat UCR (jointly with an internally-supported project in collaboration with UC Merced, UC Davis and UC Berkeley), afield data collection together with Dr.Robert Krueger,Research Leader of the USDA-ARS National Clonal Germplasm Repository for Citrus & Dates,stationed at UCR, as well a field data collection in the Gallo Vineyard where we worked together with Gallo's agricultural engineering researchers. We also organized a workshop as part of the 2023 IEEE/RSJ IROS conference (https://sites.google.com/view/agrobotics) that was received very well by the agricultural robotics and automation technology researchers attending the workshop (both from industry and academia) We havepublished and presented two conferencepapers (2023 IEEE/CVFWACV international conference and Springer 2023 ISCV international symposium) attended by roboticsand automation, machine vision, and artificial intelligenceresearchers and practitioners. For the latter we also organized a special track focusing on datasets for agriculture. Presented material is available online on the lab's YouTube channel for easy and unobtrusive access to anyone interested in learning more. One additional journal paper (IEEE RAM) has been published, and was invited for presentation in 2024 IEEE ICRA. Furthermore, we have open-sourced a dataset collected on citrus tree farms(https://ucr-robotics.github.io/Citrus-Farm-Dataset/) to further stimulate and support relevant agricutlural robotics and automation technology research. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we plan to continue our efforts under all objectives. Specifically, we are planning to fully automate the pressure chamber method while deploying stronger machine learning based tools for visual detection of xylem status. We are also seeking to deploy in larger field settings our mobile manipulation robots to sample leaves at faster rates than our current prototypes.

Impacts
What was accomplished under these goals? In this period we prioritized work under all four objectives. We developed an initial prototype of a machine-vision enhanced pressure chamber, and demonstrated how visual learning can be employed to detect instances of dry/wet xylem in video footage (IEEE RAM paper). We also focused on deploying planners developed in the project's collaborating site (UC Merced) into physical robots (a mobile manipulator) in the field to sample leaves (IEEE RAM paper). Further, wedeveloped vision-based tools in support of autonomous robot navigation (IEEE/CVF WACV paper), and generated a large multi-modal dataset taken in citrus tree farmsin support of machine vision, artificial intelligence, andautonomous robot navigation (Springer ISVC paper). We have made the dataset publically available (https://ucr-robotics.github.io/Citrus-Farm-Dataset/) to further stimulate and support relevant agricutlural robotics and automation technology research.

Publications

  • Type: Book Chapters Status: Published Year Published: 2023 Citation: Hanzhe Teng, Yipeng Wang, Xiaoao Song, and Konstantinos Karydis. "Multimodal Dataset for Localization, Mapping and Crop Monitoring in Citrus Tree Farms." In International Symposium on Visual Computing, pp. 571-582. Cham: Springer Nature Switzerland, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Hanzhe Teng, Dimitrios Chatziparaschis, Xinyue Kan, Amit K. Roy-Chowdhury, and Konstantinos Karydis. "Centroid distance keypoint detector for colored point clouds." In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp. 1196-1205. 2023.
  • Type: Journal Articles Status: Published Year Published: 2023 Citation: Amel Dechemi, Dimitrios Chatziparaschis, Joshua Chen, Merrick Campbell, Azin Shamshirgaran, Caio Mucchiani, Amit Roy-Chowdhury, Stefano Carpin, and Konstantinos Karydis. "Robotic Assessment of a Crops Need for Watering: Automating a Time-Consuming Task to Support Sustainable Agriculture." IEEE Robotics & Automation Magazine, vol. 30, no. 4, pp. 52-67, Dec. 2023.


Progress 11/01/21 to 10/31/22

Outputs
Target Audience:This project relies on a strong collaboration with the Almond Board of California (www.almonds.com). During this second year we have engaged with scientists and practioners from the board to keep our research aligned with the interests of the growers they represent. In particular, on May 13, 2022 we have visited an almond orchard located near Merced (CA) and together with the owner we actively took part in a complete set of pressure chamber measurements throughout the orchard. This process has allowed us to refine our approach for the selection of the sampling locations. In this reporting period we published and presented one paperin a major venue (IEEE/RSJ IROS international conference)attended by robotics and automation researchers and practioners, which feature segments of agricultural robotics and technology that this award pertains to. The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone interesting in learning more. Furthermore, we have pushed forward twopatent applications. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?ThreePhD students (Amel Dechemi, Keran Ye, Cody Simons), twoMS students (Merrick Campbell, Joshua Chen [both graduated in this reporting period],and one postdoctoral researcher (Caio Mucchiani) have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2) robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presenting research findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. How have the results been disseminated to communities of interest?One paperhasbeen presentedat the IEEE\RSJ IROS2022conference.The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone.We have also been discussing regularly with the Almond Board of California, whereby we gather their feedback via short demos and seek to acquire access to commercial farms to test our technology. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we plan to continue our efforts under objectives 1 and 2, as well as initiate controlled field experiments that will guide further development. In addition, a prototype of our current leaf extraction mechanism will be replicated at UC Merced site so that we can conduct multiple field experiments fordistinct tree crops at faster rates. A large dataset is an asset for visual analysis, and being able to expedite data collections will also have positive effects inobjective 2 activities. Finally, with our commercial partner we will continue to engage almond growers to get domain feedback about our proposed solution.

Impacts
What was accomplished under these goals? In this period we prioritized work under objectives 1 and 2in multiple concurrent directions. Theone mostly developedincludedrobotic leaf extraction (IEEE\RSJ IROSconference paper and one submitted patent application).At the same time, we conducted field experiments of stem water potential measurements manually and collected a first dataset to enable automatedvisual sensing of water expression at the cut stem. Initial findings were submitted to a conference but the paper was not accepted - we are currently revising this work to include theautomated pressure chamber design and results are forthcoming. One more patent application on automated soil salinity estimation was also submitted in this period.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: M. Campbell, A. Dechemi, and K. Karydis, An Integrated Actuation-Perception Framework for Robotic Leaf Retrieval: Detection, Localization, and Cutting. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2022, pp. 9210-9216.


Progress 11/01/20 to 10/31/21

Outputs
Target Audience:In this reporting period we published and presented papers in major venues attended by robotics and automation researchers and practioners, which feature segments of agricultural robotics and technology that this award pertains to. The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone interesting in learning more. Furthermore, we started investigating the potential for tech transfer, and as such we interacted with several stakeholders (practioners, ag advisors, and growers) as part of customer discovery for an NSF I-Corps program at UCR we participated in. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?TwoPhD students, one MS student, one undergraduate student and one postdoctoral researcher have been involved in aspects of this project. The project has provided them with training and professional development opportunities related to: 1) hardware design, rapid prototyping, and iterative fabrication; 2)robot motion planning and visual sensing algorithm develpment, and integration of artificial intelligence; 3) presentingresearch findings to a wide variety of audience ranging from other researchers to growers; 4) interaction with stakeholders. How have the results been disseminated to communities of interest?The researcher papers have been presented in their respective conferences (the IEEE RA-L journal was jointly accepted and presented atIEEE ICRA 2021 conference).The presented material is available online in the lab's youtube channel for easy and unobtrusive access to anyone. Furthermore, results have been communicated directly to stakeholdersas part of our customer discovery for an NSF I-Corps program at UCR we participated in. We have also been discussing regularly with the Almond Board of California, whereby we gather their feedback via short demos and seek to acquire access to commercial farms to test our technology. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we plan to focus primarily in Objectives 1, 2and 4, so that we can create the physical sampling and automation tools which can then be integrated in a refined multi-robot coordination and planning framework (Objective 3).

Impacts
What was accomplished under these goals? In this period we prioritized work under objective 3 (published IEEE RA-L journal paper) that determines how to optimally sample in field by factoring any prior information bias (e.g., past soil moisture maps). We also developed new means to automate some of the processes required to attain soil information mapsin a scalable manner by developing a robot to measure soil electric conductivity which can be linked to soil salinity (published IEEE CASE conference paper). The latter was tested and evaluated in UCR's experimental fields. We have also been working actively under the goals of Objectives 1 and 2 in multiple concurrent directions (leaf extraction - a conference paper is in preparation; automated pressure chamber design - a first prototype is being fabricated; visual sensing - a dataset to use is being developed).

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2021 Citation: Campbell, M., Ye, K., Scudiero, E. and Karydis, K., 2021, August. A Portable Agricultural Robot for Continuous Apparent Soil Electrical Conductivity Measurements to Improve Irrigation Practices. In 2021 IEEE 17th International Conference on Automation Science and Engineering (CASE) (pp. 2228-2234). IEEE.
  • Type: Journal Articles Status: Published Year Published: 2021 Citation: Kan, X., Thayer, T.C., Carpin, S. and Karydis, K., 2021. Task Planning on Stochastic Aisle Graphs for Precision Agriculture. IEEE Robotics and Automation Letters, 6(2), pp.3287-3294.