Source: UNIVERSITY OF ILLINOIS submitted to NRP
CPS: FRONTIER: COLLABORATIVE RESEARCH: COALESCE: CONTEXT AWARE LEARNING FOR SUSTAINABLE CYBER-AGRICULTURAL SYSTEMS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
ACTIVE
Funding Source
Reporting Frequency
Annual
Accession No.
1025720
Grant No.
2021-67021-34418
Cumulative Award Amt.
$2,000,000.00
Proposal No.
2020-10740
Multistate No.
(N/A)
Project Start Date
Mar 15, 2021
Project End Date
Mar 14, 2026
Grant Year
2021
Program Code
[A7302]- Cyber-Physical Systems
Recipient Organization
UNIVERSITY OF ILLINOIS
2001 S. Lincoln Ave.
URBANA,IL 61801
Performing Department
Ag and Bio Engineering
Non Technical Summary
This project seeks to transform agriculture by developing a novel, context-aware cyber-agricultural system that encompasses sensing, modeling, and actuation to enable farmers to respond to crop stressors with lower cost, greater agility, and significantly lower environmental impact than current practices. The technical objectives are to create better AI for individualized plant health estimation,to implement data-driven, multi-scale planning and reasoning, and to develop individualized sensing and actuation via autonomous robots with dexterous manipulators.
Animal Health Component
30%
Research Effort Categories
Basic
30%
Applied
30%
Developmental
40%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
40253102020100%
Knowledge Area
402 - Engineering Systems and Equipment;

Subject Of Investigation
5310 - Machinery and equipment;

Field Of Science
2020 - Engineering;
Goals / Objectives
The COALESCE project seeks to transform CPS capabilities in agriculture by developing a novel, context-aware cyber-agricultural system that encompasses sensing, modeling, and actuation to enable farmers to respond to crop stressors with lower cost, greater agility, and significantly lower environmental impact than current practices. The technical objectives are to embed biophysics in machine learning for individualized crop modeling, to apply multi-modal information fusion and robust learning for individualized sensing, to implement data-driven, multi-scale planning and reasoning, and to develop individualized sensing and actuation via autonomous robots with dexterous manipulators. Modeling of biophysical conditions in reasoning models contributes to CPS research areas in data analytics, information management, and real-time systems. The individualized sensing thrust contributes to data analytics, information management, and autonomy research areas. Multi-scale reasoning and farm management based on individualized actuation contributes to the areas of control, data analytics, autonomy, networking, safety, and real-time systems research areas.This is a collaborative project between UIUC and Iowa State.The UIUC portion of this project will focus on the robotics part. Our primary goals are:- Data driven planning and control for multi-scale reasoning- Indivizualized sensing and actuation via autonomous robots with dexterous manipulators
Project Methods
The key methods used in this project are from the realm of Cyber Physical Systems (CPS). The project will advance keyCPS science, technology and engineering. Our algorithms and tools will address foundational questions such as:Will machine learning models embedded with domain knowledge, physical, laws, invariances, and conservation principles lead to more data-efficient predictions? Can data fusion strategies combine robust and uniform abstraction of diverse data with minimum information loss? Will distributed robotic systems capable of treating biotic and abiotic stressors on an individual plant basis lead to fundamentally more sustainable and robust farming systems? Will learning based control and decision making lead to robust distributed autonomy and coordination in harsh, changing, and uncertain field environments? Will soft arms and manipulators with reinforcement learning based controllers lead to new mechanisms for controlling biotic and abiotic stressors requiring little or no chemicals?In summary, oureffort will contribute to the following core CPS research areas: Control, data analytics, autonomy, information management, networking, real-time systems, and safety. We note that while the notion of real-time may be slightly relaxed for Ag applications (compared to other CPS sectors such as aerospace), the large spatial scale of agricultureoperations still makes it time-critical, i.e., it demands fast sensing, information processing and actuation for timely mitigation of crop stresses over large fields.

Progress 03/15/23 to 03/14/24

Outputs
Target Audience:1) Graduate students: The team of researchers and graduate students working on this project and other members of the agriculture and robotics community in UIUC and Iowa State Univeristy 2) Farmers: We showcased the robots capability to farmers through our several outreach and extension activities. 3) Speciality crop growers: We had interacted and showed demos to speciaity crop growers through two extension events, one with 4) K-12 students:We incorporated material from this course into various undergraduate and graduate courses. We also invited and communicated with K-12 students who were interested in learning more about agriculture and the role of technology in agriculture. 5) Engagement with Research Community: The unique low-form factor dexterous manipulator on a mobile platformand its ability to detect and grasp fruits and berries with high precision was submitted to IROS 2024. Furthermore, a workshop that brought together agricultural roboticists all over the world to discuss the state of the art, gaps and future directions for robotics in agriculture 6) Engagement with Farmers/Growers: Our team made significant strides in engaging with urban farmer stakeholders in the last one year through participation in Dixon Springs Agricultural Center High Tunnel Production Day at Dixon Spring, IL, University of Illinois Extension Twilight Meeting Series. Simpson, IL (July 13) (attendance of 29 people), 2023 Specialty Crops Field Day. Urbana, IL (August 7) (attendance of 15 people). Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Faculty and Students participated in several conferences (e.g., NeurIPS, AAAI, MLCAS, ICRA, IROS) that enhances professional development as well as visibility. Students (including multiple women and underrepresented minority students) associated with the COALESCE project had the opportunity to participate in bi-weekly meetings to provide project update presentations to share their research findings. These opportunities provide valuable feedback on their progress and allow them to receive direction for further research and collaboration. In addition to these presentations, OSU postdoc visited ISU for data collection and ISU students visited UIUC campus/labs and vice versa to further solidify our collaborations and enhance learning experiences. ? How have the results been disseminated to communities of interest?The key results have been published in prestigious journals and conferences such as Trends in Plant Science, Frontiers in Plant Science, Plant Phenome, Plant Phenomics, IEEE Robotics and Automation Letters and presented at ICRA, AAAI, SIGCOMM, and AI-Ag workshops at IROS 2023. All the papers are listed in products. Specifcally, we would like to highlight that the core group of COALESCE PIs came together to write an authoritative review paper in Trends in Plant Science (IF:20.5), the most prestigious plant science review journal, to formally establish the discipline of Cyber-Ag Systems. The paper details is below: Sarkar, S., Ganapathysubramanian, B., Singh, A., Fotouhi, F., Kar, S., Nagasubramanian, K., Chowdhary, G., Das, S.K., Kantor, G., Krishnamurthy, A. and Merchant, N., 2023. Cyber-agricultural systems for crop breeding and sustainable production. Trends in plant science. PI Chowdhary and Co-PIs have delivered multiple invited talks (selected talks listed below) that included results from this project. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period, we will continue advancing the Year 3 objectives. In addition, we will pursue the following new technical objectives: 1. Simultaneous sesnor selection and planning for robotics in plant manipulation problems 2. Design of field-scale decision support algorithms for production agriculture ?

Impacts
What was accomplished under these goals? 1. Visual Servoing with a Hybrid Arm for intelligent in-field manipulation: We developed Detect2Grasp. A low-cost compact manipulation system designed for urban high tunnels that uses a depth camera to scan for fruit and a slender end effector mounted with an RGB camera and a two-finger gripper for grasping. We report a 100% success rate of 11.35s on average for grasping berries (over 100 tests). The tests also show that Detect2Grasp is robust to various lighting conditions, sensor noise, and plant types and shows promise in harvesting clusters of fruit. 2. Farmernetes: Centralized Traffic Engineering for Networked Farm? Applications?: We conducted measurements of WiFi's performance in corn-fields. We discovered that: 5 GHz does very poorly for under-canopy transmissions at even small distances., The 2.4 GHz band fairs better under-canopy, however, the number of channels (bandwidth) is quite limited.? Using these measurements, we design a network architecture for Farmernetes' WiFi mesh network, reserving the 2.4 GHz band for under-canopy communication and using the 5 GHz band for the mesh capabilities of Farmernetes' network.? To manage limited network capacity efficiently, we enable different resource management decisions in Farmernetes. We also propose a novel abstraction and set of resource management algorithms. Real-world (Illinois autonomous farm) and simulator results show significant improvements in application quality of experience QoE (2.5-4x) and network throughput (1.5x) compared to baseline WiFi meshes.? 3. Robust Insect Detection and Identification Model: We developed a hybrid model UP-DETR, which combines CNN (highly accurate Self-Supervised Learning pre-trained insect classifier) and transformer architectures. Trained on about 70,000 images and bounding box annotations, UP-DETR operates class-agnostically. This integrated approach enables the detection and classification of approximately 2000 insect species. With a Mean Average Precision of around 0.6, our detection model demonstrates strong performance in identifying insects in challenging backgrounds, such as scenarios with multiple insects of the same or different species in a single frame, insects against camouflaging backgrounds, and insects of various sizes.

Publications

  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2024 Citation: Sarkar, S., Ganapathysubramanian, B., Singh, A., Fotouhi, F., Kar, S., Nagasubramanian, K., Chowdhary, G., Das, S.K., Kantor, G., Krishnamurthy, A. and Merchant, N., 2023. Cyber-agricultural systems for crop breeding and sustainable production. Trends in plant science.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Ripperger, Evan and Krishnan, Girish. (2023). Design Space Enumerations for Pneumatically Actuated Soft Continuum Manipulators. .
  • Type: Conference Papers and Presentations Status: Submitted Year Published: 2024 Citation: Walt, Benjamin and Krishnan, Girish. (2023). Grasp State Classification in Agricultural Manipulation


Progress 03/15/22 to 03/14/23

Outputs
Target Audience:COALESCE has actively shared its research with diverse audiences, from local and national media to conferences and workshops. Through these efforts, we have educated farmers, industry professionals, and academic scientists*on the sustainable technologies we are developing. Our work has reached public audiencesthrough local news stories and national exposurevia features such as the Big Ten Network's B1G Impact Researchsegment. We are also collaborating with companies and farmersto tailor our innovations to real-world needs through demonstrations and hands-on training. Beyond industry engagement, COALESCE has inspired the next generation of innovators--showcasing our technology to nearly 500 studentsthrough the Engineering Open House, summer farm visits, and outreach events hosted by Agricultural and Biological Engineering. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Graduate students: Graduate students worked on key aspects of the research. They were provided opportunity and mentoring to lead papers and research activities. Undergraduate students: Undergraduate students helped with key research aspects and were provided research mentorship and opportunity to co-author papers. Postdocs: Postdocs were provided with training and mentoring to lead teams of researchers. Faculty: Faculty directed students and postdocs to drive research program objectives, and worked together to collaboratively solve problems that lie at the intersection of disciplines. How have the results been disseminated to communities of interest?The key results have been published in prestigious journals such asIEEE Robotics and Automation Letters and presented at AI-Ag workshops at AAAI 2022. All the papers are listed in products. PI Sarkar and Co- PIs have delivered multiple invited talks that included results from this project. In order to help the nascent Cyber-Ag research community grow and thrive, the COALESCE team has organized international workshops such as the Third International Workshop on Machine Learning for Cyber-Agricultural Systems (MLCAS) and the AI for Agriculture and Food Systems (AIAFS) workshop under the umbrella of AAAI 2022 conference. We also submitted a workshop to be organized at IROS 2023 in Detroit What do you plan to do during the next reporting period to accomplish the goals?Pest detection in field: Develop better perception models in collaboration with ISU students, test the robot in field and integratethe perception in the robot Visual servoing: Our main goal will be to utilize cameras on the tip of the robot and on the base of the robot to navigate through obstacles. Networking for field - CPS We expect to perform field experiments to answer the key questions raised through last year's analysis. We will conduct field tests to determine different attenuation of signals in the presence of canopy and over the canopy that can be used for data transfer between the under canopy robots working in the field. Under-canopy cover-crop planting We expect to continue to work with industry to advance under-canopy cover-crop robot planting in the 2023season.

Impacts
What was accomplished under these goals? Built a setup in the lab that consists of the BR2 soft continuum arm (SCA) and the myCobot rigid arm by Elephant Robotics. The setup also consists of various targets and obstacles. The setup also contains a tip-camera that captures RGB data and a RealSense camera that captures RGB-D data. Built a new B3 SCA to replace the BR2 SCA in the setup Developed a system to reach a target object from current position in cluttered environments: Method use 3D reconstruction for scene understanding The target location is found out using object localization on the image from real sense camera Path planning from current location to desired location is done using 3D Dijkstra's? BR2 and BR2 + extrusion Forward model network development Dynamic position and orientation controllers Generalization to unseen workspace points ?We developed an algorithm called CropNav, the algorithm can switch between different modes of autonomy to ensure reliable long duration autonomy. CropNav showed that long duration under canopy autonomy is achievable, with long runs that spanned over multiple kilometers In 2022 we demonstrated that it was possible to monitor a number of robots from the edge of the field using LoRA signals We demonstrated working with collaborators from AIFARMS that its possible to reduce the size of the network to enable visual cropFollow to work on a low-cost computer

Publications

  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2023 Citation: AlBeladi, A., Ripperger, E., Hutchinson, S., & Girish Krishnan. (2022). Hybrid Eye-in-Hand/Eye-to-Hand Image Based Visual Servoing for Soft Continuum Arms. IEEE Robotics and Automation Letters, 7(4), 1129811305. https://doi.org/10.1109/LRA.2022.3194690
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Benjamin Walt and Girish Krishnan, Grasp State Classification in Agricultural Manipulation, in IEEE International Conference on Intelligent Robots and Systems, 2023
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Yao, S. and Hauser, K., 2023, May. Estimating tactile models of heterogeneous deformable objects in real time. In 2023 IEEE International Conference on Robotics and Automation (ICRA) (pp. 12583-12589). IEEE.


Progress 03/15/21 to 03/14/22

Outputs
Target Audience: The following target audiences were reached: 1. Researchers: This includes graduate students, undergraduate students, faculty, and postdoctoral associates in the performing team. It also includes other researchers who are not a part of this project but in the community, with whom we communicated through papers and presentations. These researchers were from diverse fields of agriculture, computer sciecne, and engineering. 2. Farmers: We interacted with a number of farmers to discuss with them technologies and potentials of mechanical weeding. 3. Press: We interacted with several journalists to help them communicate the potential of mechanical weeding with small robots 4. Stakeholders: We communicated with government and industry stakeholders about the potential of mechanical weeding agbots. This includes startup companies, more established companies, program managers in goverment funding organizations, and politicians. 5. Students: We incorporated material from this course into various undergraduate and graduate courses. We also invited and communicated with K-12 students who were interested in learning more about agriculture and the role of technology in agriculture. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?Graduate students: Graduate students worked on key aspects of the research. They were provided opportunity and mentoringto lead papers and research activities. Undergraduate students: Undergraduate students helped with key research aspects and were provided research mentorship and opportunity to co-author papers. Postdocs: Postdocs were provided with training and mentoring to lead teams of researchers. In particular, Dr. Andres Baqueros is being trained to lead his own lab as he is interested in an academic position. Faculty: Faculty directed students and postdocs to drive research program objectives, and worked together to collaboratively solve problems that lie at the intersection of disciplines. The ASA/CSSA/SSSA International Annual Meeting, Salt Lake City, UT (ISU) American Society of Plant Biologists (ASPB) Biology Summit (Virtual) (ISU) Neural Information Processing Systems (NeurIPS) Conference (Virtual) (ISU, UIUC) North American Plant Phenotyping Network Annual Conference (Hybrid) (ISU) Association for the Advancement of Artificial Intelligence (AAAI) Conference (Virtual)(ISU, UIUC) RF Baker Plant Breeding Symposium (Hybrid) (ISU) American Geophysical Union Fall Meeting (OSU) Our project team members took part in demonstrations/talks including: Plant Phenomics and Artificial Intelligence to Glean Information from Plant Sensing Technologies, Genetics, Genomics and Bioinformatics Symposium, ASAS Midwest Section Meeting, March 2022. (ISU) Role of Interpretable Machine Learning in Cyber-Agricultural Systems, International Conference on Digital Technologies for Sustainable Crop Production (DIGICROP), March 2022. (ISU) A Cyber-Physical Systems Approach to Agricultural Sustainability, Japanese Society of Agricultural Informatics (JSAI) Annual Conference, May 2021. (ISU) Application of Machine Learning and Artificial Intelligence in Plant Breeding, Tri-Societies Meetings, Salt Lake, City, UT (ISU) Soybean Breeding at ISU, Iowa Soybean Association, Ankeny, IA (ISU) Artificial Intelligence (AI), Human Cognition, and Ethics,CRAI-CIS Seminar Series, Department of Computer Science, Aalto University, Helsinki, Finland (GMU) Building an Infrastructure to Help Scientists Handle Big Data, Arizona Science (National Public Radio - Episode 297) Tucson, AZ (UofA) Coupling Vegetation Biophysics and Ecohydraulics for Improved Simulation of Land-Atmosphere Carbon-Water-Energy Exchange, American Geophysical Union Fall Meeting (OSU) Our project team members provided Education and Outreach opportunities including: Fundamentals of Plant Physiology- Online (ISU) Uni-High FarmBot Educational Outreach, AGORA Day, Champaign, IL (UIUC) COntext Aware LEarning for Sustainable CybEragricultural SystemsPoster presentation at ISU Day, Des Moines, IA(ISU) Additionally, students (including multiple women and underrepresented minority students) associated with the COALESCE project had the opportunity to participate in bi-weekly meetings to provide project update presentations to share their research findings. These opportunities provide valuable feedback on their progress and allow them to receive direction for further research and collaboration. How have the results been disseminated to communities of interest?The key results have been published in prestigious journals such as Plant Phenomics, IEEE Robotics and Automation Letters and presented at AI-Ag workshops at NeurIPS 2021 and AAAI 2022. All the papers are listed in products. PI Sarkar and Co-PIs have delivered multiple invited talks that included results from this project. In order to help the nascent Cyber-Ag research community grow and thrive, the COALESCE team has organized international workshops such as the Third International Workshop on Machine Learning for Cyber-Agricultural Systems (MLCAS) and the AI for Agriculture and Food Systems (AIAFS) workshop under the umbrella of THE AAAI 2022 conference. What do you plan to do during the next reporting period to accomplish the goals?Debuggin the Field Problem: Robotic insect monitoring and mitigation that includes the following objectives - Learn areas frequented by pests, Detect and distinguish harmful pests and Spray pests with pesticide locally on plants with robots. Visual Servoing: Our main goal will be to utilize cameras on the tip of the robot and on the base of the robot to navigate through obstacles. Networking for Field - CPS We expect to perform field experiments to answer the key questions raised through last year's analysis. Under-Canopy Cover-Crop Planting We expect to continue to work with industry to advance under-canopy cover-crop robot planting in the 2022 season.

Impacts
What was accomplished under these goals? Formulation of the Debuggin the FieldProblem On average 20% of crops are lost due to pests. Farmers are unable to systematically identify, localize, and mitigate pests in a timely and cost-effective manner. A fleet of mobile ground robots, equipped with dexterous sensors and manipulators, are uniquely equipped to manage insect infestations under the canopy and increase plant yield. Currently, soft robotic arms lack precision and robustness for applications. A soft arm controller, based on a non-linear autoregressive network with exogenous inputs (NARX) architecture, is being designed and tested to solve this problem. Precision within tree cm has currently been shown in simulation and tested on the physical system. Improvements to improve this precision and add orientation tracking will soon be available. Chowdhary, Krishnan, and Arti Singh will work together on this problem. Visual Servoing for Agricultural Robots For soft continuum arms, visual servoing is a popular control strategy that relies on visual feedback to close the control loop. However, robust visual servoing is challenging as it requires reliable feature extraction from the image as well as accurate control models and sensors to perceive the shape of the arm, both of which can be hard to implement in a soft robot. We circumventthese challenges by presenting a deep neural network-based method to perform smooth and robust 3D positioning tasks on a soft arm by visual servoing using a camera mounted at the distal end of the arm. A convolutional neural network is trained to predict the actuations required to achieve the desired pose in a structured environment. Integrated and modular approaches for estimating the actuations from the image are proposed and are experimentally compared. A proportional control law is implemented to reduce the error between the desired and current image as seen by the camera. The model together with the proportional feedback control makes the described approach robust to several variations such as new targets, lighting, loads, and diminution of the soft arm. Furthermore, the model lends itself to be transferred to a new environment with minimal effort. Krishnan and Chowdhary are working together on this problem Control of Soft Continum Arms Soft continuum arms (SCAs) that are controlled by visual servoing (VS) present trade-offs between the camera range and tracking accuracy. Cameras placed at a distance (eye-to-hand) can observe a larger workspace area and the SCA tip, while a camera at the end effector (eye-in-hand) can more accurately survey the target. In this project, we present a hybrid eye-to-hand and eye-in-hand VS scheme to track a desired object in the SCA's worksapce. When the target is not in the field-of-view of the tip camera, hand-to-eye VS is implemented using a wide field-of-view camera on the soft robot's base, to servo the soft robot's tip to a feasible region where the target is expected to be seen by the tip camera. This region is estimated by solving an optimization problem that finds the best region to place the SCA assuming a constant curvature model for the SCA. When the target is seen by the tip camera, the system switches to a hand-in-eye controller that keeps the target in the desired image position of the tip camera. Experimental results on the popular BR^2SCA demonstrates the effectiveness of the hybrid VS scheme under practical settings that include external disturbances. Krishnan and Chowdhary are working together on this problem Networking for Field CPS Emerging autonomous farm applications have diverse requirements e.g. teleoperation and virtual walkthrough have low latency requirement, whereas data collection for precision agriculture requires high throughput. Unfortunately, the IoT network in autonomous farms has limited connectivity options and power constraints. Every task or data transfer utilizes limited resources and depletes power. It is useful to have a centralized traffic controller or scheduling system that schedules tasks such that it satisfies requirements of tasks while managing limited resources. We will start off by studying different applications and understanding workload requirements of these applications. Then we understand different connectivity options that are optimal for different applications. Given these connectivity options, a centralized traffic controller can schedule jobs from different applications over different options, as per the application requirements and power constraints. Mittal, Arti Singh and Chowdhary are working together on this problem Foundational Advances in Planning for High Degree of Freedom Robotic Manipulators Hauser and students continued to advance the foundations of planning and reasoning AI for control of high degree of freedom manipulator arms. These advances are designed to enable arms to plan more complex trajectories faster and more efficiently. The problems are inspired by difficult challenges in manipulation afforded by agricultural robotic CPS. Hauser is working on this problem Under-Canopy Cover-Crop Planting We developed algorithms and software for under-canopy cover-crop planting robots. We showed through numerical simulations that a team of fivesuch robots can cover an eightyacre field planting cover-crops in under fivehours when operating fully autonomously. We evaluated our algorithms for autonomy on robots created by EarthSense Inc., and demonstrated the feasibility of vision and LiDAR based autonomy for field operation. This has inspired further development at EarthSense for bringing this technology to farmers as a lower-cost option towards the farms of the future. These activities were conducted at the Illinois Autonomous Farm, where we also demonstrated the feasibility of autonomous control of under-canopy robots using vision sensors, by extending our work from Narenthiran et al. 2020 RSS (CropFollow). Chowdhary is working on this problem Feasibility of Mechanical Weeding In our latest paper published in the prestigious IEEE Transactions of Robotics, we proposed AgBots 3.0, an algorithmic framework that leverages prediction of weed emergence patterns to optimize robot paths so that fields can be covered optimally with the least number of robots. Our results demontrate through extensive numerical simulation that mechanical weeding robots can feasibly keep fields weed-free, and highlight ways in which cost can be reduced by minimizing the number of robots through smart planning. This work was continuation from Chowdhary and Adam Davis' earlier NSF-CPS project funded through NIFA.

Publications

  • Type: Journal Articles Status: Published Year Published: 2022 Citation: W. McAllister, J. Whitman, J. Varghese, A. Davis and G. Chowdhary. 2022. Agbots 3.0: Adaptive Weed Growth Prediction for Mechanical Weeding Agbots. In: IEEE Transactions on Robotics, vol. 38, no. 1, pp. 556-568, Feb. 2022, doi: 10.1109/TRO.2021.3083204.
  • Type: Journal Articles Status: Published Year Published: 2022 Citation: S. Kamtikar, S. Marri, B. Walt, N.K. Uppalapati, G. Krishnan and G. Chowdhary. 2022. Visual Servoing for Pose Control of Soft Continuum Arm in a Structured Environment. In: IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 5504-5511, April 2022, doi: 10.1109/LRA.2022.3155821.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: S. Kamtikar, S. Marri, B. Walt, N. K. Uppalapati, G. Krishnan and G. Chowdhary. 2022. Visual Servoing for Pose Control of Soft Continuum Arm in a Structured Environment. In: RoboSoft Conference, and jointly in IEEE Robotics and Automation Letters. Edinburgh, Scotland, April 2022.