Source: CORNELL UNIVERSITY submitted to NRP
CPS: TTP OPTION: MEDIUM: TOUCH SENSITIVE TECHNOLOGIES FOR IMPROVED VINEYARD MANAGEMENT
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1019074
Grant No.
2019-67021-29225
Cumulative Award Amt.
$1,191,236.00
Proposal No.
2018-09054
Multistate No.
(N/A)
Project Start Date
Apr 1, 2019
Project End Date
Dec 31, 2023
Grant Year
2019
Program Code
[A7302]- Cyber-Physical Systems
Recipient Organization
CORNELL UNIVERSITY
(N/A)
ITHACA,NY 14853
Performing Department
Electrical and Computer Engineering
Non Technical Summary
It has been predicted that the world population will reach 10 billion people by 2050, demanding critical improvements of today's agricultural systems. This proposal addresses several novel techniques that may extend to a range of fruit, but the focus will be on viticulture for ease of access. There are 1,392 grape farms in New York State alone with 39,216 acres of vineyards, generating more than $4.8 billion in annual revenue. These producers depend on predictions of how much fruit is coming in from the vineyard to predict revenue and allocate resources such as harvest and winery labor, tank space, bottles, and other required shipping-capable packaging. Improved methods for sensing and processing data in the field will lead to more accurate yield predictions well ahead of the harvest, without added labor cost. Such systems may further permit more targeted application of fertilizers and pesticides reducing cost and environmental impact.State-of-the-art precision agriculture focuses largely on automated visual assessment of crops in the field using high-end camera and laser ranging systems. However, in many scenarios, including when heavy foliage is present, use of remote vision is not adequate to determine important measures such as crop growth, ripeness, and health. This proposal aims to augment such Cyber Physical Systems through inexpensive vison-based techniques deployed before foliage becomes excessive, and through novel soft and touch sensitive technologies that can operate in close proximity to the crop. Specifically, the work proposed involves four integrative thrusts: 1) soft manipulators capable of safely obtaining close-range data on the fruit, 2) a soft, porous veil with integrated ultrasonic transducers for detection of microbes and fruit ripeness, 3) use of low-end cameras and machine learning techniques for automated early season cluster counting, and 3) modeling frameworks to interpret diverse multi-modal sensor information for improved yield predication and targeted pesticide application. The full range of acquired data will include cluster number, visual appearance, geometry, weight, temperature, elastic modulus, thermal conductivity, and humidity processed into information on leaf-area, berry count, berry hue, berry soundness, berry contact, surface wetness, microbes, cluster closure, size, and compactness. To demonstrate the advantages of introducing these technologies to agriculture, the PIs will develop an integrated robotic solution for improved vineyard management. Tracking specific clusters over days and weeks may significantly enhance yield predictions and additionally ease gathering of subsequent data. These research efforts will culminate in several in-field demonstrations, as well as a decision support system that allows farmers to make use of data collected from multiple systems in the field. This work represents a transformative step towards superior Cyber Physical Systems for Agriculture, through the combined use of real-time quantitative global- and qualitative local data.
Animal Health Component
30%
Research Effort Categories
Basic
60%
Applied
30%
Developmental
10%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
40211312020100%
Knowledge Area
402 - Engineering Systems and Equipment;

Subject Of Investigation
1131 - Wine grapes;

Field Of Science
2020 - Engineering;
Goals / Objectives
Cutting edge vision-based systems for precision agriculture are capable of rapidly gathering quantitative data about the state of the field. In this proposal we aim to lower the cost of these vision systems, and augment them with dedicated systems capable of gathering qualitative, close-range data about the crop. We focus on inexpensive solutions that remain compliant and capable of interacting safely with the crop. We will combine a range of different high-resolution sensor modalities and sensor readings over time for improved understanding. Furthermore, rather than exploring stand-alone sensors, we focus on an integrated system capable of automatically gathering data from many sources in the field, and improve current models and predictions to provide decision support for the end-user. We specifically target the challenging scenarios of Northeastern vineyards, which involve cluttered natural environments with rapidly changing weather and light conditions. Although we focus on viticulture, the techniques we propose scale to a wide range of field-grown fruits. We will deploy our system on several farms for evaluation on a large scale. The proposed project will span three years, the PIs will target incremental technical development and tests annually.We have four high-level objectives:Soft, touch sensitive manipulators capable of safely interacting with fruits in the field. These will be based on inflatable polymer manipulators with integrated carbon resistive channels, that can encompass clusters on the vine to measure their approximate dimension, cluster compactness, level of ripeness, and weight. This soft skin may further help to press hard sensors safely against the skin of the berries to measure properties like temperature and humidity. Because these manipulators create an enclosure in which light settings and other factors can be controlled, we will also examine close-range vision techniques to estimate berry numbers, size, and hue which has a direct impact on the quality of the harvest. We will build on the current manipulators present in our lab to increase the spatial density of sensors in the polymer; to accurately interpret data; and incorporate automated sanitation to avoid transferring diseases between clusters. We may also look into design of collapsible and flexible manipulators for easier motion.Ultrasonic Transducers to detect, monitor, and classify destructive fungal organisms on the fruit and their growth in the field. PI Lal will continue to develop stand-alone integrated ultrasonic transducers capable of measuring acoustic impedance of contacting organisms with a resolution down to 20um. PI Shepherd will integrate these transducers into porous silicone veils that can sustain season-length deployment on the surface of the fruit. The porosity should permit the clusters to grow as normal. The veils will be passive, however, by application of external heat they will contract, and thereby translate the ultrasonic sensor over the surface of the berry much like a scanner. It is possible that the natural heat cycle throughout the day will provide enough of a variation to produce repeated scans. We will invest effort into permitting off-board power to the sensor, such that a nearby robot manipulator can power and download data from the chip. We will gather hundreds of samples of fungal organisms which are of high concern to local growers, and use supervised learning to train classifiers to detect these. We will work closely with the pathologists to support feature engineering for these classifiers.Low-cost vision-based estimation of the amount of clusters on the vine, using simple camera-phones and machine learning. We will show that recordings from low-end camera phones by farmers (or autonomous robots) can be used to count the number of clusters in the vine with high relative correlation to harvest counts by using state-of-the-art convolutional neural networks and trackers. By running our tests at night we can ensure stable lighting conditions, and by further deploying this scheme very early in the season (before bud break), we may prevent occlusions from foliage from impacting our results. We will further use this method to detect the amount of foliage, the fruit-to-leaf area ratio, and potentially perform early identification of shoots which will indicate the location of clusters later in the season. Beyond directly impacting the yield prediction, approximate localization of clusters may help support simpler path planning methods for the automated robot arms on which the pre-mentioned manipulators will be mounted, as they move through the cluttered environment near the vine.Development of a Decision Support Framework for farmers and stakeholders. We will integrate data from the traditional vision-based techniques and novel touch sensors in the field, with computational models for improved yield predictions. With the end-user in mind, we will incorporate these into a simple decision support framework for local farmers and evaluate these through a Transition-to-Practice option
Project Methods
The proposed work will tie together several existing technologies towards the aim of automatically assessing vines in the field. It relies on a holistic and inter-disciplinary approach spanning the fields of viticulture, pathology, electrical engineering, computer science, and mechanical engineering. Several efforts will be addressed in order to reconstruct cluster geometry and thus meet objective 1. These include 1) continued development of a soft pneumatic end effector, 2) integration and improvement of resistive touch sensors, and 3) correlation of sensor outputs with cluster geometry, compactness, integrity, and ripeness. To evaluate the performance of the system, cluster integrity will also be estimated manually by determining the proportion of damaged berries (which is reflected by loss of berry membrane integrity) on each cluster.In contrast to traditional rigid end-effectors, soft manipulators are inexpensive, compliant, durable, and can be brought into contact with the crop without risk of damages. A circular pneumatic chamber will be molded out of food safe polymer from Smooth-Sil® with a shore hardness of 35-40 to avoid berry damage, and mounted against a hard outer shell. We will integrate a conductive gel based on carbon black into resistive channels in the chamber. Due to quick, imprecise manufacturing and material hysteresis the specific resistance is a poor indicator of cluster geometry; however, by controlled the pressure in the chamber and computing only the differential resistance as the pressure is changed, we can accurately determine when contact is made with the fruit. Using a compressor and feedback control, we can accurately control the inflation of the chamber. Upon initiation the chamber will be under negative pressure revealing a large opening into which the cluster can be placed. By releasing a valve back to ambient pressure, the polymer will attempt to shrink to its original state and encompass the convex geometry of the cluster. Through further inflation, it will fill the concave regions of the cluster. And finally, with added inflation we may check the hardness (related to the ripeness and integrity) of the berries.An Arduino Uno R3 will control the manipulator and collect and process data from the sensor channels. Using the enclosed space within the manipulator, we will further test close-range cameras and artificial lighting to determine berry hue.To have the end effector automatically reach the clusters, we will augment the pneumatic chamber with a camera, and mount it on a 5 degree-of-freedom commercial robot arm from Universal Robots. Using prior knowledge from visual assessment of clusters on the vine, as well as close-range cameras we will produce sparse path plans to guide the manipulator in place.To comply with objective 2, we will develop stand-alone integrated devices based on previously developed ultrasonic transducer arrays and integrate these into polymer veils that will be attached to clusters. The grape ripeness, and surface perturbations such as fungi/bacteria growth are important characteristics to yield the grape health.Using ultrasonic response of the fruit surface and bulk, one can image the grape tissue ultrasonic impedance that consists of the Young's modulus and density. We aim to measure the ultrasonic impedance over GHz frequencies, and image the grape surface using ultrasonic focused beams to identify surface fungal pathogens using tiny chips that can be interfaced to grape surfaces. A set of sonar beams, formed by phased arrays of transducers can interrogate the surface in contact with the chip. Each sonar beam transmits GHz sonic pulses, that reflect off the surfaces touching the grape surface. The former reflected signal from the surface contacting the grape consist of the information of the ultrasonic impedance as the reflection coefficient. The complex reflection coefficient can be measured by measuring the in and out of phase components of the reflected pulses, and also measured over the bandwidth of the transducers. We will inoculate Botrytis and Powdery mildew on agar and real grape surfaces in both the lab and the field to test this technique.In addition to ultrasonic impedance imaging, the technology enables measurement of surface thermal conductivity and temperature, that provides multi-dimensional data for higher confidence in determination of grape health. The elasticity of the grape surface, any water on the berry surface, and microbes will be different and will be used to identify the surface. The time-of-flight is a function of the speed of sound in the silicon chip. As an object hotter or cooler than the silicon chip touches the surface, the temperature of the silicon chip will change, as the thermal front will move from the surface into the silicon chip. In the case of grapes, interfaced through a thin protective plastic layer on the sensors, it is likely that the grapes will be in constant contact with the sensor chips, thermally equilibrating with the sensor. In order to measure the thermal conductivity, we will heat the sensor chip to 1-3 degrees above ambient using integrated resistors, launching a thermal wavefront from the sensor chip into the fruit surface, an active approach to measuring the surface thermal conductivity.We will develop porous silicone veils which can be mounted directly on a subset of fruit in the field. The veil will be 3D printed using stereo-lithography according to a chemistry and process previously published by PI Shepherd. Silicone has a linear coefficient of thermal expansion of 0.0007C, and therefore a 100 mm length veil will stretch 700 um when heated just 10C. We will apply the heat using the aforementioned robot arm, however, in the future it is possible that this heat could be induced simply by the daily cycle of the sun. The silicone will be both transparent and porous to permit the fruit and microbes to experience normal conditions. The maximum strain of the silicone is 200 %; however, the mesh-like structure will ensure high flexibility to avoid impeding cluster growth.Finally, we will implement readout methods that provide veil-scale ensemble measurement, using RF interrogation. The measurements, made without physical contact from a nearby robot, is best accomplished using RF readout. In order to read the transducers at a small distance, we will explore the use of RF backscatter from the piezoelectric transducers.To comply with objective 3, we will deploy mount a camera phone capable of high speed recording (240fps) on a gimbal, alongside some LED projectors on a garden tractor. We will drive through the rows after dusk to image the vines over several weeks early in the season (typically May). We will then hand label a number of clusters in the videos, and use supervised training of convolutional neural networks to automatically identify clusters in the frames. We will use standard tracking schemes to correlate clusters between frames. We can use similar methods to extract the amount and growth of foliage, the presence and location of shoots, and the average leaf area size. To evaluate performance of these algorithms, the automated cluster counts will be compared with harvest counts at the end of the season.Where the first three objectives target development of basic technological features, objective 4 aims to produce a coherent system of direct use to the farmer. All of the data will be measured over the full season and correlated with the measured yields from the vine. A decision support system based on a user-friendly GUI for farmers will explain how, when, and where to setup the robot, and inform them of the predicted yield along with the uncertainty of the prediction. The accuracy of the system will be checked against manual observations in the following year or in data gathered simultaneously from other areas of the vineyard.

Progress 04/01/19 to 12/31/23

Outputs
Target Audience:Our target audience was an inter-disciplinary group composed of vine growers; researchers across robotics and automation, MEMS devices,and enology; high school and college students. Changes/Problems:The original grant was awarded for work on soft robots in vineyards. Unfortunately, a confluence of student visa issues and the pandemic hindered access to the lab. To keep up momentum,we switched focus early on to developing new low-cost methods for vineyard management using smartphones, especially targeting small-medium sized vineyards in the Northeastern U.S. where foliage is significant. What opportunities for training and professional development has the project provided? One graduate student defended his thesis on this project (publishing 2 papers on the topic of low-cost smartphone based applications in precision viticulture, including with the IEEE Intl. Conference on Robotics and Automation 2024 and Frontiers of Agronomy in 2021.The student hasanother paper in submission with OENOone and a review paper in preparation. In addition his software is now freely available to growers through the myefficientvineyards app. Two graduate students worked with an MEng student and co-authord a paper on the soft robotic gripper for shape estimation which was publication in Robotics Automation Letters in 2020. Two graduate students worked on the topic of ultrasonic detection of Botrytis spores, publishing three conference papers in the 2019 IEEE Intl. Electron Devices Meeting, the 2020 Joint IEEE Intl. Frequency Control Symposoium and Intl. Symposium on Applications of Ferroelectrics, and the 2021 IEEE Intl. Ultrasonics Symposium. We further provided opportunities for fourundergraduates to help develop software based on computer-vision and convolutional neural networks for various viticulture related data sets.An M.Eng. student helped develop the initial iPhone app which was eventually enrolled into the myefficientvineyards app; another worked on automating a stage and pipetting system for testing the ultrasonic sensors. One M.Sc. student worked to develop flexible polymer sheets to keep the ultrasonic sensors in close contact with the grapes; unfortunately this project was abandoned when the lab access was restricted during the pandemic. Another four students helped gather data from the vineyard. We established the "Robots, Vine, and Food" cross-college seminar class in 2020 which catered to 35+ students, often with additional alumni and faculty checking in depending on the topic of the particular seminar. The main PI, under guidance of the senior PIs is now an executive member of the Cornell Institute for Digital Agriculture, and has increased in ranks to associate professor with tenure. How have the results been disseminated to communities of interest?We reached vine growers through presentations at tail gate meetings arranged such as the Shaulis Symposium on New Tools for Precision Management arranged by the American Society for Enology and Viticulture in July 2019, the Cornell Institute for Digital Agriculture's Annual retreat 2019-2022, and the first Cornell Robotics Industry day in November 2019. We have also enablednew features for low-cost automated cluster-counting in thehttps://my.efficientvineyard.com/appwhich is freely available to growers as of Summer 2024.Our work has also been covered in popular magazines such as the Cornell Chronicle, MorningAgClips, the Florida Times, and Corbeau Innovation. Finally, we organized an open access panel on Agricultural Robotics around the World as part of the Annual Retreat for the Cornell Institute for Digital Agriculture in 2022. We reached researchers through3 journal publications, 4 conference papers and presentations, 3 poster presentations, and 7 invited seminars and workshop talks. We have another two articles to be submitted this summer. The PI further joined the Executive Committee of Cornell Institute for Digital Agriculture in 2021 and as part of this group continue to manage annual symposiums, international hackathons, and Research Innovation Funds internal to Cornell for PI and graduate summer effort grants related to digital agriculture. The PI further moderated a panel at Grow NY in 2023. We reached college students, other faculty, and alumni across life sciences and engineering by co-teachingaseminar class "ECE6680 / VIEN4940: Robots, Wine, and Food"on precision and digital agriculture in 2020. The class was cross-listed between plant sciences and the college of engineering and attended by approximately 35 undergraduates, graduates, faculty, and alumni with great success. This class stimulated enough excitement between the colleges to later instigate the introduction of adigital agriculture minorfor undergraduates. Finally, we did semi-annual outreach talks at high school students, freshman in college, and the general public, including at the Ithaca Science Center, the South Seneca School District, the Cornell Keeton House, Science on Tap, and as part of the week-long workshops for high school students "Fabricating the Future". What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Soft Robots for Cluster Assessment Our original focus area on soft robotic grippers yielded a new low-cost, easy-access method for touch-based shape reconstruction, based on soft and resistive carbon composite sensors, capable of up to 135% strain. We showed how this fabrication technique could be easily customized to two different applications, including a stretchable, tactile interface for passive sensing, and an active, soft pneumatic gripper that could fully encompass an object to reconstruct its shape. GHz Ultrasonic Imagers for Botrytis Detection We showed how our novel technology based on a GHz ultrasonic CMOS integrated imager could be used to detectBotrytis cinerea, a necrotrophic fungus that can infect grapes. The infection is caused by spores which disperse through air and is initially asymptomatic. When the Botrytis becomes visible, the crop is already heavily infected, and it is too late for effective control. Preventative efforts include leaf removal, epidermal waxes, and excessive fungicide application. Currently, detecting Botrytis involves multispectral microscopy which involves expensive processing by offsite laboratories. The introduction of a handheld device based on GHz ultrasonic sensing for early detection of Botrytis may significantly lower management costs and fungicide applications.The concept of our technology is to place the skin of a grape against the imager, then apply nutrients regularly to speed up growth such that it can be detected before the actual grapes become symptomatic. We have an internal summer grant to further this technology towards transition to practice, in collaboration with the company GIGA. Low-Cost, Easy-Accessible Computer Vision-Based Methods to Improve Vineyard Management Targeting easy and rapid technology adoption by small and medium sized grapevine growers, particularly in the Northeast where rain and soil nutrients lead to additional foliage, we developed several new methods to automate vineyard management using smartphone cameras and computer vision. Cluster Closure:We developed methods to monitor grape clusterscontinuously throughout the season, which may serve as a valuable open-source research tool for studying grapevine cluster phenology and factors affecting cluster closure, i.e. the contacting surface area between the berries. Berries which have higher contacting surface area also haveincreased susceptibility to diseases. When clusters close,they are subject to poor air circulation and microcracks in the cuticle membranes, and it becomesharder for fungicides to penetrate to the interior. While other automatic techniques exist, many rely on cumbersome and imprecise 3D scanning techniques, or expensive equipment which is inaccessible to growers. We took 809 cluster images with a smartphone of Riesling, Pinot gris, and Cabernet Francfrom fruit set to veraison, and analyzed these usingtwo image segmentation methods (Pyramid Scheme Parsing Network and Otsu's image thresholding) both demonstratinghigh accuracy. Interestingly, we found that the cluster closurecurve showed an asymptotic trend, with a higher rate of progression observed in the first three weeks, followed by a gradual approach towards an asymptote. Based on these results, we proposethat the number of weeks past berry set when the cluster closure progression curve reaches the asymptote should be considered as the official phenological stage of cluster closure. To reach transition to practice for this grant, we submitted a USDA/NIFA CPPM proposal on Incorporating Cluster Closure into the Grape Botrytis Prediction Model earlier this year. Yield Prediction:We developed tools for low-cost, computer-vision based, pre-bloom yield prediction in vineyards. Yield estimates are used prior to harvest to allocate resources such as labor, tank space, and packaging, and to predict revenue.Traditional methods for estimating the number of grape clusters in a vineyard involve manualcounting which is laborious, costly, and with an accuracy that depends on the sample size. We demonstrated that traditional cluster counting has a high variance in accuracy and is highly sensitive to the particular counter and choice of the subset of counted vines. Our method detects, tracks, and counts clusters and shoots in videos collected using a smartphone camera that is driven or walked through the vineyard at night. With a random selection of calibration data, our method achieved an average cluster count error of 4.9% across two growing seasons and two cultivars by detecting and counting clusters. The proposed method can be deployed before flowering, while the canopy is thin, which improves maximum visibility of clusters and shoots, generalizability across different cultivars and growing seasons, and earlier yield estimates compared to prior work in the area. We submitted and received a grant on Building Cluster Counting into My Efficient Vineyard to Aid Yield Prediction to the Pennsylvania wine marketing and research board. As part of this effort, the software was released on a trial basis this summer. Pruning Weight:Pruning weight is indicative of a vine's ability to produce crop the following year, informing vineyard management. Current methods for estimating pruning weight are costly, laborious, and/or require specialized know-how and equipment. We recently demonstrated an affordable, simple, computer vision-based method to measure pruning weight using a smartphone camera and structured light which produces results better than state-of-the-art techniques for vertical shoot position vines and demonstrate initial steps towards estimating pruning weight in high cordon procumbent vines such as Concord. The simplicity and affordability of this technique lends itself to deployment by farmers today or on future viticulture robotics platforms. We achieved an R2=.80 for VSP vines (better than state-of-the-art computer vision-based methods) and R2=.29 for HC vines (not previously attempted with computer vision-based methods). We are currently exploring the addition of this software to the myefficientvineyards app. Crop Coefficient:The crop coefficient is used by farmers to estimate the evapotranspiration, which in turninforms the irrigation and water consumption needs of their field. Current state-of-the-art relies on multispectral satellite or UAV-based imaging, or expensive LiDAR sensors deployed on mobile platforms. A number of computer-vision based methods have also been released to measure the leaf area as a proxy for the crop coefficient.Previous work has demonstrated a relationship between crop coefficient and surface area of shade produced by the vine canopy. A manual method relies on a solar cell-based sensor which is placed under the vines at noon to approximate the shade. This is cumbersome, and often lead to inaccuracies because of mischaracterizations of the current-sunlight area curve. We developed a method for estimating crop coefficient based on smartphone recordings.This method leverages structure-from-motion photogrammetry and image segmentation neural networks to extract geometrical data from the ground under the vine. We tested this on 10 panels of vertical shoot positioned Riesling at three separate times of the day, for a total of 30 datapoints in 2023. We also collected separate datasets for generating labeled images for training the segmentation neural network. Results correlated with solar cell-based methods with an R2=.68, and image segmentation network validation of 0.91 intersection over union. Review:Our work has led us to do a comprehensive review of state-of-the-art in viticulture sensing techniques as they related to pruning, crop coefficient, and canopy. We are in the progress of writing up this review, comparing and contrasting methods, not just based on accuracy, but on applicability to different grape cultivars and pruning methods, and the potential for technology adoption by farmers.

Publications

  • Type: Journal Articles Status: Published Year Published: 2023 Citation: Trivedi, Manushi, Yuwei Zhou, Jonathan Hyun Moon, James Meyers, Yu Jiang, Guoyu Lu, and Justine Vanden Heuvel. "A Preliminary Method for Tracking In-Season Grapevine Cluster Closure Using Image Segmentation and Image Thresholding." Australian Journal of Grape and Wine Research 2023 (2023).
  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2024 Citation: " Jonathan Jaramillo, Aaron Wilhelm, Nils Napp, Justine Vanden Heuvel, and Kirstin Petersen. Inexpensive, Automated Pruning Weight Estimation in Vineyards In 2024 International Conference on Robotics and Automation (ICRA), pp. TBD. IEEE, 2024.
  • Type: Journal Articles Status: Submitted Year Published: 2024 Citation: Jonathan Jaramillo, Justine Vanden Heuvel, and Kirstin Petersen. Computer Vision-based Estimation of Crop Coefficient in Vineyards using a Smartphone Camera. In submission with OENO one.


Progress 04/01/22 to 03/31/23

Outputs
Target Audience:The graduate student presented a poster at an ICRA workshop in May 2022 anddid a high school outreach event at South Seneca School District in New York State May 2023. We further have a journal article accepted with minor revisions to the Australian Journal of Grape and Wine Research. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?The graduate student who was working on the GHz Ultrasonic imagers graduated in Fall 2022 and started a Postdoc at Princeton University. A new graduate student will take over their work and will be funded through a Cornell Institute of Digital Agriculture grant to do proof of concept of this imager deployed on the Cornell research farm to image fungal spore populations. The second graduate student is working on the vison-based vineyard estimation tools and, upon submitting his last 3papers, aims to graduate lateFall 2023. How have the results been disseminated to communities of interest?We are awaiting a publication in the Australian Journal of Grape and Wine Research. Furthermore, we conducted anoutreach workshop at a local high school and partook in a field robotics workshop as part of ICRA 2022. What do you plan to do during the next reporting period to accomplish the goals?We are in the final stages of the project and are focusing on resulting publications.

Impacts
What was accomplished under these goals? We are in a NCE year, aiming to wrap up several papers related to aim 3. First, we finalized our study of tracking in-season grapevine cluster closure using image segmentation and thresholding.Grapevine cluster morphology significantly affects disease risk in vineyards, as diseases such as Botrytis and black rot can spread faster when berries have higher contact surface area, resulting in an increased duration of wetness. Cluster closure (CC) is the stage when berries touch each other. This study used mobile phone images to develop a direct quantification method for tracking CC in three grapevine cultivars (Riesling, Pinot gris, and Cabernet Franc). A total of 809 cluster images from fruit set to veraison were analyzed using two image segmentation methods: (i) a Pyramid Scene Parsing Network (PSPNet) to extract cluster boundaries, and (ii) Otsu's image thresholding method to calculate %CC based on gaps between the berries. PSPNet produced high accuracy (mean accuracy = 0.98, mean intersection over union (mIoU) = 0.95) with mIoU >0.90 for both cluster and non-cluster classes. Otsu's thresholding method resulted in <2% falsely classified gap and berry pixels. The progression of CC was described using basic statistics (mean, standard deviation) and using a curve fit. The CC curve showed an asymptotic trend, with a higher rate of progression observed in the first three weeks, followed by a gradual approach towards an asymptote. We propose that the X value (in this example, no. of weeks) at which the CC progression curve reaches the asymptote be considered as the official phenological stage of CC. The developed method provides a continuous scale of CC throughout the season, potentially serving as a valuable open-source research tool for studying grapevine cluster phenology and factors affecting CC. Second, we are in the process of finalizing our analysis and write-up of our new approach to low-cost vision based estimation of pruning weight and crop coefficient, which in turn informs the Ravaz index andthe capacity of the vine for the next growing season. Fruit weight is collected (on a per vine basis) atharvest and pruning weight is collected in the following spring. This collection is arduous for grape growers so they rarely quantify it - despite the fact that it should guide the pruning practices for the following spring to ensure the appropriate crop load for the upcoming season. To estimate the pruning weight, ourtechnique involves capturing videos at night using an iPhone mounted alongside an inexpensive line laser. We have developed a computer vision pipeline setup to process these images. On vertically positioned vines (common for vine-grapes), our method performs as good as state-of-the-art (0.78 R2 value); on sprawl canopy (common for juice-grapes) it performs markedly worse (0.3 R2 value) due to the ineffective capture of horizontal growth. We are currently testing ways to improve the latter by extending the reconstruction techniques to include 3D spatial information about the vine structures. The crop coefficient is a vineyard parameter that is essential to estimating and managing water usage in grape vines for arid climates. Moreover, it is used as a metric for overall vine size in studies/experiments comparing different vine treatments or in testing new rootstock hybrids. The crop coefficient is most easily measured by estimating the shaded percentage of the vineyard at high noon. We collected video and Paso Panel (a technique that uses solar panels) data in New York and California to begin development and testing of the computer vision pipeline. We have generated annotated training data for the machine learning models and tested 3d spatial models using monocular photogrammetry. These models in conjunction should provide an accurate estimation of the crop coefficient. A variant of this computer vision model is also being developed to measure canopy size by estimating exposed leaf area in the canopy instead of shaded area.

Publications

  • Type: Journal Articles Status: Accepted Year Published: 2023 Citation: Manushi Trivedi, Yuwei Zhou, Jonathan Hyun Moon, James Meyers, Yu Jiang, Guoyu Lu, and Justine Vanden Heuvel. "Preliminary method for tracking in-season grapevine cluster closure using image segmentation and image thresholding". Australian Journal of Grape and Wine Research.
  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2022 Citation: Jonathan Jaramillo, Justine Vanden Heuvel, Kirstin Petersen. 2022. Towards Low-cost, Vision-based Estimation of Yield and Crop Load in Vineyards. In ICRA 2022 Workshop on Agricultural Robotics and Automation, May 27, 2022, Philadelphia, PA.


Progress 04/01/21 to 03/31/22

Outputs
Target Audience:This is a highly interdisciplinary grant, therefore our target audience spans researchers in engineering, computer science, robotics, and plant sciences, as well as growers. We have reached the former through invited academic seminars (Cornell Plant Sciences and UPenn MEAM), workshop talks (IEEE/RAS Intl. Conference of Soft Robotics, ), and panel sessions (Cornell Institute for Digital Agriculture, CIDA, online open-acess panel on Agricultural Robotics around the worldand GrowNY on agricultural robotics). We also published on our gigahertz imaging technology of Botrytis in the 2021 IEEE Intl. Ultrasonics Symposium (IUS). Our paper in Frontiers and Agronomy was featured on numerous news sites around the world including the Cornell Chronicle, MorningAgClips, the Florida Times, and Corbeau Innovation.We know these articles reached growers and start up companies, becuase we received frequent emails expressing interest after the fact. Promoting the topic of digital agriculture, the PIs further served on the CIDA executive committee, partook as a judge in the CIDA Annual Intl. Hackathon, and served on the CIDA RIF student award committee. Changes/Problems:There have been no major changes to our research plan over the last budget period, with the exception of various COVID-related episodes that have left individual researchers out of office for several weeks. The main PI is also headed on maternity leave in the Fall of 2022, but we do not expect that this will interfere significantly with our progress as the graduate students are all at a senior level and quite independent by now. What opportunities for training and professional development has the project provided?Professional development of ... Undergraduates involved internships to help us gather data on clusters and canopy densityin the field Graduate students involved paper writing and oral presentations (at IUS 2021), as well as seminar presentations (Cornell Plant Sciences seminar series and Houghton Engineering seminars). One graduate student was further featured in the Cornell Engineering spotlights. The junior PI involved membership of the CIDA executive committee at Cornell, seminar and workshop talks, as well as actingpanel moderator (CIDA annual symposium on Agricultural Robotics Around the World and the GrowNY panel on Robots and Agriculture). How have the results been disseminated to communities of interest?To reiterate, one paper was published at IUS 2021 conference, the Frontiers and Agronomy paper from the last reporting period received numerous press mentions in magazines popular for growers, and our work was presented at both academic seminars and research workshops. What do you plan to do during the next reporting period to accomplish the goals?This next period will be the final period of the grant (we are on a NCE). During this period we intend to continue tests with the touch-based ultrasonic transducer to test additional species of fungus, test the touch-less ultrasonic transducer on real grapes, wrap up our proposed methods to automatically estimate the Ravaz index, and wrap up and deploy our cluster closure analysis. Furthermore, we are working with Terry Bates, the driver behind myEV (https://my.efficientvineyard.com/login), an online platform for vineyard owners, to incorporate our cluster/shoot detection algorithms. As part of this effort co-PI Vanden Heuvel also submitted a proposal to the Pennsylvania wine marketing and research board ($53,695, title: Building cluster counting into my efficient vineyard to aid yield prediction) tofund the effortto integrate the cluster counting software into myEV.

Impacts
What was accomplished under these goals? Objective 2.Ultrasonic transducers to detect, monitor, and classify fungal organisms We recently published on our method to detectBotrytis cinerea, a necrotrophic fungus that can infect grapes, with the GHz ultrasonic CMOS integrated imager from co-PI Lal's lab. To do this, we applied the Botrytis spore solution (~0.1 spores/μm) to the imaging surface of the ultrasonic imager, and then pumped water onto the surface periodically to compensate for the water evaporation and water consumption by the spores. We demonstrated that we can detect and record the presence of Botrytis and its growth cycle, and that the results of our ultrasound imaging are consistent with those of optical imaging. We also compared the Botrytis scans with those obtained from Metschnikowia Pulcherrima yeast, another ubiquitous organism on the surface ofgrapes. In the experiments (consistent with expectations), show that Botrytis grows slowly for the first 6-7 hours and then rapidly, whereas the yeast grows exponentially at first and then almost linearly, which helps us disambiguate between them. While there is more work to be done to make this method available farmers, and to make it work directly with grape skins, it demonstrates a pathway to do early fungus detection directly on the farm, omitting the need to wait for the return of off-site lab results. Consequently, it is an opportunity to mitigate profilactic and overuse of fungicides. Since this publication, we have had two major achievements: 1) automating and miniaturizing the automated peristaltic pump which delivers water to the sample for integration on the PCB. 2) We have designed100kHz electronics to process reflected air-borne ultrasonic signals for touchless evaluation of grape shape and moisture content. This setup currently consists of animaging system is composed of a spiral array of 40kHz MA40S4S transducers, an acoustic levitation frame, and a FSL40 XYZ stage. Currently, we have demonstrated its ability to 'image' a coin remotely. Objective 3. Low-cost vision-based techniques for vineyard management With our method on automated cluster and shoot counting concluded in early 2021, we moved on to focus on 1) a smartphoneapp to help growers capture good videos in the field for post-processing by our machine learning approach, and 2) new methods based on similar low-cost computer vision techniques to also estimate the Ravaz index and canopy area, both of which can inform improved vineyard management. Our new iPhone app ensures the correct camera settings and capture frame rates, and guides the user on when to record videos and where to position the phone with respect to the vines. Finally, it allows a nice overview of sampled dataalongside GPS data on where videos were captured and gives users the ability to upload videos to a local database which can run the cluster detection software. The Ravaz index is calculated as yield divided by pruning weight, i.e., it is the ratio of reproductive to photosynthetic vegetation, and informs the capacity of the vine for the next growing season. Fruit weight is collected (on a per vine basis) at harvest and pruning weight is collected in the following spring. This collection is arduous for grape growers so they rarely quantify it - despite the fact that it should guide the pruning practices for the following spring to ensure the appropriate crop load for the upcoming season. Over the last year we refined our data capturing and processing methods. Over two seasons, we have captured full 3D scans of 4-6 rows of Rieslingusing an Azure Kinect LiDARand an iPhone which is recording video at night mounted alongside inexpensive, light weightline lasers pointed at the vine. The former will help us ground truth our data, the latter is the (inexpensive and easily accessible) technology for the grower. To process the Azure data, we filterthe point clouds to remove sensor noise, cropout the ground, then useiterative closest point registration to generate transformation matrices on adjacent frame, and finally stichthese point clouds back together to achieve a full 3D scan of every row while zeroing out the rotation component of the ICP registration transformation matrix. To process the iPhone video, we simply threshold the image to extract out where the line lasers illuminate the growth, stich together all images for a full 2D scan of the row, and then calibrate for drift and gait signatures using optical flow. We are now working on automatically detecting shoots versus cane, wires, and posts. Towards the same measure,we spent the Summer working on canopy estimation. Previous work indicatesthat sunlight exposed leaf area is a more important metric than total leaf area for informing vineyard irrigation practices.However, conventional methods for measuring the shaded area under a vine canopy are time consuming, tedious, and error prone.Our data is captured at night, with an iPhone mounted next to inexpensive and light weight linelasers, as well as an Azure LiDAR scan to capture ground truth data. The sensors were raised approximately 2.3m on a pole looking down on the canopy. To process the data our method 1) generates a sparse 3D reconstruction using structurefrom motion and selecting key image frames that visibly capture the entire vine, 2) segments imagesto isolate the vine of interested from the background, and then 3) estimates the visible surface area of the vine. Qualitative assessment is promising, however due to the extra wet summer we had in 2021 which resulted in uncommonly large canopies, we will include an additional season's worth of data before our results. Objective 4. For TTP, we have focused on continued development of a simple method to analyze cluster closure in grapes.Starting with pictures of clusters in the vineyard with a white board behind, we delineated five class members (shoot, cluster, board, hand & canopy) using a CNN. However, later those five classes were merged into two classes (background & cluster) because: 1) the cluster was the main focus of the study, and 2) the ability of pyramid scene parsing networks (PSPNet) of mapping global image context through multiple resolutions of the image was improved with two classes. The mean accuracy (mAcc) for five class model was 42.5% and 60.02% for class cluster. Later, using two class segmentation, the original images were segmented into cluster and background which later were converted into binary images. These binary images were used to mask out the original images and using hue saturation and image thresholding total % of cluster closure was calculated, which is 91.2%. Even though the mAcc for two classes has been significantly improved (73.96%), the mAcc for cluster class remains low (48.79%) resulting in wrong semantic classification or additional unwanted delineated cluster boundaries, mainly because berries apart from the targeted cluster are present in the image or younger clusters have higher similarities with the grapevine canopy. Moreover, the preliminary model was trained on only 100 images but 100 more images were annotated to be used for training. The segmentation will be re-run to accurately delineate cluster boundaries. Additionally, handling of sparse unwanted boundaries patches of clusters using binary thresholding will be implemented. We have two growing seasons of cluster closure images on multiple cultivars from across New York State. Once the cluster closure quantification method is finalized, we will run all images previously collected and combine them with viticultural and environmental data to build a predictive model to be used by growers. We plan for the model to be accessed through two channels: newa.cornell.edu grape disease models, and the vineyard updates available through Cornell Cooperative Extension for Eastern NY.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2021 Citation: Liu, Yutong, Justin Kuo, Kerik Cox, Justine Vanden Heuvel, Kirstin Petersen, and Amit Lal. "Imaging and Detection of Botrytis Cinerea with Gigahertz Ultrasonic Imager." In 2021 IEEE International Ultrasonics Symposium (IUS), pp. 1-4. IEEE, 2021.


Progress 04/01/20 to 03/31/21

Outputs
Target Audience:Our work has reached academic researchers in Plant Sciences and Engineering/Robotics through conference presentations and poster exhibits. We also completeda cross-listed seminar course on ECE6970/VIEN4940 Robots, Wine, and Food, which was frequented by 35+ undergraduates, graduate students, professors, industrial collaborators, and alumni engaged in farm practices. Finally, we have reached New York State growers and US companies (e.g. Microsoft MOOG, and Dropcopter) through the Cornell Institute for Digital Agriculture (online) venues. Changes/Problems:As previously mentioned, the pandemic has drastically limited our lab access, impeding both development of the soft robotic manipulator and the experiments with the ultrasonic transducer. PI Petersen took several weeks off during March-April 2020 to co-organize emergency manufacturing of face shields, gowns, and PAPR hoods for New York City and Upstate New York hospitals. Furthermore, PI Vanden Heuvel has been on sick leave throughout the Fall of 2020. What opportunities for training and professional development has the project provided?Three graduate students have worked on this project to 1) present on and partially further development on the soft, sensorized gripper, 2) develop the framework for low cost, vision-based yield estimation, 3) setting up the ultrasonic GHz transducer array to measure Botrytis spores, and 4) developing a beam forming 40kHz transducer array to remotely detect individual grapes. One of these graduate students have published his first journal article, the other two had the opportunity to present their work at multiple conference presentations related to this project. The graduate students have also helped mentor four Master of Engineering students and an undergraduate student at Cornell. One of the graduate students further took the offered class on Robots, Food, and Wine, and all of them got to present their research at these seminars. PIs Petersen, Vanden Heuvel, and Shepherd further partook as judges in the Cornell Institute for Digital Agriculture Hackathon, March 2020, giving feedback to student teams worldwide. And PIs Shepherd and Petersen were part of a large team to submit and prepare a follow-on site visit for the NSF Science and Technology Centers call on "STC CROPPS" Due to pandemic protocols, we offered no high school student internship opportunities in the Summer of 2020. How have the results been disseminated to communities of interest?We have disseminated our results through two journal and two conference papers/presentations. Beyond publications, we have given talks and poster presentations at the following venues: The Cornell Institute for Digital Agriculture (CIDA) annual retreat, January 2021 The Cornell Institute for Digital Agriculture (CIDA) round table, August 2020 IEEE International Ultrasonics Symposium (IUS), online, July 2020 IEEE International Frequency Control Symposium and International Symposium on Applications of Ferroelectrics (IFCS-ISAF), online, July 2020 IEEE International Conference on Soft Robotics (RoboSoft 2020), online, June 2020 Invited seminar at the annual retreat for the Central Informations Technology (CIT) at Cornell, March 2020 Finally, PIs Petersen and Vanden Heuvel organized a 1-credit semester-long seminar class on digital agriculture titled "Robots, Wine, and Food", cross listed between the College of Engineering and the College of Agriculture and Life Sciences at Cornell. This class was an exciting combination of research and industry talks, as well as guest lectures from growers. The class elicited interested from 35+ undergraduate seniors, M.Eng. students, and graduate students, as well as several professors and alumni. Two out of sixteen lecture were specifically dedicated specifically to the viticulture project, taught by all PIs, and these were co-organized with the Cornell Robotics Seminar for an even greater audience. The Cornell Institute for Digital Agriculture is currently considering extending the concept of this class to a permanent offering. What do you plan to do during the next reporting period to accomplish the goals?Given the great success with our low-cost vision-based yield prediction, intend to focus on extending this framework to also include the leaf-area-to-fruit ratio and the Ravaz Index, while continuing to develop phone-based apps that will encourage grape growers to use our technology. We further hope to test drive the robot and arm, to examine good path planning procedures near cluttered growth in the vineyard this Summer. We will continue to work on imaging Botrytis spores in growth media using the Geegah ultrasonic transducers, and plan to construct a 4x4 array of 40kHz remote transducers as a building block for a more extensive collection. For the TTP option, will continue to integrate NEWA weather data and our cluster closure protocol at several vineyards to inform better spray schedules.

Impacts
What was accomplished under these goals? Objective 1: Last year, we published on an in-expensive and easy-to-manufacture soft, sensorized robotic gripper capable of characterizing the 2D geometry of objects encompassed. Due to the pandemic shutdown, further iterations of this gripper was halted, but we recently started preliminary experiments towards using this gripper to characterize the elastic modulus of encompassed objects. We have also prepared the technical infrastructure to test the gripper when ready: through collaborations we have set up a commercial 7 DOF Kinova Robot arm on a Clearpath Robot ready to be launched into the experimental vineyard in the Spring of 2021. Objective 2: Upon consultation with pathology and plant-microbiome specialists, we decided to omit the silicone netting which we originally planned to use to hold the sensors onto the clusters. Early last year, we attempted our first single-pixel ultrasonic gigahertz transducer scans of Botrytis spores, using micro-XYZ stages to move the sample over the sensor, however, these experiments did not yield reliable results. Instead, our focus moving forward is to rely on detection of the grape fruit rot pathogen,Botrytis cinerea,by analyzing swabs of the berry surface using sensor arrays which can be setup by individual farmers. Specifically, we focus on 1) Ultrasonic gigahertz transducer arrays which we recently gained access to, and 2) remote ultrasonic 40kHz arrays to detect individual berries in a cluster. First, we aimed to use a CMOS integrated aluminum nitride (AlN) array to monitor the growth of the Botrytis spores in media by measuring the acoustic impedance of the contact surface. In these early tests, we were able to reliably detect and observe the features of mycelial mats ofB. cinerea. Since those early efforts, we've been working to optimize the preparation and conditions needed to detect the spores of B. cinerea. Since these wind and water-dispersed spores serve as the means on infection with grapes, it is important that germinating "infective" spores be detected by the sensor technology. We've examined the possibility of spore detection using dry spores and spores in liquid media. We are working to overcome some challenges with germination media, and sensor drying during our "proof of concept" studies. The chip in use was designed by Geegah, a startup from SonicMEMS group that has made a 128x128 pixel imager. The ultrasonic images will help us to describe and predict the spore growth in real farms. The size of the transducer elements in the array is about 20 µm, so the spore images' resolution is also of this scale. To image the growth cycle of Botrytis spores, the spores have to be placed in media for 12 hours. However, 1 hour after the start of imaging, the media dries, and we now have a microliter syringe pump setup to continuously wet the samples. Along a parallel thread, we conducted tests to determine the feasibility of using, handheld smartphone microscopes. In comparison to stereo light microscopy systems, which can routinely allow observations of botrytis spores 1000x magnification, the devices were not able to resolve the presence of spores ofB. cinerea as they were relying on digital post-possessing of images instead of optics to achieve the claims of magnification. Second, we hypothesize that addition of ultrasonic transmitters to standard computer vision techniques will improve cluster characterization, because the ultrasonic reflection measures the mechanical parameters such as density and sound speed, whereas optical image measures the optical reflectance. Specifically, we plan to develop an inexpensive 12-by-12 40 kHz transducer array; all of the building components (driver, switch, and ADC) are commercially available. The beamforming technique of the phased array allows electrically controlled wave steering. Thus, it is possible to scan the range of grape clusters without moving the sensor, achieving a resolution of 2 cm (similar to berry size). Objective 3: We recently had our paper on low-cost, computer vision-based, pre-bloom yield prediction in vineyards accepted to the journal of Frontiers in Agronomy. Traditional methods for estimating the number of grape clusters in a vineyard involve manually counting which is laborious, costly, and with an accuracy that depends on the sample size. We demonstrated that traditional cluster counting has a high variance in accuracy and is highly sensitive to the particular counter and choice of the subset of counted vines. We then proposed a simple computer vision based method for improving the reliability of yield estimates using cheap and easily accessible hardware for growers. This method detects, tracks, and counts clusters and shoots in videos collected using a smartphone camera that is driven or walked through the vineyard at night. With a random selection of calibration data, our method achieved an average cluster count error of 4.9% across two growing seasons and two cultivars by detecting and counting clusters. Traditional methods yielded an average cluster count error of 7.9% across the same dataset. Moreover, the proposed method yielded a maximum error of 12.6% while the traditional method yielded a maximum error of 23.5%. The proposed method can be deployed before flowering, while the canopy is thin, which improves maximum visibility of clusters and shoots, generalizability across different cultivars and growing seasons, and earlier yield estimates compared to prior work in the area. We are now looking at automated methods to determine the Ravaz index calculated as yield divided by pruning weight. Specifically, the ability of a grapevine to ripen fruit is based on the ratio of exposed leaf area to fruit weight. Fruit weight is collected (on a per vine basis) at harvest and pruning weight is collected in the following spring. This collection is arduous for grape growers so they rarely quantify it - despite the fact that the Ravaz Index should guide the pruning practices for the following spring to ensure the appropriate crop load for the upcoming season. Our current efforts are focused on developing an inexpensive, simple, easy-to-use, vision-based system to quantify pruning weight across an entire vineyard block.To do this, we designed a fully enclosed pull-cart able to withstand long operation in freezing temperatures and snow-covered vineyards, equipped with a computer, a space heater, a generator, and various relatively inexpensive sensors that permit vine recording, including 1) a phone mounted on a gimbal, 2) a phone mounted on a gimbal next to line lasers to illuminate the foreground over the background, 2) an Azure Kinect LiDAR mounted on a gimbal, and 3) an Intel Real Sense D455 with active IR stereo mounted on a gimbal. Objective 4: To promote use of the previously described computer vision yield estimation methods, we are now working with a Master of Engineering student to develop a phone-based app that will help direct growers to take similar videos in their own vineyards, as well as a simple method to upload them to our server where we can process the data. For close-range visual detection of cluster closure, we have collected data from multiple regions and cultivars in NY, including Long Island: Chardonnay, Riesling, Gewurztraminer; Western NY: Pinot gris, Riesling, Gewurztraminer; and the Finger Lakes: Chardonnay, Riesling, Cabernet Franc. Specifically, we imaged 50 clusters on the vine for each cultivar every 10 days or so from berry set through to closure. We have literally thousands of pictures and are working on automated Matlab scripts to process them. We further collected viticulture data (weather, shoot number, cluster number, etc.) to test for inclusion into the predictor model. We will continue this effort at the same sites in 2021.

Publications

  • Type: Journal Articles Status: Accepted Year Published: 2021 Citation: Jonathan Jaramillo, Justine Vanden Heuvel, and Kirstin H. Petersen. "Low-Cost, Computer Vision-Based, Prebloom Yield Prediction in Vineyards". Frontiers in Agronomy. DOI: 10.3389/fagro.2021.648080
  • Type: Conference Papers and Presentations Status: Published Year Published: 2020 Citation: Y. Liu et al., "Characterization of AlScN on CMOS," 2020 Joint Conference of the IEEE International Frequency Control Symposium and International Symposium on Applications of Ferroelectrics (IFCS-ISAF), Keystone, CO, USA, 2020, pp. 1-5, doi: 10.1109/IFCS-ISAF41089.2020.9234939.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2020 Citation: Y. Liu, J. Kuo, A. Lal, J. Sharma and N. Singh, "Characterization of Metal Effect on Solidly Mounted AlScN on CMOS," 2020 IEEE International Ultrasonics Symposium (IUS), Las Vegas, NV, USA, 2020, pp. 1-4, doi: 10.1109/IUS46767.2020.9251681.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Ma, Danna, Steven Ceron, Gregory Kaiser, and Kirstin Petersen. "Simple, Low-Cost Fabrication of Soft Sensors for Shape Reconstruction." IEEE Robotics and Automation Letters 5, no. 3 (2020): 4049-4054.


Progress 04/01/19 to 03/31/20

Outputs
Target Audience: Our work has reachedacademic researchers in Plant Sciences and Engineering/Robotics through numerous academic seminars, conference presentations, and poster exhibits. The work has also reached undergraduate and graduate students interested in agricultural robotics through a cross-listed seminar course on "ECE6680 / VIEN4940 Robots, Wine, and Food" which we introduced this semester. Finally, we have reached New York State growersand US companies (e.g. Microsoft, MOOG, and Dropcopter) through several Digital Agriculture venues and tail gate meetings. Changes/Problems:Beyond the coronavirus lockdown which will impact our plans for Summer, PI Petersen was on maternity leave during the 2019 Fall semester, and two of our graduate students had an additional load caused by Qualifying exams. What opportunities for training and professional development has the project provided?One postdoc has worked on this projectwith a focus on developing a protocol and image processing software to determine cluster closure. Four graduate students have worked on this project to 1) miniaturize the ultrasonic GHz transducer, 2) develop low cost, easily accessible soft sensors for feature reconstruction, and 3) develop the framework for low cost, vision-based yieldestimation. Two of these graduate students have submitted their first publications, two have given their first conference/symposiumpresentations, and one has taken this opportunity to mentor aMaster of Engineering student as well as 2 undergraduate students. One MSc student has been designing and characterizing the silicone netting with training on confocal microscopy. One MEng student has been coached to develop theautomated XYZ stage and chip testing framework. One undergraduate has been mentored to write image processing software for automated feature tracking, another has been working with the field data. Finally, two high school interns spent six weeks working in Dr. Lal's lab testing the GHz sensors. How have the results been disseminated to communities of interest?We have disseminated our results through two journal and conference papers/presentations, and have another journal paper in rebuttal phase with Robotics Automations Letters. We are furthermore writing up a third journal paper which will target the Journal of Precision Agriculture. Beyond publications, we have given talks and poster presentations at the following venues: Shaulis Symposium, New Tools for Precision Management, American Society for Enology and Viticulture, July 2019. The Cornell Institute forDigital Agriculture (CIDA) annual retreat, October 2019. The NSF-CPS annual PI-meeting, November 2019. Robotics Industry Day, Cornell University, November 2019. "The Lower Hanging Fruit: Towards a simpler approach to agricultural robotics". Cornell Institute for Digital Agriculture Seminar series, January 2020. As well as related outreach talks: "Agricultural Robots", Ithaca Science Center, January 2019. "Robotic Superorganisms", Keeton House Talk, Cornell University, College student outreach program, March 2019. "Robotic Superorganisms", Science on Tap, Cornell University, general public outreach program, April 2019. "Fabricating the Future: Robotics & Programming", Cornell University, High School outreach program, July 2019. "Agricultural Robotics", Cornel China Center opening event, November 2019. "Robots and Artificial Intelligence", Henriette Hoerluck Elementary School, November 2019. Finally, as previously mentioned, we are currently co-teaching a seminar course on Robots, Wine, and Food for students across the College of Engineering and College of Agriculture and Life Sciences. Beyond students, this course is frequented by several faculty and alumnus. What do you plan to do during the next reporting period to accomplish the goals?We are currently strategizing future work according to the coronavirus lockdown. Beyond limited access to the labs, it is especially critical to this project which is highly dependent on access to the fields during the growing season. Options for remote work include continued use of the video footage gathered in the 2019 growing season.Specifically, we intend to look into visual estimation of canopy density and leaf area ratio. We may also have access to volunteer grower footage from other areas of the world, which will permit us to test how well the classifiers translate. We are hoping to submit a patent application based on the shoot tracking software.We further intend to take on a team of Master of Engineering students to help develop the decision support framework app. We have furthermore sent one of our students home with a robotic arm, and are hoping to mount this on a Husky rover this summer to bring either a camera or soft end effectors into closer range of the clusters to start analyzing data such as hue or visually accessible Botrytis. When we regain access to the labs, we plan to expand the soft end effector with higher resolution measurements in three dimensions, and show how it may be used to determine the shore hardness of the object. We will further continue integration of the silicone netting with the grape clusters, and plan to collect a large data set of fungal spore scan using theGHz transducer, which in turn can be used to train a classifier. Finally, an important thrust involves continued work to translate the stand-alone GHz transducer chip out of the lab and onto our robot end effectors. We are furthermore looking into simply mounting small fans that blow airborne spores onto surface of the transducer to enable a more cost-effective solution to detecting fungal infections in the field. For the TTP option, if we gain access to the field, we are planning to integrate NEWA weather data and our cluster closure protocol at several vineyards to inform better spray schedules.

Impacts
What was accomplished under these goals? Objective 1. Current prevalent techniques in soft sensitive end effectors, as is necessary for touch-based cluster shape reconstruction, require expertise, expensive materials, and high-end processing equipment which limits both their transition to practice and the accessibility to researchers and users. To address this issue, we have developedeasily accessible, low cost, and rapid fabrication methods for soft resistive carbon composite sensors. We characterized their repeatability and durability in response to strain up to 35%. We further showed how this fabrication technique may be easily customized to two different applications, including a stretchable, tactile interface for passive sensing, and an active, soft pneumatic gripper that can fully encompass an object to estimate its (planar) geometry. Using these mechanical designs and simple control and analysis, we showedhow to achieve high relative accuracy, despite the low manufacturing tolerances of the sensors. Our hope is that these techniques are so simple that even small-scale farmers can make, optimize, and deploy them on their own. Weare also developing methods for ground truthing the timing of cluster closure and cluster compactness. These techniques involve taking photos of harvested clusters against a light board for contrast. An automated script outlines the cluster and determines the transmission of light through the cluster and a ratio of border/area. We are currently working with a collaborator in Australia to gather data during their growing season to start to building the model based on environmental and viticultural parameters. Objective 2. Our goal is to develop technology to image surfaces of grapes using GHz ultrasonic transducers. At GHz frequencies, the ultrasonic wavelength is small, down to 1-10um, which potentially enables the capability of imaging individual spores on the grape surface. Over the last funding cyclethis work has consisted of four separate thrusts: 1 - PI Cox)We have a developed a culturing infrastructure for Botrytis cinerea, a devastating fruit rot pathogen of grapes. The systems producers a continual batch of active pathogenic cultures. These cultures can serve as testing material for sensor development and validation of sensor detections. The cultures readily sporulate and can be used to dust or generate spore clouds with small fans to allow settling on sensor surfaces. We have also identified several locations where Erysiphe necator and Plasmopara viticola, the causal agents of grape powdery and downy mildew, respectively, can be obtained. These are obligate biotrophic pathogens, and cannot be artificially cultured in the lab, but could be grown on potted plants in a quarantined greenhouse. The sites identified during reliably get the diseases in the late summer and early fall, and can serve as a source of wildtype spore material for sensor development and validation. During this project, we established that spores from these pathogens can be harvested and stored for a few weeks at 3°C and still retain their key structural characteristics. This would allow use to establish brief windows of time for sensor development and validation for these two additional pathogens. 2 - PI Lal and Petersen) We have setup an XYZ stage with a single pixel transducer controllable from Matlab in 5-10um increments dependent on the axis of translation. We have furthermore manufactured several test pieces to validate the performance. Our hope is that this setup will help us generate a large dataset which we can use to train future classifiers. 3 - PI Lal) While our past results demonstrated the feasibility of using AlN transducers driven by commercial off the shelf electronics and CMOS electronics on a separate chip, we have now accomplished a completely monolithic integration of AlN rigidly mounted transducers on the CMOS electronics that drive them for GHz ultrasonic reflectometry. The monolithic integration reduces the parasitics yielding higher SNR, enables near complete elimination of RF feedthrough, and reduced packaging cost. This work towards a stand-alone sensor will aid integration into soft robotic end effectors and the silicone netting. 4 - PI Shepherd)We have created a silicone netting that can incorporated the transducers and be placed over grapes. We have started to characterize its thermal expansion using laser scanning confocal microscopy. We are recording shape change, but the amount is inconsistent and more care and statistics need to be taken to make a conclusion on the degree of expansion. Further, to take advantage of this phenomenon, we are designing pleating that should amplify the throw in regions of interest based on the thermal expansion. We have started working with Dr. Lal's group to measure the displacement of the silicone over his sensors. By fixing Dr. Lal's sensors to the ground and laying the silicone slab over the sensor, we can directly measure the effect of thermal expansion on the net. Objective 3. Over the growing season in 2019 we captured a large data set from the Cornell Lansing Vineyard including: video footage captured from a tractor driven along the trellis every second night, double manual counts of clusters and shoot tips, and harvest data. We set up a piece of software to enable manual marking of bounding boxes around clusters and shoot tips in the videos, creating a large training set. We then developed a framework that automatically segments the videos, runs DNN classifiers, and openCV feature tracking to estimate counts in the videos. We are currently comparing the applicability and accuracy of three major architectures. Our preliminary analysis indicates a large variance in manual clustercounts, and that the automated cluster counts perform only slightly worse than the manual counts. Note however, that farmers typically only estimate the yield of their vineyard based on manual counts in a few panels, not the entire vineyard. We are currently examening the potential to track shoot tips instead clusters and analyzing how well the number of shoot tips correlate with the final yield. This method has great potential both because we can record earlier in the season before foliage occludes many samples, and because we can predict yield even earlier. Objective 4. The TTP option is set to start in the Summer of 2020.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Abdelmejeed, Mamdouh, Adarsh Ravi, Yutong Liu, Justin Kuo, Jaibir Sharma, Srinivas Merugu, Navab Singh, and Amit Lal. "Monolithic 180nm CMOS Controlled GHz Ultrasonic Impedance Sensing and Imaging." In 2019 IEEE International Electron Devices Meeting (IEDM), pp. 34-3. IEEE, 2019.
  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2020 Citation: Danna Ma, Steven Ceron, Gregory Kaiser, and Kirstin Petersen, Simple Low-Cost Fabrication of Soft Sensors for Feature Reconstruction, Intl. Conference on Soft Robotics (RoboSoft), 2020.
  • Type: Journal Articles Status: Under Review Year Published: 2020 Citation: Danna Ma, Steven Ceron, Gregory Kaiser, and Kirstin Petersen, Simple Low-Cost Fabrication of Soft Sensors for Feature Reconstruction, under review with Robotics Automation Letters, 2020.
  • Type: Conference Papers and Presentations Status: Other Year Published: 2019 Citation: Poster presentation: Jonathan Jaramillo, Justine Vanden Heuvel, and Kirstin Petersen. Low-Cost Vision-Based Estimation of Yield in Vineyards. Shaulis Symposium, New Tools for Precision Management, American Society for Enology and Viticulture, July 2019.