Source: UNIVERSITY OF CALIFORNIA, DAVIS submitted to
ROBOTIC HARVEST-AIDING ORCHARD PLATFORMS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
TERMINATED
Funding Source
Reporting Frequency
Annual
Accession No.
1007731
Grant No.
2016-67021-24535
Project No.
CA-D-BAE-2314-CG
Proposal No.
2015-09455
Multistate No.
(N/A)
Program Code
A7301
Project Start Date
Dec 1, 2015
Project End Date
Nov 30, 2019
Grant Year
2016
Project Director
Vougioukas, S. G.
Recipient Organization
UNIVERSITY OF CALIFORNIA, DAVIS
410 MRAK HALL
DAVIS,CA 95616-8671
Performing Department
Biological & Agricultural Eng.
Non Technical Summary
In fresh market fruit production, harvesting is one of the most labor-intensive operations incurring high cost and dependence on a large seasonal semi-skilled workforce, which is becoming less available. With an eye toward mechanization, growers are increasingly adopting high-density orchards that feature narrow almost two-dimensional canopies that create "fruiting walls", which are easier to harvest. Harvest-aid platforms are also being developed and offered by several companies as an intermediate step towards full mechanization. These machines aim at increased harvest efficiency by eliminating ladders. However, adoption by US growers is still limited, even in modern high-density orchards. The main reasons are high cost and often inadequate efficiency and labor savings. The major factor that limits harvesting throughput is the mismatch between the non-uniform fruit distribution and the pickers' uneven and varying picking speeds. This project aspires to develop affordable technologies for next-generation, robotic harvest-aid orchard platforms. The machines we envision will function as co-bots that collaborate with fruit pickers by physically carrying them and intelligently adjusting their vertical positioning with respect to the canopies in the orchard. The proposed machines will use advanced sensing to estimate incoming fruit yield and worker picking speeds. Based on these data they will modulate the incoming fruit flow by optimally adjusting their own travel speed, and dynamically elevate each picker to a 'fruit zone' that matches her/his speed and maximizes the overall platform throughput. Improved picker safety and ergonomics will also be incorporated in robotic platform design and operation.Intellectual merit. The project investigates a novel human-robot physical interaction mode, in which a team of humans harvests fruit while standing on a robotic orchard platform, which controls each worker's vertical positioning to optimize collective performance. A rigorous approach to automated harvest-aid platform operation analysis and control is proposed. The platform, workers and fruiting-wall system is regarded as a processing line in a flexible manufacturing system. The optimizing principle of matching incoming flow with machine capacity is applied to robotic platforms leading to the original concept of controlling platform speed and picker placement to match picker speeds with appropriate harvesting zones of the fruiting wall. The theoretical activities will be complemented with advancement of the state-of-the-art in imaging and sensing technologies to estimate fruit densities and picker harvest rates. If robotic picking is to become practical multi-arm robotic harvesters will face the same efficiency, sensing and control issues. Therefore, by addressing robotic platform operation researchers are "paving the way" to fully mechanized selective harvesting.Broader impact. By accelerating the adoption of robotic harvest-aid platforms the project will bring financial benefits for U.S. fruit growers; market advantages for SMEs building agricultural equipment; increased safety and higher wages for farm workers; work opportunities for a wider, less physically capable labor pool, including women who are the backbone of rural communities; and fresh affordable fruits for consumers. The project's educational agenda spans the graduate, undergraduate and K-12 levels and utilizes project-based learning and fieldwork to cross-pollinate among disciplines. The researchers will leverage the themes of the project and the increasing awareness and concern for sustainable and healthy food production to engage K-12 students in STEM-promoting activities.
Animal Health Component
0%
Research Effort Categories
Basic
(N/A)
Applied
100%
Developmental
(N/A)
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
40453102020100%
Knowledge Area
404 - Instrumentation and Control Systems;

Subject Of Investigation
5310 - Machinery and equipment;

Field Of Science
2020 - Engineering;
Goals / Objectives
Our long-term goal is to develop the theoretical and technological tools that will enable the design, optimization, prototyping and field-testing of consistently high-throughput, cost-effective mechanized harvesting systems for modern orchards. As part of this effort, this proposal aspires to robotize harvest-aid orchard platforms. The machines we envision will function as co-bots that collaborate with fruit pickers by physically carrying them and intelligently adjusting their vertical positioning with respect to the canopies in the orchard. The proposed machines will use advanced sensing to estimate incoming fruit yield and worker picking speeds, in order to optimally regulate platform travel speed and picking stations' elevations. Essentially, these robotic platforms will (a) modulate the incoming fruit flow by optimally adjusting their own travel speed; and (b) dynamically assign each picker to a 'fruit zone' that matches their speed and maximizes the overall platform throughput.The proposed project must address three major research challenges: 1) model the machine-aided harvesting process that results from the complex interactions of orchard, pickers and robotic platform, in order to facilitate design, optimization and controller development for the automated harvesting system; 2) develop a robust, accurate and relatively inexpensive sensing system for fruit yield density and worker picking rates; 3) develop human-in-the-loop control system for platform operation, which increases efficiency while ensuring picker safety. The project will also evaluate the performance of the prototype system during commercial fruit harvesting in real-world conditions.Research Goal 1: Modeling of fruit harvesting with a robotic orchard platformThis objective aims at developing a harvesting simulation tool that will be used to explore, gain intuition, develop and fine-tune control schemes for platform speed and worker positioning. The performance measure will be the platform's harvesting speed, i.e., the collective picking rate of all workers on the platform. Picking speed depends on orchard layout and tree architecture, fruit distribution, worker positioning on the platform, individual worker picking speed and pattern, and platform speed. Simulating these coupled interacting systems (orchard, pickers, platform, controller) is a major challenge. The following sections formulate the objectives and tasks that must be accomplished toward this research goal.Objective 1.1: Create instances of virtual orchards.This objective aims at creating 3D models of fruit trees in rows of SNAP architecture commercial orchards, and at digitizing the positions of the fruits in the canopies. These digitized trees will serve as input to the robotic platform harvest-aid simulator; fruit positions will provide fruit density as a function of height for any position and height in a digitized orchard row.Objective 1.2: Develop human picking models.Just like digital models of actual trees will be used as input to the platform-based harvesting simulator, human picking models are needed to provide realistic individual picking rates. Our approach will be to model the process of manual fruit picking and estimate the stochastic parameters of the process from captured picker motion data. This approach is inspired by literature in production systems modeling, and it has been recently applied in the agricultural (greenhouse) sector.Objective 1.3: Develop platform-based fruit picking simulator.The Gazebo open source robotics simulator will be used to model platform-based harvesting by coupling the tree, picker and platform models. This simulator features a robust physics engine to calculate machine kinematics and dynamics, and a computational geometry engine to detect obstacle collisions.Research Goal 2: Develop a sensing system for platform-based fruit picking.The platform's control system has four sensory inputs: the fruit-density, the picker rates, the height of the picker platforms, and the velocity of the harvest platform. The latter two will be implemented using off-the-self sensors. Estimating fruit density and picking rates poses technical challenges that are discussed next.Objective 2.1: Estimate fruit yield density.It is essential that the yield density be estimated in real time, so that worker positioning and platform speed can be controlled in an optimal manner. We will leverage and build upon recent work from parter CMU on image-based fruit yield mapping in orchards and vineyards for detecting green-fruit over green-leaf background.Objective 2.2: Real time estimation of each worker's fruit picking rate.Efforts to produce harvested yield data for manually picked fruits do not work in real time (i.e., they report yield after harvest) and work either at the bin-level, or use expensive technology that requires changes in workers' harvesting activities. We propose to design and build a novel, low-cost, instrumented picking bag/basket that will measure harvested fruit weight in real time using load cells, and will transmit the data wirelessly to the platform's control system.Objective 2.3: Online calibration of imaging system with picking rate estimatesWe need a framework to reconcile and fuse the multiple sensing modalities to update the sensing parameters and picker performance models during operation to provide high-fidelity input into our control system. To reconcile fruit-density (fruit per area of fruiting wall) from the imaging system with picker rates we can use the vehicle velocity to convert the picker rates integrated over all pickers into a harvested fruit-density measurement.Research Goal 3: Develop human-in-the-loop platform control system.Objective 3.1: Develop control algorithm for platform operation.The platform's controller must increase picking efficiency and ensure picker safety. The controller inputs are: incoming fruit densities on both picking sides; platform speed; and the picker harvest speeds and elevations. The binary 'picking/not picking' signal is also available. A model predictive controller that optimizes throughput over a finite time horizon will be developed and tested.Objective 3.2: Integrate real-time sensing and control.For this project objective, we will design an application-programing interface (API) where software running on the control system can access the sensing data and interact with the actuators (velocity of the vehicle and heights of the pickers). The API design will enable the control interface to request the current fruit density estimate at the given location and height as measured from the image data.Research Goal 4: System prototyping and field evaluation.Objective 4.1: Robotize harvest-aid platform.A self-propelled harvest-aid platform is available at UC Davis. The platform will undergo significant physical modifications to support independently actuated worker positions and scissor lift tables will be installed on appropriately prepared sides of the platform chassis. Special attention will be paid to platform safety.Objective 4.2: Evaluate robotic platform in orchards.A direct comparison of robotic and conventional platform operating modes will be conducted in high-density orchards in California, for different crops and architectures (e.g., v-trellis apple trees, Kearny-v peach trees). The response variables of interest will be harvesting throughput (average minutes to fill one bin and total number of filled bins per person per hour) and ergonomics.
Project Methods
MethodsActivity 1. Collect fruit position data in SNAP orchards and digitize orchard trees to extract structural-geometrical data in order to create instances of virtual orchards.Fruit positions and tree geometries will be digitized using a Power TRAK 360TM digitizer. A frame structure is being designed to hold six digitizer sources that collectively establish a large-area magnetic field around a tree. We shall use the fruit coordinates to estimate the nonparametric spatial probability density function (s-pdf) of fruits in canopies. The estimated fruit distribution of a given orchard row will be validated by measuring fruit locations from test trees (in the same orchard block) that were not included in the calculation of the histogram and use a standard two-sample Kolmogorov-Smirnov test to check if the marginal distributions (heights, radii and angle) of the test tree-fruits come from the corresponding cdf's. The tree geometries and the fruit locations will be integrated. A possible approach to accomplish this is to generate the tree first, and then set rules for accepting and rejecting fruit locations as they are sampled from the s-pdf.Activity 2. Model the process of manual fruit picking and collect motion data to estimate the stochastic parameters of the process.Our approach will be to model the process of manual fruit picking and estimate the stochastic parameters of the process from captured picker motion data. Picking motions and their durations will be calculated by kinematic simulation that accounts for branch avoidance, and uses stochastic picker joint velocity trajectories, which will be estimated through motion capture of pickers.Activity 3: Develop a platform-based fruit-picking simulator using open-source software, like the Gazebo simulator.The Gazebo open source robotics simulator will be used to model platform-based harvesting by coupling the tree, picker and platform models. This simulator features a robust physics engine to calculate machine kinematics and dynamics, and a computational geometry engine to detect obstacle collisions. The simulator will be used to perform model-based control system development. A CAD model and the kinematic model of the prototype harvest-aid platform will be used to perform virtual harvesting. All execution data will be logged and worker and platform harvest speeds will be calculated for different control system schemes.Activity 4: Develop hardware and software for an imaging system for estimation of fruit yield density on SNAP trees.A high-resolution calibrated stereo camera pair will be used in conjunction with powerful learning and self-calibrating algorithms. A specially designed algorithm will use HDR (High Dynamic Range) imaging techniques and contour and shading cues to detect fruits amongst leavesActivity 5: Develop hardware and software for a real time fruit-picking rate measuring system.A novel, low-cost, instrumented picking bag/basket will be designed to measure harvested fruit weight in real time using load cells, and to transmit data wirelessly to the platform's control system. Load cell data are very noisy; therefore, the data stream will be filtered in real-time, using order-statistics and outlier rejecting filters. Each bag will also provide a 'binary' signal that is true only when a picker unloads fruit. The signal will either be generated physically by electrical contact-break (bag clips open to unload), or via signal processing of the continuous weight data stream (detection of sudden weight drop).Activity 6: Develop a controller for the platform.The control algorithm will integrate sensing and control and will perform speed and picking station height control, bin height control, and will supervise all platform automatic operations. The computed controls must also satisfy all throughput constraints. A model predictive controller will be developed, which optimizes throughput over a finite time horizon. Additionally, worker safety and comfort will be encoded as limits on the derivatives of actuated variables. The control system will be programmed in ROS (Robot Operating System).Activity 7: Develop a robotic harvest-aid platform.A commercial platform will undergo significant physical modifications to support independently actuated worker positions and scissor lift tables will be installed on appropriately prepared sides of the platform chassis. Special attention will be paid to platform safety, which will be incorporated in the design by adhering to relevant ISO, Cal/OSHA and California Code of Regulations standards.Activity 8: Field experiments and analysis of robotic platform in orchards.A direct comparison of robotic and conventional platform operating modes will be conducted in high-density orchards, for different crops and architectures.The response variables of interest will be harvesting throughput (average minutes to fill one bin and total number of filled bins per person per hour) and ergonomics. A fully replicated, randomized complete block statistical design will be utilized for the study, where fruit species, tree architecture, and mean fruit load will be block effects, operating mode (robotic vs. conventional) will be the main fixed effect, workers and worker gender will be random effects, while travel speed, depth of foliage to canopy edge, number of missing trees, fruit size, and post-harvest fruit quality will be covariates. Post-harvest fruit quality assessments will be done. The motion/postural information obtained from workers performing the two harvesting approaches will be used to conduct postural load risk assessments on various body joints, including, the lower back, upper extremities, and neck and shoulders.Product evaluationProduct P1: The product will be evaluated by the number of website visits.Product P2: The product will be evaluated by the number of downloads.Product P3: The product will be evaluated by the number of downloads.Product P4: The product will be evaluated by the number of downloads.Product P5: The product will be evaluated by the success of the patent applications.Product P6: The product will be evaluated by the number of downloads.Product P7: The product will be evaluated by the number of papers and presentations.Product P8: The product will be evaluated by the number of graduate, undergraduate and high-school students trained. EffortsGraduate and undergraduate students will acquire theoretical knowledge and practical skills on robotic and automation technologies for machine-aided harvesting through thesis research, senior design projects, research internships and attendance of the EBS289K graduate course "Topics in Agricultural Robotics".High-school students will acquire theoretical knowledge and practical skills on robotic and automation technologies for machine-aided harvesting through summer internships or programs like COSMOS (California State Summer School for Mathematics and Science).Researchers (US and international) will gain a better understanding of the upper-limits of robotic platform-aided harvesting efficiency and throughput through journal publications, presentations in conferences and access to the project's website. Platform manufacturers will also have access to the information through the website, targeted correspondence and attendance of field day trials.Growers will gain better understanding of: the spatial variability of fruits in commercial SNAP orchard trees; the effects of tree architecture on platform-aided harvesting efficiency and throughput. These will be achieved through presentations at growers meetings and field days at selected orchards as well as specialized publications, like the "Good Fruit Grower" that reach growers at a national level.Orchard workers, mainly Hispanic will get exposed to automation and platform technologies. This will be achieved by attending field days at selected orchards.

Progress 12/01/15 to 11/30/19

Outputs
Target Audience:One post-doctoral researcher and four Ph.D. students were mentored in the context of the project. Our research team worked closely with growers in Lodi, CA and conducted field experiments and time studies in their fields. One Ph.D. student presented findings at the 2019 ASABE Intl. Meeting in Boston, MA to researchers, students, and academics. The PI and Co-PI gave several presentations related to robotic harvest-aids. The audiences included growers, entrepreneurs, researchers, students, academic staff and the general public. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?UC Davis: One post-doctoral researcher (Seyyedhasani), and two Ph.D. students (Zhenghao - male; Pueyo Svoboda - female) were mentored in the context of the project and gained experience in real-time robotic control, sensing and data communications, and field experimental work. The post-doc and the Ph.D. students worked closely with apple growers in Lodi, CA and conducted field experiments and time studies in their fields. One Ph.D. student presented findings at the 2019 ASABE Intl. Meeting in Boston, MA to researchers, students, and academics. CMU: Two undergraduate interns were employed in the past year, gaining training and experience in the development of deep learning-based computer vision systems and field validation experiments (Cohen, Walker). Three of the CMU staff employed by the project made the transition to becoming master's students in order to learn advanced AI and robotics techniques as applicable to agriculture (Baweja, Panaje, Anjana). How have the results been disseminated to communities of interest?UC Davis: The results of this research were presented at the 2019 ASABE Intl. Meeting in Boston, MA. Also, the PI gave several presentations related to the project and robotic harvest-aids, in general. The audiences were growers, entrepreneurs, researchers, students, academic staff and the general public. December 3, 2018. Almond Board of California, Davis, CA. January 30, 2019. Cling Peach Board, Sacramento, CA. February 5, 2019. California Pear Board, Davis, CA. February 13, 2019. Morning Star Company, Davis, CA. March 19, 2019. World Bank Headquarters, Washington, D.C. April 1, 2019. Western Center for Agricultural Health and Safety, Davis, CA. April 3, 2019. California Strawberry Commission, Cal Poly at Saint Luis Obispo. August 27, 2019. Taylor Farms, Salinas, CA. CMU: Co-PI Kantor has given several public presentations that include the results of this project: June 20, 2019: Presentation titled Cameras for Sensing in Vineyards. Invited speaker, American Society for Enology and Viticulture (ASEV) National Meeting, Napa CA, (audience: grape growers) June 21, 2019: Bringing AI to the Field Robotics and Sensing in Agricultural Applications. Invited speaker, AI For Good, San Francisco CA, (audience: academic researchers in agriculture). August 25-29, 2019: Robotic Field Measurements for Plant Breeding and Crop Management. Invited speaker, National Association of Plant Breeders (NAPB) Annual Meeting, Pine Mountain GA. November 21, 2019: Workshop CPS for Agriculture Mini Workshop. NSF Cyber-Physical Systems PI Meeting, Arlington VA, (audience: academic researchers in plant biology, genetics, computer vision, and robotics). December 9, 2019: Presentation titled Technology in Tree Fruit: What's here? What's Close? What's Coming? Keynote speaker, Washington State Tree Fruit Association Annual Meeting, Wenatchee WA, (audience: Apple growers). January 15, 2020: Emerging AI Tools for Specialty Crops. Invited speaker, PrecisionAg Vision Conference, Seattle WA. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? ENTIRE PROJECT IMPACT Major activities completed / experiments conducted. The fruit locations of high-density SNAP apple trees were digitized in a Lodi orchard in order to create a small part of a virtual orchard. A low-cost stereo dual stereo camera was developed, along with fruit detection software that was based on machine learning. Two fully instrumented picking bags were developed to monitor in real-time the weight of manually harvested fruit. A commercial platform was retrofitted with two hydraulic cylinders and lifts, in order to individually elevate pickers - via a computer - at different heights. The platform's throttle was also actuated via a stepper motor to enable computer-control speed of the platform. The fruit harvesting process was modeled as a Markovian process, with appropriate state and action spaces and a stochastic transition function, which essentially simulates the picking process. A stochastic model predictive controller that performs sparse sampling was developed that takes as input the fruit distribution in the canopies in front of the platform, and the picking rate of each picker, and current platform speed and picker heights, and computes the new platform speed and picker-lift action (up/down/stay) that result in maximum machine harvest efficiency. The PI and co-PI presented the project's aims, approaches and results at several grower meetings (e.g., sweet cherries growers, Washington Tree Fruit Commission, Almond Board of California), at general audiences (e.g., Silicon Valley Forum, World Food Center, CITRIS Agricultural Technology Fair) and conferences (e.g., ASABE, Phenome). The project was also covered in the popular press media venues (e.g., Wall Street Journal article and podcast (10/1/18), the New Yorker (4/15/19) and several press websites). Data collected. Picker apple harvesting rates as a function of time and position in the orchard were measured by recording and processing the weight of the fruit carried by the picker in the picking bag. Harvest-related data were collected and stored during commercial harvesting during three consecutive summers, in apple orchard blocks in Lodi, CA. Data included video streams of tree canopies with fruits; picking bag real-time weight data; platform actuator control signals and state signals; platform GNSS position and speed data. Summary statistics and discussion of results. The apple detection system achieved a precision of 0.92, recall of 0.91 and an overall F1 score of 0.91, thus pushing forward the state of the art. The RMSE and 90th percentile errors of the weight monitored by the picking bag were less than 0.36kg and 0.56 kg, respectively (~ 2.7 apples); these errors correspond to 1.8% and 2.8% of the bag capacity. The robotic platform's harvesting throughput increased by 9.5%, when worker elevations were controlled by the model-predictive controller vs. when workers picked at fixed heights. However, when platform speed was controlled concurrently with worker elevations, the corresponding throughput increased by 24.5%. Key outcomes or other accomplishments realized. The major accomplishment of this work is that it demonstrated that when an orchard platform's speed and worker elevations are controlled by a computer that utilizes fruit distribution and picking rate information, harvest throughput can increase significantly, up to 25%. Another major accomplishment is the design and development of a low-cost ($3,000) dual stereo camera that covers the entirety of the canopy. The camera can be used for yield mapping and robotic harvesting applications. Also, another major accomplishment is the development of fully instrumented picking bags that monitor in real-time the weight of manually harvested fruit. The bags can be used for labor monitoring and management, and yield mapping.

Publications

  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Vougioukas, S.G. (2019). Agricultural Robotics. Annual Review of Control, Robotics, and Autonomous Systems. 2:365-392.
  • Type: Journal Articles Status: Submitted Year Published: 2020 Citation: Fei, Z., Shepard, J., Vougioukas, S.G. (2020). Instrumented Picking Bag for Measuring Fruit Weight During Manual Harvesting. Transactions of the American Society of Agricultural and Biological Engineering.
  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Charlton, D., Edward Taylor, J.E., Vougioukas, S.G., Rutledge, Z. (2019). Can Wages Rise Quickly Enough to Keep Workers in the Fields? Choices, 2nd Quarter 34(2). http://www.choicesmagazine.org/choices-magazine/submitted-articles/can-wages-rise-quickly-enough-to-keep-workers-in-the-fields
  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Charlton, D., Edward Taylor, J.E., Vougioukas, S.G., Rutledge, Z. (2019). Innovations for a Shrinking Agricultural Workforce. Choices, 2nd Quarter 34(2). http://www.choicesmagazine.org/choices-magazine/submitted-articles/estimating-value-damages-and-remedies-when-farm-data-are-misappropriated/innovations-for-a-shrinking-agricultural-workforce
  • Type: Conference Papers and Presentations Status: Submitted Year Published: 2020 Citation: Fei, Z., Silwal, A., Pothen, Z., Kantor, G., Vougioukas, S.G. (2020). A Co-robotic Harvest-aid Platform for Increased Harvest Efficiency. SUBMITTED. Intl. Conference on Agricultural Engineering (AgEng 2020, July 5-9), Evora, Portugal.
  • Type: Conference Papers and Presentations Status: Submitted Year Published: 2020 Citation: Fei, Z., Silwal, A., Pothen, Z., Kantor, G., Vougioukas, S.G. (2020). Dynamic Height Control for Robotized Apple Orchard Harvest-aid Platform with Multiple Pickers to Achieve Optimal Picking Efficiency. SUBMITTED. ASABE Annual International Meeting. Omaha, Nebraska.


Progress 12/01/17 to 11/30/18

Outputs
Target Audience:Students: PI Vougioukas presented the project and main results to undergraduate students in the context of the course EBS1 "Introduction to Biosystens Engineering" and EBS289K Graduate course in Agricultural Robotics, both at UC Davis. Co-PI Kantor gave a presentation titled "AI and Robotics for Crop Management and Breeding" at the University of Maryland, College Park, MD, on September 14, 2018, in the context of Electrical and Computer Engineering Distinguished Colloquium Series. (audience: UMD faculty and students). Growers & Stakeholders: PI Vougioukas presented results from this project in grower and industry stakeholder meetings such as: sweet cherries grower meeting on January 10, 2018; Lake County North Coast Pear Growers meeting on February 7, 2018; Sacramento River District Growers meeting, February 6, 2018; Innovation Summit, UC Davis, April 23, 2018; UC Extension Tree Fruit Orchard Tour, June 26, 2018. Co-PI Kantor gave a Keynote Address at the International Tree Fruit Association Annual Meeting, Rochester, NY, February 25, 2018 (Apple growers). The title was: Robotics and Artificial Intelligence for Tree Fruit Crops: Emerging Technologies and How to Get Ready for Them. Also, he gave a presentation, titled "Artificial Intelligence Technologies to Support Crop Management and Breeding" to industry representatives at the University Industry Consortium Annual Meeting, Pasco, WA, April 25, 2018. General public: PI-Vougioukas: Wall Street Journal article, October 2, 2018 (https://www.wsj.com/articles/robots-head-for-the-fields-1538426976) and podcast (https://www.wsj.com/podcasts/browse/wsj-the-future-of-everything). Co-PI Kantor: Presentation "AI Will Help Feed a Growing Planet SXSW2018", at Austin, TX March 13, 2018. Researchers: PI Vougioukas presented project progress at the NRI meeting in Arlington, in November 2018. Co-PI Kantor: Robotics and AI for Rapid In-Field Phenotyping. Invited Speaker, Phenome 2019, Tucson AZ, February 7, 2019. (audience: academic researchers in plant biology, genetics, computer vision, and robotics). Changes/Problems:A no-cost extension was granted. The major reasons for the requested extension were: 1) Field experiments in apple orchards that have trees of suitable canopies can only be performed once a year - late August to early September - during a very short picking season. 2) We concluded this year's data collection and need to make some enhancements/changes to the robotic orchard platform that were not anticipated. 3) Therefore, we need more time for the finalization of the hardware and software design, which we must evaluate in the next picking season. What opportunities for training and professional development has the project provided?At UC Davis: One Ph.D. student, Zhenghao Fei, was mentored and gained experience on model-based control for the platform, electronics development for the picking bag, hydraulic actuation and real-time software development, and field experiments. One Ph.D. student, Chen Peng, worked on real-time hydraulics control. One Ph.D. student, Natalie Pueyo Svoboda helped with field experiments and data processing. Two post-doctoral researchers, Adrien Durand-Petitiville and Hasan Seyyedhasani worked on integration of computer vision and hydraulic actuation using ROS and also gained field work experience. At CMU: Two undergraduate interns were employed in the past year, gaining training and experience in the development of deep-learning based computer vision systems and field validation experiments (Cohen, Walker). Two of the CMU staff employed by the project made the transition to become Masters students in order to learn advanced AI and robotics techniques as applicable to agriculture (Baweja, Panaje). How have the results been disseminated to communities of interest?PI Vougioukas presented results from this project in grower and industry stakeholder meetings such as: sweet cherries grower meeting on January 10, 2018; Lake County North Coast Pear Growers meeting on February 7, 2018; Sacramento River District Growers meeting, February 6, 2018; Innovation Summit, UC Davis, April 23, 2018; UC Extension Tree Fruit Orchard Tour, June 26, 2018. He also interviewed for the Wall Street Journal; see article, October 2, 2018 (https://www.wsj.com/articles/robots-head-for-the-fields-1538426976) and podcast (https://www.wsj.com/podcasts/browse/wsj-the-future-of-everything). Co-PI Kantor has given several public presentations that include the results of this project: Robotics and Artificial Intelligence for Tree Fruit Crops: Emerging Technologies and How to Get Ready for Them. Keynote Address, International Tree Fruit Association Annual Meeting, Rochester, NY, February 25. (audience: Apple growers) Robotics and AI for Rapid In-Field Phenotyping. Invited Speaker, Phenome 2019, Tucson AZ, February 7, 2019. (audience: academic researchers in plant biology, genetics, computer vision, and robotics) AI and Robotics for Crop Management and Breeding Electrical and Computer Engi-neering Distinguished Colloquium Series, University of Maryland, College Park, MD, September 14, 2018. (audience: UMD faculty and students) AI Will Help Feed a Growing Planet SXSW2018, Austin, TX March 13, 2018. (audience: general public) What do you plan to do during the next reporting period to accomplish the goals?Some hardware modifications will be done to the platform to facilitate easier and safer picking for the workers. Also, platform moving speed control will be fully integrated into the control algorithm and will be implemented physically by controlling the engine's throttle. Another goal is to increase the camera's Field of View (FoV) - From field trials, we observed that the models for both the original and the new cameras cover approximately 75% of the canopy. In our previous experiments, this field sufficed due to the limited vertical reach of the platform that raised and lowered the pickers. Our plan for the next field test is to increase that vertical reach, which requires an increase of camera FoV. To achieve this, we will arrange two cameras (new model) in a vertical configuration that, combined, will cover the entire canopy. A related goal is to register fruit counts from multiple cameras - The vertical dual camera system mentioned above will have significant overlap in between the top and bottom stereo pairs that will result in multiple counts of the same fruit. To prevent that, we will calibrate the multi-stereo system to get proper coordinate transformation between them. Once this is achieved, fruit coordinated will be registered using standard approaches such as Iterative Closest Point algorithm. Finally, apple-picking experiments will be performed to evaluate the final version fo the systema nd quantify the labor savings.

Impacts
What was accomplished under these goals? Research Goal 2: Develop a sensing system for platform-based fruit picking. Objective 2.1: Estimate fruit yield density. Cost-effective camera system One of the major objectives of this research is to develop a robust, accurate and relatively inexpensive sensing system to detect fruit yield density and worker picking rates. A cost-effective (≈ $3,000) stereo camera system was developed that uses active lighting to significantly reduce the effect from natural illumination in images. In these cameras, bright pulses of light from the high-power Xenon flashes are synchronized with the camera's trigger mechanism. Computer Vision Algorithms In the course of this project, we developed two apple-detecting computer vision algorithms. The first one, Angular Invariant Maximal (AIM) detector, utilizes specular reflection of the flash from the shiny curved surface of apples and robustly classified fruit instances and calculated their sizes. However, in regions where fruits are heavily occluded by leaves, branches, and surrounding fruits and in poorly lit areas, the algorithm does not perform as well. The second vision algorithm was based on Faster Recurrent Convolutional Neural Network (F-RCNN), to detect apples. This year, the performance of the two algorithms and their dependence on the camera platform were evaluated and compared. Experiment Design Fourteen randomly selected sections of apple trees were marked with QR tags as calibration plots in a test row of a commercial apple orchard. These tags approximately covered 1m x 1m area and were hung vertically between the third and the fourth trellis wire. Images were taken both by the original, expensive camera and by the new, less expensive system. We determined the ground truth, i.e, the actual number of apples in a region, by manually counting the apples in the static images of the QR tagged regions of the canopies. We evaluated accuracy by comparing the ground truth to the apples detected by the two computer vision algorithms. For performance metrics, we used precision, recall, and F1 scores. Experiment 1: Comparing Computer Vision Algorithms The two computer vision algorithms were applied to the images taken by the new camera system and compared based on their precision and recall. The performance objective for the two computer vision algorithms in this domain is to have a high balance for both precision and recall and not to sacrifice one metric's quality for the other. We saw that F-RCNN was able to achieve this balance far better than AIM. The precision-recall (P-R) curve of the F-RCNN is concentrated in the region with higher precision and recall values. On the other hand, the AIM algorithm has high precision, but the recall is almost half of F-RCNN. Hence, F-RCNN detects most of the fruit and most of the detected fruits are true detections with few false positives. Experiment 2: Evaluating Effect of Camera Platform on Computer Vision Algorithms To compare the new and previous camera systems, the outcomes of F-RCNN and AIM on the images of the same calibration plots from these two cameras were compared. The results of F-RCNN for both cameras were similar and closely matched ground truth numbers ,but never exceed them (no false positives). For AIM, the old camera detected more apples than the new one. Also, there were more false positives. The fluctuation between the two camera systems was mainly because of multiple hyper parameter tuning required for proper application of AIM to data from different cameras. This further supports our inclination to use F-RCNN, which is more general than AIM and does not require this kind of parameter customization for each camera system. In addition, F-RCNN has better R2 values than AIM for both the old camera (.5 vs. .39) and the new camera (.59 vs. .61). Note also that the R2 values for the old camera for both F-RCNN and AIM are actually less than the ones for the new camera, which supports use of the new camera system. Research Goal 3: Develop human-in-the-loop platform control system. Objective 3.1: Develop control algorithm for platform operation. The fruit harvesting process was modeled as a Markovian process, with appropriate state and action spaces and a stochastic transition function, which essentially simulates the picking process. A Monte-Carlo search controller was developed to optimize the lift vertical positioning of the pickers as a function of the sensed fruit densities in front of the platform and the workers' picking rates. The controller was written in C++ and ROS and runs in real time. Objective 3.2: Integrate real-time sensing and control. The Ph.D. student who wrote the code to the controller (Zhenghao Fei) travelled to CMU and collaborated with local researchers to integrate the code of the camera fruit sensing system, the platform GPS and the optimizing controller. All software components were implemented within ROS and were tested in the lab, in a harvest mock-up at UC Davis, and in an apple orchard. Research Goal 4: System prototyping and field evaluation. Objective 4.1: Robotize harvest-aid platform. The orchard platform was retrofitted so that workstations are actuated and elevated via computer control. The entire upper deck of the platform was removed and two lifts were designed and fabricated at UC Davis. Special attention was given to safe operation. Hydraulic cylinders and control circuits and valves were installed, along with displacement sensors that measure the lift vertical displacements. An Arduino was interfaced to control the hydraulic valves and read sensor outputs and was programmed to control the vertical motions of the two lifts. The system was tested and worked very well. Prior to the main experiment in the field, we tested the integration of the camera and the orchard platform in a controlled setup. Fruits were manually hung at different heights and densities to simulate real apple tree canopies. The camera system was mounted to a fixed height and distance from the fruit wall. The platform was manually driven at slow speeds while the camera imaged the simulated canopy. From the consecutive images, the fruit detection algorithm identified the fruits, and the center of the detected fruits were forwarded to the control system that adjusted the picking heights for two pickers. Objective 4.2: Evaluate robotic platform in orchards. The platform with the integrated camera and lift systems was tested in an apple orchard at Lodi, CA on September 11, 12 and 13, 2018. In these experiments, platform speed was not controlled by the optimizing controller; this capability will be added in the next period. The main results were the following. When driving at constant speed along a certain row, the mean difference in the time needed to fill a bin when the lifts were fixed (40 minutes) and controlled by the computer (37.5 minutes) was 2.5 minutes. In another row, when the lifts were fixed and the speed was constant, it took 57 minutes to fil a bin; however, if the lifts were dynamically controlled and the speed was also adjusted (by the operator) this time would decrease to 45 minutes. In another row, the bin-fill time did not change when the lifts were static or moving, but it did change when the speed was adjusted dynamically. Overall, results were inconclusive, but it became clear that controlling the moving speed plays an important role in platform performance.

Publications

  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Durand-Petiteville, A., Le Flecher, E., Cadenat, V., Sentenac, T., Vougioukas, S.G. (2018). Tree detection with low-cost 3D sensors for autonomous navigation in orchards. IEEE Robotics and Automation Letters. 3(4): 3876-3883.


Progress 12/01/16 to 11/30/17

Outputs
Target Audience:Students: One Ph.D. student was mentored in the context of this project. Informal laboratory instruction was the key instrument. Growers & Stakeholders: Our team visited one grower in Lodi, CA and informed him about the progress of the project and conducted field experiments. The PI presented results from this project in the following grower and industry stakeholder meetings: sweet cherries grower meeting on Jan. 5, 2017 at Cabral Agricultural Center, Stockton, CA; Washington Tree Fruit Commission, at UC Davis, March 1, 2017; Almond Board of California, Harvest Technology Roundtable at UC Davis, March 15, 2017;Silicon Valley Forum at UC Davis, 4 April 4, 2017. General public: Precision Ag conference series, organized by the World Food Center at UC Davis, April 27, 2017; CITRIS Agricultural Technology Fair, UC Merced, March 8, 2017. Researchers: The PI and co-PI presented project progress at the NRI meeting in Arlington, in November 2017. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?At UC Davis, one Ph.D. student, Zhenghao Fei, was mentored and learned to perform printed circuit board design; calibrate load cells; write code for signal processing. He also wrote a journal manuscript and presented at the ASABE Intl. meeting. Also, project scientist John Shepard gained experience on real-time closed loop control of hydraulic cylinders using Arduino and R&D engineer Dennis Sadowski gained experience in designing and building hydraulically actuated equipment. At CMU, project scientist Abhisesh Silwal has received training on using the imaging sensor setup for logging image data from the test orchard. How have the results been disseminated to communities of interest?Ph.D. student Zhenghao Fei presented results to researchers and professionals with an oral presentation at the 2017 Intl. ASABE meeting.Work was presented by the PI at the National Robotics Initiative (NRI) Principal Investigators meeting in November 2017. A shortpresentationwas givento entire (NRI) audience and also a 10 minute focus sessionto the USDA Program Manager and other agricultural robotics principal investigators in a side meeting. The project's aims, approaches and results were also presented by PI Vougioukas at various meetings:sweet cherries grower meeting on Jan. 5, 2017 at Cabral Agricultural Center, Stockton, CA; Washington Tree Fruit Commission, at UC Davis, March 1, 2017; Almond Board of California, Harvest Technology Roundtable at UC Davis, March 15, 2017;Silicon Valley Forum at UC Davis, 4 April 4, 2017;Precision Ag conference series, organized by the World Food Center at UC Davis, April 27, 2017; CITRIS Agricultural Technology Fair, UC Merced, March 8, 2017. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Research Goal 1: Modeling of fruit harvesting with a robotic orchard platform Objective 1.1: Create instances of virtual orchards. Twenty cling peach trees with fruits were manually digitized in summer 2017; whereas dormant cling peach trees (without leaves) were digitized using a high resolution 3D lidar in late fall 2017. Objective 1.2: Develop human picking models. Picker apple harvesting rates as a function of time and position in the orchard were measured by recording and processing the weight of the fruit carried by the picker in the picking bag (see objective 2.2).The statistics of this rate are used to model and simulate the picing process. Research Goal 2: Develop a sensing system for platform-based fruit picking. Objective 2.1: Estimate fruit yield density. Data was collected at the Lodi Orchard on September, 7th 2017. The CMU Imaging Sensor was used to capture high-resolution stereo images of one side of a row (60m) in the orchard. The collected images were Geo-referenced with GPS data. For assessing the accuracy of the imaging sensor, 15 plots of 1 meter length, were selected along the test row. The start and end of each plot was marked with QR tags. The total fruit within each of these plots were manually counted and recorded. For detecting the apples in the images, CMU team used the Angular Invariant Maximal key-point detector to locate potential fruit centers across the image. For each detected key-point, a 6 x 8 grid is constructed around it. For each grid, a feature vector is generated by combining SURF and Radial Hog features. These grid features vectors are then passed through the random forest classifier to determine whether the grid belongs to a part of an apple. If more than 50% of the grids associated with a keypoint have been classified positively, than the corresponding keypoint is classified as an apple. This algorithm is able to accurately detect the fruit in an image in 0.3 second per image and hence can be run in real time. Using this algorithm CMU achieved an R-Square of 0.72 against the ground-truth data. CMU team has also developed ROS nodes to publish: 1. The global fruit coordinates of the detected fruit (w.r.t to Bandit platform) along the orchard row. 2. The visible fruit count for every 0.5 meter grid (length and height) along the row. CMU also started exploring the use of Generative Adversarial Network for constructing an imaging unit without flash. For this purpose, an experimental stereo rig was used to image the same test plot. This sensor was a low resolution color camera from PointGrey (Chameleon 3) with a spatial resolution of 2048 x 1536 pixels. This experimental sensor was not equipped with the active lighting system as with the CMU Imaging Sensor. To compensate the variability caused by outdoor lighting, images with three different exposures were taken. It is expected that fusion of multiple exposures will reduce the effect from viable illumination without active lighting. The output in the form of exposure fused image will then be transferred to a Generative Adversarial Network that will attenuate the background using depth information. The end result would be similar to images acquired using the current CMU sensor. This concept is currently being worked on as a cost effective alternative. Objective 2.2: Real time estimation of each worker's fruit picking rate. A 2nd generation instrumented picking bag was used to measure harvested fruit weight. Load cells were repositioned and all electronics were placed on a specially designed printed circuit board (PCB) inside an enclosure. Electronics included an Arduino, signal conditioning circuits, an Xbee shield, a GPS, and an SD card. The microcontroller transmits all time-stamped data in real time via ROS. Data are filtered using a median and a low-pass filter to reduce noise. Static and dynamic calibration tests were performed anew in the lab over the weight range of the bag's capacity (20 kg) using baseballs and apples. Results showed a mean error of 0.36 kg, or 2.5 average-size apples. The bag was used in the Lodi orchard by a professional picker. Forty-five bags were filled and collected data provided real-time estimation of picker harvesting speed. The average speed was 1.65 seconds per fruit. Research Goal 3: Develop human-in-the-loop platform control system. Objective 3.1: Develop control algorithm for platform operation The fruit harvesting process was modeled as a Markovian process, with appropriate state and action spaces and a stochastic transition function, which essentially simulates the picking process. A stochastic model predictive control framework is under development and search techniques are being considered to compute optimal actions (picker elevations) for the sensed fruit yield distribution and platform moving speed and worker picking speeds. Research Goal 4: System prototyping and field evaluation. Objective 4.1: Robotize harvest-aid platform. The platform was retrofitted so that the right rear workstation is actuated hydraulically and elevated via computer control. A CAD design and the fabrication of the lifting mechanism were completed and a hydraulic cylinder was installed. A displacement sensor was also installed and closed-loop position control of the hydraulic cylinder was achieved using an Arduino. The right-front workstation will also be retrofitted for hydraulic actuation.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Fei, Z., Shepard, J., Vougioukas, S. (2017). Instrumented picking bag for measuring fruit weight during harvesting. ASABE Annual International Meeting. Paper Number 1701385. Spokane, Washington.
  • Type: Journal Articles Status: Submitted Year Published: 2017 Citation: Fei, Z., Shepard, J., Vougioukas, S. Instrumented picking bag for measuring fruit weight during harvesting. Transactions of the ASABE.


Progress 12/01/15 to 11/30/16

Outputs
Target Audience:Students: The PI mentored two PhD students in the context of this project. Informal laboratory instruction was the key instrument. Growers: Our team visited one grower in Lodi, CA and informed them about the project and conducted field experiments. The PI also presented at a sweet cherries grower meeting on Jan. 5, 2017 at Cabral Agricultural Center, Stockton, CA. Researchers: The PI and co-PI presented project progress at the NRI meeting in Arlington, in November 2017. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?At UC Davis, Ph.D. candidate Zhenghao Fei learned to use the Robot Operating System, integrate electronics, microcontrollers and sensors to measure fruit weight in an insturmented bag, and how to do static and dynamic calibration of the sensors. Research engineer Dennis Sadowski came up to speed with orchard platform technologies as he started designing the platform's retrofit for robotic operation. Research engineer John Shepard improved his skills on Arduino-based coding, filtering and sensor calibration. At CMU, research engineers Zania Pothen and Devdatta Narote have received training during the first year of the program. They have been taught how to use the Robot Operating System, how to use C++ apple detection software framework and integrate with a GPS geolocation data stream. How have the results been disseminated to communities of interest?Work was presented at the National Robotics Initiative (NRI) Principal Investigators meeting in November 2016. A short 2 minute presentation and a 1 hour poster presentation was giving to entire (NRI) audience and also a 10 minute focus session to the USDA Program Manager and other agricultural robotics principal investigators in a side meeting. The project's aims, approaches and preliminary results were also presented by PI Vougioukas to CA Sweet-Cherry growers meetings: one on Jan. 5, and another on Feb. 6, both in Stockton, CA, What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Research Goal 1: Modeling of fruit harvesting with a robotic orchard platform Objective 1.1: Create instances of virtual orchards. The fruit locations of high-density SNAP apple trees were digitized in a Lodi orchard in order to create a small part of a virtual orchard. The collected data were processed, filtered and stored in '.csv' format. Objective 1.2: Develop human picking models. Picker apple harvesting rates as a function of time and position in the orchard were measured by recording and processing the weight of the fruit carried by the picker in the picking bag (see objective 2.2). Research Goal 2: Develop a sensing system for platform-based fruit picking. Objective 2.1: Estimate fruit yield density. Data was collected at the test Lodi Orchard on September, 9th 2016. The CMU Imaging Sensor was used to image a test plot of the orchard. Also captured was GPS data, pick sensor data and ground truth apple coordinate data. There are several challenges associated with detecting fruit in images taken in outdoor and unstructured environments such as apple orchards: A large number of apples are partially occluded by foliage and surrounding apples; the surface of the fruit is smooth and therefore lacks texture and defining features; detection of the same apple from multiple images results in multiple counts. We overcame the first two limitations by using the Angular invariant Maximal detector that utilizes the gradual variation of intensity and gradient orientation formed on the surface of the fruit. For evaluating purposes a "leave one out" cross validation on 20 randomly selected images was performed. The visible apples in each of these 20 images were manually labeled to serve as ground truth for validation. The detection algorithm has an overall accuracy of 0.84 F1-score and R2 of 0.69. The algorithm achieved 100% precision and 73% recall. Moderate recall is not a concern because multi-view imaging can compensate with multiple images captured of any one apple. Yield measurements were also performed from images using the GPS data. and assign to world locations. Initial results produced correlation between image and ground truth with R2=0.60. Objective 2.2: Real time estimation of each worker's fruit picking rate. A commercial picking bag was instrumented to measure harvested fruit weight. Two load cells were placed inside an enclosure, which was placed between the bag and its shoulder straps, without hindering picking motions. The load cells measure the forces exerted on the straps by the bag and fruits. All electronics were placed inside the enclosure, and included an Arduino, signal conditioning circuits, an Xbee shield, a GPS, and an SD card. The microcontroller transmits data in real time and saves time-stamped data on the SD card. Data are filtered using a median and a low-pass filter to reduce noise. Dynamic calibration was performed in the lab over the weight range of the bag's capacity (20 kg). Baseballs provided consistent weight and volume and were placed in the bag to provide a staircase ground truth weight signal. Two people carried the bag and moved in a manner analogous to pickers. Results showed a mean error of 0.39 kg, standard deviation 0.42kg, and 95th percentile 1.04kg. Major error sources included bag acceleration and body reaction force. Research Goal 4: System prototyping and field evaluation. Objective 4.1: Robotize harvest-aid platform. The platform needs to be retrofitted so that workstations are actuated and elevated via computer control. A CAD design of the lifting mechanism was completed and hydraulic actuators were selected.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2017 Citation: Fei, Z., Shepard, J., Vougioukas, S. (2017). Instrumented picking bag for measuring fruit weight during harvesting. ASABE Annual International Meeting. Spokane, Washington.