Progress 12/01/15 to 11/30/19
Outputs Target Audience:One post-doctoral researcher and four Ph.D. students were mentored in the context of the project. Our research team worked closely with growers in Lodi, CA and conducted field experiments and time studies in their fields. One Ph.D. student presented findings at the 2019 ASABE Intl. Meeting in Boston, MA to researchers, students, and academics. The PI and Co-PI gave several presentations related to robotic harvest-aids. The audiences included growers, entrepreneurs, researchers, students, academic staff and the general public. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?UC Davis: One post-doctoral researcher (Seyyedhasani), and two Ph.D. students (Zhenghao - male; Pueyo Svoboda - female) were mentored in the context of the project and gained experience in real-time robotic control, sensing and data communications, and field experimental work. The post-doc and the Ph.D. students worked closely with apple growers in Lodi, CA and conducted field experiments and time studies in their fields. One Ph.D. student presented findings at the 2019 ASABE Intl. Meeting in Boston, MA to researchers, students, and academics. CMU: Two undergraduate interns were employed in the past year, gaining training and experience in the development of deep learning-based computer vision systems and field validation experiments (Cohen, Walker). Three of the CMU staff employed by the project made the transition to becoming master's students in order to learn advanced AI and robotics techniques as applicable to agriculture (Baweja, Panaje, Anjana). How have the results been disseminated to communities of interest?UC Davis: The results of this research were presented at the 2019 ASABE Intl. Meeting in Boston, MA. Also, the PI gave several presentations related to the project and robotic harvest-aids, in general. The audiences were growers, entrepreneurs, researchers, students, academic staff and the general public. December 3, 2018. Almond Board of California, Davis, CA. January 30, 2019. Cling Peach Board, Sacramento, CA. February 5, 2019. California Pear Board, Davis, CA. February 13, 2019. Morning Star Company, Davis, CA. March 19, 2019. World Bank Headquarters, Washington, D.C. April 1, 2019. Western Center for Agricultural Health and Safety, Davis, CA. April 3, 2019. California Strawberry Commission, Cal Poly at Saint Luis Obispo. August 27, 2019. Taylor Farms, Salinas, CA. CMU: Co-PI Kantor has given several public presentations that include the results of this project: June 20, 2019: Presentation titled Cameras for Sensing in Vineyards. Invited speaker, American Society for Enology and Viticulture (ASEV) National Meeting, Napa CA, (audience: grape growers) June 21, 2019: Bringing AI to the Field Robotics and Sensing in Agricultural Applications. Invited speaker, AI For Good, San Francisco CA, (audience: academic researchers in agriculture). August 25-29, 2019: Robotic Field Measurements for Plant Breeding and Crop Management. Invited speaker, National Association of Plant Breeders (NAPB) Annual Meeting, Pine Mountain GA. November 21, 2019: Workshop CPS for Agriculture Mini Workshop. NSF Cyber-Physical Systems PI Meeting, Arlington VA, (audience: academic researchers in plant biology, genetics, computer vision, and robotics). December 9, 2019: Presentation titled Technology in Tree Fruit: What's here? What's Close? What's Coming? Keynote speaker, Washington State Tree Fruit Association Annual Meeting, Wenatchee WA, (audience: Apple growers). January 15, 2020: Emerging AI Tools for Specialty Crops. Invited speaker, PrecisionAg Vision Conference, Seattle WA. What do you plan to do during the next reporting period to accomplish the goals?
Nothing Reported
Impacts What was accomplished under these goals?
ENTIRE PROJECT IMPACT Major activities completed / experiments conducted. The fruit locations of high-density SNAP apple trees were digitized in a Lodi orchard in order to create a small part of a virtual orchard. A low-cost stereo dual stereo camera was developed, along with fruit detection software that was based on machine learning. Two fully instrumented picking bags were developed to monitor in real-time the weight of manually harvested fruit. A commercial platform was retrofitted with two hydraulic cylinders and lifts, in order to individually elevate pickers - via a computer - at different heights. The platform's throttle was also actuated via a stepper motor to enable computer-control speed of the platform. The fruit harvesting process was modeled as a Markovian process, with appropriate state and action spaces and a stochastic transition function, which essentially simulates the picking process. A stochastic model predictive controller that performs sparse sampling was developed that takes as input the fruit distribution in the canopies in front of the platform, and the picking rate of each picker, and current platform speed and picker heights, and computes the new platform speed and picker-lift action (up/down/stay) that result in maximum machine harvest efficiency. The PI and co-PI presented the project's aims, approaches and results at several grower meetings (e.g., sweet cherries growers, Washington Tree Fruit Commission, Almond Board of California), at general audiences (e.g., Silicon Valley Forum, World Food Center, CITRIS Agricultural Technology Fair) and conferences (e.g., ASABE, Phenome). The project was also covered in the popular press media venues (e.g., Wall Street Journal article and podcast (10/1/18), the New Yorker (4/15/19) and several press websites). Data collected. Picker apple harvesting rates as a function of time and position in the orchard were measured by recording and processing the weight of the fruit carried by the picker in the picking bag. Harvest-related data were collected and stored during commercial harvesting during three consecutive summers, in apple orchard blocks in Lodi, CA. Data included video streams of tree canopies with fruits; picking bag real-time weight data; platform actuator control signals and state signals; platform GNSS position and speed data. Summary statistics and discussion of results. The apple detection system achieved a precision of 0.92, recall of 0.91 and an overall F1 score of 0.91, thus pushing forward the state of the art. The RMSE and 90th percentile errors of the weight monitored by the picking bag were less than 0.36kg and 0.56 kg, respectively (~ 2.7 apples); these errors correspond to 1.8% and 2.8% of the bag capacity. The robotic platform's harvesting throughput increased by 9.5%, when worker elevations were controlled by the model-predictive controller vs. when workers picked at fixed heights. However, when platform speed was controlled concurrently with worker elevations, the corresponding throughput increased by 24.5%. Key outcomes or other accomplishments realized. The major accomplishment of this work is that it demonstrated that when an orchard platform's speed and worker elevations are controlled by a computer that utilizes fruit distribution and picking rate information, harvest throughput can increase significantly, up to 25%. Another major accomplishment is the design and development of a low-cost ($3,000) dual stereo camera that covers the entirety of the canopy. The camera can be used for yield mapping and robotic harvesting applications. Also, another major accomplishment is the development of fully instrumented picking bags that monitor in real-time the weight of manually harvested fruit. The bags can be used for labor monitoring and management, and yield mapping.
Publications
- Type:
Journal Articles
Status:
Published
Year Published:
2019
Citation:
Vougioukas, S.G. (2019). Agricultural Robotics. Annual Review of Control, Robotics, and Autonomous Systems. 2:365-392.
- Type:
Journal Articles
Status:
Submitted
Year Published:
2020
Citation:
Fei, Z., Shepard, J., Vougioukas, S.G. (2020). Instrumented Picking Bag for Measuring Fruit Weight During Manual Harvesting. Transactions of the American Society of Agricultural and Biological Engineering.
- Type:
Journal Articles
Status:
Published
Year Published:
2019
Citation:
Charlton, D., Edward Taylor, J.E., Vougioukas, S.G., Rutledge, Z. (2019). Can Wages Rise Quickly Enough to Keep Workers in the Fields? Choices, 2nd Quarter 34(2).
http://www.choicesmagazine.org/choices-magazine/submitted-articles/can-wages-rise-quickly-enough-to-keep-workers-in-the-fields
- Type:
Journal Articles
Status:
Published
Year Published:
2019
Citation:
Charlton, D., Edward Taylor, J.E., Vougioukas, S.G., Rutledge, Z. (2019). Innovations for a Shrinking Agricultural Workforce. Choices, 2nd Quarter 34(2).
http://www.choicesmagazine.org/choices-magazine/submitted-articles/estimating-value-damages-and-remedies-when-farm-data-are-misappropriated/innovations-for-a-shrinking-agricultural-workforce
- Type:
Conference Papers and Presentations
Status:
Submitted
Year Published:
2020
Citation:
Fei, Z., Silwal, A., Pothen, Z., Kantor, G., Vougioukas, S.G. (2020). A Co-robotic Harvest-aid Platform for Increased Harvest Efficiency. SUBMITTED. Intl. Conference on Agricultural Engineering (AgEng 2020, July 5-9), Evora, Portugal.
- Type:
Conference Papers and Presentations
Status:
Submitted
Year Published:
2020
Citation:
Fei, Z., Silwal, A., Pothen, Z., Kantor, G., Vougioukas, S.G. (2020). Dynamic Height Control for Robotized Apple Orchard Harvest-aid Platform with Multiple Pickers to Achieve Optimal Picking Efficiency. SUBMITTED. ASABE Annual International Meeting. Omaha, Nebraska.
|
Progress 12/01/17 to 11/30/18
Outputs Target Audience:Students: PI Vougioukas presented the project and main results to undergraduate students in the context of the course EBS1 "Introduction to Biosystens Engineering" and EBS289K Graduate course in Agricultural Robotics, both at UC Davis. Co-PI Kantor gave a presentation titled "AI and Robotics for Crop Management and Breeding" at the University of Maryland, College Park, MD, on September 14, 2018, in the context of Electrical and Computer Engineering Distinguished Colloquium Series. (audience: UMD faculty and students). Growers & Stakeholders: PI Vougioukas presented results from this project in grower and industry stakeholder meetings such as: sweet cherries grower meeting on January 10, 2018; Lake County North Coast Pear Growers meeting on February 7, 2018; Sacramento River District Growers meeting, February 6, 2018; Innovation Summit, UC Davis, April 23, 2018; UC Extension Tree Fruit Orchard Tour, June 26, 2018. Co-PI Kantor gave a Keynote Address at the International Tree Fruit Association Annual Meeting, Rochester, NY, February 25, 2018 (Apple growers). The title was: Robotics and Artificial Intelligence for Tree Fruit Crops: Emerging Technologies and How to Get Ready for Them. Also, he gave a presentation, titled "Artificial Intelligence Technologies to Support Crop Management and Breeding" to industry representatives at the University Industry Consortium Annual Meeting, Pasco, WA, April 25, 2018. General public: PI-Vougioukas: Wall Street Journal article, October 2, 2018 (https://www.wsj.com/articles/robots-head-for-the-fields-1538426976) and podcast (https://www.wsj.com/podcasts/browse/wsj-the-future-of-everything). Co-PI Kantor: Presentation "AI Will Help Feed a Growing Planet SXSW2018", at Austin, TX March 13, 2018. Researchers: PI Vougioukas presented project progress at the NRI meeting in Arlington, in November 2018. Co-PI Kantor: Robotics and AI for Rapid In-Field Phenotyping. Invited Speaker, Phenome 2019, Tucson AZ, February 7, 2019. (audience: academic researchers in plant biology, genetics, computer vision, and robotics). Changes/Problems:A no-cost extension was granted. The major reasons for the requested extension were: 1) Field experiments in apple orchards that have trees of suitable canopies can only be performed once a year - late August to early September - during a very short picking season. 2) We concluded this year's data collection and need to make some enhancements/changes to the robotic orchard platform that were not anticipated. 3) Therefore, we need more time for the finalization of the hardware and software design, which we must evaluate in the next picking season. What opportunities for training and professional development has the project provided?At UC Davis: One Ph.D. student, Zhenghao Fei, was mentored and gained experience on model-based control for the platform, electronics development for the picking bag, hydraulic actuation and real-time software development, and field experiments. One Ph.D. student, Chen Peng, worked on real-time hydraulics control. One Ph.D. student, Natalie Pueyo Svoboda helped with field experiments and data processing. Two post-doctoral researchers, Adrien Durand-Petitiville and Hasan Seyyedhasani worked on integration of computer vision and hydraulic actuation using ROS and also gained field work experience. At CMU: Two undergraduate interns were employed in the past year, gaining training and experience in the development of deep-learning based computer vision systems and field validation experiments (Cohen, Walker). Two of the CMU staff employed by the project made the transition to become Masters students in order to learn advanced AI and robotics techniques as applicable to agriculture (Baweja, Panaje). How have the results been disseminated to communities of interest?PI Vougioukas presented results from this project in grower and industry stakeholder meetings such as: sweet cherries grower meeting on January 10, 2018; Lake County North Coast Pear Growers meeting on February 7, 2018; Sacramento River District Growers meeting, February 6, 2018; Innovation Summit, UC Davis, April 23, 2018; UC Extension Tree Fruit Orchard Tour, June 26, 2018. He also interviewed for the Wall Street Journal; see article, October 2, 2018 (https://www.wsj.com/articles/robots-head-for-the-fields-1538426976) and podcast (https://www.wsj.com/podcasts/browse/wsj-the-future-of-everything). Co-PI Kantor has given several public presentations that include the results of this project: Robotics and Artificial Intelligence for Tree Fruit Crops: Emerging Technologies and How to Get Ready for Them. Keynote Address, International Tree Fruit Association Annual Meeting, Rochester, NY, February 25. (audience: Apple growers) Robotics and AI for Rapid In-Field Phenotyping. Invited Speaker, Phenome 2019, Tucson AZ, February 7, 2019. (audience: academic researchers in plant biology, genetics, computer vision, and robotics) AI and Robotics for Crop Management and Breeding Electrical and Computer Engi-neering Distinguished Colloquium Series, University of Maryland, College Park, MD, September 14, 2018. (audience: UMD faculty and students) AI Will Help Feed a Growing Planet SXSW2018, Austin, TX March 13, 2018. (audience: general public) What do you plan to do during the next reporting period to accomplish the goals?Some hardware modifications will be done to the platform to facilitate easier and safer picking for the workers. Also, platform moving speed control will be fully integrated into the control algorithm and will be implemented physically by controlling the engine's throttle. Another goal is to increase the camera's Field of View (FoV) - From field trials, we observed that the models for both the original and the new cameras cover approximately 75% of the canopy. In our previous experiments, this field sufficed due to the limited vertical reach of the platform that raised and lowered the pickers. Our plan for the next field test is to increase that vertical reach, which requires an increase of camera FoV. To achieve this, we will arrange two cameras (new model) in a vertical configuration that, combined, will cover the entire canopy. A related goal is to register fruit counts from multiple cameras - The vertical dual camera system mentioned above will have significant overlap in between the top and bottom stereo pairs that will result in multiple counts of the same fruit. To prevent that, we will calibrate the multi-stereo system to get proper coordinate transformation between them. Once this is achieved, fruit coordinated will be registered using standard approaches such as Iterative Closest Point algorithm. Finally, apple-picking experiments will be performed to evaluate the final version fo the systema nd quantify the labor savings.
Impacts What was accomplished under these goals?
Research Goal 2: Develop a sensing system for platform-based fruit picking. Objective 2.1: Estimate fruit yield density. Cost-effective camera system One of the major objectives of this research is to develop a robust, accurate and relatively inexpensive sensing system to detect fruit yield density and worker picking rates. A cost-effective (≈ $3,000) stereo camera system was developed that uses active lighting to significantly reduce the effect from natural illumination in images. In these cameras, bright pulses of light from the high-power Xenon flashes are synchronized with the camera's trigger mechanism. Computer Vision Algorithms In the course of this project, we developed two apple-detecting computer vision algorithms. The first one, Angular Invariant Maximal (AIM) detector, utilizes specular reflection of the flash from the shiny curved surface of apples and robustly classified fruit instances and calculated their sizes. However, in regions where fruits are heavily occluded by leaves, branches, and surrounding fruits and in poorly lit areas, the algorithm does not perform as well. The second vision algorithm was based on Faster Recurrent Convolutional Neural Network (F-RCNN), to detect apples. This year, the performance of the two algorithms and their dependence on the camera platform were evaluated and compared. Experiment Design Fourteen randomly selected sections of apple trees were marked with QR tags as calibration plots in a test row of a commercial apple orchard. These tags approximately covered 1m x 1m area and were hung vertically between the third and the fourth trellis wire. Images were taken both by the original, expensive camera and by the new, less expensive system. We determined the ground truth, i.e, the actual number of apples in a region, by manually counting the apples in the static images of the QR tagged regions of the canopies. We evaluated accuracy by comparing the ground truth to the apples detected by the two computer vision algorithms. For performance metrics, we used precision, recall, and F1 scores. Experiment 1: Comparing Computer Vision Algorithms The two computer vision algorithms were applied to the images taken by the new camera system and compared based on their precision and recall. The performance objective for the two computer vision algorithms in this domain is to have a high balance for both precision and recall and not to sacrifice one metric's quality for the other. We saw that F-RCNN was able to achieve this balance far better than AIM. The precision-recall (P-R) curve of the F-RCNN is concentrated in the region with higher precision and recall values. On the other hand, the AIM algorithm has high precision, but the recall is almost half of F-RCNN. Hence, F-RCNN detects most of the fruit and most of the detected fruits are true detections with few false positives. Experiment 2: Evaluating Effect of Camera Platform on Computer Vision Algorithms To compare the new and previous camera systems, the outcomes of F-RCNN and AIM on the images of the same calibration plots from these two cameras were compared. The results of F-RCNN for both cameras were similar and closely matched ground truth numbers ,but never exceed them (no false positives). For AIM, the old camera detected more apples than the new one. Also, there were more false positives. The fluctuation between the two camera systems was mainly because of multiple hyper parameter tuning required for proper application of AIM to data from different cameras. This further supports our inclination to use F-RCNN, which is more general than AIM and does not require this kind of parameter customization for each camera system. In addition, F-RCNN has better R2 values than AIM for both the old camera (.5 vs. .39) and the new camera (.59 vs. .61). Note also that the R2 values for the old camera for both F-RCNN and AIM are actually less than the ones for the new camera, which supports use of the new camera system. Research Goal 3: Develop human-in-the-loop platform control system. Objective 3.1: Develop control algorithm for platform operation. The fruit harvesting process was modeled as a Markovian process, with appropriate state and action spaces and a stochastic transition function, which essentially simulates the picking process. A Monte-Carlo search controller was developed to optimize the lift vertical positioning of the pickers as a function of the sensed fruit densities in front of the platform and the workers' picking rates. The controller was written in C++ and ROS and runs in real time. Objective 3.2: Integrate real-time sensing and control. The Ph.D. student who wrote the code to the controller (Zhenghao Fei) travelled to CMU and collaborated with local researchers to integrate the code of the camera fruit sensing system, the platform GPS and the optimizing controller. All software components were implemented within ROS and were tested in the lab, in a harvest mock-up at UC Davis, and in an apple orchard. Research Goal 4: System prototyping and field evaluation. Objective 4.1: Robotize harvest-aid platform. The orchard platform was retrofitted so that workstations are actuated and elevated via computer control. The entire upper deck of the platform was removed and two lifts were designed and fabricated at UC Davis. Special attention was given to safe operation. Hydraulic cylinders and control circuits and valves were installed, along with displacement sensors that measure the lift vertical displacements. An Arduino was interfaced to control the hydraulic valves and read sensor outputs and was programmed to control the vertical motions of the two lifts. The system was tested and worked very well. Prior to the main experiment in the field, we tested the integration of the camera and the orchard platform in a controlled setup. Fruits were manually hung at different heights and densities to simulate real apple tree canopies. The camera system was mounted to a fixed height and distance from the fruit wall. The platform was manually driven at slow speeds while the camera imaged the simulated canopy. From the consecutive images, the fruit detection algorithm identified the fruits, and the center of the detected fruits were forwarded to the control system that adjusted the picking heights for two pickers. Objective 4.2: Evaluate robotic platform in orchards. The platform with the integrated camera and lift systems was tested in an apple orchard at Lodi, CA on September 11, 12 and 13, 2018. In these experiments, platform speed was not controlled by the optimizing controller; this capability will be added in the next period. The main results were the following. When driving at constant speed along a certain row, the mean difference in the time needed to fill a bin when the lifts were fixed (40 minutes) and controlled by the computer (37.5 minutes) was 2.5 minutes. In another row, when the lifts were fixed and the speed was constant, it took 57 minutes to fil a bin; however, if the lifts were dynamically controlled and the speed was also adjusted (by the operator) this time would decrease to 45 minutes. In another row, the bin-fill time did not change when the lifts were static or moving, but it did change when the speed was adjusted dynamically. Overall, results were inconclusive, but it became clear that controlling the moving speed plays an important role in platform performance.
Publications
- Type:
Journal Articles
Status:
Published
Year Published:
2018
Citation:
Durand-Petiteville, A., Le Flecher, E., Cadenat, V., Sentenac, T., Vougioukas, S.G. (2018). Tree detection with low-cost 3D sensors for autonomous navigation in orchards. IEEE Robotics and Automation Letters. 3(4): 3876-3883.
|
Progress 12/01/16 to 11/30/17
Outputs Target Audience:Students: One Ph.D. student was mentored in the context of this project. Informal laboratory instruction was the key instrument. Growers & Stakeholders: Our team visited one grower in Lodi, CA and informed him about the progress of the project and conducted field experiments. The PI presented results from this project in the following grower and industry stakeholder meetings: sweet cherries grower meeting on Jan. 5, 2017 at Cabral Agricultural Center, Stockton, CA; Washington Tree Fruit Commission, at UC Davis, March 1, 2017; Almond Board of California, Harvest Technology Roundtable at UC Davis, March 15, 2017;Silicon Valley Forum at UC Davis, 4 April 4, 2017. General public: Precision Ag conference series, organized by the World Food Center at UC Davis, April 27, 2017; CITRIS Agricultural Technology Fair, UC Merced, March 8, 2017. Researchers: The PI and co-PI presented project progress at the NRI meeting in Arlington, in November 2017. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?At UC Davis, one Ph.D. student, Zhenghao Fei, was mentored and learned to perform printed circuit board design; calibrate load cells; write code for signal processing. He also wrote a journal manuscript and presented at the ASABE Intl. meeting. Also, project scientist John Shepard gained experience on real-time closed loop control of hydraulic cylinders using Arduino and R&D engineer Dennis Sadowski gained experience in designing and building hydraulically actuated equipment. At CMU, project scientist Abhisesh Silwal has received training on using the imaging sensor setup for logging image data from the test orchard. How have the results been disseminated to communities of interest?Ph.D. student Zhenghao Fei presented results to researchers and professionals with an oral presentation at the 2017 Intl. ASABE meeting.Work was presented by the PI at the National Robotics Initiative (NRI) Principal Investigators meeting in November 2017. A shortpresentationwas givento entire (NRI) audience and also a 10 minute focus sessionto the USDA Program Manager and other agricultural robotics principal investigators in a side meeting. The project's aims, approaches and results were also presented by PI Vougioukas at various meetings:sweet cherries grower meeting on Jan. 5, 2017 at Cabral Agricultural Center, Stockton, CA; Washington Tree Fruit Commission, at UC Davis, March 1, 2017; Almond Board of California, Harvest Technology Roundtable at UC Davis, March 15, 2017;Silicon Valley Forum at UC Davis, 4 April 4, 2017;Precision Ag conference series, organized by the World Food Center at UC Davis, April 27, 2017; CITRIS Agricultural Technology Fair, UC Merced, March 8, 2017. What do you plan to do during the next reporting period to accomplish the goals?
Nothing Reported
Impacts What was accomplished under these goals?
Research Goal 1: Modeling of fruit harvesting with a robotic orchard platform Objective 1.1: Create instances of virtual orchards. Twenty cling peach trees with fruits were manually digitized in summer 2017; whereas dormant cling peach trees (without leaves) were digitized using a high resolution 3D lidar in late fall 2017. Objective 1.2: Develop human picking models. Picker apple harvesting rates as a function of time and position in the orchard were measured by recording and processing the weight of the fruit carried by the picker in the picking bag (see objective 2.2).The statistics of this rate are used to model and simulate the picing process. Research Goal 2: Develop a sensing system for platform-based fruit picking. Objective 2.1: Estimate fruit yield density. Data was collected at the Lodi Orchard on September, 7th 2017. The CMU Imaging Sensor was used to capture high-resolution stereo images of one side of a row (60m) in the orchard. The collected images were Geo-referenced with GPS data. For assessing the accuracy of the imaging sensor, 15 plots of 1 meter length, were selected along the test row. The start and end of each plot was marked with QR tags. The total fruit within each of these plots were manually counted and recorded. For detecting the apples in the images, CMU team used the Angular Invariant Maximal key-point detector to locate potential fruit centers across the image. For each detected key-point, a 6 x 8 grid is constructed around it. For each grid, a feature vector is generated by combining SURF and Radial Hog features. These grid features vectors are then passed through the random forest classifier to determine whether the grid belongs to a part of an apple. If more than 50% of the grids associated with a keypoint have been classified positively, than the corresponding keypoint is classified as an apple. This algorithm is able to accurately detect the fruit in an image in 0.3 second per image and hence can be run in real time. Using this algorithm CMU achieved an R-Square of 0.72 against the ground-truth data. CMU team has also developed ROS nodes to publish: 1. The global fruit coordinates of the detected fruit (w.r.t to Bandit platform) along the orchard row. 2. The visible fruit count for every 0.5 meter grid (length and height) along the row. CMU also started exploring the use of Generative Adversarial Network for constructing an imaging unit without flash. For this purpose, an experimental stereo rig was used to image the same test plot. This sensor was a low resolution color camera from PointGrey (Chameleon 3) with a spatial resolution of 2048 x 1536 pixels. This experimental sensor was not equipped with the active lighting system as with the CMU Imaging Sensor. To compensate the variability caused by outdoor lighting, images with three different exposures were taken. It is expected that fusion of multiple exposures will reduce the effect from viable illumination without active lighting. The output in the form of exposure fused image will then be transferred to a Generative Adversarial Network that will attenuate the background using depth information. The end result would be similar to images acquired using the current CMU sensor. This concept is currently being worked on as a cost effective alternative. Objective 2.2: Real time estimation of each worker's fruit picking rate. A 2nd generation instrumented picking bag was used to measure harvested fruit weight. Load cells were repositioned and all electronics were placed on a specially designed printed circuit board (PCB) inside an enclosure. Electronics included an Arduino, signal conditioning circuits, an Xbee shield, a GPS, and an SD card. The microcontroller transmits all time-stamped data in real time via ROS. Data are filtered using a median and a low-pass filter to reduce noise. Static and dynamic calibration tests were performed anew in the lab over the weight range of the bag's capacity (20 kg) using baseballs and apples. Results showed a mean error of 0.36 kg, or 2.5 average-size apples. The bag was used in the Lodi orchard by a professional picker. Forty-five bags were filled and collected data provided real-time estimation of picker harvesting speed. The average speed was 1.65 seconds per fruit. Research Goal 3: Develop human-in-the-loop platform control system. Objective 3.1: Develop control algorithm for platform operation The fruit harvesting process was modeled as a Markovian process, with appropriate state and action spaces and a stochastic transition function, which essentially simulates the picking process. A stochastic model predictive control framework is under development and search techniques are being considered to compute optimal actions (picker elevations) for the sensed fruit yield distribution and platform moving speed and worker picking speeds. Research Goal 4: System prototyping and field evaluation. Objective 4.1: Robotize harvest-aid platform. The platform was retrofitted so that the right rear workstation is actuated hydraulically and elevated via computer control. A CAD design and the fabrication of the lifting mechanism were completed and a hydraulic cylinder was installed. A displacement sensor was also installed and closed-loop position control of the hydraulic cylinder was achieved using an Arduino. The right-front workstation will also be retrofitted for hydraulic actuation.
Publications
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2017
Citation:
Fei, Z., Shepard, J., Vougioukas, S. (2017). Instrumented picking bag for measuring fruit weight during harvesting. ASABE Annual International Meeting. Paper Number 1701385. Spokane, Washington.
- Type:
Journal Articles
Status:
Submitted
Year Published:
2017
Citation:
Fei, Z., Shepard, J., Vougioukas, S. Instrumented picking bag for measuring fruit weight during harvesting. Transactions of the ASABE.
|
Progress 12/01/15 to 11/30/16
Outputs Target Audience:Students: The PI mentored two PhD students in the context of this project. Informal laboratory instruction was the key instrument. Growers: Our team visited one grower in Lodi, CA and informed them about the project and conducted field experiments. The PI also presented at a sweet cherries grower meeting on Jan. 5, 2017 at Cabral Agricultural Center, Stockton, CA. Researchers: The PI and co-PI presented project progress at the NRI meeting in Arlington, in November 2017. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?At UC Davis, Ph.D. candidate Zhenghao Fei learned to use the Robot Operating System, integrate electronics, microcontrollers and sensors to measure fruit weight in an insturmented bag, and how to do static and dynamic calibration of the sensors. Research engineer Dennis Sadowski came up to speed with orchard platform technologies as he started designing the platform's retrofit for robotic operation. Research engineer John Shepard improved his skills on Arduino-based coding, filtering and sensor calibration. At CMU, research engineers Zania Pothen and Devdatta Narote have received training during the first year of the program. They have been taught how to use the Robot Operating System, how to use C++ apple detection software framework and integrate with a GPS geolocation data stream. How have the results been disseminated to communities of interest?Work was presented at the National Robotics Initiative (NRI) Principal Investigators meeting in November 2016. A short 2 minute presentation and a 1 hour poster presentation was giving to entire (NRI) audience and also a 10 minute focus session to the USDA Program Manager and other agricultural robotics principal investigators in a side meeting. The project's aims, approaches and preliminary results were also presented by PI Vougioukas to CA Sweet-Cherry growers meetings: one on Jan. 5, and another on Feb. 6, both in Stockton, CA, What do you plan to do during the next reporting period to accomplish the goals?
Nothing Reported
Impacts What was accomplished under these goals?
Research Goal 1: Modeling of fruit harvesting with a robotic orchard platform Objective 1.1: Create instances of virtual orchards. The fruit locations of high-density SNAP apple trees were digitized in a Lodi orchard in order to create a small part of a virtual orchard. The collected data were processed, filtered and stored in '.csv' format. Objective 1.2: Develop human picking models. Picker apple harvesting rates as a function of time and position in the orchard were measured by recording and processing the weight of the fruit carried by the picker in the picking bag (see objective 2.2). Research Goal 2: Develop a sensing system for platform-based fruit picking. Objective 2.1: Estimate fruit yield density. Data was collected at the test Lodi Orchard on September, 9th 2016. The CMU Imaging Sensor was used to image a test plot of the orchard. Also captured was GPS data, pick sensor data and ground truth apple coordinate data. There are several challenges associated with detecting fruit in images taken in outdoor and unstructured environments such as apple orchards: A large number of apples are partially occluded by foliage and surrounding apples; the surface of the fruit is smooth and therefore lacks texture and defining features; detection of the same apple from multiple images results in multiple counts. We overcame the first two limitations by using the Angular invariant Maximal detector that utilizes the gradual variation of intensity and gradient orientation formed on the surface of the fruit. For evaluating purposes a "leave one out" cross validation on 20 randomly selected images was performed. The visible apples in each of these 20 images were manually labeled to serve as ground truth for validation. The detection algorithm has an overall accuracy of 0.84 F1-score and R2 of 0.69. The algorithm achieved 100% precision and 73% recall. Moderate recall is not a concern because multi-view imaging can compensate with multiple images captured of any one apple. Yield measurements were also performed from images using the GPS data. and assign to world locations. Initial results produced correlation between image and ground truth with R2=0.60. Objective 2.2: Real time estimation of each worker's fruit picking rate. A commercial picking bag was instrumented to measure harvested fruit weight. Two load cells were placed inside an enclosure, which was placed between the bag and its shoulder straps, without hindering picking motions. The load cells measure the forces exerted on the straps by the bag and fruits. All electronics were placed inside the enclosure, and included an Arduino, signal conditioning circuits, an Xbee shield, a GPS, and an SD card. The microcontroller transmits data in real time and saves time-stamped data on the SD card. Data are filtered using a median and a low-pass filter to reduce noise. Dynamic calibration was performed in the lab over the weight range of the bag's capacity (20 kg). Baseballs provided consistent weight and volume and were placed in the bag to provide a staircase ground truth weight signal. Two people carried the bag and moved in a manner analogous to pickers. Results showed a mean error of 0.39 kg, standard deviation 0.42kg, and 95th percentile 1.04kg. Major error sources included bag acceleration and body reaction force. Research Goal 4: System prototyping and field evaluation. Objective 4.1: Robotize harvest-aid platform. The platform needs to be retrofitted so that workstations are actuated and elevated via computer control. A CAD design of the lifting mechanism was completed and hydraulic actuators were selected.
Publications
- Type:
Conference Papers and Presentations
Status:
Accepted
Year Published:
2017
Citation:
Fei, Z., Shepard, J., Vougioukas, S. (2017). Instrumented picking bag for measuring fruit weight during harvesting. ASABE Annual International Meeting. Spokane, Washington.
|
|