Source: UNIVERSITY OF FLORIDA submitted to NRP
AUTOMATION AND MECHATRONIC DEVELOPMENT OF HORTICULTURAL PRODUCTION SYSTEMS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
0205000
Grant No.
(N/A)
Cumulative Award Amt.
(N/A)
Proposal No.
(N/A)
Multistate No.
(N/A)
Project Start Date
Oct 1, 2005
Project End Date
Sep 30, 2010
Grant Year
(N/A)
Program Code
[(N/A)]- (N/A)
Recipient Organization
UNIVERSITY OF FLORIDA
G022 MCCARTY HALL
GAINESVILLE,FL 32611
Performing Department
AGRICULTURAL & BIOLOGICAL ENGINEERING
Non Technical Summary
There is a growing interest among researchers, industries, and growers to pursue automation solutions to reduce the increasing disparity between U.S. production labor costs and those of developing countries. However, it is clear that novel approaches need to be taken to solve the technological problems, as well as the manufacturing and maintenance challenges which will surface as high-tech equipment systems are implemented in harsh agricultural environments. Past automation efforts have demonstrated that research efforts that jointly design the machine and plant systems have the greatest opportunity for success. Consequently, it is important to closely coordinate research in both areas. This program will combine research from two major Florida horticultural production applications (greenhouse spraying and citrus harvesting), which will provide a basis for building upon the fundamental technologies necessary to implement robotic solutions in horticultural production. Specific research in sensing technologies, manipulator configuration, visual servo control, end effector development and autonomous guidance systems will be pursued to advance these technologies as can be applied to the specific applications listed and eventually extended to other horticultural production systems. In addition, research in optimal grove and tree factor design will be integrated with machine systems development to improve the plant-machine system viability with regard to optimal production efficiency.
Animal Health Component
40%
Research Effort Categories
Basic
30%
Applied
40%
Developmental
30%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
2050999106010%
2050999202010%
2162199106010%
2162199202010%
4020999202010%
4022199202010%
4025310202020%
6010999301010%
6012199301010%
Goals / Objectives
1. Evaluate VIS/NIR/FIR and other sensor technologies which are useful for selective fruit harvest, with appropriate sensor fusion, to improve fruit detection and enable the tree fruit grading by maturity and size. 2. Implement and improve visual servo control strategies which will be used to target and track fruit during harvest. Develop path planning strategies which will optimize harvesting time. 3. Develop novel end-effectors and manipulator arm configurations which will optimally harvest tree fruit. 4. Improve tree characteristics, orchard design, and cultural practices which will enhance the harvestability of citrus. 5. Develop robust vehicle guidance technologies for operation in orchards and greenhouses where traditional GPS based techniques are incapable of maintaining navigation information from satellites.
Project Methods
Past robotic harvesting attempts have not achieved economic viability due to three primary reasons, 1) harvesting efficiency, 2) cost of equipment and 3) grower acceptance. This project will seek to address fundamental technology barriers to effective implementation of automation in two horticultural applications in Florida, namely, greenhouse spraying and citrus harvesting. The principal technologies that will be explored are fruit detection, manipulator and end effector design, vehicle guidance and robot control. Numerous sensors are commonly employed in robotic production systems where target identification must be realized; i.e. CCD color vision, ultrasonic range, laser range, capacitive proximity, LED range, and so on. Common problems such as light variability, target occulusion, sensor resolution, and noise make it unlikely that a single sensor will be adequate for the task. Efforts in sensor fusion suggest that the use of multiple sensors may hold significant promise. In order to enhance the performance of the end-effector over past development efforts, a better understanding of fruit removal mechanics and physical properties is required. Several studies are proposed which will look at issues, such as the relationship between rind penetration forces and the applied surface area on common Florida citrus cultivars; positive and negative pressure thresholds as associated with fruit bruising and shelve life; and applied force thresholds for plugging during harvesting. Several manipulator architectures have been attempted for fruit harvesting. In order to avoid obstacles and harvest interior canopy fruit, the optimal configuration for a robotic harvester may require more degrees of freedom than previously employed. Modeling techniques will be used to develop and evaluate manipulator configurations Guidance technologies are fundamental to any autonomous robotic production system whether applied to greenhouse or field agriculture applications. Our primary emphasis will be on machine vision based vehicle guidance, with supplemental devices such as laser radar, ultrasonic, gyroscopes, and inclinometers. Grove design for optimal economic gain is being considered for citrus production. In this concept, a grove will be designed for the optimal combination of varieties, rootstocks, grove layout, production practices, and harvesting methodologies, which will provide maximum economic yield. Three statistically and horticulturally designed grove sites are being considered, which would represent the major citrus producing areas in the state of Florida. Each of the three sites will have approximately ten acres of oranges, as well as ten acres of grapefruit at one site. The model grove will attempt to determine the optimal tree spacing and density for fruit yield, mechanical harvestability, hedging practices and economic return on investment. The Model Orchard will have a small block of special tree scion and rootstock combinations which will attempt to find a more suitable tree design for robotic harvesting. The designs will attempt to provide uniform tree characteristics, better access to the fruit and minimize tree hedging and pruning requirements.

Progress 10/01/05 to 09/30/10

Outputs
OUTPUTS: Our work in citrus disease detection has sought to evaluate the performance of existing packing line equipment schemes when using our new technology for grading citrus canker. An inspection module was developed on a one-line commercial fruit sorting machine. The camera unit was a two-band spectral imaging system. It mainly consisted of a beamsplitter with an equal transmission-reflection ratio, two narrow bandpass filters with central wavelengths at 730 nm and 830 nm, respectively, and two identical Gigabit Ethernet monochrome cameras. Using an exposure time of 10 ms, the imaging system can capture narrow band images without blurring from the samples moving at a speed of 5 fruits/s. Spatial resolution of the acquired images was 2.3 pixels/mm. Real-time image processing and classification algorithms were developed based on a two-band ratio approach. The system was tested by 360 grapefruits with normal surface, canker lesions, and other peel diseases and defects. The overall classification accuracy was 95.3%, demonstrating that the methodology as well as the hardware and the software developed in this study are effective and suitable for real-time citrus canker detection. Greasy spot, melanose, and sooty mold could generate false positive errors for the fruits without canker. The current system setup cannot inspect two lateral sides of the fruits. Future work will be conducted with an emphasis on whole surface inspection of each fruit. Our work in fruit detection for robotic harvesting has sought to improve detection rates which would result in higher fruit harvesting efficiencies since fruit detection and harvesting efficiency are directly correlated. Fruit visibility, the ratio of the number of fruits visible to a human observer to the total number of fruits inside a region of interest (ROI), was analyzed for a robotic citrus fruit harvesting application using RGB visible spectrum imaging technology. The ROI was a volume in the tree enclosed by a 0.125 m3 bounding cube. Multiple images of the region of interest were acquired using two viewing methods; orthographic viewing and multiple-perspective viewing. From this research, we can conclude: 1) Combining the number of visible fruit in the images of the ROI acquired from both orthographic viewing and multiple-perspective viewing methods improved fruit visibility over that from the front view only. 2) Multiple perspective viewing method performed significantly better than orthographic viewing in improving fruit visibility. 3) When combined with fruit visibility, more fruits were detected in the multiple-perspective viewing method (0.87) than orthographic viewing method (0.74). Our work in autonomous guidance has been focused on citrus grove applications. We developed two different systems. The first adapted a hydraulically controlled steering unit to a John Deere 6410, while the second was applied to a John Deere Egator. In both cases, machine vision and ladar were there primary sensor devices used to steer the vehicle. A fuzzy logic-based sensor fusion approach was adopted to achieve optimal control and performance of the steering system. PARTICIPANTS: Not relevant to this project. TARGET AUDIENCES: Nothing significant to report during this reporting period. PROJECT MODIFICATIONS: Not relevant to this project.

Impacts
The experimental results for our canker detection studies to date have given us confidence in our ability to discriminate between canker and other confounding disease conditions under static conditions. Under several different scenarios we have been able to discriminate at accuracies around 96%. We have also demonstrated that canker lesion reflectance spectra doesn't change during the season, and that we can detect lesions down to 2mm in diameter under static conditions. Finally, we have demonstrated the potential for implementing these approaches in a multi-spectral on-line systems operating at line speeds in excess of 5 fruit per second. We do not expect any problems with operating these approaches at 10 fruit per second. We therefore have developed a technology which could be implemented on-line under packinghouse conditions. The development work for enhanced fruit detection for robotic harvesting has documented a new approach for scanning a robotic harvesting region of interest which has the potential to significantly improve harvesting efficiency over approaches proposed in earlier works. This approach can be combined with normal ROI havesting strategy to effectively scan the ROI and then map fruit with in the ROI that my not be detectable from a single perspective viewing. The potential for improving fruit detection above 90% compared to earlier efforts in the 75% range, offers a potential significant improvement. Our work in autonomous guidance focused on citrus grove applications where we developed a machine vision and ladar based guidance system to steer the e-gator vehicle. A fuzzy logic-based sensor fusion approach was adopted to achieve optimal control and performance of the steering system. The outcomes of this work demonstrated the utility of non-DGPS based steering in citrus groves.

Publications

  • Subramanian, V., T.F. Burks, and W.E. Dixon. 2009. Sensor Fusion Using Fuzzy Logic Enhanced Kalman Filter For Autonomous Vehicle Guidance in Citrus Groves. Transactions of ASABE 52(5) 1-12.


Progress 10/01/08 to 09/30/09

Outputs
OUTPUTS: Our efforts during this past year has been focused on two primary areas of research and development. The outcomes of these two efforts have resulted in publications, presentations at scientific meetings, and presentations at statewide industry events. Our work in citrus disease detection has sought to evaluate the performance of existing packing line equipment schemes when using our new technology for grading citrus canker. An inspection module was developed on a one-line commercial fruit sorting machine. The camera unit was a two-band spectral imaging system. It mainly consisted of a beamsplitter with an equal transmission-reflection ratio, two narrow bandpass filters with central wavelengths at 730 nm and 830 nm, respectively, and two identical Gigabit Ethernet monochrome cameras. Using an exposure time of 10 ms, the imaging system can capture narrow band images without blurring from the samples moving at a speed of 5 fruits/s. Spatial resolution of the acquired images was 2.3 pixels/mm. Real-time image processing and classification algorithms were developed based on a two-band ratio approach. The system was tested by 360 grapefruits with normal surface, canker lesions, and other peel diseases and defects. The overall classification accuracy was 95.3%, demonstrating that the methodology as well as the hardware and the software developed in this study are effective and suitable for real-time citrus canker detection. Greasy spot, melanose, and sooty mold could generate false positive errors for the fruits without canker. The current system setup cannot inspect two lateral sides of the fruits. Future work will be conducted with an emphasis on whole surface inspection of each fruit. Our work in fruit detection for robotic harvesting has sought to improve detection rates which would result in higher fruit harvesting efficiencies since fruit detection and harvesting efficiency are directly correlated. Fruit visibility, the ratio of the number of fruits visible to a human observer to the total number of fruits inside a region of interest (ROI), was analyzed for a robotic citrus fruit harvesting application using RGB visible spectrum imaging technology. The ROI was a volume in the tree enclosed by a 0.125 m3 bounding cube. Multiple images of the region of interest were acquired using two viewing methods; orthographic viewing and multiple-perspective viewing. From this research, we can conclude: 1) Combining the number of visible fruit in the images of the ROI acquired from both orthographic viewing and multiple-perspective viewing methods improved fruit visibility over that from the front view only. 2) Multiple perspective viewing method performed significantly better than orthographic viewing in improving fruit visibility. 3) When the fruit detection algorithm was applied to the images, the fruit detection efficiency was 0.91 and 0.96 for the orthographic and multiple perspective viewing methods respectively. When combined with fruit visibility, more fruits were detected in the multiple-perspective viewing method (0.87) than orthographic viewing method (0.74). PARTICIPANTS: Nothing significant to report during this reporting period. TARGET AUDIENCES: Nothing significant to report during this reporting period. PROJECT MODIFICATIONS: Nothing significant to report during this reporting period.

Impacts
The experimental results for our canker detection studies to date have given us confidence in our ability to discriminate between canker and other confounding disease conditions under static conditions. Under several different scenarios we have been able to discriminate at accuracies around 96%. We have also demonstrated that canker lesion reflectance spectra doesn't change during the season, and that we can detect lesions down to 2mm in diameter under static conditions. Finally, we have demonstrated the potential for implementing these approaches in a multi-spectral on-line systems operating at line speeds in excess of 5 fruit per second. We do not expect any problems with operating these approaches at 10 fruit per second. We therefore have developed a technology which could be implemented on-line under packinghouse conditions. The development work for enhanced fruit detection for robotic harvesting has documented a new approach for scanning a robotic harvesting region of interest which has the potential to significantly improve harvesting efficiency over approaches proposed in earlier works. This approach can be combined with normal ROI havesting strategy to effectively scan the ROI and then map fruit with in the ROI that my not be detectable from a single perspective viewing. The potential for improving fruit detection above 90% compared to earlier efforts in the 75% range, offers a potential significant improvement.

Publications

  • 6) Qin, J., Burks, T. F., Ritenour, M. A., and Bonn, G. W. 2009 Detection of citrus canker using hyperspectral reflectance imaging with spectral information divergence. Journal of Food Engineering 93 (2009)183-191.
  • 1) Bulanon, D M; Burks T F; Alchanatis V. 2009. Image fusion of visible and thermal images for fruit detection. Biosystems Engineering. Vol. 103(1), 12-22
  • 2) Kim, D. G., Burks, T. F., Qin, J., and Bulanon, D. M. 2009. Classification of Grapefruit Peel Diseases using Color Texture Feature Analysis. Intl. Journal of Agric. & Biol. Eng. 2(3) 41-50.
  • 3) Zhao, X., Burks, T. F., and Qin, J. 2009 Digital microscopic imaging for citrus peel disease classification using color texture features. Applied Engineering in Agriculture 25(5) 1-8.
  • 4) Kim, D. G., Burks, T. F., Schumann, A.W., Zekri, M. and Zhao, Z. 2009 Detection of Citrus Greening using Microscopic Imaging. Agricultural Engineering International: The CIGR EJournal., Volume XI (2009). http://www.cigrjournal.org/index.php/Ejounral/article/view/1194
  • 5) Bulanon, D M; Burks T F; Alchanatis V. 2009. Fruit Visibility Analysis for Citrus Harvesting. Transactions of the ASABE 52(1): 277-283


Progress 10/01/07 to 09/30/08

Outputs
OUTPUTS: During this reporting cycle we have focused primarily on citrus disease detection studies for citrus canker detection in the packinghouse, as well as citrus greening HLB in the grove. A digital color microscope system (Keyence VHX-600K) was used for acquiring the magnified images of citrus fruit samples including citrus canker. The images of citrus samples with six peel conditions were taken at six magnifications from 5X to 50X. For each fruit sample, 39 texture features were computed using color co-occurrence method and redundant texture features were eliminated using SAS PROC STEPDISC. The results suggested that a reduced hue, saturation and intensity texture features model coupled with microscope imaging with magnification X5 to X20 was a good tool to differentiate citrus peel conditions. The microscopic imaging system was used to acquire images from citrus leaves with eight conditions including young flush, normal mature, greening blotchy mottle, greening island, manganese deficiency, iron deficiency, zinc deficiency and yellowed dead. Color texture features were generated as described above for the various leaf conditions. A classification model using 16 texture features achieved 90% accuracy when discriminating all leaf conditions, while another model excluding blotchy mottle and young flush, gave the best accuracy (93.3%). The result suggested that the normal young and blotch conditions have very similar leaf texture condition and caused misclassification. This is likely due to subtle differences in image characteristics, which are difficult to contrast in the visible spectrum. A hyperspectral imaging system was developed for acquiring reflectance images from citrus samples. It is a push broom, line-scan based imaging system that utilizes an electron-multiplying charge-coupled-device (EMCCD) imaging device with an imaging spectrograph and a C-mount lens. Ruby Red grapefruits with peel conditions described above were tested. Hyperspectral images were analyzed using two classification methods based on principal component analysis (PCA) and spectral information divergence (SID), respectively. The hyperspectral imaging system was also used for collecting images from citrus leaf samples with citrus greening. PCA was performed to compress the 3-D hyperspectral image data and extract useful image features that could be used to discriminate cankerous samples from normal and other diseased samples. The overall accuracy for canker detection was 92.7%. SID classification method was based on quantifying the spectral similarities by using a predetermined canker reference spectrum, was performed to the hyperspectral images of the grapefruits for differentiating canker from normal fruit peels and other citrus diseases. The SID based classifier could differentiate canker from normal fruit peels and other citrus diseases, and it also could avoid the negative effects of stem-ends and calyxes. Optimal SID threshold value was determined as 0.008, by which the overall classification accuracy was achieved as 96.2%. PARTICIPANTS: Nothing significant to report during this reporting period. TARGET AUDIENCES: Nothing significant to report during this reporting period. PROJECT MODIFICATIONS: Nothing significant to report during this reporting period.

Impacts
The results of this project are very encouraging. Preliminary results have demonstrated the potential for discriminating between various citrus diseases using optically based systems in the laboratory. It is now necessary to expand our work to real world scenarios. The first major effort will be directed toward the development of real time detection of citrus canker in the packing line. A real-time machine vision system is being developed and integrated with a small-scale commercial fruit-sorting machine to construct an on-line inspection prototype. Computer programs for image acquisition and processing will be incorporated into system control software of the sorting machine to fulfill the goal of real-time citrus canker detection. The measurement system could be integrated with the existed packing lines for detecting citrus canker in real time during common packing operations. The detection system may also be adjusted or modified for inspecting other citrus diseases. Preliminary results in greening detection are encouraging, demonstrating the potential to differentiate citrus greening from nutrient deficiencies that are commonly confused with greening. However, the extension to the field will be problematic. It is anticipated that a new study will be conducted during the upcoming year to evaluate the potential for a hyperspectral sensing system to be used for greening detection.

Publications

  • Bulanon, D.M., Burks, T.F., and Alchanatis, V. 2008. Study on temporal variation in citrus canopy using thermal imaging for citrus fruit detection. Biosystems Engineering 101 (2008) 161-171.
  • Balasundaram, D., Burks, T.F., Bulanon, D.M., Schubert, T., and W.S. Lee. 2008. Spectral reflectance characteristics of citrus canker and other peel conditions of grapefruit. Postharvest Biology and Technology 51 (2009) 220-226.
  • Qin, J., Burks, T. F., Kim, M. S., Chao, K., and Ritenour, M. A. 2008. Citrus canker detection using hyperspectral reflectance imaging and PCA-based image classification method. Sensing and Instrumentation for Food Quality and Safety 2(3): 168-177.


Progress 10/01/06 to 09/30/07

Outputs
The proposed project aims to enhance tree fruit identification and targeting for robotic harvesting through the selection of appropriate sensor technology, sensor fusion, and visual servo-control approaches. The primary challenges are fruit occlusion, light variability, peel color variation with maturity, range to target, and computational requirements of image processing algorithms. Results from past research showed robotic harvesting efficiency of less than 75% which would make robotic harvesting not economically feasible. To increase efficiency, the performance of the three factors cited above should be improved. In this study, the focus is on improved fruit detection and fruit visibility, which is the first task of the robot. The objectives of this study were; a) To develop an image processing algorithm that can deal with varying lighting condition and fruit occlusion b) To improve fruit visibility using multiple viewpoints c) To investigate the potential of thermal imaging for fruit detection d) To investigate image fusion to enhance fruit detection. The Valencia citrus variety was used in the image processing development and fruit visibility study. Images were acquired from middle of May to end of July 2006 in a commercial orchard in the state of Florida. An off the shelf digital camera was used to capture images under varying lighting conditions. In addition, the region of interest (ROI) was defined by a bounding box (0.5 m) that enclosed the boundary of a certain volume of the citrus canopy. The orthogonal views of the ROI and nine different viewpoints of the front view of the ROI were captured. A total of 25 ROI from five trees were investigated. The fruits in the image were identified visually and automatically by the fruit detection algorithm. Fruit visibility was evaluated by comparing the number of fruits recognized in the image to the total number of fruits in the ROI, which was the ground truth. The image processing algorithm for fruit detection consisted of segmentation, labeling, size filtering, perimeter extraction and circle detection. Segmentation which separated the fruit from the background was implemented by thresholding using the red chromaticity coefficient as the threshold. Results showed that this method was robust under different lighting condition. In addition, the combination of perimeter extraction and circle detection enabled the detection of occluded fruits due to leaf and fruit occlusion. The fruit detection algorithm showed more than 90 % detection rate in the 110 images tested. The study on fruit visibility showed that front viewing of the ROI yielded an average of 50 % fruit visibility. This was increased to approximately 80 % by taking the other five orthogonal views of the ROI. Fruit visibility was further improved to 90 % by capturing multiple views of the front view from different viewing angles.

Impacts
A study on enhancing fruit detection for robotic citrus harvesting was conducted. Enhancement of fruit detection were focused on image processing algorithm that is robust under variable lighting condition and capable of detecting partially occluded fruits and on employing thermal imaging as another alternative for fruit sensing. Fruit detection rate of 90 % was achieved by the developed fruit detection algorithm coupled with multiple viewing scene acquisition technique. The achievement of 90 % detection significantly improves the potential for robotic harvesting to become economically viable. Future test will be conducted to varify that harvesting can be accomplished at similarly high efficiencies.

Publications

  • Hannan M; Burks T F; Bulanon D M (2007). A Real-time Machine Vision Algorithm for Robotic Citrus Harvesting. ASABE paper no. 073125
  • Bulanon, D M; Burks T F; Alchanatis V (2007). Study on Fruit Visibility for Robotic Harvesting. ASABE paper no. 073124
  • Bulanon, D M; Burks T F; Alchanatis V (2008). Enhancement of Robotic Citrus Harvesting: Fruit recognition and visibility. (To be presented in Application of Precision Agriculture on Fruits and Vegetables, January 2008, Orlando, FL)
  • Hannan M; Burks T F; Bulanon D M (2007). A Machine Vision Algorithm for Robotic Citrus Harvesting Part I Algorithm Development. Transactions of the ASABE (under review)
  • Bulanon, D M; Burks T F; Alchanatis V (2007). A Machine Vision Algorithm for Robotic Citrus Harvesting Part II Fruit Visibility Analysis. Transactions of the ASABE (under review)


Progress 10/01/05 to 09/30/06

Outputs
During the past year research has focused on the development of critical functional components of a robotic harvesting system. A prototype end effector and manipulator concept has been developed. The end effector was designed, fabricated and tested in a grove setting. The performance of the new end effector was promising, although several design improvements will be necessary prior to advancing toward commercialization. The gripping power and actuation performed very well, while the cycle rate leaves room for improvement. The end effector is still over-sized for the task and will need to be more compact in future upgrades. In addition, a new highly dexterious prototype harvesting arm has been designed, and a sub-assembly of the system has been fabricated and tested. The arm appears to have adequate power, and compactness for a phase 1 prototype. However, future enhancements will need to improve routing of conduit and more robust fluid power controls. The prototype was tested and seems mechanically sound and capable of meeting workspace accuracies. Future work plans will try to improve the design of the manipulator and improve on the shortcomings mentioned. One area in significant need of improvement is fruit detection in the canopy. A series of studies were conducted to improve fruit detection rates. Ten unique regions of interest were selected from a group of orange trees have common rootstock and scion. The objective was to determine if additional viewing angles will increase the detection rates. Two methods were employed which were based on the position of the camera relative to the ROI. In the first method, the six orthogonal views of the control volume (top, front, right side, left side, rear, bottom) were acquired. In the second method, nine different view angles of the front view were taken. It would be similar to looking forward, looking downward, looking sideward, and looking upward relative to the front view of the ROI. The visibility of the fruits from the different views was analyzed to see if all the fruits recognized from the different views account to the total number of fruits inside the ROI. Preliminary results from visual inspection of the images for the first acquisition method shows that the front view has the highest fruit visibility rate of more than 50%, while the top view show less than 10% visibility. By combining the front, left, right, and back views we obtain the highest fruit visibility at about 80%. Although the camera position in the second method was limited to the front side of the cube, it has a higher visibility rate. By combining the different views of looking forward and looking upward, it has a visibility rate of more than 80% and by combining all the different views, the visibility rate increased to about 90%. The final research area was autonomous steering in the citrus grove. New sensor fusion approaches were employed to combine machine vision and laser radar for steering an tractor appropriately equipped with sensors and steering control. The results of this sensor fusion approach demonstrated significant improvement over earlier experiments that used single sensor guidance.

Impacts
The research conducted this year seems to have provided several important accomplishments. First, a new citrus fruit harvesting end-effector has been developed that appears to meet the requirements for harvesting Florida Fresh citrus. There are several improvements needed but this mechanism should open the door for future enhancements. In addition, the development of a highly dexterious hydraulically actuated manipulator which has been custom designed for citrus harvesting has been developed. This offers the potential for an affordable manipulator with the dexterity necessary to get inside the tree canopy. Past robotic harvesting attempts have been limited by the detection rates of occluded fruit. Approaches are being developed which will hopefully improve those detection rates. Finally a robust sensor fusion approach to autonomous navigation in the citrus grove has been developed which has the potential for numerous applications in citrus production. Once fully implemented with a path planning algorithm and headland turning approach, there will be the opportunity for fully autonomous navigation in the citrus grove.

Publications

  • Flood, S.J., T.F. Burks and J.K. Schueller. 2006. Studies in the Optimization of Harvesting Motion Mechanics using a 7-DOF Manipulator with a 6-Axis Force/Torque Sensor. ASABE Annual International Meeting. Portland, OR. July 9-13, 2006. Paper No. 063083.
  • Sivaraman, B., T. F. Burks, and J. K. Schueller. 2006. Using modern robot synthesis and analysis tools for the design of agricultural manipulators. Invited Overview. CIGR. Vol. VIII
  • Sivaraman, B., and T.F. Burks. 2006. Geometric Performance Indices for Analysis and Synthesis of Manipulators for Robotic Harvesting. Transactions of ASAE 49(5): 1589-1597.
  • Subramanian, V., T.F. Burks, and A.A. Arroyo. 2006. Machine Vision and Laser Radar-based Vehicle Guidance Systems for Citrus Grove Navigation. Computers and Electronics in Agriculture 53(2006) 130-143.
  • Flood, S.J., T.F. Burks, and A.A. Teixeira. 2006. Physical Properties of Oranges in Response to Applied Gripping Forces for Robotic Harvesting. Transactions of ASAE. 49(2): 341-346.


Progress 10/01/04 to 09/30/05

Outputs
The development of the Robotic Greenhouse sprayer was initiated in 2002 through funding provided by the National Foliage Foundation. The goal of the project is to develop a robotic system, which can autonomously navigate through a greenhouse applying chemicals at uniform application rates, which are optimized for chemical efficacy thus minimizing over-spray. Phase I has focused on developing a prototype capable of navigating greenhouse alleyways. A prototype vehicle was developed using a 6-wheel differential drive steering system, and ultrasonic based navigation sensors. The vehicle has been tested in several modes of operation; i.e., vehicle only, self-contained sprayer mode and towed spray trailer mode. These tests were conducted on various surface conditions including, finished concrete, rough concrete, compacted sand and loose sand. Test results have demonstrated that RoboSprayer is able to quickly respond to position errors, achieve and maintain an acceptable centered path through the simulated test track. Additional tests have been conducted using machine vision and laser radar to enhance the vehicles steering control on curved pathways and on corners. New studies are planned to develop a fully integrated vehicle guidance system. Through the funding and support of the Florida Department of Citrus, a research program was began at the University of Florida in the summer of 2002, which seeks to address fundamental technology barriers which have prevented past citrus robotics efforts from being successful. Successful fruit harvesting has been achieved using a seven degree of freedom research manipulator prototype. The following topics have been explored, 1) We are currently exploring sensor technology which will improve our ability to locate fruit in the tree canopy. This will overcome some of the difficulties associated with traditional machine vision approaches attempted in the past. 2) The harvesting manipulator has undergone preliminary field testing, where the current sensors and manipulator controls have operated under actual harvesting conditions. 3) The current prototype harvesting arm has been adequate for the development of visual servoing capabilities, but lacks the range of motion, speed and cost efficiency necessary for a marketable system. We have been working on concepts, which should lead to significant advancement in harvesting arm technologies. We will continue modeling manipulator configurations, design a new prototype, fabricate a harvesting manipulator and its control system. 4) Citrus material handling properties and harvesting action test will be conducted during the upcoming calendar year. These tests will provide valuable information that can be used in developing an efficient end-effector for harvesting citrus. 5) The sensors, controls and software for basic vehicle steering in the citrus grove have been advanced to the point of field-testing. Once fundamental steering is accomplished, we will pursue the development of fully autonomous navigation, which will include motion control, cornering, obstacle avoidance, and path planning.

Impacts
Florida horticultural producers are experiencing growing external pressures from growing global markets, as well as internal pressures from environmentalist, a shrinking labor force, rising workers compensation and insurance cost, increased urbanization and rising real estate cost. As a result, producers must find ways to cut production cost, improve productivity, and improve profitability. Automation technologies offers a means of improving labor productivity and safety while cutting operating cost. This project seeks to find automation solutions which will benefit producers of horticultural crops.

Publications

  • Subramanian V., T.F. Burks, and S. Singh. 2005. Autonomous Greenhouse Sprayer Vehicle using Machine Vision and Ladar for Steering Control. Applied Engineering in Agriculture 21(5):1-9.
  • Pydipati, R., T.F. Burks, and W.S. Lee, 2005. Statistical and Neural Network Classifiers for Citrus Disease Identification Using Machine Vision. Applied Engineering in Agriculture 48(5):1-8.
  • Singh, S., T.F. Burks, and W.S. Lee. 2005. Autonomous Robotic Vehicle Development for Greenhouse Spraying. Transactions of ASAE (accepted September 27, 2005).