Source: WEST VIRGINIA UNIVERSITY submitted to
PRECISION POLLINATION ROBOT
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
EXTENDED
Funding Source
Reporting Frequency
Annual
Accession No.
1011249
Grant No.
2017-67022-25926
Project No.
WVA00N/A
Proposal No.
2016-07920
Multistate No.
(N/A)
Program Code
A7301
Project Start Date
Nov 15, 2016
Project End Date
Nov 14, 2019
Grant Year
2019
Project Director
Gu, Y.
Recipient Organization
WEST VIRGINIA UNIVERSITY
886 CHESTNUT RIDGE RD RM 202
MORGANTOWN,WV 26505-2742
Performing Department
Mech. & Aero. Engineering
Non Technical Summary
One of the major issues concerning current agricultural production is crop pollination. Approximately $24 billion per year worth of crops in the U.S. rely on pollination by various pollinators. However, the recent decline of honey bees (i.e. colony collapse disorder) has greatly threatened productivity. Declines of other native pollinators, such as different insect types and animals, have also been reported. Such shortages of pollinators in the U.S. have significantly increased the cost of farmers' renting them for pollination services. From both economic and food sustainability points of view, there is an urgent need to seek alternative pollination systems.In this project, a multi-disciplinary team of researchers will develop a prototype of precision pollination robot for bramble (i.e., blackberry and raspberry) pollination in a greenhouse environment. The project team will use a robotic arm carried by a ground rover for precise flower manipulation. Computer vision algorithms will be used to estimate the flower position, size, orientation, and physical condition, and to guide the robotic arm to capture and interact with flowers. A set of soft brush tips, mimicking bee's hairs and motion, will then be used to pollinate flowers. The precision rover navigation, mapping, and localization of individual flowers within complex greenhouse environments will be provided through a fusion of multiple types of sensor measurements. A database will be automatically generated and updated by the robot, recording the history of flower development and pollination status. A human operator will collaborate with the robot through supplying agriculture domain knowledge, providing high-level decisions, and correcting mistakes made by the robot. The efficiency and throughput of the prototype pollinator robot will be evaluated through a comprehensive set of experiments comparing multiple pollination methods.The successful completion of this research project will significantly impact the field of precision agriculture. First, robotic pollinator bypasses many current issues with natural pollinators in agriculture, such as honeybee colony collapse disorder, pollinator parasites and diseases, predators, pesticide spray, adverse weather, and the availability of pollinators in a timely manner. Second, robotic pollinators will improve fruit quality and production. Such applications include selective pollination, selective abortion of flowers, and digital cataloging, condition tracking, and yield prediction for fruit productions. Finally, the precision localization, evaluation, and manipulation of small and delicate plant parts provides fundamental capabilities for enabling a variety of other precision agriculture applications such as automated irrigation, fertilization and harvest, monitoring plant damages, as well as weed and pest control.The outcomes of this research project will be broadly disseminated to the research community, the growers, and the general public. This project will also enhance regional educational and outreach activities and broaden the participation of students from underrepresented groups.
Animal Health Component
0%
Research Effort Categories
Basic
70%
Applied
30%
Developmental
(N/A)
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4021123202085%
2051123113015%
Goals / Objectives
The goal of this research project is to develop a prototype pollinator robot, and perform proof-of-concept demonstrations of its effectiveness for brambles pollination in a greenhouse environment. In supporting of this goal, the project involves the following four main research objectives:1. Investigate the detailed mechanisms of pollination between bees and flowers in order to provide the knowledge and bio-inspiration for the pollination robot design.2. Develop autonomous capabilities to precisely locate, evaluate, interact, and manipulate small and delicate plant structures within unstructured, low-dynamic, and GPS-challenged greenhouse environments.3. Perform system integration and proof-of-concept demonstrations of precision robotic pollination.4. Perform detailed evaluation of the prototype pollinator robot efficiency and throughput as compared to existing pollination methods.
Project Methods
We will use a robotic arm mounted on an existing ground rover for precision flower access and manipulation. Computer vision algorithms, using images captured in both visual and UV spectra, will be used to estimate the position, orientation, size, and condition of the flowers. A set of soft brush tips, mimicking bee's hairs (i.e. scopa) and motion, will then be used to pollinate flowers. The design parameters of the delicate robot-flower interface will be driven by a series of insect pollination experiments. The precision rover navigation, mapping, and localization of individual flowers within complex greenhouse environments will be provided through a fusion of GPS, Lidar, camera, inertial sensor, and wheel encoder measurements. A database will be automatically generated and updated by the robot, recording the history of flower development and pollination status. A human operator will collaborate with the robot through supplying agriculture domain knowledge, providing high-level decisions, and correcting mistakes made by the robot. This intelligent system will allow more selective, consistent, and uniform pollination, which has the potential of leading to better fruit set and production at a large scale. The evaluation of the project will be based on a study that compares the yield and fruit quality with several different conditions such as no pollination (i.e. control), natural insect pollination, manual pollination, robotic pollination, and collaborations between human and robot pollination. The effectiveness of pollination will be evaluated by determining the fruit yield per plant, fruit size, fruit weight, harvest time and overall distribution of fruit across a plant.

Progress 11/15/17 to 11/14/18

Outputs
Target Audience:Findings from the project were presented at an entomological conference, four US institutions, and two Korea institutions during project Year 2. The importance of bees and pollination robot were presented via outreach with "WVU Insect Zoo" for more than 200 K-12 students, growers, and the public. The WVU greenhouse hosted four tours including elementary students and Master Gardeners. In those tours, the outline of the project and the importance of the research were explained. Robotic presentations and demonstrations were also provided to the public through several outreach activities including but are not limited to robotic lab open house, high school visitation, and WVROX event for STEM outreach. The project has attracted several media attention, with articles written by Wired, the Fast Company, and NowThis Future. The project was also reported by three grower trade magazines (Greenhouse Grower Magazine, Produce Grower Magazine, and Fruit Grower News - cover story). Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?During project Year 2, a total of 14 graduate students and 13 undergraduate students were involved in research. The students received hands-on training through the participation of this interdisciplinary research project. They also learned team-working and leadership skills when interacting with other project members. How have the results been disseminated to communities of interest?The results of this project have been disseminated to the public and science communities through two conference papers and several technical presentations. Numerous outreach activities with the WVU Insect Zoo, the WVU greenhouse, and the WVU robotics labs were also performed. What do you plan to do during the next reporting period to accomplish the goals?As outlined in the original proposal, the project year 3 will focus on: 1) integrate full pollinator robot system with the ability to complete the entire pollination sequence autonomously; 2) evaluate the efficacy and efficiency of multiple pollination methods. In terms of robot system development and integration, the emphasis will be on improving initial robot pose (i.e., position and orientation) estimation using a prior map of the greenhouse, performing reliably detecting the pose estimation of individual flowers in dense flower clusters, optimizing planning algorithms for both the drive base and the arm, refining the pollination end-effector design, extending autonomy capabilities to make flexible pollination decisions, and refining the final sequence of pollination actions on real flowers. The evaluation of the prototype pollinator robot effectiveness will be performed in the WVU Greenhouse. Five methods of pollination: bee pollination, manual pollination, autonomous robot pollination, and mixed human-robot teaming on pollination (with human performs supplementary manual pollination tasks) in addition to no pollination control, will be performed and the efficiency of each pollination method will be compared. The effectiveness of pollination will be evaluated by determining the fruit yield per plant, fruit size, fruit weight, harvest time, and overall distribution of fruits across a plant.

Impacts
What was accomplished under these goals? Horticulture Two cultivars of blackberry Darrow and Prime-Ark Freedom were grown in the WVU Greenhouse. 'Darrow' is a thorny floricane blackberry that flowers in the spring, while 'Prime-Ark Freedom' is a thornless primocane that flowers in late summer. These cultivars were selected to provide flowers over two seasons to ensure BrambleBee would have more opportunities for testing. Flowers were photographed to determine pollen development. Pollen is immature when flowers begin to open. Pollination attempts prior to pollen release should be avoided and could damage the flower. Identification of the pollination ready flowers with viable pollen is a crucial step for successful pollination. Entomology To provide insights to bioinspired robotic arm movement, the mechanisms of pollination between bees and flowers were investigated. An unmanned aerial system was used to map bramble flowers and videography was used to trace bees' movement among flowers. Although videography provided details on bees' movement on a flower, tracing bees among flowers and plants were very challenging because bees were not readily distinguishable from the background. To overcome the problem, we adopted thermal imaging to track the bees. It was found that when bees were not active (i.e. resting or sleeping) the body temperature of bees were not distinguishable from the background because the bees are poikilothermic animal. However, when bees were actively foraging, their thorax temperature increased 4-6 °C. This study indicates that bees' movement among plants can be readily traced and modeled to provide biological data to design and plan robotic arm movement of BrambleBee. We also continuously studied the structure of bee hairs for pollen gathering (i.e. scopa, hairs under abdomen of bees) to acquire parameters that can support the design of the end effector of BrambleBee. We found that the location and general structure of scopa were different among Osmia bee species. The efficiency of the hairs is being investigated by measuring the pollen load in the scopa of each species. In addition, we have investigated the mechanism of pollen loading from flowers and unloading to nests. It was found that bees used middle and hind legs to gather and pack pollens inside scopa. Those legs were evolved to be flattened and bees used the legs to sweep pollens on anthers. When pollens were unloaded, the legs moved the opposite direction to sweep the pollens from the scopa to the nest cells. BrambleBee Localization and Mapping Robot localization is performed primarily with a real-time 3D Simultaneous Localization and Mapping (SLAM) algorithm. A factor graph based framework is used to fuse the data from the sensor suite on-board BrambleBee to estimate the robot platform's states (i.e., position, orientation, and velocity). To improve the reliability of the loop-closure detection, each incoming point cloud scan from the 3D Lidar is matched with a prior 3D map of the greenhouse using the Generalized Iterative Closest Point (Generalized-ICP) algorithm. If the percentage of the matching points in the incoming scan is above a user-defined threshold, the loop-closure detection will be marked as successful. A transformation of the robot's current pose with respect to the original points of the prior map's coordinate frame (i.e., global frame) is then added to the factor graph as an observation link. The factor graph is then optimized based on the new observation link. A 2D occupancy grid map is also generated from the 3D SLAM map for path planning. BrambleBee Flower Detection Flower detection is accomplished using computer vision techniques with the downward-looking fisheye camera for the initial, long-range identification and then with the RGB-D camera on the robotic arm for precise, short-range positioning of the flowers. The long-range algorithm can be separated into two parts. First, a rough segmentation is performed based on color to extract parts of the image that are most likely to be part of a flower. This process is completed using a naive Bayes' classifier to assign pixels as belonging to or not belonging to flowers. The resulting segmentation produces a number of false positives due to shared colors between flowers and other objects. Thus, an approach based on transfer learning is adopted to distinguish between true and false positives from the initial segmentation. After the flowers are identified using the described approach, they are positioned in the map using unit vectors pointing from the camera to each flower. The short-range algorithm, using the RGB-D camera on the arm, consists of performing a reconstruction of the plant using real-time dense SLAM. After the plant is reconstructed, the previously presented classification algorithm is used to identify the flowers in the series of images used for reconstruction. BrambleBee Autonomy Given information about the locations of flower clusters and the robot's location within the environment, BrambleBee makes efficient plans to pollinate the plants in the greenhouse. First, to obtain up to date information about flower cluster locations and pollination readiness, BrambleBee make an "inspection pass" of the greenhouse. The inspection path is generated by first discretizing the greenhouse environment into a graph Voronoi diagram and then solving for a path through the nodes in the graph. The path is found using visibility graph techniques and constrained so that the robot inspects all rows of plants while a balance between distance driven and safety from obstacles is maintained. As BrambleBee drives the inspection path, nearby flower clusters are detected using the on-board fisheye camera. The locations of the detected clusters are then recorded into a map of the plant rows. Each row of plants is discretized into an equal number of "grid cells" on both sides of the rows. The number of flower clusters detected, that intersect a particular grid cell, is updated when the robot is within reliable visible range of that cell. After the inspection phase is completed and flower locations have been identified, BrambleBee then proceeds to decide where to go to pollinate flowers. Pollination locations are chosen by balancing the number of reachable flower clusters ready to be pollinated with minimizing distance driven in the robot drive base's configuration space. The order to visit pollination locations is found using a greedy heuristic that chooses the next best grid cell to pollinate by selecting the one with the minimum cost. Paths are then planned to efficiently reach these locations, while avoiding obstacles. BrambleBee Manipulation Once the robot arrives at a pollination location, BrambleBee use its robotic arm to precisely map individual flowers in the local workspace and then plan and execute an efficient sequence of actions to pollinate all reachable flowers. The workspace mapping is performed by moving the end-effector of the robotic arm in a survey pattern around the nearby plant while the poses of detected flowers are estimated and recorded into a local database using the depth camera on the end of the arm. Once all flowers in the local workspace have been identified, the sequence of flowers to pollinate is chosen by finding the sequence that minimizes the total distance traveled in the robot arm's joint configuration space. After the sequence of flowers to be pollinated has been determined, collision-free paths to reach observation positions directly in front of each flower are planned. Once the end-effector arrives at one of the goal destinations, it is then parked in front of the flower, ready to perform the precise final approach and pollinate the flower. During the final approach maneuver, visual-servoing is performed to guide the tip of the end-effector precisely into the flower.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2018 Citation: Ohi, N., Lassak, K., Watson, R., Strader, J., Du, Y., Yang, C., Hedrick, G., Nguyen, J., Harper, S., Reynolds, D., Kilic, C., Hikes, J., Mills, S., Castle, C., Buzzo, B., Waterland, N., Gross, J., Park, Y., Li, X., Gu, Y., Design of an Autonomous Precision Pollination Robot, Accepted, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, October, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Park, Y.-L. 2018. Osmia cornifrons as a model for pollinator robot. The Apicultural Society of Korea Annual Conference. Gwangju, Korea.


Progress 11/15/16 to 11/14/17

Outputs
Target Audience:Findings from research in the project were presented at two annual conferences: the Entomological Society of America Annual Meeting and the West Virginia Entomological Society Meeting. At the conferences which over 3,000 people attended, two presentations associated with the project were given. In addition, applications of pollinator robot have been presented to stakeholder groups including WV Beekeepers Association and WV Master Gardeners Conference where more than 200 people attended. Current pollinator issues and the development of pollinator robots were presented at the Organic Farm Field Day at West Virginia University (WVU) where about 50 growers, extension agents, and the public attended. The importance of pollinator bees and a potential solution with bio-inspired engineering were presented to the public via outreach with "WVU Insect Zoo" that a co-PI is operating at WVU. More than 4,000 West Virginia citizens including K-12 students visited the booth for the project at the WV State Fair and 14 additional outreach services were provided to serve more than 400 citizens in 2017. Robotic presentations and demonstrations were also provided to the public through several outreach activities including but are not limited to robotic lab open house, high school visitation, and "day in the park" event for STEM outreach. ? Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?During project Year 1, a total of 12 graduate students and 2 undergraduate students are involved in this project. The students are receiving hands-on training through the participation of this interdisciplinary research project. They are also learning team working and leadership skills when interacting with other project members. How have the results been disseminated to communities of interest?The results of this project have been disseminated to the public and science communities through two annual entomological conferences, two extension conferences, WVU Farm Field Day, and numerous outreach activities with the WVU Insect Zoo, the WVU greenhouse, and the WVU robotics labs. What do you plan to do during the next reporting period to accomplish the goals?With the development of the BrambleBee robot and a suite of software functions, the Year 1 research effort provided a solid foundation towards achieving the proposed robotics pollination capabilities. During Year 2, we will focus on system integration, bringing currently isolated capabilities into unified systems. Specifically, the robot localization and mapping algorithms will be performed onboard BrambleBee, along with the ability to navigate efficiently and safely in the cluttered greenhouse environment. The planning algorithms will also be running online, allowing the robot to reach different parts of the greenhouse efficiently. The flower detection and pose estimation algorithms will be coupled with the robotic arm control schemes, leading to the demonstrations of robotic pollination (from a fixed location) of real flowers. Towards the end of Year 2, the robotics arm will be integrated onto the mobile robot platform, paving the way for full system demonstrations and evaluations in Year 3. These integration tasks will need to be supported by technology development and refinement in several areas, such as sensor calibration and rolling shutter compensation, flower pose estimation, semantic mapping, robot autonomy architecture design, flower pollination sequence planning, bee-inspired pollination interface design, detailed pollination motions, and human-machine interface. Many problems and inspirations are expected to arise during the system development, integration, and testing phases, which would lead to new research problems to be addressed by the team. During Year 2, optimal foraging and movement pattern of bees will be investigated to understand the foraging mechanism (e.g. path and approach to flowers) in supporting of the robot development. We will use videography to determine patterns of foraging and movement through dense leaves and branches. In addition, a behavior bioassay will be conducted with a hi-speed video camera to describe the bees' movement within flowers and to determine how each bee use their hairs and body parts to access and gather pollens. The horticulture team will continue preparing to validate the efficacy of the robotic pollinator. The team will conduct a preliminary experiment regarding comparison of efficiency of pollination methods. This experiment will help to identify any potential issues that might occur for the final validation. The experiment will consist of five treatments (bee pollination, manual pollination, autonomous robot pollination, mixed human-robot team, and no pollination control) with six replications. ?

Impacts
What was accomplished under these goals? Horticulture Four cultivars of red raspberry and blackberry were purchased and grown in the WVU Evansdale Greenhouse. Cultivar selection was based upon various traits such as formation of primocane/floricane, capability of self-pollination, presence or absence of thorns, trellising requirements, winter chilling hours, flowering time, the number of flowers, flower size, and fruit quality. Two blackberry cultivars, 'Darrow' and 'Prime-Ark Freedom' were selected for robotic experiments. The optimum growing temperature (24/21 °C, day/night) for both species of the brambles were determined. A trellis was designed and built to facilitate logistic movement of the robot as well as to support the bee netting in the greenhouse. Entomology The structure of hairs, function, and behavior of bees were examined in order to acquire parameters that can support the robot design. First, the structure of hairs on bee bodies (i.e. scopa) was examined. Mason bees, belonging to the genus Osmia, were used as a model. We found that the location and general structure of scopa were similar among different species of Osmia bees, but their ultrastructure showed variation in size, density, and shape. Further analyses of the data will help design the structure of pollination interface on the end effector of the robotic arm. Second, visiting behavior of bees in the field were observed with the aid of digital recording. It was found that different bees showed distinctive behavior of flower searching and visiting patterns. Lastly, we have collected data with a high-speed camera for behavior bioassay to describe the movement of bees within flowers and to determine how each bee uses its hairs and body parts to access and gather pollens. BrambleBee Robot System Design A prototype pollination robot, named BrambleBee, was built around a ClearPath Robotics Husky platform and a Kinova JACO2 robotic arm. The BrambleBee was instrumented with a collection of high-quality sensors to provide measurements of the robot state and to perceive the surrounding environment. This includes a Novatel SPAN GPS/INS unit, a Velodyne HDL-32E 32-channel 3D Lidar, a FLIR 5MP Blackfly-S camera outfitted with a fisheye lens, and a SoftKinect DS525 time of fly camera mounted on the robotic arm. To acquire accurate sensor measurements in support of the robot perception system, a series of sensor calibration effort are currently underway. First, the wide field of view fisheye camera is being calibrated to reduce optical distortions. Second, the alignment between the Lidar and the fisheye camera is estimated to improve the matching between the objects recognized in the images (e.g. flower clusters) and the associated depth measurement provided by the Lidar. Localization and Mapping To achieve fast, accurate, and robust robot localization in the greenhouse, two approaches were developed. The first approach matched Lidar measurements with prior knowledge of the greenhouse geometry to produce a fast estimation of the robot pose (i.e., position and orientation). The second approach used Lidar and wheel encoder measurements to perform Simultaneous Localization and Mapping (SLAM). Robot Autonomy A detailed BrambleBee's onboard autonomy software architecture is being designed and implemented. In this design, a Mission Planning node contains most of the high-level planning and decision making algorithms. It decides favorable locations for the robot to park at that would maximize the robotic arm's pollination efficiency. A Map Server node contains a dynamic map of flower cluster locations and estimates the number of flowers to be pollinated, which is used by the Mission Planning node to decide where to pollinate next. It also provides prior knowledge of obstacles in the form of occupancy grid maps to a Path Planner node. The Path Planner node takes in driving destinations in the form of a desired robot pose as well as information about known obstacles from the Map Server node. It dynamically detects obstacles from Lidar to plan or re-plan paths to reach the destination. A set of Sensor Sources and Localization nodes provide information about the robot and its environment. The Sensor Transforms nodes represent coordinate transforms between the robot's sensors and the robot body centroid. Finally, a Base Controller node takes the commanded wheel velocities from the Path Planner node and controls the robot's drivetrain. A planning algorithm was developed for initial greenhouse inspection. The objective of this algorithm was to enable BrambleBee to see each plant from all reachable viewing angles while minimizing the total drive distance. Computer Vision Over 2,400 images were collected at the McConnell Berry Farm for various growth stages of the plants. The flowers in the images were manually labeled as either a single flower or a cluster of flowers using a custom developed Graphical User Interface. For each single flower, the pose and growth stage of the flower was specified using a set of reference images. For each flower cluster, the number of flowers in the cluster was specified. The purpose of the labeling effort was to train machine learning algorithms and to develop a system for reliably evaluating the developed segmentation, classification, and pose estimation approaches. Image segmentation method was developed to quickly remove most parts of the images that are unlikely to be bramble flowers. To accelerate the segmentation process, a color lookup table was prebuilt under the RGB color space. During the segmentation step, the raw RGB values of each image were used to access the lookup table to generate a binary mask. A deep learning based machine learning algorithm was then developed to identify flowers ready for pollination in the segmented images. Research was also performed on the 3D reconstruction of the local environment (i.e., the robot workspace). Using data collected with a DS525 camera attached to the robotic arm, an incremental structure from motion approach was implemented to estimate the camera poses and three-dimensional points in the robot workspace. Manipulation The basic functionality required to manipulate the robot end effector into poses in which it can pollinate flowers has been established. This ability was demonstrated through subsystem testing in which pollination was simulated on a dummy plant with artificial marks while the arm was mounted on a table. In the tested pollination procedure, the arm was first controlled to a pre-determined search position, where the robot can detect markers using the DS525 camera. From the markers' pose estimates, goal poses of the end effector were determined. The inverse kinematics were solved to determine the robot joint angles required to reach each goal pose. The sequence in which to visit the markers was then determined to minimize the motion needed to visit each goal pose in an obstacle-free environment. Path planning was then performed to determine a collision-free path to the next goal pose in the sequence. Once the end effector reached the goal pose, the arm began the final approach to the flower. In the final approach, visual servoing was performed in a two-step process: first in parallel to the markers to align the pollinator axis with the center of the marker; then along the pollinator axis until contact with the marker was made. After contact was estimated to have been made based on position estimates, the arm proceeded to visit the next flower in the sequence until all simulated flowers have been visited. A more detailed report with supporting figures and diagrams is available upon request.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Pollinator robot inspired by structure and behavior of Osmia bees (Hymenoptera: Megachilidae). Annual Meeting of Entomological Society of America, Denver, CO.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Precision pollinator robot. Annual Meeting of West Virginia Entomological Society, Cairo, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Bees: current issues, future prediction, and mitigation. Annual West Virginia Extension and Master Gardeners Conference, Roanoke, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Mason bees: propagation and management. West Virginia Panhandle Beekeepers Association Meeting, Martinsburg, WV.