Source: WEST VIRGINIA UNIVERSITY submitted to
PRECISION POLLINATION ROBOT
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
NEW
Funding Source
Reporting Frequency
Annual
Accession No.
1011249
Grant No.
2017-67022-25926
Project No.
WVA00N/A
Proposal No.
2016-07920
Multistate No.
(N/A)
Program Code
A7301
Project Start Date
Nov 15, 2016
Project End Date
Nov 14, 2018
Grant Year
2017
Project Director
Gu, Y.
Recipient Organization
WEST VIRGINIA UNIVERSITY
886 CHESTNUT RIDGE RD RM 202
MORGANTOWN,WV 26505-2742
Performing Department
Mech. & Aero. Engineering
Non Technical Summary
One of the major issues concerning current agricultural production is crop pollination. Approximately $24 billion per year worth of crops in the U.S. rely on pollination by various pollinators. However, the recent decline of honey bees (i.e. colony collapse disorder) has greatly threatened productivity. Declines of other native pollinators, such as different insect types and animals, have also been reported. Such shortages of pollinators in the U.S. have significantly increased the cost of farmers' renting them for pollination services. From both economic and food sustainability points of view, there is an urgent need to seek alternative pollination systems.In this project, a multi-disciplinary team of researchers will develop a prototype of precision pollination robot for bramble (i.e., blackberry and raspberry) pollination in a greenhouse environment. The project team will use a robotic arm carried by a ground rover for precise flower manipulation. Computer vision algorithms will be used to estimate the flower position, size, orientation, and physical condition, and to guide the robotic arm to capture and interact with flowers. A set of soft brush tips, mimicking bee's hairs and motion, will then be used to pollinate flowers. The precision rover navigation, mapping, and localization of individual flowers within complex greenhouse environments will be provided through a fusion of multiple types of sensor measurements. A database will be automatically generated and updated by the robot, recording the history of flower development and pollination status. A human operator will collaborate with the robot through supplying agriculture domain knowledge, providing high-level decisions, and correcting mistakes made by the robot. The efficiency and throughput of the prototype pollinator robot will be evaluated through a comprehensive set of experiments comparing multiple pollination methods.The successful completion of this research project will significantly impact the field of precision agriculture. First, robotic pollinator bypasses many current issues with natural pollinators in agriculture, such as honeybee colony collapse disorder, pollinator parasites and diseases, predators, pesticide spray, adverse weather, and the availability of pollinators in a timely manner. Second, robotic pollinators will improve fruit quality and production. Such applications include selective pollination, selective abortion of flowers, and digital cataloging, condition tracking, and yield prediction for fruit productions. Finally, the precision localization, evaluation, and manipulation of small and delicate plant parts provides fundamental capabilities for enabling a variety of other precision agriculture applications such as automated irrigation, fertilization and harvest, monitoring plant damages, as well as weed and pest control.The outcomes of this research project will be broadly disseminated to the research community, the growers, and the general public. This project will also enhance regional educational and outreach activities and broaden the participation of students from underrepresented groups.
Animal Health Component
0%
Research Effort Categories
Basic
70%
Applied
30%
Developmental
(N/A)
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4021123202085%
2051123113015%
Goals / Objectives
The goal of this research project is to develop a prototype pollinator robot, and perform proof-of-concept demonstrations of its effectiveness for brambles pollination in a greenhouse environment. In supporting of this goal, the project involves the following four main research objectives:1. Investigate the detailed mechanisms of pollination between bees and flowers in order to provide the knowledge and bio-inspiration for the pollination robot design.2. Develop autonomous capabilities to precisely locate, evaluate, interact, and manipulate small and delicate plant structures within unstructured, low-dynamic, and GPS-challenged greenhouse environments.3. Perform system integration and proof-of-concept demonstrations of precision robotic pollination.4. Perform detailed evaluation of the prototype pollinator robot efficiency and throughput as compared to existing pollination methods.
Project Methods
We will use a robotic arm mounted on an existing ground rover for precision flower access and manipulation. Computer vision algorithms, using images captured in both visual and UV spectra, will be used to estimate the position, orientation, size, and condition of the flowers. A set of soft brush tips, mimicking bee's hairs (i.e. scopa) and motion, will then be used to pollinate flowers. The design parameters of the delicate robot-flower interface will be driven by a series of insect pollination experiments. The precision rover navigation, mapping, and localization of individual flowers within complex greenhouse environments will be provided through a fusion of GPS, Lidar, camera, inertial sensor, and wheel encoder measurements. A database will be automatically generated and updated by the robot, recording the history of flower development and pollination status. A human operator will collaborate with the robot through supplying agriculture domain knowledge, providing high-level decisions, and correcting mistakes made by the robot. This intelligent system will allow more selective, consistent, and uniform pollination, which has the potential of leading to better fruit set and production at a large scale. The evaluation of the project will be based on a study that compares the yield and fruit quality with several different conditions such as no pollination (i.e. control), natural insect pollination, manual pollination, robotic pollination, and collaborations between human and robot pollination. The effectiveness of pollination will be evaluated by determining the fruit yield per plant, fruit size, fruit weight, harvest time and overall distribution of fruit across a plant.

Progress 11/15/16 to 11/14/17

Outputs
Target Audience:Findings from research in the project were presented at two annual conferences: the Entomological Society of America Annual Meeting and the West Virginia Entomological Society Meeting. At the conferences which over 3,000 people attended, two presentations associated with the project were given. In addition, applications of pollinator robot have been presented to stakeholder groups including WV Beekeepers Association and WV Master Gardeners Conference where more than 200 people attended. Current pollinator issues and the development of pollinator robots were presented at the Organic Farm Field Day at West Virginia University (WVU) where about 50 growers, extension agents, and the public attended. The importance of pollinator bees and a potential solution with bio-inspired engineering were presented to the public via outreach with "WVU Insect Zoo" that a co-PI is operating at WVU. More than 4,000 West Virginia citizens including K-12 students visited the booth for the project at the WV State Fair and 14 additional outreach services were provided to serve more than 400 citizens in 2017. Robotic presentations and demonstrations were also provided to the public through several outreach activities including but are not limited to robotic lab open house, high school visitation, and "day in the park" event for STEM outreach. ? Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?During project Year 1, a total of 12 graduate students and 2 undergraduate students are involved in this project. The students are receiving hands-on training through the participation of this interdisciplinary research project. They are also learning team working and leadership skills when interacting with other project members. How have the results been disseminated to communities of interest?The results of this project have been disseminated to the public and science communities through two annual entomological conferences, two extension conferences, WVU Farm Field Day, and numerous outreach activities with the WVU Insect Zoo, the WVU greenhouse, and the WVU robotics labs. What do you plan to do during the next reporting period to accomplish the goals?With the development of the BrambleBee robot and a suite of software functions, the Year 1 research effort provided a solid foundation towards achieving the proposed robotics pollination capabilities. During Year 2, we will focus on system integration, bringing currently isolated capabilities into unified systems. Specifically, the robot localization and mapping algorithms will be performed onboard BrambleBee, along with the ability to navigate efficiently and safely in the cluttered greenhouse environment. The planning algorithms will also be running online, allowing the robot to reach different parts of the greenhouse efficiently. The flower detection and pose estimation algorithms will be coupled with the robotic arm control schemes, leading to the demonstrations of robotic pollination (from a fixed location) of real flowers. Towards the end of Year 2, the robotics arm will be integrated onto the mobile robot platform, paving the way for full system demonstrations and evaluations in Year 3. These integration tasks will need to be supported by technology development and refinement in several areas, such as sensor calibration and rolling shutter compensation, flower pose estimation, semantic mapping, robot autonomy architecture design, flower pollination sequence planning, bee-inspired pollination interface design, detailed pollination motions, and human-machine interface. Many problems and inspirations are expected to arise during the system development, integration, and testing phases, which would lead to new research problems to be addressed by the team. During Year 2, optimal foraging and movement pattern of bees will be investigated to understand the foraging mechanism (e.g. path and approach to flowers) in supporting of the robot development. We will use videography to determine patterns of foraging and movement through dense leaves and branches. In addition, a behavior bioassay will be conducted with a hi-speed video camera to describe the bees' movement within flowers and to determine how each bee use their hairs and body parts to access and gather pollens. The horticulture team will continue preparing to validate the efficacy of the robotic pollinator. The team will conduct a preliminary experiment regarding comparison of efficiency of pollination methods. This experiment will help to identify any potential issues that might occur for the final validation. The experiment will consist of five treatments (bee pollination, manual pollination, autonomous robot pollination, mixed human-robot team, and no pollination control) with six replications. ?

Impacts
What was accomplished under these goals? Horticulture Four cultivars of red raspberry and blackberry were purchased and grown in the WVU Evansdale Greenhouse. Cultivar selection was based upon various traits such as formation of primocane/floricane, capability of self-pollination, presence or absence of thorns, trellising requirements, winter chilling hours, flowering time, the number of flowers, flower size, and fruit quality. Two blackberry cultivars, 'Darrow' and 'Prime-Ark Freedom' were selected for robotic experiments. The optimum growing temperature (24/21 °C, day/night) for both species of the brambles were determined. A trellis was designed and built to facilitate logistic movement of the robot as well as to support the bee netting in the greenhouse. Entomology The structure of hairs, function, and behavior of bees were examined in order to acquire parameters that can support the robot design. First, the structure of hairs on bee bodies (i.e. scopa) was examined. Mason bees, belonging to the genus Osmia, were used as a model. We found that the location and general structure of scopa were similar among different species of Osmia bees, but their ultrastructure showed variation in size, density, and shape. Further analyses of the data will help design the structure of pollination interface on the end effector of the robotic arm. Second, visiting behavior of bees in the field were observed with the aid of digital recording. It was found that different bees showed distinctive behavior of flower searching and visiting patterns. Lastly, we have collected data with a high-speed camera for behavior bioassay to describe the movement of bees within flowers and to determine how each bee uses its hairs and body parts to access and gather pollens. BrambleBee Robot System Design A prototype pollination robot, named BrambleBee, was built around a ClearPath Robotics Husky platform and a Kinova JACO2 robotic arm. The BrambleBee was instrumented with a collection of high-quality sensors to provide measurements of the robot state and to perceive the surrounding environment. This includes a Novatel SPAN GPS/INS unit, a Velodyne HDL-32E 32-channel 3D Lidar, a FLIR 5MP Blackfly-S camera outfitted with a fisheye lens, and a SoftKinect DS525 time of fly camera mounted on the robotic arm. To acquire accurate sensor measurements in support of the robot perception system, a series of sensor calibration effort are currently underway. First, the wide field of view fisheye camera is being calibrated to reduce optical distortions. Second, the alignment between the Lidar and the fisheye camera is estimated to improve the matching between the objects recognized in the images (e.g. flower clusters) and the associated depth measurement provided by the Lidar. Localization and Mapping To achieve fast, accurate, and robust robot localization in the greenhouse, two approaches were developed. The first approach matched Lidar measurements with prior knowledge of the greenhouse geometry to produce a fast estimation of the robot pose (i.e., position and orientation). The second approach used Lidar and wheel encoder measurements to perform Simultaneous Localization and Mapping (SLAM). Robot Autonomy A detailed BrambleBee's onboard autonomy software architecture is being designed and implemented. In this design, a Mission Planning node contains most of the high-level planning and decision making algorithms. It decides favorable locations for the robot to park at that would maximize the robotic arm's pollination efficiency. A Map Server node contains a dynamic map of flower cluster locations and estimates the number of flowers to be pollinated, which is used by the Mission Planning node to decide where to pollinate next. It also provides prior knowledge of obstacles in the form of occupancy grid maps to a Path Planner node. The Path Planner node takes in driving destinations in the form of a desired robot pose as well as information about known obstacles from the Map Server node. It dynamically detects obstacles from Lidar to plan or re-plan paths to reach the destination. A set of Sensor Sources and Localization nodes provide information about the robot and its environment. The Sensor Transforms nodes represent coordinate transforms between the robot's sensors and the robot body centroid. Finally, a Base Controller node takes the commanded wheel velocities from the Path Planner node and controls the robot's drivetrain. A planning algorithm was developed for initial greenhouse inspection. The objective of this algorithm was to enable BrambleBee to see each plant from all reachable viewing angles while minimizing the total drive distance. Computer Vision Over 2,400 images were collected at the McConnell Berry Farm for various growth stages of the plants. The flowers in the images were manually labeled as either a single flower or a cluster of flowers using a custom developed Graphical User Interface. For each single flower, the pose and growth stage of the flower was specified using a set of reference images. For each flower cluster, the number of flowers in the cluster was specified. The purpose of the labeling effort was to train machine learning algorithms and to develop a system for reliably evaluating the developed segmentation, classification, and pose estimation approaches. Image segmentation method was developed to quickly remove most parts of the images that are unlikely to be bramble flowers. To accelerate the segmentation process, a color lookup table was prebuilt under the RGB color space. During the segmentation step, the raw RGB values of each image were used to access the lookup table to generate a binary mask. A deep learning based machine learning algorithm was then developed to identify flowers ready for pollination in the segmented images. Research was also performed on the 3D reconstruction of the local environment (i.e., the robot workspace). Using data collected with a DS525 camera attached to the robotic arm, an incremental structure from motion approach was implemented to estimate the camera poses and three-dimensional points in the robot workspace. Manipulation The basic functionality required to manipulate the robot end effector into poses in which it can pollinate flowers has been established. This ability was demonstrated through subsystem testing in which pollination was simulated on a dummy plant with artificial marks while the arm was mounted on a table. In the tested pollination procedure, the arm was first controlled to a pre-determined search position, where the robot can detect markers using the DS525 camera. From the markers' pose estimates, goal poses of the end effector were determined. The inverse kinematics were solved to determine the robot joint angles required to reach each goal pose. The sequence in which to visit the markers was then determined to minimize the motion needed to visit each goal pose in an obstacle-free environment. Path planning was then performed to determine a collision-free path to the next goal pose in the sequence. Once the end effector reached the goal pose, the arm began the final approach to the flower. In the final approach, visual servoing was performed in a two-step process: first in parallel to the markers to align the pollinator axis with the center of the marker; then along the pollinator axis until contact with the marker was made. After contact was estimated to have been made based on position estimates, the arm proceeded to visit the next flower in the sequence until all simulated flowers have been visited. A more detailed report with supporting figures and diagrams is available upon request.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Pollinator robot inspired by structure and behavior of Osmia bees (Hymenoptera: Megachilidae). Annual Meeting of Entomological Society of America, Denver, CO.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Precision pollinator robot. Annual Meeting of West Virginia Entomological Society, Cairo, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Bees: current issues, future prediction, and mitigation. Annual West Virginia Extension and Master Gardeners Conference, Roanoke, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Mason bees: propagation and management. West Virginia Panhandle Beekeepers Association Meeting, Martinsburg, WV.