Source: WEST VIRGINIA UNIVERSITY submitted to
PRECISION POLLINATION ROBOT
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
TERMINATED
Funding Source
Reporting Frequency
Annual
Accession No.
1011249
Grant No.
2017-67022-25926
Project No.
WVA00N/A
Proposal No.
2016-07920
Multistate No.
(N/A)
Program Code
A7301
Project Start Date
Nov 15, 2016
Project End Date
Feb 14, 2021
Grant Year
2019
Project Director
Gu, Y.
Recipient Organization
WEST VIRGINIA UNIVERSITY
886 CHESTNUT RIDGE RD RM 202
MORGANTOWN,WV 26505-2742
Performing Department
Mech. & Aero. Engineering
Non Technical Summary
One of the major issues concerning current agricultural production is crop pollination. Approximately $24 billion per year worth of crops in the U.S. rely on pollination by various pollinators. However, the recent decline of honey bees (i.e. colony collapse disorder) has greatly threatened productivity. Declines of other native pollinators, such as different insect types and animals, have also been reported. Such shortages of pollinators in the U.S. have significantly increased the cost of farmers' renting them for pollination services. From both economic and food sustainability points of view, there is an urgent need to seek alternative pollination systems.In this project, a multi-disciplinary team of researchers will develop a prototype of precision pollination robot for bramble (i.e., blackberry and raspberry) pollination in a greenhouse environment. The project team will use a robotic arm carried by a ground rover for precise flower manipulation. Computer vision algorithms will be used to estimate the flower position, size, orientation, and physical condition, and to guide the robotic arm to capture and interact with flowers. A set of soft brush tips, mimicking bee's hairs and motion, will then be used to pollinate flowers. The precision rover navigation, mapping, and localization of individual flowers within complex greenhouse environments will be provided through a fusion of multiple types of sensor measurements. A database will be automatically generated and updated by the robot, recording the history of flower development and pollination status. A human operator will collaborate with the robot through supplying agriculture domain knowledge, providing high-level decisions, and correcting mistakes made by the robot. The efficiency and throughput of the prototype pollinator robot will be evaluated through a comprehensive set of experiments comparing multiple pollination methods.The successful completion of this research project will significantly impact the field of precision agriculture. First, robotic pollinator bypasses many current issues with natural pollinators in agriculture, such as honeybee colony collapse disorder, pollinator parasites and diseases, predators, pesticide spray, adverse weather, and the availability of pollinators in a timely manner. Second, robotic pollinators will improve fruit quality and production. Such applications include selective pollination, selective abortion of flowers, and digital cataloging, condition tracking, and yield prediction for fruit productions. Finally, the precision localization, evaluation, and manipulation of small and delicate plant parts provides fundamental capabilities for enabling a variety of other precision agriculture applications such as automated irrigation, fertilization and harvest, monitoring plant damages, as well as weed and pest control.The outcomes of this research project will be broadly disseminated to the research community, the growers, and the general public. This project will also enhance regional educational and outreach activities and broaden the participation of students from underrepresented groups.
Animal Health Component
0%
Research Effort Categories
Basic
70%
Applied
30%
Developmental
(N/A)
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4021123202085%
2051123113015%
Goals / Objectives
The goal of this research project is to develop a prototype pollinator robot, and perform proof-of-concept demonstrations of its effectiveness for brambles pollination in a greenhouse environment. In supporting of this goal, the project involves the following four main research objectives:1. Investigate the detailed mechanisms of pollination between bees and flowers in order to provide the knowledge and bio-inspiration for the pollination robot design.2. Develop autonomous capabilities to precisely locate, evaluate, interact, and manipulate small and delicate plant structures within unstructured, low-dynamic, and GPS-challenged greenhouse environments.3. Perform system integration and proof-of-concept demonstrations of precision robotic pollination.4. Perform detailed evaluation of the prototype pollinator robot efficiency and throughput as compared to existing pollination methods.
Project Methods
We will use a robotic arm mounted on an existing ground rover for precision flower access and manipulation. Computer vision algorithms, using images captured in both visual and UV spectra, will be used to estimate the position, orientation, size, and condition of the flowers. A set of soft brush tips, mimicking bee's hairs (i.e. scopa) and motion, will then be used to pollinate flowers. The design parameters of the delicate robot-flower interface will be driven by a series of insect pollination experiments. The precision rover navigation, mapping, and localization of individual flowers within complex greenhouse environments will be provided through a fusion of GPS, Lidar, camera, inertial sensor, and wheel encoder measurements. A database will be automatically generated and updated by the robot, recording the history of flower development and pollination status. A human operator will collaborate with the robot through supplying agriculture domain knowledge, providing high-level decisions, and correcting mistakes made by the robot. This intelligent system will allow more selective, consistent, and uniform pollination, which has the potential of leading to better fruit set and production at a large scale. The evaluation of the project will be based on a study that compares the yield and fruit quality with several different conditions such as no pollination (i.e. control), natural insect pollination, manual pollination, robotic pollination, and collaborations between human and robot pollination. The effectiveness of pollination will be evaluated by determining the fruit yield per plant, fruit size, fruit weight, harvest time and overall distribution of fruit across a plant.

Progress 11/15/16 to 02/14/21

Outputs
Target Audience:The results of this project were presented in several conferences and symposia including the Entomological Society of America Annual Meetings, IEEE/RSJ International Conference on Intelligent Robots and Systems, and other national and international venues. In addition, findings from the project were presented at 15 research seminars at different universities, NASA, and National Institute for Occupational Safety and Health (NIOSH). The design of BrambleBee was also presented to 26 researchers given robotics seminar talks at West Virginia University (WVU). Applications of pollinator robot have been presented to stakeholder groups including WV Beekeepers Association and WV Master Gardeners Conference, where more than 200 people attended. Current pollinator issues and the development of pollinator robots were presented at the Organic Farm Field Day at West Virginia University (WVU) where about 50 growers, extension agents, and the public attended each year. The WVU greenhouse hosted numerous tours for K-12 students, undergraduate students, parents, Master Gardeners, and scientists attending the Midwest American Society of Plant Biologists (ASPB) conference. In those tours, the outline of the project and the importance of the research were explained. A multidisciplinary approach to address current pollination decline was presented via 12 entomological outreach activities including the participating in the WV State Fair with "WVU Insect Zoo" where more than 8,000 K-12 students, growers, and the public attended.The project was also reported by three grower trade magazines (Greenhouse Grower Magazine, Produce Grower Magazine, and Fruit Grower News - cover story). Robotic presentations and demonstrations were provided to the public through several outreach activities such as robotic lab open house, high school visitation, and WVROX event for STEM outreach. During the COVID-19 pandemic, we have been given virtual lab tours to other researchers, undergraduate students, high school students, and younger kids through activities such as "Girls in STEM". Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?A total of 24 graduate students, 12 undergraduate students, and one postdoc research fellow were involved on the project. The team members received hands-on training through the participation in this multidisciplinary research project. They also learned team-working and leadership skills when interacting with other project members. The research outcome developed through the project was also integrated in the lecture materials for a Mobile Robotics course, an Autonomous Robot Systems course, an Introduction to Digital Image Processing course, and a Robotics Capstone Design course that are offered at WVU. How have the results been disseminated to communities of interest?The results of this project have been disseminated to science community through conference papers, posters, Ph.D. dissertations, M.S. thesis, and seminar talks. The outcome from the research was used in developing curriculum material for both undergraduate and graduate classes. The experimental videos were shared on YouTube and an open source robotic pollination simulator and data sets were also shared with the research community. The project has attracted significant media attention, with articles written by Wired, the Fast Company, NowThis Future, IEEE Spectrum, SpringWise, Growing Produce, National Geographic Italia, Digital Trends, TechXplore, among others, and was part of the cover story by the Fruit Grower News. It was also featured in videos by NASA 360. The project members have participated in live interview on radio station J-WAVE and interviewed as the subject matter expert in the area of robotic pollination by CNN, AAAS Science, and ScienceNews. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Entomology The development of an end-effector of the precision pollination robot, BrambleBee, was bio-inspired by the Japanese hornfaced bee (Osmia cornifrons). Through examination of bee hairs using a scanning electron microscope (SEM), we were able to group and categorize seven different types of bee hairs: branched hair, side-branched hair, abdominal scopal hair, brush hair, basitarsal brush hair, chisel-tip brush hair, and enlarged-tip brush hair. The hair types could be grouped further into simple, compound, and complex hair. Branched hair and side-branched hair (mostly responsible for pollination) were compound hairs while enlarged-tip and chisel-tip hairs (mostly for cleaning body) were complex hairs. Branches on compound hairs (about 10 µm in length) help increase the chance to collect pollen. Our further investigation with SEM and the bee's flower-visiting and nesting behaviors revealed that the role of branched hair was to collect pollen by providing a large surface for pollen acquisition and heat conservation; we observed a considerable amount of pollen grains adhered to or trapped by branched hairs. Abdominal scopal hairs were elongated stiff setae and slanted backward. Compared with branched hairs, they were thicker and spirally twisted. The length of most scopal hairs was 1.2 mm and the hairs stand about 40 ~ 60 µm apart from each other, which can pack the collected pollen to carry to their nests for pollen provision. We found goose feather was branched and flexible, which was like the branched hairs of bees responsible for retaining pollen. In addition, the pollen gathering behavior of bees was examined using videography and we found bees (mason bee, bumblebee, and honeybee) moved around the center of a flower with different movement frequencies and patterns on the flowers. Therefore, a spiral movement of the end effector could be an important factor for collecting pollen. Robotics BrambleBee was built around a ClearPath Robotics Husky platform and a Kinova JACO2 robotic arm. BrambleBee was instrumented with a Novatel SPAN GPS/INS unit, a Velodyne HDL-32E 32-channel 3D Lidar, a FLIR Blackfly-S camera outfitted with a fisheye lens, and an Intel Realsense D435 RGB-D camera mounted on the robotic arm. Several iterations of robotic pollination end-effector were designed to mimic methods that bees use when pollinating flowers. The final end-effector design consists of three linear servos attached to a flexible 3D printed plate outfitted with a soft membrane covered with goose feather. Robot localization is performed primarily with a real-time 3D Simultaneous Localization and Mapping (SLAM) algorithm. A factor graph-based framework is used to fuse the data from the sensor suite on-board BrambleBee to estimate the robot platform's states (i.e., position, orientation, and velocity). A time synchronization module was developed to mitigate time delays occurred during message transferring and the raw point cloud scan from the Lidar sensor has been de-distorted utilizing the inertial measurements. These steps improved the performance and robustness of the point cloud matching algorithm. Computer vision algorithms were developed for flower detection and localization. The flower identification system performs the segmentation and classification of flowers using Mask R-CNN, trained with a set of flower data collected and hand labeled by project team members. The depth images corresponding to each flower are used for localizing the flowers and improve the pose estimation resolution. The flower localization algorithm works by loosely modeling each flower as a plane. Using the point clouds computed from the depth images, the best fit plane is computed. The normal to the plane is used to represent the orientation of each flower and the centroid of the point cloud is used for representing the position of each flower. The flower manipulation algorithms were developed for BrambleBee to pollinate flowers after it parked in front of a crop. The process starts by mapping the flowers and obstacles in the workspace. This is achieved by maneuvering the end-effector through a set of poses that cover the workspace to identify flowers, estimate the flower poses, and map out the obstacles in the workspace. After mapping the workspace, a trajectory is planned for the end-effector through a set of vantage points in front of each flower to refine the pose estimates. The end-effector then aligns itself to the flower and activates the visual servoing procedure to guide the end-effector towards the flower until contact is made. Once reached, the precision pollination procedure is executed, which actuates the end-effector to perform a motion that facilitates the pollen transfer. This process is repeated until all flowers in the workspace are pollinated. The experiments with artificial flowers showed the robot is capable of operating with high precision and was able to achieve a 93.1% detection accuracy and a 76.9% estimated pollination success rate on average. Limited experiments on real flowers were also performed. During the COVID-19 pandemic, physical testing of robots was limited. An open-source pollination simulator was developed using Gazebo in conjunction with ROS. This provided a high-fidelity environment that could run the same code as the physical robots, using simulated hardware interfaces. The Gazebo environment uses Universal Robot Description Format (URDF) files as simulation models for the robots. The robot's sensors, such as IMU, Lidar and encoders are also modeled in the simulation environment, with uncertainty and noise factored in. The greenhouse testing facility, plants, and flowers were modeled then exported with texture images in Gazebo. All the previously developed algorithms such as localization, mapping, motion planning, control, and the flower classifier were ported over to the simulation environment. Horticulture Two blackberry cultivars, 'Darrow' and 'Prime-Ark Freedom' were selected for robotic experiments. A trellis was built to facilitate logistic movement of the robot as well as to support the bee netting in the greenhouse. To determine the optimal time for pollination, pollen production, and viability, tests were conducted using pollen collected at different stages of flower development. Pollen was collected by shaking the flower and placed on germination media. Pollen was incubated for three hours at the same temperature as the plants were grown (23.9 ± 1.8/ 20.2 ± 1.6 °C, day/night ± st. dev.). Pollen production was evaluated by counting the number of pollen grains. The pollen was considered viable if the pollen tube was longer than the diameter of the pollen grain. The pollen length was measured using imaging software cellSens™ 1.6 (Olympus America, Inc., Center Valley, PA). Differences in pollen viability were determined by Tukey's significance test at P ≤ 0.05. We found that viable pollen can be collected two and three days after anthesis (flower opening). Pollen collected one and four days after anthesis had not dehisced and had significantly lower viability, respectively. Two methods of pollination (using the robot'send-effector withhuman controland hand pollination) were compared to a control where flowers were not intentionally pollinated. There was no difference in the number of drupelets between the no pollination control and the robot pollination. Robot pollination produced larger berries than the control, although hand pollination produced the highest number of drupelets and the largest fruit. Due to the COVID-19 pandemic, comprehensive robotic pollination evaluation that require multiple participants became difficult to perform. Therefore, we focused effort on developing a simulator to facilitate the development and sharing of the robotic pollination technology.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Precision pollinator robot. Annual Meeting of West Virginia Entomological Society, Cairo, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Bees: current issues, future prediction, and mitigation. Annual West Virginia Extension and Master Gardeners Conference, Roanoke, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Mason bees: propagation and management. West Virginia Panhandle Beekeepers Association Meeting, Martinsburg, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Pollinator robot inspired by structure and behavior of Osmia bees (Hymenoptera: Megachilidae). Annual Meeting of Entomological Society of America, Denver, CO.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Ohi, N., Lassak, K., Watson, R., Strader, J., Du, Y., Yang, C., Hedrick, G., Nguyen, J., Harper, S., Reynolds, D., Kilic, C., Hikes, J., Mills, S., Castle, C., Buzzo, B., Waterland, N., Gross, J., Park, Y., Li, X., Gu, Y., Design of an Autonomous Precision Pollination Robot, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, October, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Park, Y.-L. 2018. Osmia cornifrons as a model for pollinator robot. The Apicultural Society of Korea Annual Conference. Gwangju, Korea.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Strader, J., Nguyen, J., Tatsch, C., Du, Y., Lassak, K., Buzzo, B., Watson, R., Cerbone, H., Ohi, N., Yang, C., Gu, Y., Flower Interaction Subsystem for a Precision Pollination Robot, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019, Macau, China, Nov 2019.
  • Type: Theses/Dissertations Status: Published Year Published: 2019 Citation: Du, Y., Selected Topics in Computer Vision with Deep Learning, West Virginia University Ph.D. Dissertation, May 2019
  • Type: Conference Papers and Presentations Status: Published Year Published: 2020 Citation: Mills, S. A., Gu, Y., Gross, J., Li, X., Park, Y. L., & Waterland, N. L. Evaluation of an Autonomous Robotic Pollinator. American Society for Horticultural Science Annual Conference. August 10-14, 2020. (Virtual presentation)
  • Type: Other Status: Published Year Published: 2020 Citation: Mills, S. A., Gu, Y., Gross, J., Li, X., Park, Y. L., & Waterland, N. L. 2020. Evaluation of an Autonomous Robotic Pollinator. HortScience (Abstr.)
  • Type: Theses/Dissertations Status: Published Year Published: 2019 Citation: Watson, R., Enabling Robust State Estimation through Covariance Adaptation, WVU Ph.D. Dissertation, Dec 2019.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Yang, C., Watson, R., Gross, J., Gu, Y., Localization Algorithm Design and Evaluation for an Autonomous Pollination Robot, ION GNSS+ Conference, Miami, FL, Sep 2019.
  • Type: Theses/Dissertations Status: Published Year Published: 2018 Citation: Hikes, J. Calibration of Cameras and LiDAR for Applications in Autonomous Navigation, West Virginia University Masters Thesis, December 2018.
  • Type: Theses/Dissertations Status: Awaiting Publication Year Published: 2021 Citation: Buzzo, M., Deign and Practical Control Methods for Soft Parallel Robots
  • Type: Theses/Dissertations Status: Published Year Published: 2020 Citation: Tatsch, C., Route Planning for Long-term Robotics Missions, WVU M.S. Thesis, 2020


Progress 11/15/19 to 11/14/20

Outputs
Target Audience:Findings from this project were presented at an annual national conference, American Society of Horticultural Sciences, which is attended by students, faculty, and industry scientists, and a department seminar at West Virginia University. Due to COVID-19, other outreach activities have been limited during this period. We have been given virtual lab tours to other researchers, undergraduate students, and high school students. Changes/Problems:Due to the COVID-19 pandemic, complex robot experiments that require multiple participants become very difficult to perform. To partially mitigate this issue, we have developed a robotic pollination simulation to continue develop and test the precision robotic pollination technology. What opportunities for training and professional development has the project provided?During this project year, a total of eight graduate students, four undergraduate students, and one postdoc research fellow were involved on the project. The team members received hands-on training through the participation of this multidisciplinary research project. They also learned team-working and leadership skills when interacting with other project members. How have the results been disseminated to communities of interest?During this project year, the results of this project have been disseminated to science community through a conference paper, a Ph.D. dissertation, and one seminar talk. The outcome from the research was used in developing curriculum material for an Introduction to Digital Image Processing class and a Mobile Robotics class. The WVU greenhouse hosted a tour for undergraduate horticulture students. At the tour, the summary of the project and the importance of the research were explained. The team members have also given several robotics lab tours (either physical or virtual), and presentations to high school students. What do you plan to do during the next reporting period to accomplish the goals?We will continue to develop the pollination robotics simulation, leading to the open release of the code. We will also write manuscripts on robotics technology and bee hair structure and foraging behavior.

Impacts
What was accomplished under these goals? Entomology To develop an effective end effector of the BrambleBee robot, we examined detailed hair structure and pollen adhesion to bee hairs using scanning electron microscope. Through examination of bee hairs on the body of Osmia cornifrons, the Japanese hornfaced bee, we were able to group and categorize seven different types of bee hairs: branched hair, side-branched hair, abdominal scopal hair, brush hair, basitarsal brush hair, chisel-tip brush hair, and enlarged-tip brush hair. The seven hair types could be grouped further into simple, compound, and complex hair. Branched hair and side-branched hair (mostly responsible for pollination) were compound hairs while enlarged-tip and chisel-tip hairs (mostly for cleaning body) were complex hairs. Branches on compound hairs (about 10 µm in length) help increase a chance to collect pollen. We found that the role of branched hair was to collect pollen by providing a large surface for pollen acquisition and heat conservation; we observed a considerable amount pollen grains adhered to or trapped by branched hairs. Abdominal scopal hairs were elongated stiff setae and slanted backward. Compared with branched hairs, they were thicker and spirally twisted. The length of most scopal hairs were 1.2 mm and hair stand about 40 ~ 60 µm apart from each other, which can pack the collected pollen to carry to their nests for pollen provision. All the brushed hairs on trochanter, femur, and tibia were short (about 0.3 mm) and twisted form with point tip. Enlarged-tip brush hair had spatula-like tips and they were found in tibia and basitarsus. Chisel-tip brush hairs whose tip were located in tibia and femur of hind leg. These brushes were used to groom mostly but we also found many pollen grains. BrambleBee Robot System Pollination End-Effector Design: a new iteration of robotic pollination end effector was designed in order to mimic methods that natural pollinators use when pollinating flowers. The manipulator consists of three linear servos attached to a flexible 3D printed plate outfitted with a soft membrane that collects and distributes pollen onto the flowers. It is controlled using a lookup table that was constructed using known positions of the flexible plate given motor outputs. Robot Localization: the localization algorithm has two improvements. First, since our localization system is set up based on Robot Operator System (ROS) and time delay occurs during message transferring, a time synchronization module is implemented after receiving messages from sensors. With the assistant of the synchronization module, the motion estimates show to be more stable than our previous results. Second, the raw point cloud scan from the Lidar sensor has been de-distorted utilizing the initial measurements from the INS system. This step improves the performance of the point cloud matching algorithm. Computer Vision for Flower Detection and Localization: the improved computer vision system includes 1) a new set of data collected in the greenhouse, 2) a new classifier trained on the collected dataset, and 3) a new method for localizing the flowers with increased precision using a combination of depth and color images. The flower identification system replaces the segmentation and classification steps of the previous system with a single step using Mask R-CNN. This method provides both a bounding box containing the segmented flowers as well as a mask (or binary image) identifying the segmented flowers. The depth images corresponding to each flower are now used for localizing the flowers and improve the pose estimation resolution. The flower localization algorithm works by loosely modeling each flower as a plane. Using the point clouds computed from the depth images, the best fit plane is computed. The normal to the plane is used to represent the orientation of each flower and the centroid of the point cloud is used for representing the position of each flower. Each plane normal is converted to a transformation that can be used for moving the end-effector to the center of each flower for pollination. Robust Pollination under Uncertainty: one of the key objectives of a precision pollination robot is robust pollination under a variety of sources of uncertainty. Computed flower pose is often inaccurate due to noisy sensors and estimation errors. Our experiments have shown that the primary mode of failure is due to the poor estimation of the flower pose. The goal of this effort is to arrive at a principled approach for the pollination robot that improves the success rate of the pollination task. To achieve that goal the research team is actively investigating two approaches namely 1) combined planning and learning approach, where we model the problem as a Partially Observable Markov decision process (POMDP) and 2) frame the pollination task as a hard exploration problem, where a single positive reward is given only when the robot successfully pollinates the flower. Simulator Development: given the COVID-19 pandemic, physical testing of robots has been limited. A pollination simulator is being developed using Gazebo in conjunction with ROS. This provided a high-fidelity environment that could run the same code as the physical robots, using simulated hardware interfaces. The Gazebo environment uses Universal Robot Description Format (URDF) files as simulation models for the robots. The robot's sensors, such as IMU, lidar and encoders are also modeled in the simulation environment, with uncertainty and noise factored in. The greenhouse testing facility, plants, and flowers were modeled in Blender, then exported with texture images as COLLADA files that are readily able to be used with Gazebo. This design with textures allows for the future use and testing of different flowers. All the previously developed algorithms such as localization, mapping, robot motion planning, control, and the flower classifier can be ported over to the simulated environment with minimal changes to the algorithms that consisted of the differences between the physical sensors and the modeled sensors. Horticulture To determine the optimal time for pollination, pollen production, and viability tests were conducted using pollen collected at different stages of flower development. Pollen was collected by shaking the flower and placed on germination media. Pollen was incubated for three hours at the same temperature as the plants were grown (23.9 ± 1.8/ 20.2 ± 1.6 °C, day/night ± st. dev.). Pollen production was evaluated by counting the number of pollen grains at 4X magnification using an Olympus microscope BX53 and digital camera DP26 (Olympus America, Inc., Center Valley, PA). The pollen was considered viable if the pollen tube was longer than the diameter of the pollen grain. The pollen length was measured using imaging software cellSens™ 1.6 (Olympus America, Inc., Center Valley, PA). Data were analyzed by PROC GLM using SAS version 9.3 (SAS Institute, Inc., Cary, NC). Differences in pollen viability were determined by Tukey's significance test at P ≤ 0.05. We found that viable pollen can be collected two and three days after anthesis (flower opening). Pollen collected one and four days after anthesis had not dehisced and had significantly lower viability, respectively. Two methods of pollination (the robot's end-effector and hand pollination) were compared to a control where flowers were not intentionally pollinated. There was no difference in the number of drupelets between the no pollination control and the robot pollination. Robot pollination produced larger berries than the control, although hand pollination produced the highest number of drupelets and the largest fruit.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2020 Citation: Mills, S. A., Gu, Y., Gross, J., Li, X., Park, Y. L., & Waterland, N. L. Evaluation of an Autonomous Robotic Pollinator. American Society for Horticultural Science Annual Conference. August 10-14, 2020. (Virtual presentation)
  • Type: Other Status: Awaiting Publication Year Published: 2020 Citation: Mills, S. A., Gu, Y., Gross, J., Li, X., Park, Y. L., & Waterland, N. L. 2020. Evaluation of an Autonomous Robotic Pollinator. HortScience (Abstr.)
  • Type: Theses/Dissertations Status: Published Year Published: 2019 Citation: Watson, R., Enabling Robust State Estimation through Covariance Adaptation, WVU Ph.D. Dissertation, Dec 2019.


Progress 11/15/18 to 11/14/19

Outputs
Target Audience:Findings from this project were presented at seminars hosted by several universities and research institutions, one entomological conference, one international symposium, and two robotics/navigation conferences. In addition, a multidisciplinary approach to address current pollination decline was presented via seven entomological outreach including the participating in the WV State Fair with "WVU Insect Zoo" where more than 8,000 K-12 students, growers, and the public attended. We have been hosting numerous (>12) tours at the greenhouse and robotics labs introducing the ideas and importance of the research project. Attendees include K-12 students, undergraduate students, parents, Master Gardeners and scientists attending the Midwest American Society of Plant Biologists (ASPB) conference. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?During project Year 3, a total of 10 graduate students, 12 undergraduate students, and one postdoc research fellow were involved in research. The students received hands-on training through the participation of this multidisciplinary research project. They also learned team-working and leadership skills when interacting with other project members. How have the results been disseminated to communities of interest?The results of this project have been disseminated to the public and science communities through conference papers, student theses, and several technical presentations. Numerous outreach activities with the WVU Insect Zoo, the greenhouse, and the robotics labs were also performed. Materials generated from the project was also integrated in three courses at WVU, EE 465 (Intro. to Digital Image Processing), MAE/CpE 412 (Mobile Robotics), and MAE 593B (Autonomous Robot Systems).? What do you plan to do during the next reporting period to accomplish the goals?A research problem in the area of robotic perception within greenhouse environment that the team plans to continue investigate is the use of degraded GPS signals within the greenhouse environment for robot localization. The team has recently recorded raw RF signals for processing with a software defined GPS receiver while BrambleBee was moving in the greenhouse environment. Our plan is to conduct a study that compares the robot's Lidar based localization system with a GPS solution in which the signals are processed by a robust estimation approach recently developed at WVU. Additional research will be performed on planning appropriate robot actions to improve its estimation of the flower pose and the refinement of planning algorithms to enable a full sequence of greenhouse inspection, parking location selection, and flower pollination autonomously. In the spring, we will assess the performance of BrambleBee in performing pollination activities. We will compare five methods of pollination: bee pollination, manual pollination, autonomous robot pollination, human-robot team, and a no pollination control. The effectiveness of pollination will be evaluated by determining the fruit yield per plant, fruit size, fruit weight, harvest time, and distribution of fruit across a plant.

Impacts
What was accomplished under these goals? Horticulture Two cultivars of blackberry, Darrow and Prime-Ark Freedom, were grown in the WVU greenhouse. In order to determine the optimal window for pollination, pollen was collected from flowers of different stages of flower development. Pollen was collected by shaking the flower and plated on germination media described by (Brevis et al. 2006). Pollen was incubated for three hours at the same temperature as the plants were grown (24 °C). Pollen production was evaluated by counting the number of pollen grains using an Olympus microscope BX53 at 4X equipped with a digital camera DP26. The pollen was considered viable if the pollen tube was longer than the diameter of the pollen grain and it was measured using an imaging software cellSens™ 1.6. Data were analyzed by PROC GLM using SAS version 9.3. Differences in pollen viability among flowers of different ages were determined by Tukey's significance test at P < 0.05. We found that viable pollen could be collected from flowers two (88%) and three days (80%) after anthesis (flower opening). No pollen was collected from flowers one day after anthesis as the anther had not dehisced (released) and pollen collected from flowers four days after anthesis had significantly lower viability, yielding only 27% viable pollen. Entomology Development of an end effector of BrambleBee is in part inspired by the model bee of this project, the Japanese hornfaced bee (Osmia cornifrons). The fine structure and function of hairs on the bee were examined by using scanning electron microscopy (SEM). We found that the hairs could be classified in two major groups: branched hair and unbranched hair (scopa). The size of branched hairs ranged 7 µm to 1 mm in length and they were found on almost all of body parts of the bee. Based on the results of SEM examination, we speculated that this hair could not be a sensory receptor but might have a role in the removal of pollen by grooming. Unbranched hairs were found on abdomen and legs. Abdominal scopa were simple long hairs (1 mm length) and clustered. There were three different shapes of unbranched hairs in legs. Most were thin and long brush hairs and their role was grooming and collecting pollens. To determine the material that mimic bee hairs for the development of the end effector of BrambleBee, we examined paint brush and feather to compare with bee hairs. Painting brushes were generally used to pollinate flowers artificially and we found that brush hairs were long, thin, and tapered, which was very similar to the scopa found on bee abdomen. Goose feather was branched and flexible, which was like the branched hairs of bees responsible for retaining pollen. Pollen gathering behavior of bees was examined using a videography and we found that all three bees (mason bee, bumble bee and honey bee) moved around the center of a flower. Therefore, a spiral movement of end effector could be an important factor for collecting pollen. A pollination end effort was designed based on these findings. BrambleBee Robot Perception Machine learning algorithms, including convolution neural networks are used to solve the problem of flower detection in an image. To train the neural network we created a new dataset of bramble flowers this year. We collected around 2,000 images and manually annotated the flowers in each image. With the relatively low size of our data, we opted for the strategy of fine tuning a pretrained model. We have chosen Mask-RCNN (He et al., 2017) that was pre-trained on COCO dataset (Lin et al., 2014). To enable robotic pollination, an accurate method for inferring the flower's pose (i.e., the flower's position and orientation) is required. To facilitate this inference on-board the BrambleBee, a pose estimator algorithm that uses an imaging system on the end of the robotic arm was developed. The eye-in-hand imaging system provides RGB and Depth information. Using the RGB information, the flowers within the larger image scene can be segmented. Then, within the segmented scene, the depth information can be utilized to infer the flower's pose through a robust plane fitting technique (Fusiello, 2006). In addition to the estimated pose, the associated uncertainty can also be inferred through the eigen-decomposition associated with the plane fitting technique (Liounis & Christian, 2015). This framework that enables the estimation of a flower's pose and associated uncertainty has been experimentally validated. Another improvement to the BrambleBee robot during the reporting period was to determine an extrinsic calibration (e.g., rotation and translation) between the 3D Lidar sensor and the downward facing fisheye camera. The team employed an automated tool for calibration that uses a simple box in view of both sensors (Lyu et al., 2019) and developed a dedicated calibration board that include fiducials easily identifiable in both sensors (Hikes, 2018). An additional study performed was a performance analysis of BrambleBee's Simultaneous Localization and Mapping (SLAM) system. Indoor and outdoor field tests were conducted, and the algorithm, software implementation, and experimental data were made available to the research community (Yang et al., 2019). BrambleBee Autonomy and Testing Emphasis has been placed on the refinement of the pollination sub-system (Strader et al., 2019), which is activated by BrambleBee after the robot is positioned in front of the plants. The system starts by mapping the flowers and obstacles in the workspace. This is achieved by maneuvering the end-effector through a set of poses that cover the workspace to identify flowers, estimate the flower poses, and map out the obstacles in the workspace. The resulting obstacle map is used to avoid possible collisions that could damage the plant or the robot. After mapping the workspace, a trajectory is planned for the end-effector through a set of vantage points in front of each flower to refine the pose estimates. The end-effector then aligns itself to the flower and activates the visual servoing procedure to guide the end-effector towards the flower until contact is made. Once reached, the precision pollination procedure is executed, which actuates the end-effector to perform a motion that allows the pollen to be released from the anthers of the flower. This process is repeated until all flowers in the workspace are pollinated. The experiments with artificial flowers show the robot is capable of operating with high precision and is able to achieve a 93.1% detection accuracy and a 76.9% estimated pollination success rate on average (Strader et al., 2019). Limited experiments on real flowers were also performed. The capabilities of the developed system are demonstrated in: https://youtu.be/ZbgtP9CHycA. Reference Brevis, P. A., et al., (2006). Production and viability of pollen and pollen-ovule ratios in four rabbiteye blueberry cultivars. Journal of the American Society for Horticultural Science. Fusiello, A. (2006). Elements of geometric computer vision. Available: http://homepages. inf. ed. ac. uk/rbf/CVonline/LOCAL_COPIES/FUSIELLO4/tutorial.html. He, K., et al., (2017). Mask R-CNN. IEEE international conference on computer vision (pp. 2961-2969). Hikes, J. (2018). Calibration of Cameras and LiDAR for Applications in Autonomous Navigation, WVU Master's Thesis. Lin, T., et al., (2014). Microsoft COCO: common objects in context. arXiv:1405.0312. Liounis, A. J., & Christian, J. A. (2015). Techniques for Generating Analytic Covariance Expressions for Eigenvalues and Eigenvectors. IEEE Transactions on Signal Processing. Lyu, Y., et al., (2019). An Interactive LiDAR to Camera Calibration. arXiv:1903.02122. Strader, J., et al., (2019). Flower Interaction Subsystem for a Precision Pollination Robot. IEEE/RSJ IROS. Yang, C., et al., (2019) Localization Algorithm Design and Evaluation for an Autonomous Pollination Robot. 32nd ION GNSS+.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Strader, J., Nguyen, J., Tatsch, C., Du, Y., Lassak, K., Buzzo, B., Watson, R., Cerbone, H., Ohi, N., Yang, C., Gu, Y., Flower Interaction Subsystem for a Precision Pollination Robot, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019, Macau, China, Nov 2019.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Yang, C., Watson, R., Gross, J., Gu, Y., Localization Algorithm Design and Evaluation for an Autonomous Pollination Robot, ION GNSS+ Conference, Miami, FL, Sep 2019.
  • Type: Theses/Dissertations Status: Accepted Year Published: 2018 Citation: Hikes, J. Calibration of Cameras and LiDAR for Applications in Autonomous Navigation, West Virginia University Masters Thesis, December 2018.
  • Type: Theses/Dissertations Status: Accepted Year Published: 2019 Citation: Du, Y., Selected Topics in Computer Vision with Deep Learning, West Virginia University Ph.D. Dissertation, May 2019


Progress 11/15/17 to 11/14/18

Outputs
Target Audience:Findings from the project were presented at an entomological conference, four US institutions, and two Korea institutions during project Year 2. The importance of bees and pollination robot were presented via outreach with "WVU Insect Zoo" for more than 200 K-12 students, growers, and the public. The WVU greenhouse hosted four tours including elementary students and Master Gardeners. In those tours, the outline of the project and the importance of the research were explained. Robotic presentations and demonstrations were also provided to the public through several outreach activities including but are not limited to robotic lab open house, high school visitation, and WVROX event for STEM outreach. The project has attracted several media attention, with articles written by Wired, the Fast Company, and NowThis Future. The project was also reported by three grower trade magazines (Greenhouse Grower Magazine, Produce Grower Magazine, and Fruit Grower News - cover story). Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?During project Year 2, a total of 14 graduate students and 13 undergraduate students were involved in research. The students received hands-on training through the participation of this interdisciplinary research project. They also learned team-working and leadership skills when interacting with other project members. How have the results been disseminated to communities of interest?The results of this project have been disseminated to the public and science communities through two conference papers and several technical presentations. Numerous outreach activities with the WVU Insect Zoo, the WVU greenhouse, and the WVU robotics labs were also performed. What do you plan to do during the next reporting period to accomplish the goals?As outlined in the original proposal, the project year 3 will focus on: 1) integrate full pollinator robot system with the ability to complete the entire pollination sequence autonomously; 2) evaluate the efficacy and efficiency of multiple pollination methods. In terms of robot system development and integration, the emphasis will be on improving initial robot pose (i.e., position and orientation) estimation using a prior map of the greenhouse, performing reliably detecting the pose estimation of individual flowers in dense flower clusters, optimizing planning algorithms for both the drive base and the arm, refining the pollination end-effector design, extending autonomy capabilities to make flexible pollination decisions, and refining the final sequence of pollination actions on real flowers. The evaluation of the prototype pollinator robot effectiveness will be performed in the WVU Greenhouse. Five methods of pollination: bee pollination, manual pollination, autonomous robot pollination, and mixed human-robot teaming on pollination (with human performs supplementary manual pollination tasks) in addition to no pollination control, will be performed and the efficiency of each pollination method will be compared. The effectiveness of pollination will be evaluated by determining the fruit yield per plant, fruit size, fruit weight, harvest time, and overall distribution of fruits across a plant.

Impacts
What was accomplished under these goals? Horticulture Two cultivars of blackberry Darrow and Prime-Ark Freedom were grown in the WVU Greenhouse. 'Darrow' is a thorny floricane blackberry that flowers in the spring, while 'Prime-Ark Freedom' is a thornless primocane that flowers in late summer. These cultivars were selected to provide flowers over two seasons to ensure BrambleBee would have more opportunities for testing. Flowers were photographed to determine pollen development. Pollen is immature when flowers begin to open. Pollination attempts prior to pollen release should be avoided and could damage the flower. Identification of the pollination ready flowers with viable pollen is a crucial step for successful pollination. Entomology To provide insights to bioinspired robotic arm movement, the mechanisms of pollination between bees and flowers were investigated. An unmanned aerial system was used to map bramble flowers and videography was used to trace bees' movement among flowers. Although videography provided details on bees' movement on a flower, tracing bees among flowers and plants were very challenging because bees were not readily distinguishable from the background. To overcome the problem, we adopted thermal imaging to track the bees. It was found that when bees were not active (i.e. resting or sleeping) the body temperature of bees were not distinguishable from the background because the bees are poikilothermic animal. However, when bees were actively foraging, their thorax temperature increased 4-6 °C. This study indicates that bees' movement among plants can be readily traced and modeled to provide biological data to design and plan robotic arm movement of BrambleBee. We also continuously studied the structure of bee hairs for pollen gathering (i.e. scopa, hairs under abdomen of bees) to acquire parameters that can support the design of the end effector of BrambleBee. We found that the location and general structure of scopa were different among Osmia bee species. The efficiency of the hairs is being investigated by measuring the pollen load in the scopa of each species. In addition, we have investigated the mechanism of pollen loading from flowers and unloading to nests. It was found that bees used middle and hind legs to gather and pack pollens inside scopa. Those legs were evolved to be flattened and bees used the legs to sweep pollens on anthers. When pollens were unloaded, the legs moved the opposite direction to sweep the pollens from the scopa to the nest cells. BrambleBee Localization and Mapping Robot localization is performed primarily with a real-time 3D Simultaneous Localization and Mapping (SLAM) algorithm. A factor graph based framework is used to fuse the data from the sensor suite on-board BrambleBee to estimate the robot platform's states (i.e., position, orientation, and velocity). To improve the reliability of the loop-closure detection, each incoming point cloud scan from the 3D Lidar is matched with a prior 3D map of the greenhouse using the Generalized Iterative Closest Point (Generalized-ICP) algorithm. If the percentage of the matching points in the incoming scan is above a user-defined threshold, the loop-closure detection will be marked as successful. A transformation of the robot's current pose with respect to the original points of the prior map's coordinate frame (i.e., global frame) is then added to the factor graph as an observation link. The factor graph is then optimized based on the new observation link. A 2D occupancy grid map is also generated from the 3D SLAM map for path planning. BrambleBee Flower Detection Flower detection is accomplished using computer vision techniques with the downward-looking fisheye camera for the initial, long-range identification and then with the RGB-D camera on the robotic arm for precise, short-range positioning of the flowers. The long-range algorithm can be separated into two parts. First, a rough segmentation is performed based on color to extract parts of the image that are most likely to be part of a flower. This process is completed using a naive Bayes' classifier to assign pixels as belonging to or not belonging to flowers. The resulting segmentation produces a number of false positives due to shared colors between flowers and other objects. Thus, an approach based on transfer learning is adopted to distinguish between true and false positives from the initial segmentation. After the flowers are identified using the described approach, they are positioned in the map using unit vectors pointing from the camera to each flower. The short-range algorithm, using the RGB-D camera on the arm, consists of performing a reconstruction of the plant using real-time dense SLAM. After the plant is reconstructed, the previously presented classification algorithm is used to identify the flowers in the series of images used for reconstruction. BrambleBee Autonomy Given information about the locations of flower clusters and the robot's location within the environment, BrambleBee makes efficient plans to pollinate the plants in the greenhouse. First, to obtain up to date information about flower cluster locations and pollination readiness, BrambleBee make an "inspection pass" of the greenhouse. The inspection path is generated by first discretizing the greenhouse environment into a graph Voronoi diagram and then solving for a path through the nodes in the graph. The path is found using visibility graph techniques and constrained so that the robot inspects all rows of plants while a balance between distance driven and safety from obstacles is maintained. As BrambleBee drives the inspection path, nearby flower clusters are detected using the on-board fisheye camera. The locations of the detected clusters are then recorded into a map of the plant rows. Each row of plants is discretized into an equal number of "grid cells" on both sides of the rows. The number of flower clusters detected, that intersect a particular grid cell, is updated when the robot is within reliable visible range of that cell. After the inspection phase is completed and flower locations have been identified, BrambleBee then proceeds to decide where to go to pollinate flowers. Pollination locations are chosen by balancing the number of reachable flower clusters ready to be pollinated with minimizing distance driven in the robot drive base's configuration space. The order to visit pollination locations is found using a greedy heuristic that chooses the next best grid cell to pollinate by selecting the one with the minimum cost. Paths are then planned to efficiently reach these locations, while avoiding obstacles. BrambleBee Manipulation Once the robot arrives at a pollination location, BrambleBee use its robotic arm to precisely map individual flowers in the local workspace and then plan and execute an efficient sequence of actions to pollinate all reachable flowers. The workspace mapping is performed by moving the end-effector of the robotic arm in a survey pattern around the nearby plant while the poses of detected flowers are estimated and recorded into a local database using the depth camera on the end of the arm. Once all flowers in the local workspace have been identified, the sequence of flowers to pollinate is chosen by finding the sequence that minimizes the total distance traveled in the robot arm's joint configuration space. After the sequence of flowers to be pollinated has been determined, collision-free paths to reach observation positions directly in front of each flower are planned. Once the end-effector arrives at one of the goal destinations, it is then parked in front of the flower, ready to perform the precise final approach and pollinate the flower. During the final approach maneuver, visual-servoing is performed to guide the tip of the end-effector precisely into the flower.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2018 Citation: Ohi, N., Lassak, K., Watson, R., Strader, J., Du, Y., Yang, C., Hedrick, G., Nguyen, J., Harper, S., Reynolds, D., Kilic, C., Hikes, J., Mills, S., Castle, C., Buzzo, B., Waterland, N., Gross, J., Park, Y., Li, X., Gu, Y., Design of an Autonomous Precision Pollination Robot, Accepted, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, October, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Park, Y.-L. 2018. Osmia cornifrons as a model for pollinator robot. The Apicultural Society of Korea Annual Conference. Gwangju, Korea.


Progress 11/15/16 to 11/14/17

Outputs
Target Audience:Findings from research in the project were presented at two annual conferences: the Entomological Society of America Annual Meeting and the West Virginia Entomological Society Meeting. At the conferences which over 3,000 people attended, two presentations associated with the project were given. In addition, applications of pollinator robot have been presented to stakeholder groups including WV Beekeepers Association and WV Master Gardeners Conference where more than 200 people attended. Current pollinator issues and the development of pollinator robots were presented at the Organic Farm Field Day at West Virginia University (WVU) where about 50 growers, extension agents, and the public attended. The importance of pollinator bees and a potential solution with bio-inspired engineering were presented to the public via outreach with "WVU Insect Zoo" that a co-PI is operating at WVU. More than 4,000 West Virginia citizens including K-12 students visited the booth for the project at the WV State Fair and 14 additional outreach services were provided to serve more than 400 citizens in 2017. Robotic presentations and demonstrations were also provided to the public through several outreach activities including but are not limited to robotic lab open house, high school visitation, and "day in the park" event for STEM outreach. ? Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?During project Year 1, a total of 12 graduate students and 2 undergraduate students are involved in this project. The students are receiving hands-on training through the participation of this interdisciplinary research project. They are also learning team working and leadership skills when interacting with other project members. How have the results been disseminated to communities of interest?The results of this project have been disseminated to the public and science communities through two annual entomological conferences, two extension conferences, WVU Farm Field Day, and numerous outreach activities with the WVU Insect Zoo, the WVU greenhouse, and the WVU robotics labs. What do you plan to do during the next reporting period to accomplish the goals?With the development of the BrambleBee robot and a suite of software functions, the Year 1 research effort provided a solid foundation towards achieving the proposed robotics pollination capabilities. During Year 2, we will focus on system integration, bringing currently isolated capabilities into unified systems. Specifically, the robot localization and mapping algorithms will be performed onboard BrambleBee, along with the ability to navigate efficiently and safely in the cluttered greenhouse environment. The planning algorithms will also be running online, allowing the robot to reach different parts of the greenhouse efficiently. The flower detection and pose estimation algorithms will be coupled with the robotic arm control schemes, leading to the demonstrations of robotic pollination (from a fixed location) of real flowers. Towards the end of Year 2, the robotics arm will be integrated onto the mobile robot platform, paving the way for full system demonstrations and evaluations in Year 3. These integration tasks will need to be supported by technology development and refinement in several areas, such as sensor calibration and rolling shutter compensation, flower pose estimation, semantic mapping, robot autonomy architecture design, flower pollination sequence planning, bee-inspired pollination interface design, detailed pollination motions, and human-machine interface. Many problems and inspirations are expected to arise during the system development, integration, and testing phases, which would lead to new research problems to be addressed by the team. During Year 2, optimal foraging and movement pattern of bees will be investigated to understand the foraging mechanism (e.g. path and approach to flowers) in supporting of the robot development. We will use videography to determine patterns of foraging and movement through dense leaves and branches. In addition, a behavior bioassay will be conducted with a hi-speed video camera to describe the bees' movement within flowers and to determine how each bee use their hairs and body parts to access and gather pollens. The horticulture team will continue preparing to validate the efficacy of the robotic pollinator. The team will conduct a preliminary experiment regarding comparison of efficiency of pollination methods. This experiment will help to identify any potential issues that might occur for the final validation. The experiment will consist of five treatments (bee pollination, manual pollination, autonomous robot pollination, mixed human-robot team, and no pollination control) with six replications. ?

Impacts
What was accomplished under these goals? Horticulture Four cultivars of red raspberry and blackberry were purchased and grown in the WVU Evansdale Greenhouse. Cultivar selection was based upon various traits such as formation of primocane/floricane, capability of self-pollination, presence or absence of thorns, trellising requirements, winter chilling hours, flowering time, the number of flowers, flower size, and fruit quality. Two blackberry cultivars, 'Darrow' and 'Prime-Ark Freedom' were selected for robotic experiments. The optimum growing temperature (24/21 °C, day/night) for both species of the brambles were determined. A trellis was designed and built to facilitate logistic movement of the robot as well as to support the bee netting in the greenhouse. Entomology The structure of hairs, function, and behavior of bees were examined in order to acquire parameters that can support the robot design. First, the structure of hairs on bee bodies (i.e. scopa) was examined. Mason bees, belonging to the genus Osmia, were used as a model. We found that the location and general structure of scopa were similar among different species of Osmia bees, but their ultrastructure showed variation in size, density, and shape. Further analyses of the data will help design the structure of pollination interface on the end effector of the robotic arm. Second, visiting behavior of bees in the field were observed with the aid of digital recording. It was found that different bees showed distinctive behavior of flower searching and visiting patterns. Lastly, we have collected data with a high-speed camera for behavior bioassay to describe the movement of bees within flowers and to determine how each bee uses its hairs and body parts to access and gather pollens. BrambleBee Robot System Design A prototype pollination robot, named BrambleBee, was built around a ClearPath Robotics Husky platform and a Kinova JACO2 robotic arm. The BrambleBee was instrumented with a collection of high-quality sensors to provide measurements of the robot state and to perceive the surrounding environment. This includes a Novatel SPAN GPS/INS unit, a Velodyne HDL-32E 32-channel 3D Lidar, a FLIR 5MP Blackfly-S camera outfitted with a fisheye lens, and a SoftKinect DS525 time of fly camera mounted on the robotic arm. To acquire accurate sensor measurements in support of the robot perception system, a series of sensor calibration effort are currently underway. First, the wide field of view fisheye camera is being calibrated to reduce optical distortions. Second, the alignment between the Lidar and the fisheye camera is estimated to improve the matching between the objects recognized in the images (e.g. flower clusters) and the associated depth measurement provided by the Lidar. Localization and Mapping To achieve fast, accurate, and robust robot localization in the greenhouse, two approaches were developed. The first approach matched Lidar measurements with prior knowledge of the greenhouse geometry to produce a fast estimation of the robot pose (i.e., position and orientation). The second approach used Lidar and wheel encoder measurements to perform Simultaneous Localization and Mapping (SLAM). Robot Autonomy A detailed BrambleBee's onboard autonomy software architecture is being designed and implemented. In this design, a Mission Planning node contains most of the high-level planning and decision making algorithms. It decides favorable locations for the robot to park at that would maximize the robotic arm's pollination efficiency. A Map Server node contains a dynamic map of flower cluster locations and estimates the number of flowers to be pollinated, which is used by the Mission Planning node to decide where to pollinate next. It also provides prior knowledge of obstacles in the form of occupancy grid maps to a Path Planner node. The Path Planner node takes in driving destinations in the form of a desired robot pose as well as information about known obstacles from the Map Server node. It dynamically detects obstacles from Lidar to plan or re-plan paths to reach the destination. A set of Sensor Sources and Localization nodes provide information about the robot and its environment. The Sensor Transforms nodes represent coordinate transforms between the robot's sensors and the robot body centroid. Finally, a Base Controller node takes the commanded wheel velocities from the Path Planner node and controls the robot's drivetrain. A planning algorithm was developed for initial greenhouse inspection. The objective of this algorithm was to enable BrambleBee to see each plant from all reachable viewing angles while minimizing the total drive distance. Computer Vision Over 2,400 images were collected at the McConnell Berry Farm for various growth stages of the plants. The flowers in the images were manually labeled as either a single flower or a cluster of flowers using a custom developed Graphical User Interface. For each single flower, the pose and growth stage of the flower was specified using a set of reference images. For each flower cluster, the number of flowers in the cluster was specified. The purpose of the labeling effort was to train machine learning algorithms and to develop a system for reliably evaluating the developed segmentation, classification, and pose estimation approaches. Image segmentation method was developed to quickly remove most parts of the images that are unlikely to be bramble flowers. To accelerate the segmentation process, a color lookup table was prebuilt under the RGB color space. During the segmentation step, the raw RGB values of each image were used to access the lookup table to generate a binary mask. A deep learning based machine learning algorithm was then developed to identify flowers ready for pollination in the segmented images. Research was also performed on the 3D reconstruction of the local environment (i.e., the robot workspace). Using data collected with a DS525 camera attached to the robotic arm, an incremental structure from motion approach was implemented to estimate the camera poses and three-dimensional points in the robot workspace. Manipulation The basic functionality required to manipulate the robot end effector into poses in which it can pollinate flowers has been established. This ability was demonstrated through subsystem testing in which pollination was simulated on a dummy plant with artificial marks while the arm was mounted on a table. In the tested pollination procedure, the arm was first controlled to a pre-determined search position, where the robot can detect markers using the DS525 camera. From the markers' pose estimates, goal poses of the end effector were determined. The inverse kinematics were solved to determine the robot joint angles required to reach each goal pose. The sequence in which to visit the markers was then determined to minimize the motion needed to visit each goal pose in an obstacle-free environment. Path planning was then performed to determine a collision-free path to the next goal pose in the sequence. Once the end effector reached the goal pose, the arm began the final approach to the flower. In the final approach, visual servoing was performed in a two-step process: first in parallel to the markers to align the pollinator axis with the center of the marker; then along the pollinator axis until contact with the marker was made. After contact was estimated to have been made based on position estimates, the arm proceeded to visit the next flower in the sequence until all simulated flowers have been visited. A more detailed report with supporting figures and diagrams is available upon request.

Publications

  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Pollinator robot inspired by structure and behavior of Osmia bees (Hymenoptera: Megachilidae). Annual Meeting of Entomological Society of America, Denver, CO.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Precision pollinator robot. Annual Meeting of West Virginia Entomological Society, Cairo, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Bees: current issues, future prediction, and mitigation. Annual West Virginia Extension and Master Gardeners Conference, Roanoke, WV.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2017 Citation: Park, Y.-L. 2017. Mason bees: propagation and management. West Virginia Panhandle Beekeepers Association Meeting, Martinsburg, WV.