Source: UNIVERSITY OF NEBRASKA submitted to
PAPM EAGER: TRANSITIONING TO THE NEXT GENERATION PLANT PHENOTYPING ROBOTS
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
TERMINATED
Funding Source
Reporting Frequency
Annual
Accession No.
1011763
Grant No.
2017-67007-25941
Project No.
NEB-21-169
Proposal No.
2016-10994
Multistate No.
(N/A)
Program Code
A5172
Project Start Date
Nov 15, 2016
Project End Date
Nov 14, 2020
Grant Year
2017
Project Director
Ge, Y.
Recipient Organization
UNIVERSITY OF NEBRASKA
(N/A)
LINCOLN,NE 68583
Performing Department
Biological Systems Engineering
Non Technical Summary
High throughput plant phenotyping (HTPP), the use of holistic and large-scale approaches to collect plant phenotypic information, bears the promise to spark a new green revolution. It will be an important part of the solution to the global food security challenge by 2050 when the world population is likely to exceed 9.7 billion. As the current state of the art in HTPP, digital imaging greatly enhances our ability to capture plant phenotypes. However, the disadvantage of digital imaging is also obvious. Phenotypes are measured in terms of image pixel count or pixel intensity. These image data, by themselves, convey little information of biological relevance. To maximize the utility of the image data, they have to be analyzed and interpreted jointly with manually measured plant physiological or chemical traits, which are still slow and expensive to collect. The overall goal of this project is to develop next generation plant phenotyping robots to enable autonomous and in vivo (and human-like) plant physiological and chemical trait measurements. The phenotyping robots will greatly improve the throughput and capacity, and at the same time substantially reduce the cost of plant phenotyping.The overall goal of the plant phenotyping robot will be realized with three research thrusts. Firstly, novel robotic grippers that integrate specialized plant sensors will be designed and developed. Two specific plant sensors being considered are (i) a Y-shape bifurcating fiber optics sensing head for a variety of optically sensed plant physiological and chemical traits, and (ii) a leaf porometer for stomatal conductance measurement. Secondly, a novel robotic vision system that combines a RGB camera and a Time-of-Flight 3D depth camera will be constructed. Novel Image processing algorithms will be developed for plant leaf segmentation and localization. The algorithm will also rank the most suitable plant leaves for automated sensing, and calculate the approaching vector of the robotic gripper for successful leaf grasping and sensing. Thirdly, the developed plant phenotyping robot will be tested in the high throughput imaging greenhouse at University of Nebraska-Lincoln. Different corn and soybean lines with known susceptibility to water and nutrient stresses will be used to demonstrate and validate the throughput, accuracy and capacity of the phenotyping robot.The plant phenotyping robots will be a critical enabling technology to advance the science of plant phenomics and allow better genomics - phenomics analysis for trait discovery and crop improvement, which is an indispensable part of the overall solution to the long term food and energy security problems facing our society.This project will create an interdisciplinary environment where the graduate and undergrad students will receive training in both plant science and engineering robotics. The undergrads will be employed to work on the project through BSE-PIE, a program initiated by the team to attract students with engineering background to conduct research on plant phenotyping. Using the material from this project, the PIs will develop new course modules to expose students to this new frontier of automated plant phenotyping. A robotic competition focusing on autonomous plant phenotyping will be created within PI's professional society. Finally, results and findings from this project will be broadly disseminated through conferences, peer reviewed publications, and open collaborative platforms including iPlant Collaborative and Robotic Operating System.
Animal Health Component
0%
Research Effort Categories
Basic
20%
Applied
40%
Developmental
40%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
2011510202050%
2031820108050%
Goals / Objectives
The overall goal of this project is to develop automated robotic systems that can realize in vivo, human-like plant phenotyping in the greenhouse.There are three specific objectives.1) Integrate specialized plant sensors with the robotic gripper and manipulator for in vivo, human-like plant sensing;2) Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing;3) Evaluate and validate the plant phenotyping robot.
Project Methods
Objective 1: Integrate specialized plant sensors with the robotic gripper and manipulator for in vivo, human-like plant sensing. We will develop two types of robotic grippers integrated with specialized plant sensors. The first one will integrate a Y-shape bifurcating fiber optical sensing head that allows the retrieval of optical properties from plant leaves. This design will allow the measurement of a wide array of optical traits that are closely associated with plant chemical and physiological traits (such as water content, nitrogen, pigments, and photosynthesis). The second one will integrate an off-the-shelf leaf porometer from Decagon Devices to measure stomatal conductance and gas exchange. The grippers will then be integrated with a robotic manipulator.Objective 2: Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing. We will develop a RGB and 3D-depth camera based vision system for the plant phenotyping robot. Novel image processing algorithms will be developed to (1) segment individual plant leaves from background, (2) form a ranking of the suitability of each segmented leaf on the plant for automated sensing, and (3) compute the approaching vector of each grasping point. After the vision system is developed, "eye-hand" coordination will also be tested extensively for the system to make sure the robotic manipulator/gripper in Obj.1 and vision in Obj. 2 work harmonically and cooperatively.Objective 3: Evaluate and validate the phenotyping robot. We will conduct experiments to evaluate and validate the phenotyping robot in the high throughput phenotyping greenhouse at the University of Nebraska-Lincoln. The validation test will be conducted using maize and soybean. The validation experiments will include two forms of stress common under agronomic conditions: drought and nitrogen limitation. Similar measurements by human operators will be carried out concurrent with the measurements by the phenotyping robot. In addition, all plants will be imaged using five different imaging modules in the greenhouse. Data collected from the validation tests will be analyzed to answer the following key questions regarding the overall hypothesis (1) what is the success rate (η) and average cycle time (t) of the phenotyping robot for automated leaf grasping and sensing, and how do they compare to a human operator? Answering these questions addresses the throughput requirement. We expect η and t will vary substantially according to plant species, canopy architecture, and developmental stages. (2) How do the leaf trait measurements by the phenotyping robot compare to those by human operators? Correlation analysis will be conducted with these two sets of measurements. Answering this question addresses the accuracy requirement of the phenotyping robot. (3) Correlation analysis and comparison will also be made between the plant images and the robotic measurements vs. human measurements; and we will attempt to identify existing or novel image-based traits (or combinations of traits) which are reliable predictors of the physiological and biochemical traits measured by the phenotyping robot. This will evaluate the usefulness of data collected by the phenotyping robot in complementing plant images to improve the capacity of plant phenotyping.

Progress 11/15/16 to 11/14/20

Outputs
Target Audience:The target audiences of the project are (1) the postdoc / graduate student and undergraduate students who were supported by the project and directly participated in the research and outreach activities of the project, (2) members of professional societies (American Socieity of Agricultural and Biological Engineers, National Association of Plant Breeders), (3) researchers and scientists of universities and industrial partners (e.g., Bayer Crop Science, Corteva), (4) growers and the general public who are interested in agricultural technologies and plant breeding. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?The project supported one PhD student Abbas Atefi from 2016-2019. Abbas successfully defended his dissertation and was awarded PhD in Aug 2019 at University of Nebraska-Lincoln. He has beena postdoc research associate since then and continued to be supported by this project. The project trained and supported three undergraduate students (Ema Muslic, Arena See, and Yanni Yang). Their responsibilities included caring the plants in the greenhouse and assisting the graduate student in data collection in the experiments. The project trained and supported the UNL robotic teams to attend the robotic competition in the annual meetings of American Societyof Agricultural and Biological Engineers. The 2018team made to the second place in this international-level competition. The project engaged in a senior design capstone team with 4 undergraduate students on a project related to agricultural robotics. How have the results been disseminated to communities of interest?Professional society.The team gave presentations at the ASABE (American Society of Agricultural and Biological Engineers) meetings in 2018 and 2019. The team gave one presentation at the National Association of Plant Breeders meeting in 2017. Industrial and academic partners. Presentations regarding the robotic technologies for plant phenomics were given to Danforth Plant Science Center,Bayer Crop Science at St. Louis,the Corteva Plant Breeding Symposium Series at University of Nebraska-Lincoln, and University of Saskatchewan. General public. The plant phenotyping robot was demonstrated at Huskers' Harvest Day and Wheat Appreciation Day. The team and our phenotyping robot were interviewed and reported by Lincoln Journal Star. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? The overall goal of this project is to develop automated robotic systems that can realizein vivo, human-like plant phenotyping in the greenhouse. There are three specific objectives. 1) Integrate specialized plant sensors with the robotic gripper and manipulator forin vivo, human-like plant sensing; 2) Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing; 3) Evaluate and validate the plant phenotyping robot. Objective 1. We developed and integrated three specialized plant leaf sensors to a robotic manipulator to realize human-like phenotypic data collection for plant leaves and stems. These three sensors were (1) a fiber-optical cable coupled to a visible and near infrared (VisNIR) compact spectrometer for plant leaf reflectance measurement, (2) a thermistor for leaf temperature measurement, and (3) a linear potentiometer for plant stalk thickness measurement. Mechanically, we designed robotic end-effectors (grippers) that coupled well with these sensors, and at the same time provided the flexibility and agility for the sensors being gently clamped on plant leaves or stems. The thermistor and linear potentiometer directly reported leaf temperature and stalk thickness. To use the leaf-level VisNIR reflectance data, we also developed multivariate statistical models (partial least squares regression) to estimate a number of leaf biochemical traits (chlorophyll content, water content, nitrogen, phosphorus, and potassium). A fourth end-effector to integrate a leaf porometer (for leaf stomatal conductance measurement) with the robotic manipulator was also developed. This end-effector has not been tested. Objective 2. We investigated two imaging modules, a RGB camera and a depth camera, as the robotic vision system to guide the robotic arm for the approaching, grasping and sensing of plant leaves and stems. The depth camera provided the 3D position information of the targets, which guided the robotic end-effectors to approach and grasp the targets. Initially, we developed a rule-based image processing algorithm to (1) segment the plants from the background, (2) identify the stem and localize its position in the image, and (3) identify individual leaves and localize their positions in the image. A ranking algorithm was also developed to select three largest leaves from the plant, and calculate the geometric centers of the selected leaves as the potential grasping points for the robot. Custom-developed inverse kinematics was made to generate the commands to drive the robotic manipulator to the grasping points. Later on, we developed a convolutional neural network and deep learning (CNN-DL) based method to achieve pixel-level segmentation of background, plant leaves, and plant stem in an image (semantic segmentation). This CNN-DL based approach improved the image segmentation result to guide the robotic manipulator. Finally, all these components, including image processing, inverse kinematics, and robotic control was integrated into an APP run in the MATLAB environment as the software of the robotic phenotyping system. Objective 3. We conducted multiple tests for the developed phenotyping robot using primarily maize and sorghum plants grown in the greenhouse at their vegetative stage (n=150). The goals of the tests were to (1) obtain performance metrics of the phenotyping robot in terms of accuracy and efficiency (compared to a human operator), and (2) evaluate the accuracy of the phenotypic traits measured by the robot. We demonstrated that it took the robot about 35-38 s to collect one set of measurements from the plant leaf. The process of robotic measurement could be divided into 4 steps: image processing, inverse kinematics, leaf grasping, and sensing. Leaf grasping was the most time consuming step with an average time of 31-32 s. The overall success rate of robotic data collection was 50%. The coefficients of determination (R2) between the traits measured by the phenotyping robot and the ground-truth data were: 0.62 for leaf temperature, 0.53 for leaf chlorophyll, 0.61 for leaf water content, 0.14 for leaf nitrogen, 0.11 for leaf phosphorus, 0.52 for leaf potassium, and 0.99 for stem thickness. Limited testing was also conducted for soybean plants and we found that robotic phenotyping for soybean was substantially more difficult (dense vegetation that made image processing and robotic grasping more challenging). These tests showed the technical feasibility and the promising results for the fully-automated, human-like robotic system for plant phenotyping (at least maize and sorghum). This project resulted in two published journal articles and one PhD dissertation. A third journal article is currently under revision.

Publications

  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Atefi, A., Ge, Y., Pitla, S., Schnable, J., 2019. In vivo human-like robotic phenotyping of leaf traits in maize and sorghum in greenhouse. Computers and Electronics in Agriculture 163, 104854. https://doi.org/10.1016/j.compag.2019.104854
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Atefi, A., Ge, Y., Pitla, S., Schnable, J., 2020. Robotic detection and grasp of maize and sorghum: stem measurement with contact. Robotics 9(3), 58. https://doi.org/10.3390/robotics9030058
  • Type: Theses/Dissertations Status: Published Year Published: 2019 Citation: Atefi, A. In vivo human-like robotic phenotyping of leaf and stem traits in maize and sorghum in greenhouse. University of Nebraska-Lincoln PhD Dissertation. August, 2019.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Atefi, A., Ge, Y., Pitla, S., 2019 An automated robotic system to measure stem diameter of maize and sorghum plants in greenhouse. 2019 ASABE Annual International Meeting. Boston, MA.
  • Type: Journal Articles Status: Under Review Year Published: 2021 Citation: Abbas Atefi, Yufeng Ge, Santosh Pitla, James Schnable. Robotic technologies for high-throughput plant phenotyping: contemporary reviews and future perspectives. Frontiers in Plant Science.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Abbas Atefi, Yufeng Ge, Santosh Pitla, James Schnable. 2018. Integration of a plant phenotyping robotic system with LemnaTec high-throughput plant phenotyping system. 2018 ASABE Annual International Meeting. Detroit, Michigan.


Progress 11/15/18 to 11/14/19

Outputs
Target Audience:Nebraska public and young generation farmers. The project teamed with UNL-IANR (Institute of Agriculture and Natural Resources) and demonstrated our plant phenotyping robot at the Husker Harvest Day to many Nebraska citizens in particular the younger generation who are interested in agricultural technologies. The phenotyping robot was demonstrated to many groups of visitors to the Nebraska Greenhouse Innovation Center. A group of wheat growers in West Nebraska. The plant phenotyping robot was demonstrated at Growers Appreciation Day in Alliance, Nebraska. ASABE professional society: one presentation was made in the ASABE annual meeting. News media. Our research team and this research was interviewed by several news media including Lincoln Journal Star and BBC. Some examples of the media reports are: https://journalstar.com/news/local/education/robots-in-the-cornfield-unlteam-believes-it-s-the/article_9a9b6659-bc36-53b4-9c20-34d879b2809e.html; https://www.usatoday.com/story/news/50states/2019/07/19/barbie-tupac-scorpions-red-vines-news-around-states/39702985/ Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?The graduate student Abbas Atefi defended his dissertation in July 2019. Abbas was awarded PhD in Aug 2019 at University of Nebraska-Lincoln. Abbas is now a Postdoctoral Research Associate and continues to work on the research of robotic plant phenotyping. The project supported the robotics team (consisted of three undergraduate and three graduate students) of the Biological Systems Engineering Department. The team participated the 2019 ASABE Student Robotics Competition in agriculture (pre- and post-harvest operations). How have the results been disseminated to communities of interest?Professional societies. The team members attended the ASABE (American Society of Agricultural and Biological Engineers) meeting in July 2019 and gave a presentation on this project. PI Ge was invited to Danforth Plant Science Center in Dec 2018 and gave a seminar on "Engineering and Robotics for High Throughput Plant Phenotyping". This project was highlighted as part of the seminar presentation. Plant breeding industry. PI Ge was also invited to Bayer Crop Science in Dec 2018 and gave a seminar on "Sensing and Robotics for Plant Phenomics Research". This project was highlighted in the presentation. About 25 Bayer scientists attended the seminar and the discussion. UNL Undergraduate Students. PI Ge was invited to give a guest lecture to a class of "bioinformatics" students in Jan 2019. The lecture was on Plant Pheomics and this project was introduced to the students as part of the lecture. General Public. The phenotyping robot was demonstrated at various extension events in Nebraska, as well as to many groups of visitors at the Nebraska Greenhouse Innovation Center. One peer-reviewed journal article was published in 2019. A second one is currently under review. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period, we plan to begin a new research to use cameras mounted on the MICO2 robotic manipulator for plant root phenotyping. Compared to shoot phenotyping, root phenotyping currently lags far behind. While X-ray was shown to be useful for root phenotyping in soils, it was very expensive and not suitable for large roots. We will construct a box in which sandy coarse soils will be filled. The box will also have a network of strings (such as fishing net) running perpendicular at two directions and at multiple depths. The idea is that the root will be supported by this network of strings. When the box walls and sandy soils are removed, the root will be maintained intact and can be imaged at multiple angles with the camera controlled by the robotic arm. 3D reconstruction will be used to reconstruct the root morphology from the series of 2D root images. Since a very high-resolution camera will be used and the position and orientation of the camera can be very precisely controlled by the robotic manipulator, it is expected that the root 3D structure can be accurately reconstructed even for the fine root hairs. This research will lead to low-cost solution (compared to X-ray or MRI) for root phenotyping using the robotic technology. It is also anticipated that the characterization of root phenotypes, such as 3D morphology and coarse versus fine roots, will be more accurate. When the system is developed, an experiment likely involving maize and wheat plants will be conducted for system validation.

Impacts
What was accomplished under these goals? In this reporting period, the research focus was on developing a vision system and a robotic gripper that can measure the stem diameter of maize and sorghum plants in the greenhouse. Stem diameter is an important phenotype that determines the vigor and biomass of plants, and is known to exhibit strong genetic control. We used the same Time-of-Flight (TOF) camera that was used previously for the plant leaf-grasping robot as the vision system to capture plant images. A Convolutional Neural Network and Deep Leaning (CNN + DL) based method (i.e., faster RCNN) was employed to segment the stem from the image. To train the Faster R-CNN algorithm for stem detection, 60 gray-scale TOF images of maize and sorghum plants were labelled by drawing bounding box for stem as ground-truth using "ImageLabeler" toolbox in MATLAB. These labeled images were sharpened, blurred, darkened, and brightened to create 240 additional images (300 in total). The dataset was split into two subsets: 80% as the training set and 20% as the testing set. The input image was first passed through a convolutional network, which returned a convolutional feature map. A small neural network called Region Proposal Network (RPN) was applied on the convolutional feature map to predict the presence of an object and its bounding box (predicted region proposals). Then, a Region of Interest (RoI) pooling layer was used to bring down all the predicted region proposals to the same size. Finally, the reshaped proposals were provided as inputs to fully connected layers to classify any predicted bounding boxes. Each bounding box had a score that showed the likelihood that the box contained the stem. The score was computed by "Intersection over Union" between the predicted and ground-truth. Finally, the grasping point for the robotic gripper was determined as the center of the identified bounding box below the first (lowest) leaf. A laptop computer with Intel Core i7 Processor (2.5 GHz) and 8 GB RAM was used to implement the training of the Faster R-CNN network and the additional image processing steps. A gripper was designed to attach a linear potentiometer (LP) sensor for stem diameter measurement. The assembly was then attached to the MICO2 4-DOF (degree of freedom) robotic manipulator for motion planning and control. Finally, a data acquisition system was developed to interface the LP with the robot's task computer. An experiment was conducted in the Greenhouse Innovation Center of University of Nebraska - Lincoln to evaluate this robotic system for stem diameter measurement. Two different lines of maize (B73 and Ohio 47) and two different lines of sorghum (Simon and Grassl) were used (8 plants for each line, 32 plants in total) for the evaluation. This experiment was conducted over a period of 6 weeks (the first three weeks for maize plants and the second three weeks for sorghum) to create large variability in stem diameter. The plant stage was approximately V10 to tasseling. The key results from this validation experiment were as follows. For maize, the robotic stem diameter measurement had a correlation of R2 of 0.98, Root Mean Squared Error (RMSE) of 0.10 cm, and a Bias of -0.10 cm with the manual measurements. For sorghum, the two sets of measurement were also highly correlated with R2 of 0.99, RMSE of 0.12 cm, and a Bias of -0.11 cm. This measurement inaccuracy was compared to the stem diameter range from approximately 1.40 to 3.10 cm, with a mean of 2.20 cm. In terms of robotic stem clamping efficiency, the total execution time was averaged at 45 s (CNN-DL, 1.3 s; other Image Processing components 1.4 s; stem grasping 42.2 s) per measurement.

Publications

  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Atefi, A., Ge, Y., Pitla, S., Schnable, J., 2019. In vivo human-like robotic phenotyping of leaf traits in maize and sorghum in greenhouse. Computers and Electronics in Agriculture 163, 104854. https://doi.org/10.1016/j.compag.2019.104854
  • Type: Journal Articles Status: Under Review Year Published: 2020 Citation: Atefi, A., Ge, Y., Pitla, S., Schnable, J., 2020. A robotic system equipped with deep learning based stem detection to measure stem diameter of maize and sorghum plants in greenhouse. Biosystems Engineering. Under Review.
  • Type: Theses/Dissertations Status: Published Year Published: 2019 Citation: Atefi, A. In vivo human-like robotic phenotyping of leaf and stem traits in maize and sorghum in greenhouse. University of Nebraska-Lincoln PhD Dissertation. August, 2019.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: Atefi, A., Ge, Y., Pitla, S., 2019 An automated robotic system to measure stem diameter of maize and sorghum plants in greenhouse. 2019 ASABE Annual International Meeting. Boston, MA.


Progress 11/15/17 to 11/14/18

Outputs
Target Audience:(1) Dec/12/2017, PI Ge attended the American Geophysical Union Conference. Ge gave one oral presentation on "Using engineering technologies for plant phenotyping and plant breeding research" and one poster presentation on "Engineering systems for high throughput plant phenotyping research at University of Nebraska-Lincoln". AGU conference is an International Conference. (2) Mar/24/2018, PI Ge was invited to attended the second Asia-Pacific Plant Phenotyping Conference and gave a presentation on "imaging, robotics and modeling for non-destructive analysis of cell wall composition in sorghum". This was an international conference with ~400 attendees. About 150 people attended the presentation. (3) July/30/2018, Graduate student Abbas Atefi gave a presentation: "Integration of a plant phenotyping robotic system with LemnaTec high-throughput plant phenotyping system" at the American Society of Agricultural & Biological Engineers. This session was attended by ~40 professionals around the world. (4) Oct/16/2018, PI Ge was invited to attend the third Annual Plant Phenotyping and Imaging Research Center Symposium at University of Saskatchewan, Canada. Ge gave a 30-minute presentation on "engineering technologies for high throughput plant phenotyping research at UNL". This symposium was attended by over 150 people from around the world. (5) Dec/12/2018, PI Ge was invited as a speaker to Danforth Plant Science Center and gave a 1-hour presentation on using imaging and robotics technologies for high throughput plant phenotyping research. The presentation was attended by ~50 scientists from Danforth Center. (6) Dec/13/2018, PI Ge was invited to Bayer Crop Science at St. Louis, and gave a 1-hour presentation on engineering systems for high throughput plant phenotyping research, and interacted with their scientists for potential research collaborations. (7) Jan/17/2019, Ge gave a guest lecture (75 min) to a cohort of 20 undergraduate students in bioinformatics major at UNL. The guest lecture is on using advanced engineering technologies to solve emerging plant science problems. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?1.5 graduate students have been trained in this project (Abbas Atefi and Ujjwol Bhandari). They have been responsible for the day-to-day operation of the project regarding the plant phenotyping robot design and fabrication, its greenhouse testing, and data collection and analysis. Three undergraduate students have also been trained in the project (Ema Muslic, Arena See, and Yanni Yang, all are female). Muslic has been in UNL's work study program. Their responsibility mainly included caring the plants in the greenhouse and assisting the graduate students in data collection during the validation experiments. How have the results been disseminated to communities of interest?The project results have been widely disseminated by the research team. Research presentations were made to the following groups: (1) National and International professional conferences: ASABE and AGU; Asia-Pacific Plant Phenotyping Conference, (2) UNL graduate students through a guest lecture; (3) academic industrial partners: University of Saskatchewan, Danforth Plant Science Center, and Bayer Crop Science. What do you plan to do during the next reporting period to accomplish the goals? First, we would like to draft our second manuscript for publication. This manuscript will be on the robotized measurement of plant stalk thickness, the ConvNets algorithm on stalk segmentation, and the third validation experiment. We plan to submit this manuscript to Journal of Biosystems Engineering or Plant Methods. Second, PhD student Abbas Atefi, who was supported by this grant, plan to defend his dissertation in July 2019 and graduate in Aug 2019. He plans to continue as a postdoc researcher still in the BSE department and continue to work on this line of research (imaging and robotic technologies for high throughput plant phenotyping). Thirdly, we would like to test the robotic system with a maize or sorghum diversity panel in UNL's phenotyping greenhouse. Through this experiment we would like to show that researchers can actually rely on the phenotyping robots for genomics-phenomics analysis. We will test the robotized measurements for the ability to do QTL mapping and GWAS, and compare the results with the ground-truth measurement.

Impacts
What was accomplished under these goals? Objective 1: Integrate specialized plant sensors with the robotic gripper and manipulator for in vivo, human-like plant sensing In the first-year report, we reported that we successfully developed a robotic gripper that integrated an optical fiber bundle (to measure leaf-level VisNIR hyperspectral reflectance data) and a thermistor (to measure leaf temperature). In this reporting period, we further developed another robotic gripper that could measure the stalk thickness of maize and sorghum plants. The gripper was specially designed to incorporate a digital LVDT (linear variable differential transformer) and two robotic fingers to realize stalk thickness measurement mechanically. This gripper was also extensively tested with 60 maize and sorghum plants and demonstrated satisfactory performance (see Obj. 3 for more details). Objective 2: Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing In the first-year report, we reported the hardware of the robotic 3D vision system and image processing algorithms to (1) detect plant stem, (2) identify leaves, (3) rank leaves, and (4) localize grasping points. In this reporting period, we explored the use of more advanced image processing algorithm, namely Convolutional Neural Network and Deep Learning, to improve the performance of stem detection and grasp point localization. More specifically, we explored the use of faster R-CNN and semantic segmentation to segment plant stem from other pixels, using both the 3D depth image and the RGB image of the plants. This is considered an innovative part of our vision system. Combining Obj. 1 and Obj. 2, we further improved the MATLAB GUI (Graphic User Interface) to coordinate the vision and robotic gripper modules (eye-hand coordination). Now the system could not only control all the hardware and sensors for robotized measurement, but also visualized the measurements real time on the GUI. Objective 3: Evaluate and validate the plant phenotyping robot. In the first-year report, we reported a first-round evaluation of the robotic phenotyping system using 10 maize, 10 sorghum, 10 soybean and 10 wheat plants. The key finding from this initial test was that soybean and wheat plants caused more challenges for robotized phenotyping, primarily due to their bushy plant architecture. A decision therefore was made to focus on maize and sorghum plants for further validation experiments. Our second-round validation experiment used 60 maize and 60 sorghum plants. The experiment was a 2x2 factorial design. The first factor was water treatment with two levels (water-limited vs. control) and the second factor was nutrient treatment with two levels (high nutrient vs. low nutrient). The goal was to create large differences in leaf physiological properties (temperature, water content, nitrogen content, and chlorophyll content) so that they can be modeled and quantified by robotized measurements. In parallel to the robotized measurements, we also took ground-truth measurements with handheld sensors (an ASD VisNIR spectrometer, a chlorophyll meter, and a handheld thermistor) by students from the same plant leaves. After the robotized and ground-truth measurements, leaves were subject to destructive sampling to determine leaf water content and nitrogen, phosphorus, and potassium concentrations. We developed multivariate calibration models to relate leaf hyperspectral data to those leaf chemical traits. Some of the key results from this second validation experiment: Correlation between ground-truth and robotized measurement for leaf temperature (R2 = 0.60), chlorophyll content (R2 = 0.52), water content (R2 = 0.61), K (R2 = 0.52), N (R2 = 0.14) and P (R2 = 0.11). It took the robot ~37 s to take one robotized measurement; and the leaf grasping successful rate was around 50%. A manuscript was written to report the phenotyping robotic system and the results from this second validation test. The title of the manuscript was "In vivo Human-Like Robotic Phenotyping of Leaf Traits in Maize and Sorghum in Greenhouse" and was submitted to "Computers & Electronics in Agriculture". Please follow this link to see the submitted manuscript, which contains more information about the robotic system and the validation test results: https://unl.box.com/s/ihmumaazywtrvak9c8mtsh15hh3c6y21 We also conducted a third validation experiment to test the performance of the gripper to measure the stalk thickness of the maize and sorghum plants. This time we used two different varieties of maize and two different varieties of sorghum. The total number of plants for the testing was 60 (15 each genotype). The result showed that (1) the robotized stalk thickness measurement was highly correlated with ground-truth measurements (R2 = 0.99), and (2) the speed of robotized measurement was ~30 s.

Publications

  • Type: Journal Articles Status: Under Review Year Published: 2019 Citation: Atefi, A., Ge, Y., Pitla, S., Schnable, J., 2019. In vivo human-like robotic phenotyping of leaf traits in maize and sorghum in greenhouse. Computers and Electronics in Agriculture. Under Review.


Progress 11/15/16 to 11/14/17

Outputs
Target Audience:The Target Audience reached by the efforts include: (1) Nebraska farmers and growers with an interest in agricultural technology. On Feb/2/2017, PI Ge gave a presentation titled "High throughput plant phenotyping research at University of Nebraska-Lincoln" at Nebraska Agricultural Technologies Association Conference. About 30 growers from all over Nebraska attended the presentation. (2) On Mar/4/2017, PI Ge was invited to give a presentation titled "High throughput plant phenotyping in greenhouse and field - Translational pipelines from gene discovery to crop improvement" at Iowa State University's R.F. Baker Plant Breeding Symposium. About 150 graduate students in plant breeding and professionals from industries listened to the presentation. (3) On Apr/7/2017, PI Ge was invited to give a presentation titled "Engineering instruments and robotics for high throughput plant phenotyping" at Predictive Crop Design: Genome-to-Phenome Symposium (hosted by NSF and Nebraska EPSCOR). About 200 scientists (primarily plant scientists) from universities in NE, KS, MO, ND, SD, and IA attended the Symposium and the presentation. (4) On Apr/10/2017, PI Ge was given a presentation titled "Advanced imaging for phenotyping water-related crop traits" at 2017 Water for Food Global Conference. The seminar was attended by 50 scholars from the world with an interest in agricultural water use and management. (5) July 17-19, PI Ge and graduate student Abbas Atefi attended the annual meeting of American Society of Agricultural and Biological Engineers (ASABE). Atefi made a presentation titled "Development of a robotic system to grasp plant leaves for phenotyping" in a technical session. The presentation was attended by ~60 people with a technical interest in agricultural and biological engineering. (6) Also at this ASABE meeting, Co-PI Santosh and PI Ge led a UNL robotics team to participate in the student robotics competition. Graduate student Abbas Atefi (who is supported by this grant) served as the team captain; and the team members were from BSE-PIE (Biological Systems Engineering - Programming Instrumentation and Electronics) initiated by Santosh with the support from this grant. The team leveraged a lot of knowledge and experience (such as machine vision and gripper actuation system) from this project. There were 17 student teams from all over the world in the competition, and UNL team won the second place. Over more than 500 professionals and students watched the competition. (7) On Aug 4-5, PI Ge attended the American Society of Plant Breeders meeting and made a poster presentation titled "High throughput plant phenotyping robot". Also in this meeting, Ge gave a 5-min flash talk to introduce the project at NIFA Workshop: Plant Breeding, Engineering, Cyber Physical Systems, and Breakthrough Technologies. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?1.5 graduate students have been trained in this project (Abbas Atefi and Ujjwol Bhandari). They have been responsible for the day to day operation of the project regarding the plant phenotyping robot design and fabrication, its greenhouse testing, and data collection and analysis. Two undergraduate students have also been trained in the project (Ema Muslic and Yanni Yang, both are female). Muslic has been in UNL's work study program. Their responsibility mainly included caring the plants in the greenhouse and assisting the graduate students in data collection during the experiment. The UNL robotic team from the BSE-PIE program was a six-member team. They were Abbas Atefi, Piyush Pandey (graduate student), CheeTown Liew (Undergrad from EE), Hesan Sdt (Undergrad from EE), Lucas Renker (Undergrad from EE), and Jenny Wynn (Undergrad from BSE). The undergraduate students in the team received significant training from the graduate students and faculty advisors (Pitla and Ge) in machine vision, image processing, and robotic design for plant detection and analysis. In addition, the team also engaged in a senior design capstone project on agricultural robotics, and provided significant training to them. The four-member capstone team included: John Shook, Purity Muhia (female), Alec Fuelberth and Karlie Knoepfler (female), all BSE undergraduate students. How have the results been disseminated to communities of interest?The project results have been widely disseminated by the research team. Research presentations were made to the following groups: (1) a group of farmers in Nebraska interested in agricultural technologies; (2) agricultural engineers from the world; (3) graduate students in plant breeding at Iowa State University; (4) plant scientists from Midwest states; (5) National Association of Plant Breeders and (6) scientists from the world in irrigation and crop water use. Internal to UNL, a few presentations were given (by Ge and Schnable) to a wide group of scientists who are interested in plant phenomics. What do you plan to do during the next reporting period to accomplish the goals?The PIs plan to do the following items during the next reporting period to accomplish the goals. First, we will further design and develop the robotized gripper. Currently the integrated sensor can measure leaf reflectance and temperature. As proposed, we will build the gripper to measure two more traits at the leaf and whole plant level: (1) the gas exchange rate / stomatal conductance by integrating a commercial stomatal conductance sensor; and (2) stalk thickness of maize/sorghum by integrated a mechanical sensor. We already have initial blueprint of these designs; and development of the gripper will start immediately the next reporting period. Second, we will further improve the machine vision and image processing algorithm so that it will be more efficient for plant leaf segmentation and grasp point localization, in particular for bushy canopies of soybean and wheat. One potential solution is to merge a high resolution RGB camera with the 3D TOF camera, and use vegetation pixels in RGB images to improve segmentation and localization. Third, we will analyze data from the second evaluation of the phenotyping robot and prepare a manuscript for publication. The targeted journal for this publication will be "Plant Methods" or "Computers and Electronics in Agriculture". Last, we will plan a final large-scale evaluation of the phenotyping robot when it is all completed. We will use well-characterized diversity panels for the evaluation (such as Maize and Soybean Nested Association Mapping Population, Co-PI Schnable will ensure the availability of seeds). Plants will grow under control and low water/nitrogen conditions. When they reach full vegetative stage, robotized measurements will be taken on these plants together with an array of conventional measurements. There will be two major objectives in data analysis: (1) to see how well the robotized trait measurements are correlated with conventional measurements; and (2) how the measurements by the phenotyping robot can be used to detect genetic controls (to use GWAS or QTL analysis) of these traits. The second objective will be important to test the overall hypothesis that the phenotyping robot can be a useful tool for high throughput plant phenotyping and plant genotype-phenotype research. Finally we plan to prepare a manuscript to publish the final phenotyping robot and the evaluation results.

Impacts
What was accomplished under these goals? Please use the link here to see a video clip on the plant phenotyping robot in action. https://unl.box.com/s/du8gqbcb0kmbscnlrvhyhcpytyoqxhme Please use the link here for a slide set on the pictures showing the progress: https://unl.box.com/s/vgdbga0tqvsdhmh4xete463uw97yw1l0 Objective 1: Integrate specialized plant sensors with the robotic gripper and manipulator for in vivo, human-like plant sensing A robotic gripper that integrated (1) an optical fiber cable coupled to a portable VisNIR spectrometer for leaf reflectance measurement, and (2) a thermistor for leaf temperature measurement was successfully made. The gripper went through a few rounds of design, redesign and improvement so that it was coupled to a four degree-of-freedom MICO2 Robotic Arm (KINOVA Inc. from Quebec, Canada). The final gripper was 3D printed. We also developed the inverse kinematics of the system. Objective 2: Develop the robotic vision system to guide the robotic arm for plant leaf approaching, grasping, and sensing A SR4500 TOF (Time of Flight) camera (Mesa Imaging at Zurich, Switzerland) was used as the main sensor for the vision system. The camera capture 3D depth image that contained 4 layers of information (the X, Y, and Z coordinates and the reflectance intensity of each pixel). Using the Z (depth) information the plant pixels were first segmented from the background. Then the image processing algorithm went over four major steps: stem detection and removal; leaf identification, leaf angle calculation; and leaf ranking. Combining Obj. 1 and Obj. 2, we developed a GUI (Graphic User Interface) to coordinate the vision and robotic gripping modules (eye-hand coordination) and to take robotized measurements from test plants. The GUI was developed in MATLAB. Objective 3: Evaluate and validate the plant phenotyping robot. We had completed two rounds of evaluation of the prototype phenotyping robot at UNL's high throughput plant phenotyping greenhouse. The first round of evaluation was conducted in Jun 2017. The experiment used 10 maize, 10 sorghum, 10 soybean, and 10 wheat plants. The purpose of this experiment was to evaluate how the plant phenotyping robotic system would perform on different plant sizes and plant canopy complexity. All plants were grown under normal conditions and no stress was imposed. It was found during this test that the robotic performance was satisfactory for maize and sorghum plants; whereas the performance matrices were lower for soybean and wheat. Both soybean and wheat had bush canopy that made the identification and localization of suitable grasping points quite challenging. In addition, wheat plants had narrow leaf blades that made robotized grasping difficult. However, these problems were expected at the beginning of the project. A decision was made to continue a second round of evaluation focusing on maize and sorghum plants, while the team will continue to work on the machine vision and image processing system (obj. 1) to improve the performance on soybean and wheat. The second round of evaluation started on Oct 10 and focused on 48 maize and 48 sorghum plants. The experiment was a 2x2 factorial treatment design. The first factor was water treatment with two levels (water-limited vs. control) and the second factor was nutrient treatment with two levels (high nutrient vs. nutrient). The goal was to create large differences in leaf physiological properties (such as temperature, water content, and reflectance) so that they can be detected by robotized measurement. In parallel to the robotized measurements, we also took measurements with handheld sensors (an ASD VisNIR spectrometer, a fluorimeter, and an IR radiometer) by a student from the same plant leaves. These human-based measurements will later be used to assess the accuracy of the robotized measurement. After all the measurements, the plants were destructively sampled. Leaves were harvested and dried in an oven for water content. The dried leaf samples were then sent to Midwest Laboratory for nutrient analysis (Nitrogen, Phosphorus, and Potassium). These lab-based destructive measurements will later be used to correlate with the leaf-level reflectance by the phenotyping robot. Analysis of data from the second evaluation experiment is ongoing.

Publications

  • Type: Conference Papers and Presentations Status: Accepted Year Published: 2017 Citation: Atefi, A., Ge, Y., Pitla, S., Schnable, J., 2017. Development of a robotic system to grasp plant leave for phenotyping. 2017 ASABE Annual Meeting Abstract.