Source: UNIVERSITY OF GEORGIA submitted to
ROBOT-ASSISTED FIELD-BASED HIGH THROUGHPUT PLANT PHENOTYPING
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1010382
Grant No.
2017-67021-25928
Cumulative Award Amt.
$954,048.00
Proposal No.
2016-07805
Multistate No.
(N/A)
Project Start Date
Jan 1, 2017
Project End Date
Dec 31, 2020
Grant Year
2019
Program Code
[A7301]- National Robotics Initiative
Project Director
Li, C.
Recipient Organization
UNIVERSITY OF GEORGIA
200 D.W. BROOKS DR
ATHENS,GA 30602-5016
Performing Department
COLLEGE OF ENGINEERING
Non Technical Summary
The genomics revolution provides unprecedented power to engineer new and advanced crop cultivars with the gene combinations needed to sustain the rapidly increasing world population amid climate change. Genomic information can now be obtained relatively quickly and inexpensively for thousands of genotypes in plant breeding and selection programs. However, relating these molecular signatures to key differences in phenotype has been laborious, expensive, and imprecise. As such, rapid and repeatable measurement of phenotypic crop parameters is a major bottleneck in plant breeding programs. High-throughput phenotyping technologies that can quickly and repeatedly scan tens of thousands of individuals using an array of advanced sensor and data analytics tools are critical to improving the ability of scientists to dissect the genetics of quantitative traits such as yield and stress tolerance. The proposed project, to develop a robot-assisted field-based high throughput phenotyping system that integrates both ground and unmanned aerial elements to quantitatively measure a suite of key traits iteratively throughout the growing season, is expected to unmask plant responses that will inform a new level and quality of decision making in selection of crop genotypes for specific production conditions. The task coordination between ground and aerial vehicles will result in new discoveries in the area of partitioning and coverage control. Employing the proposed high throughput phenotyping system to acquire data on an unprecedented scale could address challenges that are unique to improvement of cotton (the focal crop), as well as general constraints leading to improvement of most crops.Ground robotic systems equipped with LiDAR and thermal cameras will be designed to measure plant phenotypic traits (plant architecture and canopy temperature) in proximity. Unmanned aerial systems equipped with color, multispectral, and thermal cameras will be further integrated to acquire traits (flowering, canopy coverage, plant biotic and abiotic stresses) with high temporal resolution. Structure from motion algorithms will be utilized to reconstruct plant 3D structure; GPS position and features identified by scale-invariant feature transform algorithm between consecutive images will be used to develop image-stitching algorithms. Wireless data transfer, sensor calibration methods, and an overall data processing pipeline will be developed. A deep learning convolutional neural network will be studied to identify flowers and stressed plants from multispectral images. The topology of the network and relevant hyper-parameters will be optimized and adapted specifically to high throughput phenotyping. A strong computational core of this research will be to develop a suite of analytical tools aimed at successful deployment of teams of heterogeneous vehicles (ground and aerial) that collaboratively collect different types of data, while minimizing time and energy consumption and mitigating practical constraints pertinent to phenotyping. The genetic hypothesis underlying the proposed research is that detailed genetic and phenotypic analysis of wild relatives will reveal new alleles that can accelerate cotton improvement. The project capitalizes on a new nested association mapping resource for cotton, using an approach that combines QTL discovery with fine-scale mapping and gene identification.More than 190,000 domestic jobs are related to cotton processing, with an estimated aggregate influence of ~$100 billion/yr on the US gross domestic product. This project will accelerate translation of cotton research into commercially viable products. Moreover, elements of the robotic system can be utilized to improve many other crops and contribute to providing safe and quality agricultural products to sustain the growing population. The project is closely tied to the education of students and the public at many levels. At the graduate level, the PIs will integrate the robotic programming and sensor interfacing into existing courses. Furthermore, the PIs will encourage motivated undergraduate students to pursue research projects through the UGA CURO program. Finally, the PIs will attract underrepresented students through programs such as the Peach State Louis Stokes Alliance for Minority Participation, an NSF-funded program to draw minority undergraduates into STEM fields.
Animal Health Component
30%
Research Effort Categories
Basic
30%
Applied
30%
Developmental
40%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4021719202070%
2011719108130%
Goals / Objectives
The proposed project has four significant aims:Develop ground and aerial robotic systems equipped with advanced sensors (LiDAR, 3D, color, thermal, and multispectral) and algorithms for plant phenotypic traits measurement;Investigate innovative coverage control algorithms that can handle heterogeneous groups of vehicles (ground and aerial) with different capabilities that cooperatively aim at covering the assigned area;Design Convolutional Neural Networks for phenotypic trait extraction from the acquired images;Validate the robotic system in the field and associate genotype and phenotype data for crop improvement.
Project Methods
Ground robotic systems equipped with LiDAR and thermal cameras will be designed to measure plant phenotypic traits (plant architecture and canopy temperature) in proximity. Unmanned aerial systems equipped with color, multispectral, and thermal cameras will be further integrated to acquire traits (flowering, canopy coverage, plant biotic and abiotic stresses) with high temporal resolution. Structure from motion algorithms will be utilized to reconstruct plant 3D structure; GPS position and features identified by scale-invariant feature transform algorithm between consecutive images will be used to develop image-stitching algorithms. Wireless data transfer, sensor calibration methods, and an overall data processing pipeline will be developed. A deep learning convolutional neural network will be studied to identify flowers and stressed plants from multispectral images. The topology of the network and relevant hyper-parameters will be optimized and adapted specifically to high throughput phenotyping. A strong computational core of this research will be to develop a suite of analytical tools aimed at successful deployment of teams of heterogeneous vehicles (ground and aerial) that collaboratively collect different types of data, while minimizing time and energy consumption and mitigating practical constraints pertinent to phenotyping. The genetic hypothesis underlying the proposed research is that detailed genetic and phenotypic analysis of wild relatives will reveal new alleles that can accelerate cotton improvement. The project capitalizes on a new nested association mapping resource for cotton, using an approach that combines QTL discovery with fine-scale mapping and gene identification.

Progress 01/01/17 to 12/31/20

Outputs
Target Audience:Plant breeders, geneticists, growers, engineers and data scientists working on high throughput plant phenotyping. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?In the Bio-sensing and Instrumentation Laboratory, this project has supported and provided training opportunities for three doctoral students (Dr. Yu Jiang, Dr. Shangpeng Sun, and Dr. Rui Xu) and one master student (Mr. Jawad Iqbal). All the graduate students had successfully graduated and accomplished their research goals (focusing on Task 1, 3, and 4). Two doctoral students trained by the project had become faculty members in top-tier research universities (Dr. Jiang is an Assistant Research Professor in Cornell University, and Dr. Sun is an Assistant Professor in McGill University). In addition, the project has provided training opportunities for many undergraduate students for CURO (Center for Undergraduate Research Opportunities) projects at the University of Georgia. PI Li has been a faculty mentor of the student robotics team for the ASABE International Student Robotics Competition in the past four years. The team won the second place award in the advanced division in 2018. As a result of the passion generated by the robotics competition, the students trained by this project formed the UGA Student Robotics Club. In addition, two robotics courses (ELEE 4280/6280 Introduction to Robotics Engineering and ELEE 8XXX Autonomous Mobile Robots and Manipulators) were developed and offered to both undergraduate and graduate students at the College of Engineering in UGA during the performance period of the project. In PI Velni's lab, one postdoc fellow (Dr. Mohammmadreza Davoodi), one graduate student (Ms. Saba Faryadi), and one undergraduate student (Mr. Michael Buzzy) received training and worked on achieving the main objectives of Task 2 of the project (described under Accomplishments). PI Velni worked very closely with the postdoc Davoodi and provided mentorship to prepare him for a future faculty job (Dr. Davoodi is now a Research Scientist at the University of Texas). The postdoc partially mentored both graduate and undergraduate students. Along with the PI Velni, postdoc and graduate student attended conferences and interacted with peers. The postdoc (along with PI Velni) attended and delivered a keynote talk in the first Southeast Controls Conference in Georgia Tech on November 2019. The details of his talk can be found in https://scc.ece.gatech.edu/events/talk26.html. Finally, the graduate student assisted PI Velni in preparing lab sessions for a freshman level workshop he teaches every Fall semester on the "applications of feedback control." In the Plant Genome Mapping Laboratory, the primary person working on this project was Mr. Jon Robertson, a B.S. level Research Coordinator with about a decade of experience in our group. Lesser roles were played by Dr Min Liu (genotyping), Dr Wenqian Kong (data analysis), Dr Tariq Shehzed, Ms Ellen Skelton (M.S. grad student), Mr Jeevan Adhikari (Ph.D. grad student), Ms. Wiriyanat Ployaram (M.S. grad student, data analysis) and others in DNA sampling and extraction. How have the results been disseminated to communities of interest?The outcome of the project has been published in peer reviewed scientific journals. In addition, presentations were given at national and international conferences, as well as in the form of seminars given at UGA and beyond. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? Objective 1: The PI Li's Bio-sensing and Instrumentation Laboratory (BSAIL) has developed two types of mobile robots for in-field high throughput phenotyping, including a differential drive inter-row robot and a modular agricultural robotic system (MARS) with 4-wheel drive and 4-wheel steering. The control software is designed using the Robot Operating System (ROS) with three modules: control module, navigation module, and vision perception module. Two MARS designs were implemented: a low-cost, lightweight robotic system named MARS mini and a heavy-duty robot named MARS X. The MARS X was tested for its performance and navigation accuracy with satisfactory results. The MARS mini and MARS X have been validated and tested in two field tests. The ground system was evaluated by scanning 23 cotton genotypes for quantification of canopy growth and development. In addition, we developed an unmanned aerial system (UAS) that integrates a color, multispectral, thermal camera and a LiDAR sensor. The acquisition software was designed based on ROS, which can visualize and record data. A data processing pipeline was proposed to preprocess the raw data and extract phenotypical traits at plot level, including morphological traits (canopy height, canopy cover, and canopy volume), canopy vegetation index, and canopy temperature. The multispectral and thermal camera were calibrated in the lab and in the field. The system was validated through field data collection in a cotton field. Objective 2: PI Velni's lab accomplished the following goals: 1) To deploy a team of heterogeneous, autonomous robots in a partially known agricultural field, the field was first represented as a weighted directed graph, and then, a new partitioning algorithm was delivered that was capable of capturing the heterogeneity of robots in terms of the speed, sensing capability and onboard power. Next, a distributed deployment strategy was proposed to optimally distribute robots on the graph to monitor specified regions of interest in the environment. It was proved that the proposed combined partitioning and deployment strategy was optimal in the sense that any other arbitrary partition than the proposed one resulted in a larger coverage cost. 2) Distributed coverage control problem was examined for deploying a team of heterogeneous robots with nonlinear dynamics in a partially known environment modeled as a weighted mixed graph. By defining an optimal tracking control problem, using a discounted cost function and state-dependent Riccati equation approach, a new partitioning algorithm was proposed to capture the heterogeneity in robots' dynamics. It was shown that the size of the subgraph associated with each robot depended on its resources and capabilities in comparison to its neighbors. Also, a distributed deployment strategy was proposed to optimally distribute robots aiming at persistently monitoring specified regions of interest. 3) Another objective was to build precision maps, that are useful in agricultural farm management, using a fleet of unmanned ground vehicles (UGVs). Having an environmental model that includes geo?referenced data would facilitate the deployment of multi?robot systems, that has emerged in precision agriculture. For this purpose, a reinforcement learning?based method was presented for a team of UGVs to cooperatively learn an unknown dynamic field (i.e., an agricultural field). The problem of interest was to deploy UGVs to map plant rows, find obstacles, whose locations are not known a priori, and define regions of interest in the field (e.g., areas with high water stress). Once an environment model was built, UGVs were then distributed to provide full coverage of plants and update the reconstructed map simultaneously. 4) A team-based approach was also proposed to minimize a locational cost function, defined with respect to various regions of interest in the field, while each region was covered by intended agents. For a large field with a swarm of autonomous agents, the field was first partitioned into smaller regions among teams using the so-called Power diagram in such a way that larger regions were assigned to those teams with higher capabilities. The teams' assigned regions were then partitioned among their members by the so-called multiplicatively-weighted (MW) Voronoi diagrams with guaranteed collision avoidance. A distributed control law was then developed based on partitioning in team and agent levels to guarantee the convergence of agents to locally optimal positions. Results of all the above achievements were published in peer reviewed journal and conference papers. Objective 3: PI Li's BSAIL lab has investigated and developed deep learning (DL)-based data analytics approaches for plant phenotyping. For example, a custom 2-layer convolutional neural networks (CNN) was trained to determine the presence of white cotton blossoms (flowers) in aerial RGB images by leveraging 2D to 3D projection to avoid double counting. In another example, a deep learning-based approach (DeepFlower) was developed to detect and count emerging blooms in the collected images using the ground platform. The numbers of emerging blooms per plant per day over the flowering period were used to derive flowering curves for individual plants. Statistical analyses showed that imaging-derived flowering characteristics had similar effectiveness as manual assessment for identifying differences among genetic categories or genotypes. In another example, we developed a DeepSeedling framework using a multi-object tracking approach and deep learning to detect and track cotton seedlings from video frames. The seedling counts predicted by the tracking-by-detection approach were highly correlated (R2= 0.98) with that found through human field assessment for 75 test videos collected in multiple locations during multiple years. In addition, Li's lab has developed data analytics on 3D point cloud data for cotton boll mapping, internode distance measurement, and branching pattern characterization. Objective 4:The robotic systems and data analytics have been extensively tested in the field in the past four years. In each of three years, we grew out three populations for QTL mapping of phenotypes collected using the HTP system. These three populations were chosen from about 30 candidates, to sample maximal variation between elite cultivated cottons and wild relatives. There was highly significant genetic variation (and also some non-genetic variation such as blocking factors and seed source) for all fiber quality parameters, reflecting the high degree of genetic variation among and within these three populations. There was also highly significant variation in canopy cover and volume measurements taken using the HTP system. The second measurement (10 July) showed the highest significance for genetic variation. Over the subsequent measurements, there continued to be significant genetic variation, but it was weaker and weaker as the season progressed, dropping below significance near the end. There were also high correlations among consecutive measurements. These data, albeit based on only a single season so far, suggested that: 1). Canopy cover and volume measurements both were heavily influenced by genetic factors; 2). Canopy establishment may have been related to yield (there was a significant correlation), however, once the canopy was established there was relatively little subsequent change. DNA was prepared from each of the ~500 families in the three mapping populations, genotyped using a genotyping-by-sequencing system published by the Paterson lab, and the populations mapped. We are now completing QTL mapping of the measured traits, determining the complexity of genetic control of each trait, identifying diagnostic DNA markers, and investigating whether different traits have partly common or independent genetic control.

Publications

  • Type: Journal Articles Status: Submitted Year Published: 2021 Citation: Xu, R., Changying Li. 2021. Development of an Multi-sensor Unmanned Aerial System for High Throughput Phenotyping. Remote Sensing.
  • Type: Journal Articles Status: Under Review Year Published: 2021 Citation: Xu, R., Changying Li. 2020. Development of the modular agricultural robotic system (MARS): concept and implementation. Journal of Field Robotics.
  • Type: Journal Articles Status: Accepted Year Published: 2020 Citation: Sun, S., C. Li, P. W. Chee, A. H. Paterson, C. Meng, J. Zhang, P. Ma, J. S. Robertson, J. Adhikari. 2020. High Resolution 3D Terrestrial LiDAR for Cotton Plant Main Stalk and Node Detection. Computers and Electronics in Agriculture
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Jiang Y, Li C, Xu R, Sun S, Robertson JS, Paterson AH. 2020. DeepFlower: a deep learning-based approach to characterize flowering patterns of cotton plants in the field. Plant Methods. 16(1):156.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Iqbal, J., Xu, R., Halloran, H., & Li, C. (2020). Development of a Multi-Purpose Autonomous Differential Drive Mobile Robot for Plant Phenotyping and Soil Sensing. Electronics, 9(9), 1550.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Iqbal, J., Xu, R., Sun, S., & Li, C. (2020). Simulation of an Autonomous Mobile Robot for LiDAR-Based In-Field Phenotyping and Navigation. Robotics, 9(2), 46.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Jiang, Y., C. Li. 2020. Convolutional neural networks for image-based high throughput plant phenotyping: A review. Plant Phenomics. Volume 2020, Article ID 4152816, 22 pages.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Jiang, Y., Snider, J. L., Li, C., Rains, G. C., & Paterson, A. H. 2020. Ground Based Hyperspectral Imaging to Characterize Canopy-Level Photosynthetic Activities. Remote Sensing, 12(2), 315.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: Sun, S., C. Li, P. Chee, A. Paterson, Y. Jiang, R. Xu, J. Robertson, J. Adhikari, T. Shehzad. 2020. Three-dimensional Mapping of Cotton Bolls in situ Based on Point Cloud Segmentation and Clustering. ISPRS Journal of Photogrammetry and Remote Sensing. 160: 195-207.
  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Jiang, Y., C. Li, A. Paterson, J. Robertson. 2019. DeepSeedling: Deep convolutional network and Kalman filter for plant seedling detection and counting in the field. Plant Methods. 15 (1):141.
  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Sun, S., C. Li, A. Paterson, and P. Chee. 2019. Image processing algorithms for infield single cotton boll counting and yield prediction. Computers and Electronics in Agriculture. 166, 104976.
  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Jiang, Y., Li, C., Takeda, F., Kramer, E. A., Ashrafi, H., & Hunter, J. (2019). 3D point cloud data to quantitatively characterize size and shape of shrub crops. Horticulture Research, 6(1), 43.
  • Type: Journal Articles Status: Published Year Published: 2019 Citation: Xu, R, C. Li, and A. H. Paterson. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping. PLOS One 14.2 (2019): e0205083.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Sun, S., C. Li, A.H. Paterson, Y. Jiang, R. Xu, J. Roberson, J. Snider, and P. Chee. 2018. In-field high throughput phenotyping and cotton plant growth analysis using LiDAR. Frontiers in Plant Sciences. 9, 16.
  • Type: Journal Articles Status: Published Year Published: 2017 Citation: Xu, R., C. Li, A.H. Paterson, Y. Jiang, S. Sun, J. Roberson. 2017. Cotton bloom detection using aerial images and convolutional neural network. Frontiers in Plant Sciences. 8, 2235.
  • Type: Journal Articles Status: Published Year Published: 2017 Citation: Jiang, Y., C. Li, A.H. Paterson, S. Sun, R. Xu, and J. Roberson. 2017. Quantitative analysis of cotton canopy size in field conditions using a consumer-grade RGB-D camera. Frontiers in Plant Sciences. 8, 2233.
  • Type: Journal Articles Status: Published Year Published: 2017 Citation: Jiang, Y., C. Li, A.H. Paterson, J. Roberson, S. Sun, and R. Xu. GPhenoVision: A ground mobile system with multi-modal imaging for field-based high throughput phenotyping of cotton. Scientific Reports. doi: 10.1038/s41598-018-19142-2.
  • Type: Journal Articles Status: Published Year Published: 2017 Citation: Sun, S., C. Li, and A. H. Paterson. 2017. In-field high-throughput phenotyping of cotton plant height using LiDAR. Remote Sensing, 9(4): 377.
  • Type: Journal Articles Status: Published Year Published: 2021 Citation: M. Davoodi, S. Faryadi, and J. Mohammadpour Velni, A graph theoretic-based approach for deploying heterogeneous multi-agent systems with application in precision agriculture, Journal of Intelligent & Robotic Systems, 101(10), 2021.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: S. Faryadi and J. Mohammadpour Velni, A reinforcement learning-based approach for modeling and coverage of an unknown field using a team of autonomous ground vehicles, International Journal of Intelligent Systems, published online on Nov. 2020.
  • Type: Journal Articles Status: Published Year Published: 2020 Citation: M. Davoodi and J. Mohammadpour Velni, Heterogeneity-aware graph partitioning for distributed deployment of multi-agent systems, IEEE Transactions on Cybernetics, published online on July 2020.
  • Type: Journal Articles Status: Published Year Published: 2019 Citation: A. Mesbahi, F. Abbasi, and J. Mohammadpour Velni, A team-based deployment approach for heterogeneous mobile sensor networks, Automatica, 106: 327-338, 2019.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: M. Davoodi, J. Mohammadpour Velni, and C. Li, Coverage control with multiple ground robots for precision agriculture, ASME. Mechanical Engineering Magazine, 140(06): S4-S8, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2020 Citation: S. Faryadi, M. Davoodi, and J. Mohammadpour Velni, Optimal path planning for a team of heterogeneous drones to monitor agricultural fields, in Proc. ASME Dynamic Systems and Control Conference (DSCC), Pittsburg, PA, Oct. 2020.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: S. Faryadi, M. Davoodi, and J. Mohammadpour Velni, Agricultural field coverage using cooperating unmanned ground vehicles, in Proc. ASME Dynamic Systems and Control Conference (DSCC), Park City, UT, Oct. 2019.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2019 Citation: S. Faryadi, M. Davoodi, and J. Mohammadpour Velni, Autonomous real-time monitoring of crops in controlled environment agriculture, in Proc. ASME Dynamic Systems and Control Conference (DSCC), Park City, UT, Oct. 2019.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: R. Xu, C. Li, and J. Mohammadpour Velni, Development of an autonomous ground robot for field high throughput phenotyping, in Proc. 6th IFAC Conference on Bio-Robotics, pp. 70-74, Beijing, China, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: ) F. Abbasi, A. Mesbahi, J. Mohammadpour Velni, and C. Li, Team-based coverage control of moving sensor networks with uncertain measurements, in Proc. IEEE American Control Conference (ACC), pp. 852-857, Milwaukee, WI, 2018.


Progress 01/01/17 to 12/31/17

Outputs
Target Audience:The target audiences would include plant breeders, plant scientists, growers and engineers. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?In the Bio-Sensing and Instrumentation Lab, two doctoral students have been primarily supported by the grant: Mr. Rui Xu (ground and aerial robot development) and Mr. Shangpeng Sun (3D point cloud data analysis and deep learning methods development). The two students will continuously be supported by the grant in the third year. In the Complex Systems Control Lab, a postdoc level research staff, Dr. Mohammadreza Davoodi, and an MS student (Ms. Saba Faryadi), were the primary personnel assigned to the project and worked on the coverage control and path planning-related tasks. We expect the same two persons to be supported on the last year of the grant. In the Plant Genome Mapping Laboratory, the primary person working on this project was Mr. Jon Robertson, a B.S. level Research Coordinator with about a decade of experience in our group. Lesser roles were played by Dr Wenqian Kong (data analysis), Dr Tariq Shehzed, Ms Ellen Skelton (M.S. grad student), Mr Jeevan Adhikari (Ph.D. grad student) and others in DNA sampling and extraction. How have the results been disseminated to communities of interest?We have made promising progress in the first two years and will continue to make progress towards our goal for the remainder of the project. We will disseminate our findings by publishing our research papers and presenting our work in professional meetings. What do you plan to do during the next reporting period to accomplish the goals?In the following year, we will focus on integrating the ground and aerial robotic systems into ROS, developing a robust communication protocol between robots, and testing the coverage control algorithms and robotic systems both in the lab and in the field. We will also develop data analysis pipelines using deep learning methods and point cloud library. Regarding the plant trait measurement, based on year 1 analysis on plant height measurement, it was decided to repeat the measurements in year 2. Once per two weeks seems sufficient for the first few measurement dates, then once per month later in the season. Further, DNA was prepared from each of the ~500 families in the three mapping populations in year 1, and is slated for genotyping. This will permit us to undertake QTL mapping of the measured traits, determining the complexity of genetic control of each trait, identifying diagnostic DNA markers, and investigating whether different traits have partly common or independent genetic control.

Impacts
What was accomplished under these goals? Objective 1: Robotic and sensing algorithm development for plant phenotypic traits measurement (Li Lab). 1) We designed an autonomous ground robot for in-field high-throughput phenotyping in the first year. The GPS-guided autonomous navigation system allows the robot to navigate through user defined waypoints in the field. We tested the robot in a cotton field and showed the collected color images can construct a detailed 3D point cloud for cottons. In addition, an unmanned aerial system with multiple sensors (color, multispectral, and thermal imaging) was also developed. Currently, an airborne LiDAR is being integrated into the UAS system using robotic operating system (ROS). 2) A terrestrial 2D laser ranging sensing system with an RTK-GPS was developed to scan plants from overhead in the field. Precise 3D models of scanned plants were reconstructed based on the point cloud data to measure three plot-level morphologic traits including plant height, projected canopy area, and plant volume. Results of validation experiments showed good agreements between LiDAR measurements and manual measurements for all three morphologic traits with R2 > 0.97. 3) We developed an in-field calibration method for aerial thermal imaging to correct the atmospheric effect. We designed a calibration panel that can maintain constant temperature using an active heating system. The calibration panel and one validation target were placed on the ground and aerial thermal images were collected at 20 m above the ground. The calibrated temperature reduced the error of the validation target from 0.5ºC to 0.2ºC. The developed robotic systems and data processing pipelines using various sensors demonstrated high throughput data collection and analysis. Objective 2: Coverage control algorithms that can handle heterogeneous robots (Velni Lab). We worked primarily on: 1) developing a new coverage control algorithm for a group of autonomous ground vehicles (robots), and 2) validating in a lab-scale test bed the developed coverage algorithm. For the former one, the field was first represented by a weighted directed graph. The important areas on the field were detected and identified on the graph using a distributed density function. Next, this information was sent as input to a distributed energy-aware deployment algorithm. Based on the proposed strategy, the robots are optimally deployed in such a way that coverage of the whole field and monitoring of the areas of interest are maximized so that robots could acquire more measurements of those areas. In the latter task accomplished so far, the developed coverage control algorithm was implemented and validated experimentally in a lab-scale test bed built recently in Velni's lab. The design of the proposed coverage algorithm is significant because it allows the coordination between robots to quickly and efficiently cover the entire field and collect measurements. The coverage control strategy proposed by the team is the first of its kind for the Agriculture applications. Although so far only tested in the lab environment, our algorithm has integrated some of the characteristics of the problem at hand pertinent to the field-based agriculture, hence facilitating its use in the field, which is our plan for the last year of the project. The project has so far led to two articles (one in conference proceedings and another in a magazine), as well as few currently in the pipeline. Objective 3: Phenotypic trait extraction using deep learning (Li Lab). 1) We developed a convolutional neural network (CNN) to detect and count cotton flowers using color images acquired by an unmanned aerial system. Cotton flowers were initially detected by CNN in raw images and their 3D locations were calculated using the dense point cloud constructed from the aerial images with the structure from motion method. The bloom count using CNN and aerial images was comparable with the manual count with an error of -4 to 3 blooms for the field with a single plant per plot. 2) We developed and evaluated a deep learning approach and thermal imaging to measure plant temperature in the field. Thermal images were collected using a high throughput phenotyping system GPhenoVision developed by Li and Paterson. Mask RCNN and a thresholding based approach were proposed for plant localization, segmentation, and temperature extraction. Experiments showed that Mask RCNN outperformed the thresholding approach for all tasks, resulting in a significant improvement on the accuracy of extracted plant temperatures. The proposed deep learning and thermal imaging based approach is an accurate and effective tool for measuring plant temperature in the field, which could advance breeding programs and genetics studies of heat-tolerant genotypes. Objective 4: Associate genotype and phenotype data for crop improvement (Paterson Lab). We are in the second year of growing out three populations for QTL mapping of phenotypes collected using the HTP system proposed. These three populations were chosen from about 30 candidates, to sample maximal variation between elite cultivated cottons and wild relatives. Based on the now-complete year 1 growout, there was highly significant genetic variation (and also some non-genetic variation such as blocking factors and seed source) for all fiber quality parameters, reflecting the high degree of genetic variation among and within these three populations. There was also highly significant variation in canopy cover and volume measurements taken using the HTP system. The second measurement (10 July) showed the highest significance for genetic variation. Over the subsequent measurements, there continued to be significant genetic variation, but it was weaker and weaker as the season progressed, dropping below significance near the end. There were also high correlations among consecutive measurements. These data, albeit based on only a single season so far, suggested that: 1). Canopy cover and volume measurements both were heavily influenced by genetic factors; 2). Canopy establishment may have been related to yield (there was a significantcorrelation), however, once the canopy was established there was relatively little subsequent change. We would need to do further investigation to see if there was some 'threshold' beyond which no further increase of cover increased yield.

Publications

  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Sun, S., C. Li*, A.H. Paterson, Y. Jiang, R. Xu, J. Roberson, J. Snider, and P. Chee. 2018. In-field high throughput phenotyping and cotton plant growth analysis using LiDAR. Frontiers in Plant Sciences. 9, 16.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: Xu, R., C. Li*, A.H. Paterson, Y. Jiang, S. Sun, J. Roberson. 2017. Cotton bloom detection using aerial images and convolutional neural network. Frontiers in Plant Sciences. 8, 2235.
  • Type: Journal Articles Status: Published Year Published: 2018 Citation: M.R. Davoodi, J.M. Velni, and C. Li. "Coverage Control with Multiple Ground Robots for Precision Agriculture." ASME Mechanical Engineering Magazine Select Articles 140.06 (2018): S4-S8.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Jiang, Y., L. Shuang, C. Li, A. Paterson, and J. Robertson. Deep learning for thermal image segmentation to measure canopy temperature of Brassica oleracea in the field. ASABE Annual International Meeting Paper No: 1800305. Detroit, Michigan. July 29-August 1, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Sun, S., C. Li, A. Paterson, Y. Jiang, J. Robertson. 3D computer vision and machine learning based technique for high throughput cotton boll mapping under field conditions. ASABE Annual International Meeting Paper No: 1800677. Detroit, Michigan. July 29-August 1, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Xu, R., C. Li, A. Paterson. Develop an in-field calibration method for aerial thermal imaging: preliminary results. ASABE Annual International Meeting Paper No: 1800830. Detroit, Michigan. July 29-August 1, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: Xu, R., C. Li, J. Velni. Development of an autonomous ground robot for in-field high throughput phenotyping. IFAC Bio-Robotics Conference. Beijing, July 13-16, 2018.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2018 Citation: F. Abbasi, A. Mesbahi, J.M. Velni, and C. Li, Team-based Coverage Control of Moving Sensor Networks with Uncertain Measurements. In 2018 Annual IEEE American Control Conference (ACC) (pp. 852-857).