Progress 02/01/17 to 01/31/21
Outputs Target Audience:Crop Research professionals in all crops and a broad range of objectives, but especially plant breeding and the seed industry. Engineers in the fields of Agricultural engineering and Civil engineering, focused on remote sensing, photogrammatry, and sensors for Agroculture and Agronomy. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Ground reference data collection, and the image processing pipeline has employed or engaged with ~15 undergraduate students, 1 hourly graduate student and 4 full-time graduate research assistants. Students were trained to work both in the field (agronomic reserach and UAS flights) and in the computer lab, through such experiences they learned skills related to the operation of unmanned aircraft systems, sampling of spectral, phenotypic traits and biomass, including aspects of experimental design, data management, and data analysis. Some of these students worked with the team as hourly employees, some worked with the team as part of independent undergraduate research projects. One of the undergraduate students participated through the Purdue University Summer Research Opportunities Program (SROP) and another through the Purdue Engineering Summer Undergraduate Research Fellowship (SURF) program. Both programs are designed to recruit undergraduate students into graduate programs by providing projects and mentorship. Three students participated as part of Senior Design project through the Department of Agricultural and Biological Engineering, with one student taking his work to the annual ASABE conference where he won the ITSC Community Paper Award for his presentation. In addition, we have a long list of multidisicplinary published outputs with more such outputs in progress or draft form. How have the results been disseminated to communities of interest?Several manuscripts have been published, and talks given on the tools, the results, and data management for UAS imagery. In the final year of the project, we have migrated the software code base from MATLAB, which has an expensive commercial license, to the Python programming language. This has made the code less expensive to implement and facilitated increased flexibility by allowing us to make use of the wide-variety of open source image processing and analysis tools developed by the Python user community. What do you plan to do during the next reporting period to accomplish the goals?
Nothing Reported
Impacts What was accomplished under these goals?
Objective 1: As part of this project, we have developed a robust set of methods to collect imagery from unmanned aircraft systems (UAS) and necessary ground reference data from experimental soybean plots on a regular basis throughout the growing season. This data is in turn fed through a analysis pipeline developed by the project team, resulting in the ability to extract images from individual plots within an experimental field site, calibrate those images and calculate vegetation indices that could be used to phenotype the crop in the field. Imagery from a flight operation can now be fully processed within 24 hours, and we can support multiple flights at multiple locations using several cameras (RGB and Multispectral) each week of the field season. The processing pipeline starts with commercially available image processing software, such as Pix4DMapper, that orients the UAS imagery and constructs a complete orthomosaic of the field experiment. Images are then run through the software tools developed as part of this project. • Crop Image Extraction (CIE) - CIE is the first step in the high-throughput phenotyping processing pipeline. CIE extracts plot images from designed experiments using RGB and multispectral imagery captured by unmanned aerial systems (UAS). The user configures the designed experiment into CIE by providing experimental metadata such as experiment location, number of rows and ranges, and size of plots. The configured design followed by steps to segment canopy then accurately and precisely identify plot midpoints enables automated and rapid extraction of plot images. CIE works entirely in MATLAB and can run batch processes on Linux computer clusters. • Vegetation Indices Derivation (VID) - VID is the second step in the data processing pipeline. VID uses image attributes (i.e. row, range, date, image band) with customized functions such as vegetation indices to quantify phenotypic traits from the extracted plot images. The automated and efficient trie structure in VID allows for rapid processing. For instance, an experiment of approximately 400 plots takes less than 30 seconds to process each vegetation index being evaluated. Moreover, VID can calibrate images from a digital number to reflectance using different methods. Image calibration is described below. VID also runs on MATLAB and can run batch process on Linux computer clusters. Additionally, multispectral images are run through an image calibration process that requires the use of reflectance panels and/ field spectrometers. The panels are laid out on the field during flight operations. The reflectance panels reflect at a specific and consistent percentage of light across the Visible and near Infra-red spectrum. Currently five panels are used, 7%, 12%, 22%, 36% and 48%. Handheld field spectrometers are used as well to measure the reflectance of the panels as the multispectral data is being collected via the camera mounted to the UAS. The reflectance values from the panels, along with radiance values of the panels, extracted from the generated orthomosaics, will be used to generate an empirical line using the empirical line method (Smith et al., 1999). The values generated from the empirical equation can be used as inputs into VID to calibrate the images for accurate reflectance and indices output. In the final year of the project, we have migrated the code base from MATLAB, which has an expensive commercial license, to the Python programming language. This has made the code less expensive to implement and facilitated increased flexibility by allowing us to make use of the wide-variety of open source image processing and analysis tools developed by the Python user community.? Objective 2: We established two methods of phenotyping above ground biomass. 1) A subset of the SoyNAM population (n=383) was grown in multi-environment trials and destructive AGB measurements were collected along with multispectral and Red, Green and Blue (RGB) imaging with an unmanned aerial system from 27 to 83 days after planting (DAP). Machine-learning methods resulted in AGB predictions with high R2 values (0.92-0.94). Narrow-sense heritabilities estimated over time using the mentioned RRM ranged from low to moderate (from 0.02 at 44 DAP to 0.28 at 33 DAP). AGB from adjacent DAP had the highest genetic correlations compared to those DAP further apart. We observed high accuracies and low biases of prediction indicating that genomic breeding values for AGB can be predicted over specific time intervals using RRM. Genomic regions associated with AGB varied with time, and no genetic markers were significant in all time points. In order to predict the AGB for all DAP, including days when ground truth data were not available, we considered a linear model using the imagery features as the predictor variables within each environment across all observed DAP. We observed that the distribution of the residuals was highly asymmetric, suggesting that a linear model was not suitable to fit the data (Thoni et al. 1990). To correct the asymmetry we considered a Box-Cox transformation on the AGB, which led to the log-transformed values (data not shown, Box and Cox, 1964). The prediction of AGB was carried-out using two different machine-learning methods: Least Absolute Shrinkage and Selection Operator (LASSO) Regression (Tibshirani 1996), and Partial Least Squares Regression (PLSR) (Wold et al. 2001). For the PLSR, 10 principal components were selected so that the root mean squared error (RMSE) was minimized. The performance of the predictive models was evaluated using a 10-fold cross-validation strategy, in which the dataset was randomly divided into a training set (90% of the plots) and validation set (10% of the plots). The predictive accuracy of the model was measured by the coefficient of determination (R2), which is equal to the fraction of AGB variance explained by the model, and by the RMSE, which measures the average error magnitude. Pearson's correlation coefficient (r) was also considered to quantify the linear correlation between the observations and their estimates, being an indication of model prediction ability. Both models were implemented in the R software (R Core Team 2019), using the package caret (Kuhn 2008) 2) Using the same field experiment described above, from temporal RGB images the crop field was reconstructed in 3D by image-based modeling. Algorithms were combined to compute canopy volume of each plot in cubic meters. Within-plot height variation was quantified to improve the biomass correlations. The 3D models use a 3D Delaunay triangulation algorithm (Golias and Dutton, 1997) applied to the variability of multiple height measurements per plot quantified from the point cloud. The advantage of this method is that is uses RGB images, so radiometric calibration is not required. Objective 3: Over approximately 10,000 soybean progeny rows over two years, we have used UAS canoy coverage data and/or yield to select lines using diferent selction criteria. These lines have been tested in preliminary and advanced yield trials in our breedingpipeline and a student is working on the manuscript.
Publications
- Type:
Journal Articles
Status:
Published
Year Published:
2019
Citation:
F.F. Moreira, A.A.A. Hearst, K.A. Cherkauer, and K.M. RAINEY. 2019. Improving the efficiency of soybean breeding with high-throughput canopy phenotyping.�Plant Methods�15,�139. doi:10.1186/s13007-019-0519-4.
- Type:
Journal Articles
Status:
Published
Year Published:
2019
Citation:
M. Herrero-Huerta*, S. Govindarajan, K.A. Cherkauer, and K.M. RAINEY (2019) Triple S: A new tool for Soybean High Throughput Phenotyping from UAS-based Multispectral Imagery. Proc. SPIE 11007, Advanced Environmental, Chemical, and Biological Sensing Technologies XV, 110070K. DOI: 10.1117/12.2519376.
- Type:
Journal Articles
Status:
Accepted
Year Published:
2020
Citation:
B. Lyu, S.D. Smith, X. Yexiang, K.M. RAINEY, K.A. Cherkauer. 2020. Deriving Vegetation Indices from High-throughput Images by Using Unmanned Aerial Systems in Soybean Breeding. Transactions of the ASABE.
- Type:
Journal Articles
Status:
Awaiting Publication
Year Published:
2020
Citation:
Herrero-Huerta, M�nica & Rodr�guez-Gonz�lvez, Pablo & Rainey, Katy. (2020). Yield Prediction By Machine Learning From Uas-Based Multi-Sensor Data Fusion In Soybean?. 10.21203/rs.3.rs-16958/v1.
- Type:
Journal Articles
Status:
Under Review
Year Published:
2020
Citation:
Moreira, F., Oliviera, H., lopz, M., Abughali, B., Gomes, G., Cherkauer, K., Brito, L., K.M. Rainey (2020) High-throughput phenotyping and random regression models reveal temporal genetic
control of soybean biomass production, under review for PNAS
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2020
Citation:
Proceedings Phenome 2020, Feb 24-28, Tucson, AZ.
K.M. RAINEY and F.F. Moreira. 2020. UAS Estimation and Genetic Architecture Field-based Soybean Biomass. Abstract was selected for an invited lecture. Contribution: Dr. Rainey was principal investigator.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
M. Herrero-Huerta* and K.M. RAINEY. High Throughput Phenotyping of Physiological Growth Dynamics from Uas-Based 3d Modeling in Soybean. ISPRS-International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 4213 (2019): 357-361. DOI: 10.5194/isprs-archives-XLII-2-W13-357-2019.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2020
Citation:
Proceedings Phenome 2020, Feb 24-28, Tucson, AZ.
M. Herrero-Huerta, K.M. RAINEY and F.F. Moreira. 2020. UAS Estimation and Genetic Architecture Field-based Soybean Biomass. Abstract was selected for an invited lecture. Contribution: Dr. Rainey was principal investigator.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Proceedings National Association of Plant Breeders 2019 Annual Meeting, Aug 25-29, Pine Mountain, GA., K.M. RAINEY*, M. Herrero-Huerta, S.D. Smith, B. Abughali, F.F. Moreira, M.A. Lopez, and K.A. Cherkauer. 2019 NIFA: 3D and precision soybean phenotypes from temporal UAS imagery of yield trials. Contribution: Dr. Rainey was principal investigator.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Proceedings National Association of Plant Breeders 2019 Annual Meeting, Aug 25-29, Pine Mountain, GA., F.F. Moreira*, M.A. Lopez, L. Brito, K.A. Cherkauer, and K.M. RAINEY. 2019. Combining high-throughput phenotyping and GWAS to reveal temporal genetic variation in soybean biomass. Contribution: Dr. Rainey was principal investigator.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2017
Citation:
K.M. RAINEY* and K.A. Cherkauer. 2017. Development of Analytical Tools for Drone-based Canopy Phenotyping in Crop Breeding. Proceedings National Association of Plant Breeders 2017 Annual Meeting, Aug 7-9, Davis CA. Contribution: Dr. Rainey was principal investigator.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2020
Citation:
Proceedings Phenome 2018, Feb 12-16, Tucson, AZ., K.M. RAINEY. 2018. UAS Phenotyping in Soybean Breeding and Phenomic Inference. Abstract for an invited lecture.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2018
Citation:
Proceedings Phenome 2018, Feb 12-16, Tucson, AZ, F.F. Moreira* and K.M. RAINEY. 2018. Improving Efficiency of Soybean Breeding with Phenomic-enabled Canopy Selection.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Proceedings Plant & Animal Genome Conference XXVII, Jan 12-16, 2019, San Diego, CA, F.F. Moreira*, M.A. Lopez, M. Herrero-Huerta, K.A. Cherkauer and K.M. RAINEY. 2019. Genetic Architecture of Soybean Biomass Development Derived from Field-Based High-Throughput Phenotyping.
- Type:
Conference Papers and Presentations
Status:
Accepted
Year Published:
2020
Citation:
Lyu, B., S.D. Smith, Keith A. Cherkauer. Fine-Grained Recognition in High-throughput Phenotyping. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2020. Seattle, WA, June 14-19, 2020
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Lyu, B., S.D. Smith, Yexiang Xue, Keith A. Cherkauer. Deriving Vegetation Indices from High-throughput Images by Using Unmanned Aerial Systems in Soybean Breeding. Abstract 1900279. Presented at the 2019 Annual International Meeting of the American Society of Agricultural and Biological Engineers. Boston, MA, July 7-10, 2019. ITSC Community Paper Award Winner, 2019
- Type:
Journal Articles
Status:
Published
Year Published:
2020
Citation:
M. Herrero-Huerta P., Rodriguez-Gonzalvez, K.M. RAINEY. 2020. Yield prediction by machine learning from UAS-based multi-sensor data fusion in soybean.�Plant Methods�16,�78. https://doi.org/10.1186/s13007-020-00620-6
- Type:
Journal Articles
Status:
Published
Year Published:
2020
Citation:
M. Herrero-Huerta, A. Bucksch, E. Puttonen, K.M. RAINEY. 2020. Canopy Roughness: A New Phenotypic Trait to Estimate Aboveground Biomass from Unmanned Aerial System",�Plant Phenomics,�vol.�2020,�Article ID�6735967, https://doi.org/10.34133/2020/6735967. Attribution: KMR designed the experiment, operated the multi-environment yield trials, and contributed conceptually, MHH performed the analysis and wrote the manuscript. Ab and EP contributed to the analysis.
|
Progress 02/01/19 to 01/31/20
Outputs Target Audience:Crop Research professionals in all crops and a broad range of objectives, but especially plant breeding and the seed industry. Engineers in the fields of Agricultural engineering and Civil engineering, focused on remote sensing, photogrammatry, and sensors. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Ground reference data collection, and the image processing pipeline has employed or engaged with ~15 undergraduate students, 1 hourly graduate student and 4 full-time graduate research assistants. Students were trained to work both in the field (agronomic reserach and UAS flights) and in the computer lab, through such experiences they learned skills related to the operation of unmanned aircraft systems, sampling of spectral, phenotypic traits and biomass, including aspects of experimental design, data management, and data analysis. Some of these students worked with the team as hourly employees, some worked with the team as part of independent undergraduate research projects. One of the undergraduate students participated through the Purdue University Summer Research Opportunities Program (SROP) and another through the Purdue Engineering Summer Undergraduate Research Fellowship (SURF) program. Both programs are designed to recruit undergraduate students into graduate programs by providing projects and mentorship. Three students participated as part of Senior Design project through the Department of Agricultural and Biological Engineering, with one student taking his work to the annual ASABE conference where he won the ITSC Community Paper Award for his presentation. In addition, we have a long list of multidisicplinary published outputs with more such outputs in progress or draft form. How have the results been disseminated to communities of interest?CIE and VID code is in a GitHUB repository, and that we have been developing a branch of the repository for public release. What do you plan to do during the next reporting period to accomplish the goals?We are working on public release od CIE and VID code.
Impacts What was accomplished under these goals?
Objective 1: A high-throughput phenotyping processing pipeline was created with two developed tools, Crop Image Extraction, version 2 (CIE) and Vegetation Indices Derivation (VID). CIE and VID are Python programs which enable users to extract, calibrate and quantify vegetation indices of interest at the plot level. The dataa processing pipeline is highly modular and efficient. CIE is the first step in the high-throughput phenotyping processing pipeline. CIE has the ability to extract plot images from designed experiments with RGB, multispectral and thermal imagery captured by the UAS. Plot images are extracted from user-configured inputs that describe the AOI and generated outputs from image stitching software such as camera parameter files. User-defined inputs consist of experimental metadata such as location, number of crop rows, ranges, and units as well as length and spacing between plots. After the configuration is completed for the AOI, plots are extracted by segmenting the canopy and gridding the calculated locations of each plot. As the crop develops and changes color, CIE uses multiple segmentations and crop localization functions to identify the crop unit centerline to ensure the correct number of crop units are identified within each crop plot. The experimental metadata information along with the dimensions of the crop unit (crop length and spacing between crop rows) are then used to calculate the crop plot midpoint and map the remaining crop plots. The result is accurately and precisely identified crop plot midpoints that enable automated and rapid extraction of plot images. Depending on the size of the experiment, the generated CIE outputs can be thousands of plot images. The plot images from CIE are then fed into VID, to calibrate images as well as create and use functions to calculate indices of interests. VID uses image attributes (i.e. row, range, date, image band, image replicate) with customized functions such as band algorithms to quantify phenotypic traits from the extracted plot images. VID can also calibrate plot images by applying empirical equations generated by extracting reflectance and digital number values from calibration panels positioned within the field during each flight (Smith et al. 1999). The automated and efficient structure in VID allows for rapid processing and ability to output data into text and image format for analysis. Objective 2: We established two methods of phenotyping above ground biomass. 1) A subset of the SoyNAM population (n=383) was grown in multi-environment trials and destructive AGB measurements were collected along with multispectral and Red, Green and Blue (RGB) imaging with an unmanned aerial system from 27 to 83 days after planting (DAP). Machine-learning methods resulted in AGB predictions with high R2 values (0.92-0.94). Narrow-sense heritabilities estimated over time using the mentioned RRM ranged from low to moderate (from 0.02 at 44 DAP to 0.28 at 33 DAP). AGB from adjacent DAP had the highest genetic correlations compared to those DAP further apart. We observed high accuracies and low biases of prediction indicating that genomic breeding values for AGB can be predicted over specific time intervals using RRM. Genomic regions associated with AGB varied with time, and no genetic markers were significant in all time points. In order to predict the AGB for all DAP, including days when ground truth data were not available, we considered a linear model using the imagery features as the predictor variables within each environment across all observed DAP. We observed that the distribution of the residuals was highly asymmetric, suggesting that a linear model was not suitable to fit the data (Thoni et al. 1990). To correct the asymmetry we considered a Box-Cox transformation on the AGB, which led to the log-transformed values (data not shown, Box and Cox, 1964). The prediction of AGB was carried-out using two different machine-learning methods: Least Absolute Shrinkage and Selection Operator (LASSO) Regression (Tibshirani 1996), and Partial Least Squares Regression (PLSR) (Wold et al. 2001). For the PLSR, 10 principal components were selected so that the root mean squared error (RMSE) was minimized. The performance of the predictive models was evaluated using a 10-fold cross-validation strategy, in which the dataset was randomly divided into a training set (90% of the plots) and validation set (10% of the plots). The predictive accuracy of the model was measured by the coefficient of determination (R2), which is equal to the fraction of AGB variance explained by the model, and by the RMSE, which measures the average error magnitude. Pearson's correlation coefficient (r) was also considered to quantify the linear correlation between the observations and their estimates, being an indication of model prediction ability. Both models were implemented in the R software (R Core Team 2019), using the package caret (Kuhn 2008) 2) Using the same field experiment described above, from temporal RGB images the crop field was reconstructed in 3D by image-based modeling. Algorithms were combined to compute canopy volume of each plot in cubic meters. Within-plot height variation was quantified to improve the biomass correlations. The 3D models use a 3D Delaunay triangulation algorithm (Golias and Dutton, 1997) applied to the variability of multiple height measurements per plot quantified from the point cloud. The advantage of this method is that is uses RGB images, so radiometric calibration is not required. Objective 3: Over approximately 10,000 soybean progeny rows over two years, we have used UAS canoy coverage data and/or yield to select linesusign diferent selction criteria. These lines have been tested in preliminary and advanced yield trials in our breedingpipeline and a student is working on the manuscript.
Publications
- Type:
Journal Articles
Status:
Accepted
Year Published:
2020
Citation:
B. Lyu, S.D. Smith, X. Yexiang, K.M. RAINEY, K.A. Cherkauer. 2020. Deriving Vegetation Indices from High-throughput Images by Using Unmanned Aerial Systems in Soybean Breeding. Transactions of the ASABE.
- Type:
Journal Articles
Status:
Published
Year Published:
2020
Citation:
F.F. Moreira, A.A.A. Hearst, K.A. Cherkauer, and K.M. RAINEY. 2019. Improving the efficiency of soybean breeding with high-throughput canopy phenotyping.�Plant Methods�15,�139. doi:10.1186/s13007-019-0519-4.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
M. Herrero-Huerta* and K.M. RAINEY. High Throughput Phenotyping of Physiological Growth Dynamics from Uas-Based 3d Modeling in Soybean. ISPRS-International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 4213 (2019): 357-361. DOI: 10.5194/isprs-archives-XLII-2-W13-357-2019.
- Type:
Journal Articles
Status:
Published
Year Published:
2019
Citation:
M. Herrero-Huerta*, S. Govindarajan, K.A. Cherkauer, and K.M. RAINEY (2019) Triple S: A new tool for Soybean High Throughput Phenotyping from UAS-based Multispectral Imagery. Proc. SPIE 11007, Advanced Environmental, Chemical, and Biological Sensing Technologies XV, 110070K. DOI: 10.1117/12.2519376.
- Type:
Journal Articles
Status:
Awaiting Publication
Year Published:
2020
Citation:
Herrero-Huerta, M�nica & Rodr�guez-Gonz�lvez, Pablo & Rainey, Katy. (2020). Yield Prediction By Machine Learning From Uas-Based Multi-Sensor Data Fusion In Soybean?. 10.21203/rs.3.rs-16958/v1.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2020
Citation:
Moreira, F., Oliviera, H., lopz, M., Abughali, B., Gomes, G., Cherkauer, K., Brito, L., K.M. Rainey (2020) High-throughput phenotyping and random regression models reveal temporal genetic
control of soybean biomass production, under review for PNAS
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Proceedings National Association of Plant Breeders 2019 Annual Meeting, Aug 25-29, Pine Mountain, GA., K.M. RAINEY*, M. Herrero-Huerta, S.D. Smith, B. Abughali, F.F. Moreira, M.A. Lopez, and K.A. Cherkauer. 2019 NIFA: 3D and precision soybean phenotypes from temporal UAS imagery of yield trials. Contribution: Dr. Rainey was principal investigator.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2020
Citation:
Proceedings Phenome 2018, Feb 12-16, Tucson, AZ., K.M. RAINEY. 2018. UAS Phenotyping in Soybean Breeding and Phenomic Inference. Abstract for an invited lecture.
- Type:
Conference Papers and Presentations
Status:
Accepted
Year Published:
2020
Citation:
Lyu, B., S.D. Smith, Keith A. Cherkauer. Fine-Grained Recognition in High-throughput Phenotyping. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2020. Seattle, WA, June 14-19, 2020
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Proceedings Plant & Animal Genome Conference XXVII, Jan 12-16, 2019, San Diego, CA, F.F. Moreira*, M.A. Lopez, M. Herrero-Huerta, K.A. Cherkauer and K.M. RAINEY. 2019. Genetic Architecture of Soybean Biomass Development Derived from Field-Based High-Throughput Phenotyping.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Lyu, B., S.D. Smith, Yexiang Xue, Keith A. Cherkauer. Deriving Vegetation Indices from High-throughput Images by Using Unmanned Aerial Systems in Soybean Breeding. Abstract 1900279. Presented at the 2019 Annual International Meeting of the American Society of Agricultural and Biological Engineers. Boston, MA, July 7-10, 2019. ITSC Community Paper Award Winner, 2019
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2019
Citation:
Proceedings National Association of Plant Breeders 2019 Annual Meeting, Aug 25-29, Pine Mountain, GA., F.F. Moreira*, M.A. Lopez, L. Brito, K.A. Cherkauer, and K.M. RAINEY. 2019. Combining high-throughput phenotyping and GWAS to reveal temporal genetic variation in soybean biomass. Contribution: Dr. Rainey was principal investigator.
|