Progress 05/01/23 to 04/30/24
Outputs Target Audience:The project targets a broad audience, from roboticists to agronomists. Moreover, stakeholders like soybean and corn growers may benefit from implementing the proposed tasks. This project is expected to have significant environmental, economic, and educational impacts in all areas where corn is grown (177 million ha). Over-application of fertilizers is a significant environmental threat worldwide. More than 30% of future US agribusiness growth is expected to come from further adoption of PA by farmers. Despite the COVID impediments, the project participants gave presentations (almost all of them virtual at the beginning and then in-person) to all potential stakeholders. More details are provided in the products section of the report. Educational benefits were achieved by using some of the project's data in various robotics and computer vision courses. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Several graduate students from both engineering and agricultural sciences worked on the project. Henry Nelson and Athanasios Bacharis are working towards their Ph.D. in CS. The objective is to collaborate closely between the different groups at UMN. A new project has been started in conjunction with the LCCMR initiative to deploy some of the sampling methods, robot systems, and sensors (developed as part of this effort) for water quality monitoring in rural Minnesota. The effort will engage middle-school students. How have the results been disseminated to communities of interest?Science-based knowledge was delivered to scientists, graduate students, technical service providers, and farmers through presentations at professional meetings such as ICRA and IEEE/RSJ, with over 1000 attendees (researchers, professors, and students) to both conferences. In addition, we hosted summer tech camps for 6-8 grades from the community where Master's and Ph.D. students taught various activities regarding robotics applications during the 3-day camp. We also hosted another outreach event between MnRI and CFANS, where the participants of this outreach visited the robotic lab to learn more about the various projects that were currently being worked on. What do you plan to do during the next reporting period to accomplish the goals?We plan to continue working on improving the automated acquisition of 3D information in the crop field using real-time optimization methods. In particular, next-best-view and reinforcement learning methods will enable adaptive and autonomous UAV planning for optimally collecting point cloud data most efficiently. Additionally, we plan to create methods that acknowledge existing disturbances in crop fields, such as the deformation of leaves coming from wind or other environmental factors, improving our methods' applicability to real-world scenarios even more. This idea will be tested with real-world experiments using corn plants and UAVs for image collection, demonstrating the transition of planning in simulation to the real world. In addition to that, we want to create state-of-the-art algorithms that can predict point cloud information and measure the phenotype of plants. For that, we will utilize the 3Ddata we will acquire from the view planning pipeline to train and innovate deep learning methods.
Impacts What was accomplished under these goals?
We developed an approximate and scalable field segmentation algorithm to separate plant structures in 3D. In this word, we created an algorithm developed to roughly segment individual plants and their stems over an entire field. This algorithm is defined as a modification of a general clustering technique so that its computational complexity is low and well-defined. Using common data clustering techniques as an algorithmic basis constrains the new algorithm to be relatively data and species-agnostic while maintaining a known behavior when scaling up to large input sizes. This algorithm allows data collection and phenotype measurement to happen at different scales so the computationally expensive phenotyping algorithms can operate on small portions of the point cloud containing individual plants to maintain a lower runtime. While still only producing approximate segmentations, this algorithm easily scales to large fields. It has been released as open-source software and has already been applied by numerous other works in the agricultural sciences. We continue to organize free summer tech camps for middle-schoolers from underserved communities in the summer 2024. We expect to have more than 100 students attending these events.
Publications
- Type:
Conference Papers and Presentations
Status:
Awaiting Publication
Year Published:
2024
Citation:
H. J. Nelson, and N. Papanikolopoulos, "Ground-Density Clustering for Approximate Agricultural Field Segmentation," 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Abu Dhabi, UAE, 2024
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Nikos Papanikolopoulos, Keynote Talk, 7th International Workshop on Visual Odometry and Computer Vision Applications Based on Location Clues (associated with CVPR 2024), June 2024.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2024
Citation:
Nikos Papanikolopoulos, Keynote Talk, Long Talk at the NSF PI NRI meeting, 2024
|
Progress 05/01/22 to 04/30/23
Outputs Target Audience:The project targets a broad audience, from roboticists to agronomists. Moreover, stakeholders like soybean and corn growers may benefit from implementing the proposed tasks. This project is expected to have significant environmental, economic, and educational impacts in all areas where corn is grown (177 million ha). Over-application of fertilizers is a significant environmental threat worldwide. More than 30% of future US agribusiness growthis expected to come from further adoption of PA by farmers. Despite the COVID impediments, the project participants gave presentations (almost all of them virtual at the beginning and then in-person) to all potential stakeholders. More details are provided in the products section of the report. Educational benefits were achieved by using some of the project's data in various robotics and computer vision courses. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Several graduate students from both engineering and agricultural sciences worked on the project. Henry Nelson and Athanasios Bacharis are working towards their Ph.D. in CS. The objective is to collaborate closely between the different groups at UMN and our partner institution at the University of Nevada Reno. How have the results been disseminated to communities of interest?Science-based knowledge was delivered to scientists, graduate students, technical service providers, and farmers through presentations at professional meetings such as ICRA and IEEE/RSJ, with over 1000 attendees (researchers, professors, and students) to both conferences. In addition, we hosted summer tech camps for 6-8 grades from the communitywhere Master's and Ph.D.students taughtvarious activitiesregardingrobotics applications during the 3-day camp. We also hosted anotheroutreach event between MnRI andCFANS, where the participants of this outreach visited the robotic lab to learn more about the various projects that were currently being worked on. What do you plan to do during the next reporting period to accomplish the goals?We plan to continue the extensive data collection and data analysis by using teams of ground and aerial robots. Hyperspectral image and corn plant tissue collection across growth stages V5 to V14 will use the approach developed last year at Becker, Rosemount, and Waseca with N, K, and S fertilizer trials. We will use these sites to collect the above UAV-based RGB video and ground truth canopy attribute measurements to build point cloud canopies that coincide with the locations where hyperspectral data are collected. In the future, we plan to apply the view planning method to various corn fields in Minnesota. This will provide us with point cloud data that can be further used for analysis in precision agriculture. Furthermore, we plan to improve the automated acquisition of 3D information in the crop field, using real-time optimization approaches; in the context of next-best-view and reinforcement learning methods. This will allow adaptive and autonomous UAV planning that will collect point cloud data in the shortest time. In addition to that, we want to create state-of-the-art algorithms that can predict point cloud information and create complete mesh models of plants. For that, we will utilize the 3D data we will acquire from the view planning pipeline to train and innovate deep learning methods. In 2023, we plan to explore this idea using multi-spectral and possibly hyperspectral data fully. Fusing information from hyperspectral and multi-spectral images using DL will be especially interesting to improve performance as described in our proposal. We will spend most of our time in 2024 validating our algorithms, presenting/publishing our research results, and preparing the final report of our project.
Impacts What was accomplished under these goals?
Our main accomplishments so far include: (i) the development of a new framework for autonomous path planning for precision agriculture applicable to both aerial and ground robots, (ii) the development of innovative path planning algorithms that allow a robotic system to ensure full exploration and mapping of an area given no prior knowledge of the environment, (iii) the integration of metrics for nutrient deficiency to ensure high-quality sensor observations in the most critical areas of the crop, and (v) the development of a multi-modal alignment algorithm involving images in the visual spectrum and images encoding Near InfraRed (NIR) information provided by multispectral cameras tailored to precision agriculture. We have verified their effectiveness in diverse 3D and 2D environments. We have released stable versions of our software contributions as open-source code and presented/published our research results at various conferences. We also created an automated view planning pipeline that generates a 3D representation of crop plants. This pipeline uses RBG images obtained by a UAV, using optimization and structure-from-motion methods. The novelty of this method lies in its adaptiveness to different field parts of row crops and different plant sizes. The point cloud results were better in quality than other baseline approaches.
Publications
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2022
Citation:
A. Bacharis, H. J. Nelson and N. Papanikolopoulos, "View Planning Using Discrete Optimization for 3D Reconstruction of Row Crops," 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Kyoto, Japan, 2022, pp.9195-9201
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2023
Citation:
H. J. Nelson, C. E. Smith, A. Bacharis and N. Papanikolopoulos, "Robust Plant Localization and Phenotyping in Dense 3D Point Clouds for Precision Agriculture," 2023 IEEE/RSJ International Conference on Robotics and Automation (ICRA), London, UK, 2023
|
Progress 05/01/21 to 04/30/22
Outputs Target Audience:Financial and social elements of modern societies are closely connected to cultivating plants like corn and soybean. Due to the massive importance of these plants, nitrogen or potassium deficiencies during their cultivation process directly translate to significant financial losses. Therefore, the early detection and treatment of these nutrient deficiencies is a task of great significance and value. However, current standard field surveillance practices are either completed manually or with the assistance of satellite imaging, which offers only infrequent, insufficient (from a spatial resolution perspective), and costly data to farmers. As a result, farmers tend to minimize risk by applying uniform fertilizer rates to the field in the fall before planting in spring (in the case of corn). This approach overestimates the amount of fertilizer needed while producing massive nitrogen contamination of surface and groundwater. This project promotes the use of autonomous teams of small aerial and ground co-robots, armed with efficient plant-centric information gathering algorithms and multi-modal perception abilities that fuse information from the visible spectrum (RGB), as well as multi-- or hyperspectral domains. It uses corn as the target crop. The overarching goal of this work is to introduce an automated strategy for plant field robotic mapping, monitoring, nitrogen and potassium deficiency detection, and crop biomass estimation at satisfactory Spatio-temporal resolutions to better estimate the nutrient fertilizer requirements. Through the capacity of the aerial and ground robotic team to autonomously select and follow the viewpoints that enable comprehensive multi-modal 3D reconstruction of the corn canopy structure (biomass) at arbitrary resolutions, a superior alternative to high altitude aerial imaging is suggested. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Several graduate students from both engineering and agricultural sciences worked on the project. Tyler Nigon just graduated with his Ph.D. while Henry Nelson and Athanasios Bacharis are working towards their Ph. D.s in CS. The objective is to collaborate closely between the different groups not only at UMN but with our partner institution at the University of Nevada Reno. How have the results been disseminated to communities of interest?Science-based knowledge was delivered to scientists, graduate students, technical service providers, and farmers through a presentation at professional meetings (talk on hyperspectral image processing) at the 2020 Annual meeting of the Soil Science Society of America and a seminar (2020 zoom seminar) on precision agriculture to nearly 100 scientists and graduate students at four HBCUs. A Fall 2020 departmental webinar was given to the Department of Biological and Agricultural Engineering at UC Davis on corn nitrogen uptake study using drone hyperspectral imaging. Approximately 30 faculty members and graduate students attended the seminar. What do you plan to do during the next reporting period to accomplish the goals?We plan to continue the extensive data collection and data analysis by using teams of ground and aerial robots. Hyperspectral image and corn plant tissue collection across growth stages V5 to V14 will use the approach developed last year at Becker, Rosemount, and Waseca with N, K, and S fertilizer trials. We will use these same sites to collect the above canopy UAV-based RGB video and ground truth canopy attribute measurements to build point cloud canopies that coincide with the locations where hyperspectral data are collected. Hyperspectral data from the first years of the project will be evaluated against ground truth data for plant tissue N, K, and S content, as well as the end-of-season crop yield. Algorithms will be developed to assess which specific spectral indices or image analysis techniques are diagnostic of N, K, or S deficiency and final crop yield. Canopy point clouds based on RGB imagery collected in the first years will be used to extract crop canopy attributes such as plant height, number of leaves, node distance, leaf width, leaf length, leaf angle, and whorl height. The accuracy of these estimates for a given sampling date will be evaluated using ground truth data. Trends in canopy attributes across sampling dates will be assessed. Below canopy lidar, data will be fused with above canopy RGB imagery to build more complete point cloud representations of corn from the ground to the top of the canopy at growth stages after canopy closure. Point cloud-based estimates of canopy attributes will be compared with the plant's hyperspectral-based N, K, or S status. A 3D semantic and instance segmentation algorithm for the corn canopy will be produced. This will enable the automation of much more detailed analytics such as leaf-based staging and measuring properties of individual leaves.
Impacts What was accomplished under these goals?
Last year we deployed aerial and ground robots to collect data to help us develop the proposed algorithms for their analysis. Data were collected in 2020 and 2021 from experimental corn plots having a wide range of nitrogen (N), potassium (K), or sulfur (S) fertilizer application rates at locations in Becker, Rosemount, and Waseca, MN. Hyperspectral images (spectral range 400-900 nm) were collected at a 2cm pixel resolution using a Resonon Pika II mounted on a DJI Matrice 600 Pro UAV from June 18 to July 8, corresponding to corn growth stages V5 to V14. Six-band multispectral images at mm-level spatial resolution using a Sentera 6X camera on a DJI Matrice 200 Pro UAV were obtained on the same days as the drone hyperspectral image collection. Whole plant samples were collected in selected treatment plots on the same days for analysis of tissue N (72 samples), K (216 samples), and S (190 samples). Crop yield was measured at harvest in each experimental unit. An RGB video camera mounted on a DJI drone collected multi-view angle imagery from 2020 and 2021 corn planting date trials in Becker, Rosemount, Waseca, and St. Paul at growth stages V3 to V7. Data were collected on seven different dates across six weeks to have multiple representations of point cloud data at each growth stage. On one of these dates, Lidar and stereo image data were also collected using a ground robot driving under the corn canopy. This data was then registered with the aerial data to produce views of the corn from both above and below the canopy. Ground truth data collected at the imaging time included plant height, number of leaves, node distance, leaf width, leaf length, leaf angle, and whorl height. We also proposed a number of novel, application-specific algorithms to produce a general and scalable plant segmentation algorithm. The novel algorithms presented are shown to have quantitatively better results than the current state-of-the-art while being less sensitive to input parameters and maintaining the same algorithmic time complexity. When incorporated into field-scale phenotyping systems, the proposed algorithms should work as a drop-in replacement that can significantly improve the accuracy of results while ensuring that performance and scalability remain undiminished. We also proposed a methodology for plant counting and localization based on 3D point clouds of cornfields using voting methods. This method can be used for growth stage determination of plants over a large area of the field to get a more robust measure of growth stage than is manually obtainable. Additionally, we proposed an optimization-based method for planning the image capture of a cornfield to produce a point cloud that is as complete and accurate as possible. The resulting image sets provide good coverage of an entire area of a field, reducing error or sparse coverage in data collection which should increase the accuracy of any downstream processing tasks.
Publications
- Type:
Conference Papers and Presentations
Status:
Under Review
Year Published:
2022
Citation:
H. Nelson, C. Smith, and N. Papanikolopoulos, Accurate Plant Localization in Dense 3D Point Clouds for Precision Agriculture, Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '22), Kyoto, Japan, October 23-27, 2022
- Type:
Journal Articles
Status:
Published
Year Published:
2021
Citation:
W. Yang, T. Nigon, Z. Hao, G. Dias Paiao, F. Fernandez, D. Mulla, and C. Yang, "Estimation of Corn Yield Based on Hyperspectral Imagery and Convolutional Neural Network", Computers and Electronics in Agriculture 184: 106092, 2021.
- Type:
Journal Articles
Status:
Published
Year Published:
2020
Citation:
D. Zermas, H.,Nelson, P. Stanitsas, V. Morellas, D. Mulla, and N. Papanikolopoulos, A Methodology for the Detection of Nitrogen Deficiency in Corn Fields Using High-Resolution RGB Imagery, IEEE Transactions on Automation Science and Engineering, October 2020.
- Type:
Journal Articles
Status:
Accepted
Year Published:
2021
Citation:
K. Mallery, D. Canelon, J. Hong, and N. Papanikolopoulos, "Design and Experiments with a Robot-Driven Underwater Holographic Microscope for Low-Cost In Situ Particle Measurements", Journal of Intelligent & Robotic Systems, 2021
- Type:
Conference Papers and Presentations
Status:
Accepted
Year Published:
2020
Citation:
H. Nelson, and N. Papanikolopoulos, Learning Continuous Object Representations from Point Cloud Data, Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '20), pp 2446-2451, Las Vegas, NV, October 25-29, 2020.
- Type:
Conference Papers and Presentations
Status:
Accepted
Year Published:
2020
Citation:
A. Schwartzwald, and N. Papanikolopoulos, Sim-to-Real with Domain Randomization for Tumbling Robot Control, Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '20), pp 4411- 4417, Las Vegas, NV, October 25-29, 2020.
- Type:
Conference Papers and Presentations
Status:
Under Review
Year Published:
2022
Citation:
A. Bacharis, H. Nelson, and N. Papanikolopoulos, View Planning Using Discrete Optimization for 3D Reconstruction of Row Crops, Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '22), Kyoto, Japan, October 23-27, 2022
|
Progress 05/01/20 to 04/30/21
Outputs Target Audience:The project targets a wide audience from roboticists to agronomists. Moreover, stakeholders like the soybean growers and corn growers may see benefits from the implementation of the proposed tasks. This project is expected to have significant environmental, economic, and educational impacts in all areas of the world where corn is grown (177 million ha). Over application of fertilizers is a significant environmental threat worldwide. More than 30% of the growth in US agribusiness of the future is expected to come from further adoption of PA by farmers. Despite the COVID impediments, the project participants gave presentations (almost all of them virtual) to all potential stakeholders. More details are provided in the products section of the report. Educational benefits were achieved by using some of the data acquired by the project in various courses in robotics and computer vision. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Several graduate students from both engineering and agricultural sciences worked on the project. Tyler Nigon just graduated with his Ph.D. while Henry Nelson is working towards his Ph.D. in CS. The objective is to have a close collaboration between the different groups not only at UMN but with our partner institution at the University of Nevada Reno. How have the results been disseminated to communities of interest?Science-based knowledge was delivered to scientists, graduate students, technical service providers, and farmers through a presentation at professional meetings (talk on hyperspectral image processing) at the 2020 Annual meeting of Soil Science Society of America, and a seminar (2020 zoom seminar) on precision agriculture to nearly 100 scientists and graduate students at four HCBUs. A Fall 2020 departmental webinar was given to the Department of Biological and Agricultural Engineering at UC Davis with the topic of corn nitrogen uptake study using drone hyperspectral imaging. Approximately 30 faculty members and graduate students attended the seminar. What do you plan to do during the next reporting period to accomplish the goals?We plan to continue the extensive data collection and data analysis by using teams of ground and aerial robots. Hyperspectral image and corn plant tissue collection across growth stages V5 to V14 will use the approach developed last year at Becker, Rosemount and Waseca with N, K, and S fertilizer trials. We will use these same sites for the collection of above canopy UAV-based RGB video, below canopy-based lidar imagery, and ground truth canopy attribute measurements to build point cloud canopies that coincide with the locations where hyperspectral data are collected. Hyperspectral data from the first years of the projectwill be evaluated against ground truth data for plant tissue N, K, and S content as well as the end-of-season crop yield. Algorithms will be developed to assess which specific spectral indices or image analysis techniques are diagnostic of N, K, or S deficiency as well as final crop yield. Canopy point clouds based on RGB imagery collected in the first years will be used to extract crop canopy attributes such as plant height, number of leaves, node distance, leaf width, leaf length, leaf angle, and whorl height. The accuracy of these estimates for a given sampling date will be evaluated using ground truth data. Trends in canopy attributes across sampling dates will be assessed. Below canopy lidar datawill be fused with above canopy RGB imagery to build more complete point cloud representations of corn from the ground to the top of the canopy at growth stages after canopy closure. Point cloud-based estimates of canopy attributes will be compared with the hyperspectral-based N, K, or S status of the plant.
Impacts What was accomplished under these goals?
During last year we deployed aerial and ground robots to collect data to help us develop the proposed algorithms for their analysis. Data were collected in 2020 from experimental corn plots having a wide range of nitrogen (N), potassium (K), or sulfur (S) fertilizer application rates at locations in Becker, Rosemount, and Waseca, MN. Hyperspectral images (spectral range 400-900 nm) were collected at a 2cm pixel resolution using a Resonon Pika II mounted on a DJI Matrice 600 Pro UAV from June 18 to July 8, corresponding to corn growth stages V5 to V14. Six-band multispectral images at mm-level spatial resolution using a Sentera 6X camera on a DJI Matrice 200 Pro UAV were obtained on the same days as the drone hyperspectral image collection. Whole plant samples were collected in selected treatment plots on the same days for analysis of tissue N (88 samples), K (102 samples), and S (81 samples). Crop yield was measured at harvest in each experimental unit. An RGB video camera mounted on a DJI drone collected multi-view angle imagery from 2020 corn planting date trials in St. Paul at growth stages V3 to V7. Data were collected on 3 different dates across 3 weeks, to have multiple representations of point cloud data at each growth stage. Point cloud data were mosaicked using Pix4D and segmented into soil versus vegetation. Ground truth data collected at the time of imaging includedplant height, number of leaves, node distance, leaf width, leaf length, leaf angle, and whorl height. We alsoproposed a number of novel, application-specific algorithms with the goal of producing a general and scalable plant segmentation algorithm. The novel algorithms proposed are shown to produce quantitatively better results than the current state-of-the-art while being less sensitive to input parameters and maintaining the same algorithmic time complexity. When incorporated into field-scale phenotyping systems, the proposed algorithms should work as a drop-in replacement that can greatly improve the accuracy of results while ensuring that performance and scalability remain undiminished.
Publications
- Type:
Journal Articles
Status:
Published
Year Published:
2020
Citation:
D. Zermas, H.,Nelson, P. Stanitsas, V. Morellas, D. Mulla, and N. Papanikolopoulos, A Methodology for the Detection of Nitrogen Deficiency in Corn Fields Using High-Resolution RGB Imagery, IEEE Transactions on Automation Science and Engineering, October 2020.
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2020
Citation:
H. Nelson, and N. Papanikolopoulos, Learning Continuous Object Representations from Point Cloud Data, Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '20), pp 2446-2451, Las Vegas, NV, October 25-29, 2020.
- Type:
Journal Articles
Status:
Accepted
Year Published:
2021
Citation:
K. Mallery, D. Canelon, J. Hong, and N. Papanikolopoulos, "Design and Experiments with a Robot-Driven Underwater Holographic Microscope for Low-Cost In Situ Particle Measurements", Journal of Intelligent & Robotic Systems, 2021
- Type:
Conference Papers and Presentations
Status:
Published
Year Published:
2020
Citation:
A. Schwartzwald, and N. Papanikolopoulos, Sim-to-Real with Domain Randomization for Tumbling Robot Control, Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '20), pp 4411-4417, Las Vegas, NV, October 25-29, 2020.
- Type:
Journal Articles
Status:
Published
Year Published:
2021
Citation:
W. Yang, T. Nigon, Z. Hao, G. Dias Paiao, F. Fernandez, D. Mulla, and C. Yang, "Estimation of Corn Yield Based on Hyperspectral Imagery and Convolutional Neural Network", Computers and Electronics in Agriculture 184: 106092, 2021.
|
|