Progress 04/01/20 to 01/04/23
Outputs Target Audience:Content from this project has been incorporated into formal classwork at the undergraduate and graduate levels at UIUC and Stanford University. This content covered various aspects of the project, from image processing and registration techniques for shadow detection to using neural networks for modeling the dynamic evolution of cloud shadows and safe control and coordination of multiple drones. Developments from this project were disseminated to the robotics, control, and machine learning communities through five peer-reviewed publications (with an additional 2 under reveiw) during this reporting period in the areas of collision free control of multiple unmanned vehicles and recurrent and deep neural networks. We have also reached out to the precision agriculture community through informal dissemination about project activities and outcomes to individuals at the University of Minnesota Research Outreach Centers and the Minnesota Department of Agriculture. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Each team member on this project has participated in the practical application of robotics, neural networks, machine learning, guidance navigation and control, image processing, and Structure from Motion theory to an important problem in precision agriculture. The training and professional development obtained by active participation on a collaborative project from concept development through implementation, testing, and evaluation, is a unique educational and professional experience. A Stanford undergraduate has had one-on-one mentoring with faculty and graduate students at Stanford as well as withindividuals from industry at Sentek Systems. While his work in the previous reporting period focused on image processing for shadow detection and vehicle interfacing, his work in the current period has focused on vehicle guidance algorithms. He has become familiar with real-time software development and also with the multi-agent traveling salesman problem. Through presentations to the full team he has also improved his communication and presentation skills. A second undergraduate from Stanford, and an under-represented minority (African American), has been involved in this project during this reporting period as well. He worked along side the other Stanford undergraduate on vehicle guidance algorithms. His initial work was on partitioning arbitrary survey regions into components suitible for flying with drones, and through this he was introduced to many concepts in computational geometry. He was mentored by Stanford faculty, a Stanford graduate student, and individuals from industry at Sentek Systems. A graduate student at UIUC has been mentored and guided by project personnel at UIUC, UC Berkeley, and Sentek Systems. He has supported the project through novel work in stochastic approximation and online learning, which has led to several publications in peer-reviewed journals and refereed conferences. A graduate student at Stanford University was mentored by Stanford faculty and individuals at Sentek Systems. She also served as a mentor to two undergraduate students at Stanford University in this period, and in so doing she developed strong technical management skills. In this reporting period she completed her PhD in deep neural networks and contributed to the project in multiple ways. She often lended technical assistance to other students at Stanford and Berkeley when they were running into difficulty. How have the results been disseminated to communities of interest?Dissemination of project developments has been achieved through multiple publications in peer-reviewed journals and refereed conferences. Material from this project has been incorporated into coursework at Berkeley, Stanford, and UIUC in support of our educational objectives. Outreach to the agricultural community in this reporting period has been through informal discussions with individuals at the University of Minnesota Research Outreach Centers and the Minnesota Department of Agriculture. What do you plan to do during the next reporting period to accomplish the goals?
Nothing Reported
Impacts What was accomplished under these goals?
In this project we aimed to develop a system of multiple cooperating UAVs that will track cloud shadows and collaborate to collect shadow-free imagery of an agricultural field. This will significantly extend the range of lighting and cloud conditions under which drone-based imaging can be conducted, potentially tripling the amount of usable imagery that can be collected in a given amount of time in many parts of the country. Additionally, by enabling the simultaneous use of multiple, collaborating drones in agricultural settings, we aim to make it possible to scale up the use of drones in agriculture to realistic, industrialscale fields. This will increase the viability of incorporating drones into farm management practices, enabling more modern, efficient, and adaptive nutrient management methodologies, thereby reducing excess nutrients left in the environment and reducing fertilization costs to farmers. The expected audience interested in this project's outcomes will be from the precision agriculture, robotics, control, and machine learning communities. Outreach activities will target underrepresented groups as presented in the broader impacts section of the proposal. Two datasets have been developed to support goal #1 of this project. The first is a large-scale dataset derived from publicly available sources, consisting of short video sequences of cloud shadow evolution. The second dataset is a collection of highresolution videos that we collected using drones with nadir-looking fisheye cameras showing dynamic cloud shadows cast on agricultural landscapes. Both of these datasets have been expanded during this reporting period, as detailed in the "Other Products" section of this report. Additional neural network models have been explored for predicting cloud shadow evolution in this reporting period. Additionally, the LSTM model developed in the previous period has been improved through training data augmentation and hyperparameter tuning, to improve consistency of predictions with respect to the direction of cloud motion and to better handle clear-sky conditions. This work has led to multiple publications in peer-reviewed journals and conferences. Additionally, a separate method for modeling shadow propagation has been developed in support of goal #1 based on contour detection and optical flow. This provides both a backup method and reference for evaluating the neural network models. In this reporting period, work has continued on developing multi-objective and multi-agent control designs, in support of goals #2 and #3. It has also expanded to include a novel approach to multi-agent control based on reinforcement learning techniques. Work on goals #2 and #3 have resulted in one masters thesis and multiple peer-reviewed journal publications. Several activities in this reporting period supported the validation of theory and algorithms being developed under this project (goal #4). An opensource, multi-vehicle ground control station is under development with major components being completed within this reporting period. This project is called Recon and is hosted on GitHub: https://github.com/poli0048/Recon. In this reporting period the drone interface module became feature-complete and was fully integrated into the project. The companion iOS App also became feature complete. The Shadow Detection module, which was working by the end of the previous reporting period, has been integrated into Recon, streamlined, and optimized in this period. The Shadow Propagation Module was developed and two separate implementations have become feature-complete in this reporting period (one using contour detection and optical flow, the other using an LSTM). The Guidance Module has seen significant development in this period as well. It was not feature complete by the end of this period, but it has reached the major milestone of working with simulated drones in certain test cases. In this period we also moved from component and simulation-based testing to full system testing, both using a single vehicle as well as with multiple vehicles simultaneously. A video deomstration showing a 4-vehicle flight test is available here: https://www.youtube.com/watch?v=aSOjmPsa1Go. In support of goal #5, dissemination of project developments has been achieved through multiple publications in peer- reviewed journals and refereed conferences. Material from this project has been incorporated into coursework at UIUC, UC Berkeley, and Stanford in support of our educational objectives. Outreach to the agricultural community in this reporting period has been through informal discussions with individuals at the University of Minnesota Research Outreach Centers and the Minnesota Department of Agriculture.
Publications
|
Progress 04/01/21 to 03/31/22
Outputs Target Audience:Content from this project has been incorporated into formal classwork at the undergraduate and graduate levels at UIUC and Stanford University. This content covered various aspects of the project, from image processing and registration techniques for shadow detection to using neural networks for modeling the dynamic evolution of cloud shadows and safe control and coordination of multiple drones. Developments from this project were disseminated to the robotics, control, and machine learning communities through five peer-reviewed publications (with an additional 2 under reveiw) during this reporting period in the areas of collision free control of multiple unmanned vehicles and recurrent and deep neural networks. We have also reached out to the precision agriculture community through informal dissemination about project activities and outcomes to individuals at the University of Minnesota Research Outreach Centers and the Minnesota Department of Agriculture. Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Each team member on this project has participated in the practical application of robotics, neural networks, machine learning, guidance navigation and control, image processing, and Structure from Motion theory to an important problem in precision agriculture. The training and professional development obtained by active participation on a collaborative project from concept development through implementation, testing, and evaluation, is a unique educational and professional experience. A Stanford undergraduate has had one-on-one mentoring with faculty and graduate students at Stanford as well as with individuals from industry at Sentek Systems. While his work in the previous reporting period focused on image processing for shadow detection and vehicle interfacing, his work in the current period has focused on vehicle guidance algorithms. He has become familiar with real-time software development and also with the multi-agent traveling salesman problem. Through presentations to the full team he has also improved his communication and presentation skills. A second undergraduate from Stanford, and an under-represented minority (African American), has been involved in this project during this reporting period as well. He worked along side the other Stanford undergraduate on vehicle guidance algorithms. His initial work was on partitioning arbitrary survey regions into components suitible for flying with drones, and through this he was introduced to many concepts in computational geometry. He was mentored by Stanford faculty, a Stanford graduate student, and individuals from industry at Sentek Systems. A graduate student at UIUC has been mentored and guided by project personnel at UIUC, UC Berkeley, and Sentek Systems. He has supported the project through novel work in stochastic approximation and online learning, which has led to several publications in peer-reviewed journals and refereed conferences. A graduate student at Stanford University was mentored by Stanford faculty and individuals at Sentek Systems. She also served as a mentor to two undergraduate students at Stanford University in this period, and in so doing she developed strong technical management skills. In this reporting period she completed her PhD in deep neural networks and contributed to the project in multiple ways. She often lended technical assistance to other students at Stanford and Berkeley when they were running into difficulty. How have the results been disseminated to communities of interest?Dissemination of project developments has been achieved through multiple publications in peer-reviewed journals and refereed conferences. Material from this project has been incorporated into coursework at Berkeley, Stanford, and UIUC in support of our educational objectives. Outreach to the agricultural community in this reporting period has been through informal discussions with individuals at the University of Minnesota Research Outreach Centers and the Minnesota Department of Agriculture. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period the remianing unfinished pieces of the multi-vehicle system testbed will be completed. This primarily entails finishing the guidance module in the Recon ground control station. Additional single-drone and multi-drone flight testing will be conducted to ensure that the system is safe and reliable. Sentek will make necessary changes to their Structure from Motion pipeline to support processing datasets collected bymultiple collaborating drones. This will enable us to collect datasets with the multi-vehicle testbed and evaluate overall system effectiveness and performance by generating shadow-free reconstructions and evaluating their quality. This was delayed due to challenges encountered on other system components. We will conduct full system testing by conducting multi-drone collaborative survey flights with the project testbed and the Recon ground control station over agricultural fields in Minnesota in the presense of clouds. We will develop objective shadow prediction quality metrics to evaluate and compare the LSTM and optical-flow shadow propagation algorithms and establish the practical prediction horizon of the system. In further support of goal #5 the team will continue to publish results that have been supported by this project to disseminate advancements to the interested academic and industrial communities. Additional results will also be incorporated into course curricula at UIUC, Stanford, and UC Berkeley to support ongoing educational objectives. As the testbed is more mature at this stage, outreach to the precision agriculture community will accelerate through live and recorded system demonstrations. One such demonstration is already scheduled to take place at the 15'th International Conference on Precision Agriculture in Minnesota on June 29, 2022. We plan to conduct additional demonstrations at events hosted by University of Minnesota Research Outreach Centers and the Minnesota Department of Agriculture. Demonstrations will also be incorporated into community outreach activities conducted by UIUC, Stanford, and UC Berkeley, including through the UIUC Engineering Open House and the UIUC Engineering Outreach Society.
Impacts What was accomplished under these goals?
In this project we aim to develop a system of multiple cooperating UAVs that will track cloud shadows and collaborate to collect shadow-free imagery of an agricultural field. This will significantly extend the range of lighting and cloud conditions under which drone-based imaging can be conducted, potentially tripling the amount of usable imagery that can be collected in a given amount of time in many parts of the country. Additionally, by enabling the simultaneous use of multiple, collaborating drones in agricultural settings, we aim to make it possible to scale up the use of drones in agriculture to realistic, industrialscale fields. This will increase the viability of incorporating drones into farm management practices, enabling more modern, efficient, and adaptive nutrient management methodologies, thereby reducing excess nutrients left in the environment and reducing fertilization costs to farmers. The expected audience interested in this project's outcomes will be from the precision agriculture, robotics, control, and machine learning communities. Outreach activities will target underrepresented groups as presented in the broader impacts section of the proposal. Two datasets have been developed to support goal #1 of this project. The first is a large-scale dataset derived from publicly available sources, consisting of short video sequences of cloud shadow evolution. The second dataset is a collection of highresolution videos that we collected using drones with nadir-looking fisheye cameras showing dynamic cloud shadows cast on agricultural landscapes. Both of these datasets have been expanded during this reporting period, as detailed in the "Other Products" section of this report. Additional neural network models have been explored for predicting cloud shadow evolution in this reporting period. Additionally, the LSTM model developed in the previous period has been improved through training data augmentation and hyperparameter tuning, to improve consistency of predictions with respect to the direction of cloud motion and to better handle clear-sky conditions. This work has led to multiple publications in peer-reviewed journals and conferences. Additionally, a separate method for modeling shadow propagation has been developed in support of goal #1 based on contour detection and optical flow. This provides both a backup method and reference for evaluating the neural network models. In this reporting period, work has continued on developing multi-objective and multi-agent control designs, in support of goals #2 and #3. It has also expanded to include a novel approach to multi-agent control based on reinforcement learning techniques. Work on goals #2 and #3 have resulted in one masters thesis and multiple peer-reviewed journal publications. Several activities in this reporting period supported the validation of theory and algorithms being developed under this project (goal #4). An open-source, multi-vehicle ground control station is under development with major components being completed within this reporting period. This project is called Recon and is hosted on GitHub: https://github.com/poli0048/Recon. In this reporting period the drone interface module became feature-complete and was fully integrated into the project. The companion iOS App also became feature complete. The Shadow Detection module, which was working by the end of the previous reporting period, has been integrated into Recon, streamlined, and optimized in this period. The Shadow Propagation Module was developed and two separate implementations have become feature-complete in this reporting period (one using contour detection and optical flow, the other using an LSTM). The Guidance Module has seen significant development in this period as well. It was not feature complete by the end of this period, but it has reached the major milestone of working with simulated drones in certain test cases. In this period we also moved from component and simulation-based testing to full system testing, both using a single vehicle as well as with multiple vehicles simultaneously. A video deomstration showing a 4-vehicle flight test is available here: https://www.youtube.com/watch?v=aSOjmPsa1Go. In support of goal #5, dissemination of project developments has been achieved through multiple publications in peer-reviewed journals and refereed conferences. Material from this project has been incorporated into coursework at UIUC, UC Berkeley, and Stanford in support of our educational objectives. Outreach to the agricultural community in this reporting period has been through informal discussions with individuals at the University of Minnesota Research Outreach Centers and the Minnesota Department of Agriculture.
Publications
- Type:
Theses/Dissertations
Status:
Published
Year Published:
2021
Citation:
E. Chai, Analysis of quantization and normalization effects in deep neural networks, PhD Thesis, Stanford University, Aug. 2021. URL: http://searchworks.stanford.edu/view/13971425
|
Progress 04/01/20 to 03/31/21
Outputs Target Audience:
Nothing Reported
Changes/Problems:
Nothing Reported
What opportunities for training and professional development has the project provided?Significant portions of this project's research tasks were carried out by students training to become more proficient in research. These students received close mentorship and guidance from advanced graduate students as well as their advising faculty. The project also provided ample opportunities for gaining experience in scientific communication, with one student centering their undergraduate honors thesis around their contributions to the project. How have the results been disseminated to communities of interest?Publication on a developed algorithm, see products. What do you plan to do during the next reporting period to accomplish the goals?Immediate tasks include the development of a predictive cloud shadow model with reasonable performance, the translation of these predictions into UAV control commands, and the creation of an interface with the UAV fleet to facilitate communication between prediction/control processing (i.e. a laptop) and the physical UAV. Successful progress in these tasks is directly pertinent to the following goals that are currently unmet. Goal: "Develop methods to blend information-theoretic and safety objectives in a novel multi-objective methodology that allows for scalability across multiple robots." Goal: "Derive closed-form vehicle control laws based on the objectives and dynamic characteristics of the vehicles. The design will tolerate imperfect information exchange as well as communication time delays while safely collecting shadow-free imagery." Goal: "Validate the developed theory and algorithms with autonomous vehicles." One specific task is the development of a communication interface between UAV mobile software platform and the multi-vehicle ground control station (GCS) running the algorithms. This interface is critical to the development and experimental validation of multi-objective UAV control algorithms. This interface will need to retrieve telemetry data such as GPS locations, drone velocity and orientation as well as the live camera feed from the drone fleet, and forward this information over a network socket interface to the GCS for processing. The resulting flight commands from the GCS will need to be forwarded over the same interface to control the UAV fleet to collaboratively image the area while avoiding cloud shadows. Because this interface is critical to the development of the control algorithms, the creation of this interface is an immediate task that is pertinent to all of the unmet goals. Another task is the optimization of the shadow detection and prediction algorithms. Real-time processing is required to ensure that the prediction and control algorithms' results do not become outdated by long shadow map processing latencies and quickly changing cloud dynamics. Outdated results can lead to flawed imagery and unnecessarily long UAV waypoint missions. Therefore the next step for the shadow detection and prediction algorithms is to carefully profile and optimize the code to meet the latency requirements set by the vehicle control laws and cloud dynamics. More work is also needed toward the goal: "Perform education, outreach, and dissemination activities."
Impacts What was accomplished under these goals?
In response to rising populations and anthropogenic climate change, this project is motivated by the need for precise monitoring of crop status using unmanned aerial vehicles (UAVs) and mounted sensors. Covering an average of two-thirds of the Earth's surface, clouds and the shadows that they cast can influence subsequent interpretations of the imagery collected by UAVs, resulting in inaccurate prescriptions of crop inputs. Our goals, therefore, center around enabling UAV-based precision agriculture by devising a comprehensive strategy to consistently collect shadow-free imagery. Successful execution of our goals is of immediate relevance and benefit to mid and large-scale agriculture, where the combination of wide spatial coverage and the high resolution of precision agriculture is most advantageous. Goal: "Develop machine learning algorithms to model and estimate the dynamics of cloud shadow evolution. These estimation schemes will be time-dependent and based on particular neural network models." The development and training of predictive machine learning algorithms require a significant quantity of data first to be collected and processed. To predict the cloud shadow dynamics, the algorithm needs training data that either directly or indirectly indicates the location of cloud shadows over time. The raw input that we use to produce this data is the video feed taken from a downward-facing camera mounted to a high flying UAV. As clouds travel above the UAV, their shadows move across the camera's field of view. To maximize the amount of spatial coverage, the UAV uses a wide fisheye lens. Video feed is collected in this manner in several locations across the US. Raw video is subsequently processed for input into our predictive model. In particular, we are interested in extracting the locations of cloud shadows from raw imagery and determining the geographical coordinates of these shadows. Our cloud shadow extraction algorithm implements video stabilization to produce comparable frames and creates a binary mask of shadowed pixels based on a comparison of frames to a fully-lit reference frame. Determining the geographical coordinates of shadows requires calibration of the fisheye lens to establish a relationship between pixels and geographical coordinates. Key implementations include a fast camera pose solver (determining where the camera is in space) and transformations between several coordinate systems to maximize geographical accuracy and robustness to random error introduced in the calibration process. Additionally, a custom raster format was developed to optimize the storage of the cloud shadow maps and their corresponding geographical coordinates (georegistered). Through this process, several thousand frames of georegistered cloud shadow frames were produced for later use as input into the predictive models. With these initial samples of cloud shadow frames, development (i.e., model parameterization, training, and evaluation) of the predictive model may proceed in parallel with the collection of additional raw cloud shadow video for processing.
Publications
- Type:
Journal Articles
Status:
Accepted
Year Published:
2020
Citation:
Stipanovic, D.M., Kapetina, M.N., Rapaic, M.R. et al. Stability of Gated Recurrent Unit Neural Networks: Convex Combination Formulation Approach. J Optim Theory Appl (2020). https://doi.org/10.1007/s10957-020-01776-w
|
|