Source: SPECTRAL SCIENCES, INC. submitted to NRP
CLOUD EFFECTS CORRECTION OF DRONE-COLLECTED CROP IMAGERY FOR PRECISION AGRICULTURE
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1016091
Grant No.
2018-33610-28555
Cumulative Award Amt.
$100,000.00
Proposal No.
2018-00266
Multistate No.
(N/A)
Project Start Date
Sep 1, 2018
Project End Date
Apr 30, 2019
Grant Year
2018
Program Code
[8.13]- Plant Production and Protection-Engineering
Recipient Organization
SPECTRAL SCIENCES, INC.
4 4TH AVE
BURLINGTON,MA 01803
Performing Department
(N/A)
Non Technical Summary
The effect of cloud shadows on aerial imagery is an unsolved problem that has been identified by many operators of unmanned aerial vehicle (drone)-based remote sensing systems for agriculture. Shadows affect the ability to compare imagery, including multispectral and hyperspectral imagery (MSI and HSI), taken over periods of time, and the ability to quantify image data analytics, including NDVI and more sophisticated metrics for crop health, water stress, pigments, diseases, and soil nutrients. The overall objective of Phase I of this effort is to demonstrate component technologies needed to provide cloud effects correction in drone imagery, including development of a preliminary design for hardware and data post-processing software. Critical components include the collection and storage of skydata, algorithms to transform the data into quantifiable shadow and other cloud effects, and finally application of irradiance to cloud shadow correction in the down-looking imagery.
Animal Health Component
75%
Research Effort Categories
Basic
(N/A)
Applied
75%
Developmental
25%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
4040420202040%
1320499208060%
Goals / Objectives
The effect of cloud shadows on aerial imagery is an unsolved problem that has been identified by many operators of unmanned aerial vehicle (drone)-based remote sensing systems for agriculture. Shadows affect the ability to compare imagery, including multispectral and hyperspectral imagery (MSI and HSI), taken over periods of time, and the ability to quantify image data analytics, including NDVI and more sophisticated metrics for crop health, water stress, pigments, diseases, and soil nutrients. For this reason, Spectral Sciences Inc. (SSI) proposes to develop the ShadowMapper system, to support agricultural drone data collection. The system will integrate with crop analytics software, such as PrecisionHawks PrecisionMapper and Harris's Highland Hub farm management tool, as a step in the correction of crop imagery to produce the most accurate crop health information for farmers and precision agriculture service providers.The overall objective of Phase I of this effort is to demonstrate component technologies needed to provide cloud effects correction in drone imagery, including development of a preliminary design for hardware and data post-processing software. Critical components include the collection and storage of skydata, algorithms to transform the data into quantifiable shadow and other cloud effects, and finally application of irradiance to cloud shadow correction in the down-looking imagery. The tasks to achieve these objectives are:Data simulations supporting development of the sky image processing algorithm.Essential algorithm development.Field demonstration using breadboard hardware and software components.Preliminary design of the Phase II prototype hardware and software for the cloud effects correction system.Documentation of the Phase I project in a Final Technical Report.
Project Methods
Phase I will be a critical concept study. We will take preliminary data using an in-house drone system and demonstrate critical algorithms for cloud mapping.

Progress 09/01/18 to 04/30/19

Outputs
Target Audience: The target audience for this project is USDA researchers and technologists with an interest in improving farm analytics data collected from airborne and drone instruments. Ultimately ths program will result in farmers and farm services analytic data providers increasing available flight days by 20-60%, depending on region and season, improving the value of drone systems and improving the level of crop intelligence available. Changes/Problems: The original Phase I concept called for mapping clouds near the sun that cast shadows within our primary data collection system's nadir field of view (FOV) and projecting them onto a map of the ground beneath the data collection sensor, to produce an illumination correction that could be applied to the data in post-processing. While the approach is feasible, we have determined that the baseline measurements required are difficult, function well only over limited conditions, and add complexity to typical drone flight. However, we have developed an alternative concept that reduces the complexity of the measurement, by directly observing and quantifying the cloud shadowed illumination projected onto the ground. What opportunities for training and professional development has the project provided? Nothing Reported How have the results been disseminated to communities of interest? Nothing Reported What do you plan to do during the next reporting period to accomplish the goals? The goal of Phase II is to develop a prototype of the cloud shadow correction system, including required hardware and software components. Our goal is to use off the shelf hardware to the extent possible for the prototype, and if feasible, use the existing drone camera and related GPS, timing and inertial navigation data. Most drones used for remote sensing have an onboard RGB camera for context viewing that could be appropriated. In some cases, this camera may be used as the primary data collection system, and we may then require an additional WFOV imager. The major focus of the Phase II will be development and testing of specialized software to georeference and overlay imagery from the moving platform, detect frame-to-frame changes indicating cloud shadow motion, build up a temporal cloud shadow map including attenuation levels, and apply maps to data in a correction algorithm.

Impacts
What was accomplished under these goals? Intermittent cloud shadowing on the ground during an image data collect from a manned aircraft or drone platform results in variability in solar illumination of the data, often rendering it unusable by current data processing routines. Shadows affect the ability to compare imagery, including multispectral and hyperspectral imagery (MSI and HSI), taken over periods of time. Furthermore, shadows affect the ability to quantify spectral image data analytics, specifically vegetative indices based on molecular absorption of light in characteristic spectral bands, that form sophisticated metrics for crop health, water stress, pigments, diseases, and soil nutrients. Recognizing that direct observation of cloud shadows on the ground is hindered by ground clutter, the original Phase I concept called for an all-sky imager to map the sky during data collection and project the illumination correction onto the 'primary' image data. Critical components of such a system include the acquisition and storage of sky imagery, algorithms to project and transform the imagery into quantifiable ground irradiance maps that account for shadow, and finally application of irradiance maps to correct cloud shadow in the simultaneously collected primary image data. The key objective of Phase I was to establish this approach. To fulfill the Phase I objective, we undertook a number of tasks, including simulation of sky and resultant cloud shadow imagery used in the proposed approach, development of essential algorithms for cloud projection, acquisition of field data to demonstrate the approach in real world conditions. All the tasks were completed, and are described in this report. However, the key result of our Phase I study was to substantially revise the approach to cloud shadow mapping and correction we have proposed for Phase II. We were able to demonstrate key components of this new approach in Phase I and, based on the results, complete the final technical task; specify a Phase II system. The original Phase I concept called for mapping clouds near the sun that cast shadows within our primary data collection system's nadir field of view (FOV) and projecting them onto a map of the ground beneath the data collection sensor, to produce an illumination correction that could be applied to the data in post-processing. While the approach is feasible, we have determined that the baseline measurements required are difficult, function well only over limited conditions, and add complexity to typical drone flight. However, we have developed an alternative concept that reduces the complexity of the measurement, by directly observing and quantifying the cloud shadowed illumination projected onto the ground. Quantification of the cloud shadow requires discrimination of the shadow from a widely varying, cluttered background. Our new Phase II approach accomplishes this discrimination using the shadow motion as it traverses the ground. This requires a very wide field of view that covers a significant swath of the field being measured, not just the instantaneous footprint of the primary image collection system, so that we can capture every pixel in the field in sunlit and changing shadow conditions. The ShadowMapper camera takes imagery of the field at intervals of a few seconds, and the changes in illumination across the field are captured and extracted using change detection algorithms. Once the shadow radiance is mapped onto the primary data, we use the differences to make adjustments to radiance measured by the primary system, or subsequent reflectance determined using atmospheric correction software. The plan for the Phase II system includes a number of processes. 1) A time sequence of WFOV ground images is collected at the same time as our primary data collection. 2) Change detection algorithms are used to discriminate cloud shadow positions and intensities as they pass over the ground. 3) This information is correlated with imagery of the narrower field, primary data collection device to produce a frame illumination mask. 4)This in turn is applied as a correction to the radiant flux obtained with the primary device. 5) The corrected crop imagery data can then be used for crop analytics, such as vegetative indices, which ultimately produce intelligence of relevance to the farmer. The Phase I project was critical to arriving at the proposed Phase II concept. In Phase I we demonstrated the methods needed to project sky illumination onto the ground using sky images to determine the cloud altitudes and subsequently diffuse radiance due to clouds. This exercise clarified the required algorithms, parameters and limitations of our initial approach. The field demonstrations, from the ground and from our Matrice 600 drone, introduced us to the hardware and inspired the measurement approach that we have adopted for Phase II.

Publications