Source: UNIVERSITY OF ILLINOIS submitted to NRP
III: SMALL: DATAG: SCALABLE REAL-TIME SATELLITE-BASED CROP YIELD FORECASTING FRAMEWORK VIA DEEP LEARNING
Sponsoring Institution
National Institute of Food and Agriculture
Project Status
COMPLETE
Funding Source
Reporting Frequency
Annual
Accession No.
1024495
Grant No.
2021-67021-33446
Cumulative Award Amt.
$496,973.00
Proposal No.
2020-08943
Multistate No.
(N/A)
Project Start Date
Nov 15, 2020
Project End Date
Nov 14, 2024
Grant Year
2021
Program Code
[A7302]- Cyber-Physical Systems
Recipient Organization
UNIVERSITY OF ILLINOIS
2001 S. Lincoln Ave.
URBANA,IL 61801
Performing Department
Department of Geography and GIScience
Non Technical Summary
Large-scale crop yield forecasting at fine spatial and temporal granularities is key to characterizing agricultural productivity for individual farm fields towards more intelligent and precise farm management. It has notable implications for predicting future trajectories of food prices, food security, and agricultural development. However, region-wide forecasting of field-level crop yield in a timely fashion remains a long-lasting challenge, due to the difficulty in obtaining appropriate field-level data and the poor scalability of existing forecasting models. The rapid advances in satellite remote sensing and recent innovations in deep learning open up new opportunities to tackle the challenge. This project therefore aims to advance and benchmark crop yield forecasting systems at both fine spatial and temporal resolutions with cutting-edge deep learning approaches. The overarching goal of the project is to develop a scalable, real-time, and field-level crop yield forecasting framework with satellite remote sensing. We will develop this yield forecasting framework through the following specific objectives: 1) Develop a hybrid deep learning-based image fusion model to generate dense satellite imagery at the farm field level; 2) Devise a network-based phenological model to estimate crop phenology in a timely fashion; and 3) Develop an innovative deep learning-enabled integrated model to prototype scalable, real-time, and field-level crop yield forecasting systems. Characterized by those unique features, the yield forecasting framework can provide sufficient spatial granularity to predict crop productivity at the field level, which will drastically benefit the precise farm management, particularly in small-holder agricultural systems. The framework will enable the crop yield forecasting in a real-time fashion throughout the growing season, which will facilitate governments, stakeholders, and farmers to make timely adaptive management, economic, and political decisions. The framework will also empower the paradigm shift from conventional chronological calendar-based crop yield modeling architectures to more scalable phenology-based ones, and thus holds strong potential to be generalized over wide geographical regions.
Animal Health Component
30%
Research Effort Categories
Basic
30%
Applied
30%
Developmental
40%
Classification

Knowledge Area (KA)Subject of Investigation (SOI)Field of Science (FOS)Percent
60160302080100%
Goals / Objectives
The overarching goal of this project is to develop a scalable, real-time, and field-level crop yield forecasting framework via cutting-edge deep learning approaches. This goal will be achieved through the following specific objectives: 1) Develop a hybrid deep learning-based image fusion model to generate dense satellite imagery at the farm field level; 2) Devise a network-based phenological model to estimate crop phenology in a timely fashion; and 3) Develop an innovative deep learning-enabled integrated model to prototype scalable, real-time, and field-level crop yield forecasting systems.
Project Methods
The proposed crop yield forecasting framework encompasses three key constituents: a hybrid deep learning-based image fusion model, a network-based crop phenological model, and a deep learning-enabled integrated crop yield forecasting system. Specifically, the hybrid deep learning-based image fusion model integrates convolutional neural network (CNN) with long short-term memory (LSTM) models to fuse the imagery of varying spatio-temporal resolutions, and to generate dense satellite imagery tailored to individual farm fields. The network-based crop phenological model facilitates the real-time crop phenological characterization with dynamic complex network structures, which is an indispensably critical step to standardize crop development variations over space and time for scalable yield forecasting. The deep learning-enabled integrated crop yield forecasting system includes three components: crop model simulated realizations, LSTM-enabled integrated crop yield forecasting model, and remote sensing field-level crop yield forecasting. It can predict the cumulative yield responses along a sequence of crop growing stages in a real-time fashion using an innovative phenology-guided LSTM modeling architecture. This modeling architecture can integrate the site-specific crop model simulations with region-wide remote sensing monitoring, and possesses immense potential to transform the conventional forecasting systems. An open-source cyberGIS toolkit will be devised to facilitate the adoption and continuing development of the proposed framework by taking advantage of advanced cyberinfrastructure. The proposed yield forecasting framework will catalyze a new generation of satellite-based models for scalable, real-time, and field-level crop yield forecasting, which can hardly be achieved by conventional yield forecasting models. The innovative modeling architectures embedded in the framework enable flexible designs and configurations of yield models for more generalizable, timely, and precise crop yield forecasting.

Progress 11/15/20 to 11/14/24

Outputs
Target Audience:The target audiences for our research include the scientists and researchers who study crop production, agricultural remote sensing, and artificial intelligence applications in agriculture, as well as agricultural stakeholders who benefit from within-season crop yield estimation for improving farm management practices and decision making. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?The PI/Co-PI have recruited about six graduate students as well as summer graduate interns for this project. The students were trained by the PI/Co-PI to collect various types of data, analyze remote sensing imagery of various spatial and temporal resolutions, develop innovative deep learning, crop phenological and yield estimation models, as well as developing CyberGIS-Compute middleware system. The PI/Co-PI also supervised the students in conducting various experiments (e.g. PB-CNN crop yield estimation model and CyberGIS-Compute middleware design) relevant to the project to help them gain enhanced understanding of agricultural research and relevant domains. The materials, data, and models developed in this project will dramatically help the students with their dissertation research and future professional development. For the project period, the students have actively participated in about ten manuscripts out of the project. The students have also presented the project work in several academic conferences, such as Annual Meeting of the Association of American Geographers and American Geophysical Union (AGU)Fall Meeting. Zhijie Zhou, a student working on the project, won the first place of the Student Honors Paper Competition, organized by the remote sensing specialty group, at the Annual Meeting of the Association of American Geographers (2024). His presentation title is "CropSight: towards a national-scale operational framework for object-based crop type ground truth retrieval using street view and PlanetScope satellite imagery". Tianci Guo, a student working on the project, won the third place of the Student Honors Paper Competition, organized by the remote sensing specialty group, at the Annual Meeting of the Association of American Geographers (2023). Her presentation title is "Towards scalable field-level crop yield estimation through integration of crop model and deep learning". Chishan Zhang, a student working on the project, won the second place of the Student Honors Paper Competition, organized by the remote sensing specialty group, at the Annual Meeting of the Association of American Geographers (2022). His presentation title is "County-level soybean yield estimation based on Bayesian-CNN incorporating phenology dynamic." The PI significantly refined the course "Introduction to Remote Sensing", with the project collected remote sensing data and the devised model in agricultural remote sensing. This course had students from various academic backgrounds, including both undergraduate and graduate students. There were also students from minority and underrepresented groups. I prepared training materials in four of the labs to introduce the agricultural remote sensing as real-world examples. This course exposes students to agriculture remote sensing and artificial intelligence. The PI has also developed a new course "Advanced Remote Sensing", with the project collected remote sensing data and devised models in agricultural remote sensing. This course is an advanced-level graduate seminar course that introduces some of the most innovative advances in remote sensing, and the novel models developed from this project fit the scope of the course well and are intensively discussed in the course. The course has students from various academic backgrounds, including students from minority and underrepresented groups. This course inspires the students to solve urgent problems in agriculture using remote sensing imagery and artificial intelligence. How have the results been disseminated to communities of interest?The results from this projecthave been disseminated through about ten peer-reviewed publications and numerous conference presentations (such as Annual Meeting of the Association of American Geographers, American Geophysical Union Fall Meeting, International Conference on Geoinformatics, Taylor Geospatial Institute Conference, and NASA Carbon Cycle & Ecosystems Joint Science Workshop). We have also communicated the research results to a broader community during the university research review, Illinois GIS day, and Engineering open house, as well as to farmers via Data-Intensive Farm Management meeting/International Conference for On-farm Precision Experimentation. We have closely collaborated with research scientists at USDA for the development of project models (e.g., hybrid phenology matching model and EMET) and dissemination of corresponding results to government agencies.We have organized several relevant remote sensing sessions at the annual meeting of the Association of American Geographers to introduce the developed models and share the research findings with colleagues in academic institutions, government agencies, and industrial sectors, including "Time Series Remote Sensing in Characterizing Land Surface Dynamics" session in 2021, "Advances in Agricultural Remote Sensing and Artificial Intelligence" session in 2022, "Advances in Agricultural Remote Sensing and Artificial Intelligence" session in 2023, and "Advances in Multitemporal Remote Sensing for Terrestrial Ecosystems" session in 2024.These activities have been well-received, and sparked strong interests among various stakeholders in our project, deep learning, and agricultural remote sensing. What do you plan to do during the next reporting period to accomplish the goals? Nothing Reported

Impacts
What was accomplished under these goals? The overarching goal of this project is to develop a scalable, real-time, field-level crop yield forecasting framework via cutting-edge deep learning approaches. Real-time forecasting of crop yields at the field level has become increasingly crucial to investigate spatially and temporally varying factors associated with yield gaps, and to empower farmers to make timely adaptive agronomic decisions throughout the growing season for more efficient farm management. It also provides early warnings for supply chain planning of agricultural industries and trade markets, and helps with the assessment of crop insurance and land rental tailored to individual farm fields. To achieve this goal, we have developed innovative hybrid deep learning-based spatio-temporal image fusion models to fuse the imagery from various earth observation satellites, and devised the CyberGIS-Compute middleware system to achieve scalable and reproducible remote sensing data fusion by harnessing advanced cyberinfrastructure. We have also developed a hybrid phenology matching model to characterize the crop phenological growth cycle, which represents a key step for the subsequent crop yield forecasting. We have further advanced the field-level planting date retrieval by devising a novel CropSow framework, which integrates cutting-edge crop models and remote sensing phenology models. To further expand the near-real time (NRT) crop phenological monitoring to the US Corn Belt, we have developed the EMET model for NRT crop type mapping by integrating thermal-time based phenology normalization and advanced deep learning classification model. With innovatively retrieved crop phenology information, we have developed novel deep learning-based yield estimation models. The crop yield estimation models can largely advance the modeling of crop yield responses to crop growth and environmental conditions across critical crop phenological growth stages, as well as the quantification of yield estimation uncertainty. Such crop yield forecasting and uncertainty information is critical for the price forecasts of agricultural commodities, aiding in crop insurance designing, trade decision making, and early warnings of food insecurity. Objective 1: Develop a hybrid deep learning-based image fusion model to generate dense satellite imagery at the farm field level. We have developed an innovative hybrid deep learning model that can effectively and robustly fuse the satellite imagery of various spatial and temporal resolutions, as well as designing the CyberGIS-Compute middleware system to achieve scalable and reproducible remote sensing data fusion across multiple spatiotemporal resolutions by harnessing advanced cyberinfrastructure. To further enhance our capability in conducting spatiotemporal image fusion at large scales, we have also developed a multi-stream spatiotemporal fusion GAN (STGAN) model that can accommodate drastic temporal difference among satellite imagery. STGAN is uniquely designed with the integration of a spatial transformer, a channel attention module, and a U-net structure to accommodate the changes in both spectral reflectance and spatial structures among the satellite imagery series. It outperforms several advanced benchmark spatiotemporal image fusion models, particularly in predicting the fusion images without temporally close satellite images. With the hybrid deep learning fusion model and STGAN, as well as CyberGIS-Compute middleware system for large-scale satellite imagery fusion, we have successfully achieved our first objective. Objective 2: Devise a network-basedphenological model to estimate crop phenology in a timely fashion. We have developed a novel hybrid phenology matching model to robustly retrieve a diverse spectrum of crop phenological stages using satellite time series. The hybrid phenology matching model can retrieve most of crop growth stages with high accuracy, yet its retrieval performance of crop planting dates still needs to be improved due to uncertainty caused by human activities. To further advance field-level planting date estimation, we have developed a novel CropSow framework. CropSow integrates the remote sensing phenological detecting method with the crop growth model for retrieving crop planting dates at the field level. The remote sensing phenological detecting method is devised to retrieve the critical crop phenological metrics of farm fields from remote sensing time series, which are then integrated into the crop growth model for field planting date estimation in consideration of soil-crop-atmosphere continuum. CropSow leverages the rich physiological knowledge embedded in the crop growth model to scalably interpret satellite observations under a variety of environmental and management conditions for field-level planting date retrievals. The developed CropSow outperforms three advanced benchmark models (i.e., remote sensing accumulative growing degree day method, weather-dependent method, and shape model) in crop planting date estimation at the field level. It achieves better generalization performance than the benchmark models, as well as stronger adaptability to abnormal weather conditions with more robust performance in estimating the planting dates of farm fields. With the hybrid phenology matching model and CropSow, we have successfully achieved our second objective. Objective 3: Develop an innovative deep learning-enabled integrated model to prototype scalable, real-time, and field-level crop yield forecasting systems. With characteristic crop phenology, we have developed an within-season emergence (WISE)-phenology normalized deep learning model (i.e., EMET) towards scalable within-season crop mapping, a key step for mapping crop distributions in real time for subsequent crop yield forecasting. Through accommodating the spatiotemporal variations in crop phenological dynamics, EMET exhibits more robust performance across the Corn Belt and can be transferred to different years with enhanced scalability. With the timely crop type mapping, we have developed novel deep learning-based models (i.e., PB-CNN and hybrid KG-DL models) for crop yield estimation and uncertainty quantification. The devised PB-CNN yield estimation model is mainly composed of three components: phenology imagery construction, multi-stream Bayesian-CNN modeling, and feature (i.e., yield predictor and phenological stage) and uncertainty analysis. With the innovative integration of critical crop phenological stages in modeling the crop yield response to a heterogeneous set of yield predictors, PB-CNN outperforms three advanced benchmark models (i.e., SVR, RF, and LSTM). Our further devised hybrid KG-DL model synthesizes the advanced crop model for crop growth/yield simulations and phenology-guided deep learning model for empirical relationship building between crop yield with a variety of environmental and satellite-based predictors. The integration of both types of models eases the challenge of intensive input data requirements and parameter configuration of the crop model for large-scale crop yield prediction. Meanwhile, the massive amounts of simulated crop realizations guided by the crop model can tackle the data scarcity issue of site-specific observations for building empirical deep learning model as well as enhance its generalization capability. Among the yield predictor groups, the satellite-based predictor group is the most critical in crop yield estimation, followed by the water- and heat-related predictor groups. Throughout the growing season, the crop early reproductive phenological stage plays a more crucial role in modeling the yield. The devised model largely enhances our understanding of the complex crop yield response to varying environmental conditions across crop phenological stages for more sustainable agricultural development. With EMET, PB-CNN and hybrid KG-DL models, we have successfully achieved our third objective.

Publications

  • Type: Book Chapters Status: Published Year Published: 2023 Citation: Zhang, C., Diao, C. and T. Guo. (2023). GeoAI for agriculture, in eds. Gao, S., Hu, Y. & Li, W. Handbook of Geospatial Artificial Intelligence, CRC Press/Taylor & Francis Group.
  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2024 Citation: Yang, Z., Diao, C., Gao, F. and B. Li (2024). EMET: An emergence-based thermal phenological framework for near real-time crop type mapping. ISPRS Journal of Photogrammetry and Remote Sensing, 215, 271-291.
  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2024 Citation: Liu, Y., Diao, C., Mei, W. and C. Zhang (2024). CropSight: Towards a large-scale operational framework for object-based crop type ground truth retrieval using street view and PlanetScope satellite imagery. ISPRS Journal of Photogrammetry and Remote Sensing, 216, 66-89.
  • Type: Peer Reviewed Journal Articles Status: Published Year Published: 2025 Citation: Lyu, F., Yang, Z., Diao, C., and S. Wang (2025). Multi-stream STGAN: A spatiotemporal image fusion model with improved temporal transferability. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 18, 1562-1576.
  • Type: Other Journal Articles Status: Under Review Year Published: 2024 Citation: Zhang, C., Li, X., Mieno, T., Diao, C., and D. Bullock (under review). Quadratic-Plateau Geographically Weighted Regression model for estimating site-specific economically optimal input rates.
  • Type: Other Journal Articles Status: Under Review Year Published: 2024 Citation: Guo, T., Diao, C., Yang, Z., Liu, Y. and C. Zhang (under review). Towards scalable field-level crop yield estimation through integration of crop model and deep learning.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Diao, C., Yang, Z., Liu, Y., Zhang, C. and T. Guo. Towards large-scale crop phenological characterization using multi-scale satellite time series. American Geophysical Union (AGU) Fall Meeting, San Francisco, CA, December 11-December 15, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Yang, Z., Diao, C., and F. Gao. EMET: A emergence-based thermal phenological framework for near real-time crop type mapping. American Geophysical Union (AGU) Fall Meeting, San Francisco, CA, December 11-December 15, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Zhou, Z., Liu. Y. and C. Diao. CropSight-US from field(view) to table: an object-based crop type ground truth dataset using street view and Sentinel 2 satellite imagery across the contiguous United States. Geo-Resolution Conference, St. Louis, MO. September 12, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Zhou, Z., Liu. Y. and C. Diao. CropSight - towards a national-scale operational framework for object-based crop type ground truth retrieval using street view and PlanetScope satellite imagery. The Thirty-First International Conference on Geoinformatics (Geoinformatics 2024), Toronto, Canada. August 14-August 16, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Guo, T. and C. Diao. Field-level crop yield estimation through integration of crop model and deep learning. The Thirty-First International Conference on Geoinformatics (Geoinformatics 2024), Toronto, Canada. August 14-August 16, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Zhang, C. and C. Diao. A phenology-guided Bayesian-CNN (PB-CNN) framework for yield estimation and uncertainty analysis. NASA/USAID SERVIR-Global Geo-AI Working Group, Online, June 26, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Yang, Z. and C. Diao. A weakly supervised deep learning framework for within-season field-level crop phenology characterization. Taylor Geospatial Institute Town Hall Event, St. Louis, MO. May 22, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Diao, C., Yang, Z., and Y. Liu. Monitoring diverse crop phenological stages using satellite time series. Annual Meeting of the Association of American Geographers (AAG), Honolulu, HI. April 16-April 20, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Yang, Z. and C. Diao. A novel deep learning framework for within-season field-level crop phenology characterization. Annual Meeting of the Association of American Geographers (AAG), Honolulu, HI. April 16-April 20, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Zhou, Z., Liu, Y., and C. Diao. CropSight: towards a national-scale operational framework for object-based crop type ground truth retrieval using street view and PlanetScope satellite imagery. Annual Meeting of the Association of American Geographers (AAG), Honolulu, HI. April 16-April 20, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Chen, J., Diao, C., Liu, Y., Yang, Z., and Z. Zhou. CropSync: towards a large-scale operational framework for within-season crop type mapping using Google Street View and Harmonized Landsat and Sentinel-2 imagery. Annual Meeting of the Association of American Geographers (AAG), Honolulu, HI. April 16-April 20, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Lyu, F., Yang, Z., Diao, C., and S. Wang. Multi-stream STGAN: a spatiotemporal image fusion model with improved spatial transferability. Annual Meeting of the Association of American Geographers (AAG), Honolulu, HI. April 16-April 20, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2024 Citation: Zhang, C., Li, X., Mieno, T., Diao, C., and D. Bullock. Quadratic-Plateau Geographically Weighted Regression model for estimating site-specific economically optimal input rates. International Conference for On-farm Precision Experimentation (ICOFPE), South Padre Island, TX. January 8-January 11, 2024.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Zhang, C. and C. Diao. Enhancing crop yield prediction across regions with phenology-based domain adaptation: a meta-learned conditional adversarial approach. American Geophysical Union (AGU) Fall Meeting, San Francisco, CA, December 11-December 15, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Yang, Z. and C. Diao. A novel deep learning framework for within-season field-level crop phenology characterization. American Geophysical Union (AGU) Fall Meeting, San Francisco, CA, December 11-December 15, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Liu, Y., Diao, C., Mei, W. and C. Zhang. CropSight: an operational framework for crop type information retrieval using street view and PlanetScope satellite images. American Geophysical Union (AGU) Fall Meeting, San Francisco, CA, December 11-December 15, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Guo, T., Diao, C., Zhang, C., Liu, Y. and Z. Yang. Towards scalable field-level crop yield estimation through integration of crop model and deep learning. American Geophysical Union (AGU) Fall Meeting, San Francisco, CA, December 11-December 15, 2023.


Progress 11/15/22 to 11/14/23

Outputs
Target Audience:The target audiences for our research include the scientists and researchers who study crop production, agricultural remote sensing, and artificial intelligence applications in agriculture, as well as agricultural stakeholders who benefit from within-season crop yield estimation for improving farm management practices and decision making. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?The PI/Co-PI have recruited two graduate students as well as summer graduate interns for this project. The students were trained by the PI/Co-PI to collect various types of data, analyze remote sensing imagery of various spatial and temporal resolutions, develop innovative deep learning, crop phenological and yield estimation models, as well as developing CyberGIS-Compute middleware system. The PI/Co-PI also supervised the students in conducting various experiments (e.g. CropSow-based planting date retrieval and CyberGIS-Compute middleware design) relevant to the project to help them gain enhanced understanding of agricultural research and relevant domains. The materials, data, and models developed in this project will dramatically help the students with their dissertation research and future professional development. For the past year, the students have actively participated in five manuscripts of the project, with three new manuscripts published and two manuscripts currently under review. The students have also presented the project work in academic conferences, such as Annual Meeting of the Association of American Geographers and the American Geophysical Union (AGU)Fall Meeting. One graduate student has won the third place of the Student Honors Paper Competition, organized by the AAG remote sensing specialty group (2023). How have the results been disseminated to communities of interest?The results from this research have been disseminated through peer-reviewed publications and conference presentations (such as AAG, AGU, NASA Carbon Cycle & Ecosystems Joint Science Workshop). We have also communicated the research results to a broader community during the Illinois GIS day, as well as to farmers via Data-Intensive Farm Management meeting. We have organized a session "Advances in Agricultural Remote Sensing and Artificial Intelligence" at AAG to introduce the developed models and share the research findings with colleagues in academic institutions, government agencies, and industrial sectors. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we will evaluate the computational intensity and performance of the devised phenology matching model and CropSow in conducting large-scale crop phenological retrieval using the fusion imagery. We will continue devising the deep learning-based crop yield estimation model, particularly in quantifying the yield estimation uncertainty with Bayesian inference. We will further integrate the devised deep learning model with crop simulation model to downscale the yield estimation from county-level to field-level. We will calibrate the crop model with remote sensing and environmental/management inputs, and employ the calibrated crop model for generating massive simulated crop growth and yield profiles to train the deep learning model for more scalable field-level crop yield estimation.

Impacts
What was accomplished under these goals? The overarching goal of this project is to develop a scalable, real-time, field-level crop yield forecasting framework via cutting-edge deep learning approaches. To achieve this goal, we have developed an innovative hybrid deep learning-based spatio-temporal image fusion model to fuse the imagery from various earth observation satellites, and devised the CyberGIS-Compute middleware system to achieve scalable and reproducible remote sensing data fusion by harnessing advanced cyberinfrastructure. We have also developed a hybrid phenology matching model to characterize the crop phenological growth cycle, which represents a key step for the subsequent crop yield forecasting. We have further advanced the field-level planting date retrieval by devising a novel CropSow framework.With innovatively retrieved crop phenology information, we are currently developing a novel deep learning-based yield estimation model. This crop yield estimation model can largely advance the modeling of crop yield responses to crop growth and environmental conditions across critical crop phenological growth stages, as well as the quantification of yield estimation uncertainty. Objective 1: Develop a hybrid deep learning-based image fusion model to generate dense satellite imagery at the farm field level. We have developed an innovative hybrid deep learning model that can effectively and robustly fuse the satellite imagery of various spatial and temporal resolutions, as well as designing the CyberGIS-Compute middleware system (https://github.com/cybergis/cybergis-compute-core) to achieve scalable and reproducible remote sensing data fusion across multiple spatiotemporal resolutions by harnessing advanced cyberinfrastructure. To further enhance our capability in conducting spatiotemporal image fusion at large scales, we are currently developing a multi-stream spatiotemporal fusion GAN (STGAN) model that can accommodate drastic temporal difference among satellite imagery. STGAN is uniquely designed with the integration of a spatial transformer, a channel attention module, and a U-net structure to accommodate the changes in both spectral reflectance and spatial structures among the satellite imagery series. It outperforms several advanced benchmark spatiotemporal image fusion models, particularly in predicting the fusion images without temporally close satellite images. We are currently wrapping up this study and will leverage the hybrid deep learning fusion model and STGAN, as well as CyberGIS-Compute middleware system for large-scale satellite imagery fusion. Objective 2: Devise a network-basedphenological model to estimate crop phenology in a timely fashion. We have developed a novel hybrid phenology matching model to robustly retrieve a diverse spectrum of crop phenological stages using satellite time series. The hybrid phenology matching model can retrieve most of crop growth stages with high accuracy, yet its retrieval performance of crop planting dates still needs to be improved due to uncertainty caused by human activities. To further advance field-level planting date estimation, we have developed a novel CropSow framework. CropSow integrates the remote sensing phenological detecting method with the crop growth model for retrieving crop planting dates at the field level. The remote sensing phenological detecting method is devised to retrieve the critical crop phenological metrics of farm fields from remote sensing time series, which are then integrated into the crop growth model for field planting date estimation in consideration of soil-crop-atmosphere continuum. CropSow leverages the rich physiological knowledge embedded in the crop growth model to scalably interpret satellite observations under a variety of environmental and management conditions for field-level planting date retrievals. With corn in Illinois, US as a case study, the developed CropSow outperforms three advanced benchmark models (i.e., the remote sensing accumulative growing degree day method, the weather-dependent method, and the shape model) in crop planting date estimation at the field level, with R square higher than 0.68, root mean square error (RMSE) lower than 10 days, and mean bias error (MBE) around 5 days from 2016 to 2020. It achieves better generalization performance than the benchmark models, as well as stronger adaptability to abnormal weather conditions with more robust performance in estimating the planting dates of farm fields. We are currently employing both hybrid phenology matching model and CropSow to conduct the large-scale phenological characterization of Corn Belt. Objective 3: Develop an innovative deep learning-enabled integrated model to prototype scalable, real-time, and field-level crop yield forecasting systems. With characteristic crop phenology, we have developed an within-season emergence (WISE)-phenology normalized deep learning model towards scalable within-season crop mapping, a key step for mapping crop distributions in real time for subsequent crop yield forecasting. The crop time-series remote sensing data are first normalized by the WISE crop emergence dates before being fed into an attention-based one-dimensional convolutional neural network (At1DCNN) classifier. Compared to conventional calendar-based approaches, the WISE-phenology normalization approach largely helps the deep learning crop mapping model accommodate the spatiotemporal variations in crop phenological dynamics. Results in Illinois from 2017 to 2020 indicate that the devised WISE-based deep learning model outperforms benchmark calendar-based approaches and yields over 90% overall accuracy for classifying corn and soybeans at the end of season. During the growing season, the devised model can give satisfactory performance (85% overall accuracy) one to four weeks earlier than benchmark calendar-based approaches. With WISE-phenology normalization, the devised model exhibits more stable performance across Illinois and can be transferred to different years with enhanced scalability and robustness. With the timely crop type mapping, we are currently developing a novel deep learning-based model for crop yield estimation and uncertainty quantification. This novel deep learning yield estimation model is mainly composed of three components: phenology imagery construction, multi-stream Bayesian-CNN modeling, and feature (i.e., yield predictor and phenological stage) and uncertainty analysis. With the innovative integration of critical crop phenological stages in modeling the crop yield response to a heterogeneous set of yield predictors (i.e., satellite-based, heat-related, water-related, and soil predictors), the devised model outperforms three advanced benchmark models (i.e., support vector regression, random forest, and long-short term memory), achieving an average RMSE of 4.622 bu/ac, an average R2 of 0.709, and an average bias of -2.057 bu/ac in estimating the county-level soybean yield of the US Corn Belt for years 2014-2018. Among the yield predictor groups, the satellite-based predictor group is the most critical in crop yield estimation, followed by the water- and heat-related predictor groups. Throughout the growing season, the crop early reproductive phenological stage plays a more crucial role in modeling the yield. The soil predictor group as well as the early growing stages can improve the model estimation accuracy yet potentially brings more uncertainties into the yield estimation. The devised model largely enhances our understanding of the complex crop yield response to varying environmental conditions across crop phenological stages for more sustainable agricultural development. We are currently evaluating the yield estimation uncertainty using Bayesian inference, as well as downscaling the yield estimation to the field level.

Publications

  • Type: Journal Articles Status: Published Year Published: 2023 Citation: Zhang, C. and C. Diao. (2023). A Phenology-guided Bayesian-CNN (PB-CNN) framework for soybean yield estimation and uncertainty analysis. ISPRS Journal of Photogrammetry and Remote Sensing, 205, 50-73.
  • Type: Journal Articles Status: Published Year Published: 2023 Citation: Liu, Y., Diao, C. and Z. Yang. (2023). CropSow: an integrative remotely sensed crop modeling framework for field-level crop planting date estimation. ISPRS Journal of Photogrammetry and Remote Sensing, 202, 334-355.
  • Type: Journal Articles Status: Published Year Published: 2023 Citation: Yang, Z., Diao, C. and F. Gao. (2023). Towards scalable within-season crop mapping with phenology normalization and deep learning. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 16, 1390-1402.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Diao, C., Yang, Z., Gao, F., Zhang, X., Yang, Z., and G. Li. Remotely sensed hybrid phenology matching model to estimate crop growing stages. NASA Carbon Cycle & Ecosystems Joint Science Workshop, College Park, MD. May 8-May 12, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Yang, Z., Diao, C., and F. Gao. A novel phenology guided deep learning model for within-season field-level crop mapping. NASA Carbon Cycle & Ecosystems Joint Science Workshop, College Park, MD. May 8-May 12, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Diao, C. Development of large-scale crop phenological characterization framework with satellite time series. Annual Meeting of the Association of American Geographers (AAG), Denver, CO. March 23-March 27, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Guo, T. and C. Diao. Towards scalable field-level crop yield estimation through integration of crop model and deep learning. Annual Meeting of the Association of American Geographers (AAG), Denver, CO. March 23-March 27, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2023 Citation: Yang, Z. and C. Diao. Within-season crop mapping at the field level using a phenology-guided deep learning model. Annual Meeting of the Association of American Geographers (AAG), Denver, CO. March 23-March 27, 2023.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Zhang, C. and C. Diao. A novel phenology-guided Bayesian-CNN framework for crop yield estimation. American Geophysical Union (AGU) Fall Meeting, Chicago, IL, December 12-December 16, 2022.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Liu, Y., Diao, C., Yang, Z., and E. Nafziger. CropSow: an integrative remotely sensed crop modeling framework for field-level crop planting data estimation. American Geophysical Union (AGU) Fall Meeting, Chicago, IL, December 12-December 16, 2022.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Yang, Z., Diao, C., and F. Gao. A novel phenology guided deep learning model for within-season field-level crop mapping. American Geophysical Union (AGU) Fall Meeting, Chicago, IL, December 12-December 16, 2022.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Diao, C. and G. Li. Monitoring crop phenology with near-surface and high-resolution satellite time series. American Geophysical Union (AGU) Fall Meeting, Chicago, IL, December 12-December 16, 2022.


Progress 11/15/21 to 11/14/22

Outputs
Target Audience:The target audiences for our research include the scientists and researchers who study crop yield forecasting, agricultural remote sensing, and artificial intelligence applications in agriculture, as well as the farmers and agricultural industries who benefit from within-season crop yield estimation for improving farm management practices and decision making. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?The PI/Co-PI have recruited two graduate students as well as summer graduate interns for this project. The students were trained by the PI/Co-PI to collect various types of data, analyze remote sensing imagery of various spatial and temporal resolutions, develop innovative deep learning and crop phenological models, as well as developing CyberGIS-Compute middleware system. The PI/Co-PI also supervised the students in conducting various experiments (e.g. PhenoCam imagery analysis and CyberGIS-Compute middleware design) relevant to the project to help them gain enhanced understanding of agricultural research and relevant domains. The materials, data, and models developed in this project will dramatically help the students with their dissertation research and future professional development. For the past year, the students have actively participated in three manuscripts of the project, with two new manuscripts published and one manuscript currently under revision. The students have also presented the project work in the academic conferences, such as Annual Meeting of the Association of American Geographers. One graduate student has won the second place of the Student Honors Paper Competition, organized by the AAG remote sensing specialty group (2022). How have the results been disseminated to communities of interest?The results from this research have been disseminated through peer-reviewed publications and conference presentations (such as AAG, AGU, PEARC). We have also communicated the research results to a broader community during the Illinois GIS day, as well as to stakeholdersvia Data-Intensive Farm Management meeting. We have organized a session "Advances in Agricultural Remote Sensing and Artificial Intelligence" at AAG to introduce the developed models and share the research findings with colleagues in academic institutions, government agencies, and industrial sectors. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we will evaluate the computational intensity of the hybrid deep learning model in conducting the large-scale spatio-temporal image fusion using the devised CyberGIS-Compute middleware system, and write another manuscript for the fusion work. As timely crop type mapping is essential for in-season crop yield forecasting, we will develop a within-season crop mapping model by integrating the characteristic phenology and deep learning. We will continue devisingthe Phenology-guided Bayesian Convolutional Neural Network systemfor crop yield forecasting, particularly for characterizing the critical yield predictors and phenological stages, as well as the predictive yield uncertainties (aleatoric and epistemic uncertainties)

Impacts
What was accomplished under these goals? The overarching goal of this project is to develop a scalable, real-time, field-level crop yield forecasting framework via cutting-edge deep learning approaches. Real-time forecasting of crop yields at the field level has become increasingly crucial to investigate spatially and temporally varying factors associated with yield gaps, and to empower farmers to make timely adaptive agronomic decisions throughout the growing season for more efficient farm management. It also provides early warnings for supply chain planning of agricultural industries and trade markets, and helps with the assessment of crop insurance and land rental tailored to individual farm fields. To achieve this goal, we have developed an innovative hybrid deep learning-based spatio-temporal image fusion model to fuse the imagery from various earth observation satellites, and devised the CyberGIS-Compute middleware system to achieve scalable and reproducible remote sensing data fusion by harnessing advanced cyberinfrastructure. We have also developed a hybrid phenology matching model to characterize the crop phenological growth cycle, which represents a key step for the subsequent crop yield forecasting. We have rigorously validated the developed crop phenological model using near-surface and high-resolution satellite time series. We are currently expanding the crop phenological monitoring to the US Corn Belt by the integration of image fusion model and crop phenological model. With innovatively retrieved crop phenology information, we are currently developing a phenology-guided deep learning yield modeling system. The yield modeling system can largely advance the modeling of crop yield responses to crop growth and environmental conditions across critical crop phenological growth stages, as well as the quantification of yield estimation uncertainties. Such crop yield forecasting and uncertainty information is critical for the price forecasts of agricultural commodities, aiding in crop insurance designing, trade decision making, and early warnings of food insecurity. Objective 1: Develop a hybrid deep learning-based image fusion model to generate dense satellite imagery at the farm field level. Due to the sensor tradeoff, most remote sensing systems cannot provide images with both high spatial and temporal resolutions. We have developed an innovative hybrid deep learning model that can effectively and robustly fuse the satellite imagery of various spatial and temporal resolutions. As advanced data fusion models often involve sophisticated deep learning models, the scalability of these models can be limited due to a lack of access to advanced cyberinfrastructure. We are currently designing the CyberGIS-Compute middleware system (https://github.com/cybergis/cybergis-compute-core) to achieve scalable and reproducible remote sensing data fusion across multiple spatiotemporal resolutions by harnessing advanced cyberinfrastructure. CyberGIS-Compute is a key component of the CyberGISX platform that provides streamlined and user-friendly access to advanced cyberinfrastructure and cyberGIS capabilities with an integrated software stack for computationally reproducible and data-intensive geospatial analytics. Compared with conventional methods of running a remote sensing data fusion model using local or cloud computing resources, our system not only enables computationally scalable fusion of satellite remote sensing data with both high spatial and temporal resolutions, but also supports users with little programming background to execute sophisticated and computationally intensive models in a reproducible way. Depending on the computational complexity and intensity, we might further integrate a conditional generative adversarial network model for more robust image fusion. Objective 2: Devise a network-basedphenological model to estimate crop phenology in a timely fashion. We have developed a novel hybrid phenology matching model to robustly retrieve a diverse spectrum of crop phenological stages using satellite time series. The devised hybrid model leverages the complementary strengths of phenometric extraction methods and phenology matching models. Despite the advances in satellite-based crop phenological retrievals, interpreting those retrieval characteristics in the context of on-the-ground crop phenological events remains a long-standing hurdle. Over the recent years, the emergence of near-surface phenology cameras (e.g., PhenoCams), along with the satellite imagery of both high spatial and temporal resolutions (e.g., PlanetScope imagery), has largely facilitated direct comparisons of retrieved characteristics to visually observed crop stages for phenological interpretation and validation. We are currently systematically assessing near-surface PhenoCams and high-resolution PlanetScope time series in reconciling sensor- and ground-based crop phenological characterizations. We have retrieved diverse phenological characteristics from both PhenoCam and PlanetScope imagery for a range of agricultural sites across the United States. The results showed that the curvature-based Greenup and Gu-based Upturn estimates showed good congruence with the visually observed crop emergence stage (RMSE about 1 week, bias about 0-9 days, and R square about 0.65-0.75). The threshold- and derivative-based End of greenness falling Season (i.e., EOS) estimates reconciled well with visual crop maturity observations (RMSE about 5-10 days, bias about 0-8 days, and R square about 0.6-0.75). The concordance among PlanetScope, PhenoCam, and visual phenology demonstrated the potential to interpret the sensor-derived phenological characteristics in the context of physiologically well-characterized crop phenological events, which paved the way to characterize the crop physiological growing stage using remotely sensed phenological models. Objective 3: Develop an innovative deep learning-enabled integrated model to prototype scalable, real-time, and field-level crop yield forecasting systems. With our developed hybrid phenology matching model, we are able to characterize the major crop phenological stages. This phenology information is a key step for building scalable crop yield forecasting systems. Thus we are currently developing a Phenology-guided Bayesian-Convolutional Neural Network (PB-CNN) system for crop yield estimation and uncertainty quantification. The PB-CNN system is mainly composed of three components: phenology imagery construction, multi-stream Bayesian-CNN modeling, and feature (i.e., yield predictor and phenological stage) and uncertainty (i.e., aleatoric and epistemic uncertainty) analysis. We have collected satellite data, climate data (temperature, precipitation, and vapor pressure deficit), soil data (clay content mass fraction, sand content mass fraction, Water content at 33kPa, pH in H2O, Bulk density, Carbon content), and cropland data layer for the corn belt from 2008-2018. With the innovative accommodation of diverse phenological stages and heterogeneous nature of yield predictors, the developed PB-CNN system achieved an RMSE of 4.350 bu/ac, an R2 of 0.743, and a bias of 0.928 bu/ac for predicting soybean yields of the study site in 2018, and outperformed three benchmark machine learning models (i.e., support vector regression, random forest, and long-short term memory). We are currently evaluating the yield estimation uncertainties, including both the aleatoric and epistemic uncertainties, using Bayesian inference.

Publications

  • Type: Journal Articles Status: Published Year Published: 2022 Citation: Diao, C., and G. Li. 2022. Near-surface and high-resolution satellite time series for detecting crop phenology. Remote Sensing, 14(9), 1957.
  • Type: Journal Articles Status: Published Year Published: 2022 Citation: Lyu, F., Yang, Z., Xiao, Z., Diao, C., Park, J., and S. Wang. 2022. CyberGIS for scalable remote sensing data fusion. In, Practice and Experience in Advanced Research Computing (pp. 1-4).
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Diao, C. 2022. Towards remote sensing modeling framework for crop phenological characterization. Annual Meeting of the Association of American Geographers (AAG), New York City, NY. February 25-March 1, 2022.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Zhang, C. and C. Diao. 2022. County-level soybean yield estimation based on Bayesian-CNN incorporating phenology dynamic. Annual Meeting of the Association of American Geographers (AAG), New York City, NY. February 25-March 1, 2022.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Yang, Z. and C. Diao. 2022. A phenology-guided deep learning model for early crop mapping at the field level. Annual Meeting of the Association of American Geographers (AAG), New York City, NY. February 25-March 1, 2022.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2022 Citation: Zhang, C. and C. Diao. 2022. A new Probabilistic Bayesian-CNN (PB-CNN) method incorporating phenology dynamic for crop yield estimation. Data-Intensive Farm Management Annual Meeting, Corpus Christi, TX. January 6-January 8, 2022.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2021 Citation: Diao, C., Yang, Z., Gao, F., Zhang, X. and Z. Yang. 2021. A novel hybrid phenology matching model for robust crop growth stage characterization. American Geophysical Union (AGU) Fall Meeting, New Orleans, LA, December 13-December 17, 2021.


Progress 11/15/20 to 11/14/21

Outputs
Target Audience:The target audiences for our research include the scientific community who study crop responses to climate change, crop yield forecasting, agricultural remote sensing, and artificial intelligence applications in agriculture, as well as the farmers and agricultural industries who benefit from within-season crop yield estimation for improving farm management practices and decision making. Changes/Problems: Nothing Reported What opportunities for training and professional development has the project provided?The PIs have recruited two graduate students and one summer graduate intern for this project. The students were trained by the PIs to collect various types of data, analyze remote sensing imagery of various spatial and temporal resolutions, develop innovative deep learning and crop phenological models. The PIs also supervised the students in conducting various experiments (e.g. image fusion design and crop growth characterization) relevant to the project to help them gain enhanced understanding of agricultural research and relevant domains. The materials, data, and models developed in this project will dramatically help the students with their dissertation research and future professional development. The students have actively participated in two manuscripts of the project, with one manuscript as the first author (currently under review). The other manuscript is published in the top remote sensing journal (ISPRS Journal of Photogrammetry and Remote Sensing). The students have also presented the project work in the academic conferences, such as Annual Meeting of the Association of American Geographers. One graduate student has won the Poster Competition Award (2021), School of Earth, Society, & Environment (SESE) Research Review, University of Illinois. How have the results been disseminated to communities of interest?The results from this research have been disseminated through peer-reviewed publications and conference presentations (such as AAG and AGU). We have also communicated the research results to a broader community during the Illinois GIS day. We have organized a session "Time Series Remote Sensing in Characterizing Land Surface Dynamics" at AAG to introduce the developed models and share the research findings with colleagues in academic institutions, government agencies, and industrial sectors. What do you plan to do during the next reporting period to accomplish the goals?In the next reporting period we will evaluate the hybrid deep learning model in predicting rapid and/or transient phenological changes, generate the high quality spatio-temporal satellite fusion data for the study region, and write the manuscript for the fusion work. We will expand the hybrid phenology matching model to the real-time fashion and test the model performance with the collected data. We will continue developing the phenology-guided Bayesian Convolutional Neural Network model for crop yield forecasting and investigate the role of phenological metrics in the modeling scalability.

Impacts
What was accomplished under these goals? The continuing growth of e world population has drastically increased the global demand for agricultural products, posing severe threats to food security. Enhancing agricultural productivity through more intelligent farm management is imperative to meet the rising food demand, which in turn requires more comprehensive understanding of crop yield variations across a range of spatial and temporal scales. Accurate crop yield estimations have notable implications for future trajectories of food prices, food security and agricultural development. Particularly, real-time forecasting of crop yields at the field level has become increasingly crucial to investigate spatially and temporally varying factors associated with yield gaps, and to empower farmers to make timely adaptive agronomic decisions throughout the growing season for more efficient farm management. It also provides early warnings for supply chain planning of agricultural industries and trade markets, and helps with the assessment of crop insurance and land rental tailored to individual farm fields. The overarching goal of this project is to develop a scalable, real-time, field-level crop yield forecasting framework via cutting-edge deep learning approaches. To achieve this goal, we are developing a hybrid deep learning-based spatio-temporal image fusion model to fuse the imagery from various earth observation satellites, so as to create the synthesized imagery of both high spatial and temporal resolutions for field level crop monitoring. We have also developed a hybrid phenology matching model to characterize the crop phenological growth cycle, which represents a key step for the subsequent crop yield forecasting, in which we are currently developing an innovative phenology-guided deep learning yield modeling system. The development of this crop yield forecasting framework will help transform the conventional region-specific, intermittent, district-level systems to more scalable, real-time, field-level crop yield forecasting systems. The timely yield forecasting can provide new insights to farmers and agricultural industries through facilitating the establishment of early warning systems of food security. Objective 1: Develop a hybrid deep learning-based image fusion model to generate dense satellite imagery at the farm field level. Due to the sensor tradeoff, most remote sensing systems cannot provide images with both high spatial and temporal resolutions. We are developing an innovative hybrid deep learning model that can effectively and robustly fuse the satellite imagery of various spatial and temporal resolutions. The proposed model integrates two types of network models: super-resolution convolutional neural network (SRCNN) and long short-term memory (LSTM). SRCNN can enhance the coarse images by restoring degraded spatial details, while LSTM can learn and extract the temporal changing patterns from the time-series images. We have collected all the high-quality Landsat and MODIS imagery for Champaign County, Illinois over the course of 2017, as well as generating the simulation MODIS-Landsat image pairs for 2017 using the daily MODIS images and Cropland Data Layer. The results indicated that the hybrid deep learning model can generate high-quality fusion data, with mean RMSE values lower than 0.005 for visible bands and lower than 0.01 for infrared (IR) bands for the simulation data. As for the satellite data, errors are also low (mean RMSE < 0.025 for visible bands and < 0.07 for IR bands). By integrating two types of network models, the hybrid deep learning model thus can yield fused images with accurate temporal information and detailed spatial information. Objective 2: Devise a network-based phenological model to estimate crop phenology in a timely fashion. We have developed a novel hybrid phenology matching model to robustly retrieve a diverse spectrum of crop phenological stages using satellite time series. The devised hybrid model leverages the complementary strengths of phenometric extraction methods and phenology matching models. It relaxes the geometrical scaling assumption of conventional phenology matching models and can characterize key phenological stages of crop cycles, ranging from farming practice-relevant stages (e.g., planted and harvested) to crop development stages (e.g., emerged and mature). To systematically evaluate the influence of phenological references on phenology matching, we have further designed four representative phenological reference scenarios under varying levels of phenological calibrations with crop progress reports (CPRs). The results indicate that the hybrid phenology matching model can achieve high accuracies for estimating corn phenological growth stages in Illinois for the last twentyyears, particularly with the year- and region-adjusted phenological reference (R-squared higher than 0.9 and RMSE less than fivedays for most phenological stages). The inter-annual and regional phenological patterns characterized by the hybrid model correspond well with those in the CPRs from the USDA NASS. Compared to the benchmark phenology matching model, the hybrid model is more robust to the decreasing levels of phenological reference calibrations, and is particularly advantageous in retrieving crop early phenological stages (e.g., planted and emerged stages) when the phenological reference information is limited. Objective 3: Develop an innovative deep learning-enabled integrated model to prototype scalable, real-time, and field-level crop yield forecasting systems. With our developed hybrid phenology matching model, we are able to characterize the major crop phenological stages. This phenology information is a key step for building scalable crop yield forecasting systems. Thus we are devising a phenology-guided deep learning modeling system for crop yield forecasting. We have collected satellite data, climate data (temperature, precipitation, and vapor pressure deficit), soil data (clay content mass fraction, sand content mass fraction, Water content at 33kPa, pH in H2O, Bulk density, Carbon content), and cropland data layer for the corn belt from 2008-2018. With the data, we are currently developing a phenology-guided Bayesian Convolutional Neural Network model for estimating the crop yield and quantifying the uncertainties.

Publications

  • Type: Journal Articles Status: Published Year Published: 2021 Citation: Diao, C., Yang, Z., Gao, F., Zhang, X. and Z. Yang. 2021. Hybrid phenology matching model for robust crop phenological retrieval. ISPRS Journal of Photogrammetry and Remote Sensing, 181, 308-326.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2021 Citation: Diao, C. and Z. Yang. 2021. Retrieval of crop growing progress with remote sensing and phenology-matching models. Annual Meeting of the Association of American Geographers (AAG), Seattle, Washington.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2021 Citation: Li, G. and C. Diao. 2021. Fine-scale crop phenological monitoring with near-surface remote sensing and high-resolution satellite time series. Annual Meeting of the Association of American Geographers (AAG), Seattle, Washington.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2021 Citation: Yang, Z. and C. Diao. 2021. A novel deep learning-based phenology matching model for characterizing crop phenological stages with fused high spatio-temporal resolution imagery. Annual Meeting of the Association of American Geographers (AAG), Seattle, Washington.
  • Type: Conference Papers and Presentations Status: Published Year Published: 2020 Citation: Diao, C. and Z. Yang. 2020. An innovative phenology-matching model to estimate crop growing stages. American Geophysical Union (AGU) Fall Meeting, San Francisco, California.