EMET: An emergence-based thermal phenological framework for near real-time crop type mapping

IF 10.6 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL
{"title":"EMET: An emergence-based thermal phenological framework for near real-time crop type mapping","authors":"","doi":"10.1016/j.isprsjprs.2024.07.007","DOIUrl":null,"url":null,"abstract":"<div><p>Near real-time (NRT) crop type mapping plays a crucial role in modeling crop development, managing food supply chains, and supporting sustainable agriculture. The low-latency updates on crop type distribution also help assess the impacts of weather extremes and climate change on agricultural production in a timely fashion, aiding in identification of early risks in food insecurity as well as rapid assessments of the damage. Yet NRT crop type mapping is challenging due to the obstacle in acquiring timely crop type reference labels during the current season for crop mapping model building. Meanwhile, the crop mapping models constructed with historical crop type labels and corresponding satellite imagery may not be applicable to the current season in NRT due to spatiotemporal variability of crop phenology. The difficulty in characterizing crop phenology in NRT remains a significant hurdle in NRT crop type mapping. To tackle these issues, a novel emergence-based thermal phenological framework (EMET) is proposed in this study for field-level NRT crop type mapping. The EMET framework comprises three key components: hybrid deep learning spatiotemporal image fusion, NRT thermal-based crop phenology normalization, and NRT crop type characterization. The hybrid fusion model integrates super-resolution convolutional neural network (SRCNN) and long short-term memory (LSTM) to generate daily satellite observations with a high spatial resolution in NRT. The NRT thermal-based crop phenology normalization innovatively synthesizes within-season crop emergence (WISE) model and thermal time accumulation throughout the growing season, to timely normalize crop phenological progress derived from temporally dense fusion imagery. The NRT normalized fusion time series are then fed into an advanced deep learning classifier, the self-attention based LSTM (SAtLSTM) model, to identify crop types. Results in Illinois and Minnesota of the U.S. Corn Belt suggest that the EMET framework significantly enhances the model scalability with crop phenology normalized in NRT for timely crop mapping. A consistently higher overall accuracy is yielded by the EMET framework throughout the growing season compared to the calendar-based and WISE-based benchmark scenarios. When transferred to different study sites and testing years, EMET maintains an advantage of over 5% in overall accuracy during early- to mid-season. Moreover, EMET reaches an overall accuracy of 85% a month earlier than the benchmarks, and it can accurately characterize crop types with an overall accuracy of 90% as early as in late July. F1 scores for both corn and soybeans also achieve 90% around late July. The EMET framework paves the way for large-scale satellite-based NRT crop type mapping at the field level, which can largely help reduce food market volatility to enhance food security, as well as benefit a variety of agricultural applications to optimize crop management towards more sustainable agricultural production.</p></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":null,"pages":null},"PeriodicalIF":10.6000,"publicationDate":"2024-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0924271624002740/pdfft?md5=c768127877f879d4fb5bfe0382468a9f&pid=1-s2.0-S0924271624002740-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271624002740","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Near real-time (NRT) crop type mapping plays a crucial role in modeling crop development, managing food supply chains, and supporting sustainable agriculture. The low-latency updates on crop type distribution also help assess the impacts of weather extremes and climate change on agricultural production in a timely fashion, aiding in identification of early risks in food insecurity as well as rapid assessments of the damage. Yet NRT crop type mapping is challenging due to the obstacle in acquiring timely crop type reference labels during the current season for crop mapping model building. Meanwhile, the crop mapping models constructed with historical crop type labels and corresponding satellite imagery may not be applicable to the current season in NRT due to spatiotemporal variability of crop phenology. The difficulty in characterizing crop phenology in NRT remains a significant hurdle in NRT crop type mapping. To tackle these issues, a novel emergence-based thermal phenological framework (EMET) is proposed in this study for field-level NRT crop type mapping. The EMET framework comprises three key components: hybrid deep learning spatiotemporal image fusion, NRT thermal-based crop phenology normalization, and NRT crop type characterization. The hybrid fusion model integrates super-resolution convolutional neural network (SRCNN) and long short-term memory (LSTM) to generate daily satellite observations with a high spatial resolution in NRT. The NRT thermal-based crop phenology normalization innovatively synthesizes within-season crop emergence (WISE) model and thermal time accumulation throughout the growing season, to timely normalize crop phenological progress derived from temporally dense fusion imagery. The NRT normalized fusion time series are then fed into an advanced deep learning classifier, the self-attention based LSTM (SAtLSTM) model, to identify crop types. Results in Illinois and Minnesota of the U.S. Corn Belt suggest that the EMET framework significantly enhances the model scalability with crop phenology normalized in NRT for timely crop mapping. A consistently higher overall accuracy is yielded by the EMET framework throughout the growing season compared to the calendar-based and WISE-based benchmark scenarios. When transferred to different study sites and testing years, EMET maintains an advantage of over 5% in overall accuracy during early- to mid-season. Moreover, EMET reaches an overall accuracy of 85% a month earlier than the benchmarks, and it can accurately characterize crop types with an overall accuracy of 90% as early as in late July. F1 scores for both corn and soybeans also achieve 90% around late July. The EMET framework paves the way for large-scale satellite-based NRT crop type mapping at the field level, which can largely help reduce food market volatility to enhance food security, as well as benefit a variety of agricultural applications to optimize crop management towards more sustainable agricultural production.

EMET:用于近实时作物类型绘图的基于出现的热物候框架
近实时(NRT)作物类型绘图在模拟作物发展、管理粮食供应链和支持可持续农业方面发挥着至关重要的作用。作物类型分布的低时延更新还有助于及时评估极端天气和气候变化对农业生产的影响,帮助识别粮食不安全的早期风险并快速评估损失。然而,由于在当季及时获取作物类型参考标签以建立作物绘图模型存在障碍,因此 NRT 作物类型绘图具有挑战性。同时,由于作物物候的时空变异性,利用历史作物类型标签和相应卫星图像构建的作物测绘模型可能并不适用于北热带地区的当前季节。难以确定北热带地区作物物候特征仍然是北热带地区作物类型绘图的一个重大障碍。为解决这些问题,本研究提出了一种新颖的基于出现的热物候框架(EMET),用于田间水平的 NRT 作物类型测绘。EMET 框架由三个关键部分组成:混合深度学习时空图像融合、基于 NRT 热的作物物候归一化和 NRT 作物类型表征。混合融合模型集成了超分辨率卷积神经网络(SRCNN)和长短期记忆(LSTM),以生成具有高空间分辨率的 NRT 每日卫星观测数据。基于 NRT 热的作物物候归一化创新性地综合了整个生长季节的季内作物出苗(WISE)模型和热时间累积,及时归一化从时间密集的融合图像中得出的作物物候进展。然后将 NRT 归一化融合时间序列输入先进的深度学习分类器--基于自我注意的 LSTM(SAtLSTM)模型,以识别作物类型。在美国玉米带伊利诺伊州和明尼苏达州的研究结果表明,EMET 框架显著增强了模型的可扩展性,通过在 NRT 中对作物物候进行归一化,可及时绘制作物图。与基于日历的基准方案和基于 WISE 的基准方案相比,EMET 框架在整个生长季节的总体精度始终较高。在不同的研究地点和测试年份中,EMET 在生长季初至中期的总体精度上保持了 5% 以上的优势。此外,EMET 的总体准确率比基准方案早一个月达到 85%,早在七月下旬就能准确描述作物类型,总体准确率达到 90%。玉米和大豆的 F1 分数也在 7 月下旬左右达到 90%。EMET 框架为基于卫星的大规模田间 NRT 作物类型测绘铺平了道路,这在很大程度上有助于减少粮食市场的波动,从而提高粮食安全,并有利于各种农业应用,优化作物管理,实现更可持续的农业生产。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信