基于密集连接和关注特征融合网络的高分辨率光学遥感图像变化检测

Daifeng Peng, Chenchen Zhai, Yongjun Zhang, Haiyan Guan
{"title":"基于密集连接和关注特征融合网络的高分辨率光学遥感图像变化检测","authors":"Daifeng Peng, Chenchen Zhai, Yongjun Zhang, Haiyan Guan","doi":"10.1111/phor.12462","DOIUrl":null,"url":null,"abstract":"Abstract The detection of ground object changes from bi‐temporal images is of great significance for urban planning, land‐use/land‐cover monitoring and natural disaster assessment. To solve the limitation of incomplete change detection (CD) entities and inaccurate edges caused by the loss of detailed information, this paper proposes a network based on dense connections and attention feature fusion, namely Siamese NestedUNet with Attention Feature Fusion (SNAFF). First, multi‐level bi‐temporal features are extracted through a Siamese network. The dense connections between the sub‐nodes of the decoder are used to compensate for the missing location information as well as weakening the semantic differences between features. Then, the attention mechanism is introduced to combine global and local information to achieve feature fusion. Finally, a deep supervision strategy is used to suppress the problem of gradient vanishing and slow convergence speed. During the testing phase, the test time augmentation (TTA) strategy is adopted to further improve the CD performance. In order to verify the effectiveness of the proposed method, two datasets with different change types are used. The experimental results indicate that, compared with the comparison methods, the proposed SNAFF achieves the best quantitative results on both datasets, in which F1, IoU and OA in the LEVIR‐CD dataset are 91.47%, 84.28% and 99.13%, respectively, and the values in the CDD dataset are 96.91%, 94.01% and 99.27%, respectively. In addition, the qualitative results show that SNAFF can effectively retain the global and edge information of the detected entity, thus achieving the best visual performance.","PeriodicalId":22881,"journal":{"name":"The Photogrammetric Record","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"High‐resolution optical remote sensing image change detection based on dense connection and attention feature fusion network\",\"authors\":\"Daifeng Peng, Chenchen Zhai, Yongjun Zhang, Haiyan Guan\",\"doi\":\"10.1111/phor.12462\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract The detection of ground object changes from bi‐temporal images is of great significance for urban planning, land‐use/land‐cover monitoring and natural disaster assessment. To solve the limitation of incomplete change detection (CD) entities and inaccurate edges caused by the loss of detailed information, this paper proposes a network based on dense connections and attention feature fusion, namely Siamese NestedUNet with Attention Feature Fusion (SNAFF). First, multi‐level bi‐temporal features are extracted through a Siamese network. The dense connections between the sub‐nodes of the decoder are used to compensate for the missing location information as well as weakening the semantic differences between features. Then, the attention mechanism is introduced to combine global and local information to achieve feature fusion. Finally, a deep supervision strategy is used to suppress the problem of gradient vanishing and slow convergence speed. During the testing phase, the test time augmentation (TTA) strategy is adopted to further improve the CD performance. In order to verify the effectiveness of the proposed method, two datasets with different change types are used. The experimental results indicate that, compared with the comparison methods, the proposed SNAFF achieves the best quantitative results on both datasets, in which F1, IoU and OA in the LEVIR‐CD dataset are 91.47%, 84.28% and 99.13%, respectively, and the values in the CDD dataset are 96.91%, 94.01% and 99.27%, respectively. In addition, the qualitative results show that SNAFF can effectively retain the global and edge information of the detected entity, thus achieving the best visual performance.\",\"PeriodicalId\":22881,\"journal\":{\"name\":\"The Photogrammetric Record\",\"volume\":\"4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-09-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Photogrammetric Record\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1111/phor.12462\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Photogrammetric Record","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/phor.12462","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

从双时相影像中检测地物变化对城市规划、土地利用/土地覆盖监测和自然灾害评估具有重要意义。为了解决不完全变化检测(CD)实体和细节信息丢失导致的边缘不准确的局限性,本文提出了一种基于密集连接和注意力特征融合的网络,即Siamese NestedUNet with attention feature fusion (SNAFF)。首先,通过Siamese网络提取多层次双时态特征。解码器子节点之间的紧密连接用于补偿缺失的位置信息,并减弱特征之间的语义差异。然后,引入注意机制,结合全局和局部信息实现特征融合;最后,采用深度监督策略抑制了梯度消失和收敛速度慢的问题。在测试阶段,采用测试时间增加(TTA)策略进一步提高CD性能。为了验证该方法的有效性,使用了两个不同变化类型的数据集。实验结果表明,与对比方法相比,所提出的SNAFF在两个数据集上都取得了最好的定量结果,其中LEVIR‐CD数据集的F1、IoU和OA分别为91.47%、84.28%和99.13%,CDD数据集的F1、IoU和OA分别为96.91%、94.01%和99.27%。定性结果表明,SNAFF能够有效地保留被检测实体的全局和边缘信息,从而达到最佳的视觉效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

High‐resolution optical remote sensing image change detection based on dense connection and attention feature fusion network

High‐resolution optical remote sensing image change detection based on dense connection and attention feature fusion network
Abstract The detection of ground object changes from bi‐temporal images is of great significance for urban planning, land‐use/land‐cover monitoring and natural disaster assessment. To solve the limitation of incomplete change detection (CD) entities and inaccurate edges caused by the loss of detailed information, this paper proposes a network based on dense connections and attention feature fusion, namely Siamese NestedUNet with Attention Feature Fusion (SNAFF). First, multi‐level bi‐temporal features are extracted through a Siamese network. The dense connections between the sub‐nodes of the decoder are used to compensate for the missing location information as well as weakening the semantic differences between features. Then, the attention mechanism is introduced to combine global and local information to achieve feature fusion. Finally, a deep supervision strategy is used to suppress the problem of gradient vanishing and slow convergence speed. During the testing phase, the test time augmentation (TTA) strategy is adopted to further improve the CD performance. In order to verify the effectiveness of the proposed method, two datasets with different change types are used. The experimental results indicate that, compared with the comparison methods, the proposed SNAFF achieves the best quantitative results on both datasets, in which F1, IoU and OA in the LEVIR‐CD dataset are 91.47%, 84.28% and 99.13%, respectively, and the values in the CDD dataset are 96.91%, 94.01% and 99.27%, respectively. In addition, the qualitative results show that SNAFF can effectively retain the global and edge information of the detected entity, thus achieving the best visual performance.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信