基于两流融合网络和残差通道关注机制的遥感图像融合算法

Mengxing Huang, Shi Liu, Zhenfeng Li, Siling Feng, Di Wu, Yuanyuan Wu, Feng Shu
{"title":"基于两流融合网络和残差通道关注机制的遥感图像融合算法","authors":"Mengxing Huang, Shi Liu, Zhenfeng Li, Siling Feng, Di Wu, Yuanyuan Wu, Feng Shu","doi":"10.1155/2022/8476000","DOIUrl":null,"url":null,"abstract":"A two-stream remote sensing image fusion network (RCAMTFNet) based on the residual channel attention mechanism is proposed by introducing the residual channel attention mechanism (RCAM) in this paper. In the RCAMTFNet, the spatial features of PAN and the spectral features of MS are extracted, respectively, by a two-channel feature extraction layer. Multiresidual connections allow the network to adapt to a deeper network structure without the degradation. The residual channel attention mechanism is introduced to learn the interdependence between channels, and then the correlation features among channels are adapted on the basis of the dependency. In this way, image spatial information and spectral information are extracted exclusively. What is more, pansharpening images are reconstructed across the board. Experiments are conducted on two satellite datasets, GaoFen-2 and WorldView-2. The experimental results show that the proposed algorithm is superior to the algorithms to some existing literature in the comparison of the values of reference evaluation indicators and nonreference evaluation indicators.","PeriodicalId":23995,"journal":{"name":"Wirel. Commun. Mob. Comput.","volume":"62 1","pages":"8476000:1-8476000:14"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Remote Sensing Image Fusion Algorithm Based on Two-Stream Fusion Network and Residual Channel Attention Mechanism\",\"authors\":\"Mengxing Huang, Shi Liu, Zhenfeng Li, Siling Feng, Di Wu, Yuanyuan Wu, Feng Shu\",\"doi\":\"10.1155/2022/8476000\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A two-stream remote sensing image fusion network (RCAMTFNet) based on the residual channel attention mechanism is proposed by introducing the residual channel attention mechanism (RCAM) in this paper. In the RCAMTFNet, the spatial features of PAN and the spectral features of MS are extracted, respectively, by a two-channel feature extraction layer. Multiresidual connections allow the network to adapt to a deeper network structure without the degradation. The residual channel attention mechanism is introduced to learn the interdependence between channels, and then the correlation features among channels are adapted on the basis of the dependency. In this way, image spatial information and spectral information are extracted exclusively. What is more, pansharpening images are reconstructed across the board. Experiments are conducted on two satellite datasets, GaoFen-2 and WorldView-2. The experimental results show that the proposed algorithm is superior to the algorithms to some existing literature in the comparison of the values of reference evaluation indicators and nonreference evaluation indicators.\",\"PeriodicalId\":23995,\"journal\":{\"name\":\"Wirel. Commun. Mob. Comput.\",\"volume\":\"62 1\",\"pages\":\"8476000:1-8476000:14\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-01-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Wirel. Commun. Mob. Comput.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1155/2022/8476000\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Wirel. Commun. Mob. Comput.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1155/2022/8476000","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

在引入残差通道注意机制的基础上,提出了一种基于残差通道注意机制的两流遥感图像融合网络(RCAMTFNet)。在RCAMTFNet中,通过双通道特征提取层分别提取PAN的空间特征和MS的光谱特征。多残差连接允许网络在不退化的情况下适应更深的网络结构。引入剩余通道注意机制来学习通道间的相互依赖关系,并在此基础上调整通道间的相关特征。这样,图像空间信息和光谱信息就被单独提取出来了。更重要的是,泛锐化图像被全面重建。实验在高分二号和世界观二号卫星数据集上进行。实验结果表明,在比较参考评价指标与非参考评价指标的数值方面,本文算法优于现有文献的算法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Remote Sensing Image Fusion Algorithm Based on Two-Stream Fusion Network and Residual Channel Attention Mechanism
A two-stream remote sensing image fusion network (RCAMTFNet) based on the residual channel attention mechanism is proposed by introducing the residual channel attention mechanism (RCAM) in this paper. In the RCAMTFNet, the spatial features of PAN and the spectral features of MS are extracted, respectively, by a two-channel feature extraction layer. Multiresidual connections allow the network to adapt to a deeper network structure without the degradation. The residual channel attention mechanism is introduced to learn the interdependence between channels, and then the correlation features among channels are adapted on the basis of the dependency. In this way, image spatial information and spectral information are extracted exclusively. What is more, pansharpening images are reconstructed across the board. Experiments are conducted on two satellite datasets, GaoFen-2 and WorldView-2. The experimental results show that the proposed algorithm is superior to the algorithms to some existing literature in the comparison of the values of reference evaluation indicators and nonreference evaluation indicators.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信