{"title":"FMAMPN: lightweight feature map attention multipath network for semantic segmentation of remote sensing image","authors":"Songqi Hou, Ying Yuan","doi":"10.1117/12.3031941","DOIUrl":null,"url":null,"abstract":"Deep neural networks excel in remote sensing image semantic segmentation, but existing methods, despite their sophistication, often focus on channel and spatial dependencies within identical feature maps. This can lead to a uniform treatment of diverse feature maps, hindering information exchange and impacting model efficacy. To address this, we introduce Feature Map Attention, dynamically modulating weights based on interdependencies among various feature maps. This fosters connections and feature fusion, enhancing the model's capability to represent features. Importantly, this improvement comes with minimal additional computational expense. We also incorporate multipath skip connections, efficiently transmitting features at various scales from encoder to decoder, boosting overall model effectiveness. Our FMAMPN, a lightweight neural network, outperforms other state-of-the-art lightweight models across various datasets.","PeriodicalId":198425,"journal":{"name":"Other Conferences","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Other Conferences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.3031941","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Deep neural networks excel in remote sensing image semantic segmentation, but existing methods, despite their sophistication, often focus on channel and spatial dependencies within identical feature maps. This can lead to a uniform treatment of diverse feature maps, hindering information exchange and impacting model efficacy. To address this, we introduce Feature Map Attention, dynamically modulating weights based on interdependencies among various feature maps. This fosters connections and feature fusion, enhancing the model's capability to represent features. Importantly, this improvement comes with minimal additional computational expense. We also incorporate multipath skip connections, efficiently transmitting features at various scales from encoder to decoder, boosting overall model effectiveness. Our FMAMPN, a lightweight neural network, outperforms other state-of-the-art lightweight models across various datasets.