CGF-Unet:基于 Unet 结合全局特征的侧扫声纳语义分割

IF 3.8 2区 工程技术 Q1 ENGINEERING, CIVIL
Yushan Sun;Haotian Zheng;Guocheng Zhang;Jingfei Ren;Guoyang Shu
{"title":"CGF-Unet:基于 Unet 结合全局特征的侧扫声纳语义分割","authors":"Yushan Sun;Haotian Zheng;Guocheng Zhang;Jingfei Ren;Guoyang Shu","doi":"10.1109/JOE.2024.3364670","DOIUrl":null,"url":null,"abstract":"In the realm of oceanic exploration, sidescan sonar's significance is indisputable. However, the inherent challenges of low resolution and robust noise interference in sidescan sonar images have presented a formidable barrier to semantic segmentation in target regions. To address this, we propose a novel CGF-Unet framework, amalgamating Unet with global features, for precise and rapid sidescan sonar image segmentation. Leveraging both Transformers and Unet, CGF-Unet strategically introduces Transformer Blocks during downsampling and upsampling, amplifying access to comprehensive global insights and synergizing Transformer's potent sequence encoding with convolutional neural network's (CNN) holistic perception and spatial invariance. The incorporation of Conv-Attention within the Transformer Block streamlines model training parameters, accelerates training pace, and bolsters learning prowess. By implementing a weighted loss function, we navigate the challenge posed by skewed positive and negative samples, thereby elevating segmentation accuracy. Demonstrating its novelty, on distinct sidescan sonar data sets, we achieve exceptional mIOU scores of 89.3% and 86.5%, surpassing existing methodologies in precision. Remarkably, even amidst noise perturbation, the method maintains robust performance.","PeriodicalId":13191,"journal":{"name":"IEEE Journal of Oceanic Engineering","volume":"49 3","pages":"963-975"},"PeriodicalIF":3.8000,"publicationDate":"2024-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"CGF-Unet: Semantic Segmentation of Sidescan Sonar Based on Unet Combined With Global Features\",\"authors\":\"Yushan Sun;Haotian Zheng;Guocheng Zhang;Jingfei Ren;Guoyang Shu\",\"doi\":\"10.1109/JOE.2024.3364670\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the realm of oceanic exploration, sidescan sonar's significance is indisputable. However, the inherent challenges of low resolution and robust noise interference in sidescan sonar images have presented a formidable barrier to semantic segmentation in target regions. To address this, we propose a novel CGF-Unet framework, amalgamating Unet with global features, for precise and rapid sidescan sonar image segmentation. Leveraging both Transformers and Unet, CGF-Unet strategically introduces Transformer Blocks during downsampling and upsampling, amplifying access to comprehensive global insights and synergizing Transformer's potent sequence encoding with convolutional neural network's (CNN) holistic perception and spatial invariance. The incorporation of Conv-Attention within the Transformer Block streamlines model training parameters, accelerates training pace, and bolsters learning prowess. By implementing a weighted loss function, we navigate the challenge posed by skewed positive and negative samples, thereby elevating segmentation accuracy. Demonstrating its novelty, on distinct sidescan sonar data sets, we achieve exceptional mIOU scores of 89.3% and 86.5%, surpassing existing methodologies in precision. Remarkably, even amidst noise perturbation, the method maintains robust performance.\",\"PeriodicalId\":13191,\"journal\":{\"name\":\"IEEE Journal of Oceanic Engineering\",\"volume\":\"49 3\",\"pages\":\"963-975\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2024-04-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal of Oceanic Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10496477/\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, CIVIL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Oceanic Engineering","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10496477/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0

摘要

在海洋探测领域,侧扫声纳的重要性毋庸置疑。然而,侧扫声纳图像的低分辨率和强噪声干扰等固有挑战给目标区域的语义分割带来了巨大障碍。为解决这一问题,我们提出了一种新颖的 CGF-Unet 框架,将 Unet 与全局特征相结合,用于精确、快速的侧扫声纳图像分割。CGF-Unet 同时利用 Transformers 和 Unet,在降采样和升采样过程中战略性地引入 Transformer Block,从而扩大了对全面全局洞察力的获取,并将 Transformer 强大的序列编码与卷积神经网络(CNN)的整体感知和空间不变性协同作用。将 Conv-Attention 纳入 Transformer Block 简化了模型训练参数,加快了训练速度,增强了学习能力。通过实施加权损失函数,我们克服了正负样本倾斜带来的挑战,从而提高了分割精度。在不同的侧扫声纳数据集上,我们取得了 89.3% 和 86.5% 的优异 mIOU 分数,在精确度上超越了现有方法,证明了该方法的新颖性。值得注意的是,即使在噪声扰动的情况下,该方法也能保持稳健的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
CGF-Unet: Semantic Segmentation of Sidescan Sonar Based on Unet Combined With Global Features
In the realm of oceanic exploration, sidescan sonar's significance is indisputable. However, the inherent challenges of low resolution and robust noise interference in sidescan sonar images have presented a formidable barrier to semantic segmentation in target regions. To address this, we propose a novel CGF-Unet framework, amalgamating Unet with global features, for precise and rapid sidescan sonar image segmentation. Leveraging both Transformers and Unet, CGF-Unet strategically introduces Transformer Blocks during downsampling and upsampling, amplifying access to comprehensive global insights and synergizing Transformer's potent sequence encoding with convolutional neural network's (CNN) holistic perception and spatial invariance. The incorporation of Conv-Attention within the Transformer Block streamlines model training parameters, accelerates training pace, and bolsters learning prowess. By implementing a weighted loss function, we navigate the challenge posed by skewed positive and negative samples, thereby elevating segmentation accuracy. Demonstrating its novelty, on distinct sidescan sonar data sets, we achieve exceptional mIOU scores of 89.3% and 86.5%, surpassing existing methodologies in precision. Remarkably, even amidst noise perturbation, the method maintains robust performance.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Journal of Oceanic Engineering
IEEE Journal of Oceanic Engineering 工程技术-工程:大洋
CiteScore
9.60
自引率
12.20%
发文量
86
审稿时长
12 months
期刊介绍: The IEEE Journal of Oceanic Engineering (ISSN 0364-9059) is the online-only quarterly publication of the IEEE Oceanic Engineering Society (IEEE OES). The scope of the Journal is the field of interest of the IEEE OES, which encompasses all aspects of science, engineering, and technology that address research, development, and operations pertaining to all bodies of water. This includes the creation of new capabilities and technologies from concept design through prototypes, testing, and operational systems to sense, explore, understand, develop, use, and responsibly manage natural resources.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信