A Fully convolutional network for spatiotemporal feature extraction and detection of active sonar target echoes

IF 5.5 2区 工程技术 Q1 ENGINEERING, CIVIL
Nan Lu, Tongsheng Shen, Zailei Luo, Xionghui Li, Yongmeng Zhu
{"title":"A Fully convolutional network for spatiotemporal feature extraction and detection of active sonar target echoes","authors":"Nan Lu,&nbsp;Tongsheng Shen,&nbsp;Zailei Luo,&nbsp;Xionghui Li,&nbsp;Yongmeng Zhu","doi":"10.1016/j.oceaneng.2025.122955","DOIUrl":null,"url":null,"abstract":"<div><div>Conventional active sonar target detection methods primarily rely on echo amplitude information, often neglecting the valuable spatiotemporal structural features (ST-SF) introduced by signal processing operations such as matched filtering and beamforming. This limitation restricts their detection performance in challenging environments. To address this issue, this paper proposes a novel method termed FCN-STFED (Fully Convolutional Network for Spatiotemporal Feature Extraction and Detection). The core of the method is an encoder-decoder based fully convolutional network (FCN), which learns, in a data-driven and end-to-end manner, a complex nonlinear mapping from time-angle energy matrix, incorporating local contextual information, to detection statistics. This enables effective exploitation of the spatial structure characteristics of target echoes. The detection threshold is adaptively determined via Monte Carlo simulation according to a preset false alarm probability. Experimental results demonstrate that the proposed FCN-STFED method achieves superior performance over the conventional two-dimensional constant false alarm rate (CFAR) detector. It yields an average improvement in detection probability of approximately 21<span><math><mo>%</mo></math></span> at the same false alarm rate when processing complex multi-highlight target models under low signal-to-clutter ratio (SCR) conditions. Visualization analyses further confirm that the FCN successfully learns discriminative structural features of targets, leading to more reliable detection under low SCR conditions. This study validates the significant potential of using deep learning to exploit inherent ST-SF in the signal processing domain, offering an effective and engineering-feasible new approach for enhancing active sonar detection performance in complex environments.</div></div>","PeriodicalId":19403,"journal":{"name":"Ocean Engineering","volume":"342 ","pages":"Article 122955"},"PeriodicalIF":5.5000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ocean Engineering","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0029801825026381","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0

Abstract

Conventional active sonar target detection methods primarily rely on echo amplitude information, often neglecting the valuable spatiotemporal structural features (ST-SF) introduced by signal processing operations such as matched filtering and beamforming. This limitation restricts their detection performance in challenging environments. To address this issue, this paper proposes a novel method termed FCN-STFED (Fully Convolutional Network for Spatiotemporal Feature Extraction and Detection). The core of the method is an encoder-decoder based fully convolutional network (FCN), which learns, in a data-driven and end-to-end manner, a complex nonlinear mapping from time-angle energy matrix, incorporating local contextual information, to detection statistics. This enables effective exploitation of the spatial structure characteristics of target echoes. The detection threshold is adaptively determined via Monte Carlo simulation according to a preset false alarm probability. Experimental results demonstrate that the proposed FCN-STFED method achieves superior performance over the conventional two-dimensional constant false alarm rate (CFAR) detector. It yields an average improvement in detection probability of approximately 21% at the same false alarm rate when processing complex multi-highlight target models under low signal-to-clutter ratio (SCR) conditions. Visualization analyses further confirm that the FCN successfully learns discriminative structural features of targets, leading to more reliable detection under low SCR conditions. This study validates the significant potential of using deep learning to exploit inherent ST-SF in the signal processing domain, offering an effective and engineering-feasible new approach for enhancing active sonar detection performance in complex environments.
基于全卷积网络的主动声纳目标回波时空特征提取与检测
传统的主动声纳目标检测方法主要依赖回波幅度信息,往往忽略了匹配滤波和波束形成等信号处理操作引入的有价值的时空结构特征。这一限制限制了它们在具有挑战性的环境中的检测性能。为了解决这个问题,本文提出了一种新的方法,称为FCN-STFED(全卷积网络时空特征提取和检测)。该方法的核心是基于编码器-解码器的全卷积网络(FCN),该网络以数据驱动和端到端方式学习从时间角能量矩阵到检测统计的复杂非线性映射,并结合局部上下文信息。这样可以有效地利用目标回波的空间结构特征。根据预先设定的虚警概率,通过蒙特卡罗仿真自适应确定检测阈值。实验结果表明,所提出的FCN-STFED方法比传统的二维恒虚警率(CFAR)检测器具有更好的性能。在低信杂比(SCR)条件下处理复杂的多高光目标模型时,在相同虚警率下,检测概率平均提高约21%。可视化分析进一步证实,FCN成功地学习了目标的判别结构特征,在低SCR条件下实现了更可靠的检测。该研究验证了利用深度学习来开发信号处理领域固有ST-SF的巨大潜力,为在复杂环境中提高主动声纳探测性能提供了一种有效且工程上可行的新方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Ocean Engineering
Ocean Engineering 工程技术-工程:大洋
CiteScore
7.30
自引率
34.00%
发文量
2379
审稿时长
8.1 months
期刊介绍: Ocean Engineering provides a medium for the publication of original research and development work in the field of ocean engineering. Ocean Engineering seeks papers in the following topics.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信