Multiframe spatio-temporal attention motion-adaptive network for moving space target detection

IF 2.8 3区 地球科学 Q2 ASTRONOMY & ASTROPHYSICS
Yuxi Guo , Junzhe Cao , Bindang Xue
{"title":"Multiframe spatio-temporal attention motion-adaptive network for moving space target detection","authors":"Yuxi Guo ,&nbsp;Junzhe Cao ,&nbsp;Bindang Xue","doi":"10.1016/j.asr.2025.08.024","DOIUrl":null,"url":null,"abstract":"<div><div>Space target detection based on optical observations is a fundamental approach for space situational awareness. However, in ground-based optical images, space targets often resemble stars, making single-frame differentiation challenging. The dense stellar background with high dynamic range further complicates object extraction. In real-world scenarios, varying relative motion between objects and the observation platform induces diverse apparent speeds, scales, and shapes, posing challenges for existing methods to exploit multiframe motion cues effectively, thereby limiting detection performance. To address these challenges, we propose the Multiframe Spatio-temporal Attention Motion-adaptive Network (MSAMNet) for robust small moving target detection in complex backgrounds. MSAMNet integrates Adaptive Attention Feature Enhancement (AAFE) and Spatio-Temporal Dynamic Motion-Aware (STDMA) modules to enhance spatio-temporal feature representation. AAFE generates attention maps based on channel-wise differences and local feature distributions to suppress noise and highlight target details. STDMA captures motion features across consecutive frames through exponential decay weighting and multi-scale heterogeneous kernel convolution, improving sensitivity to diverse motion patterns. Furthermore, we introduce BUAA-MSOD, the first open-source dataset of multiple moving space objects in real-world scenarios, covering diverse motion patterns and target morphologies with both mask and point-level annotations. Experimental results demonstrate that MSAMNet significantly outperforms state-of-the-art methods on BUAA-MSOD, achieving higher detection accuracy and lower false alarm rates across various space targets.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"76 9","pages":"Pages 5383-5405"},"PeriodicalIF":2.8000,"publicationDate":"2025-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117725009007","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0

Abstract

Space target detection based on optical observations is a fundamental approach for space situational awareness. However, in ground-based optical images, space targets often resemble stars, making single-frame differentiation challenging. The dense stellar background with high dynamic range further complicates object extraction. In real-world scenarios, varying relative motion between objects and the observation platform induces diverse apparent speeds, scales, and shapes, posing challenges for existing methods to exploit multiframe motion cues effectively, thereby limiting detection performance. To address these challenges, we propose the Multiframe Spatio-temporal Attention Motion-adaptive Network (MSAMNet) for robust small moving target detection in complex backgrounds. MSAMNet integrates Adaptive Attention Feature Enhancement (AAFE) and Spatio-Temporal Dynamic Motion-Aware (STDMA) modules to enhance spatio-temporal feature representation. AAFE generates attention maps based on channel-wise differences and local feature distributions to suppress noise and highlight target details. STDMA captures motion features across consecutive frames through exponential decay weighting and multi-scale heterogeneous kernel convolution, improving sensitivity to diverse motion patterns. Furthermore, we introduce BUAA-MSOD, the first open-source dataset of multiple moving space objects in real-world scenarios, covering diverse motion patterns and target morphologies with both mask and point-level annotations. Experimental results demonstrate that MSAMNet significantly outperforms state-of-the-art methods on BUAA-MSOD, achieving higher detection accuracy and lower false alarm rates across various space targets.
运动空间目标检测的多帧时空注意运动自适应网络
基于光学观测的空间目标探测是空间态势感知的基本途径。然而,在基于地面的光学图像中,空间目标通常与恒星相似,这使得单帧区分具有挑战性。密集的高动态范围恒星背景使目标提取变得更加复杂。在现实场景中,物体和观测平台之间的相对运动变化会导致不同的视速度、尺度和形状,这对现有方法有效利用多帧运动线索提出了挑战,从而限制了检测性能。为了解决这些挑战,我们提出了用于复杂背景下鲁棒小运动目标检测的多帧时空注意运动自适应网络(MSAMNet)。MSAMNet集成了自适应注意特征增强(AAFE)和时空动态运动感知(STDMA)模块来增强时空特征表示。AAFE根据信道差异和局部特征分布生成注意图,以抑制噪声并突出显示目标细节。STDMA通过指数衰减加权和多尺度异构核卷积捕获连续帧的运动特征,提高了对不同运动模式的灵敏度。此外,我们引入了BUAA-MSOD,这是第一个真实场景中多个运动空间物体的开源数据集,涵盖了不同的运动模式和目标形态,包括掩码和点级注释。实验结果表明,MSAMNet在BUAA-MSOD上显著优于当前最先进的方法,在各种空间目标上实现了更高的检测精度和更低的误报率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Advances in Space Research
Advances in Space Research 地学天文-地球科学综合
CiteScore
5.20
自引率
11.50%
发文量
800
审稿时长
5.8 months
期刊介绍: The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc. NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR). All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信