Input matrix compensated strong tracking filter for maneuvering spacecraft tracking

IF 5.8 1区 工程技术 Q1 ENGINEERING, AEROSPACE
Peng Zhang , Nan Zhang , Hexi Baoyin , Zhaokui Wang
{"title":"Input matrix compensated strong tracking filter for maneuvering spacecraft tracking","authors":"Peng Zhang ,&nbsp;Nan Zhang ,&nbsp;Hexi Baoyin ,&nbsp;Zhaokui Wang","doi":"10.1016/j.ast.2025.110995","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate tracking of maneuvering space targets is crucial, as their unpredictable movements pose significant challenges in increasingly congested space environments. A widely used method for tracking impulsive maneuvering targets is the Strong Tracking Filter (STF). While effective, this method has two primary limitations in practical applications. First, it causes significant transient tracking errors following a maneuver. Second, because the method relies on measurement residuals to detect maneuvers, it is unable to distinguish between actual maneuvers and measurement outliers, leading to misinterpretations that degrade tracking accuracy. This paper proposes two key improvements to overcome these issues. First, an input matrix compensation framework is introduced based on a residual orthogonalization criterion, which updates the covariance in a way that better reflects the physical impact of unknown maneuvers through the system’s input matrix. This modification effectively eliminates transient tracking error overshoots while maintaining tracking accuracy. Second, an auxiliary filter is introduced to handle measurement outliers, allowing for precise differentiation between outliers and maneuvers, thereby enhancing the algorithm’s robustness in the presence of outliers. Simulation results demonstrate that the proposed method outperforms the STF in terms of convergence, accuracy, and robustness, particularly in scenarios with measurement outliers.</div></div>","PeriodicalId":50955,"journal":{"name":"Aerospace Science and Technology","volume":"168 ","pages":"Article 110995"},"PeriodicalIF":5.8000,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Aerospace Science and Technology","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1270963825010582","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, AEROSPACE","Score":null,"Total":0}
引用次数: 0

Abstract

Accurate tracking of maneuvering space targets is crucial, as their unpredictable movements pose significant challenges in increasingly congested space environments. A widely used method for tracking impulsive maneuvering targets is the Strong Tracking Filter (STF). While effective, this method has two primary limitations in practical applications. First, it causes significant transient tracking errors following a maneuver. Second, because the method relies on measurement residuals to detect maneuvers, it is unable to distinguish between actual maneuvers and measurement outliers, leading to misinterpretations that degrade tracking accuracy. This paper proposes two key improvements to overcome these issues. First, an input matrix compensation framework is introduced based on a residual orthogonalization criterion, which updates the covariance in a way that better reflects the physical impact of unknown maneuvers through the system’s input matrix. This modification effectively eliminates transient tracking error overshoots while maintaining tracking accuracy. Second, an auxiliary filter is introduced to handle measurement outliers, allowing for precise differentiation between outliers and maneuvers, thereby enhancing the algorithm’s robustness in the presence of outliers. Simulation results demonstrate that the proposed method outperforms the STF in terms of convergence, accuracy, and robustness, particularly in scenarios with measurement outliers.
输入矩阵补偿强跟踪滤波器用于机动航天器跟踪
精确跟踪机动空间目标是至关重要的,因为它们的不可预测的运动构成了日益拥挤的空间环境的重大挑战。一种广泛应用于脉冲机动目标跟踪的方法是强跟踪滤波器(STF)。这种方法虽然有效,但在实际应用中有两个主要限制。首先,它会导致机动后出现严重的瞬态跟踪错误。其次,由于该方法依赖于测量残差来检测机动,因此无法区分实际机动和测量异常值,从而导致误读,降低了跟踪精度。本文提出了克服这些问题的两个关键改进。首先,引入了基于残差正交化准则的输入矩阵补偿框架,该框架通过系统输入矩阵更新协方差,从而更好地反映未知机动的物理影响。这种修改有效地消除了瞬态跟踪误差超调,同时保持了跟踪精度。其次,引入辅助滤波器来处理测量异常值,允许精确区分异常值和机动,从而增强算法在异常值存在时的鲁棒性。仿真结果表明,该方法在收敛性、准确性和鲁棒性方面优于STF,特别是在具有测量异常值的情况下。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Aerospace Science and Technology
Aerospace Science and Technology 工程技术-工程:宇航
CiteScore
10.30
自引率
28.60%
发文量
654
审稿时长
54 days
期刊介绍: Aerospace Science and Technology publishes articles of outstanding scientific quality. Each article is reviewed by two referees. The journal welcomes papers from a wide range of countries. This journal publishes original papers, review articles and short communications related to all fields of aerospace research, fundamental and applied, potential applications of which are clearly related to: • The design and the manufacture of aircraft, helicopters, missiles, launchers and satellites • The control of their environment • The study of various systems they are involved in, as supports or as targets. Authors are invited to submit papers on new advances in the following topics to aerospace applications: • Fluid dynamics • Energetics and propulsion • Materials and structures • Flight mechanics • Navigation, guidance and control • Acoustics • Optics • Electromagnetism and radar • Signal and image processing • Information processing • Data fusion • Decision aid • Human behaviour • Robotics and intelligent systems • Complex system engineering. Etc.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信