Huanyu Yin;Xiaoyuan Ren;Libing Jiang;Canyu Wang;Qianwen Xiong;Zhuang Wang
{"title":"Robust Semantic Feature Extraction and Attitude Estimation of Unseen Noncooperative On-Orbit Spacecraft","authors":"Huanyu Yin;Xiaoyuan Ren;Libing Jiang;Canyu Wang;Qianwen Xiong;Zhuang Wang","doi":"10.1109/JSEN.2025.3585138","DOIUrl":null,"url":null,"abstract":"Attitude estimation of noncooperative spacecraft based on a monocular camera is a crucial technique in on-orbit servicing missions. Most of the existing methods rely on the known 3-D model of the target or require a large number of observation images with ground-truth labels, which do not apply to unseen spacecraft lacking such prior knowledge. In this article, we present a two-stage framework for semantic feature extraction of spacecraft typical components and on-orbit attitude estimation of unseen targets to solve the above problem, which inverts the 3-D attitude information from the 2-D axes of spacecraft typical components in the sequential observation images. First, a spacecraft semantic feature network (SSF-Net) is designed, which can learn the common semantic features of typical components in different spacecraft, thereby achieving good generalization for unseen targets and extracting their axes features. Then, we introduce homographic adaptation, the geometric constraint of semantic features, and dynamic constraints in sequential images to optimize false positives or missed detections of the extracted features under extreme observation perspectives. Finally, the axis reconstruction algorithm based on the random sample consensus (RANSAC) is proposed to estimate the attitude of unseen on-orbit spacecraft. Simulation results confirm that the proposed method can effectively extract semantic features, with average pixel and angular errors of 6.93 pixels and 1.86°, respectively, and estimate the attitude of unseen spacecraft with typical component structures accurately, achieving the average estimation error of 3.25°. Experiments also exhibit significant advantages compared to classical methods and excellent robustness under worse observing conditions.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 16","pages":"31858-31873"},"PeriodicalIF":4.3000,"publicationDate":"2025-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/11075951/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Attitude estimation of noncooperative spacecraft based on a monocular camera is a crucial technique in on-orbit servicing missions. Most of the existing methods rely on the known 3-D model of the target or require a large number of observation images with ground-truth labels, which do not apply to unseen spacecraft lacking such prior knowledge. In this article, we present a two-stage framework for semantic feature extraction of spacecraft typical components and on-orbit attitude estimation of unseen targets to solve the above problem, which inverts the 3-D attitude information from the 2-D axes of spacecraft typical components in the sequential observation images. First, a spacecraft semantic feature network (SSF-Net) is designed, which can learn the common semantic features of typical components in different spacecraft, thereby achieving good generalization for unseen targets and extracting their axes features. Then, we introduce homographic adaptation, the geometric constraint of semantic features, and dynamic constraints in sequential images to optimize false positives or missed detections of the extracted features under extreme observation perspectives. Finally, the axis reconstruction algorithm based on the random sample consensus (RANSAC) is proposed to estimate the attitude of unseen on-orbit spacecraft. Simulation results confirm that the proposed method can effectively extract semantic features, with average pixel and angular errors of 6.93 pixels and 1.86°, respectively, and estimate the attitude of unseen spacecraft with typical component structures accurately, achieving the average estimation error of 3.25°. Experiments also exhibit significant advantages compared to classical methods and excellent robustness under worse observing conditions.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice