{"title":"约束视图下空间目标三维重建的不确定性神经曲面","authors":"Yuandong Li;Qinglei Hu;Fei Dong;Dongyu Li;Zhenchao Ouyang","doi":"10.1109/TCSVT.2025.3551779","DOIUrl":null,"url":null,"abstract":"In asteroid exploration and orbital servicing missions with space robots, accurate 3D structural of the target is typically relied upon for planning landing trajectories and controlling movements. Unlike conventional neural radiance fields (NeRF) studies, which rely on full-view random sampling of targets that can be easily achieved on the ground, spacecraft operations present unique challenges due to the kinematic orbit constraint, the high cost of controlled motion, and limited fuel reserves. This results in limited observation of space targets. In order to obtain 3D structure under close-flybys and restricted observation, we proposed Uncertainty Neural Surfaces (UNS) model based on Bayesian uncertainty estimation. UNS enhance the precision of reconstructed target surfaces under constrained-views, providing guidance for subsequent imaging view design. Specifically, UNS introduces Bayesian estimation based surface uncertainty on neural implicit surfaces. The estimation is calculated based on the degree of self-occlusion of the target and the difference between rendered and actual colors. This approach enables uncertain estimation of 3D space and arbitrary view. Finally, extensive systematic evaluations and analyses of spacecraft model sampling in a local darkroom validate the sophistication of UNS in uncertainty estimation and surface reconstruction quality. Code is available at <uri>https://github.com/YD-96/UNS</uri>.","PeriodicalId":13082,"journal":{"name":"IEEE Transactions on Circuits and Systems for Video Technology","volume":"35 8","pages":"8045-8056"},"PeriodicalIF":11.1000,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Uncertainty Neural Surfaces for Space Target 3D Reconstruction Under Constrained Views\",\"authors\":\"Yuandong Li;Qinglei Hu;Fei Dong;Dongyu Li;Zhenchao Ouyang\",\"doi\":\"10.1109/TCSVT.2025.3551779\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In asteroid exploration and orbital servicing missions with space robots, accurate 3D structural of the target is typically relied upon for planning landing trajectories and controlling movements. Unlike conventional neural radiance fields (NeRF) studies, which rely on full-view random sampling of targets that can be easily achieved on the ground, spacecraft operations present unique challenges due to the kinematic orbit constraint, the high cost of controlled motion, and limited fuel reserves. This results in limited observation of space targets. In order to obtain 3D structure under close-flybys and restricted observation, we proposed Uncertainty Neural Surfaces (UNS) model based on Bayesian uncertainty estimation. UNS enhance the precision of reconstructed target surfaces under constrained-views, providing guidance for subsequent imaging view design. Specifically, UNS introduces Bayesian estimation based surface uncertainty on neural implicit surfaces. The estimation is calculated based on the degree of self-occlusion of the target and the difference between rendered and actual colors. This approach enables uncertain estimation of 3D space and arbitrary view. Finally, extensive systematic evaluations and analyses of spacecraft model sampling in a local darkroom validate the sophistication of UNS in uncertainty estimation and surface reconstruction quality. Code is available at <uri>https://github.com/YD-96/UNS</uri>.\",\"PeriodicalId\":13082,\"journal\":{\"name\":\"IEEE Transactions on Circuits and Systems for Video Technology\",\"volume\":\"35 8\",\"pages\":\"8045-8056\"},\"PeriodicalIF\":11.1000,\"publicationDate\":\"2025-03-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Circuits and Systems for Video Technology\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10929007/\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Circuits and Systems for Video Technology","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10929007/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Uncertainty Neural Surfaces for Space Target 3D Reconstruction Under Constrained Views
In asteroid exploration and orbital servicing missions with space robots, accurate 3D structural of the target is typically relied upon for planning landing trajectories and controlling movements. Unlike conventional neural radiance fields (NeRF) studies, which rely on full-view random sampling of targets that can be easily achieved on the ground, spacecraft operations present unique challenges due to the kinematic orbit constraint, the high cost of controlled motion, and limited fuel reserves. This results in limited observation of space targets. In order to obtain 3D structure under close-flybys and restricted observation, we proposed Uncertainty Neural Surfaces (UNS) model based on Bayesian uncertainty estimation. UNS enhance the precision of reconstructed target surfaces under constrained-views, providing guidance for subsequent imaging view design. Specifically, UNS introduces Bayesian estimation based surface uncertainty on neural implicit surfaces. The estimation is calculated based on the degree of self-occlusion of the target and the difference between rendered and actual colors. This approach enables uncertain estimation of 3D space and arbitrary view. Finally, extensive systematic evaluations and analyses of spacecraft model sampling in a local darkroom validate the sophistication of UNS in uncertainty estimation and surface reconstruction quality. Code is available at https://github.com/YD-96/UNS.
期刊介绍:
The IEEE Transactions on Circuits and Systems for Video Technology (TCSVT) is dedicated to covering all aspects of video technologies from a circuits and systems perspective. We encourage submissions of general, theoretical, and application-oriented papers related to image and video acquisition, representation, presentation, and display. Additionally, we welcome contributions in areas such as processing, filtering, and transforms; analysis and synthesis; learning and understanding; compression, transmission, communication, and networking; as well as storage, retrieval, indexing, and search. Furthermore, papers focusing on hardware and software design and implementation are highly valued. Join us in advancing the field of video technology through innovative research and insights.