Kang Yan , Jun Zheng , Shangjian Chen , Jiangfeng Wang , Fengxin Jin , Ming Dai , Weiyuan Liu , Weihao Tang , Yunbo Bi
{"title":"一种提高视觉测量系统跟踪精度的组合观测位姿校正方法","authors":"Kang Yan , Jun Zheng , Shangjian Chen , Jiangfeng Wang , Fengxin Jin , Ming Dai , Weiyuan Liu , Weihao Tang , Yunbo Bi","doi":"10.1016/j.measurement.2025.117708","DOIUrl":null,"url":null,"abstract":"<div><div>Non-contact full-field 3D measurement and reconstruction using visual binocular tracking is crucial for many contemporary applications. However, accuracy limitations arise in long-range tracking due to low confidence in depth pose estimation. To address this, we propose a pose correction method using two binocular tracking units (TUs) in a perpendicular configuration. In the system, depth tracking from one unit is correlated to lateral tracking from the other unit through their relative pose. During fusion, lateral information takes precedence due to re-projection constraints, enabling the system to operate with high confidence in nearly any tracking direction. This approach not only improves tracking accuracy but also reduces spatial accuracy variations compared to conventional tracking systems. The proposed method is an optimization framework that leverages predictive models and corresponding loss functions. It estimates the combined tracking pose for each frame and the global relative poses between units using projection data from both TUs. Our system exhibits a maximum relative RMSE of 0.0074%, markedly lower than that of separate TU systems (0.0155–0.1153%). Furthermore, it achieves average reductions in spatial accuracy variation of 57.2% and 38.9% compared with individual TU systems.</div></div>","PeriodicalId":18349,"journal":{"name":"Measurement","volume":"253 ","pages":"Article 117708"},"PeriodicalIF":5.2000,"publicationDate":"2025-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A combined-observation pose correction method for enhancing tracking accuracy in visual measurement systems\",\"authors\":\"Kang Yan , Jun Zheng , Shangjian Chen , Jiangfeng Wang , Fengxin Jin , Ming Dai , Weiyuan Liu , Weihao Tang , Yunbo Bi\",\"doi\":\"10.1016/j.measurement.2025.117708\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Non-contact full-field 3D measurement and reconstruction using visual binocular tracking is crucial for many contemporary applications. However, accuracy limitations arise in long-range tracking due to low confidence in depth pose estimation. To address this, we propose a pose correction method using two binocular tracking units (TUs) in a perpendicular configuration. In the system, depth tracking from one unit is correlated to lateral tracking from the other unit through their relative pose. During fusion, lateral information takes precedence due to re-projection constraints, enabling the system to operate with high confidence in nearly any tracking direction. This approach not only improves tracking accuracy but also reduces spatial accuracy variations compared to conventional tracking systems. The proposed method is an optimization framework that leverages predictive models and corresponding loss functions. It estimates the combined tracking pose for each frame and the global relative poses between units using projection data from both TUs. Our system exhibits a maximum relative RMSE of 0.0074%, markedly lower than that of separate TU systems (0.0155–0.1153%). Furthermore, it achieves average reductions in spatial accuracy variation of 57.2% and 38.9% compared with individual TU systems.</div></div>\",\"PeriodicalId\":18349,\"journal\":{\"name\":\"Measurement\",\"volume\":\"253 \",\"pages\":\"Article 117708\"},\"PeriodicalIF\":5.2000,\"publicationDate\":\"2025-05-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Measurement\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S026322412501067X\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S026322412501067X","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
A combined-observation pose correction method for enhancing tracking accuracy in visual measurement systems
Non-contact full-field 3D measurement and reconstruction using visual binocular tracking is crucial for many contemporary applications. However, accuracy limitations arise in long-range tracking due to low confidence in depth pose estimation. To address this, we propose a pose correction method using two binocular tracking units (TUs) in a perpendicular configuration. In the system, depth tracking from one unit is correlated to lateral tracking from the other unit through their relative pose. During fusion, lateral information takes precedence due to re-projection constraints, enabling the system to operate with high confidence in nearly any tracking direction. This approach not only improves tracking accuracy but also reduces spatial accuracy variations compared to conventional tracking systems. The proposed method is an optimization framework that leverages predictive models and corresponding loss functions. It estimates the combined tracking pose for each frame and the global relative poses between units using projection data from both TUs. Our system exhibits a maximum relative RMSE of 0.0074%, markedly lower than that of separate TU systems (0.0155–0.1153%). Furthermore, it achieves average reductions in spatial accuracy variation of 57.2% and 38.9% compared with individual TU systems.
期刊介绍:
Contributions are invited on novel achievements in all fields of measurement and instrumentation science and technology. Authors are encouraged to submit novel material, whose ultimate goal is an advancement in the state of the art of: measurement and metrology fundamentals, sensors, measurement instruments, measurement and estimation techniques, measurement data processing and fusion algorithms, evaluation procedures and methodologies for plants and industrial processes, performance analysis of systems, processes and algorithms, mathematical models for measurement-oriented purposes, distributed measurement systems in a connected world.