Extrinsic Parameter Calibration for Camera and Optical Phased Array LiDAR

IF 5.6 2区 工程技术 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Wenqiang Yue;Yunhao Fu;Xiaolong Hu;Min Tao;Peng Wang;Lei Liang;Baisong Chen;Junfeng Song;Lijun Wang
{"title":"Extrinsic Parameter Calibration for Camera and Optical Phased Array LiDAR","authors":"Wenqiang Yue;Yunhao Fu;Xiaolong Hu;Min Tao;Peng Wang;Lei Liang;Baisong Chen;Junfeng Song;Lijun Wang","doi":"10.1109/TIM.2025.3580845","DOIUrl":null,"url":null,"abstract":"In autonomous driving, the fusion of camera and light detection and ranging (LiDAR) data is critical for accurate environmental perception, with high-precision extrinsic calibration playing a pivotal role. Optical phased array (OPA) LiDAR, due to its advantages in solid-state scanning, coherent detection, immunity to mechanical fatigue and external interference, and eye safety, represents a promising direction in next-generation LiDAR technology. Conventional LiDAR-camera calibration approaches generally rely on spatial or reflectivity-based point cloud features to infer shared correspondences, followed by nonlinear optimization. However, three key challenges remain: 1) the absence of publicly available datasets for the emerging OPA LiDAR; 2) inaccuracies from sparse point clouds, foreground inflation, and bleeding points affecting feature correspondence; and 3) reliance on complex calibration targets and computationally intensive processes, reducing robustness and efficiency. To overcome these limitations, we propose, for the first time, four joint calibration methods specifically designed for OPA LiDAR. These methods utilize OPA’s directional scanning to treat each scan point as a reliable 3-D feature that can be directly matched to corresponding 2-D image features, enabling efficient global nonlinear optimization. Experimental validation demonstrates that our methods achieve higher calibration accuracy and significantly reduced computational time compared to existing state-of-the-art techniques. This offers a robust and efficient solution for future multisensor fusion systems centered around OPA LiDAR.","PeriodicalId":13341,"journal":{"name":"IEEE Transactions on Instrumentation and Measurement","volume":"74 ","pages":"1-15"},"PeriodicalIF":5.6000,"publicationDate":"2025-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Instrumentation and Measurement","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/11040013/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

In autonomous driving, the fusion of camera and light detection and ranging (LiDAR) data is critical for accurate environmental perception, with high-precision extrinsic calibration playing a pivotal role. Optical phased array (OPA) LiDAR, due to its advantages in solid-state scanning, coherent detection, immunity to mechanical fatigue and external interference, and eye safety, represents a promising direction in next-generation LiDAR technology. Conventional LiDAR-camera calibration approaches generally rely on spatial or reflectivity-based point cloud features to infer shared correspondences, followed by nonlinear optimization. However, three key challenges remain: 1) the absence of publicly available datasets for the emerging OPA LiDAR; 2) inaccuracies from sparse point clouds, foreground inflation, and bleeding points affecting feature correspondence; and 3) reliance on complex calibration targets and computationally intensive processes, reducing robustness and efficiency. To overcome these limitations, we propose, for the first time, four joint calibration methods specifically designed for OPA LiDAR. These methods utilize OPA’s directional scanning to treat each scan point as a reliable 3-D feature that can be directly matched to corresponding 2-D image features, enabling efficient global nonlinear optimization. Experimental validation demonstrates that our methods achieve higher calibration accuracy and significantly reduced computational time compared to existing state-of-the-art techniques. This offers a robust and efficient solution for future multisensor fusion systems centered around OPA LiDAR.
相机和光学相控阵激光雷达的外部参数标定
在自动驾驶中,摄像头和光探测和测距(LiDAR)数据的融合对于实现准确的环境感知至关重要,其中高精度的外部校准起着至关重要的作用。光学相控阵(OPA)激光雷达由于其在固态扫描、相干探测、抗机械疲劳和外部干扰以及人眼安全等方面的优势,是下一代激光雷达技术的发展方向。传统的激光雷达相机校准方法通常依赖于基于空间或反射率的点云特征来推断共享对应,然后进行非线性优化。然而,三个关键挑战仍然存在:1)缺乏公开可用的数据集用于新兴的OPA激光雷达;2)稀疏点云、前景膨胀和出血点影响特征对应的不准确性;3)依赖于复杂的校准目标和计算密集型过程,降低了鲁棒性和效率。为了克服这些限制,我们首次提出了四种专门为OPA激光雷达设计的联合校准方法。这些方法利用OPA的定向扫描,将每个扫描点视为一个可靠的三维特征,可以直接匹配相应的二维图像特征,从而实现高效的全局非线性优化。实验验证表明,与现有的最先进技术相比,我们的方法实现了更高的校准精度,并显着减少了计算时间。这为未来以OPA激光雷达为中心的多传感器融合系统提供了一个强大而高效的解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Instrumentation and Measurement
IEEE Transactions on Instrumentation and Measurement 工程技术-工程:电子与电气
CiteScore
9.00
自引率
23.20%
发文量
1294
审稿时长
3.9 months
期刊介绍: Papers are sought that address innovative solutions to the development and use of electrical and electronic instruments and equipment to measure, monitor and/or record physical phenomena for the purpose of advancing measurement science, methods, functionality and applications. The scope of these papers may encompass: (1) theory, methodology, and practice of measurement; (2) design, development and evaluation of instrumentation and measurement systems and components used in generating, acquiring, conditioning and processing signals; (3) analysis, representation, display, and preservation of the information obtained from a set of measurements; and (4) scientific and technical support to establishment and maintenance of technical standards in the field of Instrumentation and Measurement.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信