Jingshi Wang, Zhipeng Lin, Zhiyu Zhou, Zhi Gao, Guoqing Wang
{"title":"基于强度感知的深线配准的激光雷达和相机的外部校准","authors":"Jingshi Wang, Zhipeng Lin, Zhiyu Zhou, Zhi Gao, Guoqing Wang","doi":"10.1049/ell2.70423","DOIUrl":null,"url":null,"abstract":"<p>Accurate extrinsic calibration between the light detection and ranging (LiDAR) and camera is a critical step for sensor fusion tasks. Existing calibration methods often rely on artificial calibration targets or distinct visual textures, which may not be available in many real-world environments. In addition, conventional LiDAR systems often capture sparse point clouds, which limits feature extraction and matching in calibration tasks. In this work, we propose a novel extrinsic calibration framework that leverages intensity-aware deep line registration. Our approach first generates dense point clouds by incrementally registering consecutive LiDAR frames and voxel filtering. This dense point cloud serves as the basis for generating high-resolution intensity maps. Next, we apply deep learning-based line detection algorithms to extract robust line features from both the intensity map and the corresponding camera image. By minimising a distance-based objective function formulated with the 3D line points and 2D image lines, we estimate the extrinsic parameters through optimisation process. Experimental results show that our method achieves sub-pixel reprojection accuracy and robustness in various environments. Our calibration method is cost-effective, easy to deploy and suitable for real-time robotic applications without the need for artificial targets.</p>","PeriodicalId":11556,"journal":{"name":"Electronics Letters","volume":"61 1","pages":""},"PeriodicalIF":0.8000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ell2.70423","citationCount":"0","resultStr":"{\"title\":\"Extrinsic Calibration of LiDAR and Camera via Intensity-Aware Deep Line Registration\",\"authors\":\"Jingshi Wang, Zhipeng Lin, Zhiyu Zhou, Zhi Gao, Guoqing Wang\",\"doi\":\"10.1049/ell2.70423\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Accurate extrinsic calibration between the light detection and ranging (LiDAR) and camera is a critical step for sensor fusion tasks. Existing calibration methods often rely on artificial calibration targets or distinct visual textures, which may not be available in many real-world environments. In addition, conventional LiDAR systems often capture sparse point clouds, which limits feature extraction and matching in calibration tasks. In this work, we propose a novel extrinsic calibration framework that leverages intensity-aware deep line registration. Our approach first generates dense point clouds by incrementally registering consecutive LiDAR frames and voxel filtering. This dense point cloud serves as the basis for generating high-resolution intensity maps. Next, we apply deep learning-based line detection algorithms to extract robust line features from both the intensity map and the corresponding camera image. By minimising a distance-based objective function formulated with the 3D line points and 2D image lines, we estimate the extrinsic parameters through optimisation process. Experimental results show that our method achieves sub-pixel reprojection accuracy and robustness in various environments. Our calibration method is cost-effective, easy to deploy and suitable for real-time robotic applications without the need for artificial targets.</p>\",\"PeriodicalId\":11556,\"journal\":{\"name\":\"Electronics Letters\",\"volume\":\"61 1\",\"pages\":\"\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2025-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ell2.70423\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electronics Letters\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/ell2.70423\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics Letters","FirstCategoryId":"5","ListUrlMain":"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/ell2.70423","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Extrinsic Calibration of LiDAR and Camera via Intensity-Aware Deep Line Registration
Accurate extrinsic calibration between the light detection and ranging (LiDAR) and camera is a critical step for sensor fusion tasks. Existing calibration methods often rely on artificial calibration targets or distinct visual textures, which may not be available in many real-world environments. In addition, conventional LiDAR systems often capture sparse point clouds, which limits feature extraction and matching in calibration tasks. In this work, we propose a novel extrinsic calibration framework that leverages intensity-aware deep line registration. Our approach first generates dense point clouds by incrementally registering consecutive LiDAR frames and voxel filtering. This dense point cloud serves as the basis for generating high-resolution intensity maps. Next, we apply deep learning-based line detection algorithms to extract robust line features from both the intensity map and the corresponding camera image. By minimising a distance-based objective function formulated with the 3D line points and 2D image lines, we estimate the extrinsic parameters through optimisation process. Experimental results show that our method achieves sub-pixel reprojection accuracy and robustness in various environments. Our calibration method is cost-effective, easy to deploy and suitable for real-time robotic applications without the need for artificial targets.
期刊介绍:
Electronics Letters is an internationally renowned peer-reviewed rapid-communication journal that publishes short original research papers every two weeks. Its broad and interdisciplinary scope covers the latest developments in all electronic engineering related fields including communication, biomedical, optical and device technologies. Electronics Letters also provides further insight into some of the latest developments through special features and interviews.
Scope
As a journal at the forefront of its field, Electronics Letters publishes papers covering all themes of electronic and electrical engineering. The major themes of the journal are listed below.
Antennas and Propagation
Biomedical and Bioinspired Technologies, Signal Processing and Applications
Control Engineering
Electromagnetism: Theory, Materials and Devices
Electronic Circuits and Systems
Image, Video and Vision Processing and Applications
Information, Computing and Communications
Instrumentation and Measurement
Microwave Technology
Optical Communications
Photonics and Opto-Electronics
Power Electronics, Energy and Sustainability
Radar, Sonar and Navigation
Semiconductor Technology
Signal Processing
MIMO