{"title":"基于两阶段激光雷达补偿的复杂交通场景鸟瞰感知","authors":"Peichao Cong;Murong Deng;Yangang Zhu;Yixuan Xiao;Xin Zhang","doi":"10.1109/JSEN.2025.3545292","DOIUrl":null,"url":null,"abstract":"Establishing 3-D perception capabilities for self-driving cars is a key research problem. Recent research has differentially “lifted” features from multicamera images onto a 2-D ground plane to produce a bird’s-eye view (BEV) feature representation of the 3-D space around the vehicle. However, this is currently challenging due to the inability to accurately reproduce the sizes and positions of truncated objects as well as drag tails and long tails in cameras with different fields of view. In this article, we propose a BEV sensing method based on two-stage light detection and ranging (LiDAR) feature compensation. First, the initial BEV features are obtained by fusing image features with LiDAR voxel features. Second, a two-stage LiDAR feature compensation method is proposed to synthesize the point-voxel features by using voxel features and point cloud features. This method also calculates the similarity between the initial BEV features and the point-voxel features to reject and replace feature points in the image features that have insufficient similarity with the point-voxel features on a large scale. Again, through the compensated BEV features, the BEV features with time series are input into the time-domain BEV feature fusion module, to query the same vehicle’s position, size, and other physical states at different times. Finally, the features are fed into tasks such as detection and segmentation to complete the output. In this study, a comparative validation is carried out on the BEV-aware dataset nuScenes. The experimental results show the effectiveness of truncated target detection and segmentation.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 8","pages":"14342-14357"},"PeriodicalIF":4.3000,"publicationDate":"2025-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bird’s-Eye View Perception Based on Two-Stage LiDAR Compensation in Complex Traffic Scenarios\",\"authors\":\"Peichao Cong;Murong Deng;Yangang Zhu;Yixuan Xiao;Xin Zhang\",\"doi\":\"10.1109/JSEN.2025.3545292\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Establishing 3-D perception capabilities for self-driving cars is a key research problem. Recent research has differentially “lifted” features from multicamera images onto a 2-D ground plane to produce a bird’s-eye view (BEV) feature representation of the 3-D space around the vehicle. However, this is currently challenging due to the inability to accurately reproduce the sizes and positions of truncated objects as well as drag tails and long tails in cameras with different fields of view. In this article, we propose a BEV sensing method based on two-stage light detection and ranging (LiDAR) feature compensation. First, the initial BEV features are obtained by fusing image features with LiDAR voxel features. Second, a two-stage LiDAR feature compensation method is proposed to synthesize the point-voxel features by using voxel features and point cloud features. This method also calculates the similarity between the initial BEV features and the point-voxel features to reject and replace feature points in the image features that have insufficient similarity with the point-voxel features on a large scale. Again, through the compensated BEV features, the BEV features with time series are input into the time-domain BEV feature fusion module, to query the same vehicle’s position, size, and other physical states at different times. Finally, the features are fed into tasks such as detection and segmentation to complete the output. In this study, a comparative validation is carried out on the BEV-aware dataset nuScenes. The experimental results show the effectiveness of truncated target detection and segmentation.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"25 8\",\"pages\":\"14342-14357\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2025-03-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10909079/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10909079/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Bird’s-Eye View Perception Based on Two-Stage LiDAR Compensation in Complex Traffic Scenarios
Establishing 3-D perception capabilities for self-driving cars is a key research problem. Recent research has differentially “lifted” features from multicamera images onto a 2-D ground plane to produce a bird’s-eye view (BEV) feature representation of the 3-D space around the vehicle. However, this is currently challenging due to the inability to accurately reproduce the sizes and positions of truncated objects as well as drag tails and long tails in cameras with different fields of view. In this article, we propose a BEV sensing method based on two-stage light detection and ranging (LiDAR) feature compensation. First, the initial BEV features are obtained by fusing image features with LiDAR voxel features. Second, a two-stage LiDAR feature compensation method is proposed to synthesize the point-voxel features by using voxel features and point cloud features. This method also calculates the similarity between the initial BEV features and the point-voxel features to reject and replace feature points in the image features that have insufficient similarity with the point-voxel features on a large scale. Again, through the compensated BEV features, the BEV features with time series are input into the time-domain BEV feature fusion module, to query the same vehicle’s position, size, and other physical states at different times. Finally, the features are fed into tasks such as detection and segmentation to complete the output. In this study, a comparative validation is carried out on the BEV-aware dataset nuScenes. The experimental results show the effectiveness of truncated target detection and segmentation.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice