{"title":"PLE-SLAM: A Visual-Inertial SLAM Based on Point-Line Features and Efficient IMU Initialization","authors":"Jiaming He;Mingrui Li;Yangyang Wang;Hongyu Wang","doi":"10.1109/JSEN.2024.3523039","DOIUrl":null,"url":null,"abstract":"Camera and IMU are widely used in robotics to achieve accurate and robust pose estimation. However, this fusion relies heavily on sufficient visual feature observations and precise inertial state variables. This article proposes PLE-SLAM, a real-time visual-inertial simultaneous localization and mapping (SLAM) for complex environments, which introduces line features to point-based SLAM and proposes an efficient IMU initialization method. First, we use parallel computing methods to extract point-line features and compute descriptors to ensure real-time performance. Adjacent short-line segments are merged into long-line segments for more stable tracking, and isolated short-line segments are directly eliminated. Second, to overcome rapid rotation and low-texture scenes, we estimate gyroscope bias by tightly coupling rotation preintegration and 2-D point-line observations without 3-D point cloud and vision-only rotation estimation. Accelerometer bias and gravity direction are solved by an analytical method, which is more efficient than nonlinear optimization. To improve the system’s robustness in complex environments, an improved method of dynamic feature elimination and a solution for loop detection and loop frames pose estimation using CNN and GNN are integrated into the system. The experimental results on public datasets demonstrate that PLE-SLAM achieves more than 20%~50% improvement in localization performance than ORB-SLAM3 and outperforms other state-of-the-art visual-inertial SLAM systems in most environments.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 4","pages":"6801-6811"},"PeriodicalIF":4.3000,"publicationDate":"2025-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10824222/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Camera and IMU are widely used in robotics to achieve accurate and robust pose estimation. However, this fusion relies heavily on sufficient visual feature observations and precise inertial state variables. This article proposes PLE-SLAM, a real-time visual-inertial simultaneous localization and mapping (SLAM) for complex environments, which introduces line features to point-based SLAM and proposes an efficient IMU initialization method. First, we use parallel computing methods to extract point-line features and compute descriptors to ensure real-time performance. Adjacent short-line segments are merged into long-line segments for more stable tracking, and isolated short-line segments are directly eliminated. Second, to overcome rapid rotation and low-texture scenes, we estimate gyroscope bias by tightly coupling rotation preintegration and 2-D point-line observations without 3-D point cloud and vision-only rotation estimation. Accelerometer bias and gravity direction are solved by an analytical method, which is more efficient than nonlinear optimization. To improve the system’s robustness in complex environments, an improved method of dynamic feature elimination and a solution for loop detection and loop frames pose estimation using CNN and GNN are integrated into the system. The experimental results on public datasets demonstrate that PLE-SLAM achieves more than 20%~50% improvement in localization performance than ORB-SLAM3 and outperforms other state-of-the-art visual-inertial SLAM systems in most environments.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice