{"title":"基于惯性传感器融合的深度流里程计","authors":"Jeongmin Kang","doi":"10.1049/ell2.70409","DOIUrl":null,"url":null,"abstract":"<p>Robust pose estimation in GNSS-denied environments is essential for autonomous driving systems. Recent advancements in deep learning within the field of computer vision have significantly contributed to the development of visual odometry (VO). However, most VO-based approaches still suffer from scale drift errors in general road environments. This letter introduces a novel visual-inertial odometry framework for robust pose estimation. Inertial measurement unit (IMU) measurements obtained between image frames are fused with deep learning-based optical flow and depth predictions extracted from image pairs. First, measurements from the accelerometer and gyroscope are propagated through the IMU dynamic model. Next, optical flow and depth information predicted from image pairs are used in a geometric approach to recover the camera motion by optimising correspondences based on optical flow consistency. Finally, the proposed method is evaluated on the publicly available KITTI dataset, and its performance is compared with existing methods. Additionally, the impact of the network model on flow consistency, which plays a crucial role in geometry-based pose recovery, is analysed. The results demonstrate that the proposed method achieves reliable pose estimation accuracy.</p>","PeriodicalId":11556,"journal":{"name":"Electronics Letters","volume":"61 1","pages":""},"PeriodicalIF":0.8000,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ell2.70409","citationCount":"0","resultStr":"{\"title\":\"Deep Depth-Flow Odometry With Inertial Sensor Fusion\",\"authors\":\"Jeongmin Kang\",\"doi\":\"10.1049/ell2.70409\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Robust pose estimation in GNSS-denied environments is essential for autonomous driving systems. Recent advancements in deep learning within the field of computer vision have significantly contributed to the development of visual odometry (VO). However, most VO-based approaches still suffer from scale drift errors in general road environments. This letter introduces a novel visual-inertial odometry framework for robust pose estimation. Inertial measurement unit (IMU) measurements obtained between image frames are fused with deep learning-based optical flow and depth predictions extracted from image pairs. First, measurements from the accelerometer and gyroscope are propagated through the IMU dynamic model. Next, optical flow and depth information predicted from image pairs are used in a geometric approach to recover the camera motion by optimising correspondences based on optical flow consistency. Finally, the proposed method is evaluated on the publicly available KITTI dataset, and its performance is compared with existing methods. Additionally, the impact of the network model on flow consistency, which plays a crucial role in geometry-based pose recovery, is analysed. The results demonstrate that the proposed method achieves reliable pose estimation accuracy.</p>\",\"PeriodicalId\":11556,\"journal\":{\"name\":\"Electronics Letters\",\"volume\":\"61 1\",\"pages\":\"\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2025-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ell2.70409\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electronics Letters\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/ell2.70409\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics Letters","FirstCategoryId":"5","ListUrlMain":"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/ell2.70409","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Deep Depth-Flow Odometry With Inertial Sensor Fusion
Robust pose estimation in GNSS-denied environments is essential for autonomous driving systems. Recent advancements in deep learning within the field of computer vision have significantly contributed to the development of visual odometry (VO). However, most VO-based approaches still suffer from scale drift errors in general road environments. This letter introduces a novel visual-inertial odometry framework for robust pose estimation. Inertial measurement unit (IMU) measurements obtained between image frames are fused with deep learning-based optical flow and depth predictions extracted from image pairs. First, measurements from the accelerometer and gyroscope are propagated through the IMU dynamic model. Next, optical flow and depth information predicted from image pairs are used in a geometric approach to recover the camera motion by optimising correspondences based on optical flow consistency. Finally, the proposed method is evaluated on the publicly available KITTI dataset, and its performance is compared with existing methods. Additionally, the impact of the network model on flow consistency, which plays a crucial role in geometry-based pose recovery, is analysed. The results demonstrate that the proposed method achieves reliable pose estimation accuracy.
期刊介绍:
Electronics Letters is an internationally renowned peer-reviewed rapid-communication journal that publishes short original research papers every two weeks. Its broad and interdisciplinary scope covers the latest developments in all electronic engineering related fields including communication, biomedical, optical and device technologies. Electronics Letters also provides further insight into some of the latest developments through special features and interviews.
Scope
As a journal at the forefront of its field, Electronics Letters publishes papers covering all themes of electronic and electrical engineering. The major themes of the journal are listed below.
Antennas and Propagation
Biomedical and Bioinspired Technologies, Signal Processing and Applications
Control Engineering
Electromagnetism: Theory, Materials and Devices
Electronic Circuits and Systems
Image, Video and Vision Processing and Applications
Information, Computing and Communications
Instrumentation and Measurement
Microwave Technology
Optical Communications
Photonics and Opto-Electronics
Power Electronics, Energy and Sustainability
Radar, Sonar and Navigation
Semiconductor Technology
Signal Processing
MIMO