{"title":"gnss环境下视觉与惯性传感器融合列车定位","authors":"Haifeng Song;Haoyu Zhang;Xiaoqing Wu;Wangzhe Li;Hairong Dong","doi":"10.1109/JSEN.2025.3597772","DOIUrl":null,"url":null,"abstract":"The accurate train positioning is essential for ensuring safety and operational efficiency in modern rail systems. Traditional methods based on trackside infrastructure or satellite signals often suffer from limited precision or high cost, especially in Global Navigation Satellite Systems (GNSS)-denied environments. To address these challenges, this article proposes a hybrid vision–inertial train positioning method that combines the visual absolute positioning with inertial measurement unit (IMU)-based relative positioning. An enhanced you only look once (YOLO)-based object detection algorithm and an end-to-end text recognition network are employed to identify and interpret railway landmarks. The absolute position of the train is then retrieved by matching recognized text with a preconstructed database. To achieve continuous and robust localization, a differential evolution Kalman filter (DE-KF) is introduced to adaptively fuse IMU data with the vision-derived observations, dynamically tuning the process noise covariance in response to environmental variation. The proposed method was validated at Beijing National Railway Experimental Center. Experimental results demonstrate that the system maintains positioning errors within 3.5 m and achieves high recognition performance, with an mAP50 of 98.0%. These findings confirm the effectiveness of the proposed fusion framework for real-time, accurate, and resource-efficient train localization.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 18","pages":"35323-35334"},"PeriodicalIF":4.3000,"publicationDate":"2025-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Vision and Inertial Sensors Fusion for Train Positioning in GNSS-Denied Environments\",\"authors\":\"Haifeng Song;Haoyu Zhang;Xiaoqing Wu;Wangzhe Li;Hairong Dong\",\"doi\":\"10.1109/JSEN.2025.3597772\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The accurate train positioning is essential for ensuring safety and operational efficiency in modern rail systems. Traditional methods based on trackside infrastructure or satellite signals often suffer from limited precision or high cost, especially in Global Navigation Satellite Systems (GNSS)-denied environments. To address these challenges, this article proposes a hybrid vision–inertial train positioning method that combines the visual absolute positioning with inertial measurement unit (IMU)-based relative positioning. An enhanced you only look once (YOLO)-based object detection algorithm and an end-to-end text recognition network are employed to identify and interpret railway landmarks. The absolute position of the train is then retrieved by matching recognized text with a preconstructed database. To achieve continuous and robust localization, a differential evolution Kalman filter (DE-KF) is introduced to adaptively fuse IMU data with the vision-derived observations, dynamically tuning the process noise covariance in response to environmental variation. The proposed method was validated at Beijing National Railway Experimental Center. Experimental results demonstrate that the system maintains positioning errors within 3.5 m and achieves high recognition performance, with an mAP50 of 98.0%. These findings confirm the effectiveness of the proposed fusion framework for real-time, accurate, and resource-efficient train localization.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"25 18\",\"pages\":\"35323-35334\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2025-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11131525/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/11131525/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Vision and Inertial Sensors Fusion for Train Positioning in GNSS-Denied Environments
The accurate train positioning is essential for ensuring safety and operational efficiency in modern rail systems. Traditional methods based on trackside infrastructure or satellite signals often suffer from limited precision or high cost, especially in Global Navigation Satellite Systems (GNSS)-denied environments. To address these challenges, this article proposes a hybrid vision–inertial train positioning method that combines the visual absolute positioning with inertial measurement unit (IMU)-based relative positioning. An enhanced you only look once (YOLO)-based object detection algorithm and an end-to-end text recognition network are employed to identify and interpret railway landmarks. The absolute position of the train is then retrieved by matching recognized text with a preconstructed database. To achieve continuous and robust localization, a differential evolution Kalman filter (DE-KF) is introduced to adaptively fuse IMU data with the vision-derived observations, dynamically tuning the process noise covariance in response to environmental variation. The proposed method was validated at Beijing National Railway Experimental Center. Experimental results demonstrate that the system maintains positioning errors within 3.5 m and achieves high recognition performance, with an mAP50 of 98.0%. These findings confirm the effectiveness of the proposed fusion framework for real-time, accurate, and resource-efficient train localization.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice