{"title":"Enhanced dynamic visual SLAM system for hospital logistics robots: Nonlinear optimal filtering, deep learning, and real-time positioning","authors":"Feng Xiao , Jie Fang , Xing Guo , Youhai Zhang , Rubing Huang","doi":"10.1016/j.robot.2025.105081","DOIUrl":null,"url":null,"abstract":"<div><div>Current Simultaneous Localization and Mapping (SLAM) systems frequently exhibit increased positioning errors, inaccuracies in map creation, and struggle with real-time multimodal sensor data fusion in dynamic environments. This paper introduces an improved SLAM system for hospital logistics robots that utilizes nonlinear optimal filtering and deep learning to navigate the challenges presented by dynamic environments. The system incorporates an Unscented Kalman Filter (UKF) for nonlinear state estimation and employs Convolutional Neural Networks (CNN) for deep feature extraction of environmental images. Semantic edge detection is accomplished through the integration of Fully Convolutional Networks (FCN) and Canny edge detection. The fusion of multimodal data is optimized using an Extended Kalman Filter (EKF) to enhance positioning accuracy across vision, lidar, and inertial measurement unit (IMU) sensors. Real-time motion estimation is achieved via an event-based camera paired with an optical flow algorithm. The proposed system delivers an absolute trajectory error (ATE) as low as 0.067 m, over 90 % overlap in map construction, and an average frame processing time under 90 ms, significantly surpassing the performance of other mainstream SLAM systems. These enhancements markedly improve positioning accuracy, mapping quality, and real-time performance in complex hospital environments.</div></div>","PeriodicalId":49592,"journal":{"name":"Robotics and Autonomous Systems","volume":"193 ","pages":"Article 105081"},"PeriodicalIF":4.3000,"publicationDate":"2025-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Autonomous Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0921889025001678","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Current Simultaneous Localization and Mapping (SLAM) systems frequently exhibit increased positioning errors, inaccuracies in map creation, and struggle with real-time multimodal sensor data fusion in dynamic environments. This paper introduces an improved SLAM system for hospital logistics robots that utilizes nonlinear optimal filtering and deep learning to navigate the challenges presented by dynamic environments. The system incorporates an Unscented Kalman Filter (UKF) for nonlinear state estimation and employs Convolutional Neural Networks (CNN) for deep feature extraction of environmental images. Semantic edge detection is accomplished through the integration of Fully Convolutional Networks (FCN) and Canny edge detection. The fusion of multimodal data is optimized using an Extended Kalman Filter (EKF) to enhance positioning accuracy across vision, lidar, and inertial measurement unit (IMU) sensors. Real-time motion estimation is achieved via an event-based camera paired with an optical flow algorithm. The proposed system delivers an absolute trajectory error (ATE) as low as 0.067 m, over 90 % overlap in map construction, and an average frame processing time under 90 ms, significantly surpassing the performance of other mainstream SLAM systems. These enhancements markedly improve positioning accuracy, mapping quality, and real-time performance in complex hospital environments.
期刊介绍:
Robotics and Autonomous Systems will carry articles describing fundamental developments in the field of robotics, with special emphasis on autonomous systems. An important goal of this journal is to extend the state of the art in both symbolic and sensory based robot control and learning in the context of autonomous systems.
Robotics and Autonomous Systems will carry articles on the theoretical, computational and experimental aspects of autonomous systems, or modules of such systems.