{"title":"VIO-GO: optimizing event-based SLAM parameters for robust performance in high dynamic range scenarios.","authors":"Saber Sakhrieh, Abhilasha Singh, Jinane Mounsef, Bilal Arain, Noel Maalouf","doi":"10.3389/frobt.2025.1541017","DOIUrl":null,"url":null,"abstract":"<p><p>This paper addresses a critical challenge in Industry 4.0 robotics by enhancing Visual Inertial Odometry (VIO) systems to operate effectively in dynamic and low-light industrial environments, which are common in sectors like warehousing, logistics, and manufacturing. Inspired by biological sensing mechanisms, we integrate bio-inspired event cameras to improve state estimation systems performance in both dynamic and low-light conditions, enabling reliable localization and mapping. The proposed state estimation framework integrates events, conventional video frames, and inertial data to achieve reliable and precise localization with specific emphasis on real-world challenges posed by high-speed and cluttered settings typical in Industry 4.0. Despite advancements in event-based sensing, there is a noteworthy gap in optimizing Event Simultaneous Localization and Mapping (SLAM) parameters for practical applications. To address this, we introduce a novel VIO-Gradient-based Optimization (VIO-GO) method that employs Batch Gradient Descent (BGD) for efficient parameter tuning. This automated approach determines optimal parameters for Event SLAM algorithms by using motion-compensated images to represent event data. Experimental validation on the Event Camera Dataset shows a remarkable 60% improvement in Mean Position Error (MPE) over fixed-parameter methods. Our results demonstrate that VIO-GO consistently identifies optimal parameters, enabling precise VIO performance in complex, dynamic scenarios essential for Industry 4.0 applications. Additionally, as parameter complexity scales, VIO-GO achieves a 24% reduction in MPE when using the most comprehensive parameter set (VIO-GO8) compared to a minimal set (VIO-GO2), highlighting the method's scalability and robustness for adaptive robotic systems in challenging industrial environments.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"12 ","pages":"1541017"},"PeriodicalIF":3.0000,"publicationDate":"2025-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12490131/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Robotics and AI","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frobt.2025.1541017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
This paper addresses a critical challenge in Industry 4.0 robotics by enhancing Visual Inertial Odometry (VIO) systems to operate effectively in dynamic and low-light industrial environments, which are common in sectors like warehousing, logistics, and manufacturing. Inspired by biological sensing mechanisms, we integrate bio-inspired event cameras to improve state estimation systems performance in both dynamic and low-light conditions, enabling reliable localization and mapping. The proposed state estimation framework integrates events, conventional video frames, and inertial data to achieve reliable and precise localization with specific emphasis on real-world challenges posed by high-speed and cluttered settings typical in Industry 4.0. Despite advancements in event-based sensing, there is a noteworthy gap in optimizing Event Simultaneous Localization and Mapping (SLAM) parameters for practical applications. To address this, we introduce a novel VIO-Gradient-based Optimization (VIO-GO) method that employs Batch Gradient Descent (BGD) for efficient parameter tuning. This automated approach determines optimal parameters for Event SLAM algorithms by using motion-compensated images to represent event data. Experimental validation on the Event Camera Dataset shows a remarkable 60% improvement in Mean Position Error (MPE) over fixed-parameter methods. Our results demonstrate that VIO-GO consistently identifies optimal parameters, enabling precise VIO performance in complex, dynamic scenarios essential for Industry 4.0 applications. Additionally, as parameter complexity scales, VIO-GO achieves a 24% reduction in MPE when using the most comprehensive parameter set (VIO-GO8) compared to a minimal set (VIO-GO2), highlighting the method's scalability and robustness for adaptive robotic systems in challenging industrial environments.
期刊介绍:
Frontiers in Robotics and AI publishes rigorously peer-reviewed research covering all theory and applications of robotics, technology, and artificial intelligence, from biomedical to space robotics.