{"title":"KMOP-vSLAM: Dynamic Visual SLAM for RGB-D Cameras using K-means and OpenPose","authors":"Yubao Liu, J. Miura","doi":"10.1109/IEEECONF49454.2021.9382724","DOIUrl":null,"url":null,"abstract":"Although tremendous progress has been made in Simultaneous Localization and Mapping (SLAM), the scene rigidity assumption limits wide usage of visual SLAMs in the real-world environment of computer vision, smart robotics and augmented reality. To make SLAM more robust in dynamic environments, outliers on the dynamic objects, including unknown objects, need to be removed from tracking process. To address this challenge, we present a novel real-time visual SLAM system, KMOP-vSLAM, which adds the capability of unsupervised learning segmentation and human detection to reduce the drift error of tracking in indoor dynamic environments. An efficient geometric outlier detection method is proposed, using dynamic information of the previous frames as well as a novel probability model to judge moving objects with the help of geometric constraints and human detection. Outlier features belonging to moving objects are largely detected and removed from tracking. The well-known dataset, TUM, is used to evaluate tracking errors in dynamic scenes where people are walking around. Our approach yields a significantly lower trajectory error compared to state-of-the-art visual SLAMs using an RGB-D camera.","PeriodicalId":395378,"journal":{"name":"2021 IEEE/SICE International Symposium on System Integration (SII)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE/SICE International Symposium on System Integration (SII)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IEEECONF49454.2021.9382724","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
Although tremendous progress has been made in Simultaneous Localization and Mapping (SLAM), the scene rigidity assumption limits wide usage of visual SLAMs in the real-world environment of computer vision, smart robotics and augmented reality. To make SLAM more robust in dynamic environments, outliers on the dynamic objects, including unknown objects, need to be removed from tracking process. To address this challenge, we present a novel real-time visual SLAM system, KMOP-vSLAM, which adds the capability of unsupervised learning segmentation and human detection to reduce the drift error of tracking in indoor dynamic environments. An efficient geometric outlier detection method is proposed, using dynamic information of the previous frames as well as a novel probability model to judge moving objects with the help of geometric constraints and human detection. Outlier features belonging to moving objects are largely detected and removed from tracking. The well-known dataset, TUM, is used to evaluate tracking errors in dynamic scenes where people are walking around. Our approach yields a significantly lower trajectory error compared to state-of-the-art visual SLAMs using an RGB-D camera.