{"title":"Robust Semantic Optical Flow Visual Odometry in Dynamic Environment","authors":"Yi Zhang, Jinhong Li, Bin Xing, Xiaolin Hu","doi":"10.1109/ICDSBA51020.2020.00090","DOIUrl":null,"url":null,"abstract":"In view of the low robustness and accuracy of traditional visual odometry in dynamic environment, a dynamic feature point detection method based on semantic-optical flow fusion based on dynamic threshold is used to solve the problem that single optical flow method is sensitive to noise. In this paper, the Mask R-CNN Network is used to segment the image semantics, and the LK (Lucas Kanade) optical flow is used to track the feature points in the image, and the optical flow motion vector is calculated. On the basis of optical flow motion vector, a dynamic point judgment method based on the depth information of feature points and camera motion information is designed, and the dynamic feature points are eliminated to adapt to the dynamic environment. The experimental results on the tum data set show that the absolute trajectory error of the proposed system is improved by more than 90% compared with ORB-SLAM2, while the absolute trajectory error is improved by more than 13% compared with the same type of DS-SLAM, which improves the robustness of the visual odometry.","PeriodicalId":354742,"journal":{"name":"2020 4th Annual International Conference on Data Science and Business Analytics (ICDSBA)","volume":"144 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 4th Annual International Conference on Data Science and Business Analytics (ICDSBA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDSBA51020.2020.00090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In view of the low robustness and accuracy of traditional visual odometry in dynamic environment, a dynamic feature point detection method based on semantic-optical flow fusion based on dynamic threshold is used to solve the problem that single optical flow method is sensitive to noise. In this paper, the Mask R-CNN Network is used to segment the image semantics, and the LK (Lucas Kanade) optical flow is used to track the feature points in the image, and the optical flow motion vector is calculated. On the basis of optical flow motion vector, a dynamic point judgment method based on the depth information of feature points and camera motion information is designed, and the dynamic feature points are eliminated to adapt to the dynamic environment. The experimental results on the tum data set show that the absolute trajectory error of the proposed system is improved by more than 90% compared with ORB-SLAM2, while the absolute trajectory error is improved by more than 13% compared with the same type of DS-SLAM, which improves the robustness of the visual odometry.