T. Oskiper, Mikhail Sizintsev, Vlad Branzoi, S. Samarasekera, Rakesh Kumar
{"title":"Augmented Reality Scout: Joint Unaided-Eye and Telescopic-Zoom System for Immersive Team Training","authors":"T. Oskiper, Mikhail Sizintsev, Vlad Branzoi, S. Samarasekera, Rakesh Kumar","doi":"10.1109/ISMAR.2015.11","DOIUrl":null,"url":null,"abstract":"In this paper we present a dual, wide area, collaborative augmented reality (AR) system that consists of standard live view augmentation, e.g., from helmet, and zoomed-in view augmentation, e.g., from binoculars. The proposed advanced scouting capability allows long range high precision augmentation of live unaided and zoomed-in imagery with aerial and terrain based synthetic objects, vehicles, people and effects. The inserted objects must appear stable in the display and not jitter or drift as the user moves around and examines the scene. The AR insertions for the binocs must work instantly when they are picked up anywhere as the user moves around. The design of both AR modules is based on using two different cameras with wide and narrow field of view (FoV) lenses. The wide FoV gives context and enables the recovery of location and orientation of the prop in 6 degrees of freedom (DoF) much more robustly, whereas the narrow FoV is used for the actual augmentation and increased precision in tracking. Furthermore, narrow camera in unaided eye and wide camera on the binoculars are jointly used for global yaw (heading) correction. We present our navigation algorithms using monocular cameras in combination with IMU and GPS in an Extended Kalman Filter (EKF) framework to obtain robust and real-time pose estimation for precise augmentation and cooperative tracking.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"71 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2015.11","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In this paper we present a dual, wide area, collaborative augmented reality (AR) system that consists of standard live view augmentation, e.g., from helmet, and zoomed-in view augmentation, e.g., from binoculars. The proposed advanced scouting capability allows long range high precision augmentation of live unaided and zoomed-in imagery with aerial and terrain based synthetic objects, vehicles, people and effects. The inserted objects must appear stable in the display and not jitter or drift as the user moves around and examines the scene. The AR insertions for the binocs must work instantly when they are picked up anywhere as the user moves around. The design of both AR modules is based on using two different cameras with wide and narrow field of view (FoV) lenses. The wide FoV gives context and enables the recovery of location and orientation of the prop in 6 degrees of freedom (DoF) much more robustly, whereas the narrow FoV is used for the actual augmentation and increased precision in tracking. Furthermore, narrow camera in unaided eye and wide camera on the binoculars are jointly used for global yaw (heading) correction. We present our navigation algorithms using monocular cameras in combination with IMU and GPS in an Extended Kalman Filter (EKF) framework to obtain robust and real-time pose estimation for precise augmentation and cooperative tracking.