A multi-sensor platform for wide-area tracking

C. Waechter, Manuel J. Huber, P. Keitler, M. Schlegel, G. Klinker, D. Pustka
{"title":"A multi-sensor platform for wide-area tracking","authors":"C. Waechter, Manuel J. Huber, P. Keitler, M. Schlegel, G. Klinker, D. Pustka","doi":"10.1109/ISMAR.2010.5643604","DOIUrl":null,"url":null,"abstract":"Indoor tracking scenarios still face challenges in providing continuous tracking support in wide-area workplaces. This is especially the case in Augmented Reality since such augmentations generally require exact full 6DOF pose measurements in order to continuously display 3D graphics from user-related view points. Many single sensor systems have been explored but only few of them have the capability to track reliably in wide-area environments. We introduce a mobile multi-sensor platform to overcome the shortcomings of single sensor systems. The platform is equipped with a detachable optical camera and a rigidly mounted odometric measurement system providing relative positions and orientations with respect to the ground plane. The camera is used for marker-based as well as for marker-less (feature-based) inside-out tracking as part of a hybrid approach. We explain the principle tracking technologies in our competitive/cooperative fusion approach and show possible enhancements to further developments. This inside-out approach scales well with increasing tracking range, as opposed to stationary outside-in tracking.","PeriodicalId":250608,"journal":{"name":"2010 IEEE International Symposium on Mixed and Augmented Reality","volume":"51 1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2010.5643604","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11

Abstract

Indoor tracking scenarios still face challenges in providing continuous tracking support in wide-area workplaces. This is especially the case in Augmented Reality since such augmentations generally require exact full 6DOF pose measurements in order to continuously display 3D graphics from user-related view points. Many single sensor systems have been explored but only few of them have the capability to track reliably in wide-area environments. We introduce a mobile multi-sensor platform to overcome the shortcomings of single sensor systems. The platform is equipped with a detachable optical camera and a rigidly mounted odometric measurement system providing relative positions and orientations with respect to the ground plane. The camera is used for marker-based as well as for marker-less (feature-based) inside-out tracking as part of a hybrid approach. We explain the principle tracking technologies in our competitive/cooperative fusion approach and show possible enhancements to further developments. This inside-out approach scales well with increasing tracking range, as opposed to stationary outside-in tracking.
面向广域跟踪的多传感器平台
室内跟踪场景仍然面临着在广域工作场所提供持续跟踪支持的挑战。这在增强现实中尤其如此,因为这种增强通常需要精确的完整6DOF姿势测量,以便从用户相关的视点连续显示3D图形。许多单一传感器系统已被探索,但只有少数具有在广域环境中可靠跟踪的能力。为了克服单传感器系统的缺点,我们引入了一种移动多传感器平台。该平台配备有可拆卸的光学相机和刚性安装的里程测量系统,提供相对于地平面的相对位置和方向。作为混合方法的一部分,该相机既可用于基于标记的跟踪,也可用于无标记(基于特征的)由内到外跟踪。我们解释了竞争/合作融合方法中的跟踪技术原理,并展示了进一步发展的可能增强功能。这种从内到外的方法随着跟踪范围的增加而扩大,而不是固定的由外到内的跟踪。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信