Experimental evaluation of user interfaces for visual indoor navigation

Andreas Möller, M. Kranz, Stefan Diewald, L. Roalter, Robert Huitl, T. Stockinger, Marion Koelle, Patrick Lindemann
{"title":"Experimental evaluation of user interfaces for visual indoor navigation","authors":"Andreas Möller, M. Kranz, Stefan Diewald, L. Roalter, Robert Huitl, T. Stockinger, Marion Koelle, Patrick Lindemann","doi":"10.1145/2556288.2557003","DOIUrl":null,"url":null,"abstract":"Mobile location recognition by capturing images of the environment (visual localization) is a promising technique for indoor navigation in arbitrary surroundings. However, it has barely been investigated so far how the user interface (UI) can cope with the challenges of the vision-based localization technique, such as varying quality of the query images. We implemented a novel UI for visual localization, consisting of Virtual Reality (VR) and Augmented Reality (AR) views that actively communicate and ensure localization accuracy. If necessary, the system encourages the user to point the smartphone at distinctive regions to improve localization quality. We evaluated the UI in a experimental navigation task with a prototype, informed by initial evaluation results using design mockups. We found that VR can contribute to efficient and effective indoor navigation even at unreliable location and orientation accuracy. We discuss identified challenges and share lessons learned as recommendations for future work.","PeriodicalId":20599,"journal":{"name":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2014-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"51","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the SIGCHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2556288.2557003","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 51

Abstract

Mobile location recognition by capturing images of the environment (visual localization) is a promising technique for indoor navigation in arbitrary surroundings. However, it has barely been investigated so far how the user interface (UI) can cope with the challenges of the vision-based localization technique, such as varying quality of the query images. We implemented a novel UI for visual localization, consisting of Virtual Reality (VR) and Augmented Reality (AR) views that actively communicate and ensure localization accuracy. If necessary, the system encourages the user to point the smartphone at distinctive regions to improve localization quality. We evaluated the UI in a experimental navigation task with a prototype, informed by initial evaluation results using design mockups. We found that VR can contribute to efficient and effective indoor navigation even at unreliable location and orientation accuracy. We discuss identified challenges and share lessons learned as recommendations for future work.
视觉室内导航用户界面的实验评价
通过捕捉环境图像进行移动定位(视觉定位)是一种很有前途的用于任意环境下室内导航的技术。然而,迄今为止,很少有人研究用户界面(UI)如何应对基于视觉的定位技术的挑战,例如查询图像质量的变化。我们为视觉定位实现了一种新颖的UI,由虚拟现实(VR)和增强现实(AR)视图组成,可以主动交流并确保定位准确性。如果有必要,系统会鼓励用户将智能手机指向不同的地区,以提高本地化质量。我们使用原型在一个实验性导航任务中评估UI,并根据使用设计模型的初始评估结果进行评估。我们发现,即使在位置和方向精度不可靠的情况下,VR也可以为高效有效的室内导航做出贡献。我们讨论已确定的挑战,并分享经验教训,作为对今后工作的建议。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信