Level-of-Detail AR: Dynamically Adjusting Augmented Reality Level of Detail Based on Visual Angle

Abby Wysopal, Vivian Ross, Joyce E Passananti, K. Yu, Brandon Huynh, Tobias Höllerer
{"title":"Level-of-Detail AR: Dynamically Adjusting Augmented Reality Level of Detail Based on Visual Angle","authors":"Abby Wysopal, Vivian Ross, Joyce E Passananti, K. Yu, Brandon Huynh, Tobias Höllerer","doi":"10.1109/VR55154.2023.00022","DOIUrl":null,"url":null,"abstract":"Dynamically adjusting the content of augmented reality (AR) applications to efficiently display information best fitting the available screen estate may be important for user performance and satisfaction. Currently, there is not a common practice for dynamically adjusting the content of AR applications based on their apparent size in the user's view of the surround environment. We present a Level-of-Detail AR mechanism to improve the usability of AR applications at any relative size. Our mechanism dynamically renders textual and interactable content based on its legibility, interactability, and viewability respectively. When tested, Level-of-Detail AR functioned as intended out-of-the-box on 44 of the 45 standard user interface Unity prefabs in Microsoft's Mixed Reality Tool Kit. We additionally evaluated impact on task performance, user distance, and subjective satisfaction through a mixed-design user study with 45 participants. Statistical analysis of our results revealed significant task-dependent differences in user performance between the modes. User satisfaction was consistently higher for the Level-of-Detail AR condition.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR55154.2023.00022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Dynamically adjusting the content of augmented reality (AR) applications to efficiently display information best fitting the available screen estate may be important for user performance and satisfaction. Currently, there is not a common practice for dynamically adjusting the content of AR applications based on their apparent size in the user's view of the surround environment. We present a Level-of-Detail AR mechanism to improve the usability of AR applications at any relative size. Our mechanism dynamically renders textual and interactable content based on its legibility, interactability, and viewability respectively. When tested, Level-of-Detail AR functioned as intended out-of-the-box on 44 of the 45 standard user interface Unity prefabs in Microsoft's Mixed Reality Tool Kit. We additionally evaluated impact on task performance, user distance, and subjective satisfaction through a mixed-design user study with 45 participants. Statistical analysis of our results revealed significant task-dependent differences in user performance between the modes. User satisfaction was consistently higher for the Level-of-Detail AR condition.
细节水平AR:基于视角动态调整增强现实细节水平
动态调整增强现实(AR)应用程序的内容,以有效地显示最适合可用屏幕的信息,这对于用户性能和满意度非常重要。目前,还没有一种常见的做法,可以根据AR应用程序在用户周围环境视图中的明显大小来动态调整其内容。我们提出了一个细节级AR机制,以提高任何相对大小的AR应用程序的可用性。我们的机制分别基于可读性、交互性和可视性动态呈现文本内容和可交互内容。经过测试,Level-of-Detail AR在微软混合现实工具包的45个标准用户界面Unity预装件中的44个上都能正常运行。我们还通过45名参与者的混合设计用户研究评估了对任务绩效、用户距离和主观满意度的影响。对我们的结果进行统计分析,揭示了两种模式之间用户表现的显著任务依赖性差异。在细节级AR条件下,用户满意度始终较高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信