Understanding the design space of referencing in collaborative augmented reality environments

J. Chastine, Kristine S. Nagel, Ying Zhu, Luca Yearsovich
{"title":"Understanding the design space of referencing in collaborative augmented reality environments","authors":"J. Chastine, Kristine S. Nagel, Ying Zhu, Luca Yearsovich","doi":"10.1145/1268517.1268552","DOIUrl":null,"url":null,"abstract":"For collaborative environments to be successful, it is critical that participants have the ability to generate effective references. Given the heterogeneity of the objects and the myriad of possible scenarios for collaborative augmented reality environments, generating meaningful references within them can be difficult. Participants in co-located physical spaces benefit from non-verbal communication, such as eye gaze, pointing and body movement; however, when geographically separated, this form of communication must be synthesized using computer-mediated techniques. We have conducted an exploratory study using a collaborative building task of constructing both physical and virtual models to better understand inter-referential awareness -- or the ability for one participant to refer to a set of objects, and for that reference to be understood. Our contributions are not necessarily in presenting novel techniques, but in narrowing the design space for referencing in collaborative augmented reality. This study suggests collaborative reference preferences are heavily dependent on the context of the workspace.","PeriodicalId":197912,"journal":{"name":"International Genetic Improvement Workshop","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"37","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Genetic Improvement Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1268517.1268552","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 37

Abstract

For collaborative environments to be successful, it is critical that participants have the ability to generate effective references. Given the heterogeneity of the objects and the myriad of possible scenarios for collaborative augmented reality environments, generating meaningful references within them can be difficult. Participants in co-located physical spaces benefit from non-verbal communication, such as eye gaze, pointing and body movement; however, when geographically separated, this form of communication must be synthesized using computer-mediated techniques. We have conducted an exploratory study using a collaborative building task of constructing both physical and virtual models to better understand inter-referential awareness -- or the ability for one participant to refer to a set of objects, and for that reference to be understood. Our contributions are not necessarily in presenting novel techniques, but in narrowing the design space for referencing in collaborative augmented reality. This study suggests collaborative reference preferences are heavily dependent on the context of the workspace.
引用的设计空间理解协作增强现实环境
对于成功的协作环境来说,参与者有能力产生有效的参考是至关重要的。考虑到对象的异构性和协作增强现实环境的无数可能场景,在其中生成有意义的引用可能很困难。身处同一物理空间的参与者受益于非语言交流,如眼神凝视、指向和身体运动;然而,当地理位置分开时,这种形式的通信必须使用计算机媒介技术进行合成。我们进行了一项探索性研究,使用构建物理和虚拟模型的协作构建任务来更好地理解相互参照意识——或者一个参与者参考一组对象的能力,以及被理解的参考。我们的贡献并不一定是在展示新颖的技术,而是在缩小设计空间以供协作增强现实参考。这项研究表明,协作参考偏好在很大程度上取决于工作环境。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信