Comparing the Effect of Audio and Visual Notifications on Workspace Awareness Using Head-Mounted Displays for Remote Collaboration in Augmented Reality

Marina Cidota, Stephan Lukosch, Dragos Datcu, Heide Lukosch
{"title":"Comparing the Effect of Audio and Visual Notifications on Workspace Awareness Using Head-Mounted Displays for Remote Collaboration in Augmented Reality","authors":"Marina Cidota,&nbsp;Stephan Lukosch,&nbsp;Dragos Datcu,&nbsp;Heide Lukosch","doi":"10.1007/s41133-016-0003-x","DOIUrl":null,"url":null,"abstract":"<div><p>In many fields of activity, working in teams is necessary for completing tasks in a proper manner and often requires visual context-related information to be exchanged between team members. In such a collaborative environment, awareness of other people’s activity is an important feature of shared-workspace collaboration. We have developed an augmented reality framework for virtual colocation that supports visual communication between two people who are in different physical locations. We address these people as the remote user, who uses a laptop and the local user, who wears a head-mounted display with an RGB camera. The remote user can assist the local user in solving a spatial problem, by providing instructions in form of virtual objects in the view of the local user. For annotating the shared workspace, we use the state-of-the-art algorithm for localization and mapping without markers that provides “anchors” in the 3D space for placing virtual content. In this paper, we report on a user study that explores on how automatic audio and visual notifications about the remote user’s activities affect the local user’s workspace awareness. We used an existing game to research virtual colocation, addressing a spatial challenge on increasing levels of task complexity. The results of the user study show that participants clearly preferred visual notifications over audio or no notifications, no matter the level of the difficulty of the task.</p></div>","PeriodicalId":100147,"journal":{"name":"Augmented Human Research","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2016-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1007/s41133-016-0003-x","citationCount":"24","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Augmented Human Research","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.1007/s41133-016-0003-x","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 24

Abstract

In many fields of activity, working in teams is necessary for completing tasks in a proper manner and often requires visual context-related information to be exchanged between team members. In such a collaborative environment, awareness of other people’s activity is an important feature of shared-workspace collaboration. We have developed an augmented reality framework for virtual colocation that supports visual communication between two people who are in different physical locations. We address these people as the remote user, who uses a laptop and the local user, who wears a head-mounted display with an RGB camera. The remote user can assist the local user in solving a spatial problem, by providing instructions in form of virtual objects in the view of the local user. For annotating the shared workspace, we use the state-of-the-art algorithm for localization and mapping without markers that provides “anchors” in the 3D space for placing virtual content. In this paper, we report on a user study that explores on how automatic audio and visual notifications about the remote user’s activities affect the local user’s workspace awareness. We used an existing game to research virtual colocation, addressing a spatial challenge on increasing levels of task complexity. The results of the user study show that participants clearly preferred visual notifications over audio or no notifications, no matter the level of the difficulty of the task.

比较音频和视觉通知对使用头戴式显示器进行增强现实远程协作的工作空间感知的影响
在许多活动领域中,团队合作对于以适当的方式完成任务是必要的,并且通常需要在团队成员之间交换与上下文相关的可视化信息。在这样的协作环境中,了解其他人的活动是共享工作空间协作的一个重要特征。我们已经为虚拟托管开发了一个增强现实框架,支持位于不同物理位置的两个人之间的视觉交流。我们将这些人称为使用笔记本电脑的远程用户和佩戴带有RGB摄像头的头戴式显示器的本地用户。远程用户可以通过在本地用户视图中以虚拟对象的形式提供指令,帮助本地用户解决空间问题。为了标注共享工作空间,我们使用最先进的算法进行定位和映射,而不需要在3D空间中提供“锚”来放置虚拟内容的标记。在本文中,我们报告了一项用户研究,该研究探讨了关于远程用户活动的自动音频和视觉通知如何影响本地用户的工作空间意识。我们使用一个现有的游戏来研究虚拟主机托管,解决了任务复杂性不断增加的空间挑战。用户研究的结果表明,无论任务的难度如何,参与者显然更喜欢视觉通知,而不是音频通知或没有通知。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信