2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)最新文献

筛选
英文 中文
VR Collaboration in Large Companies: An Interview Study on the Role of Avatars 大型企业的虚拟现实合作:对虚拟化身角色的访谈研究
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00037
Natalie Hube, Katrin Angerbauer, Daniel Pohlandt, Kresimir Vidackovic, M. Sedlmair
{"title":"VR Collaboration in Large Companies: An Interview Study on the Role of Avatars","authors":"Natalie Hube, Katrin Angerbauer, Daniel Pohlandt, Kresimir Vidackovic, M. Sedlmair","doi":"10.1109/ISMAR-Adjunct54149.2021.00037","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00037","url":null,"abstract":"Collaboration is essential in companies and often physical presence is required, thus, more and more Virtual Reality (VR) systems are used to work together remotely. To support social interaction, human representations in form of avatars are used in collaborative virtual environment (CVE) tools. However, up to now, the avatar representations often are limited in their design and functionality, which may hinder effective collaboration. In our interview study, we explored the status quo of VR collaboration in a large automotive company setting with a special focus on the role of avatars. We collected inter-view data from 21 participants, from which we identified challenges of current avatar representations used in our setting. Based on these findings, we discuss design suggestions for avatars in a company setting, which aim to improve social interaction. As opposed to state-of-the-art research, we found that users within the context of a large automotive company have an altered need with respect to avatar representations.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"162 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117346667","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
VRSmartphoneSketch: Augmenting VR Controller With A Smartphone For Mid-air Sketching vrsmartphonessketch:增强VR控制器与智能手机的半空中素描
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00082
Shouxia Wang, Li Zhang, Jingjing Kang, Shuxia Wang, Weiping He
{"title":"VRSmartphoneSketch: Augmenting VR Controller With A Smartphone For Mid-air Sketching","authors":"Shouxia Wang, Li Zhang, Jingjing Kang, Shuxia Wang, Weiping He","doi":"10.1109/ISMAR-Adjunct54149.2021.00082","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00082","url":null,"abstract":"We propose VRSmartphoneSketch, a mid-air sketching system that combines a smartphone with VR controllers to allow hybrid 2D and 3D inputs. We conducted a user study with 12 participants to explore the utility of the hybrid inputs of the bundled smartphone and controller when there was no drawing surface. The results show that the proposed system didn’t significantly improve stroke accuracy and user experiences compared with the benchmark controller-only drawing condition. However, users have given positive feedback on the stability and sense of control of strokes brought by the smartphone’s touch.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"2016 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133563208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Focus Group on Social Virtual Reality in Social Virtual Reality: Effects on Emotion and Self-Awareness 社会虚拟现实:对情绪和自我意识的影响
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00099
Patricia Manyuru, Chelsea Dobbins, Benjamin Matthews, O. Baumann, Arindam Dey
{"title":"Focus Group on Social Virtual Reality in Social Virtual Reality: Effects on Emotion and Self-Awareness","authors":"Patricia Manyuru, Chelsea Dobbins, Benjamin Matthews, O. Baumann, Arindam Dey","doi":"10.1109/ISMAR-Adjunct54149.2021.00099","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00099","url":null,"abstract":"Social Virtual Reality (VR) platforms enable multiple users to be present together in the same virtual environment (VE) and interact with each other in this space. These platforms are used in different application areas including teaching and learning, conferences, and meetings. To improve the engagement, safety, and overall positive experience in such platforms it is important to understand the effect they have on users’ emotional states and self-awareness while being in the VE. In this work, we present a focus group study where we dis-cussed users’ opinions about social VR and we ran the focus group in a social VR platform created in Hubs by Mozilla. Our primary goal was to investigate users’ emotional states and self-awareness while using this platform. We measured these effects using positive and negative affect schedule (PANAS) and Self-Assessment Questionnaire (SAQ). The experiment involved 12 adult participants who were volunteers from around the world with previous experience of VR.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127684960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Using Context and Physiological Cues to Improve Emotion Recognition in Virtual Reality 利用情境和生理线索提高虚拟现实中的情绪识别
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00105
Kunal Gupta
{"title":"Using Context and Physiological Cues to Improve Emotion Recognition in Virtual Reality","authors":"Kunal Gupta","doi":"10.1109/ISMAR-Adjunct54149.2021.00105","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00105","url":null,"abstract":"Immersive Virtual Reality (VR) can create compelling context-specific emotional experiences, but very few studies explore the importance of emotion-relevant contextual cues in VR. In this thesis, we will investigate how to use combined contextual and physiological cues to improve emotion recognition in VR with a view of enhancing shared VR experiences. We will explore how relatively low-cost sensors can capture physiological information and how this can be combined with user and environmental context cues to more accurately measure emotion. The main novelty of the research is that it will demonstrate the first Personalized Real-time Emotion-Adaptive Context-Aware VR (PerAffectly VR) system and provide significant insight into how to create, measure, and share emotional VR experiences. The research will be helpful in multiple domains such as tourism, training, entertainment, and gaming. It will also enable the creation of VR interfaces that automatically adapt to the user’s emotional needs and provide a better user experience.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126367492","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Multi-Drone Collaborative Trajectory Optimization for Large-Scale Aerial 3D Scanning 大型航空三维扫描多无人机协同轨迹优化
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00034
Fangping Chen, Yuheng Lu, Binbin Cai, Xiaodong Xie
{"title":"Multi-Drone Collaborative Trajectory Optimization for Large-Scale Aerial 3D Scanning","authors":"Fangping Chen, Yuheng Lu, Binbin Cai, Xiaodong Xie","doi":"10.1109/ISMAR-Adjunct54149.2021.00034","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00034","url":null,"abstract":"Reconstruction and mapping of outdoor urban environment are critical to a large variety of applications, ranging from large-scale city-level 3D content creation for augmented and virtual reality to the digital twin construction of smart cities and automatic driving. The construction of large-scale city-level 3D model will become another important medium after images and videos. We propose an autonomous approach to reconstruct the voxel model of the scene in real-time, and estimate the best set of viewing angles according to the precision requirement. These task views are assigned to the drones based on Optimal Mass Transport (OMT) optimization. In this process, the multi-level pipelining in the chip design method is applied to accelerate the parallelism between exploration and data acquisition. Our method includes: (1) real-time perception and reconstruction of scene voxel model and obstacle avoidance; (2) determining the best observation and viewing angles of scene geometry through global and local optimization; (3) assigning the task views to the drones and planning path based on the OMT optimization, and iterating continuously according to new exploration results; (4) expediting exploration and data acquisition in parallel through multi-stage pipeline to improve efficiency. Our method can schedule routes for drones according to the scene and its optimal acquisition perspective in real-time, which avoids the model void and lack of accuracy caused by traditional aerial 3D scanning using routes of cultivating land regardless of the object, and lays a solid foundation for the 3D real-life model to directly become the available 3D data source for AR and VR. We evaluate the effectiveness of our method by collecting several groups of large-scale city-level data. Facts have proved that the accuracy and efficiency of reconstruction have been greatly improved.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131625978","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Watch-Your-Skiing: Visualizations for VR Skiing using Real-time Body Tracking 观看你的滑雪:使用实时身体跟踪的VR滑雪可视化
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00088
Xuan Zhang, Erwin Wu, H. Koike
{"title":"Watch-Your-Skiing: Visualizations for VR Skiing using Real-time Body Tracking","authors":"Xuan Zhang, Erwin Wu, H. Koike","doi":"10.1109/ISMAR-Adjunct54149.2021.00088","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00088","url":null,"abstract":"Correcting one’s body posture is necessary when acquiring specific skills, especially for some sports such as skiing or gymnastics. However, it is difficult to observe our posture objectively, which is the reason why a trainer is required. In this paper, we introduce a VR ski training system using full body motion capture to provide real-time feedback for the user. Two types of different visual cues are developed and qualitatively compared in a user study. This system opens the opportunity to learn alpine skiing by oneself and also has a potential to be applied to other sports or skill acquisition.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124616845","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Context-Aware Markerless Augmented Reality for Shared Educational Spaces 用于共享教育空间的上下文感知无标记增强现实
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00110
T. Scargill
{"title":"Context-Aware Markerless Augmented Reality for Shared Educational Spaces","authors":"T. Scargill","doi":"10.1109/ISMAR-Adjunct54149.2021.00110","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00110","url":null,"abstract":"In order for markerless augmented reality (AR) to reach its full potential in educational settings it must be able to adapt to the wide range of possible environments, devices and user cognitive states that affect learning outcomes. Shared educational spaces, such as classrooms, art galleries, museums, teaching hospitals and wildlife centers present enticing opportunities in terms of specialized AR applications, existing infrastructure to leverage, and large numbers of AR sessions to gather data on. In this work we make feasible the challenging concept of context-aware AR through the use of a ‘local expert’, which learns the optimal configuration of AR algorithms and virtual content for the specific educational space and use case for which it is implemented. To compute insights from multiple AR devices, enable timely responses to fast-changing user cognitive states, and ensure the security of sensitive user data, we propose an edge-computing architecture, in which storage and computation related to our local expert is performed on a server on the same local area network as the mobile AR devices.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121649973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Gaze-Adaptive Subtitles Considering the Balance among Vertical/Horizontal and Depth of Eye Movement 考虑眼动纵深平衡的注视自适应字幕
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00035
Yusuke Shimizu, Ayumi Ohnishi, T. Terada, M. Tsukamoto
{"title":"Gaze-Adaptive Subtitles Considering the Balance among Vertical/Horizontal and Depth of Eye Movement","authors":"Yusuke Shimizu, Ayumi Ohnishi, T. Terada, M. Tsukamoto","doi":"10.1109/ISMAR-Adjunct54149.2021.00035","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00035","url":null,"abstract":"Subtitles (captions displayed on the screen) are important in 3D content, such as virtual reality (VR) and 3D movies, to help users understand the content. However, an optimal displaying method and framework for subtitles have not been established for 3D content because 3D has a depth factor. To determine how to place text in 3D content, we propose four methods of moving subtitles dynamically considering the balance between the vertical/horizontal and depth of gaze shift. These methods are used to reduce the difference in depth or distance between the gaze position and subtitles. Additionally, we evaluate the readability of the text and participants’ fatigue. The results show that aligning the text horizontally and vertically to eye movements improves visibility and readability. It is also shown that the eyestrain is related to the distance between the object and subtitles. This evaluation provides basic knowledge for presenting text in 3D content.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115580864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Vision-based Acoustic Information Retrieval for Interactive Sound Rendering 基于视觉的交互式声音渲染声学信息检索
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00115
M. Colombo
{"title":"Vision-based Acoustic Information Retrieval for Interactive Sound Rendering","authors":"M. Colombo","doi":"10.1109/ISMAR-Adjunct54149.2021.00115","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00115","url":null,"abstract":"The planned thesis work involves adopting computer vision techniques in the process of decomposing complex scenes to recognise acoustic characteristics of space, determining physical and structural features of complex scenes. The experiments presented demonstrate applications of scene understanding techniques to game scenes and virtual reconstructions of real space to determine acoustic properties of scene geometry for automating realistic sound rendering, identifying the current state of automatic acoustic material recognition for virtual environments and proposing a novel evaluation framework to test objective and subjective accuracy against measurements from real environments. Proof-of-concept systems have been tested on state-of-the-art acoustic renderers to demonstrate their efficiency in offline procedures. Current directions are aimed at designing end-to-end pipelines for interactive, real-time applications, with the ambition of adopting computer vision to understand the acoustic space, even in contexts of dynamic geometry typical of Augmented Reality platforms, where the acoustic space is constantly updating based on the surrounding, real world.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"34 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116599221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Augmented Reality Interface for Sailing Navigation: a User Study for Wind Representation 航海导航的增强现实界面:风表示的用户研究
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) Pub Date : 2021-10-01 DOI: 10.1109/ISMAR-Adjunct54149.2021.00060
Francesco Laera, V. Manghisi, Alessandro Evangelista, M. Foglia, M. Fiorentino
{"title":"Augmented Reality Interface for Sailing Navigation: a User Study for Wind Representation","authors":"Francesco Laera, V. Manghisi, Alessandro Evangelista, M. Foglia, M. Fiorentino","doi":"10.1109/ISMAR-Adjunct54149.2021.00060","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00060","url":null,"abstract":"This paper presents a novel Augmented Reality (AR) interface for Head Mounted Display HMD, specifically for sailing navigation. Compared to literature and the commercial solutions available, the novelty is to use boat-referenced 3D graphics. It allows representing wind direction and intensity, heading with monochrome green elements. Furthermore, we carried out a user validation study. We implemented a virtual simulator including a sailboat, the marine environment (i.e., sea, sky, marine traffic, and sounds), and the presented interface as AR overlay. We evaluated the effectiveness of the wind representation of the AR interface through an online questionnaire based on a video simulation and asking the user to imagine it as the result of an AR visualization. We defined one test scenario with wind variations and a distracting element (i.e., a crossing vessel). 75 sailors (59% experts, with more than 50 sailing days per year) participated in the questionnaire, and most of them (63%) considered the video effective at simulating the AR interface. In addition, 75% are ready to wear an AR device during sailing. Also, the usability (SUS questionnaire) and the user experience (UEQ) results provided positive results.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129702797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信