2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)最新文献

筛选
英文 中文
Half Title Page 半页标题
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/vr55154.2023.00001
{"title":"Half Title Page","authors":"","doi":"10.1109/vr55154.2023.00001","DOIUrl":"https://doi.org/10.1109/vr55154.2023.00001","url":null,"abstract":"","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"108 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133275218","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Animation Fidelity in Self-Avatars: Impact on User Performance and Sense of Agency 自我形象中的动画保真度:对用户表现和代理感的影响
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00044
Haoran Yun, J. L. Ponton, C. Andújar, N. Pelechano
{"title":"Animation Fidelity in Self-Avatars: Impact on User Performance and Sense of Agency","authors":"Haoran Yun, J. L. Ponton, C. Andújar, N. Pelechano","doi":"10.1109/VR55154.2023.00044","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00044","url":null,"abstract":"The use of self-avatars is gaining popularity thanks to affordable VR headsets. Unfortunately, mainstream VR devices often use a small number of trackers and provide low-accuracy animations. Previous studies have shown that the Sense of Embodiment, and in particular the Sense of Agency, depends on the extent to which the avatar's movements mimic the user's movements. However, few works study such effect for tasks requiring a precise interaction with the environment, i.e., tasks that require accurate manipulation, precise foot stepping, or correct body poses. In these cases, users are likely to notice inconsistencies between their self-avatars and their actual pose. In this paper, we study the impact of the animation fidelity of the user avatar on a variety of tasks that focus on arm movement, leg movement and body posture. We compare three different animation techniques: two of them using Inverse Kinematics to reconstruct the pose from sparse input (6 trackers), and a third one using a professional motion capture system with 17 inertial sensors. We evaluate these animation techniques both quantitatively (completion time, unintentional collisions, pose accuracy) and qualitatively (Sense of Embodiment). Our results show that the animation quality affects the Sense of Embodiment. Inertial-based MoCap performs significantly better in mimicking body poses. Surprisingly, IK-based solutions using fewer sensors outperformed MoCap in tasks requiring accurate positioning, which we attribute to the higher latency and the positional drift that causes errors at the end-effectors, which are more noticeable in contact areas such as the feet.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129157241","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Empirically Evaluating the Effects of Eye Height and Self-Avatars on Dynamic Passability Affordances in Virtual Reality 眼高和自我形象对虚拟现实动态通行能力影响的实证评估
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00046
Ayush Bhargava, Roshan Venkatakrishnan, R. Venkatakrishnan, Hannah M. Solini, Kathryn M. Lucaites, Andrew C. Robb, C. Pagano, Sabarish V. Babu
{"title":"Empirically Evaluating the Effects of Eye Height and Self-Avatars on Dynamic Passability Affordances in Virtual Reality","authors":"Ayush Bhargava, Roshan Venkatakrishnan, R. Venkatakrishnan, Hannah M. Solini, Kathryn M. Lucaites, Andrew C. Robb, C. Pagano, Sabarish V. Babu","doi":"10.1109/VR55154.2023.00046","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00046","url":null,"abstract":"Over the past two decades self-avatars have been shown to affect the perception of both oneself and of environmental properties including the sizes and distances of elements in immersive virtual environments. However, virtual avatars that accurately match the body proportions of their users remain inaccessible to the general public. As such, most virtual experiences that represent the user have a generic avatar that does not fit the proportions of the users' body. This can negatively affect judgments involving affordances, such as passability and maneuverability, which pertain to the relationship between the properties of environmental elements relative to the properties of the user providing information about actions that can be enacted. This is especially true when the task requires the user to maneuver around moving objects like in games. Therefore, it is necessary to understand how different sized self-avatars affect the perception of affordances in dynamic virtual environments. To better understand this, we conducted an experiment investigating how a self-avatar that is either the same size, 20% shorter, or 20% taller, than the user's own body affects passability judgments in a dynamic virtual environment. Our results suggest that the presence of self-avatars results in better regulatory and safer road crossing behavior, and helps participants synchronize self-motion to external stimuli quicker than in the absence of self-avatars.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131374971","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Volumetric Avatar Reconstruction with Spatio-Temporally Offset RGBD Cameras 时空偏移RGBD相机的体积头像重建
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00023
Gareth Rendle, A. Kreskowski, Bernd Froehlich
{"title":"Volumetric Avatar Reconstruction with Spatio-Temporally Offset RGBD Cameras","authors":"Gareth Rendle, A. Kreskowski, Bernd Froehlich","doi":"10.1109/VR55154.2023.00023","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00023","url":null,"abstract":"RGBD cameras can capture users and their actions in the real world for reconstruction of photo-realistic volumetric avatars that allow rich interaction between spatially distributed telepresence parties in virtual environments. In this paper, we present and evaluate a system design that enables volumetric avatar reconstruction at increased frame rates. We demonstrate that we can overcome the limited capturing frame rate of commodity RGBD cameras such as the Azure Kinect by dividing a set of cameras into two spatio-temporally offset reconstruction groups and implementing a real-time reconstruction pipeline to fuse the temporally offset RGBD image streams. Comparisons of our proposed system against capture configurations possible with the same number of RGBD cameras indicate that it is beneficial to use a combination of spatially and temporally offset RGBD cameras, allowing increased reconstruction frame rates and scene coverage while producing temporally consistent volumetric avatars.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"143 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124561094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
IEEE VR 2023 Organizing Committee IEEE VR 2023组委会
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/vr55154.2023.00008
{"title":"IEEE VR 2023 Organizing Committee","authors":"","doi":"10.1109/vr55154.2023.00008","DOIUrl":"https://doi.org/10.1109/vr55154.2023.00008","url":null,"abstract":"","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"264 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116166972","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Optimizing Product Placement for Virtual Stores 优化虚拟商店的产品植入
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00049
Wei Liang, Luhui Wang, Xinzhe Yu, Changyang Li, Rawan Alghofaili, Yining Lang, L. Yu
{"title":"Optimizing Product Placement for Virtual Stores","authors":"Wei Liang, Luhui Wang, Xinzhe Yu, Changyang Li, Rawan Alghofaili, Yining Lang, L. Yu","doi":"10.1109/VR55154.2023.00049","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00049","url":null,"abstract":"The recent popularity of consumer-grade virtual reality devices has enabled users to experience immersive shopping in virtual environments. As in a real-world store, the placement of products in a virtual store should appeal to shoppers, which could be time-consuming, tedious, and non-trivial to create manually. Thus, this work introduces a novel approach for automatically optimizing product placement in virtual stores. Our approach considers product exposure and spatial constraints, applying an optimizer to search for optimal product placement solutions. We conducted qualitative scene rationality and quantitative product exposure experiments to validate our approach with users. The results show that the proposed approach can synthesize reasonable product placements and increase product exposures for different virtual stores.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121120190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Compact Photochromic Occlusion Capable See-through Display with Holographic Lenses 一种具有全息透镜的紧凑型光致变色遮挡能力的透明显示器
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00039
Chun Wei Ooi, Yuichi Hiroi, Yuta Itoh
{"title":"A Compact Photochromic Occlusion Capable See-through Display with Holographic Lenses","authors":"Chun Wei Ooi, Yuichi Hiroi, Yuta Itoh","doi":"10.1109/VR55154.2023.00039","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00039","url":null,"abstract":"Occlusion is a crucial visual element in optical see-through (OST) augmented reality, however, implementing occlusion in OST displays while addressing various design trade-offs is a difficult problem. In contrast to the traditional method of using spatial light modulators (SLMs) for the occlusion mask, using photochromic materials as occlusion masks can effectively eliminate diffraction artifacts in see-through views due to the lack of electronic pixels, thus providing superior see-through image quality. However, this design requires UV illumination to activate the photochromic mate-rial, which traditionally requires multiple SLMs, resulting in a larger form factor for the system. This paper presents a compact photochromic occlusion-capable OST design using multilayer, wavelength-dependent holographic optical lenses (HOLs). Our approach employs a single digital mi-cromirror display (DMD) to form both the occlusion mask with UV light and a virtual image with visible light in a time-multiplexed man-ner. We demonstrate our proof-of-concept system on a bench-top setup and assess the appearance and contrasts of the displayed image. We also suggest potential improvements for current prototypes to encourage the community to explore this occlusion approach.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115425024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Redirected Walking Based on Historical User Walking Data 基于历史用户步行数据的重定向步行
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00021
Cheng-Wei Fan, Sen-Zhe Xu, Peng Yu, Fang-Lue Zhang, Songhai Zhang
{"title":"Redirected Walking Based on Historical User Walking Data","authors":"Cheng-Wei Fan, Sen-Zhe Xu, Peng Yu, Fang-Lue Zhang, Songhai Zhang","doi":"10.1109/VR55154.2023.00021","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00021","url":null,"abstract":"With redirected walking (RDW) technology, people can explore large virtual worlds in smaller physical spaces. RDW controls the trajectory of the user's walking in the physical space through subtle adjustments, so as to minimize the collision between the user and the physical space. Previous predictive algorithms place constraints on the user's path according to the spatial layouts of the virtual environment and work well when applicable, while reactive algorithms are more general for scenarios involving free exploration or uncon-strained movements. However, even in relatively free environments, we can predict the user's walking to a certain extent by analyzing the user's historical walking data, which can help the decision-making of reactive algorithms. This paper proposes a novel RDW method that improves the effect of real-time unrestricted RDW by analyzing and utilizing the user's historical walking data. In this method, the physical space is discretized by considering the user's location and orientation in the physical space. Using the weighted directed graph obtained from the user's historical walking data, we dynamically update the scores of different reachable poses in the physical space during the user's walking. We rank the scores and choose the optimal target position and orientation to guide the user to the best pose. Since simulation experiments have been shown to be effective in many previous RDW studies, we also provide a method to simulate user walking trajectories and generate a dataset. Experiments show that our method outperforms multiple state-of-the-art methods in various environments of different sizes and spatial layouts.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129706989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Measuring the Effect of Stereo Deficiencies on Peripersonal Space Pointing 测量立体缺陷对周边个人空间指向的影响
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00063
Anil Ufuk Batmaz, M. H. Mughrabi, M. Sarac, Mayra Donaji Barrera Machuca, W. Stuerzlinger
{"title":"Measuring the Effect of Stereo Deficiencies on Peripersonal Space Pointing","authors":"Anil Ufuk Batmaz, M. H. Mughrabi, M. Sarac, Mayra Donaji Barrera Machuca, W. Stuerzlinger","doi":"10.1109/VR55154.2023.00063","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00063","url":null,"abstract":"State-of-the-art Virtual Reality (VR) and Augmented Reality (AR) headsets rely on singlefocal stereo displays. For objects away from the focal plane, such displays create a vergence-accommodation conflict (VAC), potentially degrading user interaction performance. In this paper, we study how the VAC affects pointing at targets within arm's reach with virtual hand and raycasting interaction in current stereo display systems. We use a previously proposed experimental methodology that extends the ISO 9241–411:2015 multi-directional selection task to enable fair comparisons between selecting targets in different display conditions. We conducted a user study with eighteen participants and the results indicate that participants were faster and had higher throughput in the constant VAC condition with the virtual hand. We hope that our results enable designers to choose more efficient interaction methods in virtual environments.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129721468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Exploring Neural Biomarkers in Young Adults Resistant to VR Motion Sickness: A Pilot Study of EEG 探索年轻人对VR晕动病的神经生物标志物:脑电图的初步研究
2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR) Pub Date : 2023-03-01 DOI: 10.1109/VR55154.2023.00048
Gang Li, Katharina Margareta Theresa Pöhlmann, Mark Mcgill, C. Chen, S. Brewster, F. Pollick
{"title":"Exploring Neural Biomarkers in Young Adults Resistant to VR Motion Sickness: A Pilot Study of EEG","authors":"Gang Li, Katharina Margareta Theresa Pöhlmann, Mark Mcgill, C. Chen, S. Brewster, F. Pollick","doi":"10.1109/VR55154.2023.00048","DOIUrl":"https://doi.org/10.1109/VR55154.2023.00048","url":null,"abstract":"VR (Virtual Reality) Motion Sickness (VRMS) refers to purely visually-induced motion sickness. Not everyone is susceptible to VRMS, but if experienced, nausea will often lead users to withdraw from the ongoing VR applications. VRMS represents a serious challenge in the field of VR ergonomics and human factors. Like other neuro-ergonomics researchers did before, this paper considers VRMS as a brain state problem as various etiologies of VRMS support the claim that VRMS is caused by disagreement between the vestibular and visual sensory inputs. However, what sets this work apart from the existing literature is that it explores anti-VRMS brain patterns via electroencephalogram (EEG) in VRMS-resistant individuals. Based on existing datasets of a previous study, we found enhanced theta activity in the left parietal cortex in VRMS-resistant individuals (N= 10) compared to VRMS-susceptible individuals (N=10). Even though the sample size per se is not large, this finding achieved medium effect size. This finding offers new hypotheses regarding how to reduce VRMS by the enhancement of brain functions per se (e.g., via non-invasive transcranial electrostimulation techniques) without the need to redesign the existing VR content.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130192605","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信