2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)最新文献

筛选
英文 中文
Evaluating VR Sickness in VR Locomotion Techniques 评估VR运动技术中的VR眩晕
T. V. Gemert, Joanna Bergström
{"title":"Evaluating VR Sickness in VR Locomotion Techniques","authors":"T. V. Gemert, Joanna Bergström","doi":"10.1109/VRW52623.2021.00078","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00078","url":null,"abstract":"VR Sickness is a form of motion sickness in Virtual Reality that affects 25-60% of the population. It is typically caused by exposure to mismatches between real and virtual motion, which happens in most VR Locomotion techniques. Hence, VR Locomotion and VR Sickness are intimately related, but this relationship is not reflected in the state of VR Sickness assessment. In this work we highlight the importance of understanding and quantifying VR Sickness in VR locomotion research. We discuss the most important factors and measures of VR to develop VR Sickness as a meaningful metric for VR Locomotion.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116506136","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
An Overview of Group Navigation in Multi-User Virtual Reality 多用户虚拟现实中的组导航技术综述
Tim Weissker, Pauline Bimberg, B. Fröhlich
{"title":"An Overview of Group Navigation in Multi-User Virtual Reality","authors":"Tim Weissker, Pauline Bimberg, B. Fröhlich","doi":"10.1109/VRW52623.2021.00073","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00073","url":null,"abstract":"Group navigation techniques can allow both collocated and distributed collaborators to explore a shared virtual environment together. In this paper, we review the different facets, the resulting challenges, and previous implementations of group navigation in the literature and derive four broad and non-exclusive topic areas for future research on the subject. Our overarching goal is to underline the importance of optimizing navigation processes for groups and to increase the awareness of group navigation techniques as a relevant solution approach in this regard.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116675617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Analyzing Visual Perception and Predicting Locomotion using Virtual Reality and Eye Tracking 基于虚拟现实和眼动追踪的视觉感知分析与运动预测
Niklas Stein
{"title":"Analyzing Visual Perception and Predicting Locomotion using Virtual Reality and Eye Tracking","authors":"Niklas Stein","doi":"10.1109/VRW52623.2021.00246","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00246","url":null,"abstract":"Locomotion and vison are closely linked. When users explore virtual environments by walking they rely on stable visible landmarks to plan and execute their next movement. In my research I am developing novel methods to predict locomotion paths of human subjects for the immediate future, i.e. the next few seconds. I aim to connect different types of behavioral data (eye, hand, feet and head tracking) and test their reliability and validity for predicting walking behavior in virtual reality. Such a prediction will be very valuable for natural interaction, for example in redirected walking schemes.My approach begins with an evaluation of the quality of data gathered with current tracking methods. Informative experimental conditions need to be developed to find meaningful patterns in natural walking. Next, raw tracked data of different modalities need to be connected with each other and aggregated in a useful way. Thereafter, possible valid predictors need to be developed and compared to already functioning predicting algorithms (e.g. [2],[6],[12]). As a final goal, all valid predictors shall be used to create a prediction algorithm returning the most likely future path when exploring virtual environments.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115039853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
CeVRicale: A VR app for Cervical Rehabilitation CeVRicale:一款用于颈椎康复的VR应用
Arnaldo Cesco, Francesco Ballardin, G. Marfia
{"title":"CeVRicale: A VR app for Cervical Rehabilitation","authors":"Arnaldo Cesco, Francesco Ballardin, G. Marfia","doi":"10.1109/VRW52623.2021.00203","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00203","url":null,"abstract":"We propose CeVRicale, a cervical rehabilitation application based on the use of virtual reality (VR). CeVRicale is smartphone-based, thus it may be available to larger shares of population when compared to those applications that are implemented for head mounted displays such as HTC Vive or Oculus Rift. The app exploits a smartphone’s sensor to track head movements in five exergames inspired by rehabilitation exercises. This project is the first step in a study to evaluate the effectiveness and efficiency of a low cost VR application in the treatment of cervical musculoskeletal disorders.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124550958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Requirements Gathering for VR Simulators for Training: Lessons Learned for Globally Dispersed Teams 培训VR模拟器的需求收集:全球分散团队的经验教训
Vivian Gómez, Kelly Peñaranda, P. Figueroa
{"title":"Requirements Gathering for VR Simulators for Training: Lessons Learned for Globally Dispersed Teams","authors":"Vivian Gómez, Kelly Peñaranda, P. Figueroa","doi":"10.1109/VRW52623.2021.00108","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00108","url":null,"abstract":"We report an empirical study on the use of current VR technologies for requirements gathering in the field of simulation and training. We used synchronous and asynchronous traditional techniques plus collaborative virtual environments such as MozillaHubs and AltspaceVR. Our results show that requirements gathering in VR makes a difference in the process of requirements identification. We report advantages and shortcomings that can be useful for future practitioners. For example, we found that VR sessions allowed for better identification of dimensions and sizes. VR sessions for requirements gathering could also benefit from better pointers and better sound.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125271911","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Analysis of Positional Tracking Space Usage when using Teleportation 使用隐形传态时位置跟踪空间的使用分析
Aniruddha Prithul, Eelke Folmer
{"title":"Analysis of Positional Tracking Space Usage when using Teleportation","authors":"Aniruddha Prithul, Eelke Folmer","doi":"10.1109/VRW52623.2021.00122","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00122","url":null,"abstract":"Teleportation is a widely used virtual locomotion technique that allows users to navigate beyond the confines of available tracking space with a low possibility of inducing VR sickness. Because teleportation requires little physical effort and lets users traverse large distances instantly, a risk is that over time users might only use teleportation and abandon walking input. This paper provides insight into this risk by presenting results from a study that analyzes tracking space usage of three popular commercially available VR games that rely on teleportation. Our study confirms that positional tracking usage is limited by the use of teleportation.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123650368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
RED: A Real-Time Datalogging Toolkit for Remote Experiments RED:用于远程实验的实时数据记录工具包
Sam Adeniyi, Evan Suma Rosenberg, Jerald Thomas
{"title":"RED: A Real-Time Datalogging Toolkit for Remote Experiments","authors":"Sam Adeniyi, Evan Suma Rosenberg, Jerald Thomas","doi":"10.1109/VRW52623.2021.00183","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00183","url":null,"abstract":"The ability to conduct experiments on virtual reality systems has become increasingly compelling as the world continues to migrate towards remote research, affecting the feasibility of conducting in-person studies with human participants. The Remote Experiment Datalogger (RED) Toolkit is an open-source library designed to simplify the administration of remote experiments requiring continuous real-time data collection. Our design consists of a REST server, implemented using the Flask framework, and a client API for transparent integration with multiple game engines. We foresee the RED Toolkit serving as a building block for the handling of future remote experiments across a multitude of circumstances.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130625728","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Matching 2D Image Patches and 3D Point Cloud Volumes by Learning Local Cross-domain Feature Descriptors 通过学习局部跨域特征描述符匹配二维图像补丁和三维点云
Weiquan Liu, Baiqi Lai, Cheng Wang, Xuesheng Bian, Chenglu Wen, Ming Cheng, Yu Zang, Yan Xia, Jonathan Li
{"title":"Matching 2D Image Patches and 3D Point Cloud Volumes by Learning Local Cross-domain Feature Descriptors","authors":"Weiquan Liu, Baiqi Lai, Cheng Wang, Xuesheng Bian, Chenglu Wen, Ming Cheng, Yu Zang, Yan Xia, Jonathan Li","doi":"10.1109/VRW52623.2021.00140","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00140","url":null,"abstract":"Establishing the relationship of 2D images and 3D point clouds is a solution to establish the spatial relationship between 2D and 3D space, i.e. AR virtual-real registration. In this paper, we propose a network, 2D3D-GAN-Net, to learn the local invariant cross-domain feature descriptors of 2D image patches and 3D point cloud volumes. Then, the learned local invariant cross-domain feature descriptors are used for matching 2D images and 3D point clouds. The Generative Adversarial Networks (GAN) is embedded into the 2D3D-GANNet, which is used to distinguish the source of the learned feature descriptors, facilitating the extraction of invariant local cross-domain feature descriptors. Experiments show that the local cross-domain feature descriptors learned by 2D3D-GAN-Net are robust, and can be used for cross-dimensional retrieval on the 2D image patches and 3D point cloud volumes dataset. In addition, the learned 3D feature descriptors are used to register the point cloud for demonstrating the robustness of learned local cross-domain feature descriptors.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"412 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126588254","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A Shared Haptic Virtual Environment for Dental Surgical Skill Training 面向牙科手术技能培训的共享触觉虚拟环境
Maximilian Kaluschke, Myat Su Yin, P. Haddawy, N. Srimaneekarn, P. Saikaew, G. Zachmann
{"title":"A Shared Haptic Virtual Environment for Dental Surgical Skill Training","authors":"Maximilian Kaluschke, Myat Su Yin, P. Haddawy, N. Srimaneekarn, P. Saikaew, G. Zachmann","doi":"10.1109/VRW52623.2021.00069","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00069","url":null,"abstract":"Online learning has become an effective approach to reach students who may not be able to travel to university campuses for various reasons. Its use has also dramatically increased during the current COVID-19 pandemic with social distancing and lockdown requirements. But online education has thus far been primarily limited to teaching of knowledge and cognitive skills. There is yet almost no use of online education for teaching of physical clinical skills.In this paper, we present a shared haptic virtual environment for dental surgical skill training. The system provides the teacher and student with a shared environment containing a virtual dental station with patient, a dental drill controlled by a haptic device, and a drillable tooth. It also provides automated scoring of procedure outcomes. We discuss a number of optimizations used in order to provide the high-fidelity simulation and real-time performance needed for training of high-precision clinical skills. Since tactile, in particular kinaesthetic, sense is essential in carrying out many dental procedures, an important question is how to best teach this in a virtual environment. In order to support exploring this, our system includes three modes for transmitting haptic sensations from the user performing the procedure to the user observing.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"112 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127725300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Text Selection in AR-HMD Using a Smartphone as an Input Device AR-HMD中使用智能手机作为输入设备的文本选择
Rajkumar Darbar, Joan Odicio-Vilchez, Thibault Lainé, Arnaud Prouzeau, M. Hachet
{"title":"Text Selection in AR-HMD Using a Smartphone as an Input Device","authors":"Rajkumar Darbar, Joan Odicio-Vilchez, Thibault Lainé, Arnaud Prouzeau, M. Hachet","doi":"10.1109/VRW52623.2021.00145","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00145","url":null,"abstract":"Text selection is a common task while reading a PDF file or browsing the web. Efficient text selection techniques exist on desktops and touch devices, but are still under-explored for Augmented Reality Head Mounted Display (AR-HMD). Performing text selection in AR commonly uses hand-tracking, voice commands, and eye/head-gaze, which are cumbersome and lack precision. In this poster paper, we explore the use of a smartphone as an input device to support text selection in AR-HMD because of its availability, familiarity, and social acceptability. As an initial attempt, we propose four eyes-free, uni-manual text selection techniques for AR-HMD, all using a smartphone - continuous touch, discrete touch, spatial movement, and raycasting.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121499662","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信