Priyanka Pazhayedath, Pedro Belchior, Rafael Prates, Filipe Silveira, D. Lopes, Robbe Cools, Augusto Esteves, A. Simeone
{"title":"Exploring Bi-Directional Pinpointing Techniques for Cross-Reality Collaboration","authors":"Priyanka Pazhayedath, Pedro Belchior, Rafael Prates, Filipe Silveira, D. Lopes, Robbe Cools, Augusto Esteves, A. Simeone","doi":"10.1109/VRW52623.2021.00055","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00055","url":null,"abstract":"Virtual Reality (VR) technology enables users to immerse themselves in artificial worlds. However, it isolates users from the outside world and impedes them from collaborating with other users who might be outside of the VR experience and vice-versa. We implemented two systems where we explore how such an external user in the real world can interact across realities with a user immersed in virtual reality, either locally or remotely, in order to to share pinpoint locations. In the first we investigate three cross-reality techniques for the external user to draw the attention of their VR counterpart on specific objects present in the virtual environment (Voice, Highlight, and Arrow). Participants performed better overall and preferred the Arrow technique, followed by the Highlight technique. In the second system we expand on these two techniques to explore an even starker cross-reality interaction between users in VR and users interacting via a tablet computer to direct each other to pinpoint objects in the scene. We adapted the previous two techniques and implemented two others (Vision cone, Pointing) that support bi-directional communication between users. When it comes to bi-directional pinpointing, VR users still showed preference for the Arrow technique (now described as Pointing in Giant mode), while mobile users were split between the Vision cone and the Highlight techniques.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"176 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134624312","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Analyzing Visual Perception and Predicting Locomotion using Virtual Reality and Eye Tracking","authors":"Niklas Stein","doi":"10.1109/VRW52623.2021.00246","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00246","url":null,"abstract":"Locomotion and vison are closely linked. When users explore virtual environments by walking they rely on stable visible landmarks to plan and execute their next movement. In my research I am developing novel methods to predict locomotion paths of human subjects for the immediate future, i.e. the next few seconds. I aim to connect different types of behavioral data (eye, hand, feet and head tracking) and test their reliability and validity for predicting walking behavior in virtual reality. Such a prediction will be very valuable for natural interaction, for example in redirected walking schemes.My approach begins with an evaluation of the quality of data gathered with current tracking methods. Informative experimental conditions need to be developed to find meaningful patterns in natural walking. Next, raw tracked data of different modalities need to be connected with each other and aggregated in a useful way. Thereafter, possible valid predictors need to be developed and compared to already functioning predicting algorithms (e.g. [2],[6],[12]). As a final goal, all valid predictors shall be used to create a prediction algorithm returning the most likely future path when exploring virtual environments.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115039853","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Overview of Group Navigation in Multi-User Virtual Reality","authors":"Tim Weissker, Pauline Bimberg, B. Fröhlich","doi":"10.1109/VRW52623.2021.00073","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00073","url":null,"abstract":"Group navigation techniques can allow both collocated and distributed collaborators to explore a shared virtual environment together. In this paper, we review the different facets, the resulting challenges, and previous implementations of group navigation in the literature and derive four broad and non-exclusive topic areas for future research on the subject. Our overarching goal is to underline the importance of optimizing navigation processes for groups and to increase the awareness of group navigation techniques as a relevant solution approach in this regard.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116675617","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"[DC] Embodying an avatar with an asymmetrical lower body to modulate the dynamic characteristics of gait initiation","authors":"Valentin Vallageas, R. Aissaoui, David R. Labbé","doi":"10.1109/VRW52623.2021.00245","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00245","url":null,"abstract":"Virtual reality (VR) enables the user to perceive body owner ship towards a virtual body. This illusion is induced through first-person perspective (1PP) and synchronous movement with the real body. Previous studies have shown that pronounced differences between the real and the virtual body lead to changes in the user’s behavior. It has also been shown that modifying the body image can affect the user’s movements. Nevertheless, the state of the art does not refer to the kinetic and kinematic impacts of one virtual lower limb deformation. Therefore, this paper presents a methodology exploring the impact of a self-avatar with an asymmetrical lower body (one limb longer or larger than the other) on the dynamic characteristics of the user during a gait initiation task.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117050891","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Oberdörfer, Samantha Straka, Marc Erich Latoschik
{"title":"Effects of Immersion and Visual Angle on Brand Placement Effectiveness","authors":"S. Oberdörfer, Samantha Straka, Marc Erich Latoschik","doi":"10.1109/VRW52623.2021.00102","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00102","url":null,"abstract":"Typical inherent properties of immersive Virtual Reality (VR) such as felt presence might have an impact on how well brand placements are remembered. In this study, we exposed participants to brand placements in four conditions of varying degrees of immersion and visual angle on the stimulus. Placements appeared either as poster or as puzzle. We measured the recall and recognition of these placements. Our study revealed that neither immersion nor the visual angle had a significant impact on memory for brand placements.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"112 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115751771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Analysis of Positional Tracking Space Usage when using Teleportation","authors":"Aniruddha Prithul, Eelke Folmer","doi":"10.1109/VRW52623.2021.00122","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00122","url":null,"abstract":"Teleportation is a widely used virtual locomotion technique that allows users to navigate beyond the confines of available tracking space with a low possibility of inducing VR sickness. Because teleportation requires little physical effort and lets users traverse large distances instantly, a risk is that over time users might only use teleportation and abandon walking input. This paper provides insight into this risk by presenting results from a study that analyzes tracking space usage of three popular commercially available VR games that rely on teleportation. Our study confirms that positional tracking usage is limited by the use of teleportation.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123650368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interactive Narrative Facial Expression Animation Generation by Intuitive Curve Drawing","authors":"Yanxiang Zhang, Y. Ling","doi":"10.1109/VRW52623.2021.00088","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00088","url":null,"abstract":"This paper presents a type of interactive facial expressions animation generation system based on traditional montage techniques and generative narrative concepts. It could allow users to produce narrative facial expression animation by interactively and intuitively drawing a plot curve based on pre-recorded facial expression animation clips, thus provide high flexibility and creativity to users with interesting interaction.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121684473","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Requirements Gathering for VR Simulators for Training: Lessons Learned for Globally Dispersed Teams","authors":"Vivian Gómez, Kelly Peñaranda, P. Figueroa","doi":"10.1109/VRW52623.2021.00108","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00108","url":null,"abstract":"We report an empirical study on the use of current VR technologies for requirements gathering in the field of simulation and training. We used synchronous and asynchronous traditional techniques plus collaborative virtual environments such as MozillaHubs and AltspaceVR. Our results show that requirements gathering in VR makes a difference in the process of requirements identification. We report advantages and shortcomings that can be useful for future practitioners. For example, we found that VR sessions allowed for better identification of dimensions and sizes. VR sessions for requirements gathering could also benefit from better pointers and better sound.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125271911","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"CeVRicale: A VR app for Cervical Rehabilitation","authors":"Arnaldo Cesco, Francesco Ballardin, G. Marfia","doi":"10.1109/VRW52623.2021.00203","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00203","url":null,"abstract":"We propose CeVRicale, a cervical rehabilitation application based on the use of virtual reality (VR). CeVRicale is smartphone-based, thus it may be available to larger shares of population when compared to those applications that are implemented for head mounted displays such as HTC Vive or Oculus Rift. The app exploits a smartphone’s sensor to track head movements in five exergames inspired by rehabilitation exercises. This project is the first step in a study to evaluate the effectiveness and efficiency of a low cost VR application in the treatment of cervical musculoskeletal disorders.","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124550958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MyChanges: Tools for the co-designing of housing transformations","authors":"S. Eloy, Micaela Raposo, F. Costa, P. Vermaas","doi":"10.1109/VRW52623.2021.00265","DOIUrl":"https://doi.org/10.1109/VRW52623.2021.00265","url":null,"abstract":"MyChanges is a prototype tool for generating and visualizing architectonic modifications of existing housing in co-design projects with inhabitants. Our hypothesis is that giving inhabitants design solutions that would fit individual needs and aspirations, will increase their satisfaction with their house. To arrive at architectonically responsible house transformations, we used a shape grammar system for defining the possible modifications [1]. For empowering inhabitants to understand and explore these modifications to their housing and increase the potential of their participation [2], we developed a mockup tool that comprehends two main parts: shape generation and visualization. The shape generation component is currently a mockup simulation that reproduces some of the generation possibilities of the grammar. For visualizing the outcomes, we developed three different possibilities: i) a semi-immersive visualization where the user utilizes a smart phone to see a 360º render of the site; ii) a fully immersive visualization developed with Unity in which the user with a Head Mounted Display, can freely navigate through the final design; iii) a non-immersive screen-based visualization, where the user, with a tablet device, visualizes a static image of the final design. Interviews and tests with real inhabitants (n=12) were performed to assess user’s response to the potential of the tools and preliminary conclusions show that a tool like MyChanges would have acceptance among inhabitants [3].","PeriodicalId":256204,"journal":{"name":"2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121109088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}