Jan-Philipp Stauffert, Florian Niebling, Marc Erich Latoschik
{"title":"Effects of Latency Jitter on Simulator Sickness in a Search Task","authors":"Jan-Philipp Stauffert, Florian Niebling, Marc Erich Latoschik","doi":"10.1109/VR.2018.8446195","DOIUrl":"https://doi.org/10.1109/VR.2018.8446195","url":null,"abstract":"Low latency is a fundamental requirement for Virtual Reality (VR) systems to reduce the potential risks of cybersickness and to increase effectiveness, efficiency and user experience. In contrast to the effects of uniform latency degradation, the influence of latency jitter on user experience in VR is not well researched, although today's consumer VR systems are vulnerable in this respect. In this work we report on the impact of latency jitter on cybersickness in HMD-based VR environments. Test subjects are given a search task in Virtual Reality, provoking both head rotation and translation. One group experienced artificially added latency jitter in the tracking data of their head-mounted display. The introduced jitter pattern was a replication of a real-world latency behavior extracted and analyzed from an existing example VR-system. The effects of the introduced latency jitter were measured based on self-reports simulator sickness questionnaire (SSQ) and by taking physiological measurements. We found a significant increase in self-reported simulator sickness. We therefore argue that measure and control of latency based on average values taken at a few time intervals is not enough to assure a required timeliness behavior but that latency jitter needs to be considered when designing experiences for Virtual Reality.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125172099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Soft Hand Simulation for Smooth and Robust Natural Interaction","authors":"M. Verschoor, Daniel Lobo, M. Otaduy","doi":"10.1109/VR.2018.8447555","DOIUrl":"https://doi.org/10.1109/VR.2018.8447555","url":null,"abstract":"Natural hand-based interaction should feature hand motion that adapts smoothly to the tracked user's motion, reacts robustly to contact with objects in a virtual environment, and enables dexterous manipulation of these objects. In our work, we enable all these properties thanks to an efficient soft hand simulation model. This model integrates an articulated skeleton, nonlinear soft tissue and frictional contact, to provide the realism necessary for natural interaction. Robust and smooth interaction is made possible by simulating in a single energy minimization framework all the mechanical energy exchanges among elements of the hand: coupling between the hand's skeleton and the user's motion, constraints at skeletal joints, nonlinear soft skin deformation, coupling between the hand's skeleton and the soft skin, frictional contact between the skin and virtual objects, and coupling between a grasped object and other virtual objects. We have put our effort on describing all elements of the hand that provide for realism and natural interaction, while ensuring minimal and bounded computational cost, which is key for smooth and robust interaction. As a result, we accomplish hand simulation as an asset that can be connected to diverse input tracking devices, and seamlessly integrated in game engines for fast deployment in VR applications.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126304783","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Encounter-Type Haptic Interfaces for Virtual Reality Musical Instruments","authors":"Alberto Boem, Hiroo Iwata","doi":"10.1109/VR.2018.8446549","DOIUrl":"https://doi.org/10.1109/VR.2018.8446549","url":null,"abstract":"This paper summarizes the author's interest in haptic interfaces for Virtual Reality Musical Instruments. The current research focuses on finding interfaces that can improve physical interaction and presence with such instruments. Musical expression is a topic rarely addressed in the field of Virtual Reality. During the years, the author has explored different systems and concepts while finding the thesis topic for the Ph.D. research. They include the development and evaluation of deformable input surfaces and Shape-Changing interfaces. The results from these implementations led us to investigate Encounter-type haptics, a method that has never received a proper consideration in the design of virtual musical instruments. This represents the current stage of our research. However, the exact direction towards the Ph.D. thesis topic is still in search. Through this paper, we will describe the background and motivations behind this research together with the research hypothesis developed until now.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125391640","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aelee Kim, Minha Chang, Yeseul Choi, Sohyeon Jeon, Kyoungmin Lee
{"title":"The Effect of Immersion on Emotional Responses to Film Viewing in a Virtual Environment","authors":"Aelee Kim, Minha Chang, Yeseul Choi, Sohyeon Jeon, Kyoungmin Lee","doi":"10.1109/VR.2018.8446046","DOIUrl":"https://doi.org/10.1109/VR.2018.8446046","url":null,"abstract":"In this study, we explore how immersion affects people's sense of emotions in a virtual environment. The primary goals of this study are to analyze the possible use of virtual reality (VR) as an affective medium and research the relationship between immersion and emotion. To investigate these objectives, we compared two viewing conditions (HMD vs. No-HMD) and applied two types of emotional content (horror and empathy) to examine whether the level of immersion could influence emotional responses. The results showed that viewers who watched the horror movie using HMD felt more scared than those in the No-HMD condition. However, there were no significant emotional differences between the HMD and No-HMD conditions in the movie groups exposed to empathy. Regarding these results, we may assume that the effect of an immersive viewing experience on emotional responses in VR is deeply related to the degree of arousal and strong perceptual cues. The horror movie used in this study included intense visual and audio stimuli found in the typical horror film format. In contrast, viewers experienced less stimulating perceptual input when they are watching the empathetic movie. In conclusion, VR undoubtedly elicits a more immersive experience and greater emotional responses to the horror film. This study has confirmed the efficacy of VR as an emotional amplifier and successfully demonstrated the important association between immersion and emotion in VR.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"371 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121740500","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Oscar Ariza, G. Bruder, Nicholas Katzakis, Frank Steinicke
{"title":"Analysis of Proximity-Based Multimodal Feedback for 3D Selection in Immersive Virtual Environments","authors":"Oscar Ariza, G. Bruder, Nicholas Katzakis, Frank Steinicke","doi":"10.1109/VR.2018.8446317","DOIUrl":"https://doi.org/10.1109/VR.2018.8446317","url":null,"abstract":"Interaction tasks in virtual reality (VR) such as three-dimensional (3D) selection or manipulation of objects often suffer from reduced performance due to missing or different feedback provided by VR systems than during corresponding realworld interactions. Vibrotactile and auditory feedback have been suggested as additional perceptual cues complementing the visual channel to improve interaction in VR. However, it has rarely been shown that multimodal feedback improves performance or reduces errors during 3D object selection. Only little research has been conducted in the area of proximity-based multimodal feedback, in which stimulus intensities depend on spatiotemporal relations between input device and the virtual target object. In this paper, we analyzed the effects of unimodal and bimodal feedback provided through the visual, auditory and tactile modalities, while users perform 3D object selections in VEs, by comparing both binary and continuous proximity-based feedback. We conducted a Fitts' Law experiment and evaluated the different feedback approaches. The results show that the feedback types affect ballistic and correction phases of the selection movement, and significantly influence the user performance.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114421092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Effect of Haptic Prediction Accuracy on Presence","authors":"Dominik Gall, Marc Erich Latoschik","doi":"10.1109/VR.2018.8446153","DOIUrl":"https://doi.org/10.1109/VR.2018.8446153","url":null,"abstract":"This paper reports on the effect of visually-anchored prediction accuracy of haptic information on the perceived presence of virtual environments. We designed an experiment which explicitly prevented confounding factors potentially introduced by virtual body ownership and/or agency. The experimental design consisted of two main conditions defining congruent vs incongruent visual and haptic cues. Presence was measured during as well as after exposure. A distance estimation task solely based on motor action and the visually-anchored spatial model of the environment was executed to control for perceptual binding. 56 healthy volunteers were randomly assigned to one of two groups in a single-blind mixed-group design study. The study revealed increased presence for high prediction accuracy and decreased presence for low prediction accuracy, while perceptual binding still occurred. The observed effect sizes were in the medium range. The results indicate a significant correlation between prediction accuracy of haptic information and the perceived realness and presence of a virtual environment which gives rise to a discussion about models for dissociative symptom derealisation.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131254962","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ehsan Azimi, A. Winkler, E. Tucker, Long Qian, Manyu Sharma, J. Doswell, Nassir Navab, P. Kazanzides
{"title":"Evaluation of Optical See-Through Head-Mounted Displays in Training for Critical Care and Trauma","authors":"Ehsan Azimi, A. Winkler, E. Tucker, Long Qian, Manyu Sharma, J. Doswell, Nassir Navab, P. Kazanzides","doi":"10.1109/VR.2018.8446583","DOIUrl":"https://doi.org/10.1109/VR.2018.8446583","url":null,"abstract":"One major cause of preventable death is a lack of proper skills for providing critical care. Conventional training for advanced emergency medical procedures is often limited to a verbal block of instructions and/or an instructional video. In this study, we evaluate the benefits of using an optical see-through head-mounted display (OST-HMD) for training of caregivers in an emergency medical environment. A rich user interface was implemented that provides 3D visual aids including images, text and tracked 3D overlays for each task. A user study with 20 participants was conducted for two medical tasks, where each subject received conventional training for one task and HMD training for the other task. Our results indicate that using a mixed reality HMD is more engaging, improves the time-on-task, and increases the confidence level of users.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129638124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alexandre Audinot, E. Goga, Vincent Goupil, Carl-Johan Jorqensen, Adrien Reuzeau, F. Argelaguet
{"title":"Climb, Fly, Stack: Design of Tangible and Gesture-Based Interfaces for Natural and Efficient Interaction","authors":"Alexandre Audinot, E. Goga, Vincent Goupil, Carl-Johan Jorqensen, Adrien Reuzeau, F. Argelaguet","doi":"10.1109/VR.2018.8446244","DOIUrl":"https://doi.org/10.1109/VR.2018.8446244","url":null,"abstract":"This paper describes three novel 3D interaction metaphors conceived to fulfill the three tasks proposed in the current edition of the IEEE VR 3DUI Contest. We propose the VladdeR, a tangible interface for Virtual laddeR climbing, the FPDrone, a First Person Drone control flying interface, and the Dice Cup, a tangible interface for virtual object stacking. All three interactions take advantage of body proprioception and previous knowledge of real life interactions without the need of complex interaction mechanics: climbing a tangible ladder through arm and leg motions, control a drone like a child flies an imaginary plane by extending your arms or stacking objects as you will grab and stack dice with a dice cup.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"21 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129215676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"3DUI Contest 2018 - Team NaN","authors":"Christian Hirt, Anh Nguyen, Markus Zank","doi":"10.1109/VR.2018.8446051","DOIUrl":"https://doi.org/10.1109/VR.2018.8446051","url":null,"abstract":"For the contest held at IEEE VR 2018 3DUIs for three tasks have to be designed. The tasks include climbing a ladder, fly a drone in first-person perspective and stacking objects. The goal of our design is to find an easy yet intuitive solution to the given tasks. In that way, for the ladder task, a 3DUI is implemented closely mimicking a real ladder ascent or descent using only the controllers to interact with the environment. For the drone flying, we propose a solution which uses the head orientation to control the drone. The third task concluded with a no gravity, temporary storage space which is used to prepare the final stack that can be released with a trigger press.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"139 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134477119","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Influences on the Elicitation of Interpersonal Space with Virtual Humans","authors":"D. Krum, Sin-Hwa Kang, Thai-Binh Phan","doi":"10.1109/VR.2018.8446235","DOIUrl":"https://doi.org/10.1109/VR.2018.8446235","url":null,"abstract":"The emergence of low cost virtual and augmented reality systems has encouraged the development of immersive training applications for medical, military, and many other fields. Many of the training scenarios for these various fields may require the presentation of realistic interactions with virtual humans. It is thus vital to determine the critical factors of fidelity required in those interactions to elicit naturalistic behavior on the part of trainees. Negative training may occur if trainees are inadvertently influenced to react in ways that are unexpected and unnatural, hindering proper learning and transfer of skills and knowledge back into real world contexts. In this research, we examined whether haptic priming (presenting an illusion of virtual human touch at the beginning of the virtual experience) and different locomotion techniques (either joystick or physical walking) might affect proxemic behavior in human users. The results of our study suggest that locomotion techniques can alter proxemic behavior in significant ways. Haptic priming did not appear to impact proxemic behavior, but did increase rapport and other subjective social measures. The results suggest that designers and developers of immersive training systems should carefully consider the impact of even simple design and fidelity choices on trainee reactions in social interactions.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132805717","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}