{"title":"Immersive Job Taste: a Concept of Demonstrating Workplaces with Virtual Reality","authors":"Mikhail Fominykh, E. Prasolova-Førland","doi":"10.1109/VR.2019.8798356","DOIUrl":"https://doi.org/10.1109/VR.2019.8798356","url":null,"abstract":"This paper presents a new concept of ‘Immersive Job Taste’ – interactive virtual reality demonstration of a workplace that aims to give a feeling of going through an average workday of a professional with elements of basic training. The main target audiences of Job Taste simulations are young job seekers who can be aided in selecting a career path at school or a welfare center, choosing the first or a new occupation, often after a period of being unemployed. The design methodology behind the Immersive Job Taste concept includes presentation of a workplace, typical tasks, feedback on performance, and advice on applying for jobs in the specific industry. We developed several scenarios and applied different virtual and augmented reality concepts to build prototypes for different types of devices. The prototypes were evaluated by several groups of primary users and experts. The results indicate a generally very positive attitude towards the concept. In this paper, we discuss the potential impact of applying the concept and directions for future work.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115340782","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Laura Fink, Nora Hensel, Daniela Markov-Vetter, C. Weber, O. Staadt, Marc Starnrninqer
{"title":"Hybrid Mono-Stereo Rendering in Virtual Reality","authors":"Laura Fink, Nora Hensel, Daniela Markov-Vetter, C. Weber, O. Staadt, Marc Starnrninqer","doi":"10.1109/VR.2019.8798283","DOIUrl":"https://doi.org/10.1109/VR.2019.8798283","url":null,"abstract":"Rendering for Head Mounted Displays (HMD) causes a doubled computational effort, since serving the human stereopsis requires the creation of one image for the left and one for the right eye. The difference in this image pair, called binocular disparity, is an important cue for depth perception and the spatial arrangement of surrounding objects. Findings in the context of the human visual system (HVS) have shown that especially in the near range of an observer, binocular disparities have a high significance. But as with rising distance the disparity converges to a simple geometric shift, also the importance as depth cue exponentially declines. In this paper, we exploit this knowledge about the human perception by rendering objects fully stereoscopic only up to a chosen distance and monoscopic, from there on. By doing so, we obtain three distinct images which are synthesized to a new hybrid stereoscopic image pair, which reasonably approximates a conventionally rendered stereoscopic image pair. The method has the potential to reduce the amount of rendered primitives easily to nearly 50 % and thus, significantly lower frame times. Besides of a detailed analysis of the introduced formal error and how to deal with occurring artifacts, we evaluated the perceived quality of the VR experience during a comprehensive user study with nearly 50 participants. The results show that the perceived difference in quality between the shown image pairs was generally small. An in-depth analysis is given on how the participants reached their decisions and how they subjectively rated their VR experience.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130840104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sports Training System for Visualizing Bird's-Eye View from First-Person View","authors":"Shunki Shimizu, Kaoru Surni","doi":"10.1109/VR.2019.8798227","DOIUrl":"https://doi.org/10.1109/VR.2019.8798227","url":null,"abstract":"In ball games, it is important that the players are able to estimate the position of the other players from a bird's-eye view based on the information obtained from their first-person view. We have developed a training system for improving this ability. The user wears a head-mounted display and can simulate ball games in 360° from the first-person view. The system allows the user to rearrange all players, and a ball from the bird's-eye view. The user can then track the other players from the first-person viewpoint and perform actions specific to the ball game such as passing, receiving a ball, and (if a defense player) following offense players.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128145025","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hyeongyeop Kang, Geonsun Lee, D. Kang, O. Kwon, Jun Yeup Cho, Ho-Jung Choi, JunaHvun Han
{"title":"Jumping Further: Forward Jumps in a Gravity-reduced Immersive Virtual Environment","authors":"Hyeongyeop Kang, Geonsun Lee, D. Kang, O. Kwon, Jun Yeup Cho, Ho-Jung Choi, JunaHvun Han","doi":"10.1109/VR.2019.8798251","DOIUrl":"https://doi.org/10.1109/VR.2019.8798251","url":null,"abstract":"In a cable-driven suspension system developed to simulate the reduced gravity of lunar or Martian surfaces, we propose to manipu-late/reduce the physical cues of forward jumps so as to overcome the limited workspace problem. The physical cues should be manipulated in a way that the discrepancy from the visual cues provided through the HMD is not noticeable by users. We identified the extent to which forward jumps can be manipulated naturally. We combined it with visual gains, which can scale visual cues without being noticed by users. The test results obtained in a prototype application show that we can use both trajectory manipulation and visual gains to overcome the spatial limit. We also investigated the user experiences when making significantly high and far jumps. The results will be helpful in designing astronaut-training systems and various VR entertainment content.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121498781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Velko Vechev, J. Zárate, David Lindlbauer, R. Hinchet, H. Shea, Otmar Hilliges
{"title":"TacTiles: Dual-Mode Low-Power Electromagnetic Actuators for Rendering Continuous Contact and Spatial Haptic Patterns in VR","authors":"Velko Vechev, J. Zárate, David Lindlbauer, R. Hinchet, H. Shea, Otmar Hilliges","doi":"10.1109/VR.2019.8797921","DOIUrl":"https://doi.org/10.1109/VR.2019.8797921","url":null,"abstract":"We introduce TacTiles, light (1.8g), low-power (130 mW), and small form-factor (1 cm3) electromagnetic actuators that can form a flexible haptic array to provide localized tactile feedback. Our novel hardware design uses a custom 8-layer PCB, dampening materials, and asymmetric latching, enabling two distinct modes of actuation: contact and pulse mode. We leverage these modes in Virtual Reality (VR) to render continuous contact with objects and the exploration of object surfaces and volumes with spatial haptic patterns. Results from a series of experiments show that users are able to localize feedback, discriminate between modes with high accuracy, and differentiate objects from haptic surfaces and volumes even without looking at them.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121857755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kosuke Hiratani, D. Iwai, Parinya Punpongsanon, Kosuke Sato
{"title":"Shadowless Projector: Suppressing Shadows in Projection Mapping with Micro Mirror Array Plate","authors":"Kosuke Hiratani, D. Iwai, Parinya Punpongsanon, Kosuke Sato","doi":"10.1109/VR.2019.8798245","DOIUrl":"https://doi.org/10.1109/VR.2019.8798245","url":null,"abstract":"Shadowless Projector is projection mapping system in which a shadow (more specifically, umbra) does not suffer the projected result. A typical shadow removal technique used a multiple overlapping projection system. In this paper, we propose a shadow-less projection method with single projector. Inspired by a surgical light system that does not cast shadows on patients' bodies in clinical practice, we apply a special optical system that consists of methodically positioned vertical mirrors. This optical system works as a large aperture lens, it is impossible to block all projected ray by a small object such as a hand. Consequently, only penumbra is caused, which leads to a shadow-less projection.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116993867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sinhwa Kang, Jake Chanenson, P. Ghate, Peter Cowal, M. Weaver, D. Krum
{"title":"Advancing Ethical Decision Making in Virtual Reality","authors":"Sinhwa Kang, Jake Chanenson, P. Ghate, Peter Cowal, M. Weaver, D. Krum","doi":"10.1109/VR.2019.8798151","DOIUrl":"https://doi.org/10.1109/VR.2019.8798151","url":null,"abstract":"Virtual reality (VR) has been widely utilized for training and education purposes because of pedagogical, safety, and economic benefits. The investigation of moral judgment is a particularly interesting VR application, related to training. For this study, we designed a within-subject experiment manipulating the role of study participants in a Trolley Dilemma scenario: either victim or driver. We conducted a pilot study with four participants and describe preliminary results and implications in this poster.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"3 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123730757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Brian M. Williamson, E. Taranta, Pat Garrity, R. Sottilare, J. Laviola
{"title":"A Systematic Evaluation of Multi-Sensor Array Configurations for SLAM Tracking with Agile Movements","authors":"Brian M. Williamson, E. Taranta, Pat Garrity, R. Sottilare, J. Laviola","doi":"10.1109/VR.2019.8798007","DOIUrl":"https://doi.org/10.1109/VR.2019.8798007","url":null,"abstract":"Accurate tracking of a user in a marker-less environment can be difficult, even more so when agile head or hand movements are expected. When relying on feature detection as part of a SLAM algorithm the issue arises that a large rotational delta causes previously tracked features to become lost. One approach to overcome this problem is with multiple sensors increasing the horizontal field of view. In this paper, we perform a systematic evaluation of tracking accuracy by recording several agile movements and providing different camera configurations to evaluate against. We begin with four sensors in a square configuration and test the resulting output from a chosen SLAM algorithm. We then systematically remove a camera from the feed covering all permutations to determine the level of accuracy and tracking loss. We cover some of the lessons learned in this preliminary experiment and how it may guide researchers in tracking extremely agile movements.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"82 11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123421031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spatial Presence in Real and Remote Immersive Environments","authors":"Nawel Khenak, J. Vézien, David Thery, P. Bourdot","doi":"10.1109/VR.2019.8797801","DOIUrl":"https://doi.org/10.1109/VR.2019.8797801","url":null,"abstract":"This paper presents an experiment assessing the feeling of spatial presence in both real and remote environments (respectively the socalled “natural presence” and “telepresence”). Twenty-eight (28) participants performed a 3D-pointing task while being located in a real office and the same office remotely rendered over HMD. The spatial presence was evaluated by means of the ITC-SOPI questionnaire and users' behaviour analysis (trajectories of head during the task). The analysis also included the effect of different levels of immersion of the system - visual-only versus visual and audio - rendering in such environments. The results show a higher sense of spatial presence for the remote condition, regardless of the degree of immersion, and for the “visual and audio” condition regardless of the environment. Additionally, trajectory analysis of users' heads reveals that participants behaved similarly in both environments.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128883562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
David Heidrich, S. Oberdörfer, Marc Erich Latoschik
{"title":"The Effects of Immersion on Harm-inducing Factors in Virtual Slot Machines","authors":"David Heidrich, S. Oberdörfer, Marc Erich Latoschik","doi":"10.1109/VR.2019.8798021","DOIUrl":"https://doi.org/10.1109/VR.2019.8798021","url":null,"abstract":"Slot machines are one of the most played games by pathological gamblers. New technologies, e.g. immersive Virtual Reality (VR), offer more possibilities to exploit erroneous beliefs in the context of gambling. However, the risk potential of VR-based gambling has not been researched, yet. A higher immersion might increase harmful aspects, thus making VR realizations more dangerous. Measuring harm-inducing factors reveals the risk potential of virtual gambling. In a user study, we analyze a slot machine realized as a desktop 3D and as an immersive VR version. Both versions are compared in respect to effects on dissociation, urge to gamble, dark flow, and illusion of control. Our study shows significantly higher values of dissociation, dark flow, and urge to gamble in the VR version. Presence significantly correlates with all measured harm-inducing factors. We demonstrate that VR-based gambling has a higher risk potential. This creates the importance of regulating VR-based gambling.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124806609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}