{"title":"Locomotion with Virtual Agents in the Realm of Social Virtual Reality","authors":"A. Bönsch","doi":"10.1109/VR.2018.8446617","DOIUrl":"https://doi.org/10.1109/VR.2018.8446617","url":null,"abstract":"My research focuses on social locomotion of computer-controlled, human-like, virtual agents in virtual reality applications. Two main areas are covered in the literature: a) user-agent-dynamics in, e.g., pedestrian scenarios and b) pure inter-agent-dynamics. However, joint locomotion of a social group consisting of a user and one to several virtual agents has not been investigated yet. I intend to close this gap by contributing an algorithmic model of an agent's behavior during social locomotion. In addition, I plan to evaluate the effects of the resulting agent's locomotion patterns on a user's perceived degree of immersion, comfort, as well as social presence.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116992697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shohei Mori, Sei Ikeda, Alexander Plopski, C. Sandor
{"title":"BrightView: Increasing Perceived Brightness of Optical See-Through Head-Mounted Displays Through Unnoticeable Incident Light Reduction","authors":"Shohei Mori, Sei Ikeda, Alexander Plopski, C. Sandor","doi":"10.1109/VR.2018.8446441","DOIUrl":"https://doi.org/10.1109/VR.2018.8446441","url":null,"abstract":"Optical See-Through Head-Mounted Displays (OST-HMDs) lose the visibility of virtual contents under bright environment illumination due to their see-through nature. We demonstrate how a liquid crystal (LC) filter attached to an OST-HMD can be used to dynamically increase the perceived brightness of virtual content without impacting the perceived brightness of the real scene. We present a prototype OST-HMD that continuously adjusts the opacity of the LC filter to attenuate the environment light without users becoming aware of the change. Consequently, virtual content appears to be brighter. The proposed approach is evaluated in psychophysical experiments in three scenes, with 16, 31, and 31 participants, respectively. The participants were asked to compare the magnitude of brightness changes of both real and virtual objects, before and after dimming the LC filter over a period of 5, 10, and 20 seconds. The results showed that the participants felt increases in the brightness of virtual objects while they were less conscious of reductions of the real scene luminance. These results provide evidence for the effectiveness of our display design. Our design can be applied to a wide range of OST-HMDs to improve the brightness and hence realism of virtual content in augmented reality applications.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121657013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tomohiro Ichiyama, Atsushi Matsubayashi, Yasutoshi Makino, H. Shinoda
{"title":"Real-Time Control Operation Support of Unstable System by Visual Feedback","authors":"Tomohiro Ichiyama, Atsushi Matsubayashi, Yasutoshi Makino, H. Shinoda","doi":"10.1109/VR.2018.8446622","DOIUrl":"https://doi.org/10.1109/VR.2018.8446622","url":null,"abstract":"In this paper, we show that an inverted pendulum can be stabilized manually even when a user does not know the physical characteristics and the current state of the pendulum. We display two markers: one indicates current position of the base of the pendulum and the other indicates the target position where the base should be located 0.3 seconds later. Subjects can stabilize the pendulum for a significantly longer time than seeing the real pendulum directly, just by chasing the target marker.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"17 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126633309","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Deep Localization on Panoramic Images","authors":"Atsutoshi Hanasaki, Hideaki Uchiyama, Atsushi Shimada, Rin-ichiro Taniquch","doi":"10.1109/VR.2018.8446048","DOIUrl":"https://doi.org/10.1109/VR.2018.8446048","url":null,"abstract":"Sensor pose estimation is an essential technology for various applications. For instance, it can be used not only to display immersive contents according user movements in Virtual Reality (VR) and but also to superimpose computer-generated objects onto images from a camera in Augmented Reality (AR). As a technical term definition, camera localization with respect to a pre-created map database is specifically referred to as image based localization, memory based localization, or camera relocalization.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126911902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Felix Klotzsche, A. Mariola, Simon M. Hofmann, V. Nikulin, A. Villringer, Michael Gaebler
{"title":"Using EEG to Decode Subjective Levels of Emotional Arousal During an Immersive VR Roller Coaster Ride","authors":"Felix Klotzsche, A. Mariola, Simon M. Hofmann, V. Nikulin, A. Villringer, Michael Gaebler","doi":"10.1109/VR.2018.8446275","DOIUrl":"https://doi.org/10.1109/VR.2018.8446275","url":null,"abstract":"Emotional arousal is a key component of a user's experience in immersive virtual reality (VR). Subjective and highly dynamic in nature, emotional arousal involves the whole body and particularly the brain. However, it has been difficult to relate subjective emotional arousal to an objective, neurophysiological marker-especially in naturalistic settings. We tested the association between continuously changing states of emotional arousal and oscillatory power in the brain during a VR roller coaster experience. We used novel spatial filtering approaches to predict self-reported emotional arousal from the electroencephalogram (EEG) signal of 38 participants. Periods of high vs. low emotional arousal could be classified with accuracies significantly above chance level. Our results are consistent with prior findings regarding emotional arousal in less naturalistic settings. We demonstrate a new approach to decode states of subjective emotional arousal from continuous EEG data in an immersive VR experience.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125999537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An AR-Guided System for Fast Image-Based Modeling of Indoor Scenes","authors":"Daniel Andersen, V. Popescu","doi":"10.1109/VR.2018.8446560","DOIUrl":"https://doi.org/10.1109/VR.2018.8446560","url":null,"abstract":"We present a system that enables a novice user to acquire a large indoor scene in minutes as a collection of images that are sufficient for five degrees-of-freedom virtual navigation by image morphing. The user walks through the scene wearing an augmented reality head-mounted display (AR HMD) enhanced with a panoramic video camera. The AR HMD visualizes a 2D grid partitioning of a dynamically generated floor plan, which guides the user to acquire a panorama from each grid cell. The panoramas are registered offline using both AR HMD tracking data and structure-from - motion tools. Feature correspondences are established between neighboring panoramas. The resulting panoramas and correspondences support interactive rendering via image morphing with any view direction and from any viewpoint on the acquisition plane.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"95 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127100576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effect of Electrical Stimulation Haptic Feedback on Perceptions of Softness-Hardness and Stickiness While Touching a Virtual Object","authors":"Vibol Yem, Kevin Vu, Yuki Kon, H. Kajimoto","doi":"10.1109/VR.2018.8446403","DOIUrl":"https://doi.org/10.1109/VR.2018.8446403","url":null,"abstract":"With the advantages of small size and light weight, electrical stimulation devices have been investigated for providing haptic feedback in relation to virtual objects. Electrical stimulation devices can directly activate sensory receptors to produce a reaction force or touch sensations. In the current study, we tested a new method of electrically inducing force sensation in the fingertip, presenting haptic feedback designed to alter perceptions of softness, hardness and stickiness. We developed a 3D virtual reality system combined with finger-motion capture and electrical stimulation devices. We conducted two experiments to evaluate our electrical stimulation method and analyzed the effects of electrical stimulation on perception. The first experiment confirmed that participants could distinguish between the directions of the illusory force sensation, reporting whether the stimulation flexed their index finger forward or extended it backward. The second experiment examined the effects of the electric current itself on the intensity of their perception of the softness, hardness and stickiness of a virtual object.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130644037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Trey Cook, Nate Phillips, Kristen Massey, Alexander Plopski, C. Sandor
{"title":"User Preference for SharpView-Enhanced Virtual Text During Non-Fixated Viewing","authors":"Trey Cook, Nate Phillips, Kristen Massey, Alexander Plopski, C. Sandor","doi":"10.1109/VR.2018.8446058","DOIUrl":"https://doi.org/10.1109/VR.2018.8446058","url":null,"abstract":"For optical see-through head-mounted displays, the mismatch between a display's focal length and the real world scene inadvertently prevents users from simultaneously focusing on the presented virtual content and the scene. It has been shown that it is possible to ameliorate the out-of-focus blur for images with a known focus distance, by applying an algorithm called Sharp View. However, it remains unclear if Sharp View also improves the readability and clarity of text rendered on the display. In this study, we investigate whether users reported increased text clarity when Sharp View was applied to a text label, and how the focal demand of the display, the focal distance to real world content, and gaze condition affect the result. Our results indicate that, in non-fixated viewing, there is a significant user preference for Sharp View-enhanced text strings.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132249149","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. G. D. Silveira, M. Veronez, Gabriel Lanzer Kannenberg, Demetrius Nunes Alves, C. Cazarin, L. Santana, Jean Luca de Fraga, Leonardo Campos Inocencio, L. V. D. Souza, Fernando P. Marson, F. Bordin, F. M. Tognoli
{"title":"Immersive Virtual Fieldwork: Advances for the Petroleum Industry","authors":"L. G. D. Silveira, M. Veronez, Gabriel Lanzer Kannenberg, Demetrius Nunes Alves, C. Cazarin, L. Santana, Jean Luca de Fraga, Leonardo Campos Inocencio, L. V. D. Souza, Fernando P. Marson, F. Bordin, F. M. Tognoli","doi":"10.1109/VR.2018.8446511","DOIUrl":"https://doi.org/10.1109/VR.2018.8446511","url":null,"abstract":"Laser scanning and photogrammetry techniques have been broadly adopted by Oil&Gas industry for modeling petroleum reservoir analogues. Beyond the benefits of digital data itself, computer systems employed by geoscientists for interpretation and modeling tasks provide high quality rendering, point clouds surface meshes and photo-realistic textured models. But these systems, commonly, have used 2-D display, the 3-D models and information are projected on the screen, providing a limited visualization and restrictive toolset for interpretation. This work proposes to break this paradigm by developing a fully immersive system capable to virtually teleport the geoscientists to the fieldwork and provide a complete toolset for the outcrop's interpretation. Besides, the system has been evaluated and validated by geologists with different skills and it has emerged as an useful and attractive toolset for Oil&Gas industry.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132958302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using Cybersickness Indicators to Adapt Navigation in Virtual Reality: A Pre-Study","authors":"Jérémy Plouzeau, J. Chardonnet, F. Mérienne","doi":"10.1109/VR.2018.8446192","DOIUrl":"https://doi.org/10.1109/VR.2018.8446192","url":null,"abstract":"We propose an innovative method to navigate in a virtual environment by adapting the acceleration parameters to users in real time, in order to reduce cybersickness. Indeed, navigation parameters for most navigation interfaces are still determined by rate-control devices. Inappropriate parameter settings may lead to strong sickness, making the application unusable. Past research found that especially accelerations should not be set too high. Here, we define the accelerations as a function of a cybersickness indicator: the Electro-Dermal Activity (EDA). A pre-study was conducted to test the effectiveness of our approach and showed promising results where cybersickness tends to decrease with our adaptive navigation method.","PeriodicalId":355048,"journal":{"name":"2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133795469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}