{"title":"Being them: presence of using non-human avatars in immersive virtual environment","authors":"Dong-Yong Lee, Yong-Hun Cho, In-Kwon Lee","doi":"10.1145/3281505.3283384","DOIUrl":"https://doi.org/10.1145/3281505.3283384","url":null,"abstract":"This work examines the differences of the effects between using humanoid and non-humanoid avatars on the user's Illusion of Virtual Body Ownership (IVBO) and experience. We used three kinds of avatars: bipedalism group (human), quadrupedalism group (wolf), and serpentine motion group (snake). The result shows that using non-humanoid avatars feel more sense of change of their body. Users feel more proficient when using the humanoid avatar, but are more pleased with the non-humanoid avatars.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122083887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An evaluation of pupillary light response models for 2D screens and VR HMDs","authors":"Brendan David-John, Pallavi Raiturkar, Arunava Banerjee, Eakta Jain","doi":"10.1145/3281505.3281538","DOIUrl":"https://doi.org/10.1145/3281505.3281538","url":null,"abstract":"Pupil diameter changes have been shown to be indicative of user engagement and cognitive load for various tasks and environments. However, it is still not the preferred physiological measure for applied settings. This reluctance to leverage the pupil as an index of user engagement stems from the problem that in scenarios where scene brightness cannot be controlled, the pupil light response confounds the cognitive-emotional response. What if we could predict the light response of an individual's pupil, thus creating the opportunity to factor it out of the measurement? In this work, we lay the groundwork for this research by evaluating three models of pupillary light response in 2D, and in a virtual reality (VR) environment. Our results show that either a linear or an exponential model can be fit to an individual participant with an easy-to-use calibration procedure. This work opens several new research directions in VR relating to performance analysis and inspires the use of eye tracking beyond gaze as a pointer and foveated rendering.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117332124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"VirtualHaus: a collaborative mixed reality application with tangible interface","authors":"Jean-Philippe Farrugia","doi":"10.1145/3281505.3281568","DOIUrl":"https://doi.org/10.1145/3281505.3281568","url":null,"abstract":"We present VirtualHaus, a collaborative mixed reality application allowing two participants to recreate Mozart's apartment as it used to be by interactively placing furniture. Each participant has a different role and therefore uses a different application: the visitor uses an immersive virtual reality application, while the supervisor uses an augmented reality application. The two applications are wirelessly synchronised and display the same information with distinct viewpoints and tools.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129638603","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tracking projection mosaicing by synchronized high-speed optical axis control","authors":"Masashi Nitta, Tomohiro Sueishi, M. Ishikawa","doi":"10.1145/3281505.3281535","DOIUrl":"https://doi.org/10.1145/3281505.3281535","url":null,"abstract":"Projectors, as information display devices, have improved substantially and to achieve both the wide range and high resolution is desired for the dynamic human gaze. However, a fixed projector has a trade-off between the angle of projection and a resolution with limited pixels. Conventional methods with dynamic optical axis control lack the potential speed of the devices. We propose a tracking projection mosaicing with a high-speed projector and a high-speed optical axis controller for a randomly moving position, such as the gaze. We also propose a synchronization strategy by queuing and alternating operations to reduce motion-based artifacts, which realize a high-quality static image projection during the dynamic optical axis control. We have experimentally validated the geometric and temporal consistency of the proposed synchronization method and have attempted a demonstration of the tracking projection mosaicing for the dynamically moving bright spot of a laser pointer.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127526346","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Am I in the theater?: usability study of live performance based virtual reality","authors":"Linjia He, Hongsong Li, Tong Xue, Deyuan Sun, Shoulun Zhu, Gangyi Ding","doi":"10.1145/3281505.3281508","DOIUrl":"https://doi.org/10.1145/3281505.3281508","url":null,"abstract":"Duplicating the audience experience of an art performance with VR technology is a promising VR application, which is considered to provide better viewer experience than the conventional video. As various forms of art performances are recorded by the panoramic camera and broadcasted on the Internet, the impact of this new VR-based media to the viewers needs to be systematically studied. In this work, a two-level usability framework is proposed, which combines the traditional concepts of presence and the quality evaluation of art performances, aiming to systematically study the usability of such VR application. Both the conventional video and the panoramic video of a theatre performance were captured simultaneously, and were replayed to two groups of viewers in a cinematic setup and through an HMD respectively. The psychological measurement methods, including the questionnaire and the interview, as well as the psychophysical measurement methods, including the EEG and the motion capture techniques were both used in the study. The results show that the such VR application duplicates the live performance better by providing a higher sense of presence, higher engagement levels, and stronger desire to see live performance. For visual intensive performance contents, the new VR-based media can provide a better user experience. The future development of the new media forms based on the panoramic video technique could benefit from this work.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127907870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Investigating different modalities of directional cues for multi-task visual-searching scenario in virtual reality","authors":"Taizhou Chen, Yi-Shiun Wu, Kening Zhu","doi":"10.1145/3281505.3281516","DOIUrl":"https://doi.org/10.1145/3281505.3281516","url":null,"abstract":"In this study, we investigated and compared the effectiveness of visual, auditory, and vibrotactile directional cues on multiple simultaneous visual-searching tasks in an immersive virtual environment. Effectiveness was determined by the task-completion time, the range of head movement, the accuracy of the identification task, and the perceived workload. Our experiment showed that the on-head vibrotactile display can effectively guide users towards virtual visual targets, without affecting their performance on the other simultaneous tasks, in the immersive VR environment. These results can be applied to numerous applications (e.g. gaming, driving, and piloting) in which there are usually multiple simultaneous tasks, and the user experience and performance could be vulnerable.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"14 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124734700","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"JamGesture","authors":"Souta Mizuno, Tetsuro Kitahara, Shun Shiramatsu, Shugo Ichinose","doi":"10.1145/3281505.3283380","DOIUrl":"https://doi.org/10.1145/3281505.3283380","url":null,"abstract":"The physical gestures promote musical comprehension because they can provide visual information of musical performance for others. Melodic outlines especially have a high affinity with intuitive physical gestures. We propose methods for recognizing physical gestures using motion sensor cameras and smartphone sensors, and we have developed an improvisation support system, JamGesture, by integrating a method for recognizing physical gestures using smartphones and JamSketch, a system for melody generation based on melodic outlines. JamGesture enables users to improvise music by using the input from their intuitive physical gestures with the melody-generation function of JamSketch.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121114881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ping-Hsuan Han, Yang-Sheng Chen, Kong-Chang Lee, Hao-Cheng Wang, Chiao-En Hsieh, Jui-Chun Hsiao, C. Chou, Y. Hung
{"title":"Haptic around: multiple tactile sensations for immersive environment and interaction in virtual reality","authors":"Ping-Hsuan Han, Yang-Sheng Chen, Kong-Chang Lee, Hao-Cheng Wang, Chiao-En Hsieh, Jui-Chun Hsiao, C. Chou, Y. Hung","doi":"10.1145/3281505.3281507","DOIUrl":"https://doi.org/10.1145/3281505.3281507","url":null,"abstract":"In this paper, we present Haptic Around, a hybrid-haptic feedback system, which utilizes fan, hot air blower, mist creator and heat light to recreate multiple tactile sensations in virtual reality for enhancing the immersive environment and interaction. This system consists of a steerable haptic device rigged on the top of the user head and a handheld device also with haptics feedbacks to simultaneously provide tactile sensations to the users in a 2m x 2m space. The steerable haptic device can enhance the immersive environment for providing full body experience, such as heat in the desert or cold in the snow mountain. Additionally, the handheld device can enhance the immersive interaction for providing partial body experience, such as heating the iron or quenching the hot iron. With our system, the users can perceive visual, auditory and haptic when they are moving around in virtual space and interacting with virtual object. In our study, the result has shown the potential of the hybrid-haptic feedback system, which the participants rated the enjoyment, realism, quality, immersion higher than the other.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121087275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"User-centric classification of virtual reality locomotion","authors":"J. Albert, Kelvin Sung","doi":"10.1145/3281505.3283376","DOIUrl":"https://doi.org/10.1145/3281505.3283376","url":null,"abstract":"Traveling in a virtual world, while confined in the real world requires a virtual reality locomotion (VRL) method. VRL remains an issue because of three fundamental challenges, sickness, presence, and fatigue. We propose a User-Centric Classification (UCC) of VRL methods based on a method's ability to address these challenges. UCC provides a framework to discuss and compare different VRL methods and to examine performance trade-offs. We designed and implemented a testbed to study several VRL methods, and initial results demonstrated the effectiveness of the UCC framework [1].","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116449884","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Extending recreational environments with a landscape-superimposed display using mixed reality","authors":"Mamoru Hatanaka, R. Hamakawa","doi":"10.1145/3281505.3283394","DOIUrl":"https://doi.org/10.1145/3281505.3283394","url":null,"abstract":"Herein, we describe a system that extends recreational experiences by overlaying a virtual landscape of a remote place over the currently experienced real landscape using mixed reality (MR) technology and displaying avatars of other users. There are many recreational activities that can be performed outdoors. However, such activities usually involve some traveling costs, preparation time, and require schedule adjustments. To reduce the impact of these factors, we implemented a system that extends recreational environments, thereby allowing free movement through the manipulation of the visual information using MR.","PeriodicalId":138249,"journal":{"name":"Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128530968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}