{"title":"Localized scent presentation to a walking person by using scent projectors","authors":"Koji Murai, T. Serizawa, Y. Yanagida","doi":"10.1109/ISVRI.2011.5759604","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759604","url":null,"abstract":"We have developed a system to provide scents locally, both in space and time, to people who are walking down an aisle. The system is based on a scent delivery technique that uses scent projectors that launch vortex rings containing scented air. The walking trajectory of a person who approaches the system is detected by a time-of-flight-based range imaging camera, and the system generates a small volume of scented air just in front of the target person by causing two vortex rings to collide with each other.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133001588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Image-based stereo background modeling for CAVE system","authors":"Hasup Lee, Y. Tateyama, T. Ogi","doi":"10.1109/ISVRI.2011.5759646","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759646","url":null,"abstract":"In this paper, we make 3D stereo backgrounds for a CAVE system from real world for users to feel more realistic. We take panorama photographs with the 3D sweep panorama™ functions and generate equirectangular panorama format image by stitching them manually. Finally 3D stereo background of CAVE is constructed using this panorama format image. We propose a generation method of 3D equirectangular panorama format image and background modeling of CAVE using these images.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124272945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Verification of out of body sensations, attribution and localization by interaction with oneself","authors":"Kouichi Watanabe, S. Tachi","doi":"10.1109/ISVRI.2011.5759611","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759611","url":null,"abstract":"To enhance the realistic sensation and the human presence in a telepresence or a telexistence system, it is important not only to match the audio-visual sensation of the operator with the robot but also to match the embodiment of the operator with the robot by reflecting the somatic sensation of the operator. Presently, however, in telexistence, the key factors for matching the embodiment of the operator with the robot are unclear and lack established evaluation methods. In this paper, we experiment with out-of-body sensations in a telexistence system on the basis of the rubber hand illusion. We construct a system to self-interact with and evaluate the stimulation influence of the self-attribution and self-localization of the body.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124309558","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An adaptive kalman filter for three dimensional attitude tracking","authors":"Xiaoming Hu, Qin Li, Changyu He, Yue Liu","doi":"10.1109/ISVRI.2011.5759620","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759620","url":null,"abstract":"An adaptive kalman filter for three dimensional attitude tracking is presented in this paper. Such filter can be used in the low cost system with only a triaxis accelerometer and a triaxis magnetometer where dynamic attitude tracking is needed. An adaptive kalman filter for three dimensional attitude tracking which dynamically updates the measurement variance according to the mode of acceleration is proposed. The proposed filter is implemented and evaluated on a homemade attitude tracking module. Experimental results show that this method can be used for low-cost real time human machine interface.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121823693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Real-time, directable smoke simulation","authors":"Yong Tang, Ping Li, Jin Li, Songqing Zhai, Kunqi Ma, M.Y. Lv","doi":"10.1109/ISVRI.2011.5759668","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759668","url":null,"abstract":"Pursuing the true real-time has long been a problem in smoke animation. In this paper, we build upon work by solving the nondimentional Navier-Stokes equations, utilize the MacCormack method to solve the advection item for lowering numerical dissipation inherent in the semi-Lagrangian scheme, and compensate the high-detailed smoke turbulence on 2D slices over modern GPUs to shorten the simulation time. Besides, an especial rendering technique called view-aligned slices rendering is also proposed, which can effectively capture the smoke details generated by our methods and greatly reduce the time cost. Moreover, our model could also correctly handle the interaction of smoke with winds and stationary objects.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129910597","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"High viscosity fluid simulation using particle-based method","authors":"Yuanzhang Chang, K. Bao, Jian Zhu, E. Wu","doi":"10.1109/ISVRI.2011.5759632","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759632","url":null,"abstract":"We present a new particle-based method for high viscosity fluid simulation. In the method, a new elastic stress term, which is derived from a modified form of the Hooke's law, is included in the traditional Navier-Stokes equation to simulate the movements of the high viscosity fluids. Benefiting from the Lagrangian nature of Smoothed Particle Hydrodynamics method, large flow deformation can be well handled easily and naturally. In addition, in order to eliminate the particle deficiency problem near the boundary, ghost particles are employed to enforce the solid boundary condition. Compared with Finite Element Methods with complicated and time-consuming remeshing operations, our method is much more straightforward to implement. Moreover, our method doesn't need to store and compare to an initial rest state. The experimental results show that the proposed method is effective and efficient to handle the movements of highly viscous flows, and a large variety of different kinds of fluid behaviors can be well simulated by adjusting just one parameter.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129394509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Takumi Yoshida, K. Shimizu, Tadatoshi Kurogi, Sho Kamuro, K. Minamizawa, Hideaki Nii, S. Tachi
{"title":"RePro3D: full-parallax 3D display with haptic feedback using retro-reflective projection technology","authors":"Takumi Yoshida, K. Shimizu, Tadatoshi Kurogi, Sho Kamuro, K. Minamizawa, Hideaki Nii, S. Tachi","doi":"10.1109/ISVRI.2011.5759601","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759601","url":null,"abstract":"We propose a novel full-parallax three-dimensional (3D) display system-RePro3D-that is suitable for interactive 3D applications with haptic feedback.Our approach is based on the retro-reflective projection technology in which several images projected from a projector array are displayed on a retro-reflective screen. When viewers view the screen through a half mirror, they see, without the aid of glasses, a 3D image superimposed in real space. RePro3D has a sensor function that recognizes user input; therefore, it can support some interactive features such as manipulation of 3D objects. In addition, a wearable haptic device, which is a part of our system, provides the user with a sensation of having touched the 3D image. In this paper, we describe the optical system of the high-density projector array used in RePro3D. Then, we describe the development of a prototype of RePro3D. The prototype is used to demonstrate that our system displays full-parallax images superimposed in real space from 42 different viewpoints. The proposed system enables a user to physically interact with the 3D image with haptic feedback.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"193 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129964414","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Guangming Lu, Yi Li, Shuai Jin, Yang Zheng, Weidong Chen, Xiaoxiang Zheng
{"title":"A real-time motion capture framework for synchronized neural decoding","authors":"Guangming Lu, Yi Li, Shuai Jin, Yang Zheng, Weidong Chen, Xiaoxiang Zheng","doi":"10.1109/ISVRI.2011.5759656","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759656","url":null,"abstract":"Neural decoding is an active research area concerned with how sensory and other information is represented in the brain by networks of neurons. An important step in neural decoding research is to collect the subject's motion and neural activities synchronously, which requires a real-time motion capture system with high accuracy. In this paper, we propose a practical motion capture framework with the capability of processing motion data and output character animation in real-time. We use a two-stage coarse-to-fine method to preprocess the raw motion capture data. We employ Kalman filter to coarsely estimate the positions of missing markers and filter out the possible noisy markers. The positions of the missing markers are refined with the relationship between the current frame and similar frames in motion templates. We operate the motion data in PCA space to reduce computational complexity. We present the results for our approach as applied to capturing human hand motions, which demonstrates the accuracy and usefulness of our real-time motion capture framework.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130859892","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Virtual police: Acquiring knowledge-in-use in virtual training environments","authors":"Johanna Bertram, Johannes Moskaliuk, U. Cress","doi":"10.1109/ISVRI.2011.5759669","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759669","url":null,"abstract":"Police officers are often confronted with different unexpected and untrained scenarios and have to respond adequately. To prepare officers for situations that cannot be trained in reality because of high costs, danger, time or effort involved, virtual training seems to be the obvious choice. This paper explicates a theory-driven design process of a virtual training environment and its application in a German state police department.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129648770","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The framework and implementation of Virtual Network Marathon","authors":"Mingmin Zhang, Mingliang Xu, Yong Liu, Gaoqi He, Lizhen Han, Pei Lv, Yongqing Li","doi":"10.1109/ISVRI.2011.5759623","DOIUrl":"https://doi.org/10.1109/ISVRI.2011.5759623","url":null,"abstract":"In this paper, we present an exergame called VNM (Virtual Network Marathon). The VNM employs devised treadmills for immersive virtual running in local network or on Internet, which are embedded with various sensors for collecting the body performance data and connected with computers or Set-Top Box (STB) to synchronously control the player's avatar in the virtual game environment. VNM is implemented on a novel ISCAL model (Immersion, Scientificalness, Competitiveness, Adaptability and Learning) model for exergames design, The exercise time and intensity strictly conform to the physical exercise guidelines of the ACSM (American College of Sports Medicine), and a novel demonstration-based non-player modeling technique is employed to simulate the marathon race crowd. The VNM also allows players to learn the Chinese culture and the Olympic knowledge in the tour mode during or after the running exercise. The user study of VNM indicates that the proposed ISCAL model can be successfully applied in the exergame design.","PeriodicalId":197131,"journal":{"name":"2011 IEEE International Symposium on VR Innovation","volume":"277 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121739263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}