A. Nambu, Takuji Narumi, Kunihiro Nishimura, T. Tanikawa, M. Hirose
{"title":"Visual-olfactory display using olfactory sensory map","authors":"A. Nambu, Takuji Narumi, Kunihiro Nishimura, T. Tanikawa, M. Hirose","doi":"10.1109/VR.2010.5444817","DOIUrl":"https://doi.org/10.1109/VR.2010.5444817","url":null,"abstract":"Olfactory displays which exist now can only present the set of scents which was prepared beforehand because a set of ―primary odors\" has not been found. In this paper, we focus on development of an olfactory display using cross modality which can represent more patterns of scents than the patterns of scents prepared. Due to cross modal effect between vision and olfaction, human tends to feel not olfactory but visual stimulation as scents. First, we asked subjects to smell various aroma chemicals and evaluate their similarity. Based on the data of similarity among aromas, we built a map of smell distance. Next, we selected a few aroma chemicals from the smell distance map and implemented a visual and olfactory display. Then we conducted an assessment experiment of the display. We presented various pictures and the selected aroma chemicals to subjects and asked them what kind of scent they smelled like. In this experiment, we succeed in giving subjects feeling more patterns of scents than the number of selected aroma chemicals. Also, we succeed in making not olfactory stimulation by aroma chemicals but visual stimulation by pictures as scents. In particular, we find that the visual effect on olfactory sensation is more strong when the distance between the picture and the aroma chemicals is close than when the distance is far.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122791041","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Samantha L. Finkelstein, A. Nickel, T. Barnes, Evan A. Suma
{"title":"Astrojumper: Designing a virtual reality exergame to motivate children with autism to exercise","authors":"Samantha L. Finkelstein, A. Nickel, T. Barnes, Evan A. Suma","doi":"10.1109/VR.2010.5444770","DOIUrl":"https://doi.org/10.1109/VR.2010.5444770","url":null,"abstract":"Children with autism show substantial benefits from rigorous physical activity, however it is often difficult to motivate these individuals to exercise due to their usually sedentary lifestyles. To address the problem of motivation, we have developed Astrojumper, a stereoscopic virtual reality exergame which was designed to fit the needs of children with autism. During the game, virtual space-themed objects fly forward toward the user who must use their own physical movements to avoid collisions. Preliminary playtesting of Astrojumper on neuro-typical participants has been positive, and we plan to run an extensive evaluation assessing the psychological and physiological effects of this system on children with and without autism.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123064999","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evaluation of training effect of tooth scaling simulator by measurement of operation force","authors":"Nobuyoshi Hashimoto, H. Kato, K. Matsui","doi":"10.1109/VR.2010.5444768","DOIUrl":"https://doi.org/10.1109/VR.2010.5444768","url":null,"abstract":"Apprentice dentists and dental hygienists must be trained \"tooth scaling\". However most of training schools have several problems, lack of cooperative patients with calculi, a great variety of calculi. The authors have developed a simulator for the training by using a PHANToM and a video-see-through HMD in the previous paper. In this paper, visual display of the simulator is improved in streoscopic. Then the training effectiveness has been evaluated based on difference between operation force by veterans and that by beginners. The result shows the developed simulator is effective, and the teaching function of the simulator contributes to training effect, whereas the stereo vision without the teaching function doesn't contribute so much.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131453995","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fraser Anderson, M. Annett, W. Bischof, P. Boulanger
{"title":"Virtual equine assisted therapy","authors":"Fraser Anderson, M. Annett, W. Bischof, P. Boulanger","doi":"10.1109/VR.2010.5444776","DOIUrl":"https://doi.org/10.1109/VR.2010.5444776","url":null,"abstract":"People with a wide spectrum of disabilities, ranging from spinal injuries to autism, have benefited from equine assisted therapy (EAT). Using EAT, therapy patients have improved both physically and psychologically (e.g., demonstrating increased attention, motivation, and communication skills). There are still many open questions regarding this therapy and the reasons for its success. Many of these questions have remained unanswered due in large part to the uncontrolled nature of EAT. The Virtual Equine Assisted Therapy (VEAT) Project integrates a robotic platform with virtual reality technologies to provide a safe, controlled environment through which various aspects of EAT can be isolated and studied. The system incorporates realistic equine motions with visual, auditory, olfactory, and somatosensory stimuli to provide highly immersive experiences to patients.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"07 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116136721","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Can you stand on virtual grounds? A study on postural affordances in virtual reality","authors":"Tony Regia-Corte, M. Marchal, A. Lécuyer","doi":"10.1109/VR.2010.5444789","DOIUrl":"https://doi.org/10.1109/VR.2010.5444789","url":null,"abstract":"The concept of affordance, introduced by the psychologist James Gibson, can be defined as the functional utility of an object, a surface or an event. The purpose of this article was to evaluate the perception of affordances in virtual environments (VE). In order to test this perception, we considered the affordances for standing on a virtual slanted surface. The participants were asked to judge whether a virtual slanted surface supported upright stance. The perception was investigated by manipulating the texture of the slanted surface (Wooden texture vs. Ice texture). Results showed an effect of the texture: the perceptual boundary (or critical angle) with the Ice texture was significantly lower than with the Wooden texture. These results reveal that perception of affordances for standing on a slanted surface in virtual reality is possible and comparable to previous studies conducted in real environments.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115143928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Heikki Laaki, Karel Kaurila, Kalle Ots, Vik Nuckchady, P. Belimpasakis
{"title":"Augmenting virtual worlds with real-life data from mobile devices","authors":"Heikki Laaki, Karel Kaurila, Kalle Ots, Vik Nuckchady, P. Belimpasakis","doi":"10.1109/VR.2010.5444764","DOIUrl":"https://doi.org/10.1109/VR.2010.5444764","url":null,"abstract":"Virtual worlds have typically been isolated from the real environment, treated as separate parallel worlds. In this paper we show a scenario where context data collection from mobile devices can be used for augmenting virtual worlds with real-life data. Life-logging elements are used to control an avatar, in a virtual world, as a way to re-play experiences. The prototype system, which was implemented for proving the concept, is presented.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126333008","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Y. Visell, Alvin W. Law, J. Ip, Severin Smith, J. Cooperstock
{"title":"Interaction capture in immersive virtual environments via an intelligent floor surface","authors":"Y. Visell, Alvin W. Law, J. Ip, Severin Smith, J. Cooperstock","doi":"10.1109/VR.2010.5444748","DOIUrl":"https://doi.org/10.1109/VR.2010.5444748","url":null,"abstract":"We present techniques to enable users to interact on foot with simulated natural ground surfaces, such as soil or ice, in immersive virtual environments. Position and force estimates from in-floor force sensors are used to synthesize plausible auditory and vibrotactile feedback in response. Relevant rendering techniques are discussed in the context of walking on a virtual frozen pond.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129293935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jonas Nilsson, A. Ödblom, J. Fredriksson, Adeel Zafar, Fahim Ahmed
{"title":"Performance evaluation method for mobile computer vision systems using augmented reality","authors":"Jonas Nilsson, A. Ödblom, J. Fredriksson, Adeel Zafar, Fahim Ahmed","doi":"10.1109/VR.2010.5444821","DOIUrl":"https://doi.org/10.1109/VR.2010.5444821","url":null,"abstract":"This paper describes a framework which uses augmented reality for evaluating the performance of mobile computer vision systems. Computer vision systems use primarily image data to interpret the surrounding world, e.g to detect, classify and track objects. The performance of mobile computer vision systems acting in unknown environments is inherently difficult to evaluate since, often, obtaining ground truth data is problematic. The proposed novel framework exploits the possibility to add virtual agents into a real data sequence collected in an unknown environment, thus making it possible to efficiently create augmented data sequences, including ground truth, to be used for performance evaluation. Varying the content in the data sequence by adding different virtual agents is straightforward, making the proposed framework very flexible. The method has been implemented and tested on a pedestrian detection system used for automotive collision avoidance. Preliminary results show that the method has potential to replace and complement physical testing, for instance by creating collision scenarios, which are difficult to test in reality.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114062938","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jan-Phillip Tiesel, C. Borst, Kaushik Das, E. Habib
{"title":"Single-pass 3D lens rendering and spatiotemporal “Time Warp” example","authors":"Jan-Phillip Tiesel, C. Borst, Kaushik Das, E. Habib","doi":"10.1109/VR.2010.5444782","DOIUrl":"https://doi.org/10.1109/VR.2010.5444782","url":null,"abstract":"This paper extends 3D lens techniques. Interactive 3D lenses, often called volumetric lenses, provide users with alternative views of datasets within spatially bounded regions of interest (focus) while maintaining the surrounding overview (context). In contrast to previous multi-pass rendering work, we discuss the strengths, limitations, and performance cost of a single-pass technique. For a substantial range of effects, it supports several interactive composable lenses at interactive frame rates without performance loss during increasing lens intersections or manipulations. Other cases, for which this performance cannot be achieved, are also discussed. Finally, we illustrate possible applications of our lens system, especially new Time Warp lenses for exploring time-varying datasets in interactive VR.","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122888887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Lightweight bleeding and smoke effect for surgical simulators","authors":"Tansel Halic, S. De","doi":"10.1109/VR.2010.5444771","DOIUrl":"https://doi.org/10.1109/VR.2010.5444771","url":null,"abstract":"A fully functional Virtual Reality (VR) surgical simulator often captures maximum CPU capacity due to computationally intensive tasks such as collision detection and response, physics simulation, fluid simulation, haptics etc. In order to maintain interactive rates, these major components are given high priority in sharing limited computational resource. Therefore, smoke and bleeding models that have lesser significance with respect to major modules often lead to poor and very unrealistic effects. In this work, a GPU based method was designed and implemented for creating realistic smoke and bleeding effect in a VR surgical simulator. The techniques were employed in Laparoscopic Adjustable Gastric Banding (LAGB) simulator [1][10].","PeriodicalId":151060,"journal":{"name":"2010 IEEE Virtual Reality Conference (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2010-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129227114","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}