{"title":"Quantification of Contrast Sensitivity and Color Perception using Head-worn Augmented Reality Displays","authors":"M. Livingston, Jane H. Barrow, Ciara M. Sibley","doi":"10.1109/VR.2009.4811009","DOIUrl":"https://doi.org/10.1109/VR.2009.4811009","url":null,"abstract":"Augmented reality (AR) displays often reduce the visual capabilities of the user. This reduction can be measured both objectively and through user studies. We acquired objective measurements with a color meter and conducted two user studies for each of two key measurements. First was the combined effect of resolution and display contrast, which equate to the visual acuity and apparent brightness. The combined effect may be captured by the contrast sensitivity function and measured through analogs of optometric exams. We expanded the number of commercial devices tested in previous studies, including higher resolution and video-overlay AR displays. We found patterns of reduced contrast sensitivity similar to previous work; however, we saw that all displays enabled users to achieve the maximum possible acuity with at least moderate levels of contrast. The second measurement was the perception of color. Objective measurements showed a distortion of color, notably in the blue region of color space. We devised a color matching task to quantify the distortion of color perception, finding that the displays themselves were poor at showing colors in the blue region of color space and that the perceptual distortion of such colors was even greater than the objective distortion. We noted significantly different distortions and variability between displays.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130128433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Evan A. Suma, Samantha L. Finkelstein, Myra Reid, Amy Banic, L. Hodges
{"title":"Real Walking Increases Simulator Sickness in Navigationally Complex Virtual Environments","authors":"Evan A. Suma, Samantha L. Finkelstein, Myra Reid, Amy Banic, L. Hodges","doi":"10.1109/VR.2009.4811037","DOIUrl":"https://doi.org/10.1109/VR.2009.4811037","url":null,"abstract":"We report on a study in which we investigate the effects of travel technique on simulator sickness in a real and virtual environment. Participants explored either a real maze or a virtual maze using either natural walking or simulated walking. Reported scores for measures of overall simulator sickness, disorientation, nausea, and occulomotor discomfort were all higher in the natural walking condition than either the simulated walking or real world conditions. This indicates that simulated walking is a better choice for reducing simulator sickness during tasks requiring a navigationally complex environment and a long amount of time.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128348323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Virtual Reality-Based Multi-View Visualization of Time-Dependent Simulation Data","authors":"B. Hentschel, M. Wolter, T. Kuhlen","doi":"10.1109/VR.2009.4811041","DOIUrl":"https://doi.org/10.1109/VR.2009.4811041","url":null,"abstract":"The analysis of time-dependent simulation data is a demanding task, both in terms of computing power and time. Interactive analysis using multiple linked views has been shown to be one possible solution to this problem. However, there are two significant short-comings when limited to a standard desktop-based setup: First, complex spatial relationships are hard to understand using only 2D projections of the data. Second, the size of today's simulation runs is too large to be handled even by powerful workstations. We describe a system for the interactive analysis of large, time-dependent data in virtual environments. Based on the techniques of multiple linked views and brushing, our approach allows the user to quickly formulate, visualize and assess hypotheses about the data. To enable an interactive exploration even in the face of multi-gigabyte data sets, we distribute the workload to a multi-processor parallel machine and a rendering client.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127437601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Selection Method of Odor Components for Olfactory Display Using Mass Spectrum Database","authors":"T. Nakamoto, Keisuke Murakami","doi":"10.1109/VR.2009.4811016","DOIUrl":"https://doi.org/10.1109/VR.2009.4811016","url":null,"abstract":"A variety of smells can be realized by blending multiple odor components using an olfactory display. Since a set of odor components to cover the entire range of smells has not yet been known, we studied a method of selecting odor components using a large-scale mass spectrum database. Basis vectors corresponding to odor components were extracted by the NMF (nonnegative matrix factorization) method. Then, the recipe of the target odor was obtained using the nonnegative least-squares method. The basis vectors were successfully obtained from 10,000 compounds within a tolerable error. Moreover, the mass spectra of 104 odors composed of 322 compounds could be approximated using 32-50 basis vectors.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125176459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Yeh, Brad Newman, Matt Liewer, J. Pair, Anton Treskunov, G. Reger, B. Rothbaum, J. Difede, Josh Spitalnick, R. McLay, T. Parsons, A. Rizzo
{"title":"A Virtual Iraq System for the Treatment of Combat-Related Posttraumatic Stress Disorder","authors":"S. Yeh, Brad Newman, Matt Liewer, J. Pair, Anton Treskunov, G. Reger, B. Rothbaum, J. Difede, Josh Spitalnick, R. McLay, T. Parsons, A. Rizzo","doi":"10.1109/VR.2009.4811017","DOIUrl":"https://doi.org/10.1109/VR.2009.4811017","url":null,"abstract":"Posttraumatic Stress Disorder (PTSD) is reported to be caused by traumatic events that are outside the range of usual human experience including (but not limited to) military combat, violent personal assault, being kidnapped or taken hostage and terrorist attacks. Initial data suggests that at least 1 out of 5 Iraq War veterans are exhibiting symptoms of depression, anxiety and PTSD. Virtual Reality (VR) delivered exposure therapy for PTSD has been previously used with reports of positive outcomes. The current paper is a follow-up to a paper presented at IEEE VR2006 and will present the rationale and description of a VR PTSD therapy application (Virtual Iraq) and present the findings from its use with active duty service members since the VR2006 presentation. Virtual Iraq consists of a series of customizable virtual scenarios designed to represent relevant Middle Eastern VR contexts for exposure therapy, including a city and desert road convoy environment. User-centered design feedback needed to iteratively evolve the system was gathered from returning Iraq War veterans in the USA and from a system deployed in Iraq and tested by an Army Combat Stress Control Team. Results from an open clinical trial using Virtual Iraq at the Naval Medical Center-San Diego with 20 treatment completers indicate that 16 no longer met PTSD diagnostic criteria at post-treatment, with only one not maintaining treatment gains at 3 month follow-up.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122149285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Unified Calibration Method with a Parametric Approach for Wide-Field-of-View Multiprojector Displays","authors":"M. Ogata, Hiroyuki Wada, J. Baar, R. Raskar","doi":"10.1109/VR.2009.4811032","DOIUrl":"https://doi.org/10.1109/VR.2009.4811032","url":null,"abstract":"In this paper, we describe techniques for supporting a wide-field-of-view multiprojector curved screen display system. Our main contribution is in achieving automatic geometric calibration and efficient rendering for seamless displays, which is effective even in the presence of panoramic surround screens with the multiview calibration method without polygonal representation of the display surface. We show several prototype systems that use a stereo camera for capturing and a new rendering method for quadric curved screens. Previous approaches have required a calibration camera at the sweet spot. Due to parameterized representation, however, our unified calibration method is independent of the orientation and field of view of the calibration camera. This method can simplify the tedious and complicated installation process as well as the maintenance of large multiprojector displays in planetariums, virtual reality systems, and other visualization venues.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"118 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133340230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Exploring Co-presence for Next Generation Technical Support","authors":"Sinem Güven, Mark Podlaseck, G. Pingali","doi":"10.1109/VR.2009.4811006","DOIUrl":"https://doi.org/10.1109/VR.2009.4811006","url":null,"abstract":"In many technical support systems, live support agents help end-users resolve issues with computer software/hardware and appliances/gadgets in the real world. The dominant mode of communication in such systems is still the telephone, while instant messaging, video communication, and remote take-over have emerged as additional modalities in recent years. In contrast to these, 3D avatar based visual co-presence offers a unique combination of gestural interaction, shared reality, agent multitasking, and anonymity. Our paper argues that such 3D co-presence is viable in computer chat/remote help sessions, as well as real world support over camera equipped devices, offering an attractive alternative and enhancement to today's support modalities.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116177620","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Virtual Reality Training Embedded in Neurosurgical Microscope","authors":"A. Mauro, J. Raczkowsky, M. Halatsch, H. Wörn","doi":"10.1109/VR.2009.4811031","DOIUrl":"https://doi.org/10.1109/VR.2009.4811031","url":null,"abstract":"In this paper, we present the very first virtual reality training system for neurosurgical interventions based on a real surgical microscope and on a haptic interface for a better visual and ergonomic realism. Its main purpose is the realistic simulation of the palpation of low-grade glioma. The ability of a surgeon to feel the difference in consistency between tumors cells and normal brain parenchyma requires considerable experience and it is a key factor for a successful intervention. The simulation takes advantage of an accurate tissue modeling, a force feedback device and the rendering of the virtual scene directly to the oculars of the operating microscope.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"138 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116407505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Subjective Perception and Objective Measurements in Perceiving Object Softness for VR Surgical Systems","authors":"A. Widmer, Yaoping Hu","doi":"10.1109/VR.2009.4811048","DOIUrl":"https://doi.org/10.1109/VR.2009.4811048","url":null,"abstract":"A critical issue of virtual reality (VR) surgical systems is to correctly represent both haptic and visual information for distinguishing the softness of organs/tissues. We investigated the relationship between subjective perception of object softness and objective measurements of haptic and visual information. On a co-location VR setup, human subjects pressed deformable balls (simulating organs/tissues) under the conditions of both haptic and visual information available and only haptic (or visual) information available. We recorded and analyzed the subject's selection (subjective perception) of the harder object between two balls and objective measurements of maximum force (haptic) and pressing depth (visual). The results preliminarily indicated that subjective perception behaves differently from objective measurements in perceiving object softness. This has implications for creating accurate simulation in VR surgical systems.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132444427","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Steven Koepnick, D. Norpchen, W. Sherman, D. Coming
{"title":"Immersive Training for Two-Person Radiological Surveys","authors":"Steven Koepnick, D. Norpchen, W. Sherman, D. Coming","doi":"10.1109/VR.2009.4811018","DOIUrl":"https://doi.org/10.1109/VR.2009.4811018","url":null,"abstract":"Civil Support Teams (CST) must be ready to respond to a variety of potential situations involving dangerous materials. For many of these materials, standard real-world training methods can be successfully employed. Training involving radiological agents, however, poses a greater challenge than for other agents due to a lack of materials that can suitably mimic the situation without the danger of the real material. To address the need of providing a good training system for learning how to behave when responding to a radiological threat, we have developed a CST immersive training system. Our system simulates a radiological threat in a virtual environment and allows users to practice surveying the threat using virtual representations of the world and necessary equipment. We developed novel multi-user interaction techniques to enable simultaneous training for two CST members. The 92nd CST tested the system and provided feedback throughout the development process. The team learned to use the system with little coaching, quickly learned to navigate and interact via wand controls, and ultimately performed a successful demonstration of a radiological survey using our system for their superior officers.","PeriodicalId":433266,"journal":{"name":"2009 IEEE Virtual Reality Conference","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2009-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126838433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}