{"title":"Perceiving And Acting In Virtual Environments","authors":"L. Hettinger","doi":"10.1109/VRAIS.1998.658509","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658509","url":null,"abstract":"&gore C. Burdra, Ph.D. Virtual Reality is revolutionizing Medicine, from diagnosis to surgery and rehabilitation. Medical VR requires realistic organ modeling, dedicated or generic interfaces, use of programming toolkits, and extensive human factor tests to determine outcome. Several research projects around the word will be presented including early clinical study results. Dr. Grigore Burdea is Associate Professor of Electrical and Computer Engineering at Rutgers. His research interests are in force feedback for virtual reality and its applications in Medicine. He has been Principal or Co-Investigator on projects ranging from hand rehabilitation in VR to training in palpation of virtual malignancies. He authored the books “Virtual Reality Technology,” and “Force and Touch Feedback for Virtual Reality” (John Wiley & Sons), and co-edited the book “ComputerAided Surgery” (MIT Press). Workshop: Interfaces for Wearable Comwters Authors: Mark Billinghurst and Thad Starner Abstract: “If, as it is said to be not unlikely in the near future, the principle of sight is applied to the telephone as well as that of sound, earth will be truthfully a paradise, and distance will lose its enchantment by being abolished altogether.” Arthur Strand, 1898 The goal of this workshop is to develop and exchange ideas on how virtual reality techniques can be used to develop intuitive interfaces for wearable computers, particularly collaborative interfaces. It will also aim to uncover promising areas for future wearable interface research and provide a forum for participants to evaluate current interfaces. Pre-workshop Activities: A pre-conference electronic mailing list will bc created that will enable participants to begin discussion prior to the workshop. This will enable the development of several common themes that will be explored at the workshop. Attendees will also be encouraged to develop demonstrations of their wearable computers to show at the workshop. Mark Billinghurst is a final year doctoral student at the Human Interface Technology Laboratory (HIT Lab), the University of Washington where he co-manages their wearable computing effort. He organized the VRAIS 1996 and VRST 1996 tutorials on Multimodal Interfaces. Past projects he has been involved in include voice and gestural interfaces, evaluation of VR interaction techniques, intelligent virtual interfaces, and collaborative augmented reality environments. His current work involves using VR techniques to develop interfaces for wearable computers. Thad Stamer is a doctoral student at the MIT Media Laboratory where he co-founded the wearablc computing project. He helped organize the 1996 Boeing Wearable Computing workshop, the CHI Wearable Computers Workshop and the recent successful IEEE International Symposium on Wearable Computing.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127346843","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Creating A Virtual Bridge To Reality: The Latest Uses Of Virtual Reality For Mental Health","authors":"Dorothy C. Strickland","doi":"10.1109/VRAIS.1998.658497","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658497","url":null,"abstract":"","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134244487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Vestibular cues and virtual environments","authors":"L. Harris, M. Jenkin, D. Zikovitz","doi":"10.1109/VRAIS.1998.658469","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658469","url":null,"abstract":"The vast majority of virtual environments concentrate on constructing a realistic visual simulation while ignoring non-visual environmental cues. Although these missing cues can to some extent be ignored by an operator, the lack of appropriate cues may contribute to cybersickness and may affect operator performance. We examine the role of vestibular cues to self-motion on an operator's sense of self-motion within a virtual environment. We show that the presence of vestibular cues has a very significant effect on an operator's estimate of self-motion. The addition of vestibular cues, however, is not always beneficial.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115468564","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrew E. Johnson, Maria Roussos, J. Leigh, C. Vasilakis, C. Barnes, T. Moher
{"title":"The NICE project: learning together in a virtual world","authors":"Andrew E. Johnson, Maria Roussos, J. Leigh, C. Vasilakis, C. Barnes, T. Moher","doi":"10.1109/VRAIS.1998.658487","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658487","url":null,"abstract":"This paper describes the NICE project, an immersive learning environment for children implemented in the CAVE and related multi-user virtual reality (VR) technologies. The NICE project provides an engaging setting where children construct and cultivate simple virtual ecosystems, collaborate via networks with other remotely-located children, and create stories from their interactions in the real and virtual world.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122677468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Constructing 3D natural scene from video sequences with vibrated motions","authors":"Zhigang Zhu, Guangyou Xu, X. Lin","doi":"10.1109/VRAIS.1998.658453","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658453","url":null,"abstract":"This paper presents a systematic approach to automatically construct 3D natural scenes from video sequences. Dense layered depth maps are derived from image sequences captured by a vibrated camera with only approximately known motion. The approach consists of (1) image stabilization by motion filtering and (2) depth estimation by spatio-temporal texture analysis. The two stage method not only generalized the so called panoramic image method and epipolar plane image method to handle image sequence vibrations due to the uncontrollable camera fluctuations, but also bypasses the feature extraction and matching problems encountered in stereo or visual motion. Our approach allows automatic modeling of the real environment for inclusion in VR representations.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121472930","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. G. Guan, L. Serra, R. Kockro, N. Hern, W. Nowinski, Chumpon Chan
{"title":"Volume-based tumor neurosurgery planning in the Virtual Workbench","authors":"C. G. Guan, L. Serra, R. Kockro, N. Hern, W. Nowinski, Chumpon Chan","doi":"10.1109/VRAIS.1998.658486","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658486","url":null,"abstract":"We present a virtual reality application to neurosurgical pre-operative planning which is undergoing clinical evaluation at the Singapore General Hospital. The application, based on the ISS Virtual Workbench, lets the neurosurgeon study the brain pathology, blood vessels, skull and the surrounding tissue using real-time volumetric rendering of the patient data. With this information, the surgeon can plan the best approach for surgery. At the moment, seven cases have been planned. The system features measuring markers, multi-modal data fusion of a patient's data, different visualization modes, tissue enhancement through manipulation of colour, look-up tables, cloning of region of interest, and interactive pathology outlining.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127858773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Aiding orientation performance in virtual environments with proprioceptive feedback","authors":"N. H. Bakker, P. Werkhoven, P. O. Passenier","doi":"10.1109/VRAIS.1998.658419","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658419","url":null,"abstract":"In most applications of virtual environments (VEs), like training and design evaluation, a good sense of orientation is needed in the VE. Orientation performance when moving around in the real world relies on visual as well as proprioceptive feedback. However, the navigation metaphors which are used to move around the VE often lack proprioceptive feedback. Furthermore, the visual feedback in a VE is often relatively poor compared to the visual feedback available in the real world. Therefore, we have quantified the influence of visual and proprioceptive feedback on orientation performance in VEs. Subjects were immersed in a virtual forest and were asked to turn specific angles using three navigation metaphors, differing in the kind of proprioceptive feedback which is provided (no proprioceptive feedback, vestibular feedback, and vestibular and kinesthetic feedback). The results indicate that the most accurate turn performance is found when kinesthetic feedback is present, in a condition where subjects use their legs to turn around. This indicates that incorporating this kind of feedback in navigation metaphors is quite beneficial. Orientation on only the visual component is most inaccurate, leading to progressively larger undershoots for larger angles.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134212870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Crashing in cyberspace-evaluating structural behaviour of car bodies in a virtual environment","authors":"M. Schulz, T. Ertl, Thomas Reuding","doi":"10.1109/VRAIS.1998.658485","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658485","url":null,"abstract":"The use of virtual prototypes generated from engineering simulations can be crucial to the efficient development of innovative products. Performance predictions and functional evaluations of a design are possible long before results of real prototype tests are available. With the rise in model complexity, data quantity, computing performance and accuracy, we increasingly fined ourselves lacking the tools, methods and metaphors to deal with the information that is being generated. We present new results of on-going research at the University of Erlangen and BMW in the development of a virtual environment for car-body engineering applications as illustrated by examples from acoustics, vibration and impact dynamics.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1998-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116146078","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MediSim: a prototype VR system for training medical first responders","authors":"S. Stansfield, D. Shawver, A. Sobel","doi":"10.1109/VRAIS.1998.658490","DOIUrl":"https://doi.org/10.1109/VRAIS.1998.658490","url":null,"abstract":"This paper presents a prototype virtual reality (VR) system for training medical first responders. The initial application is to battlefield medicine and focuses on the training of medical corpsmen and other front-line personnel who might be called upon to provide emergency triage on the battlefield. The system is built upon Sandia's multi-user, distributed VR platform and provides an interactive, immersive simulation capability. The user is represented by an Avatar and is able to manipulate his virtual instruments and carry out medical procedures. A dynamic casualty simulation provides realistic cues to the patient's condition (e.g. changing blood pressure and pulse) and responds to the actions of the trainee (e.g. a change in the color of a patient's skin may result from a check of the capillary refill rate). The current casualty simulation is of an injury resulting in a tension pneumothorax. This casualty model was developed by the University of Pennsylvania and integrated into the Sandia MediSim system.","PeriodicalId":105542,"journal":{"name":"Proceedings. IEEE 1998 Virtual Reality Annual International Symposium (Cat. No.98CB36180)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122610835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}