{"title":"Immersive environment technologies for planetary exploration","authors":"J. Wright, F. Hartman, B. Cooper","doi":"10.1109/VR.2001.913785","DOIUrl":"https://doi.org/10.1109/VR.2001.913785","url":null,"abstract":"Immersive environments are successfully being used to support mission operations at the Jet Propulsion Laboratory. This technology contributed to the Mars Pathfinder Mission in planning sorties for the Sojourner rover. Results and operational experiences with these tools are being incorporated into the development of the second generation of mission planning tools. NASA's current plan includes two rovers being deployed to Mars in 2003 and early 2004. The next generation Rover Control Workstation utilizes existing technologies and more to provide a multimodal, collaborative, partially immersive environment. This system includes tools for planning long range sorties for highly autonomous rovers, tools for building the three-dimensional (3D) models of the terrain being explored, and advanced tools for visualizing telemetry from remote spacecraft and landers. These tools comprise a system for immersing the operator in the environment of another planet, body, or space to make the mission planning function more intuitive and effective.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"3088 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127468253","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Simulator sickness and presence in a high FOV virtual environment","authors":"A. Seay, D. Krum, L. Hodges, W. Ribarsky","doi":"10.1109/VR.2001.913806","DOIUrl":"https://doi.org/10.1109/VR.2001.913806","url":null,"abstract":"As part of a process to identify potential simulator sickness issues with our NAVE (Non-expensive Automatic Virtual Environment), a new virtual environment display system developed at Georgia Tech, we have conducted a study to address the experience of simulator sickness and presence under different display and user role configurations. The NAVE has three 8 feet by 6 feet screens. The two side screens are positioned at 120 degree angles to the central screen to give a three-sided display area that is sixteen feet wide and approximately seven feet deep. This allows for two different field of view configurations, a one-screen set-up that provides a 60 degree FOV, and a three-screen set-up that provides a 180 degree FOV. Users are seated in front of the center screen and navigate with a joystick. The virtual environment used in this study could be displayed in the NAVE stereo-visually or in mono, providing us with a second experimental factor, display fidelity. Lastly, we were interested in determining whether or not the user's role in the environment would affect their experience of simulator sickness and presence as suggested by Stanney and Kennedy (1997). This provided us with our third factor user role, with its two levels, driver and passenger.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"112 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122855283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Passive force display using ER brakes and its control experiments","authors":"M. Sakaguchi, J. Furusho, N. Takesue","doi":"10.1109/VR.2001.913764","DOIUrl":"https://doi.org/10.1109/VR.2001.913764","url":null,"abstract":"Force information is often required for tele-operation systems and virtual reality. Conventional force displays are active systems with actuators. This, however, is inherently active, so that it may become a danger. Consequently, passive force display is an effective method for assuring safety. The authors developed a brake using ER (electrorheological) fluid and passive force display using ER brakes. They discuss two degree of freedom passive force display and basic control experiments.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128233300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interaction, navigation, and visualization props in complex virtual environments using image based rendering techniques","authors":"S. Stoev, I. Peter, W. Straßer","doi":"10.1109/VR.2001.913802","DOIUrl":"https://doi.org/10.1109/VR.2001.913802","url":null,"abstract":"Present an image-based technique for accelerated rendering of world-in-miniature (WIM)-based and multiple-viewpoint-based props. The WIM technique offers an intuitive and useful tool for navigation and manipulation within virtual environments. The multiple-viewpoint technique is often applied for enhancing visualization and navigation. Unfortunately, both approaches require multiple rendering of the viewed data. This can significantly deteriorate the frame rate and negatively influence the interaction. The proposed approach circumvents this drawback, while still providing the features of these tools.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121435932","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Kano, Kunihiro Nishimura, K. Hirota, M. Hirose, H. Aburatani, T. Hamakubo, T. Kodama
{"title":"Visualization for genome function analysis","authors":"M. Kano, Kunihiro Nishimura, K. Hirota, M. Hirose, H. Aburatani, T. Hamakubo, T. Kodama","doi":"10.1109/VR.2001.913807","DOIUrl":"https://doi.org/10.1109/VR.2001.913807","url":null,"abstract":"In this paper we discuss application possibilities of virtual reality technology such as immersive projection technology to the field of genome science. The prototype of the visualization environment is implemented and used in analysis to elucidate the important genes in categorizing liver cancer.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"144 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115758537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interactive texturing by polyhedron decomposition","authors":"Volker Leeb, L. Auer, A. Radetzky","doi":"10.1109/VR.2001.913783","DOIUrl":"https://doi.org/10.1109/VR.2001.913783","url":null,"abstract":"For visualization in virtual reality, two topics are of major importance: real-time rendering and realism. To meet these requirements, modern graphics hardware has to be applied wherever possible. A commonly used method to improve realism without decreasing the rendering speed is texturing. Today, fast texture mapping algorithms are even integrated in low-cost hardware. However, high-resolving and non-distorting texturing is very difficult and sometimes even impossible for non-convex complex polyhedra. Nevertheless, the realism of many virtual reality applications could be improved by using non-distorted textures. Especially in surgical simulation, each anatomical detail has to be placed correctly on very complex models of human organs. For this, a new method has been developed, allowing the interactive placement of high-resolution bitmaps to any desired position on the model's surface. In addition, the visualization quality can be improved by using an antialiasing filter. This method, called arbitrary texture placement, utilizes polyhedron decomposition to split one object of complex shape into N triangles. Treating each triangle of the surface as independent object it is possible to assign them an unique part of the texture space where color information can be stored. If a bitmap is applied to the polyhedron's surface, the involved triangles are determined and the pieces of the bitmap inside the triangle are stored at the corresponding areas in the texture space. An example shows the appliance of the method to an anatomical model of the ventricular system in the human brain, used for simulating minimally invasive neurosurgery.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134579695","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distributed virtual reality using Octopus","authors":"P. Hartling, C. Just, C. Cruz-Neira","doi":"10.1109/VR.2001.913770","DOIUrl":"https://doi.org/10.1109/VR.2001.913770","url":null,"abstract":"With the widespread popularity of the Internet and advances in distributed computing and in virtual reality, more flexibility is needed in the development and use of collaborative virtual environments. In this paper, we present Octopus, a cross-platform, object-oriented API for constructing shared virtual worlds. The list of goals for Octopus, a description of its design and a detailed discussion of its implementation are provided. The design description gives explanations of the three components of Octopus: the core that handles networking and data sharing, the interface for implementing user representations in the virtual space (avatars), and the actual implementations of the avatars.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117347478","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effects of viewing and orientation on path following in a medical teleoperation environment","authors":"P. Passmore, C. F. Nielsen, W. J. Cosh, A. Darzi","doi":"10.1109/VR.2001.913788","DOIUrl":"https://doi.org/10.1109/VR.2001.913788","url":null,"abstract":"The use of virtual and augmented reality techniques in medicine is rapidly increasing particularly in the area of minimal access surgery. Such surgery is a form of teleoperation in which accurate perception of depth and orientation, navigation, and interaction with the operative space are vital. Virtual and augmented reality techniques will allow us to produce new views of the operative site and introduce extra information into the scene such as safe paths for instruments to follow etc. A path following task is developed and human factors issues are addressed by varying viewing conditions (standard mono, stereo, multiple views and tool-linked view), presence or absence of haptic feedback, and orientation of the task. The results show that performance is improved with haptic feedback, but not by the various viewing conditions and is significantly worse for side aligned orientations.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123674340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Building a worldwide photorealistic virtual environment by switching between subenvironments","authors":"T. Tanikawa, M. Hirose","doi":"10.1109/VR.2001.913808","DOIUrl":"https://doi.org/10.1109/VR.2001.913808","url":null,"abstract":"A worldwide photorealistic environment includes huge amounts of datasets of different types and scales. It is almost impossible to describe such an environment using a single methodology. To manage different types and scales of these datasets, we build and handle virtual subenvironments of different types and scales separately. In addition, we propose an image-based combination method to switch between these local environments smoothly according to the user's viewpoint. We implemented this prototype system in ImmersivProjection Technology (IPT) and demonstrated the efficiency of the method.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129286596","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Goshi, K. Matsunaga, Hiroki Nagata, K. Shidoji, Hayato Matsugashita
{"title":"Digital stereoscopic video system with embedded high resolution images","authors":"K. Goshi, K. Matsunaga, Hiroki Nagata, K. Shidoji, Hayato Matsugashita","doi":"10.1109/VR.2001.913786","DOIUrl":"https://doi.org/10.1109/VR.2001.913786","url":null,"abstract":"We developed a stereoscopic video system, which has high-resolution images for central vision and it is called the Q system. The Q system uses a compound image that is a wide-angle image with an embedded high-resolution image. However, the Q system could not be used under situations where many robots work at the same time. This is because it needs four channels of video signals and the available channels could be limited under such situations. Thus, we have developed a digital Q system. The system can be used under such restricted situations, because the required data transfer rate is adjustable by changing the compression rates for a high-resolution image and a wide-angle image. In addition, an experiment confirmed that even though the systems used the same data transfer rate, digital Q system could make teleoperation more efficient and more precise than a conventional stereoscopic video system.","PeriodicalId":445662,"journal":{"name":"Proceedings IEEE Virtual Reality 2001","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2001-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131442766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}