A. Butz, Tobias Höllerer, Steven K. Feiner, B. MacIntyre, Clifford Beshers
{"title":"Enveloping users and computers in a collaborative 3D augmented reality","authors":"A. Butz, Tobias Höllerer, Steven K. Feiner, B. MacIntyre, Clifford Beshers","doi":"10.1109/IWAR.1999.803804","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803804","url":null,"abstract":"We present EMMIE (Environment Management for Multiuser Information Environments), a prototype experimental user interface to a collaborative augmented environment. Users share a 3D virtual space and manipulate virtual objects that represent information to be discussed. We refer to EMMIE as a hybrid user interface because it combines a variety of different technologies and techniques, including virtual elements such as 3D widgets, and physical objects such as tracked displays and input devices. See-through head-worn displays overlay the virtual environment on the physical environment, visualizing the pervasive \"virtual ether\" within which all interaction occurs. Our prototype includes additional 2D and 3D displays, ranging from palm-sized to wall-sized, allowing the most appropriate one to be used for any task. Objects can be moved among displays (including across dimensionalities) through drag-and-drop. In analogy to 2D window managers, we describe a prototype implementation of a shared 3D environment manager that is distributed across displays, machines, and operating systems. We also discuss two methods we are exploring for handling information privacy in such an environment.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116754010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Merging visible and invisible: two Camera-Augmented Mobile C-arm (CAMC) applications","authors":"N. Navab, A. Bani-Kashemi, Matthias Mitschke","doi":"10.1109/IWAR.1999.803814","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803814","url":null,"abstract":"This paper presents the basic concept of CAMC and some of its applications. A CCD camera is attached to a mobile C-arm fluoroscopy X-ray system. Both optical and X-ray imaging systems are calibrated in the same coordinate system in an off-line process. The new system is able to provide X-ray and optical images simultaneously. The CAMC framework has great potential for medical augmented reality. We briefly introduce two new CAMC applications to the augmented reality research community. The first application aims at merging video images with a pre-computed tomographic reconstruction of the 3D volume of interest. This is a logical continuation of our work on 3D reconstruction using a CAMC (1999). The second approach is a totally new CAMC design where using a double mirror system and an appropriate calibration procedure the X-ray and optical images are merged in real-time. This new system enables the user to see an optical image, an X-ray image, or an augmented image where both visible and invisible are combined in real-time. The paper is organized in two independent sections describing each of the above. Experimental results are provided at the same time as the methods and apparatus are described for each section.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130837947","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Integrating virtual and augmented realities in an outdoor application","authors":"W. Piekarski, B. Gunther, B. Thomas","doi":"10.1109/IWAR.1999.803805","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803805","url":null,"abstract":"This paper explores interconnecting outdoor AR systems with a VR system to achieve collaboration in both domains simultaneously. We envisage multiple mobile users of wearable AR systems interacting with a stationary VR facility via a wireless network. An application in simulated combat training is described, where the AR users are soldiers with wearable computers, and the VR system is located at a command and control centre. For soldiers, AR provides enhanced information about the battlefield environment, which may include the positions and attributes of simulated entities for the purpose of training outdoors at low cost. At the same time a complete picture of the battlefield, including real and simulated troops and vehicles, is available via the VR system. As soldiers move about, their GPS and digital compass hardware provide the remote VR user and other AR users with the means to track their position in real-time. We describe a working system based on our modular Tinmith-II wearable computer, which interacts with a combat simulator to create a synthetic battle environment for safe training and monitoring.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115843033","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Building a hybrid tracking system: integration of optical and magnetic tracking","authors":"T. Auer, A. Pinz","doi":"10.1109/IWAR.1999.803802","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803802","url":null,"abstract":"Multi-user augmented reality systems require excellent registration in order to allow cooperation among the users. In order to satisfy this need we have built a hybrid tracking system integrating magnetic and optical tracking. Prediction from the magnetic tracker allows for rather small search areas in the optical tracking subsystem. Our hybrid tracking system is faster than a standalone optical tracking system and outperforms a magnetic system in terms of accuracy and jitter.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"243 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124676586","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Marker tracking and HMD calibration for a video-based augmented reality conferencing system","authors":"H. Kato, M. Billinghurst","doi":"10.1109/IWAR.1999.803809","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803809","url":null,"abstract":"We describe an augmented reality conferencing system which uses the overlay of virtual images on the real world. Remote collaborators are represented on virtual monitors which can be freely positioned about a user in space. Users can collaboratively view and interact with virtual objects using a shared virtual whiteboard. This is possible through precise virtual image registration using fast and accurate computer vision techniques and head mounted display (HMD) calibration. We propose a method for tracking fiducial markers and a calibration method for optical see-through HMD based on the marker tracking.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115137417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A feasible low-power augmented-reality terminal","authors":"J. Pouwelse, K. Langendoen, H. Sips","doi":"10.1109/IWAR.1999.803806","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803806","url":null,"abstract":"This paper studies the requirements for a truly wearable augmented-reality (AR) terminal. The requirements translate into a generic hardware architecture consisting of programmable modules communicating through a central interconnect. Careful selection of low-power components shows that it is feasible to construct an AR terminal that weighs about 2 kg and roughly dissipates 26 W. With state-of-the-art batteries and a 50% average resource utilization, the terminal can operate for about 10 hours.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128651916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The importance of being mobile: some social consequences of wearable augmented reality systems","authors":"Steven K. Feiner","doi":"10.1109/IWAR.1999.803815","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803815","url":null,"abstract":"What are the consequences of mobility for augmented reality? This paper explores some of the issues that I believe will be raised by the development and future commonplace adoption of mobile, wearable, augmented reality systems. These include: social influences on tracking accuracy, the importance of appearance and comfort, an increase in collaborative applications, integration with other devices, and implications for personal privacy.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"127 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114291167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Vision-based pose computation: robust and accurate augmented reality tracking","authors":"Jun Park, Bolan Jiang, U. Neumann","doi":"10.1109/IWAR.1999.803801","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803801","url":null,"abstract":"Vision-based tracking systems have advantages for augmented reality (AR) applications. Their registration can be very accurate, and there is no delay between the motions of real and virtual scene elements. However, vision-based tracking often suffers from limited range, intermittent errors, and dropouts. These shortcomings are due to the need to see multiple calibrated features or fiducials in each frame. To address these shortcomings, features in the scene can be dynamically calibrated and pose calculations can be made robust to noise and numerical instability. In this paper, we survey classic vision-based pose computations and present two methods that offer increased robustness and accuracy in the context of real-time AR tracking.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125097881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Photometric image-based rendering for virtual lighting image synthesis","authors":"Y. Mukaigawa, Sadahiko Mihashi, Takeshi Shakunaga","doi":"10.1109/IWAR.1999.803812","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803812","url":null,"abstract":"A concept named Photometric Image-Based Rendering (PIBR) is introduced for a seamless augmented reality. The PIBR is defined as image-based rendering which covers appearance changes caused by the lighting condition changes, while Geometric Image-Based Rendering (GIBR) is defined as image-based rendering which covers appearance changes caused by the view point changes. PIBR can be applied to image synthesis to keep photometric consistency between virtual objects and real scenes in arbitrary lighting conditions. We analyze conventional IBR algorithms, and formalize PIBR within the whole IBR framework. A specific algorithm is also presented for realizing PIBR. The photometric linearization makes a controllable framework for PIBR, which consists of four processes: (1) separation of environmental illumination effects, (2) estimation of lighting directions, (3) separation of specular reflections and cast-shadows, and (4) linearization of self-shadows. After the-photometric linearization of input images, we can synthesize any realistic images which include not-only diffuse reflections but also self-shadows, cast-shadows and specular reflections. Experimental results show that realistic images can be successfully synthesized while keeping photometric consistency.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130142797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An adaptive estimator for registration in augmented reality","authors":"Lin Chai, Khoi Nguyen, Bill Hoff, T. Vincent","doi":"10.1109/IWAR.1999.803803","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803803","url":null,"abstract":"In augmented reality (AR) systems using head-mounted displays (HMD), it is important to accurately sense the position and orientation (pose) of the user's head with respect to the world, in order that graphical overlays are drawn correctly aligned with real world objects. It is desired to maintain registration dynamically (while the person is moving their head) so that the graphical objects will not appear to lag behind, or swim around, the corresponding real objects. We present an adaptive method for achieving dynamic registration which accounts for variations in the magnitude of the users head motion, based on a multiple model approach. This approach uses the extended Kalman filter to smooth sensor data and estimate position and orientation.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125842443","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}