{"title":"A real-time tracker for markerless augmented reality","authors":"Andrew I. Comport, É. Marchand, F. Chaumette","doi":"10.1109/ISMAR.2003.1240686","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240686","url":null,"abstract":"Augmented reality has now progressed to the point where real-time applications are required and being considered. At the same time it is important that synthetic elements are rendered and aligned in the scene in an accurate and visually acceptable way. In order to address these issues a real-time, robust and efficient 3D model-based tracking algorithm is proposed for a 'video see through' monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, non-linear pose computation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curve interaction matrices is given for different features including lines, circles, cylinders and spheres. A local moving edge tracker is used in order to provide real-time tracking of points normal to the object contours. A method is proposed for combining local position uncertainty and global pose uncertainty in an efficient and accurate way by propagating uncertainty. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively re-weighted least squares implementation. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination and mistracking.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127232169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Stereo depth assessment experiment for microscope-based surgery","authors":"R. Lapeer, A. Tan, A. Linney, G. Alusi","doi":"10.1109/ISMAR.2003.1240716","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240716","url":null,"abstract":"We present experimental data on the use of autostereoscopic displays as complementary visualization aids to the surgical stereo microscope for augmented reality surgery. Five experts in the use of the microscope, and five non-experts, performed a depth experiment to assess stereo cues as provided by two autostereoscopic displays (DTI 2015XLS Virtual Window and SHARP micro-optic twin), the surgical microscope and the \"naked\" eye.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127097296","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
B. MacIntyre, Maribeth Gandy Coleman, J. Bolter, Steven W. Dow, B. Hannigan
{"title":"DART: the Designer's Augmented Reality Toolkit","authors":"B. MacIntyre, Maribeth Gandy Coleman, J. Bolter, Steven W. Dow, B. Hannigan","doi":"10.1109/ISMAR.2003.1240744","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240744","url":null,"abstract":"This demonstration highlights the Designer's Augmented Reality Toolkit (DART), a system that allows users to easily create augmented reality (AR) experiences. Over the past year our research has focused on the creation of this toolkit that can be used by technologists, designers, and students alike to rapidly prototype AR applications. Current approaches to AR development involve extensive programming and content creation as well as knowledge of technical topics involving cameras, trackers, and 3D geometry. The result is that it is very difficult even for technologists to create AR experiences. Our goal was to eliminate these obstacles that prevent such users from being able to experiment with AR. The DART system is based on the Macromedia Director multimedia-programming environment, the de facto standard for multimedia content creation. DART uses the familiar Director paradigms of a score, sprites and behaviors to allow a user to visually create complex AR applications. DART also provides low-level support for the management of trackers, sensors, and camera via a Director plug-in Xtra. This demonstration will show the wide range of AR and other types of multimedia applications that can be created with DART, and visitors will have the opportunity to use DART to create their own experiences.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127768879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Y. Baillot, S. Julier, Dennis G. Brown, M. Livingston
{"title":"A tracker alignment framework for augmented reality","authors":"Y. Baillot, S. Julier, Dennis G. Brown, M. Livingston","doi":"10.1109/ISMAR.2003.1240697","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240697","url":null,"abstract":"To achieve accurate registration, the transformations which locate the tracking system components with respect to the environment must be known. These transformations relate the base of the tracking system to the virtual world and the tracking system's sensor to the graphics display. In this paper we present a unified, general calibration method for calculating these transformations. A user is asked to align the display with objects in the real world. Using this method, the sensor to display and tracker base to world transformations can be determined with as few as three measurements.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115824174","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hybrid indoor and outdoor tracking for mobile 3D mixed reality","authors":"W. Piekarski, Ben Avery, B. Thomas, P. Malbezin","doi":"10.1109/ISMAR.2003.1240713","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240713","url":null,"abstract":"This paper describes a new hybrid tracking system that integrates standard outdoor augmented reality trackers with a low cost indoor tracker based on the use of fiducial markers. We use multiple cameras, separate orientation sensing, and scene graph integration to improve on similar previous systems. This hybrid tracker allows applications to operate within large indoor and outdoor environments with minimal scaling costs. Tracking is performed using GPS style world coordinates both indoors and outdoors, and has been integrated into our existing mobile 3D modelling applications.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117304121","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Personal positioning based on walking locomotion analysis with self-contained sensors and a wearable camera","authors":"M. Kourogi, T. Kurata","doi":"10.1109/ISMAR.2003.1240693","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240693","url":null,"abstract":"In this paper, we propose a method of personal positioning for a wearable augmented reality (AR) system that allows a user to freely move around indoors and outdoors. The user is equipped with self-contained sensors, a wearable camera, an inertial head tracker and display. The method is based on sensor fusion of estimates for relative displacement caused by human walking locomotion and estimates for absolute position and orientation within a Kalman filtering framework. The former is based on intensive analysis of human walking behavior using self-contained sensors. The latter is based on image matching of video frames from a wearable camera with an image database that was prepared beforehand.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114790827","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Toshikazu Ohshima, Tsuyoshi Kuroki, Hiroyuki Yamamoto, H. Tamura
{"title":"A mixed reality system with visual and tangible interaction capability: application to evaluating automobile interior design","authors":"Toshikazu Ohshima, Tsuyoshi Kuroki, Hiroyuki Yamamoto, H. Tamura","doi":"10.1109/ISMAR.2003.1240722","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240722","url":null,"abstract":"This paper presents a mixed reality (MR) system with tangible interface as well as visual fusion in an MR space. The sense of touch is given by physical objects on which computer generated imagery is accurately registered and superimposed. The proposed approach is especially useful in industrial design where digital mockups and physical mockups are thoroughly utilized.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121969300","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. Grasset, Laurence Boissieux, J. Gascuel, D. Schmalstieg
{"title":"Interactive mediated reality","authors":"R. Grasset, Laurence Boissieux, J. Gascuel, D. Schmalstieg","doi":"10.1109/ISMAR.2003.1240731","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240731","url":null,"abstract":"Mediated reality describes the concept of filtering or vision of reality, typically using a head-worn video mixing display. In this paper, we propose a generalized concept and new tools for interactively mediated reality. We present also our first prototype system for painting, grabbing and gluing together real and virtual elements.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121794065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Graphic shadow: augmenting your shadow on the floor","authors":"Hiroshi Kato, T. Naemura, H. Harashima","doi":"10.1109/ISMAR.2003.1240733","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240733","url":null,"abstract":"This paper proposes real-time interactive systems illustrating your shadows cast on the floor, which we name \"graphic shadow.\" They will make you experience an exciting space of interaction performing an illusion on your own shadows.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131390248","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ARWin - a desktop augmented reality Window Manager","authors":"S. DiVerdi, D. Nurmi, Tobias Höllerer","doi":"10.1109/ISMAR.2003.1240729","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240729","url":null,"abstract":"We present ARWin, a single user 3D augmented reality desktop. We explain our design considerations and system architecture and discuss a variety of applications and interaction techniques designed to take advantage of this new platform.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"345 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134263194","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}