{"title":"Model-based tracking with stereovision for AR","authors":"Hesam Najafi, G. Klinker","doi":"10.1109/ISMAR.2003.1240736","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240736","url":null,"abstract":"This demo shows a robust model-based tracker using stereovision. The combined use of a 3D model with stereoscopic analysis allows accurate pose estimation in the presence of partial occlusions by non rigid objects like the hands of the user. Furthermore, using a second camera improves the stability of tracking and also simplifies the algorithm.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"60 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130823034","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel Beier, R. Billert, B. Brüderlin, Dirk Stichling, B. Kleinjohann
{"title":"Marker-less vision based tracking for mobile augmented reality","authors":"Daniel Beier, R. Billert, B. Brüderlin, Dirk Stichling, B. Kleinjohann","doi":"10.1109/ISMAR.2003.1240709","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240709","url":null,"abstract":"In this paper an object recognition and tracking approach for the mobile, marker-less and PDA-based augmented reality system AR-PDA is described. For object recognition and localization 2D features are extracted from images and compared with a priori known 3D models. The system consists of a 2D graph matching, 3D hypothesis generation and validation and an additional texture based validation step.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121360806","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fully automated and stable registration for augmented reality applications","authors":"V. Lepetit, L. Vacchetti, D. Thalmann, P. Fua","doi":"10.1109/ISMAR.2003.1240692","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240692","url":null,"abstract":"We present a fully automated approach to camera registration for augmented reality systems. It relies on purely passive vision techniques to solve the initialization and real-time tracking problems, given a rough CAD model of parts of the real scene. It does not require a controlled environment, for example placing markers. It handles arbitrarily complex models, occlusions, large camera displacements and drastic aspect changes. This is made possible by two major contributions: the first one is a fast recognition method that detects the known part of the scene, registers the camera with respect to it, and initializes a real-time tracker, which is the second contribution. Our tracker eliminates drift and jitter by merging the information from preceding frames in a traditional recursive tracking fashion with that of a very limited number of key-frames created off-line. In the rare instances where it fails, for example because of large occlusion, it detects the failure and reinvokes the initialization procedure. We present experimental results on several different kinds of objects and scenes.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125366920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Woolard, V. Lalioti, N. Hedley, N. Carrigan, M. Hammond, Joey Julien
{"title":"Case studies in application of augmented reality in future media production","authors":"A. Woolard, V. Lalioti, N. Hedley, N. Carrigan, M. Hammond, Joey Julien","doi":"10.1109/ISMAR.2003.1240727","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240727","url":null,"abstract":"In this application-based poster, we describe three case studies about potential application of augmented reality (AR) in the broadcasting and entertainment industry. The poster covers the potential impact on BBC's principal objectives to \"entertain, educate and inform\" in a variety of environments such as broadcast studios, classrooms and in the home.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125573096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A LCD cube transporting high dynamic range light environments","authors":"Kosuke Nakayama, Kosuke Sato","doi":"10.1109/ISMAR.2003.1240741","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240741","url":null,"abstract":"In the MR world, it needs to overlay virtual objects onto real scene with harmonizing the photometric consistency. We developed a new illumination device \"LCD cube\" for keeping photometric consistency. In the demonstration, we plan to present the method to display high dynamic range light environments by \"LCD cube\" for image-based lighting. In addition, we performed light reproduction in real-time.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125047495","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The augmented composer project: the music table","authors":"R. Berry, Mao Makino, Naoto Hikawa, Masami Suzuki","doi":"10.1109/ISMAR.2003.1240749","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240749","url":null,"abstract":"The music table enables the composition of musical patterns by arranging cards on a tabletop. An overhead camera allows the computer to track the movements and positions of the cards and to provide immediate feedback in the form of music and on-screen computer generated images. Musical structure is experienced as a tangible space enriched with physical and visual cues about the music produced.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"126 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123766431","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
O. Bimber, Anselm Grundhöfer, Gordon Wetzstein, Sebastian Knödel
{"title":"Consistent illumination within optical see-through augmented environments","authors":"O. Bimber, Anselm Grundhöfer, Gordon Wetzstein, Sebastian Knödel","doi":"10.1109/ISMAR.2003.1240703","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240703","url":null,"abstract":"We present techniques which create a consistent illumination between real and virtual objects inside an application specific optical see-through display: the virtual showcase. We use projectors and cameras to capture reflectance information from diffuse real objects and to illuminate them under new synthetic lighting conditions. Matching direct and indirect lighting effects, such as shading, shadows, reflections and color bleeding can be approximated at interactive rates in such a controlled mixed environment.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133361075","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Immersive evaluation of virtualized soccer match at real stadium model","authors":"Naho Inamoto, H. Saito","doi":"10.1109/ISMAR.2003.1240702","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240702","url":null,"abstract":"This paper presents a novel observation system for immersive soccer match taken by multiple video cameras at a real stadium. The user sees the soccer field model in front of his/her eyes from the viewpoint through head-mounted display, while the images of players and a soccer ball are also rendered onto the display. For geometric registration between the soccer field model in the real world and the dynamic soccer scene in the rendered images, the viewpoint position of the user is computed by using only natural feature lines in the HMD camera image. Since it is difficult to strongly calibrate the HMD camera and the multiple cameras that capture the real soccer scene, we employ projective geometry for the registration and the rendering. For demonstrating the efficacy of the proposed system, video images of soccer matches taken at real stadium are rendered onto the HMD camera images of a tabletop stadium model. This is a completely new challenge to apply augmented reality to a dynamic event in a large-space.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123629819","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}