Yong Liu, M. Störring, T. Moeslund, C. Madsen, E. Granum
{"title":"Computer vision based head tracking from re-configurable 2D markers for AR","authors":"Yong Liu, M. Störring, T. Moeslund, C. Madsen, E. Granum","doi":"10.1109/ISMAR.2003.1240712","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240712","url":null,"abstract":"This paper presents a computer vision based head tracking system for augmented reality. A camera is attached to a head mounted display and used to track markers in the user's field of view. These markers are movable on a table and used as interface with virtual objects. Furthermore, they are used as landmarks to track the user's viewpoint and viewing direction by a homography based camera pose estimation algorithm. By integrating this computer vision tracker with a commercially available InterSense tracker, we take the advantages of the small jitter of the former one without losing the tracking speed of the later one. For static and slow head motion the system has less than 0.3mm RMS of position jitter and 0.165 degrees RMS of orientation jitter.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"117 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113984836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Satoh, Shinji Uchiyama, Hiroyuki Yamamoto, H. Tamura
{"title":"Robot vision-based registration utilizing bird's-eye view with user's view","authors":"K. Satoh, Shinji Uchiyama, Hiroyuki Yamamoto, H. Tamura","doi":"10.1109/ISMAR.2003.1240687","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240687","url":null,"abstract":"This paper describes new vision-based registration methods utilizing not only cameras on a user's head-mounted display but also a bird's-eye view camera that observes the user from an objective viewpoint. Two new methods, the line constraint method (LCM) and global error minimization method (GEM), are proposed. The former method reduces the number of unknown parameters concerning the user's viewpoint by restricting it to be on the line of sight from the bird's-eye view. The other method minimizes the sum of errors, which is the sum of the distance between the fiducials on the view and the calculated positions of them based on the current viewing parameters, for both the user's view and the bird's-eye view. The methods proposed here reduce the number of points that should be observed from the user's viewpoint for registration, thus improving the stability. In addition to theoretical discussions, this paper demonstrates the effectiveness of our methods by experiments in comparison with methods that use only a user's view camera or a bird's-eye view camera.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"155 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134229390","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Augmented reality live-action compositing","authors":"T. Pintaric","doi":"10.1109/ISMAR.2003.1240748","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240748","url":null,"abstract":"This report describes a system that performs live-action compositing of physical and virtual objects to a panoramic background image in real-time at interactive rates. A static camera is directed towards a 40 cm/sup 3/ miniature stage, whose backdrop has been colored in chromatic green. Users can add virtual objects and manipulate their parameters within the scene by using a proxy device that consists of a small rod attached to a fiducial marker. Our system runs on commodity hardware such as a notebook equipped with a firewire video camera. The necessary chroma-keying and adaptive difference-matting algorithms have been implemented on a GPU using fragment shading.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129382644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An object-oriented software architecture for 3D mixed reality applications","authors":"W. Piekarski, B. Thomas","doi":"10.1109/ISMAR.2003.1240708","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240708","url":null,"abstract":"This paper presents a new software architecture for 3D mixed reality applications, named Tinmith-evo5. Currently there are a limited number of existing toolkits for the development of 3D mixed reality applications, each optimized for particular feature but at the detriment of others. Complex interactive user interfaces and applications require extensive supporting infrastructure, and can be hampered by inadequate support. The Tinmith-evo5 architecture is optimised to develop mobile augmented reality and other interactive 3D applications on portable platforms with limited resources. This architecture is implemented in C++ with an object-oriented data flow design, an object store based on the Unix file system model, and uses other ideas from existing previous work.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"6 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116796315","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kazue Kobayashi, Mitsunori Hirano, A. Narita, H. Ishii
{"title":"IP network designer: interface for IP network simulation","authors":"Kazue Kobayashi, Mitsunori Hirano, A. Narita, H. Ishii","doi":"10.1109/ISMAR.2003.1240743","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240743","url":null,"abstract":"In this demonstration, we present IP network designer: interfaces for IP network simulation. The IP network designer consists of two subsystems: the IP network design workbench and a 3D simulator. IP network design workbench is intended to support a collaborative design and simulation of an IP network by a group of network designers and their customers. This system is based on a tangible user interface platform called \"sensetable\" and allows users to directly manipulate network topologies. Users can control parameters of nodes and links using physical pucks on the sensing table and simultaneously see the simulation results projected onto the table. 3D simulator provides a 3D view of simulation results. Users can see traffic packets flow as if they are inside the network. This system allows users to understand network behavior intuitively.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124922342","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards a usable stereoscopic augmented reality interface for the manipulation of virtual cursors","authors":"J. Sands, Shaun W. Lawson","doi":"10.1109/ISMAR.2003.1240720","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240720","url":null,"abstract":"The combination of augmented reality (AR) systems and stereoscopic display devices has created a powerful tool with which to \"supplement\" our view. One application for such systems is for the augmented teleoperation of unmanned vehicles, deployed in remote or hazardous environments. Research in this area has highlighted the need for accurate 3D measurement techniques - such as through the use of virtual cursors. This paper describes our work in the development of an AR interface designed to assist the accurate selection of position in 3D space. We describe some preliminary experimental work using virtual cursors before discussing how we believe depth cues can be utilized to allow a user to make a more informed judgment of depth in unprepared environments. It is expected that the guidelines outlined in this report will be used as a benchmark for the development of further 3D ARA cursors.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124156606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel Belcher, M. Billinghurst, S. Hayes, R. Stiles
{"title":"Using augmented reality for visualizing complex graphs in three dimensions","authors":"Daniel Belcher, M. Billinghurst, S. Hayes, R. Stiles","doi":"10.1109/ISMAR.2003.1240691","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240691","url":null,"abstract":"In this paper we explore the effect of using augmented reality (AR) for three-dimensional graph link analysis. Two experiments were conducted. The first was designed to compare a tangible AR interface to a desktop-based interface. Different modes of viewing network graphs were presented using a variety of interfaces. The results of the first experiment show that a tangible AR interface is well suited to link analysis. The second experiment was designed to test the effect of stereographic viewing on graph comprehension. The results show that stereographic viewing has little effect on comprehension and performance. These experiments add support to the work of Ware and Frank, whose studies showed that depth and motion cues provide huge gains in spatial comprehension and accuracy in link analysis.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"147 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127255106","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Zauner, M. Haller, Alexander Brandl, W. Hartmann
{"title":"Authoring of a mixed reality assembly instructor for hierarchical structures","authors":"J. Zauner, M. Haller, Alexander Brandl, W. Hartmann","doi":"10.1109/ISMAR.2003.1240707","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240707","url":null,"abstract":"Mixed reality is a very useful and powerful instrument for the visualization of processes, including the assembly process. A Mixed Reality based step-by-step furniture assembly application is introduced. On the one hand context related actions are given to the user to install elements. On the other hand an intuitive way for authors to create new MR based assembly instructions is provided. Our goal is to provide a powerful, flexible and easy-to-use authoring wizard for assembly experts, allowing them to author their new assembly instructor for hierarchical structures. This minimizes the costs for the creation of new mixed reality assembly instructors.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127367495","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Loscos, Hila Ritter Widenfeld, M. Roussou, Alexandre Meyer, F. Tecchia, G. Drettakis, Emmanuel Gallo, A. R. Martinez, N. Tsingos, Y. Chrysanthou, L. Robert, M. Bergamasco, A. Dettori, S. Soubra
{"title":"The CREATE project: mixed reality for design, education, and cultural heritage with a constructivist approach","authors":"C. Loscos, Hila Ritter Widenfeld, M. Roussou, Alexandre Meyer, F. Tecchia, G. Drettakis, Emmanuel Gallo, A. R. Martinez, N. Tsingos, Y. Chrysanthou, L. Robert, M. Bergamasco, A. Dettori, S. Soubra","doi":"10.1109/ISMAR.2003.1240721","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240721","url":null,"abstract":"The global scope of the CREATE project is to develop a mixed-reality framework that enables highly interactive real-time construction and manipulation of photo-realistic, virtual worlds based on real data sources. This framework will be tested and applied to cultural heritage content in an educational context, as well as to the design and review of architectural/urban planning settings. The evaluation of the project is based on a human-centered, constructivist approach to working and learning, with special attention paid to the evaluation of the resulting mixed reality experience. Through this approach, participants in an activity \"construct\" their own knowledge by testing ideas and concepts based on their prior knowledge and experience, applying these to a new situation, and integrating the new knowledge gained with pre-existing intellectual constructs. CREATE project uses a high degree of interactivity, and includes provision for other senses (haptics and sound). The application developed in CREATE are designed to run on different platforms, and the targeted running systems are SGI and PC driven, with immersive stereo-displays such as a workbench, a ReaCTor (CAVE-like environment), and a wide projection screen.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124492183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Inertial and magnetic sensing of human movement near ferromagnetic materials","authors":"D. Roetenberg, H. Luinge, P. Veltink","doi":"10.1109/ISMAR.2003.1240714","DOIUrl":"https://doi.org/10.1109/ISMAR.2003.1240714","url":null,"abstract":"This paper describes a Kalman filter design to estimate orientation of human body segments by fusing gyroscope, accelerometer and magnetometer signals. Ferromagnetic materials near the sensor disturb the local magnetic field and therefore the orientation estimation. The magnetic disturbance can be detected by looking at the total magnetic density and a magnetic disturbance vector can be calculated. Results show the capability of this filter to correct for magnetic disturbances.","PeriodicalId":296266,"journal":{"name":"The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121542693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}