{"title":"User evaluations on form factors of tangible magic lenses","authors":"Ji-Young Oh, H. Hua","doi":"10.1109/ISMAR.2006.297790","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297790","url":null,"abstract":"Magic Lens is a small inset window embedded in a large context display, which provides an alternative view to the region of interest selected from the context view. This metaphor is used for 3D visualization in our Augmented Virtual Environment infrastructure, SCAPE (Stereoscopic Collaboration in Augmented and Projective Environments), which is composed of an immersive room display for a high level of detail, life-size virtual world and a workbench display for simplified god-like view to the world. A tangible Magic Lens is used on the workbench display to allow direct and intuitive selection of continuous levels of detail, bridging the gap between the two extreme levels of detail in SCAPE. This paper presents our first step to the user evaluations of tangible Magic Lens. We conducted two sets of user evaluations, one mainly testing the lens aspect ratio, and another for the lens size. For both of the tests, two types of tasks are conducted: information gathering and relating the detailed information with the context. We found that the aspect ratio of a lens plays more important role in user preference for smaller lenses than for larger ones. Meanwhile, the size of a lens is the most important factor that affects the user performance in the two types of tasks.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129075876","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Texture generation over the marker area","authors":"S. Siltanen","doi":"10.1109/ISMAR.2006.297831","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297831","url":null,"abstract":"In this paper, we present a method for generating a texture for hiding a marker in augmented reality applications. The texture is generated using the neighbourhood of the detected marker area in the image, which enables photorealistic results. The method presented here shows clear potential for real time use.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131934344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Augmented reality as a comparison tool in automotive industry","authors":"Stefan Nölle, G. Klinker","doi":"10.1109/ISMAR.2006.297829","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297829","url":null,"abstract":"Augmented reality (AR) can be used in the automotive industry to compare real parts of a car with their associated construction data. The real parts have to be checked whether they correspond to the latest version of the design and whether they have been manufactured with appropriate precision. With AR, CAD construction data can be superimposed on the real parts striving for maximum correspondence. The real and the virtual part should both be visible at the same time and in the same place. Therefore, for this kind of overlay, a special method of augmentation is needed. We present and discuss some visualization schemes.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"150 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132081722","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An event architecture for distributed interactive multisensory rendering","authors":"T. Edmunds, D. Pai","doi":"10.1109/ISMAR.2006.297814","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297814","url":null,"abstract":"We describe an architecture for coping with latency and asynchrony of multisensory events in interactive virtual environments. We propose to decompose multisensory interactions into a series of discrete, perceptually significant events, and structure the application architecture within this event-based context. We analyze the sources of latency, and develop a framework for event prediction and scheduling. Our framework decouples synchronization from latency, and uses prediction to reduce latency when possible. We evaluate the performance of the architecture using vision-based motion sensing and multisensory rendering using haptics, sounds, and graphics. The architecture makes it easy to achieve good performance using commodity off-the-shelf hardware.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127413102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Markerless augmented reality for cubic panorama sequences","authors":"Christopher R. Warrington, G. Roth, E. Dubois","doi":"10.1109/ISMAR.2006.297832","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297832","url":null,"abstract":"This paper presents a system for introducing augmented reality (AR) enhancements into an image-based cubic panorama sequence. Panoramic cameras, such as the Point Gray Research Ladybug allow rapid capture and generation of panoramic sequences for users to navigate and view. Our AR system provides the ability for authors to add virtual content into the panoramic sequences. First, a user manually selects a planar region over which to add the content. Then the system automatically finds a matching planar region in all the panoramas, allowing the virtual content to propagate. No preconditioning of the imaged scene through the addition of physical markers is necessary. Instead, 3-D position information is obtained by matching interest-point features across the panoramic sequence. This paper presents an application of augmented reality algorithms to the unique case of pre-captured panoramic sequences.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121528900","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Interactive laser-projection for programming industrial robots","authors":"M. Zäh, W. Vogl","doi":"10.1109/ISMAR.2006.297803","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297803","url":null,"abstract":"A method for intuitive and efficient programming of industrial robots based on Augmented Reality (AR) is presented, in which tool trajectories and target coordinates are interactively visualized and manipulated in the robot's environment by means of laser projection. The virtual information relevant for programming, such as trajectories and target coordinates, is projected into the robot's environment and can be manipulated interactively. For an intuitive and efficient user input to the system, spatial interaction techniques have been developed, which enable the user to virtually draw the desired motion paths for processing a work piece surface, directly onto the respective object. The discussed method has been implemented in an integrated AR-user interface and has been initially evaluated in an experimental programming scenario. The obtained results indicate that it enables significantly faster and easier programming of processing tasks compared to currently available shop-floor programming methods.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130289526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Quantification of visual capabilities using augmented reality displays","authors":"M. Livingston","doi":"10.1109/ISMAR.2006.297788","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297788","url":null,"abstract":"In order to be able to perceive and recognize objects or surface properties of objects, one must be able to resolve the features. These perceptual tasks can be difficult for both graphical representations and real objects in augmented reality (AR) displays. This paper presents the results of objective measurements and two user studies. The first evaluation explores visual acuity and contrast sensitivity; the second explores color perception. Both experiments test users' capabilities with their natural vision against their capabilities using commercially-available AR displays. The limited graphical resolution, reduced brightness, and uncontrollable visual context of the merged environment demonstrably reduce users' visual capabilities. The paper concludes by discussing the implications for display design and AR applications, as well as outlining possible extensions to the current studies.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128528523","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Going out: robust model-based tracking for outdoor augmented reality","authors":"Gerhard Reitmayr, T. Drummond","doi":"10.1109/ISMAR.2006.297801","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297801","url":null,"abstract":"This paper presents a model-based hybrid tracking system for outdoor augmented reality in urban environments enabling accurate, realtime overlays for a handheld device. The system combines several well-known approaches to provide a robust experience that surpasses each of the individual components alone: an edge-based tracker for accurate localisation, gyroscope measurements to deal with fast motions, measurements of gravity and magnetic field to avoid drift, and a back store of reference frames with online frame selection to re-initialize automatically after dynamic occlusions or failures. A novel edge-based tracker dispenses with the conventional edge model, and uses instead a coarse, but textured, 3D model. This yields several advantages: scale-based detail culling is automatic, appearance-based edge signatures can be used to improve matching and the models needed are more commonly available. The accuracy and robustness of the resulting system is demonstrated with comparisons to map-based ground truth data.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133505788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Youngjin Hong, Sanggoog Lee, Yongbeom Lee, Sang Ryong Kim
{"title":"Mobile pointing and input system using active marker","authors":"Youngjin Hong, Sanggoog Lee, Yongbeom Lee, Sang Ryong Kim","doi":"10.1109/ISMAR.2006.297822","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297822","url":null,"abstract":"We present a mobile pointing and input system for mobile phones with minimal hardware parts that allows users who wear an eyeglass display to view and interact with a large virtual image. To use the mobile phone as a pointing and input device, a fiducial marker-based tracking system developed for augmented reality is adapted for our system. The brightness and shape of active marker could be changed according to the state of operating environment and/or the user's intentions in order to increase the detection rate of the marker.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125744020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Implementation of god-like interaction techniques for supporting collaboration between outdoor AR and indoor tabletop users","authors":"A. Stafford, W. Piekarski, B. Thomas","doi":"10.1109/ISMAR.2006.297809","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297809","url":null,"abstract":"This paper presents a new interaction metaphor we have termed \"god-like interaction\". This is a metaphor for improved communication of situational and navigational information between outdoor users, equipped with mobile augmented reality systems, and indoor users, equipped with tabletop projector display systems. Physical objects are captured by a series of cameras viewing a table surface indoors, the data is sent over a wireless network, and is then reconstructed at a real-world location for outdoor augmented reality users. Our novel god-like interaction metaphor allows users to communicate information using physical props as well as natural gestures. We have constructed a system that implements our god-like interaction metaphor as well as a series of novel applications to facilitate collaboration between indoor and outdoor users. We have extended a well-known video based rendering algorithm to make it suitable for use on outdoor wireless networks of limited bandwidth. This paper also describes the limitations and lessons learned during the design and construction of the hardware that supports this research.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127140792","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}