Jason Wither, Christopher Coffin, Jonathan Ventura, Tobias Höllerer
{"title":"Fast annotation and modeling with a single-point laser range finder","authors":"Jason Wither, Christopher Coffin, Jonathan Ventura, Tobias Höllerer","doi":"10.1109/ISMAR.2008.4637326","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637326","url":null,"abstract":"This paper presents methodology for integrating a small, single-point laser range finder into a wearable augmented reality system. We first present a way of creating object-aligned annotations with very little user effort. Second, we describe techniques to segment and pop-up foreground objects. Finally, we introduce a method using the laser range finder to incrementally build 3D panoramas from a fixed observerpsilas location. To build a 3D panorama semi-automatically, we track the systempsilas orientation and use the sparse range data acquired as the user looks around in conjunction with real-time image processing to construct geometry around the userpsilas position. Using full 3D panoramic geometry, it is possible for new virtual objects to be placed in the scene with proper lighting and occlusion by real world objects, which increases the expressivity of the AR experience.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"107 11","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132477134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A differential GPS carrier phase technique for precision outdoor AR tracking","authors":"W. T. Fong, S. Ong, A. Nee","doi":"10.1109/ISMAR.2008.4637319","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637319","url":null,"abstract":"This paper presents a differential GPS carrier phase technique for 3D outdoor position tracking in mobile augmented reality (AR) applications. It has good positioning accuracy, low drift and jitter, and low computation requirement. It eliminates the resolution of integer ambiguities. The position from an initial point is tracked by accumulating the displacement in each time step, which is determined using Differential Single Difference. Preliminary results using low cost GPS receivers show that the position error is 10 cm, and the drift is 0.001 ms-1, which can be compensated using linear models. Stable and accurate augmentations in outdoor scenes are demonstrated.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133266510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The effect of registration error on tracking distant augmented objects","authors":"M. Livingston, Zhuming Ai","doi":"10.1109/ISMAR.2008.4637329","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637329","url":null,"abstract":"We conducted a user study of the effect of registration error on performance of tracking distant objects in augmented reality. Categorizing error by types that are often used as specifications, we hoped to derive some insight into the ability of users to tolerate noise, latency, and orientation error. We used measurements from actual systems to derive the parameter settings. We expected all three errors to influence userspsila ability to perform the task correctly and the precision with which they performed the task. We found that high latency had a negative impact on both performance and response time. While noise consistently interacted with the other variables, and orientation error increased user error, the differences between ldquohighrdquo and ldquolowrdquo amounts were smaller than we expected. Results of userspsila subjective rankings of these three categories of error were surprisingly mixed. Users believed noise was the most detrimental, though statistical analysis of performance refuted this belief. We interpret the results and draw insights for system design.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114739065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"3D fiducials for scalable AR visual tracking","authors":"J. Steinbis, W. Hoff, T. Vincent","doi":"10.1109/ISMAR.2008.4637357","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637357","url":null,"abstract":"A new vision and inertial pose estimation system was implemented for real-time handheld augmented reality (AR). A sparse set of 3D cone fiducials are utilized for scalable indoor/outdoor tracking, as opposed to traditional planar patterns. The cones are easy to segment and have a large working volume which makes them more suitable for many applications. The pose estimation system receives measurements from the camera and IMU at 30 Hz and 100 Hz respectively. With a dual-core workstation, all measurements can be processed in real-time to update the pose of virtual graphics within the AR display.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"125 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121965234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Efficiency of techniques for mixed-space collaborative navigation","authors":"A. Stafford, B. Thomas, W. Piekarski","doi":"10.1109/ISMAR.2008.4637356","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637356","url":null,"abstract":"This paper describes the results of a study conducted to determine the efficiency of visual cues for a collaborative navigation task in a mixed-space environment. The task required a user with an exocentric view of a virtual room to navigate a fully immersed user with an egocentric view to an exit. The study compares natural hand-based gestures, a mouse-based interface and an audio only technique to determine their relative efficiency on task completion times. The results show that visual cue-based collaborative navigation techniques are significantly more efficient than an audio-only technique.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121526096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"User evaluation of see-through vision for mobile outdoor augmented reality","authors":"Ben Avery, B. Thomas, W. Piekarski","doi":"10.1109/ISMAR.2008.4637327","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637327","url":null,"abstract":"We have developed a system built on our mobile AR platform that provides users with see-through vision, allowing visualization of occluded objects textured with real-time video information. We present a user study that evaluates the userpsilas ability to view this information and understand the appearance of an outdoor area occluded by a building while using a mobile AR computer. This understanding was compared against a second group of users who watched video footage of the same outdoor area on a regular computer monitor. The comparison found an increased accuracy in locating specific points from the scene for the outdoor AR participants. The outdoor participants also displayed more accurate results, and showed better speed improvement than the indoor group when viewing more than one video simultaneously.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130348263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR","authors":"Feng Zhou, H. Duh, M. Billinghurst","doi":"10.1109/ismar.2008.4637362","DOIUrl":"https://doi.org/10.1109/ismar.2008.4637362","url":null,"abstract":"Although Augmented Reality technology was first developed over forty years ago, there has been little survey work giving an overview of recent research in the field. This paper reviews the ten-year development of the work presented at the ISMAR conference and its predecessors with a particular focus on tracking, interaction and display research. It provides a roadmap for future augmented reality research which will be of great value to this relatively young field, and also for helping researchers decide which topics should be explored when they are beginning their own studies in the area.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127672506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Haptically extended augmented prototyping","authors":"Mariza Dima, D. Arvind, John R. Lee, Mark Wright","doi":"10.1109/ISMAR.2008.4637350","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637350","url":null,"abstract":"This project presents a new display concept, which brings together haptics, augmented and mixed reality and tangible computing within the context of an intuitive conceptual design environment. The project extends the paradigm of augmented prototyping by allowing modelling of virtual geometry on the physical prototype, which can be touched by means of a haptic device. Wireless tracking of the physical prototype is achieved in three different ways by attaching to it a 'Speck', a tracker and Nintendo Wii Remote and it provides continuous tangible interaction. The physical prototype becomes a tangible interface augmented with mixed reality and with a novel 3D haptic design system.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126512902","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multiple 3D Object tracking for augmented reality","authors":"Youngmin Park, V. Lepetit, Woontack Woo","doi":"10.1109/ISMAR.2008.4637336","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637336","url":null,"abstract":"We present a method that is able to track several 3D objects simultaneously, robustly, and accurately in real-time. While many applications need to consider more than one object in practice, the existing methods for single object tracking do not scale well with the number of objects, and a proper way to deal with several objects is required. Our method combines object detection and tracking: Frame-to-frame tracking is less computationally demanding but is prone to fail, while detection is more robust but slower. We show how to combine them to take the advantages of the two approaches, and demonstrate our method on several real sequences.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131853833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Virtual redlining for civil engineering in real environments","authors":"Gerhard Schall, Erick Méndez, D. Schmalstieg","doi":"10.1109/ISMAR.2008.4637332","DOIUrl":"https://doi.org/10.1109/ISMAR.2008.4637332","url":null,"abstract":"Field workers of utility companies are regularly engaged in outdoor tasks such as network planning and inspection of underground infrastructure. Redlining is the term used for manually annotating either printed paper maps or a 2D geographic information system on a notebook computer taken to the field. Either of these approaches requires finding the physical location to be annotated on the physical or digital map. In this paper, we describe a mobile Augmented Reality (AR) system capable of supporting virtual redlining. The AR visualization delivered by the system is constructed from data directly extracted from a GIS used in day-to-day production by utility companies. We also report on encouraging trials and interviews performed with professional field workers from the utility sector.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123948936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}