Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)最新文献

筛选
英文 中文
A method for calibrating see-through head-mounted displays for AR 一种校准用于AR的透明头戴式显示器的方法
E. McGarrity, M. Tuceryan
{"title":"A method for calibrating see-through head-mounted displays for AR","authors":"E. McGarrity, M. Tuceryan","doi":"10.1109/IWAR.1999.803808","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803808","url":null,"abstract":"In order to have a working augmented reality (AR) system, the see-through system must be calibrated such that the internal models of objects match their physical counterparts. By match, we mean they should have the same position, orientation, and size information as well as any intrinsic parameters (such as focal lengths in the case of cameras) that their physical counterparts have. To this end, a procedure must be developed which estimates the parameters of these internal models. This calibration method must be both accurate and simple to use. This paper reports on our efforts to implement a calibration method for a see-through head-mounted display. We use a dynamic system in which a user interactively modifies the camera parameters until the image of a calibration object matches the image of a corresponding physical object. The calibration method is dynamic in the sense that we do not require the user's head to be immobilized.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129160680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 48
Calibration propagation for image augmentation 图像增强的校准传播
D. Stricker, Nassir Navab
{"title":"Calibration propagation for image augmentation","authors":"D. Stricker, Nassir Navab","doi":"10.1109/IWAR.1999.803810","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803810","url":null,"abstract":"Calibration is the first step in image augmentation. Classical approaches compute the projection matrix given 3D points of the scene and their 2D image correspondences. Different auto-calibration algorithms have been recently developed by the computer vision community. They do not use 3D-2D correspondences, but need many 2D-2D correspondences over long sequence of images to provide stable results. We propose a calibration propagation procedure which is in-between the two previous approaches. Starting from one calibrated image, the unknown camera parameters and position are computed for a second image. In particular the paper presents a method for extracting the focal length and the 3D structure, while other camera intrinsic parameters remain invariant. In practice, for many professional cameras, the principal point is approximately at the center of the image and the aspect ratio is given by camera specification. Calibration propagation is relevant to augmented reality applications, e.g. video see through HMD with zooming capability since it enables image augmentation for a number of camera views with changing intrinsic parameters. We present results on synthetic images showing the theoretical validity and performance of the method. We then use real data to demonstrate the potential of this approach for image augmentation applications in industrial maintenance assistance and architectural design.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"6 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121013010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Table-top spatially-augmented realty: bringing physical models to life with projected imagery 桌面空间增强现实:通过投影图像将物理模型带入生活
R. Raskar, G. Welch, Wei-Chao Chen
{"title":"Table-top spatially-augmented realty: bringing physical models to life with projected imagery","authors":"R. Raskar, G. Welch, Wei-Chao Chen","doi":"10.1109/IWAR.1999.803807","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803807","url":null,"abstract":"Despite the availability of high-quality graphics systems, architects and designers still build scaled physical models of buildings and products. These physical models have many advantages, however they are typically static in structure and surface characteristics. They are inherently lifeless. In contrast, high-quality graphics systems are tremendously flexible, allowing viewers to see alternative structures, facades, textures, cut-away views, and even dynamic effects such as changing lighting, moving automobiles, people, etc. We introduce a combination of these approaches that builds on our previously-published projector-based spatially-augmented reality techniques. The basic idea is to aim multiple ceiling-mounted light projectors inward to graphically augment table-top scaled physical models of buildings or products. This approach promises to provide very compelling hybrid visualizations that afford the benefits of both traditional physical models, and modern computer graphics, effectively \"bringing to life\" table-top physical models.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133336423","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 121
Scene augmentation via the fusion of industrial drawings and uncalibrated images with a view to marker-less calibration 通过融合工业图纸和未校准图像的场景增强,以实现无标记校准
Nassir Navab, B. Bascle, M. Appel, Echeyde Cubillo
{"title":"Scene augmentation via the fusion of industrial drawings and uncalibrated images with a view to marker-less calibration","authors":"Nassir Navab, B. Bascle, M. Appel, Echeyde Cubillo","doi":"10.1109/IWAR.1999.803813","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803813","url":null,"abstract":"The application presented augments uncalibrated images of factories with industrial drawings. Industrial drawings are among the most important documents used during the lifetime of industrial environments. They are the only common documents used during design, installation, monitoring and control, maintenance, update and finally dismantling of industrial units. Leading traditional industries towards the full use of virtual and augmented reality technology is impossible unless industrial drawings are integrated into our systems. We provide the missing link between industrial drawings and digital images of industrial sites. On one hand, this could enable us to calibrate cameras and build a 3D model of the scene without using any calibration markers. On the other hand it brings industrial drawings, floor map, images and 3D models into one unified framework. This provides a solid foundation for building efficient enhanced virtual industrial environments. The augmented scene is obtained by perspective warping of an industrial drawing of the factory onto its floor, wherever the floor is visible. The visibility of the floor is determined using probabilistic reasoning over a set of clues including (1) floor color/intensity (2) image warping and differencing between an uncalibrated stereoscopic image pair using the ground plane homography. Experimental results illustrate the approach.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"54 5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116534919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 57
Registration with a zoom lens camera for augmented reality applications 为增强现实应用注册变焦镜头相机
Gilles Simon, M. Berger
{"title":"Registration with a zoom lens camera for augmented reality applications","authors":"Gilles Simon, M. Berger","doi":"10.1109/IWAR.1999.803811","DOIUrl":"https://doi.org/10.1109/IWAR.1999.803811","url":null,"abstract":"We focus on the problem of adding computer-generated objects in video sequences that have been shot with a zoom lens camera. While numerous papers have been devoted to registration with fixed focal length, little attention has been brought to zoom lens cameras. We propose an efficient two-stage algorithm for handling zoom changing which are are likely to happen in a video sequence. We first attempt to partition the video into camera motions and zoom variations. Then, classical registration methods are used on the image frames labeled camera motion while keeping the internal parameters constant, whereas the zoom parameters are only updated for the frames labeled zoom variations. Results are presented demonstrating registration on various sequences. Augmented video sequences are also shown.","PeriodicalId":435326,"journal":{"name":"Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127036256","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信