2006 IEEE/ACM International Symposium on Mixed and Augmented Reality最新文献

筛选
英文 中文
Visualization of sensor data using mobile phone augmented reality 利用手机增强现实实现传感器数据的可视化
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297820
Ann-Sofie Gunnarsson, Malinda Rauhala, Anders Henrysson, A. Ynnerman
{"title":"Visualization of sensor data using mobile phone augmented reality","authors":"Ann-Sofie Gunnarsson, Malinda Rauhala, Anders Henrysson, A. Ynnerman","doi":"10.1109/ISMAR.2006.297820","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297820","url":null,"abstract":"We have developed a prototype system for visual inspection of hidden structures using a mobile phone wireless ZigBee sensor network. Data collected from an embedded wireless sensor matrix is used to synthesize graphics in real-time. Combining this with augmented reality technology on a mobile phone yields a novel approach to on-site inspection of a broad range of elements and their current internal states.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134304761","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Mobile augmented reality interaction techniques for authoring situated media on-site 移动增强现实交互技术,用于创作现场媒体
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297821
Sinem Güven, Steven K. Feiner, Ohan Oda
{"title":"Mobile augmented reality interaction techniques for authoring situated media on-site","authors":"Sinem Güven, Steven K. Feiner, Ohan Oda","doi":"10.1109/ISMAR.2006.297821","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297821","url":null,"abstract":"We present a set of mobile augmented reality interaction techniques for authoring situated media: multimedia and hypermedia that are embedded within the physical environment. Our techniques are designed for use with a tracked hand-held tablet display with an attached camera, and rely on \"freezing\" the frame for later editing.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130949952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 60
Mixed reality pre-visualization and camera-work authoring in filmmaking 电影制作中的混合现实预可视化和摄影作品创作
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297823
Ryosuke Ichikari, Keisuke Kawano, Asako Kimura, F. Shibata, H. Tamura
{"title":"Mixed reality pre-visualization and camera-work authoring in filmmaking","authors":"Ryosuke Ichikari, Keisuke Kawano, Asako Kimura, F. Shibata, H. Tamura","doi":"10.1109/ISMAR.2006.297823","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297823","url":null,"abstract":"In this paper, we introduce the outline of \"The MR-PreViz Project\" performed in Japan. In the pre-production process of filmmaking, Pre Viz, pre-visualizing the desired scene by CGI, is used as a new technique. As its advanced approach, we propose MR-PreViz that utilized mixed reality technology in current PreViz. MR-PreViz makes it possible to merge the real background and the computer-generated humans and creatures in open set or at outdoor location. The user can consider the camerawork and camera blocking efficiently by using MR-PreViz. This paper introduces the outline of the MR-PreViz project, the design of hardware configuration, camera-work authoring method and the results of prototyping.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122288047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 18
Photometric inconsistency on a mixed-reality face 混合现实人脸的光度不一致性
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297804
Masayuki Takemura, I. Kitahara, Y. Ohta
{"title":"Photometric inconsistency on a mixed-reality face","authors":"Masayuki Takemura, I. Kitahara, Y. Ohta","doi":"10.1109/ISMAR.2006.297804","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297804","url":null,"abstract":"A mixed-reality face (MR face) is a mosaic face with real and virtual facial parts, presented by overlaying a virtual facial part on a real face using mixed-reality techniques. An MR face is an effective means to improve communication in mixed-reality space by restoring the eye expressions lost when wearing HMDs. Photometric registration between the real and virtual parts is important because our eyes are very sensitive, even to small changes in human faces. However, efforts to achieve perfect 'physical' photometric registration on an MR face are not feasible in an ordinary MR space. Therefore, it is essential to clarify the sensitivity of our eyes to the photometric inconsistencies on an MR face, and to concentrate on resolving them. In this paper, we first present the results of a systematic experiment that evaluated our sensitivity to the photometric inconsistencies on an MR face. Then, a technique to resolve the inconsistency and an experimental system to demonstrate the effectiveness of an MR face are described.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115242854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Support system for guitar playing using augmented reality display 使用增强现实显示器的吉他演奏支持系统
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297825
Y. Motokawa, H. Saito
{"title":"Support system for guitar playing using augmented reality display","authors":"Y. Motokawa, H. Saito","doi":"10.1109/ISMAR.2006.297825","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297825","url":null,"abstract":"Learning to play the guitar is difficult. We proposed a system that assists people learning to play the guitar using augmented reality. This system shows a learner how to correctly hold the strings by overlaying a virtual hand model and lines onto a real guitar. The player learning to play the guitar can easily understand the required position by overlapping their hand on a visual guide. An important issue for this system to address is the accurate registration between the visual guide and the guitar, therefore we need to track the pose and the position of the guitar. We also proposed a method to track the guitar with a visual marker and natural features of the guitar. Since we used marker information and edge information as natural features, the system could continually track the guitar. Accordingly, our system can constantly display visual guides at the required position to enable a player to learn to play the guitar in a natural manner.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130296396","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 77
Enhanced visual realism by incorporating camera image effects 通过结合相机图像效果增强视觉真实感
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297815
J. Fischer, D. Bartz, W. Straßer
{"title":"Enhanced visual realism by incorporating camera image effects","authors":"J. Fischer, D. Bartz, W. Straßer","doi":"10.1109/ISMAR.2006.297815","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297815","url":null,"abstract":"In video see-through augmented reality (AR), virtual objects are overlaid over digital video images. One particular problem of this image mixing process is that the visual appearance of the computer graphics differs strongly from the real background image. The reason for this is that typical AR systems use fast but simple real-time rendering techniques for displaying virtual objects. In this paper, methods for reducing the impact of three effects which make virtual and real objects easily distinguishable are presented. The first effect is camera image noise, which is contained in the data delivered by the image sensor used for capturing the real scene. The second effect considered is edge aliasing, which makes distinguishing virtual objects from real objects simple. Finally, we consider motion blur, which is caused by the temporal integration of color intensities in the image sensor during fast movements of the camera or observed objects. In this paper, we present a system for generating a realistic simulation of image noise based on a new camera calibration step. Additionally, a rendering algorithm is introduced, which performs a smooth blending between the camera image and virtual objects at their boundary in order to reduce aliasing. Lastly, a rendering method is presented, which produces motion blur according to the current camera movement. The implementation of the new rendering techniques utilizes the programmability of modern graphics processing units (GPUs) and delivers real-time frame rates.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125403452","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
Automatic online walls detection for immediate use in AR tasks 自动在线墙壁检测立即在AR任务中使用
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297792
Gilles Simon
{"title":"Automatic online walls detection for immediate use in AR tasks","authors":"Gilles Simon","doi":"10.1109/ISMAR.2006.297792","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297792","url":null,"abstract":"This paper proposes a method to automatically detect and reconstruct planar surfaces for immediate use in AR tasks. Traditional methods for plane detection are typically based on the comparison of transfer errors of a homography, which make them sensitive to the choice of a discrimination threshold. We propose a very different approach: the image is divided into a grid and rectangles that belong to the same planar surface are clustered around the local maxima of a Hough transform. As a result, we simultaneously get clusters of coplanar rectangles and the image of their intersection line with a reference plane, which easily leads to their 3D position and orientation. Results are shown on both synthetic and real data.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115144570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Predicting and estimating the accuracy of n-occular optical tracking systems n-光学跟踪系统的精度预测与估计
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297793
M. Bauer, M. Schlegel, D. Pustka, Nassir Navab, G. Klinker
{"title":"Predicting and estimating the accuracy of n-occular optical tracking systems","authors":"M. Bauer, M. Schlegel, D. Pustka, Nassir Navab, G. Klinker","doi":"10.1109/ISMAR.2006.297793","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297793","url":null,"abstract":"Marker-based optical tracking systems are widely used in augmented reality, medical navigation and industrial applications. We propose a model for the prediction of the target registration error (TRE) in these kinds of tracking systems by estimating the fiducial location error (FLE) from two-dimensional errors on the image plane and propagating that error to a given point of interest. We have designed a set of experiments in order to estimate the actual parameters of the model for any given tracking system. We present the results of a study which we used to demonstrate the effect of different sources of error. The method is applied to real applications to show the usefulness for any kind of augmented reality system. We also present a set of tools that can be used to visualize the accuracy at design time.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"142 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115796036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 33
Online camera pose estimation in partially known and dynamic scenes 部分已知和动态场景下的在线相机姿态估计
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297795
G. Bleser, H. Wuest, D. Stricker
{"title":"Online camera pose estimation in partially known and dynamic scenes","authors":"G. Bleser, H. Wuest, D. Stricker","doi":"10.1109/ISMAR.2006.297795","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297795","url":null,"abstract":"One of the key requirements of augmented reality systems is a robust real-time camera pose estimation. In this paper we present a robust approach, which does neither depend on offline pre-processing steps nor on pre-knowledge of the entire target scene. The connection between the real and the virtual world is made by a given CAD model of one object in the scene. However, the model is only needed for initialization. A line model is created out of the object rendered from a given camera pose and registrated onto the image gradient for finding the initial pose. In the tracking phase, the camera is not restricted to the modeled part of the scene anymore. The scene structure is recovered automatically during tracking. Point features are detected in the images and tracked from frame to frame using a brightness invariant template matching algorithm. Several template patches are extracted from different levels of an image pyramid and are used to make the 2D feature tracking capable for large changes in scale. Occlusion is detected already on the 2D feature tracking level. The features' 3D locations are roughly initialized by linear triangulation and then refined recursively over time using techniques of the Extended Kalman Filter framework. A quality manager handles the influence of a feature on the estimation of the camera pose. As structure and pose recovery are always performed under uncertainty, statistical methods for estimating and propagating uncertainty have been incorporated consequently into both processes. Finally, validation results on synthetic as well as on real video sequences are presented.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"663 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115833187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 131
Visualizing and navigating complex situated hypermedia in augmented and virtual reality 增强现实和虚拟现实中复杂超媒体的可视化和导航
2006 IEEE/ACM International Symposium on Mixed and Augmented Reality Pub Date : 2006-10-22 DOI: 10.1109/ISMAR.2006.297807
Sinem Güven, Steven K. Feiner
{"title":"Visualizing and navigating complex situated hypermedia in augmented and virtual reality","authors":"Sinem Güven, Steven K. Feiner","doi":"10.1109/ISMAR.2006.297807","DOIUrl":"https://doi.org/10.1109/ISMAR.2006.297807","url":null,"abstract":"We present a set of techniques that enable mobile users to visualize and navigate complex hypermedia structures embedded in the real world, through augmented reality or virtual reality. Situating hypermedia in the 3D physical environment makes it possible to represent information about users' surroundings in context. However, it requires addressing a new set of problems beyond those of visualizing hypermedia on a 2D display: Nodes and links can potentially be distributed across large distances, and may be occluded by other objects, both real and virtual. Our techniques address these issues by enabling mobile users to select and manipulate portions of the hypermedia structure by tilting, lifting and shifting them, to view more clearly links and nodes that would otherwise be occluded or ambiguously connected.","PeriodicalId":332844,"journal":{"name":"2006 IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2006-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116923776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信