2015 IEEE International Symposium on Mixed and Augmented Reality最新文献

筛选
英文 中文
[POSTER] Remote Mixed Reality System Supporting Interactions with Virtualized Objects [海报]支持虚拟对象交互的远程混合现实系统
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.22
Peng Yang, I. Kitahara, Y. Ohta
{"title":"[POSTER] Remote Mixed Reality System Supporting Interactions with Virtualized Objects","authors":"Peng Yang, I. Kitahara, Y. Ohta","doi":"10.1109/ISMAR.2015.22","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.22","url":null,"abstract":"Mixed Reality (MR) can merge real and virtual worlds seamlessly. This paper proposes a method to realize smooth collaboration using a remote MR, which makes it possible for geographically distributed users to share the same objects and communicate in real time as if they are at the same place. In this paper, we consider a situation where the users at local and remote sites perform a collaborative work, and real objects to be operated exist only at the local site. It is necessary to share the real objects between the two sites. However, prior studies have shown sharing real objects by duplication is either too costly or unrealistic. Therefore, we propose a method to share the objects by virtualizing the real objects using Computer Vision (CV) and then rendering the virtualized objects using MR. We have proposed a remote collaborative work system to create a smoother user experience for collaborative work with virtualized objects for remote users. Through experiments, we confirmed the effectiveness of our approach.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128710633","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
[POSTER] Hands-Free AR Work Support System Monitoring Work Progress with Point-cloud Data Processing [海报]免提AR工作支持系统通过点云数据处理监控工作进度
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.50
H. Sagawa, H. Nagayoshi, Harumi Kiyomizu, T. Kurihara
{"title":"[POSTER] Hands-Free AR Work Support System Monitoring Work Progress with Point-cloud Data Processing","authors":"H. Sagawa, H. Nagayoshi, Harumi Kiyomizu, T. Kurihara","doi":"10.1109/ISMAR.2015.50","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.50","url":null,"abstract":"We present a hands-free AR work support system that provides work instructions to workers without interrupting normal work procedures. This system estimates the work progress by monitoring the status of work objects only on the basis of 3D data captured from a depth sensor mounted on a helmet, and it selects appropriate information to be displayed on a head-mounted display (HMD) on the basis of the estimated work progress. We describe a prototype of the proposed system and the results of primary experiments carried out to evaluate the accuracy and performance of the system.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128353522","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
RGBDX: First Design and Experimental Validation of a Mirror-Based RGBD X-ray Imaging System RGBDX:基于镜的RGBD x射线成像系统的首次设计与实验验证
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.17
S. Habert, J. Gardiazabal, P. Fallavollita, Nassir Navab
{"title":"RGBDX: First Design and Experimental Validation of a Mirror-Based RGBD X-ray Imaging System","authors":"S. Habert, J. Gardiazabal, P. Fallavollita, Nassir Navab","doi":"10.1109/ISMAR.2015.17","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.17","url":null,"abstract":"This paper presents the first design of a mirror based RGBD X-ray imaging system and includes an evaluation study of the depth errors induced by the mirror when used in combination with an infrared pattern-emission RGBD camera. Our evaluation consisted of three experiments. The first demonstrated almost no difference in depth measurements of the camera with and without the use of the mirror. The final two experiments demonstrated that there were no relative and location-specific errors induced by the mirror showing the feasibility of the RGBDX-ray imaging system. Lastly, we showcase the potential of the RGBDX-ray system towards a visualization application in which an X-ray image is fused to the 3D reconstruction of the surgical scene via the RGBD camera, using automatic C-arm pose estimation.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"126 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122755317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Simultaneous Direct and Augmented View Distortion Calibration of Optical See-Through Head-Mounted Displays 光学透明头戴式显示器的同时直接和增强视图畸变校准
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.14
Yuta Itoh, G. Klinker
{"title":"Simultaneous Direct and Augmented View Distortion Calibration of Optical See-Through Head-Mounted Displays","authors":"Yuta Itoh, G. Klinker","doi":"10.1109/ISMAR.2015.14","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.14","url":null,"abstract":"In Augmented Reality (AR) with an Optical See-Through Head-Mounted Display (OST-HMD), the spatial calibration between a user's eye and the display screen is a crucial issue in realizing seamless AR experiences. A successful calibration hinges upon proper modeling of the display system which is conceptually broken down into an eye part and an HMD part. This paper breaks the HMD part down even further to investigate optical aberration issues. The display optics causes two different optical aberrations that degrade the calibration quality: the distortion of incoming light from the physical world, and that of light from the image source of the HMD. While methods exist for correcting either of the two distortions independently, there is, to our knowledge, no method which corrects for both simultaneously. This paper proposes a calibration method that corrects both of the two distortions simultaneously for an arbitrary eye position given an OST-HMD system. We expand a light-field (LF) correction approach [8] originally designed for the former distortion. Our method is camera-based and has an offline learning and an online correction step. We verify our method in exemplary calibrations of two different OST-HMDs: a professional and a consumer OST-HMD. The results show that our method significantly improves the calibration quality compared to a conventional method with the accuracy comparable to 20/50 visual acuity. The results also indicate that only by correcting both the distortions simultaneously can improve the quality.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"132 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134087704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
[POSTER] Fusion of Vision and Inertial Sensing for Accurate and Efficient Pose Tracking on Smartphones [海报]融合视觉和惯性传感的智能手机准确有效的姿态跟踪
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.23
Xin Yang, Xun Si, Tangli Xue, K. Cheng
{"title":"[POSTER] Fusion of Vision and Inertial Sensing for Accurate and Efficient Pose Tracking on Smartphones","authors":"Xin Yang, Xun Si, Tangli Xue, K. Cheng","doi":"10.1109/ISMAR.2015.23","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.23","url":null,"abstract":"This paper aims at accurate and efficient pose tracking of planar targets on modern smartphones. Existing methods, relying on either visual features or motion sensing based on built-in inertial sensors, are either too computationally expensive to achieve realtime performance on a smartphone, or too noisy to achieve sufficient tracking accuracy. In this paper we present a hybrid tracking method which can achieve real-time performance with high accuracy. Based on the same framework of a state-of-the-art visual feature tracking algorithm [5] which ensures accurate and reliable pose tracking, the proposed hybrid method significantly reduces its computational cost with the assistance of a phone's built-in inertial sensors. However, noises in inertial sensors and abrupt errors in feature tracking due to severe motion blurs could result in instability of the hybrid tracking system. To address this problem, we propose to employ an adaptive Kalman filter with abrupt error detection to robustly fuse the inertial and feature tracking results. We evaluated the proposed method on a dataset consisting of 16 video clips with synchronized inertial sensing data. Experimental results demonstrated our method's superior performance and accuracy on smartphones compared to a state-of-the-art vision tracking method [5]. The dataset will be made publicly available with the publication of this paper.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133326587","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
[POSTER] A Particle Filter Approach to Outdoor Localization Using Image-Based Rendering 基于图像渲染的室外定位粒子滤波方法
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.39
Christian Poglitsch, Clemens Arth, D. Schmalstieg, Jonathan Ventura
{"title":"[POSTER] A Particle Filter Approach to Outdoor Localization Using Image-Based Rendering","authors":"Christian Poglitsch, Clemens Arth, D. Schmalstieg, Jonathan Ventura","doi":"10.1109/ISMAR.2015.39","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.39","url":null,"abstract":"We propose an outdoor localization system using a particle filter. In our approach, a textured, geo-registered model of the outdoor environment is used as a reference to estimate the pose of a smartphone. The device position and the orientation obtained from a Global Positioning System (GPS) receiver and an inertial measurement unit (IMU) are used as a first estimation of the true pose. Then, multiple pose hypotheses are randomly distributed about the GPS/IMU measurement and use to produce renderings of the virtual model. With vision-based methods, the rendered images are compared with the image received from the smartphone, and the matching scores are used to update the particle filter. The outcome of our system improves the camera pose estimate in real time without user assistance.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134245123","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
[POSTER] Interactive Visualizations for Monoscopic Eyewear to Assist in Manually Orienting Objects in 3D [海报]单镜眼镜的交互式可视化,以帮助在3D中手动定位物体
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.54
Carmine Elvezio, Mengu Sukan, Steven K. Feiner, B. Tversky
{"title":"[POSTER] Interactive Visualizations for Monoscopic Eyewear to Assist in Manually Orienting Objects in 3D","authors":"Carmine Elvezio, Mengu Sukan, Steven K. Feiner, B. Tversky","doi":"10.1109/ISMAR.2015.54","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.54","url":null,"abstract":"Assembly or repair tasks often require objects to be held in specific orientations to view or fit together. Research has addressed the use of AR to assist in these tasks, delivered as registered overlaid graphics on stereoscopic head-worn displays. In contrast, we are interested in using monoscopic head-worn displays, such as Google Glass. To accommodate their small monoscopic field of view, off center from the user's line of sight, we are exploring alternatives to registered overlays. We describe four interactive rotation guidance visualizations for tracked objects intended for these displays.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131675051","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
[POSTER] Toward Enhancing Robustness of DR System: Ranking Model for Background Inpainting [POSTER]增强DR系统鲁棒性的研究:背景绘制排序模型
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.53
Mariko Isogawa, Dan Mikami, Kosuke Takahashi, Akira Kojima
{"title":"[POSTER] Toward Enhancing Robustness of DR System: Ranking Model for Background Inpainting","authors":"Mariko Isogawa, Dan Mikami, Kosuke Takahashi, Akira Kojima","doi":"10.1109/ISMAR.2015.53","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.53","url":null,"abstract":"A method for blindly predicting inpainted image quality is proposed for enhancing the robustness of diminished reality (DR), which uses inpainting to remove unwanted objects by replacing them with background textures in real time. The method maps from inpainted image features to subjective image quality scores without the need for reference images. It enables more complex background textures to be applied to DR.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129547505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
[POSTER] Design Guidelines for Generating Augmented Reality Instructions [海报]生成增强现实指令的设计指南
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.36
Cledja Rolim, D. Schmalstieg, Denis Kalkofen, V. Teichrieb
{"title":"[POSTER] Design Guidelines for Generating Augmented Reality Instructions","authors":"Cledja Rolim, D. Schmalstieg, Denis Kalkofen, V. Teichrieb","doi":"10.1109/ISMAR.2015.36","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.36","url":null,"abstract":"Most work about instructions in Augmented Reality (AR) does not follow established patterns or design rules -- each approach defines its own method on how to convey instructions. This work describes our initial results and experiences towards defining design guidelines for AR instructions. The guidelines were derived from a survey of the most common visualization techniques and instruction types applied in AR. We studied about how 2D and 3D instructions can be applied in the AR context.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127478539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
[POSTER] Overlaying Navigation Signs on a Road Surface Using a Head-Up Display [海报]使用平视显示器在路面上叠加导航标志
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.48
Kaho Ueno, T. Komuro
{"title":"[POSTER] Overlaying Navigation Signs on a Road Surface Using a Head-Up Display","authors":"Kaho Ueno, T. Komuro","doi":"10.1109/ISMAR.2015.48","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.48","url":null,"abstract":"In this paper, we propose a method for overlaying navigation signs on a road surface and displaying them on a head-up display (HUD). Accurate overlaying is realized by measuring 3D data of the surface in real time using a depth camera. In addition, the effect of head movement is reduced by performing face tracking with a camera that is placed in front of the HUD, and by performing distortion correction of projection images according to the driver's viewpoint position. Using an experimental system, we conducted an experiment to display a navigation sign and confirmed that the sign is overlaid on a surface. We also confirmed that the sign looks to be fixed on the surface in real space.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128558599","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信