2015 IEEE International Symposium on Mixed and Augmented Reality最新文献

筛选
英文 中文
[POSTER] A Step Closer To Reality: Closed Loop Dynamic Registration Correction in SAR [海报]离现实更近一步:SAR闭环动态配准校正
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.34
Hemal Naik, Federico Tombari, Christoph Resch, P. Keitler, Nassir Navab
{"title":"[POSTER] A Step Closer To Reality: Closed Loop Dynamic Registration Correction in SAR","authors":"Hemal Naik, Federico Tombari, Christoph Resch, P. Keitler, Nassir Navab","doi":"10.1109/ISMAR.2015.34","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.34","url":null,"abstract":"In Spatial Augmented Reality (SAR) applications, real world objects are augmented with virtual content by means of a calibrated camera-projector system. A computer generated model (CAD) of the real object is used to plan the positions where the virtual content is to be projected. It is often the case that the real object deviates from its CAD model, this resulting in misregistered augmentations. We propose a new method to dynamically correct the planned augmentation by accommodating for the unknown deviations in the object geometry. We use a closed loop approach where the projected features are detected in the camera image and deployed as feedback. As a result, the registration misalignment is identified and the augmentations are corrected in the areas affected by the deviation. Our work is especially focused on SAR applications related to the industrial domain, where this problem is omnipresent. We show that our method is effective and beneficial for multiple industrial applications.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"34 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133787468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
[POSTER] Deformation Estimation of Elastic Bodies Using Multiple Silhouette Images for Endoscopic Image Augmentation [POSTER]基于多剪影图像的弹性体内窥镜图像增强变形估计
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.49
Akira Saito, M. Nakao, Yuuki Uranishi, T. Matsuda
{"title":"[POSTER] Deformation Estimation of Elastic Bodies Using Multiple Silhouette Images for Endoscopic Image Augmentation","authors":"Akira Saito, M. Nakao, Yuuki Uranishi, T. Matsuda","doi":"10.1109/ISMAR.2015.49","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.49","url":null,"abstract":"This study proposes a method to estimate elastic deformation using silhouettes obtained from multiple endoscopic images. Our method can estimate the intraoperative deformation of organs using a volumetric mesh model reconstructed from preoperative CT data. We use this elastic body silhouette information of elastic bodies not to model the shape but to estimate the local displacements. The model shape is updated to satisfy the silhouette constraint while preserving the shape as much as possible. The result of the experiments showed that the proposed methods could estimate the deformation with root mean square (RMS) errors of 5.0–10 mm.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121107204","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 20
Auditory and Visio-Temporal Distance Coding for 3-Dimensional Perception in Medical Augmented Reality 医学增强现实中三维感知的听觉和视觉-时间距离编码
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.16
F. Bork, B. Fuerst, Anja-Katharina Schneider, Francisco Pinto, C. Graumann, Nassir Navab
{"title":"Auditory and Visio-Temporal Distance Coding for 3-Dimensional Perception in Medical Augmented Reality","authors":"F. Bork, B. Fuerst, Anja-Katharina Schneider, Francisco Pinto, C. Graumann, Nassir Navab","doi":"10.1109/ISMAR.2015.16","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.16","url":null,"abstract":"Image-guided medical interventions more frequently rely on Augmented Reality (AR) visualization to enable surgical navigation. Current systems use 2-D monitors to present the view from external cameras, which does not provide an ideal perception of the 3-D position of the region of interest. Despite this problem, most research targets the direct overlay of diagnostic imaging data, and only few studies attempt to improve the perception of occluded structures in external camera views. The focus of this paper lies on improving the 3-D perception of an augmented external camera view by combining both auditory and visual stimuli in a dynamic multi-sensory AR environment for medical applications. Our approach is based on Temporal Distance Coding (TDC) and an active surgical tool to interact with occluded virtual objects of interest in the scene in order to gain an improved perception of their 3-D location. Users performed a simulated needle biopsy by targeting virtual lesions rendered inside a patient phantom. Experimental results demonstrate that our TDC-based visualization technique significantly improves the localization accuracy, while the addition of auditory feedback results in increased intuitiveness and faster completion of the task.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114680624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 27
[POSTER] Improved SPAAM Robustness through Stereo Calibration 通过立体声校准提高了SPAAM的鲁棒性
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.64
Kenneth R. Moser, J. Swan
{"title":"[POSTER] Improved SPAAM Robustness through Stereo Calibration","authors":"Kenneth R. Moser, J. Swan","doi":"10.1109/ISMAR.2015.64","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.64","url":null,"abstract":"We are investigating methods for improving the robustness and consistency of the Single Point Active Alignment Method (SPAAM) optical see-through (OST) head-mounted display (HMD) calibration procedure. Our investigation focuses on two variants of SPAAM. The first utilizes a standard monocular alignment strategy to calibrate the left and right eye separately, while the second leverages stereoscopic cues available from binocular HMDs to calibrate both eyes simultaneously. We compare results from repeated calibrations between methods using eye location estimates and inter pupillary distance (IPD) measures. Our findings indicate that the stereo SPAAM method produces more accurate and consistent results during calibration compared to the monocular variant.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128976538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
[POSTER] Tracking and Mapping with a Swarm of Heterogeneous Clients [海报]一群异构客户端的跟踪和映射
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.40
Philipp Fleck, Clemens Arth, Christian Pirchheim, D. Schmalstieg
{"title":"[POSTER] Tracking and Mapping with a Swarm of Heterogeneous Clients","authors":"Philipp Fleck, Clemens Arth, Christian Pirchheim, D. Schmalstieg","doi":"10.1109/ISMAR.2015.40","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.40","url":null,"abstract":"In this work, we propose a multi-user system for tracking and mapping, which accommodates mobile clients with different capabilities, mediated by a server capable of providing real-time structure from motion. Clients share their observations of the scene according to their individual capabilities. This can involve only keyframe tracking, but also mapping and map densification, if more computational resources are available. Our contribution is a system architecture that lets heterogeneous clients contribute to a collaborative mapping effort, without prescribing fixed capabilities for the client devices. We investigate the implications that the clients' capabilities have on the collaborative reconstruction effort and its use for AR applications.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132065585","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
[POSTER] Pseudo Printed Fabrics through Projection Mapping [海报]投影映射的伪印花织物
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.51
Yuichiro Fujimoto, Goshiro Yamamoto, Takafumi Taketomi, C. Sandor, H. Kato
{"title":"[POSTER] Pseudo Printed Fabrics through Projection Mapping","authors":"Yuichiro Fujimoto, Goshiro Yamamoto, Takafumi Taketomi, C. Sandor, H. Kato","doi":"10.1109/ISMAR.2015.51","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.51","url":null,"abstract":"Projection-based Augmented Reality commonly projects on rigid objects, while only few systems project on deformable objects. In this paper, we present Pseudo Printed Fabrics (PPF), which enables the projection on a deforming piece of cloth. This can be applied to previewing a cloth design while manipulating its shape. We support challenging manipulations, including heavy occlusions and stretching the cloth. In previous work, we developed a similar system, based on a novel marker pattern; PPF extends it in two important aspects. First, we improved performance by two orders of magnitudes to achieve interactive performance. Second, we developed a new interpolation algorithm to keep registration during challenging manipulations. We believe that PPF can be applied to domains including virtual-try on and fashion design.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"141 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134481548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
[POSTER] Remote Welding Robot Manipulation Using Multi-view Images [海报]基于多视角图像的远程焊接机器人操作
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.38
Yuichi Hiroi, Kei Obata, Katsuhiro Suzuki, Naoto Ienaga, M. Sugimoto, H. Saito, Tadashi Takamaru
{"title":"[POSTER] Remote Welding Robot Manipulation Using Multi-view Images","authors":"Yuichi Hiroi, Kei Obata, Katsuhiro Suzuki, Naoto Ienaga, M. Sugimoto, H. Saito, Tadashi Takamaru","doi":"10.1109/ISMAR.2015.38","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.38","url":null,"abstract":"This paper proposes a remote welding robot manipulation system by using multi-view images. After an operator specifies two-dimensional path on images, the system transforms it into three-dimensional path and displays the movement of the robot by overlaying graphics with images. The accuracy of our system is sufficient to weld objects when combining with a sensor in the robot. The system allows the non-expert operator to weld objects remotely and intuitively, without the need to create a 3D model of a processed object beforehand.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128213517","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
[POSTER] Geometric Mapping for Color Compensation Using Scene Adaptive Patches 使用场景自适应补丁进行颜色补偿的几何映射
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.67
Jong Hun Lee, Yong Hwi Kim, Yong Yi Lee, Kwan H. Lee
{"title":"[POSTER] Geometric Mapping for Color Compensation Using Scene Adaptive Patches","authors":"Jong Hun Lee, Yong Hwi Kim, Yong Yi Lee, Kwan H. Lee","doi":"10.1109/ISMAR.2015.67","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.67","url":null,"abstract":"The SAR technique using a projector-camera system allows us to make various effect on a real scene without physical reconstitution. In order to project contents on a textured scene without color imperfections, geometric and radiometric compensation of a projection image should be conducted as preprocessing. In this paper, we present a new geometric mapping method for color compensation in the projector-camera system. We capture the scene and segment it into adaptive patch according to the scene structure using the SLIC segmentation. The piece-wise polynomial function is evaluated for each patch to find pixel-to-pixel correspondences between the measured and projection images. Finally, color compensation is performed by using a color mixing matrix. Experimental results show that our geometric mapping method establishes accurate correspondences and color compensation alleviates the color imperfections which is caused by texture of a general scene.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114668049","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
[POSTER] Avatar-Mediated Contact Interaction between Remote Users for Social Telepresence [海报]社交网真远程用户的虚拟媒介接触互动
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-29 DOI: 10.1109/ISMAR.2015.61
Jihye Oh, Yeonjoon Kim, Taeil Jin, Sukwon Lee, Youjin Lee, Sung-Hee Lee
{"title":"[POSTER] Avatar-Mediated Contact Interaction between Remote Users for Social Telepresence","authors":"Jihye Oh, Yeonjoon Kim, Taeil Jin, Sukwon Lee, Youjin Lee, Sung-Hee Lee","doi":"10.1109/ISMAR.2015.61","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.61","url":null,"abstract":"Social touch such as a handshake increases the sense of coexistence and closeness between remote users in a social telepresence environment, but creating such coordinated contact movements with a distant person is extremely difficult if given only visual feedback, without haptic feedback. This paper presents a method to enable hand-contact interaction between remote users in an avatar-mediated telepresence environment. The key approach is, while the avatar directly follows its owner's motion in normal conditions, it adjusts the pose to maintain contact with the other user when the two users attempt to make contact interaction. To this end, we develop classifiers to recognize the users' intention for the contact interaction. The contact classifier identifies whether the users try to initiate contact when they are not in contact, and the separation classifier identifies whether the two in contact attempt to break contact. The classifiers are trained based on a set of geometric distance features. During the contact phase, inverse kinematics is solved to determine the pose of the avatar's arm so as to initiate and maintain natural contact with the other user's hand. Our system is unique in that two remote users can perform real time hand contact interaction in a social telepresence environment.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116552234","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
[POSTER] Augmented Reality for Radiation Awareness [海报]增强现实辐射意识
2015 IEEE International Symposium on Mixed and Augmented Reality Pub Date : 2015-09-01 DOI: 10.1109/ISMAR.2015.21
Nicola Leucht, S. Habert, P. Wucherer, S. Weidert, Nassir Navab, P. Fallavollita
{"title":"[POSTER] Augmented Reality for Radiation Awareness","authors":"Nicola Leucht, S. Habert, P. Wucherer, S. Weidert, Nassir Navab, P. Fallavollita","doi":"10.1109/ISMAR.2015.21","DOIUrl":"https://doi.org/10.1109/ISMAR.2015.21","url":null,"abstract":"C-arm fluoroscopes are frequently used during surgeries for intraoperative guidance. Unfortunately, due to X-ray emission and scattering, increased radiation exposure occurs in the operating theatre. The objective of this work is to sensitize the surgeon to their radiation exposure, enable them to check on their exposure over time, and to help them choose their best position related to the C-arm gantry during surgery. First, we aim at simulating the amount of radiation that reaches the surgeon using the Geant4 software, a toolkit developed by CERN. Using a flexible setup in which two RGB-D cameras are mounted to the mobile C-arm, the scene is captured and modeled respectively. After the simulation of particles with specific energies, the dose at the surgeon's position, determined by the depth cameras, can be measured. The validation was performed by comparing the simulation results to both theoretical values from the C-arms user manual and real measurements made with a QUART didoSVM dosimeter. The average error was 16.46% and 16.39%, respectively. The proposed flexible setup and high simulation precision without a calibration with measured dosimeter values, has great potential to be directly used and integrated intraoperatively for dose measurement.","PeriodicalId":240196,"journal":{"name":"2015 IEEE International Symposium on Mixed and Augmented Reality","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127116455","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信