2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)最新文献

筛选
英文 中文
Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-Through Based Augmented Reality 模糊(粘)的手指:本体感觉指向和选择远距离物体的光学透视增强现实
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-10-07 DOI: 10.1109/ISMAR-Adjunct.2016.0109
J. Yu, G. Kim
{"title":"Blurry (Sticky) Finger: Proprioceptive Pointing and Selection of Distant Objects for Optical See-Through Based Augmented Reality","authors":"J. Yu, G. Kim","doi":"10.1109/ISMAR-Adjunct.2016.0109","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0109","url":null,"abstract":"We demonstrate “Blurry (Sticky) Finger” in which one uses the unfocused blurred finger, sense of proprioception, to aim, point and directly select a distant object in the real world with both eyes open. We showcase two demo applications. The first illustrates the accuracy and usability of the proposed method with the target objects lying at a fixed depth on a monitor. The second is an AR based object inquiry system, a more practical application. The user aims and encircles a real 3D object whose image is captured with the eye-to-camera offset compensated. The image is searched through the data base with the result augmented on an OST display.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"91 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126129912","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
PoLAR: A Portable Library for Augmented Reality PoLAR:用于增强现实的便携式库
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-19 DOI: 10.1109/ISMAR-Adjunct.2016.0081
P. Petitprez, E. Kerrien, Pierre-Frédéric Villard
{"title":"PoLAR: A Portable Library for Augmented Reality","authors":"P. Petitprez, E. Kerrien, Pierre-Frédéric Villard","doi":"10.1109/ISMAR-Adjunct.2016.0081","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0081","url":null,"abstract":"We present here a novel cross-platform library to facilitate research and development applications dealing with augmented reality (AR). Features include 2D and 3D objects visualization and interaction, camera flow and image manipulation, and soft-body deformation. Our aim is to provide computer vision specialists' with tools to facilitate AR application development by providing easy and state of the art access to GUI creation, visualization and hardware management.We demonstrate both the simplicity and the efficiency of coding AR applications through three detailed examples. PoLAR can be downloaded at http://polar.inria.fr and is distributed under the GPL licence.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126486715","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Designing AR Systems to Explore Point-of-View, Bias, and Trans-cultural Conflict 设计AR系统以探索观点、偏见和跨文化冲突
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0069
Maribeth Gandy Coleman, L. Levy, Scott L. Robertson, Jeremy Johnson, Jeff Wilson, Tony Lemieux, Susan Tamasi, Darlene Mashman, Michele Sumler, Laureen L. Hill
{"title":"Designing AR Systems to Explore Point-of-View, Bias, and Trans-cultural Conflict","authors":"Maribeth Gandy Coleman, L. Levy, Scott L. Robertson, Jeremy Johnson, Jeff Wilson, Tony Lemieux, Susan Tamasi, Darlene Mashman, Michele Sumler, Laureen L. Hill","doi":"10.1109/ISMAR-Adjunct.2016.0069","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0069","url":null,"abstract":"Over ten years ago, we created a novel dramatic augmented reality (AR) experience exploring bias and point-of-view (PoV) based upon the classic film “Twelve Angry Men,” which allowed a user to experience a dramatic jury room deliberation from the PoV of each of four different characters. Recently, informed by this previous work, we have created a new AR platform for engaging users in different PoVs, exposing forms of biases, and studying cultural conflicts. We are currently using this system for training and assessment in two domains: healthcare and psychological studies of terrorism. In this paper we present the requirements we have identified for this type of user experience, the co-design of both AR environments with domain experts, and the results of an initial user study of technology acceptance that yielded positive feedback from participants.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120962527","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Scalable Mobile Image Recognition for Real-Time Video Annotation 可扩展的移动图像识别实时视频注释
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0110
Philipp Fleck, Clemens Arth, D. Schmalstieg
{"title":"Scalable Mobile Image Recognition for Real-Time Video Annotation","authors":"Philipp Fleck, Clemens Arth, D. Schmalstieg","doi":"10.1109/ISMAR-Adjunct.2016.0110","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0110","url":null,"abstract":"Traditional AR frameworks for gaming and advertising focus on tracking 2D static targets. This limits the plausible use of this solutions to certain application cases like brochures or posters, but deprives their use for dynamically changing 2D targets, such as video walls or electronic billboards used in advertising.In this demo, we show how to use a rapid, fully mobile image recognition system to introduce AR in videos playing on TV sets or other dynamic screens, without the need to alter or modify the content for trackability. Our approach uses a scalable and fully mobile concept, which requires a database with a very small memory footprint on mobiles for a video or even a collection of videos.The feasibility of the approach is demonstrated on over 16 hours of video from a popular TV series, indexing into the video and giving accurate time codes and full 6DOF tracking for AR augmentations.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125547468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Evaluation of Information Connection in Augmented Reality for 3D Scenes with Occlusion 具有遮挡的三维场景增强现实中的信息连接评估
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0083
Ralf Dauenhauer, Tobias Müller
{"title":"An Evaluation of Information Connection in Augmented Reality for 3D Scenes with Occlusion","authors":"Ralf Dauenhauer, Tobias Müller","doi":"10.1109/ISMAR-Adjunct.2016.0083","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0083","url":null,"abstract":"Most augmented reality applications connect virtual information to anchors, i.e. physical places or objects, by using spatial overlays or proximity. However, for industrial use cases this is not always feasible because specific parts must remain fully visible in order to meet work or security requirements. In these situations virtual information must be displayed at alternative positions while connections to anchors must still be clearly recognizable. In our previous research we were the first to show that for simple scenes connection lines are most suitable for this. To extend these results to more complex environments, we conducted an experiment on the effects of visual interruptions in connection lines and incorrect occlusion. Completion time and subjective mental effort for search tasks were used as measures. Our findings confirm that also in 3D scenes with partial occlusion connection lines are preferable to connect virtual information with anchors if an assignment via overlay or close proximity is not feasible. The results further imply that neither incorrectly used depth cues nor missing parts of connection lines make a significant difference concerning completion time or subjective mental effort. For designers of industrial augmented reality applications this means that they can choose either visualization based on their needs.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122053875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
DualCAD: Integrating Augmented Reality with a Desktop GUI and Smartphone Interaction DualCAD:集成增强现实与桌面GUI和智能手机交互
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0030
Alexandre Millette, Michael J. McGuffin
{"title":"DualCAD: Integrating Augmented Reality with a Desktop GUI and Smartphone Interaction","authors":"Alexandre Millette, Michael J. McGuffin","doi":"10.1109/ISMAR-Adjunct.2016.0030","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0030","url":null,"abstract":"Head-Mounted Displays (HMDs) combined with 3-or-more Degree-of-Freedom (DoF) input enable rapid manipulation of stereoscopic 3D content. However, such input is typically performed with hands in midair and therefore lacks precision and stability. Also, recent consumer-grade HMDs suffer from limited angular resolution and/or limited field-of-view as compared to a desktop monitor. We present the DualCAD system that implements two solutions to these problems. First, the user may freely switch at runtime between an augmented reality HMD mode, and a traditional desktop mode with precise 2D mouse input and an external desktop monitor. Second, while in the augmented reality HMD mode, the user holds a smartphone in their non-dominant hand that is tracked with 6 DoF, allowing it to be used as a complementary high-resolution display as well as an alternative input device for stylus or multitouch input. Two novel bimanual interaction techniques that leverage the properties of the smartphone are presented. We also report initial user feedback.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128579579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 50
Simultaneous Pose Estimation and Augmentation of Elastic Surfaces from a Moving Monocular Camera 运动单目相机弹性曲面的同步姿态估计与增强
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0076
Nazim Haouchine, M. Berger, S. Cotin
{"title":"Simultaneous Pose Estimation and Augmentation of Elastic Surfaces from a Moving Monocular Camera","authors":"Nazim Haouchine, M. Berger, S. Cotin","doi":"10.1109/ISMAR-Adjunct.2016.0076","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0076","url":null,"abstract":"We present in this paper an original method to estimate the pose of a monocular camera while simultaneously modeling and capturing the elastic deformation of the object to be augmented. Our method tackles a challenging problem where ambiguities between rigid motion and non-rigid deformation are present. This issue represents a major lock for the establishment of an efficient surgical augmented reality where endoscopic camera moves and organs deform. Using an underlying physical model to estimate the low stressed regions our algorithm separates the rigid body motion from the elastic deformations using polar decomposition of the strain tensor. Following this decomposition, a constrained minimization, that encodes both the optical and the physical constraints, is resolved at each frame. Results on real and simulated data are exposed to show the effectiveness of our approach.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126693862","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A Systematic Review of Usability Studies in Augmented Reality between 2005 and 2014 2005年至2014年增强现实可用性研究的系统回顾
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0036
Arindam Dey, M. Billinghurst, R. Lindeman, J. Swan II
{"title":"A Systematic Review of Usability Studies in Augmented Reality between 2005 and 2014","authors":"Arindam Dey, M. Billinghurst, R. Lindeman, J. Swan II","doi":"10.1109/ISMAR-Adjunct.2016.0036","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0036","url":null,"abstract":"Augmented Reality (AR) interfaces have been studied extensively over the last few decades, with a growing number of user-based experiments. In this paper, we systematically review most AR papers published between 2005 and 2014 that include user studies. A total of 291 papers have been reviewed and classified based on their application areas. The primary contribution of the review is to present the broad landscape of user-based AR research, and to provide a high-level view of how that landscape has changed. We also identify areas where there have been few user studies, and opportunities for future research. This poster describes the methodology of the review and the classifications of AR research that have emerged.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124191055","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 38
IoT Platform-based iAR: a Prototype for Plant O&M Applications 基于物联网平台的iAR:工厂运维应用的原型
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0063
Jung Min Lee, Kyung-Ho Lee, B. Nam, Yuepeng Wu
{"title":"IoT Platform-based iAR: a Prototype for Plant O&M Applications","authors":"Jung Min Lee, Kyung-Ho Lee, B. Nam, Yuepeng Wu","doi":"10.1109/ISMAR-Adjunct.2016.0063","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0063","url":null,"abstract":"In this paper, we present an iAR prototype designed for offshore/onshore plant's piping O&M. We have defined terms of iAR as “integrated-AR” and “intelligent-AR” in previous paper. That is, AR technology or application should be integrated with CAD system (or CAD oriented system) and be able to support engineering knowledge to be the successful industry application. The system uses “Intergraph Smart™ 3D for Plant” to export piping model data and “PTC ThingWorx” as IoT platform. The architecture of our prototype involves two main modules: converting pipe drawing data to 3D model and connecting sensor data to 3D model. The first module includes parametric piping symbol data generator, which can directly generate 3D geometry model from isometric piping drawing files. In the second module, we developed fast sensor data connecting methods for connecting sensor device with 3D model. By combining these processing modules, the proposed system is able to successfully apply to plant's O&M (operation and maintenance).","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134066027","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
On Combining a Semi-Calibrated Stereo Camera and Massive Parallelism for Fast Plane Extraction 结合半标定立体相机和大规模平行度的快速平面提取
2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct) Pub Date : 2016-09-01 DOI: 10.1109/ISMAR-Adjunct.2016.0084
R. Lima, J. Martínez-Carranza, A. Morales-Reyes, W. Mayol-Cuevas
{"title":"On Combining a Semi-Calibrated Stereo Camera and Massive Parallelism for Fast Plane Extraction","authors":"R. Lima, J. Martínez-Carranza, A. Morales-Reyes, W. Mayol-Cuevas","doi":"10.1109/ISMAR-Adjunct.2016.0084","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct.2016.0084","url":null,"abstract":"We present a novel methodology that combines stereo vision and parallel processing, based on GPU and the use of binary descriptors, for fast plane extraction. Typical stereo algorithms require an image rectification stage that has to run in a frame-to-frame basis, increasing the computational burden and with the possibility of compromising high frame rate operation. Hence, we propose to use a semi-calibrated stereo approach, meaning that only calibration of extrinsic parameters of the stereo rig is carried out, thus avoiding a rectification process of the frames captured by the stereo camera. For the latter, we rely on feature matching of salient points detected on the stereo images, from which image correspondences are obtained. These correspondences are triangulated in order to generate a point cloud that is passed to a plane fitting module. As feature matching is a heavy task, we present a novel GPU architecture to accelerate such process, thus achieving real-time performance of up to 50 fps for the whole process. To demonstrate our approach, we also present an augmented reality application that exploits the planes extracted with our proposed methodology.","PeriodicalId":171967,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)","volume":"515 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132931252","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信