增强现实中基于边缘快照的动态遮挡处理深度增强

Chao Du, Yen-Lin Chen, Mao Ye, Liu Ren
{"title":"增强现实中基于边缘快照的动态遮挡处理深度增强","authors":"Chao Du, Yen-Lin Chen, Mao Ye, Liu Ren","doi":"10.1109/ISMAR.2016.17","DOIUrl":null,"url":null,"abstract":"Dynamic occlusion handling is critical for correct depth perception in Augmented Reality (AR) applications. Consequently it is a key component to ensure realistic and immersive AR experiences. Existing solutions to tackle this challenge typically suffer from various limitations, e.g. assumption of a static scene or high computational complexity. In this work, we propose an algorithm for depth map enhancement for dynamic occlusion handling in AR applications. The key of our algorithm is an edge snapping approach, formulated as discrete optimization, that improves the consistency of object boundaries between RGB and depth data. The optimization problem is solved efficiently via dynamic programming and our system runs in near real-time on the tablet platform. Experimental evaluations demonstrate that our approach largely improves the raw sensor data and is particularly suitable compared to several related approaches in terms of both speed and quality. Furthermore, we demonstrate visually pleasing dynamic occlusion effects for multiple AR use cases based on our edge snapping results.","PeriodicalId":146808,"journal":{"name":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)","volume":"85 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"32","resultStr":"{\"title\":\"Edge Snapping-Based Depth Enhancement for Dynamic Occlusion Handling in Augmented Reality\",\"authors\":\"Chao Du, Yen-Lin Chen, Mao Ye, Liu Ren\",\"doi\":\"10.1109/ISMAR.2016.17\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Dynamic occlusion handling is critical for correct depth perception in Augmented Reality (AR) applications. Consequently it is a key component to ensure realistic and immersive AR experiences. Existing solutions to tackle this challenge typically suffer from various limitations, e.g. assumption of a static scene or high computational complexity. In this work, we propose an algorithm for depth map enhancement for dynamic occlusion handling in AR applications. The key of our algorithm is an edge snapping approach, formulated as discrete optimization, that improves the consistency of object boundaries between RGB and depth data. The optimization problem is solved efficiently via dynamic programming and our system runs in near real-time on the tablet platform. Experimental evaluations demonstrate that our approach largely improves the raw sensor data and is particularly suitable compared to several related approaches in terms of both speed and quality. Furthermore, we demonstrate visually pleasing dynamic occlusion effects for multiple AR use cases based on our edge snapping results.\",\"PeriodicalId\":146808,\"journal\":{\"name\":\"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)\",\"volume\":\"85 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"32\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISMAR.2016.17\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2016.17","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 32

摘要

在增强现实(AR)应用中,动态遮挡处理对于正确的深度感知至关重要。因此,它是确保现实和沉浸式AR体验的关键组成部分。解决这一挑战的现有解决方案通常受到各种限制,例如假设静态场景或高计算复杂性。在这项工作中,我们提出了一种深度图增强算法,用于AR应用中的动态遮挡处理。该算法的关键是一种边缘捕捉方法,该方法被表述为离散优化,它提高了RGB和深度数据之间目标边界的一致性。通过动态规划有效地解决了优化问题,系统在平板平台上近乎实时地运行。实验评估表明,我们的方法在很大程度上改善了原始传感器数据,并且在速度和质量方面与几种相关方法相比特别合适。此外,我们基于边缘捕捉结果,为多个AR用例演示了视觉上令人愉悦的动态遮挡效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Edge Snapping-Based Depth Enhancement for Dynamic Occlusion Handling in Augmented Reality
Dynamic occlusion handling is critical for correct depth perception in Augmented Reality (AR) applications. Consequently it is a key component to ensure realistic and immersive AR experiences. Existing solutions to tackle this challenge typically suffer from various limitations, e.g. assumption of a static scene or high computational complexity. In this work, we propose an algorithm for depth map enhancement for dynamic occlusion handling in AR applications. The key of our algorithm is an edge snapping approach, formulated as discrete optimization, that improves the consistency of object boundaries between RGB and depth data. The optimization problem is solved efficiently via dynamic programming and our system runs in near real-time on the tablet platform. Experimental evaluations demonstrate that our approach largely improves the raw sensor data and is particularly suitable compared to several related approaches in terms of both speed and quality. Furthermore, we demonstrate visually pleasing dynamic occlusion effects for multiple AR use cases based on our edge snapping results.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信