Fast 3D point-cloud segmentation for interactive surfaces

E. M. Mthunzi, Christopher Getschmann, Florian Echtler
{"title":"Fast 3D point-cloud segmentation for interactive surfaces","authors":"E. M. Mthunzi, Christopher Getschmann, Florian Echtler","doi":"10.1145/3447932.3491141","DOIUrl":null,"url":null,"abstract":"Easily accessible depth sensors have enabled using point-cloud data to augment tabletop surfaces in everyday environments. However, point-cloud operations are computationally expensive and challenging to perform in real-time, particularly when targeting embedded systems without a dedicated GPU. In this paper, we propose mitigating the high computational costs by segmenting candidate interaction regions near real-time. We contribute an open-source solution for variable depth cameras using CPU-based architectures. For validation, we employ Microsoft’s Azure Kinect and report achieved performance. Our initial findings show that our approach takes under to segment candidate interaction regions on a tabletop surface and reduces the data volume by up to 70%. We conclude by contrasting the performance of our solution against a model-fitting approach implemented by the SurfaceStreams toolkit. Our approach outperforms the RANSAC-based strategy within the context of our test scenario, segmenting a tabletop’s interaction region up to 94% faster. Our results show promise for point-cloud-based approaches, even when targeting embedded solutions with limited resources.","PeriodicalId":214635,"journal":{"name":"Companion Proceedings of the 2021 Conference on Interactive Surfaces and Spaces","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Companion Proceedings of the 2021 Conference on Interactive Surfaces and Spaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3447932.3491141","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Easily accessible depth sensors have enabled using point-cloud data to augment tabletop surfaces in everyday environments. However, point-cloud operations are computationally expensive and challenging to perform in real-time, particularly when targeting embedded systems without a dedicated GPU. In this paper, we propose mitigating the high computational costs by segmenting candidate interaction regions near real-time. We contribute an open-source solution for variable depth cameras using CPU-based architectures. For validation, we employ Microsoft’s Azure Kinect and report achieved performance. Our initial findings show that our approach takes under to segment candidate interaction regions on a tabletop surface and reduces the data volume by up to 70%. We conclude by contrasting the performance of our solution against a model-fitting approach implemented by the SurfaceStreams toolkit. Our approach outperforms the RANSAC-based strategy within the context of our test scenario, segmenting a tabletop’s interaction region up to 94% faster. Our results show promise for point-cloud-based approaches, even when targeting embedded solutions with limited resources.
交互式曲面的快速3D点云分割
易于访问的深度传感器可以使用点云数据来增强日常环境中的桌面表面。然而,点云操作在计算上是昂贵的,并且在实时执行时具有挑战性,特别是在没有专用GPU的嵌入式系统时。在本文中,我们提出通过近实时分割候选交互区域来减少高计算成本。我们提供了一个基于cpu架构的可变深度相机的开源解决方案。为了验证,我们使用了微软的Azure Kinect,并报告了实现的性能。我们的初步研究结果表明,我们的方法可以在桌面表面上分割候选交互区域,并将数据量减少多达70%。最后,我们将解决方案的性能与SurfaceStreams工具包实现的模型拟合方法进行了对比。在我们的测试场景中,我们的方法优于基于ransac的策略,将桌面的交互区域分割速度提高了94%。我们的研究结果显示了基于点云的方法的前景,即使是针对资源有限的嵌入式解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信