Real-Time Segmentation of Depth Map Frames on Mobile Devices

F. Moldoveanu, A. Ciobanu, A. Morar, A. Moldoveanu, V. Asavei
{"title":"Real-Time Segmentation of Depth Map Frames on Mobile Devices","authors":"F. Moldoveanu, A. Ciobanu, A. Morar, A. Moldoveanu, V. Asavei","doi":"10.1109/CSCS.2019.00052","DOIUrl":null,"url":null,"abstract":"This paper presents a real-time method for segmentation of depth map video frames on devices with limited computational capabilities. The depth maps are segmented based on the edges extracted by using an edge detector kernel. The entire scene is divided into regions that are separated by their edges. Furthermore, for providing a spatio-temporal consistency of the data, an association step is introduced in order to match the regions in the current frame with the ones in the previous frames. The proposed method may be used in scenarios where the objects in the 3D scene are placed at different distances from the acquisition device. The method was tested on a Raspberry Pi 2 board and on a series of Apple mobile devices (iPhone 6s, iPhone 5s and iPad Mini), using depth data acquired with the Microsoft Kinect v2 sensor at a resolution of 512x424 pixels. Since the Raspberry Pi and the Apple devices do not have full support for the Kinect v2 sensor, for the tests we have used a recorded dataset. Also, we tested the performance of the algorithm on the Apple devices by using a compatible depth sensor developed by Occipital (Structure Senor).","PeriodicalId":352411,"journal":{"name":"2019 22nd International Conference on Control Systems and Computer Science (CSCS)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 22nd International Conference on Control Systems and Computer Science (CSCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSCS.2019.00052","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This paper presents a real-time method for segmentation of depth map video frames on devices with limited computational capabilities. The depth maps are segmented based on the edges extracted by using an edge detector kernel. The entire scene is divided into regions that are separated by their edges. Furthermore, for providing a spatio-temporal consistency of the data, an association step is introduced in order to match the regions in the current frame with the ones in the previous frames. The proposed method may be used in scenarios where the objects in the 3D scene are placed at different distances from the acquisition device. The method was tested on a Raspberry Pi 2 board and on a series of Apple mobile devices (iPhone 6s, iPhone 5s and iPad Mini), using depth data acquired with the Microsoft Kinect v2 sensor at a resolution of 512x424 pixels. Since the Raspberry Pi and the Apple devices do not have full support for the Kinect v2 sensor, for the tests we have used a recorded dataset. Also, we tested the performance of the algorithm on the Apple devices by using a compatible depth sensor developed by Occipital (Structure Senor).
移动设备上深度图帧的实时分割
本文提出了一种在计算能力有限的设备上对深度图视频帧进行实时分割的方法。基于边缘检测核提取的边缘对深度图进行分割。整个场景被划分成由边缘分开的区域。此外,为了保证数据的时空一致性,引入了关联步骤,将当前帧中的区域与前一帧中的区域进行匹配。所提出的方法可用于将3D场景中的物体放置在与采集设备不同距离的场景中。该方法在Raspberry Pi 2板和一系列苹果移动设备(iPhone 6s、iPhone 5s和iPad Mini)上进行了测试,使用的是分辨率为512x424像素的微软Kinect v2传感器获取的深度数据。由于树莓派和苹果设备不完全支持Kinect v2传感器,我们在测试中使用了一个记录的数据集。此外,我们使用Occipital (Structure Senor)开发的兼容深度传感器在Apple设备上测试了算法的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信