Preliminary Evaluation of a Framework for Overhead Skeleton Tracking in Factory Environments using Kinect

M. M. Marinho, Yuki Yatsushima, T. Maekawa, Y. Namioka
{"title":"Preliminary Evaluation of a Framework for Overhead Skeleton Tracking in Factory Environments using Kinect","authors":"M. M. Marinho, Yuki Yatsushima, T. Maekawa, Y. Namioka","doi":"10.1145/3134230.3134232","DOIUrl":null,"url":null,"abstract":"This paper presents a preliminary evaluation of a framework that allows an overhead RGBD camera to segment and track workers skeleton in an unstructured factory environment. The default Kinect skeleton tracking algorithm was developed using front-view artificial depth images generated from a 3D model of a person in an empty room. The proposed framework is inspired in this concept, and works by capturing motion data of worker movements performing a real factory task. That motion data is matched to the 3D model of the worker. In a novel approach, the largest elements in the workspace (e.g. desks, racks) are modeled with simple shapes, and the artificial depth images are generated in a \"simplified workspace\" in contrast with an \"empty workspace\". We show in preliminary experiments that the addition of the simplified models during training can increase, ceteris paribus, the segmentation accuracy by over 3 times and the recall by about one and a half times when the workspace is highly cluttered. Evaluation is made using real depth images obtained in a factory environment, and as ground-truth manually segmented images are used.","PeriodicalId":209424,"journal":{"name":"Proceedings of the 4th International Workshop on Sensor-based Activity Recognition and Interaction","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Workshop on Sensor-based Activity Recognition and Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3134230.3134232","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

This paper presents a preliminary evaluation of a framework that allows an overhead RGBD camera to segment and track workers skeleton in an unstructured factory environment. The default Kinect skeleton tracking algorithm was developed using front-view artificial depth images generated from a 3D model of a person in an empty room. The proposed framework is inspired in this concept, and works by capturing motion data of worker movements performing a real factory task. That motion data is matched to the 3D model of the worker. In a novel approach, the largest elements in the workspace (e.g. desks, racks) are modeled with simple shapes, and the artificial depth images are generated in a "simplified workspace" in contrast with an "empty workspace". We show in preliminary experiments that the addition of the simplified models during training can increase, ceteris paribus, the segmentation accuracy by over 3 times and the recall by about one and a half times when the workspace is highly cluttered. Evaluation is made using real depth images obtained in a factory environment, and as ground-truth manually segmented images are used.
工厂环境中使用Kinect的架空骨架跟踪框架的初步评估
本文提出了一个框架的初步评估,该框架允许头顶RGBD相机在非结构化工厂环境中分割和跟踪工人骨架。Kinect的默认骨骼跟踪算法是利用一个人在一个空房间里的3D模型生成的前视人工深度图像开发的。所提出的框架受到这一概念的启发,并通过捕获执行真实工厂任务的工人运动的运动数据来工作。这些运动数据与工人的3D模型相匹配。在一种新颖的方法中,工作空间中最大的元素(例如桌子,架子)用简单的形状建模,并在“简化工作空间”中生成人工深度图像,与“空工作空间”形成对比。我们在初步实验中表明,在其他条件相同的情况下,当工作空间高度混乱时,在训练过程中加入简化模型可以将分割精度提高3倍以上,召回率提高1.5倍左右。使用在工厂环境中获得的真实深度图像进行评估,并使用真实的手动分割图像。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信