Real-time 3D hand posture estimation based on 2D appearance retrieval using monocular camera

N. Shimada, Kousuke Kimura, Y. Shirai
{"title":"Real-time 3D hand posture estimation based on 2D appearance retrieval using monocular camera","authors":"N. Shimada, Kousuke Kimura, Y. Shirai","doi":"10.1109/RATFG.2001.938906","DOIUrl":null,"url":null,"abstract":"This paper proposes a system for estimating arbitrary 3D human hand postures in real-time. It can accept not only pre-determined hand signs but also arbitrary postures and it works in a monocular camera environment. The estimation is based on a 2D image retrieval. More than 16,000 possible hand appearances are first generated from a given 3D shape model by rotating model joints and stored in an appearance database. Every appearance is tagged with its own joint angles which are used when the appearance was generated. By retrieving the appearance in the database well-matching to the input image contour, the joint angles of the input shape can be rapidly obtained. The search area is reduced by using an adjacency map in the database. To prevent tracking failures, a fixed number of the well-matching appearances are saved at every frame. After the multiple neighborhoods of the saved appearances are merged, the unified neighborhood is searched for the estimate efficiently by beam search. The posture estimates result from experimental examples are shown.","PeriodicalId":355094,"journal":{"name":"Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"117","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RATFG.2001.938906","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 117

Abstract

This paper proposes a system for estimating arbitrary 3D human hand postures in real-time. It can accept not only pre-determined hand signs but also arbitrary postures and it works in a monocular camera environment. The estimation is based on a 2D image retrieval. More than 16,000 possible hand appearances are first generated from a given 3D shape model by rotating model joints and stored in an appearance database. Every appearance is tagged with its own joint angles which are used when the appearance was generated. By retrieving the appearance in the database well-matching to the input image contour, the joint angles of the input shape can be rapidly obtained. The search area is reduced by using an adjacency map in the database. To prevent tracking failures, a fixed number of the well-matching appearances are saved at every frame. After the multiple neighborhoods of the saved appearances are merged, the unified neighborhood is searched for the estimate efficiently by beam search. The posture estimates result from experimental examples are shown.
基于二维外观检索的单目相机实时三维手部姿态估计
本文提出了一种实时估计任意三维人体手势的系统。它不仅可以接受预先确定的手势,还可以接受任意的姿势,并且可以在单目相机环境下工作。估计是基于二维图像检索。首先,通过旋转模型关节,从给定的3D形状模型中生成超过16,000种可能的手部外观,并存储在外观数据库中。每个外观都有自己的关节角标记,这些关节角在生成外观时使用。通过检索数据库中与输入图像轮廓匹配良好的外观,可以快速获得输入形状的关节角。通过使用数据库中的邻接图来缩小搜索区域。为了防止跟踪失败,在每一帧保存固定数量的匹配良好的外观。将保存的外观的多个邻域合并后,通过波束搜索高效地搜索统一邻域进行估计。给出了实验算例的姿态估计结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信