Puppeteer: Exploring Intuitive Hand Gestures and Upper-Body Postures for Manipulating Human Avatar Actions

Ching-Wen Hung, Ruei-Che Chang, Hong-Sheng Chen, Chung-Han Liang, Liwei Chan, Bing-Yu Chen
{"title":"Puppeteer: Exploring Intuitive Hand Gestures and Upper-Body Postures for Manipulating Human Avatar Actions","authors":"Ching-Wen Hung, Ruei-Che Chang, Hong-Sheng Chen, Chung-Han Liang, Liwei Chan, Bing-Yu Chen","doi":"10.1145/3562939.3565609","DOIUrl":null,"url":null,"abstract":"Body-controlled avatars provide a more intuitive method to real-time control virtual avatars but require larger environment space and more user effort. In contrast, hand-controlled avatars give more dexterous and fewer fatigue manipulations within a close-range space for avatar control but provide fewer sensory cues than the body-based method. This paper investigates the differences between the two manipulations and explores the possibility of a combination. We first performed a formative study to understand when and how users prefer manipulating hands and bodies to represent avatars’ actions in current popular video games. Based on the top video games survey, we decided to represent human avatars’ motions. Besides, we found that players used their bodies to represent avatar actions but changed to using hands when they were too unrealistic and exaggerated to mimic by bodies (e.g., flying in the sky, rolling over quickly). Hand gestures also provide an alternative to lower-body motions when players want to sit during gaming and do not want extensive effort to move their avatars. Hence, we focused on the design of hand gestures and upper-body postures. We present Puppeteer, an input prototype system that allows players directly control their avatars through intuitive hand gestures and upper-body postures. We selected 17 avatar actions discovered in the formative study and conducted a gesture elicitation study to invite 12 participants to design best representing hand gestures and upper-body postures for each action. Then we implemented a prototype system using the MediaPipe framework to detect keypoints and a self-trained model to recognize 17 hand gestures and 17 upper-body postures. Finally, three applications demonstrate the interactions enabled by Puppeteer.","PeriodicalId":134843,"journal":{"name":"Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3562939.3565609","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Body-controlled avatars provide a more intuitive method to real-time control virtual avatars but require larger environment space and more user effort. In contrast, hand-controlled avatars give more dexterous and fewer fatigue manipulations within a close-range space for avatar control but provide fewer sensory cues than the body-based method. This paper investigates the differences between the two manipulations and explores the possibility of a combination. We first performed a formative study to understand when and how users prefer manipulating hands and bodies to represent avatars’ actions in current popular video games. Based on the top video games survey, we decided to represent human avatars’ motions. Besides, we found that players used their bodies to represent avatar actions but changed to using hands when they were too unrealistic and exaggerated to mimic by bodies (e.g., flying in the sky, rolling over quickly). Hand gestures also provide an alternative to lower-body motions when players want to sit during gaming and do not want extensive effort to move their avatars. Hence, we focused on the design of hand gestures and upper-body postures. We present Puppeteer, an input prototype system that allows players directly control their avatars through intuitive hand gestures and upper-body postures. We selected 17 avatar actions discovered in the formative study and conducted a gesture elicitation study to invite 12 participants to design best representing hand gestures and upper-body postures for each action. Then we implemented a prototype system using the MediaPipe framework to detect keypoints and a self-trained model to recognize 17 hand gestures and 17 upper-body postures. Finally, three applications demonstrate the interactions enabled by Puppeteer.
木偶师:探索直观的手势和上半身姿势操纵人类化身的行动
体控虚拟化身为实时控制虚拟化身提供了一种更加直观的方法,但需要更大的环境空间和更多的用户努力。相比之下,手控角色在近距离空间内提供了更灵巧和更少的疲劳操作,但提供的感官线索比基于身体的方法少。本文研究了这两种手法之间的差异,并探讨了结合的可能性。我们首先进行了一项形成性研究,以了解在当前流行的电子游戏中,用户何时以及如何喜欢操纵手和身体来代表虚拟角色的动作。基于最热门的电子游戏调查,我们决定代表人类化身的动作。除此之外,我们还发现玩家会使用自己的身体去代表角色的行动,但当这些动作过于不现实或夸张而无法被身体模仿时(例如,在空中飞行,快速翻滚),他们便会使用手。当玩家在游戏过程中想要坐着而不希望花费大量精力去移动他们的虚拟角色时,手势也可以作为下半身动作的替代选择。因此,我们将重点放在手势和上半身姿势的设计上。我们展示了Puppeteer,这是一个输入原型系统,允许玩家通过直观的手势和上半身姿势直接控制他们的化身。我们选择了在形成性研究中发现的17个化身动作,并进行了手势启发研究,邀请12名参与者为每个动作设计最具代表性的手势和上半身姿势。然后,我们使用MediaPipe框架实现了一个原型系统来检测关键点,并使用一个自我训练的模型来识别17种手势和17种上半身姿势。最后,三个应用程序演示了Puppeteer实现的交互。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信