FilterJoint:迈向对全身手势表达的理解

Aishat Aloba, Julia Woodward, Lisa Anthony
{"title":"FilterJoint:迈向对全身手势表达的理解","authors":"Aishat Aloba, Julia Woodward, Lisa Anthony","doi":"10.1145/3382507.3418822","DOIUrl":null,"url":null,"abstract":"Classification accuracy of whole-body gestures can be improved by selecting gestures that have few conflicts (i.e., confusions or misclassifications). To identify such gestures, an understanding of the nuances of how users articulate whole-body gestures can help, especially when conflicts may be due to confusion among seemingly dissimilar gestures. To the best of our knowledge, such an understanding is currently missing in the literature. As a first step to enable this understanding, we designed a method that facilitates investigation of variations in how users move their body parts as they perform a motion. This method, which we call filterJoint, selects the key body parts that are actively moving during the performance of a motion. The paths along which these body parts move in space over time can then be analyzed to make inferences about how users articulate whole-body gestures. We present two case studies to show how the filterJoint method enables a deeper understanding of whole-body gesture articulation, and we highlight implications for the selection of whole-body gesture sets as a result of these insights.","PeriodicalId":402394,"journal":{"name":"Proceedings of the 2020 International Conference on Multimodal Interaction","volume":"109 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"FilterJoint: Toward an Understanding of Whole-Body Gesture Articulation\",\"authors\":\"Aishat Aloba, Julia Woodward, Lisa Anthony\",\"doi\":\"10.1145/3382507.3418822\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Classification accuracy of whole-body gestures can be improved by selecting gestures that have few conflicts (i.e., confusions or misclassifications). To identify such gestures, an understanding of the nuances of how users articulate whole-body gestures can help, especially when conflicts may be due to confusion among seemingly dissimilar gestures. To the best of our knowledge, such an understanding is currently missing in the literature. As a first step to enable this understanding, we designed a method that facilitates investigation of variations in how users move their body parts as they perform a motion. This method, which we call filterJoint, selects the key body parts that are actively moving during the performance of a motion. The paths along which these body parts move in space over time can then be analyzed to make inferences about how users articulate whole-body gestures. We present two case studies to show how the filterJoint method enables a deeper understanding of whole-body gesture articulation, and we highlight implications for the selection of whole-body gesture sets as a result of these insights.\",\"PeriodicalId\":402394,\"journal\":{\"name\":\"Proceedings of the 2020 International Conference on Multimodal Interaction\",\"volume\":\"109 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2020 International Conference on Multimodal Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3382507.3418822\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 International Conference on Multimodal Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3382507.3418822","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

通过选择冲突较少(即混淆或错误分类)的手势,可以提高全身手势的分类精度。为了识别这些手势,了解用户如何表达全身手势的细微差别会有所帮助,特别是当看似不同的手势之间的混淆可能导致冲突时。据我们所知,目前文献中缺少这样的理解。作为实现这种理解的第一步,我们设计了一种方法,便于调查用户在执行运动时如何移动身体部位的变化。这个方法,我们称之为filterJoint,选择在运动执行过程中主动移动的关键身体部位。然后,这些身体部位在空间中随时间移动的路径可以被分析,从而推断出用户是如何表达全身手势的。我们提出了两个案例研究来展示filterJoint方法如何能够更深入地理解全身手势发音,并且我们强调了这些见解对选择全身手势集的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
FilterJoint: Toward an Understanding of Whole-Body Gesture Articulation
Classification accuracy of whole-body gestures can be improved by selecting gestures that have few conflicts (i.e., confusions or misclassifications). To identify such gestures, an understanding of the nuances of how users articulate whole-body gestures can help, especially when conflicts may be due to confusion among seemingly dissimilar gestures. To the best of our knowledge, such an understanding is currently missing in the literature. As a first step to enable this understanding, we designed a method that facilitates investigation of variations in how users move their body parts as they perform a motion. This method, which we call filterJoint, selects the key body parts that are actively moving during the performance of a motion. The paths along which these body parts move in space over time can then be analyzed to make inferences about how users articulate whole-body gestures. We present two case studies to show how the filterJoint method enables a deeper understanding of whole-body gesture articulation, and we highlight implications for the selection of whole-body gesture sets as a result of these insights.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信