基于触摸手势识别的人机协作任务物理交互

IF 5 3区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Dawoon Jung;Chengyan Gu;Junmin Park;Joono Cheong
{"title":"基于触摸手势识别的人机协作任务物理交互","authors":"Dawoon Jung;Chengyan Gu;Junmin Park;Joono Cheong","doi":"10.1109/TCDS.2024.3466553","DOIUrl":null,"url":null,"abstract":"Human–robot collaboration (HRC) has recently attracted increasing attention as a vital component of next-generation automated manufacturing and assembly tasks, yet physical human–robot interaction (pHRI)—which is an inevitable component of collaboration—is often limited to rudimentary touches. This article therefore proposes a deep-learning-based pHRI method that utilizes predefined types of human touch gestures as intuitive communicative signs for collaborative tasks. To this end, a touch gesture network model is first designed upon the framework of the gated recurrent unit (GRU) network, which accepts a set of ground-truth dynamic responses (energy change, generalized momentum, and external joint torque) of robot manipulators under the action of known types of touch gestures and learns to predict the five representative touch gesture types and the corresponding link toward a random touch gesture input. After training the GRU-based touch gesture model using a collected dataset of dynamic responses of a robot manipulator, a total of 35 outputs (five gesture types with seven links each) is recognized with 96.94% accuracy. The experimental results of recognition accuracy correlated with the touch gesture types, and their strength results are shown to validate the performance and disclose the characteristics of the proposed touch gesture model. An example of an IKEA chair assembly task is also presented to demonstrate a collaborative task using the proposed touch gestures. By developing the proposed pHRI method and demonstrating its applicability, we expect that this method can help position physical interaction as one of the key modalities for communication in real-world HRC applications.","PeriodicalId":54300,"journal":{"name":"IEEE Transactions on Cognitive and Developmental Systems","volume":"17 2","pages":"421-435"},"PeriodicalIF":5.0000,"publicationDate":"2024-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Touch Gesture Recognition-Based Physical Human–Robot Interaction for Collaborative Tasks\",\"authors\":\"Dawoon Jung;Chengyan Gu;Junmin Park;Joono Cheong\",\"doi\":\"10.1109/TCDS.2024.3466553\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Human–robot collaboration (HRC) has recently attracted increasing attention as a vital component of next-generation automated manufacturing and assembly tasks, yet physical human–robot interaction (pHRI)—which is an inevitable component of collaboration—is often limited to rudimentary touches. This article therefore proposes a deep-learning-based pHRI method that utilizes predefined types of human touch gestures as intuitive communicative signs for collaborative tasks. To this end, a touch gesture network model is first designed upon the framework of the gated recurrent unit (GRU) network, which accepts a set of ground-truth dynamic responses (energy change, generalized momentum, and external joint torque) of robot manipulators under the action of known types of touch gestures and learns to predict the five representative touch gesture types and the corresponding link toward a random touch gesture input. After training the GRU-based touch gesture model using a collected dataset of dynamic responses of a robot manipulator, a total of 35 outputs (five gesture types with seven links each) is recognized with 96.94% accuracy. The experimental results of recognition accuracy correlated with the touch gesture types, and their strength results are shown to validate the performance and disclose the characteristics of the proposed touch gesture model. An example of an IKEA chair assembly task is also presented to demonstrate a collaborative task using the proposed touch gestures. By developing the proposed pHRI method and demonstrating its applicability, we expect that this method can help position physical interaction as one of the key modalities for communication in real-world HRC applications.\",\"PeriodicalId\":54300,\"journal\":{\"name\":\"IEEE Transactions on Cognitive and Developmental Systems\",\"volume\":\"17 2\",\"pages\":\"421-435\"},\"PeriodicalIF\":5.0000,\"publicationDate\":\"2024-09-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Cognitive and Developmental Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10693288/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Cognitive and Developmental Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10693288/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

作为下一代自动化制造和装配任务的重要组成部分,人机协作(HRC)最近引起了越来越多的关注,然而物理人机交互(pHRI)——这是协作的一个不可避免的组成部分——往往仅限于基本的接触。因此,本文提出了一种基于深度学习的pHRI方法,该方法利用预定义类型的人类触摸手势作为协作任务的直观交流符号。为此,首先在门控递归单元(GRU)网络框架上设计了一个触摸手势网络模型,该模型接受机器人在已知类型的触摸手势作用下的一组真实动态响应(能量变化、广义动量和外部关节扭矩),并学习预测五种具有代表性的触摸手势类型及其与随机触摸手势输入的对应链接。利用收集到的机器人动态响应数据集对基于gru的触摸手势模型进行训练后,共识别出35个输出(5种手势类型,每种手势7个链接),准确率为96.94%。实验结果表明,识别精度与触摸手势类型及其强度相关,验证了所提触摸手势模型的性能,揭示了所提触摸手势模型的特点。还以宜家椅子组装任务为例,演示了使用提议的触摸手势的协作任务。通过开发所提出的pHRI方法并证明其适用性,我们期望该方法可以帮助将物理交互定位为现实世界HRC应用中通信的关键模式之一。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Touch Gesture Recognition-Based Physical Human–Robot Interaction for Collaborative Tasks
Human–robot collaboration (HRC) has recently attracted increasing attention as a vital component of next-generation automated manufacturing and assembly tasks, yet physical human–robot interaction (pHRI)—which is an inevitable component of collaboration—is often limited to rudimentary touches. This article therefore proposes a deep-learning-based pHRI method that utilizes predefined types of human touch gestures as intuitive communicative signs for collaborative tasks. To this end, a touch gesture network model is first designed upon the framework of the gated recurrent unit (GRU) network, which accepts a set of ground-truth dynamic responses (energy change, generalized momentum, and external joint torque) of robot manipulators under the action of known types of touch gestures and learns to predict the five representative touch gesture types and the corresponding link toward a random touch gesture input. After training the GRU-based touch gesture model using a collected dataset of dynamic responses of a robot manipulator, a total of 35 outputs (five gesture types with seven links each) is recognized with 96.94% accuracy. The experimental results of recognition accuracy correlated with the touch gesture types, and their strength results are shown to validate the performance and disclose the characteristics of the proposed touch gesture model. An example of an IKEA chair assembly task is also presented to demonstrate a collaborative task using the proposed touch gestures. By developing the proposed pHRI method and demonstrating its applicability, we expect that this method can help position physical interaction as one of the key modalities for communication in real-world HRC applications.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.20
自引率
10.00%
发文量
170
期刊介绍: The IEEE Transactions on Cognitive and Developmental Systems (TCDS) focuses on advances in the study of development and cognition in natural (humans, animals) and artificial (robots, agents) systems. It welcomes contributions from multiple related disciplines including cognitive systems, cognitive robotics, developmental and epigenetic robotics, autonomous and evolutionary robotics, social structures, multi-agent and artificial life systems, computational neuroscience, and developmental psychology. Articles on theoretical, computational, application-oriented, and experimental studies as well as reviews in these areas are considered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信