A Multi-sensor Gesture Interaction System for Human-robot Cooperation

Jiahui Yu, Min Li, Xin-Li Zhang, T. Zhang, Xianzhong Zhou
{"title":"A Multi-sensor Gesture Interaction System for Human-robot Cooperation","authors":"Jiahui Yu, Min Li, Xin-Li Zhang, T. Zhang, Xianzhong Zhou","doi":"10.1109/ICNSC52481.2021.9702166","DOIUrl":null,"url":null,"abstract":"Gestures are considered as a natural expression of the human body and are used to communicate with other people. The gesture-based human-robot interaction is natural, convenient, and applicable, and can be applied to complex interactive scenarios. In this paper, considering the real-time nature of the human-robot cooperation(HRC) system and the variability of the interaction range, we combine Kinect V2.0 (far-range sensor) and Leap Motion (short-range and high precision sensor), and propose a real-time multi-sensor gesture interaction system. Firstly, a reasonable layout of two sensors is discussed to realize far-range perception of natural gesture interaction. Then, nine gestures are defined that are easy for users to remember and operate. At the same time, a gesture interactive mechanism is proposed that can automatically switch two sensors according to the distance of the operator’s position. It can better improve the defects such as occlusion and confusion in the process of gesture controlling, and solve the minimum distance constraint of Kinect. Finally, the interactive experiment proves the stability and accuracy of the system.","PeriodicalId":129062,"journal":{"name":"2021 IEEE International Conference on Networking, Sensing and Control (ICNSC)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Networking, Sensing and Control (ICNSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNSC52481.2021.9702166","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Gestures are considered as a natural expression of the human body and are used to communicate with other people. The gesture-based human-robot interaction is natural, convenient, and applicable, and can be applied to complex interactive scenarios. In this paper, considering the real-time nature of the human-robot cooperation(HRC) system and the variability of the interaction range, we combine Kinect V2.0 (far-range sensor) and Leap Motion (short-range and high precision sensor), and propose a real-time multi-sensor gesture interaction system. Firstly, a reasonable layout of two sensors is discussed to realize far-range perception of natural gesture interaction. Then, nine gestures are defined that are easy for users to remember and operate. At the same time, a gesture interactive mechanism is proposed that can automatically switch two sensors according to the distance of the operator’s position. It can better improve the defects such as occlusion and confusion in the process of gesture controlling, and solve the minimum distance constraint of Kinect. Finally, the interactive experiment proves the stability and accuracy of the system.
面向人机协作的多传感器手势交互系统
手势被认为是人类身体的一种自然表达,用于与他人交流。基于手势的人机交互具有自然、方便、适用等特点,可应用于复杂的交互场景。本文考虑到人机协作(HRC)系统的实时性和交互范围的可变性,结合Kinect V2.0(远程传感器)和Leap Motion(近距离高精度传感器),提出了一种实时多传感器手势交互系统。首先,讨论了两个传感器的合理布局,以实现自然手势交互的远程感知。然后,定义了九种易于用户记忆和操作的手势。同时,提出了一种手势交互机制,可以根据操作者位置的距离自动切换两个传感器。可以较好地改善手势控制过程中的遮挡、混淆等缺陷,解决Kinect的最小距离约束问题。最后,通过交互实验验证了系统的稳定性和准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信