A Study on Robot-Human System with Consideration of Individual Preferences : 2nd Report, Multimodal Human-Machine Interface for Object-Handing Robot System

M. Jindai, S. Shibata, Tomonori Yamamoto, Tomio Watanabe
{"title":"A Study on Robot-Human System with Consideration of Individual Preferences : 2nd Report, Multimodal Human-Machine Interface for Object-Handing Robot System","authors":"M. Jindai, S. Shibata, Tomonori Yamamoto, Tomio Watanabe","doi":"10.1299/jsmec.49.1033","DOIUrl":null,"url":null,"abstract":"In this study, we propose an object-handing robot system with a multimodal human-machine interface which is composed of speech recognition and image processing units. Using this multimodal human-machine interface, the cooperator can order the object-handing robot system using voice commands and hand gestures. In this robot system, the motion parameters of the robot, which are maximum velocity, velocity profile peak and handing position, can be adjusted by the voice commands or the hand gestures in order to realize the most appropriate motion of the robot. Furthermore, the cooperator can order the handing of objects using voice commands along with hand gestures. In these voice commands, the cooperator can use adverbs. This permits the cooperator to realize efficient adjustments, because the adjustment value of each motion parameters is determined by adverbs. In particular, adjustment values corresponding to adverbs are estimated by fuzzy inference in order to take into consideration the ambiguities of human speech.","PeriodicalId":151961,"journal":{"name":"Jsme International Journal Series C-mechanical Systems Machine Elements and Manufacturing","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Jsme International Journal Series C-mechanical Systems Machine Elements and Manufacturing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1299/jsmec.49.1033","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

In this study, we propose an object-handing robot system with a multimodal human-machine interface which is composed of speech recognition and image processing units. Using this multimodal human-machine interface, the cooperator can order the object-handing robot system using voice commands and hand gestures. In this robot system, the motion parameters of the robot, which are maximum velocity, velocity profile peak and handing position, can be adjusted by the voice commands or the hand gestures in order to realize the most appropriate motion of the robot. Furthermore, the cooperator can order the handing of objects using voice commands along with hand gestures. In these voice commands, the cooperator can use adverbs. This permits the cooperator to realize efficient adjustments, because the adjustment value of each motion parameters is determined by adverbs. In particular, adjustment values corresponding to adverbs are estimated by fuzzy inference in order to take into consideration the ambiguities of human speech.
考虑个体偏好的机器人-人系统研究:第二报告,多模态人机界面的物体处理机器人系统
在这项研究中,我们提出了一个具有多模态人机界面的物体处理机器人系统,该系统由语音识别和图像处理单元组成。利用这种多模态人机界面,操作者可以使用语音命令和手势对物体处理机器人系统进行命令。在该机器人系统中,可以通过语音命令或手势调整机器人的运动参数,即最大速度、速度剖面峰值和搬运位置,以实现机器人最合适的运动。此外,合作者可以使用语音命令和手势来命令物体的处理。在这些语音命令中,合作者可以使用副词。由于每个运动参数的调整值是由副词决定的,这使得协作器可以实现高效的调整。特别地,为了考虑到人类语言的模糊性,通过模糊推理来估计副词对应的调整值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信