实时抓取模式分类系统的视觉-肌电融合方法

Dulanjana M. Perera, D. Madusanka
{"title":"实时抓取模式分类系统的视觉-肌电融合方法","authors":"Dulanjana M. Perera, D. Madusanka","doi":"10.1109/MERCon52712.2021.9525702","DOIUrl":null,"url":null,"abstract":"Although recently developed Electromyography-based (EMG) prosthetic hands could classify a significant amount of wrist motions, classifying 5-6 grasping patterns in real-time is a challenging task. The collaboration of EMG and vision has addressed this problem to a certain extent but could not achieve significant performance in real-time. In this paper, we propose a fusion method that can improve the real-time prediction accuracy of the EMG system by merging a probability matrix that represents the usage of the six grasping patterns for the targeted object. The YOLO object detection system retrieves a probability matrix of the identified object, and it is used to correct the error in the EMG classification system. The experiments revealed that the optimized ANN model outperformed the KNN, LDA, NB, and DT by achieving the highest mean True Positive Rate (mTPR) of 69.34%(21.54) in real-time for all the six grasping patterns. Furthermore, the proposed feature set (Age, Gender, and Handedness of the user) showed that their influence increases the mTPR of ANN by 16.05%(2.70). The proposed system takes 393.89ms(178.23ms) to produce a prediction. Therefore, the user does not feel a delay between intention and execution. Furthermore, the system facilitates users to use multiple-grasping patterns for an object.","PeriodicalId":6855,"journal":{"name":"2021 Moratuwa Engineering Research Conference (MERCon)","volume":"56 1","pages":"585-590"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Vision-EMG Fusion Method for Real-time Grasping Pattern Classification System\",\"authors\":\"Dulanjana M. Perera, D. Madusanka\",\"doi\":\"10.1109/MERCon52712.2021.9525702\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Although recently developed Electromyography-based (EMG) prosthetic hands could classify a significant amount of wrist motions, classifying 5-6 grasping patterns in real-time is a challenging task. The collaboration of EMG and vision has addressed this problem to a certain extent but could not achieve significant performance in real-time. In this paper, we propose a fusion method that can improve the real-time prediction accuracy of the EMG system by merging a probability matrix that represents the usage of the six grasping patterns for the targeted object. The YOLO object detection system retrieves a probability matrix of the identified object, and it is used to correct the error in the EMG classification system. The experiments revealed that the optimized ANN model outperformed the KNN, LDA, NB, and DT by achieving the highest mean True Positive Rate (mTPR) of 69.34%(21.54) in real-time for all the six grasping patterns. Furthermore, the proposed feature set (Age, Gender, and Handedness of the user) showed that their influence increases the mTPR of ANN by 16.05%(2.70). The proposed system takes 393.89ms(178.23ms) to produce a prediction. Therefore, the user does not feel a delay between intention and execution. Furthermore, the system facilitates users to use multiple-grasping patterns for an object.\",\"PeriodicalId\":6855,\"journal\":{\"name\":\"2021 Moratuwa Engineering Research Conference (MERCon)\",\"volume\":\"56 1\",\"pages\":\"585-590\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 Moratuwa Engineering Research Conference (MERCon)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MERCon52712.2021.9525702\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 Moratuwa Engineering Research Conference (MERCon)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MERCon52712.2021.9525702","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

尽管最近开发的基于肌电图(EMG)的假手可以对大量的手腕运动进行分类,但实时分类5-6种抓取模式是一项具有挑战性的任务。肌电和视觉的协同在一定程度上解决了这一问题,但在实时性上并不能取得显著的效果。在本文中,我们提出了一种融合方法,通过合并一个概率矩阵来表示六种抓取模式对目标物体的使用情况,从而提高肌电系统的实时预测精度。YOLO目标检测系统检索识别目标的概率矩阵,并用于纠正肌电分类系统中的错误。实验表明,优化后的ANN模型优于KNN、LDA、NB和DT,对所有6种抓取模式的实时平均真阳性率(mTPR)最高,为69.34%(21.54)。此外,所提出的特征集(用户的年龄、性别和惯用手)表明,它们的影响使人工神经网络的mTPR提高了16.05%(2.70)。提出的系统需要393.89ms(178.23ms)来产生一个预测。因此,用户不会感觉到意图和执行之间的延迟。此外,该系统便于用户对一个对象使用多种抓取模式。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Vision-EMG Fusion Method for Real-time Grasping Pattern Classification System
Although recently developed Electromyography-based (EMG) prosthetic hands could classify a significant amount of wrist motions, classifying 5-6 grasping patterns in real-time is a challenging task. The collaboration of EMG and vision has addressed this problem to a certain extent but could not achieve significant performance in real-time. In this paper, we propose a fusion method that can improve the real-time prediction accuracy of the EMG system by merging a probability matrix that represents the usage of the six grasping patterns for the targeted object. The YOLO object detection system retrieves a probability matrix of the identified object, and it is used to correct the error in the EMG classification system. The experiments revealed that the optimized ANN model outperformed the KNN, LDA, NB, and DT by achieving the highest mean True Positive Rate (mTPR) of 69.34%(21.54) in real-time for all the six grasping patterns. Furthermore, the proposed feature set (Age, Gender, and Handedness of the user) showed that their influence increases the mTPR of ANN by 16.05%(2.70). The proposed system takes 393.89ms(178.23ms) to produce a prediction. Therefore, the user does not feel a delay between intention and execution. Furthermore, the system facilitates users to use multiple-grasping patterns for an object.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信