使用Y010v5和MediaPipe检测手掌手势(美国手语)的命令式方法

Gouri Anilkumar, M. S. Fouzia, G. S. Anisha
{"title":"使用Y010v5和MediaPipe检测手掌手势(美国手语)的命令式方法","authors":"Gouri Anilkumar, M. S. Fouzia, G. S. Anisha","doi":"10.1109/CONIT55038.2022.9847703","DOIUrl":null,"url":null,"abstract":"Humans place a high importance on the ability to interact. People with hearing or speaking difficulties had trouble expressing themselves. Despite the fact that sign language solved the problem, they were still unable to engage with the general populace., necessitating the development of sign language detectors. A variety of sign language detection algorithms are effectively open. This research investigates two well-known models for recognizing American Sign Language Gestures for alphabets: MediaPipe-LSTM and YOLO v5-PyTorch. They were given custom datasets., and the outcomes were inferred and compared to see how accurate and effective the models were.","PeriodicalId":270445,"journal":{"name":"2022 2nd International Conference on Intelligent Technologies (CONIT)","volume":"314 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Imperative Methodology to Detect the Palm Gestures (American Sign Language) using Y010v5 and MediaPipe\",\"authors\":\"Gouri Anilkumar, M. S. Fouzia, G. S. Anisha\",\"doi\":\"10.1109/CONIT55038.2022.9847703\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Humans place a high importance on the ability to interact. People with hearing or speaking difficulties had trouble expressing themselves. Despite the fact that sign language solved the problem, they were still unable to engage with the general populace., necessitating the development of sign language detectors. A variety of sign language detection algorithms are effectively open. This research investigates two well-known models for recognizing American Sign Language Gestures for alphabets: MediaPipe-LSTM and YOLO v5-PyTorch. They were given custom datasets., and the outcomes were inferred and compared to see how accurate and effective the models were.\",\"PeriodicalId\":270445,\"journal\":{\"name\":\"2022 2nd International Conference on Intelligent Technologies (CONIT)\",\"volume\":\"314 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 2nd International Conference on Intelligent Technologies (CONIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CONIT55038.2022.9847703\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 2nd International Conference on Intelligent Technologies (CONIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CONIT55038.2022.9847703","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

人类非常重视互动的能力。有听力或语言障碍的人很难表达自己。尽管手语解决了这个问题,但他们仍然无法与普通民众交流。这就需要开发手语探测器。各种手语检测算法是有效开放的。本研究研究了两个著名的美国手语手势识别模型:MediaPipe-LSTM和YOLO v5-PyTorch。他们得到了定制的数据集。,并对结果进行推断和比较,以了解模型的准确性和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Imperative Methodology to Detect the Palm Gestures (American Sign Language) using Y010v5 and MediaPipe
Humans place a high importance on the ability to interact. People with hearing or speaking difficulties had trouble expressing themselves. Despite the fact that sign language solved the problem, they were still unable to engage with the general populace., necessitating the development of sign language detectors. A variety of sign language detection algorithms are effectively open. This research investigates two well-known models for recognizing American Sign Language Gestures for alphabets: MediaPipe-LSTM and YOLO v5-PyTorch. They were given custom datasets., and the outcomes were inferred and compared to see how accurate and effective the models were.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信