Gesture-controlled omnidirectional autonomous vehicle: A web-based approach for gesture recognition

IF 2.3 Q2 COMPUTER SCIENCE, THEORY & METHODS
Array Pub Date : 2025-05-19 DOI:10.1016/j.array.2025.100408
Huma Zia, Bara Fteiha, Maha Abdulnasser, Tafleh Saleh, Fatima Suliemn, Kawther Alagha, Jawad Yousaf, Mohammed Ghazal
{"title":"Gesture-controlled omnidirectional autonomous vehicle: A web-based approach for gesture recognition","authors":"Huma Zia,&nbsp;Bara Fteiha,&nbsp;Maha Abdulnasser,&nbsp;Tafleh Saleh,&nbsp;Fatima Suliemn,&nbsp;Kawther Alagha,&nbsp;Jawad Yousaf,&nbsp;Mohammed Ghazal","doi":"10.1016/j.array.2025.100408","DOIUrl":null,"url":null,"abstract":"<div><div>This paper presents a novel web-based hand/thumb gesture recognition model, validated through the implementation of a gesture-controlled omnidirectional autonomous vehicle. Utilizing a custom-trained YOLOv5s model, the system efficiently translates user gestures into precise control signals, facilitating real-time vehicle operation under five commands: forward, backward, left, right, and stop. Integration with Raspberry Pi hardware, including a camera and peripherals, enables rapid live video processing with a latency of 150–300 ms and stable frame rates of 12–18 FPS. The system demonstrates reliable performance with a classification accuracy of 94.2%, validated across multiple gesture classes through statistical analysis, including confusion matrices and ANOVA testing. A user-friendly web interface, built using TensorFlow.js, Node.js, and WebSocket, enhances usability by providing seamless live video streaming and real-time, device-agnostic control directly in the browser without requiring wearable sensors or external processing. The system’s key contributions include: (1) robust real-time hand gesture recognition using YOLOv5s; (2) seamless Raspberry Pi–Arduino integration; (3) a browser-based interface enabling accessible, scalable deployment; and (4) empirical validation across functional, environmental, and statistical performance metrics. This innovation marks a significant advancement in the practical application of hand gesture control within robotics. It offers a flexible and cost-effective alternative to sensor-based systems and serves as a foundation for future developments in autonomous vehicles, human-machine interaction, assistive technologies, automation, and AI-driven interfaces. By eliminating the existing systems’ need for wearable technology, specialized hardware, or complex setups, this work expands the potential for deploying intuitive, touch-free control systems across diverse real-world domains.</div></div>","PeriodicalId":8417,"journal":{"name":"Array","volume":"26 ","pages":"Article 100408"},"PeriodicalIF":2.3000,"publicationDate":"2025-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Array","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590005625000359","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

This paper presents a novel web-based hand/thumb gesture recognition model, validated through the implementation of a gesture-controlled omnidirectional autonomous vehicle. Utilizing a custom-trained YOLOv5s model, the system efficiently translates user gestures into precise control signals, facilitating real-time vehicle operation under five commands: forward, backward, left, right, and stop. Integration with Raspberry Pi hardware, including a camera and peripherals, enables rapid live video processing with a latency of 150–300 ms and stable frame rates of 12–18 FPS. The system demonstrates reliable performance with a classification accuracy of 94.2%, validated across multiple gesture classes through statistical analysis, including confusion matrices and ANOVA testing. A user-friendly web interface, built using TensorFlow.js, Node.js, and WebSocket, enhances usability by providing seamless live video streaming and real-time, device-agnostic control directly in the browser without requiring wearable sensors or external processing. The system’s key contributions include: (1) robust real-time hand gesture recognition using YOLOv5s; (2) seamless Raspberry Pi–Arduino integration; (3) a browser-based interface enabling accessible, scalable deployment; and (4) empirical validation across functional, environmental, and statistical performance metrics. This innovation marks a significant advancement in the practical application of hand gesture control within robotics. It offers a flexible and cost-effective alternative to sensor-based systems and serves as a foundation for future developments in autonomous vehicles, human-machine interaction, assistive technologies, automation, and AI-driven interfaces. By eliminating the existing systems’ need for wearable technology, specialized hardware, or complex setups, this work expands the potential for deploying intuitive, touch-free control systems across diverse real-world domains.
手势控制的全方位自动驾驶汽车:基于网络的手势识别方法
本文提出了一种新的基于web的手/拇指手势识别模型,并通过手势控制的全向自动驾驶汽车的实现进行了验证。利用定制训练的YOLOv5s模型,该系统有效地将用户手势转换为精确的控制信号,实现车辆在前进、后退、左、右和停车五种命令下的实时操作。与树莓派硬件的集成,包括摄像头和外围设备,可以实现150-300毫秒的延迟和12-18 FPS的稳定帧率的快速实时视频处理。该系统表现出可靠的性能,分类准确率为94.2%,通过统计分析(包括混淆矩阵和方差分析)验证了多个手势类别。使用TensorFlow.js、Node.js和WebSocket构建的用户友好的web界面,通过直接在浏览器中提供无缝的直播视流和实时的、与设备无关的控制来增强可用性,而无需可穿戴传感器或外部处理。该系统的主要贡献包括:(1)使用YOLOv5s进行鲁棒实时手势识别;(2) Raspberry Pi-Arduino无缝集成;(3)基于浏览器的界面,实现可访问、可扩展的部署;(4)跨功能、环境和统计绩效指标的实证验证。这一创新标志着手势控制在机器人技术中的实际应用取得了重大进展。它为基于传感器的系统提供了一种灵活且经济的替代方案,并为自动驾驶汽车、人机交互、辅助技术、自动化和人工智能驱动界面的未来发展奠定了基础。通过消除现有系统对可穿戴技术、专用硬件或复杂设置的需求,这项工作扩大了在不同现实世界领域部署直观、无触摸控制系统的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Array
Array Computer Science-General Computer Science
CiteScore
4.40
自引率
0.00%
发文量
93
审稿时长
45 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信