{"title":"前hgr:手势识别与混合特征感知变压器","authors":"Monu Verma;Garvit Gopalani;Saiyam Bharara;Santosh Kumar Vipparthi;Subrahmanyam Murala;Mohamed Abdel-Mottaleb","doi":"10.1109/LSENS.2025.3566022","DOIUrl":null,"url":null,"abstract":"Hand gesture recognition (HGR) systems, using cameras and sensors, offer an intuitive method for human–machine interaction, sparking interest across various applications. However, these systems face challenges from environmental factors such as variations in illumination, complex backgrounds, diverse hand shapes, and similarities between different gesture classes. Achieving accurate gesture recognition under such conditions remains a complex task, necessitating robust solutions to ensure reliable performance. This letter proposes a novel approach named Former-HGR, a hybrid feature-aware transformer for HGR. Unlike traditional transformer-based HGR systems that heavily rely on computationally intensive self-attention mechanisms, Former-HGR enhances global feature perception by applying self-attention across channels through the integration of multidconv head transposed attention. In addition, Former-HGR improves feature extraction by incorporating multiscale features and effectively filters redundant information using a hybrid feature-aware network. Extensive experiments conducted on three datasets: NUSII, OUHANDS, and MUGD, demonstrate that Former-HGR outperforms recent benchmark HGR approaches, achieving accuracy improvements of up to 14% in person-independent validation schemes.","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":"9 6","pages":"1-4"},"PeriodicalIF":2.2000,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Former-HGR: Hand Gesture Recognition With Hybrid Feature-Aware Transformer\",\"authors\":\"Monu Verma;Garvit Gopalani;Saiyam Bharara;Santosh Kumar Vipparthi;Subrahmanyam Murala;Mohamed Abdel-Mottaleb\",\"doi\":\"10.1109/LSENS.2025.3566022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hand gesture recognition (HGR) systems, using cameras and sensors, offer an intuitive method for human–machine interaction, sparking interest across various applications. However, these systems face challenges from environmental factors such as variations in illumination, complex backgrounds, diverse hand shapes, and similarities between different gesture classes. Achieving accurate gesture recognition under such conditions remains a complex task, necessitating robust solutions to ensure reliable performance. This letter proposes a novel approach named Former-HGR, a hybrid feature-aware transformer for HGR. Unlike traditional transformer-based HGR systems that heavily rely on computationally intensive self-attention mechanisms, Former-HGR enhances global feature perception by applying self-attention across channels through the integration of multidconv head transposed attention. In addition, Former-HGR improves feature extraction by incorporating multiscale features and effectively filters redundant information using a hybrid feature-aware network. Extensive experiments conducted on three datasets: NUSII, OUHANDS, and MUGD, demonstrate that Former-HGR outperforms recent benchmark HGR approaches, achieving accuracy improvements of up to 14% in person-independent validation schemes.\",\"PeriodicalId\":13014,\"journal\":{\"name\":\"IEEE Sensors Letters\",\"volume\":\"9 6\",\"pages\":\"1-4\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2025-03-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Letters\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11005620/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11005620/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Former-HGR: Hand Gesture Recognition With Hybrid Feature-Aware Transformer
Hand gesture recognition (HGR) systems, using cameras and sensors, offer an intuitive method for human–machine interaction, sparking interest across various applications. However, these systems face challenges from environmental factors such as variations in illumination, complex backgrounds, diverse hand shapes, and similarities between different gesture classes. Achieving accurate gesture recognition under such conditions remains a complex task, necessitating robust solutions to ensure reliable performance. This letter proposes a novel approach named Former-HGR, a hybrid feature-aware transformer for HGR. Unlike traditional transformer-based HGR systems that heavily rely on computationally intensive self-attention mechanisms, Former-HGR enhances global feature perception by applying self-attention across channels through the integration of multidconv head transposed attention. In addition, Former-HGR improves feature extraction by incorporating multiscale features and effectively filters redundant information using a hybrid feature-aware network. Extensive experiments conducted on three datasets: NUSII, OUHANDS, and MUGD, demonstrate that Former-HGR outperforms recent benchmark HGR approaches, achieving accuracy improvements of up to 14% in person-independent validation schemes.