{"title":"规范化全手掌:迈向更准确的基于手的多模态生物识别","authors":"Yitao Qiao;Wenxiong Kang;Dacan Luo;Junduan Huang","doi":"10.1109/TPAMI.2025.3564514","DOIUrl":null,"url":null,"abstract":"Hand-based multimodal biometrics have attracted significant attention due to their high security and performance. However, existing methods fail to adequately decouple various hand biometric traits, limiting the extraction of unique features. Moreover, effective feature extraction for multiple hand traits remains a challenge. To address these issues, we propose a novel method for the precise decoupling of hand multimodal features called ‘Normalized-Full-Palmar-Hand’ and construct an authentication system based on this method. First, we propose HSANet, which accurately segments various hand regions with diverse backgrounds based on low-level details and high-level semantic information. Next, we establish two hand multimodal biometric databases with HSANet: SCUT Normalized-Full-Palmar-Hand Database Version 1 (SCUT_NFPH_v1) and Version 2 (SCUT_NFPH_v2). These databases include full hand images, semantic masks, and images of various hand biometric traits obtained from the same individual at the same scale, totaling 157,500 images. Third, we propose the Full Palmar Hand Authentication Network framework (FPHandNet) to extract unique features of multiple hand biometric traits. Finally, extensive experimental results, performed via the publicly available CASIA, IITD, COEP databases, and our proposed databases, validate the effectiveness of our methods.","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":"47 8","pages":"6715-6730"},"PeriodicalIF":18.6000,"publicationDate":"2025-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Normalized-Full-Palmar-Hand: Toward More Accurate Hand-Based Multimodal Biometrics\",\"authors\":\"Yitao Qiao;Wenxiong Kang;Dacan Luo;Junduan Huang\",\"doi\":\"10.1109/TPAMI.2025.3564514\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hand-based multimodal biometrics have attracted significant attention due to their high security and performance. However, existing methods fail to adequately decouple various hand biometric traits, limiting the extraction of unique features. Moreover, effective feature extraction for multiple hand traits remains a challenge. To address these issues, we propose a novel method for the precise decoupling of hand multimodal features called ‘Normalized-Full-Palmar-Hand’ and construct an authentication system based on this method. First, we propose HSANet, which accurately segments various hand regions with diverse backgrounds based on low-level details and high-level semantic information. Next, we establish two hand multimodal biometric databases with HSANet: SCUT Normalized-Full-Palmar-Hand Database Version 1 (SCUT_NFPH_v1) and Version 2 (SCUT_NFPH_v2). These databases include full hand images, semantic masks, and images of various hand biometric traits obtained from the same individual at the same scale, totaling 157,500 images. Third, we propose the Full Palmar Hand Authentication Network framework (FPHandNet) to extract unique features of multiple hand biometric traits. Finally, extensive experimental results, performed via the publicly available CASIA, IITD, COEP databases, and our proposed databases, validate the effectiveness of our methods.\",\"PeriodicalId\":94034,\"journal\":{\"name\":\"IEEE transactions on pattern analysis and machine intelligence\",\"volume\":\"47 8\",\"pages\":\"6715-6730\"},\"PeriodicalIF\":18.6000,\"publicationDate\":\"2025-04-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on pattern analysis and machine intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10978896/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10978896/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Normalized-Full-Palmar-Hand: Toward More Accurate Hand-Based Multimodal Biometrics
Hand-based multimodal biometrics have attracted significant attention due to their high security and performance. However, existing methods fail to adequately decouple various hand biometric traits, limiting the extraction of unique features. Moreover, effective feature extraction for multiple hand traits remains a challenge. To address these issues, we propose a novel method for the precise decoupling of hand multimodal features called ‘Normalized-Full-Palmar-Hand’ and construct an authentication system based on this method. First, we propose HSANet, which accurately segments various hand regions with diverse backgrounds based on low-level details and high-level semantic information. Next, we establish two hand multimodal biometric databases with HSANet: SCUT Normalized-Full-Palmar-Hand Database Version 1 (SCUT_NFPH_v1) and Version 2 (SCUT_NFPH_v2). These databases include full hand images, semantic masks, and images of various hand biometric traits obtained from the same individual at the same scale, totaling 157,500 images. Third, we propose the Full Palmar Hand Authentication Network framework (FPHandNet) to extract unique features of multiple hand biometric traits. Finally, extensive experimental results, performed via the publicly available CASIA, IITD, COEP databases, and our proposed databases, validate the effectiveness of our methods.