面向掌纹验证的序列特征融合网络

IF 8 1区 计算机科学 Q1 COMPUTER SCIENCE, THEORY & METHODS
Yunlong Liu;Lu Leng;Ziyuan Yang;Andrew Beng Jin Teoh;Bob Zhang
{"title":"面向掌纹验证的序列特征融合网络","authors":"Yunlong Liu;Lu Leng;Ziyuan Yang;Andrew Beng Jin Teoh;Bob Zhang","doi":"10.1109/TIFS.2025.3611692","DOIUrl":null,"url":null,"abstract":"Currently global features are usually extracted directly from local patterns in palmprint verification. Furthermore, sequence features for palmprint verification are only used as local features, but the properties of sequence features are not fully utilized. To solve this issue, this paper introduces Sequence Feature Fusion Network (SF2Net) for palmprint verification. SF2Net proposes a new paradigm: using stable and spatially correlated sequence features as an intermediate bridge to generate robust global representations. SF2Net’s core mechanism is to first extract fine-grained local features that are then converted into sequence features by a Sequence Feature Extractor (SFE). Finally, the sequence features are used as a superior input to capture high-quality global features. By fusing multi-order texture-based local features with globally extracted sequence features, SF2Net achieves superior discrimination. To ensure high accuracy even with limited training data, a hybrid loss function is proposed, which integrate a cross-entropy loss and a triplet loss. Triplet loss effectively optimizes feature separation by explicitly considering negative samples. Extensive experiments on multiple publicly available palmprint datasets demonstrate that SF2Net achieves state-of-the-art (SOTA) performance. Remarkably, even with a small training-to-testing ratio (1:9), SF2Net achieves 100% accuracy, surpassing SOTA methods under several benchmark datasets. The code is released at <uri>https://github.com/20201422/SF2Net</uri>","PeriodicalId":13492,"journal":{"name":"IEEE Transactions on Information Forensics and Security","volume":"20 ","pages":"9936-9949"},"PeriodicalIF":8.0000,"publicationDate":"2025-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SF2Net: Sequence Feature Fusion Network for Palmprint Verification\",\"authors\":\"Yunlong Liu;Lu Leng;Ziyuan Yang;Andrew Beng Jin Teoh;Bob Zhang\",\"doi\":\"10.1109/TIFS.2025.3611692\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Currently global features are usually extracted directly from local patterns in palmprint verification. Furthermore, sequence features for palmprint verification are only used as local features, but the properties of sequence features are not fully utilized. To solve this issue, this paper introduces Sequence Feature Fusion Network (SF2Net) for palmprint verification. SF2Net proposes a new paradigm: using stable and spatially correlated sequence features as an intermediate bridge to generate robust global representations. SF2Net’s core mechanism is to first extract fine-grained local features that are then converted into sequence features by a Sequence Feature Extractor (SFE). Finally, the sequence features are used as a superior input to capture high-quality global features. By fusing multi-order texture-based local features with globally extracted sequence features, SF2Net achieves superior discrimination. To ensure high accuracy even with limited training data, a hybrid loss function is proposed, which integrate a cross-entropy loss and a triplet loss. Triplet loss effectively optimizes feature separation by explicitly considering negative samples. Extensive experiments on multiple publicly available palmprint datasets demonstrate that SF2Net achieves state-of-the-art (SOTA) performance. Remarkably, even with a small training-to-testing ratio (1:9), SF2Net achieves 100% accuracy, surpassing SOTA methods under several benchmark datasets. The code is released at <uri>https://github.com/20201422/SF2Net</uri>\",\"PeriodicalId\":13492,\"journal\":{\"name\":\"IEEE Transactions on Information Forensics and Security\",\"volume\":\"20 \",\"pages\":\"9936-9949\"},\"PeriodicalIF\":8.0000,\"publicationDate\":\"2025-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Information Forensics and Security\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11172301/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Information Forensics and Security","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11172301/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

摘要

目前掌纹验证通常是直接从局部模式中提取全局特征。此外,用于掌纹验证的序列特征仅作为局部特征,未能充分利用序列特征的特性。为了解决这一问题,本文引入了序列特征融合网络(SF2Net)进行掌纹验证。SF2Net提出了一种新的范式:使用稳定和空间相关的序列特征作为中间桥梁来生成鲁棒的全局表示。SF2Net的核心机制是首先提取细粒度的局部特征,然后通过序列特征提取器(sequence Feature Extractor, SFE)将其转换为序列特征。最后,将序列特征作为优越的输入来捕获高质量的全局特征。SF2Net通过将基于多阶纹理的局部特征与全局提取的序列特征融合,实现了较好的识别效果。为了在训练数据有限的情况下也能保证较高的准确率,提出了一种混合损失函数,该函数将交叉熵损失和三重熵损失相结合。三重态损失通过明确考虑负样本,有效地优化了特征分离。在多个公开可用的掌纹数据集上进行的大量实验表明,SF2Net实现了最先进的(SOTA)性能。值得注意的是,即使训练与测试的比例很小(1:9),SF2Net也能达到100%的准确率,在几个基准数据集下超过了SOTA方法。该代码发布在https://github.com/20201422/SF2Net
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SF2Net: Sequence Feature Fusion Network for Palmprint Verification
Currently global features are usually extracted directly from local patterns in palmprint verification. Furthermore, sequence features for palmprint verification are only used as local features, but the properties of sequence features are not fully utilized. To solve this issue, this paper introduces Sequence Feature Fusion Network (SF2Net) for palmprint verification. SF2Net proposes a new paradigm: using stable and spatially correlated sequence features as an intermediate bridge to generate robust global representations. SF2Net’s core mechanism is to first extract fine-grained local features that are then converted into sequence features by a Sequence Feature Extractor (SFE). Finally, the sequence features are used as a superior input to capture high-quality global features. By fusing multi-order texture-based local features with globally extracted sequence features, SF2Net achieves superior discrimination. To ensure high accuracy even with limited training data, a hybrid loss function is proposed, which integrate a cross-entropy loss and a triplet loss. Triplet loss effectively optimizes feature separation by explicitly considering negative samples. Extensive experiments on multiple publicly available palmprint datasets demonstrate that SF2Net achieves state-of-the-art (SOTA) performance. Remarkably, even with a small training-to-testing ratio (1:9), SF2Net achieves 100% accuracy, surpassing SOTA methods under several benchmark datasets. The code is released at https://github.com/20201422/SF2Net
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Information Forensics and Security
IEEE Transactions on Information Forensics and Security 工程技术-工程:电子与电气
CiteScore
14.40
自引率
7.40%
发文量
234
审稿时长
6.5 months
期刊介绍: The IEEE Transactions on Information Forensics and Security covers the sciences, technologies, and applications relating to information forensics, information security, biometrics, surveillance and systems applications that incorporate these features
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信