IEEE transactions on biometrics, behavior, and identity science最新文献

筛选
英文 中文
IEEE Transactions on Biometrics, Behavior, and Identity Science Publication Information IEEE生物计量学、行为与身份科学学报
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-06-26 DOI: 10.1109/TBIOM.2025.3577282
{"title":"IEEE Transactions on Biometrics, Behavior, and Identity Science Publication Information","authors":"","doi":"10.1109/TBIOM.2025.3577282","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3577282","url":null,"abstract":"","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"C2-C2"},"PeriodicalIF":0.0,"publicationDate":"2025-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11052648","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492496","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
IEEE Transactions on Biometrics, Behavior, and Identity Science Information for Authors IEEE生物识别、行为和身份科学信息作者汇刊
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-06-26 DOI: 10.1109/TBIOM.2025.3577281
{"title":"IEEE Transactions on Biometrics, Behavior, and Identity Science Information for Authors","authors":"","doi":"10.1109/TBIOM.2025.3577281","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3577281","url":null,"abstract":"","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"C3-C3"},"PeriodicalIF":0.0,"publicationDate":"2025-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11052639","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492286","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
IEEE Transactions on Biometrics, Behavior, and Identity Science Publication Information IEEE生物计量学、行为与身份科学学报
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-03-25 DOI: 10.1109/TBIOM.2025.3548256
{"title":"IEEE Transactions on Biometrics, Behavior, and Identity Science Publication Information","authors":"","doi":"10.1109/TBIOM.2025.3548256","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3548256","url":null,"abstract":"","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 2","pages":"C2-C2"},"PeriodicalIF":0.0,"publicationDate":"2025-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10938747","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143698182","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
IEEE Transactions on Biometrics, Behavior, and Identity Science Information for Authors IEEE生物识别、行为和身份科学信息作者汇刊
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-03-25 DOI: 10.1109/TBIOM.2025.3548257
{"title":"IEEE Transactions on Biometrics, Behavior, and Identity Science Information for Authors","authors":"","doi":"10.1109/TBIOM.2025.3548257","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3548257","url":null,"abstract":"","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 2","pages":"C3-C3"},"PeriodicalIF":0.0,"publicationDate":"2025-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10938740","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143698324","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Detecting Near-Duplicate Face Images 检测近重复的人脸图像
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-03-05 DOI: 10.1109/TBIOM.2025.3548541
Sudipta Banerjee;Arun Ross
{"title":"Detecting Near-Duplicate Face Images","authors":"Sudipta Banerjee;Arun Ross","doi":"10.1109/TBIOM.2025.3548541","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3548541","url":null,"abstract":"Near-duplicate images are often generated when applying repeated photometric and geometric transformations that produce imperceptible variants of the original image. Consequently, a deluge of near-duplicates can be circulated online posing copyright infringement concerns. The concerns are more severe when biometric data is altered through such nuanced transformations. In this work, we address the challenge of near-duplicate detection in face images by, firstly, identifying the original image from a set of near-duplicates and, secondly, deducing the relationship between the original image and the near-duplicates. We construct a tree-like structure, called an Image Phylogeny Tree (IPT) using a graph-theoretic approach to estimate the relationship, i.e., determine the sequence in which they have been generated. We further extend our method to create an ensemble of IPTs known as Image Phylogeny Forests (IPFs). We rigorously evaluate our method to demonstrate robustness across other modalities, unseen transformations by latest generative models and IPT configurations, thereby significantly advancing the state-of-the-art performance by ~42% on IPF reconstruction accuracy. Our code is publicly available at <uri>https://github.com/sudban3089/DetectingNear-Duplicates</uri>.","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"498-511"},"PeriodicalIF":0.0,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fingerprint Spoof Generation Using Style Transfer 指纹欺骗生成使用风格转移
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-02-24 DOI: 10.1109/TBIOM.2025.3545308
Abdarahmane Wone;Joël Di Manno;Christophe Charrier;Christophe Rosenberger
{"title":"Fingerprint Spoof Generation Using Style Transfer","authors":"Abdarahmane Wone;Joël Di Manno;Christophe Charrier;Christophe Rosenberger","doi":"10.1109/TBIOM.2025.3545308","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3545308","url":null,"abstract":"Nowadays, biometrics is becoming more and more present in our everyday lives. They are used in ID documents, border controls, authentication, and e-payment, etc. Therefore, ensuring the security of biometric systems has become a major concern. The certification process aims at qualifying the behavior of a biometric system and verifying its conformity to international specifications. It involves the evaluation of the system performance and its robustness to attacks. Anti-spoofing tests require the creation of physical presentation attack instruments (PAIs), which are used to evaluate the robustness of biometric systems against spoofing through multiple attempts of testing on the device. In this article, we propose a new solution based on deep learning to generate synthetic fingerprint spoof images from a small dataset of real-life images acquired by a specific sensor. We artificially modify these images to simulate how they would appear if generated from known spoof materials usually involved in fingerprint spoofing tests. Experiments on LivDet datasets show first, that synthetic fingerprint spoof images give similar performance to real-life ones from a matching point of view only and second, that injection attacks succeed 50% of the time for most of the materials we tested.","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"512-523"},"PeriodicalIF":0.0,"publicationDate":"2025-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
What’s Color Got to Do With It? Face Recognition in Grayscale 颜色和它有什么关系?灰度人脸识别
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-02-14 DOI: 10.1109/TBIOM.2025.3542316
Aman Bhatta;Domingo Mery;Haiyu Wu;Joyce Annan;Michael C. King;Kevin W. Bowyer
{"title":"What’s Color Got to Do With It? Face Recognition in Grayscale","authors":"Aman Bhatta;Domingo Mery;Haiyu Wu;Joyce Annan;Michael C. King;Kevin W. Bowyer","doi":"10.1109/TBIOM.2025.3542316","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3542316","url":null,"abstract":"State-of-the-art deep CNN face matchers are typically created using extensive training sets of color face images. Our study reveals that such matchers attain virtually identical accuracy when trained on either grayscale or color versions of the training set, even when the evaluation is done using color test images. Furthermore, we demonstrate that shallower models, lacking the capacity to model complex representations, rely more heavily on low-level features such as those associated with color. As a result, they display diminished accuracy when trained with grayscale images. We then consider possible causes for deeper CNN face matchers “not seeing color”. Popular Web-scraped face datasets actually have 30 to 60% of their identities with one or more grayscale images. We analyze whether this grayscale element in the training set impacts the accuracy achieved, and conclude that it does not. We demonstrate that using only grayscale images for both training and testing achieves accuracy comparable to that achieved using only color images for deeper models. This holds true for both real and synthetic training datasets. HSV color space, which separates chroma and luma information, does not improve the network’s learning about color any more than in the RGB color space. We then show that the skin region of an individual’s images in a Web-scraped training set exhibits significant variation in their mapping to color space. This suggests that color carries limited identity-specific information. We also show that when the first convolution layer is restricted to a single filter, models learn a grayscale conversion filter and pass a grayscale version of the input color image to the next layer. Finally, we demonstrate that leveraging the lower per-image storage for grayscale to increase the number of images in the training set can improve accuracy of face recognition.","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"484-497"},"PeriodicalIF":0.0,"publicationDate":"2025-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492417","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
WePerson: Generalizable Re-Identification From Synthetic Data With Single Query Adaptation WePerson:基于单查询自适应的合成数据的可泛化再识别
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-02-11 DOI: 10.1109/TBIOM.2025.3540919
He Li;Mang Ye;Kehua Su;Bo Du
{"title":"WePerson: Generalizable Re-Identification From Synthetic Data With Single Query Adaptation","authors":"He Li;Mang Ye;Kehua Su;Bo Du","doi":"10.1109/TBIOM.2025.3540919","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3540919","url":null,"abstract":"Person re-identification (ReID) aims to retrieve a target person across non-overlapping cameras. Due to the uncontrollable environment and the privacy concerns, the diversity and scale of real-world training data are usually limited, resulting in poor testing generalizability. To overcome these problems, we introduce a large-scale Weather Person dataset that generates synthetic images with different weather conditions, complex scenes, natural lighting changes, and various pedestrian accessories in a simulated camera network. The environment is fully controllable, supporting factor-by-factor analysis. To narrow the gap between synthetic data and real-world scenarios, this paper introduces a simple yet efficient domain generalization method via Single Query Adaptation (SQA), calibrating the statistics and transformation parameters in BatchNorm layers with only a single query image in the target domain. This significantly improves performance through a single adaptation epoch, greatly boosting the applicability of the ReID technique for intelligent surveillance systems. Abundant experiment results demonstrate that the WePerson dataset achieves superior performance under direct transfer setting without any real-world data training. In addition, the proposed SQA method shows amazing robustness in real-to-real, synthetic-to-real ReID, and various corruption settings. Dataset and code are available at <uri>https://github.com/lihe404/WePerson</uri>.","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"458-470"},"PeriodicalIF":0.0,"publicationDate":"2025-02-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492272","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Synthetic Face Ageing: Evaluation, Analysis and Facilitation of Age-Robust Facial Recognition Algorithms 合成人脸老化:年龄稳健人脸识别算法的评估、分析与促进
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-01-30 DOI: 10.1109/TBIOM.2025.3536622
Wang Yao;Muhammad Ali Farooq;Joseph Lemley;Peter Corcoran
{"title":"Synthetic Face Ageing: Evaluation, Analysis and Facilitation of Age-Robust Facial Recognition Algorithms","authors":"Wang Yao;Muhammad Ali Farooq;Joseph Lemley;Peter Corcoran","doi":"10.1109/TBIOM.2025.3536622","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3536622","url":null,"abstract":"Establishing the identity of an individual from their facial data is widely adopted across the consumer sector, driven by the use of facial authentication on handheld devices. This widespread use of facial authentication technology has raised other issues, in particular those of biases in the underlying algorithms. Initial studies focused on ethnic or gender biases, but another area is that of age-related biases. This research work focuses on the challenge of face recognition over decades-long time intervals and explores the feasibility of utilizing synthetic ageing data to improve the robustness of face recognition models in recognizing people across these longer time intervals. To achieve this, we first design a set of experiments to evaluate state-of-the-art synthetic ageing methods. In the next stage, we explore the effect of age intervals on a reference face recognition algorithm using both synthetic and real ageing data to perform rigorous validation. We then use these synthetic age data as an augmentation method to facilitate the age-invariant face recognition algorithm. Extensive experimental results demonstrate a notable improvement in the recognition rate of the model trained on synthetic ageing images, with an increase of 3.33% compared to the baseline model when tested on images with a 40-year age gap. Additionally, our models exhibit competitive performance when validated on benchmark cross-age datasets and general face recognition datasets. These findings underscore the potential of synthetic age data to enhance the performance of age-invariant face recognition systems.","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"471-483"},"PeriodicalIF":0.0,"publicationDate":"2025-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10858190","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Joint Coarse to Fine-Grained Spatio-Temporal Modeling for Video Action Recognition 视频动作识别的粗、细粒度时空联合建模
IEEE transactions on biometrics, behavior, and identity science Pub Date : 2025-01-21 DOI: 10.1109/TBIOM.2025.3532416
Chunlei Li;Can Cheng;Miao Yu;Zhoufeng Liu;Di Huang
{"title":"Joint Coarse to Fine-Grained Spatio-Temporal Modeling for Video Action Recognition","authors":"Chunlei Li;Can Cheng;Miao Yu;Zhoufeng Liu;Di Huang","doi":"10.1109/TBIOM.2025.3532416","DOIUrl":"https://doi.org/10.1109/TBIOM.2025.3532416","url":null,"abstract":"The action recognition task involves analyzing video content and temporal relationships between frames to identify actions. Crucial to this process are action representations that effectively capture varying temporal scales and spatial motion variations. To address these challenges, we propose the Joint Coarse to Fine-Grained Spatio-Temporal Modeling (JCFG-STM) approach, which is designed to capture robust spatio-temporal representations through three key components: the Temporal-enhanced Spatio-Temporal Perception (TSTP) module, the Positional-enhanced Spatio-Temporal Perception (PSTP) module, and the Fine-grained Spatio-Temporal Perception (FSTP) module. Specifically, TSTP is designed to fuse temporal information across both local and global spatial scales, while PSTP emphasizes the integration of spatial coordinate directions, both horizontal and vertical, with temporal dynamics. Meanwhile, FSTP focuses on combining spatial coordinate information with short-term temporal data by differentiating neighboring frames, enabling fine-grained spatio-temporal modeling. JCFG-STM effectively focuses on multi-granularity and complementary motion patterns associated with actions. Extensive experiments conducted on large-scale action recognition datasets, including Kinetics-400, Something-Something V2, Jester, and EgoGesture, demonstrate the effectiveness of our approach and its superiority over state-of-the-art methods.","PeriodicalId":73307,"journal":{"name":"IEEE transactions on biometrics, behavior, and identity science","volume":"7 3","pages":"444-457"},"PeriodicalIF":0.0,"publicationDate":"2025-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144492273","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信