{"title":"Handwritten Signature Verification via Multimodal Consistency Learning","authors":"Zhaosen Shi;Fagen Li;Dong Hao;Qinshuo Sun","doi":"10.1109/TIFS.2025.3557674","DOIUrl":null,"url":null,"abstract":"Multimodal handwritten signatures usually involve offline images and online sequences. Since in real-world scenarios, different modalities of the same signature are generated simultaneously, most research hypothesizes that the different modalities are consistent. However, attacks launched on a partial modality (e.g., only tampering on the image modality) of signature data are commonly seen, and will cause the inter-modal inconsistency. In this paper, we propose and analyze the multimodal security and attack levels for handwritten signatures, and provide a multimodal consistency learning method to detect different levels of attacks of signatures. The modalities include not only traditional offline and online data, but also videos capturing hand movements. We collect a number of triple modal signatures to address the scarcity of public handwritten video datasets. Then, we extract hand joint sequences from videos and utilize them to analyze subtle multimodal consistency with the online modality. We provide extensive experiments for the consistency between online and offline signatures, as well as between online signatures and movement videos. The verification involves distance-based and classification-based fusion models, showing the most effective discriminative networks for attack detection and the superiority of consistency learning.","PeriodicalId":13492,"journal":{"name":"IEEE Transactions on Information Forensics and Security","volume":"20 ","pages":"3995-4007"},"PeriodicalIF":6.3000,"publicationDate":"2025-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Information Forensics and Security","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10950350/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Multimodal handwritten signatures usually involve offline images and online sequences. Since in real-world scenarios, different modalities of the same signature are generated simultaneously, most research hypothesizes that the different modalities are consistent. However, attacks launched on a partial modality (e.g., only tampering on the image modality) of signature data are commonly seen, and will cause the inter-modal inconsistency. In this paper, we propose and analyze the multimodal security and attack levels for handwritten signatures, and provide a multimodal consistency learning method to detect different levels of attacks of signatures. The modalities include not only traditional offline and online data, but also videos capturing hand movements. We collect a number of triple modal signatures to address the scarcity of public handwritten video datasets. Then, we extract hand joint sequences from videos and utilize them to analyze subtle multimodal consistency with the online modality. We provide extensive experiments for the consistency between online and offline signatures, as well as between online signatures and movement videos. The verification involves distance-based and classification-based fusion models, showing the most effective discriminative networks for attack detection and the superiority of consistency learning.
期刊介绍:
The IEEE Transactions on Information Forensics and Security covers the sciences, technologies, and applications relating to information forensics, information security, biometrics, surveillance and systems applications that incorporate these features