Qihua Zhou;Zhili Zhou;Zhipeng Bao;Weina Niu;Yuling Liu
{"title":"IIN-FFD: Intra-Inter Network for Face Forgery Detection","authors":"Qihua Zhou;Zhili Zhou;Zhipeng Bao;Weina Niu;Yuling Liu","doi":"10.26599/TST.2024.9010022","DOIUrl":null,"url":null,"abstract":"Since different kinds of face forgeries leave similar forgery traces in videos, learning the common features from different kinds of forged faces would achieve promising generalization ability of forgery detection. Therefore, to accurately detect known forgeries while ensuring high generalization ability of detecting unknown forgeries, we propose an intra-inter network (IIN) for face forgery detection (FFD) in videos with continual learning. The proposed IIN mainly consists of three modules, i.e., intra-module, inter-module, and forged trace masking module (FTMM). Specifically, the intra-module is trained for each kind of face forgeries by supervised learning to extract special features, while the inter-module is trained by self-supervised learning to extract the common features. As a result, the common and special features of the different forgeries are decoupled by the two feature learning modules, and then the decoupled common features can be utlized to achieve high generalization ability for FFD. Moreover, the FTMM is deployed for contrastive learning to further improve detection accuracy. The experimental results on FaceForensic++ dataset demonstrate that the proposed IIN outperforms the state-of-the-arts in FFD. Also, the generalization ability of the IIN verified on DFDC and Celeb-DF datasets demonstrates that the proposed IIN significantly improves the generalization ability for FFD.","PeriodicalId":48690,"journal":{"name":"Tsinghua Science and Technology","volume":"29 6","pages":"1839-1850"},"PeriodicalIF":6.6000,"publicationDate":"2024-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10566006","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10566006/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Multidisciplinary","Score":null,"Total":0}
引用次数: 0
Abstract
Since different kinds of face forgeries leave similar forgery traces in videos, learning the common features from different kinds of forged faces would achieve promising generalization ability of forgery detection. Therefore, to accurately detect known forgeries while ensuring high generalization ability of detecting unknown forgeries, we propose an intra-inter network (IIN) for face forgery detection (FFD) in videos with continual learning. The proposed IIN mainly consists of three modules, i.e., intra-module, inter-module, and forged trace masking module (FTMM). Specifically, the intra-module is trained for each kind of face forgeries by supervised learning to extract special features, while the inter-module is trained by self-supervised learning to extract the common features. As a result, the common and special features of the different forgeries are decoupled by the two feature learning modules, and then the decoupled common features can be utlized to achieve high generalization ability for FFD. Moreover, the FTMM is deployed for contrastive learning to further improve detection accuracy. The experimental results on FaceForensic++ dataset demonstrate that the proposed IIN outperforms the state-of-the-arts in FFD. Also, the generalization ability of the IIN verified on DFDC and Celeb-DF datasets demonstrates that the proposed IIN significantly improves the generalization ability for FFD.
期刊介绍:
Tsinghua Science and Technology (Tsinghua Sci Technol) started publication in 1996. It is an international academic journal sponsored by Tsinghua University and is published bimonthly. This journal aims at presenting the up-to-date scientific achievements in computer science, electronic engineering, and other IT fields. Contributions all over the world are welcome.