An Approach of Fake Videos Detection Based on Haar Cascades and Convolutional Neural Network

Ameni Jellali, Ines Ben Fredj, K. Ouni
{"title":"An Approach of Fake Videos Detection Based on Haar Cascades and Convolutional Neural Network","authors":"Ameni Jellali, Ines Ben Fredj, K. Ouni","doi":"10.1109/IC_ASET58101.2023.10150604","DOIUrl":null,"url":null,"abstract":"Because deep fakes might skew our impression of the truth, we need to come up with a method for better spotting them. This paper proposes a new forensic technique to detect manipulated facial images from videos. It is based on CNNs architecture that can distinguish possible face manipulations in the “real-and-fake-face-detection” dataset offered by Kaggle. The results obtained highlight comparable performances with the state-of-the-art methods. It showed an accuracy of approximately 99 % for this binary classification into fake or real faces. Then to validate this model we added a human face detection technique using the Haar Cascade method to this model in order to detect the manipulated videos from Deep Fake Detection Challenge (DFDC) dataset and we achieve an accuracy of 91 correct predictions out of 100 total videos.","PeriodicalId":272261,"journal":{"name":"2023 IEEE International Conference on Advanced Systems and Emergent Technologies (IC_ASET)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Advanced Systems and Emergent Technologies (IC_ASET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC_ASET58101.2023.10150604","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Because deep fakes might skew our impression of the truth, we need to come up with a method for better spotting them. This paper proposes a new forensic technique to detect manipulated facial images from videos. It is based on CNNs architecture that can distinguish possible face manipulations in the “real-and-fake-face-detection” dataset offered by Kaggle. The results obtained highlight comparable performances with the state-of-the-art methods. It showed an accuracy of approximately 99 % for this binary classification into fake or real faces. Then to validate this model we added a human face detection technique using the Haar Cascade method to this model in order to detect the manipulated videos from Deep Fake Detection Challenge (DFDC) dataset and we achieve an accuracy of 91 correct predictions out of 100 total videos.
基于Haar级联和卷积神经网络的假视频检测方法
因为深度虚假可能会扭曲我们对真相的印象,我们需要想出一种更好地识别它们的方法。本文提出了一种新的从视频中检测被篡改面部图像的取证技术。它基于cnn架构,可以在Kaggle提供的“真假人脸检测”数据集中区分可能的人脸操纵。所获得的结果突出了与最先进的方法相当的性能。它显示,将假脸和真脸进行二元分类的准确率约为99%。然后为了验证该模型,我们使用Haar级联方法在该模型中添加了人脸检测技术,以便检测来自深度虚假检测挑战(DFDC)数据集的操纵视频,我们在100个总视频中实现了91个正确预测的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信