Real-Time Deepfake System for Live Streaming

Yifei Fan, Modan Xie, Peihan Wu, Gang Yang
{"title":"Real-Time Deepfake System for Live Streaming","authors":"Yifei Fan, Modan Xie, Peihan Wu, Gang Yang","doi":"10.1145/3512527.3531350","DOIUrl":null,"url":null,"abstract":"This paper proposes a real-time deepfake framework to assist users use deep forgery to conduct live streaming, further to protect privacy and increase interesting by selecting different reference faces to create a non-existent fake face. Nowadays, because of the demand for live broadcast functions such as selling goods, playing games, and auctions, the opportunities for anchor exposure are increasing, which leads live streamers pay more attention to their privacy protection. Meanwhile, the traditional technology of deepfake is more likely to infring on the portrait rights of others, so our framework supports users to select different face features for facial tampering to avoid infringement. In our framework, through feature extractor, heatmap transformer, heatmap regression and face blending, face reenactment could be confirmed effectively. Users can enrich the personal face feature database by uploading different photos, and then select the desired picture for tampering on this basis, and finally real-time tampering live broadcast is achieved. Moreover, our framework is a closed loop self-adaptation system as it allows users to update the database themselves to extend face feature data and improve conversion efficiency.","PeriodicalId":179895,"journal":{"name":"Proceedings of the 2022 International Conference on Multimedia Retrieval","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 International Conference on Multimedia Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3512527.3531350","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

This paper proposes a real-time deepfake framework to assist users use deep forgery to conduct live streaming, further to protect privacy and increase interesting by selecting different reference faces to create a non-existent fake face. Nowadays, because of the demand for live broadcast functions such as selling goods, playing games, and auctions, the opportunities for anchor exposure are increasing, which leads live streamers pay more attention to their privacy protection. Meanwhile, the traditional technology of deepfake is more likely to infring on the portrait rights of others, so our framework supports users to select different face features for facial tampering to avoid infringement. In our framework, through feature extractor, heatmap transformer, heatmap regression and face blending, face reenactment could be confirmed effectively. Users can enrich the personal face feature database by uploading different photos, and then select the desired picture for tampering on this basis, and finally real-time tampering live broadcast is achieved. Moreover, our framework is a closed loop self-adaptation system as it allows users to update the database themselves to extend face feature data and improve conversion efficiency.
实时深度造假直播系统
本文提出了一种实时深度伪造框架,帮助用户使用深度伪造进行直播,进而通过选择不同的参考面孔来创建不存在的假面孔来保护隐私和增加趣味性。如今,由于销售商品、玩游戏、拍卖等直播功能的需求,主播曝光的机会越来越多,这使得直播者更加注重隐私保护。同时,传统的深度造假技术更容易侵犯他人的肖像权,因此我们的框架支持用户选择不同的人脸特征进行人脸篡改,避免侵权。在我们的框架中,通过特征提取、热图变换、热图回归和人脸混合,可以有效地确定人脸再现。用户可以通过上传不同的照片来丰富个人人脸特征库,然后在此基础上选择想要篡改的图片,最后实现实时篡改直播。此外,我们的框架是一个闭环自适应系统,允许用户自行更新数据库,扩展人脸特征数据,提高转换效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信