Jiayi Xu, Xuan Tan, Yixuan Ju, Xiaoyang Mao, Shanqing Zhang
{"title":"基于动态身份感知的高相似度可控人脸匿名技术","authors":"Jiayi Xu, Xuan Tan, Yixuan Ju, Xiaoyang Mao, Shanqing Zhang","doi":"10.1007/s00371-024-03526-9","DOIUrl":null,"url":null,"abstract":"<p>In the meta-universe scenario, with the development of personalized social networks, interactive behaviors such as uploading and sharing personal and family photographs are becoming increasingly widespread. Consequently, the risk of being searched or leaking personal financial information increases. A possible solution is to use anonymized face images instead of real images in the public situations. Most of the existing face anonymization methods attempt to replace a large portion of the face image to modify identity information. However, the resulted faces are often not similar enough to the original faces as seen with the naked eyes. To maintain visual coherence as much as possible while avoiding recognition by face recognition systems, we propose to detect part of the face that is most relevant to the identity based on saliency analysis. Furthermore, we preserve the identification of irrelevant face features by re-injecting them into the regenerated face. The proposed model consists of three stages. Firstly, we employ a dynamic identity perception network to detect the identity-relevant facial region and generate a masked face with removed identity. Secondly, we apply feature selection and preservation network that extracts basic semantic attributes from the original face and also extracts multilevel identity-irrelevant face features from the masked face, and then fuses them into conditional feature vectors for face regeneration. Finally, a pre-trained StyleGAN2 generator is applied to obtain a high-quality identity-obscured face image. The experimental results show that the proposed method can obtain more realistic anonymized face images that retain most of the original facial attributes, while it can deceive face recognition system to protect privacy in the modern digital economy and entertainment scenarios.</p>","PeriodicalId":501186,"journal":{"name":"The Visual Computer","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"High similarity controllable face anonymization based on dynamic identity perception\",\"authors\":\"Jiayi Xu, Xuan Tan, Yixuan Ju, Xiaoyang Mao, Shanqing Zhang\",\"doi\":\"10.1007/s00371-024-03526-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In the meta-universe scenario, with the development of personalized social networks, interactive behaviors such as uploading and sharing personal and family photographs are becoming increasingly widespread. Consequently, the risk of being searched or leaking personal financial information increases. A possible solution is to use anonymized face images instead of real images in the public situations. Most of the existing face anonymization methods attempt to replace a large portion of the face image to modify identity information. However, the resulted faces are often not similar enough to the original faces as seen with the naked eyes. To maintain visual coherence as much as possible while avoiding recognition by face recognition systems, we propose to detect part of the face that is most relevant to the identity based on saliency analysis. Furthermore, we preserve the identification of irrelevant face features by re-injecting them into the regenerated face. The proposed model consists of three stages. Firstly, we employ a dynamic identity perception network to detect the identity-relevant facial region and generate a masked face with removed identity. Secondly, we apply feature selection and preservation network that extracts basic semantic attributes from the original face and also extracts multilevel identity-irrelevant face features from the masked face, and then fuses them into conditional feature vectors for face regeneration. Finally, a pre-trained StyleGAN2 generator is applied to obtain a high-quality identity-obscured face image. The experimental results show that the proposed method can obtain more realistic anonymized face images that retain most of the original facial attributes, while it can deceive face recognition system to protect privacy in the modern digital economy and entertainment scenarios.</p>\",\"PeriodicalId\":501186,\"journal\":{\"name\":\"The Visual Computer\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Visual Computer\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s00371-024-03526-9\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Visual Computer","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00371-024-03526-9","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
High similarity controllable face anonymization based on dynamic identity perception
In the meta-universe scenario, with the development of personalized social networks, interactive behaviors such as uploading and sharing personal and family photographs are becoming increasingly widespread. Consequently, the risk of being searched or leaking personal financial information increases. A possible solution is to use anonymized face images instead of real images in the public situations. Most of the existing face anonymization methods attempt to replace a large portion of the face image to modify identity information. However, the resulted faces are often not similar enough to the original faces as seen with the naked eyes. To maintain visual coherence as much as possible while avoiding recognition by face recognition systems, we propose to detect part of the face that is most relevant to the identity based on saliency analysis. Furthermore, we preserve the identification of irrelevant face features by re-injecting them into the regenerated face. The proposed model consists of three stages. Firstly, we employ a dynamic identity perception network to detect the identity-relevant facial region and generate a masked face with removed identity. Secondly, we apply feature selection and preservation network that extracts basic semantic attributes from the original face and also extracts multilevel identity-irrelevant face features from the masked face, and then fuses them into conditional feature vectors for face regeneration. Finally, a pre-trained StyleGAN2 generator is applied to obtain a high-quality identity-obscured face image. The experimental results show that the proposed method can obtain more realistic anonymized face images that retain most of the original facial attributes, while it can deceive face recognition system to protect privacy in the modern digital economy and entertainment scenarios.