AU Data Augmentation Method Based on Generative Adversarial Networks

Qingdan Huang, Liqiang Pei, Yong Wang, Lian Zeng
{"title":"AU Data Augmentation Method Based on Generative Adversarial Networks","authors":"Qingdan Huang, Liqiang Pei, Yong Wang, Lian Zeng","doi":"10.1145/3421766.3421825","DOIUrl":null,"url":null,"abstract":"In the facial action unit (Facial Action Unit, AU) recognition process, due to the low occurrence probability of some AUs, the sample imbalance is serious, which severely limits the model recognition performance. Generative adversarial network GAN is an unsupervised learning method. Compared with the autoencoder and autoregressive model in the unsupervised learning method, its advantages are sufficient data fitting, higher efficiency and better generated samples. The original GAN model uses the minimum and maximum (minmax) to continuously optimize the training of the model; the conditional generation adversarial network CGAN adds conditional constraints to the model input to make the generated results controllable and prevent collapse in the model training process. GAN has been widely used in research fields such as image processing, natural language processing NLP, and real-time color correction of underwater images. This paper designs a model based on a conditional generation adversarial network to supplement the minority samples of a specific AU and improve the sample distribution space of the action unit.","PeriodicalId":360184,"journal":{"name":"Proceedings of the 2nd International Conference on Artificial Intelligence and Advanced Manufacture","volume":"41 6 Suppl 1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2nd International Conference on Artificial Intelligence and Advanced Manufacture","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3421766.3421825","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In the facial action unit (Facial Action Unit, AU) recognition process, due to the low occurrence probability of some AUs, the sample imbalance is serious, which severely limits the model recognition performance. Generative adversarial network GAN is an unsupervised learning method. Compared with the autoencoder and autoregressive model in the unsupervised learning method, its advantages are sufficient data fitting, higher efficiency and better generated samples. The original GAN model uses the minimum and maximum (minmax) to continuously optimize the training of the model; the conditional generation adversarial network CGAN adds conditional constraints to the model input to make the generated results controllable and prevent collapse in the model training process. GAN has been widely used in research fields such as image processing, natural language processing NLP, and real-time color correction of underwater images. This paper designs a model based on a conditional generation adversarial network to supplement the minority samples of a specific AU and improve the sample distribution space of the action unit.
基于生成对抗网络的AU数据增强方法
在人脸动作单元(facial action unit, AU)识别过程中,由于某些人脸的出现概率较低,导致样本失衡严重,严重限制了模型的识别性能。生成对抗网络GAN是一种无监督学习方法。与无监督学习方法中的自编码器和自回归模型相比,其优点是数据拟合充分,效率更高,生成的样本质量更好。原始GAN模型采用最小最大值(minmax)对模型进行持续优化训练;条件生成对抗网络CGAN在模型输入中加入条件约束,使生成的结果可控,防止模型训练过程中的崩溃。GAN已广泛应用于图像处理、自然语言处理NLP、水下图像实时色彩校正等研究领域。本文设计了一种基于条件生成对抗网络的模型,以补充特定AU的少数样本,改善行动单元的样本分布空间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信