{"title":"结合 CausalNL 和 CGAN 模型进行标签噪声学习","authors":"Zixing Gou, Yifan Sun, Zhebin Jin, Hanqiu Hu, Weiyi Xia","doi":"10.54254/2755-2721/79/20241399","DOIUrl":null,"url":null,"abstract":"Since Deep Neural Networks easily overfit label errors, which will degenerate the performance of Deep Learning algorithms, recent research gives a lot of methodology for this problem. A recent model, causalNL, uses a structural causalNL model for instance-dependent label-noise learning and obtained excellent experimental results. The implementation of the algorithm is based on the VAE model, which encodes latent variables Y and Z with the observable variables X and Y. This in turn generates the transfer matrix. But it relies on some unreasonable assumptions. In this paper, we introduce CGAN to the causalNL model, which avoids setting P(Y) and P(Z) for a specific distribution. GANs ability of processing data do not need to set a specific distribution. ICC was validated on several authoritative datasets and compared to a variety of proven algorithms including causalNL. The paper presents notable findings on the ICC model (Introduce CGAN to causalNL) shows excellent training ability on most datasets. Surprisingly, ICC shows totally higher accuracy than causalNL in CIFAR10.","PeriodicalId":502253,"journal":{"name":"Applied and Computational Engineering","volume":"48 5","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Label noise learning with the combination of CausalNL and CGAN models\",\"authors\":\"Zixing Gou, Yifan Sun, Zhebin Jin, Hanqiu Hu, Weiyi Xia\",\"doi\":\"10.54254/2755-2721/79/20241399\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Since Deep Neural Networks easily overfit label errors, which will degenerate the performance of Deep Learning algorithms, recent research gives a lot of methodology for this problem. A recent model, causalNL, uses a structural causalNL model for instance-dependent label-noise learning and obtained excellent experimental results. The implementation of the algorithm is based on the VAE model, which encodes latent variables Y and Z with the observable variables X and Y. This in turn generates the transfer matrix. But it relies on some unreasonable assumptions. In this paper, we introduce CGAN to the causalNL model, which avoids setting P(Y) and P(Z) for a specific distribution. GANs ability of processing data do not need to set a specific distribution. ICC was validated on several authoritative datasets and compared to a variety of proven algorithms including causalNL. The paper presents notable findings on the ICC model (Introduce CGAN to causalNL) shows excellent training ability on most datasets. Surprisingly, ICC shows totally higher accuracy than causalNL in CIFAR10.\",\"PeriodicalId\":502253,\"journal\":{\"name\":\"Applied and Computational Engineering\",\"volume\":\"48 5\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied and Computational Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.54254/2755-2721/79/20241399\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied and Computational Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54254/2755-2721/79/20241399","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
摘要
由于深度神经网络很容易出现标签过拟合错误,这会使深度学习算法的性能下降,因此最近的研究给出了很多解决这一问题的方法。最近的一个模型,causalNL,使用了一个结构化的 causalNL 模型,用于实例依赖的标签噪声学习,取得了很好的实验结果。该算法的实现基于 VAE 模型,即用可观测变量 X 和 Y 对潜变量 Y 和 Z 进行编码,进而生成转移矩阵。但它依赖于一些不合理的假设。在本文中,我们将 CGAN 引入因果 NL 模型,从而避免了为特定分布设置 P(Y) 和 P(Z)。GAN 处理数据的能力不需要设置特定的分布。ICC 在多个权威数据集上进行了验证,并与包括 causalNL 在内的多种成熟算法进行了比较。论文介绍了 ICC 模型(将 CGAN 引入 causalNL)在大多数数据集上表现出的出色训练能力。令人惊讶的是,在 CIFAR10 中,ICC 的准确率完全高于 causalNL。
Label noise learning with the combination of CausalNL and CGAN models
Since Deep Neural Networks easily overfit label errors, which will degenerate the performance of Deep Learning algorithms, recent research gives a lot of methodology for this problem. A recent model, causalNL, uses a structural causalNL model for instance-dependent label-noise learning and obtained excellent experimental results. The implementation of the algorithm is based on the VAE model, which encodes latent variables Y and Z with the observable variables X and Y. This in turn generates the transfer matrix. But it relies on some unreasonable assumptions. In this paper, we introduce CGAN to the causalNL model, which avoids setting P(Y) and P(Z) for a specific distribution. GANs ability of processing data do not need to set a specific distribution. ICC was validated on several authoritative datasets and compared to a variety of proven algorithms including causalNL. The paper presents notable findings on the ICC model (Introduce CGAN to causalNL) shows excellent training ability on most datasets. Surprisingly, ICC shows totally higher accuracy than causalNL in CIFAR10.