{"title":"Label noise learning with the combination of CausalNL and CGAN models","authors":"Zixing Gou, Yifan Sun, Zhebin Jin, Hanqiu Hu, Weiyi Xia","doi":"10.54254/2755-2721/79/20241399","DOIUrl":null,"url":null,"abstract":"Since Deep Neural Networks easily overfit label errors, which will degenerate the performance of Deep Learning algorithms, recent research gives a lot of methodology for this problem. A recent model, causalNL, uses a structural causalNL model for instance-dependent label-noise learning and obtained excellent experimental results. The implementation of the algorithm is based on the VAE model, which encodes latent variables Y and Z with the observable variables X and Y. This in turn generates the transfer matrix. But it relies on some unreasonable assumptions. In this paper, we introduce CGAN to the causalNL model, which avoids setting P(Y) and P(Z) for a specific distribution. GANs ability of processing data do not need to set a specific distribution. ICC was validated on several authoritative datasets and compared to a variety of proven algorithms including causalNL. The paper presents notable findings on the ICC model (Introduce CGAN to causalNL) shows excellent training ability on most datasets. Surprisingly, ICC shows totally higher accuracy than causalNL in CIFAR10.","PeriodicalId":502253,"journal":{"name":"Applied and Computational Engineering","volume":"48 5","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied and Computational Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54254/2755-2721/79/20241399","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Since Deep Neural Networks easily overfit label errors, which will degenerate the performance of Deep Learning algorithms, recent research gives a lot of methodology for this problem. A recent model, causalNL, uses a structural causalNL model for instance-dependent label-noise learning and obtained excellent experimental results. The implementation of the algorithm is based on the VAE model, which encodes latent variables Y and Z with the observable variables X and Y. This in turn generates the transfer matrix. But it relies on some unreasonable assumptions. In this paper, we introduce CGAN to the causalNL model, which avoids setting P(Y) and P(Z) for a specific distribution. GANs ability of processing data do not need to set a specific distribution. ICC was validated on several authoritative datasets and compared to a variety of proven algorithms including causalNL. The paper presents notable findings on the ICC model (Introduce CGAN to causalNL) shows excellent training ability on most datasets. Surprisingly, ICC shows totally higher accuracy than causalNL in CIFAR10.