{"title":"Subclass consistency regularization for learning with noisy labels based on contrastive learning","authors":"","doi":"10.1016/j.neucom.2024.128759","DOIUrl":null,"url":null,"abstract":"<div><div>A prominent effect of label noise on neural networks is the disruption of the consistency of predictions. While prior efforts primarily focused on predictions’ consistency at the individual instance level, they often fell short of fully harnessing the consistency across multiple instances. This paper introduces subclass consistency regularization (SCCR) to maximize the potential of this collective consistency of predictions. SCCR mitigates the impact of label noise on neural networks by imposing constraints on the consistency of predictions within each subclass. However, constructing high-quality subclasses poses a formidable challenge, which we formulate as a special clustering problem. To efficiently establish these subclasses, we incorporate a clustering-based contrastive learning framework. Additionally, we introduce the <span><math><mi>Q</mi></math></span>-enhancing algorithm to tailor the contrastive learning framework, ensuring alignment with subclass construction. We conducted comprehensive experiments using benchmark datasets and real datasets to evaluate the effectiveness of our proposed method under various scenarios with differing noise rates. The results unequivocally demonstrate the enhancement in classification accuracy, especially in challenging high-noise settings. Moreover, the refined contrastive learning framework significantly elevates the quality of subclasses even in the presence of noise. Furthermore, we delve into the compatibility of contrastive learning and learning with noisy labels, using the projection head as an illustrative example. This investigation sheds light on an aspect that has hitherto been overlooked in prior research efforts.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224015303","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
A prominent effect of label noise on neural networks is the disruption of the consistency of predictions. While prior efforts primarily focused on predictions’ consistency at the individual instance level, they often fell short of fully harnessing the consistency across multiple instances. This paper introduces subclass consistency regularization (SCCR) to maximize the potential of this collective consistency of predictions. SCCR mitigates the impact of label noise on neural networks by imposing constraints on the consistency of predictions within each subclass. However, constructing high-quality subclasses poses a formidable challenge, which we formulate as a special clustering problem. To efficiently establish these subclasses, we incorporate a clustering-based contrastive learning framework. Additionally, we introduce the -enhancing algorithm to tailor the contrastive learning framework, ensuring alignment with subclass construction. We conducted comprehensive experiments using benchmark datasets and real datasets to evaluate the effectiveness of our proposed method under various scenarios with differing noise rates. The results unequivocally demonstrate the enhancement in classification accuracy, especially in challenging high-noise settings. Moreover, the refined contrastive learning framework significantly elevates the quality of subclasses even in the presence of noise. Furthermore, we delve into the compatibility of contrastive learning and learning with noisy labels, using the projection head as an illustrative example. This investigation sheds light on an aspect that has hitherto been overlooked in prior research efforts.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.