{"title":"Sample selection for noisy partial label learning with interactive contrastive learning","authors":"Xiaotong Yu , Shiding Sun , Yingjie Tian","doi":"10.1016/j.patcog.2025.111681","DOIUrl":null,"url":null,"abstract":"<div><div>In the context of weakly supervised learning, partial label learning (PLL) addresses situations where each training instance is associated with a set of partial labels, with only one being accurate. However, in complex realworld tasks, the restrictive assumption may be invalid which means the ground-truth may be outside the candidate label set. In this work, we loose the constraints and address the noisy label problem for PLL. First, we introduce a selection strategy, which enables deep models to select clean samples via the loss values of flipped and original images. Besides, we progressively identify the true labels of the selected samples and ensemble two models to acquire the knowledge of unselected samples. To extract better feature representations, we introduce pseudo-labeled interactive contrastive learning to aggregate cross-network information of all samples. Experimental results verify that our approach surpasses baseline methods on noisy PLL task with different levels of label noise.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"166 ","pages":"Article 111681"},"PeriodicalIF":7.5000,"publicationDate":"2025-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320325003413","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
In the context of weakly supervised learning, partial label learning (PLL) addresses situations where each training instance is associated with a set of partial labels, with only one being accurate. However, in complex realworld tasks, the restrictive assumption may be invalid which means the ground-truth may be outside the candidate label set. In this work, we loose the constraints and address the noisy label problem for PLL. First, we introduce a selection strategy, which enables deep models to select clean samples via the loss values of flipped and original images. Besides, we progressively identify the true labels of the selected samples and ensemble two models to acquire the knowledge of unselected samples. To extract better feature representations, we introduce pseudo-labeled interactive contrastive learning to aggregate cross-network information of all samples. Experimental results verify that our approach surpasses baseline methods on noisy PLL task with different levels of label noise.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.