{"title":"Deep Intra-Class Similarity Measured Semi-Supervised Learning","authors":"Yang Yang","doi":"10.1109/ICSP54964.2022.9778705","DOIUrl":null,"url":null,"abstract":"Recently, how to handle the situation where only a few samples in the dataset are labeled has become a hot academic topic. Semi-Supervised Learning (SSL) has shown its great capacity and potential in this topic. However, existing methods tend to focus more on the relationship between unlabeled samples and labeled samples or focus on unlabeled samples' information while rarely exploring the hidden information between unlabeled data of the same category. To address this shortcoming, we use an intra-class similarity measure to exploit the information between unlabeled samples of the same class and, on this basis, introduce a new intra-class similarity loss term. In addition, to improve the accuracy of pseudo-labels in deep semi-supervised learning, we also propose an adaptive expansion of the Label Propagation algorithm. The proposed method outperforms many state-of-the-art results in CIFAR-10, CIFAR-100 and Mini-ImageNet. The experimental results show that adding the intra-class similarity loss term and the adaptive extension improvement to the deep semi-supervised learning model can effectively improve the model's performance.","PeriodicalId":363766,"journal":{"name":"2022 7th International Conference on Intelligent Computing and Signal Processing (ICSP)","volume":"27 5","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Intelligent Computing and Signal Processing (ICSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSP54964.2022.9778705","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, how to handle the situation where only a few samples in the dataset are labeled has become a hot academic topic. Semi-Supervised Learning (SSL) has shown its great capacity and potential in this topic. However, existing methods tend to focus more on the relationship between unlabeled samples and labeled samples or focus on unlabeled samples' information while rarely exploring the hidden information between unlabeled data of the same category. To address this shortcoming, we use an intra-class similarity measure to exploit the information between unlabeled samples of the same class and, on this basis, introduce a new intra-class similarity loss term. In addition, to improve the accuracy of pseudo-labels in deep semi-supervised learning, we also propose an adaptive expansion of the Label Propagation algorithm. The proposed method outperforms many state-of-the-art results in CIFAR-10, CIFAR-100 and Mini-ImageNet. The experimental results show that adding the intra-class similarity loss term and the adaptive extension improvement to the deep semi-supervised learning model can effectively improve the model's performance.