Deep Intra-Class Similarity Measured Semi-Supervised Learning

Yang Yang
{"title":"Deep Intra-Class Similarity Measured Semi-Supervised Learning","authors":"Yang Yang","doi":"10.1109/ICSP54964.2022.9778705","DOIUrl":null,"url":null,"abstract":"Recently, how to handle the situation where only a few samples in the dataset are labeled has become a hot academic topic. Semi-Supervised Learning (SSL) has shown its great capacity and potential in this topic. However, existing methods tend to focus more on the relationship between unlabeled samples and labeled samples or focus on unlabeled samples' information while rarely exploring the hidden information between unlabeled data of the same category. To address this shortcoming, we use an intra-class similarity measure to exploit the information between unlabeled samples of the same class and, on this basis, introduce a new intra-class similarity loss term. In addition, to improve the accuracy of pseudo-labels in deep semi-supervised learning, we also propose an adaptive expansion of the Label Propagation algorithm. The proposed method outperforms many state-of-the-art results in CIFAR-10, CIFAR-100 and Mini-ImageNet. The experimental results show that adding the intra-class similarity loss term and the adaptive extension improvement to the deep semi-supervised learning model can effectively improve the model's performance.","PeriodicalId":363766,"journal":{"name":"2022 7th International Conference on Intelligent Computing and Signal Processing (ICSP)","volume":"27 5","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Intelligent Computing and Signal Processing (ICSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSP54964.2022.9778705","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Recently, how to handle the situation where only a few samples in the dataset are labeled has become a hot academic topic. Semi-Supervised Learning (SSL) has shown its great capacity and potential in this topic. However, existing methods tend to focus more on the relationship between unlabeled samples and labeled samples or focus on unlabeled samples' information while rarely exploring the hidden information between unlabeled data of the same category. To address this shortcoming, we use an intra-class similarity measure to exploit the information between unlabeled samples of the same class and, on this basis, introduce a new intra-class similarity loss term. In addition, to improve the accuracy of pseudo-labels in deep semi-supervised learning, we also propose an adaptive expansion of the Label Propagation algorithm. The proposed method outperforms many state-of-the-art results in CIFAR-10, CIFAR-100 and Mini-ImageNet. The experimental results show that adding the intra-class similarity loss term and the adaptive extension improvement to the deep semi-supervised learning model can effectively improve the model's performance.
深度类内相似度测量半监督学习
近年来,如何处理数据集中只有少数样本被标记的情况已经成为一个热门的学术话题。半监督学习(SSL)在这一领域显示出了巨大的能力和潜力。然而,现有的方法往往更多地关注未标记样本与已标记样本之间的关系,或者关注未标记样本的信息,而很少探索同一类别未标记数据之间的隐藏信息。为了解决这一缺点,我们使用类内相似性度量来利用同一类的未标记样本之间的信息,并在此基础上引入新的类内相似性损失项。此外,为了提高深度半监督学习中伪标签的准确性,我们还提出了标签传播算法的自适应扩展。该方法优于CIFAR-10、CIFAR-100和Mini-ImageNet中许多最先进的结果。实验结果表明,在深度半监督学习模型中加入类内相似度损失项和自适应扩展改进可以有效地提高模型的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信