SFME:多专家评分融合用于长尾识别

Lingyun Wang, Yin Liu, Yunshen Zhou
{"title":"SFME:多专家评分融合用于长尾识别","authors":"Lingyun Wang, Yin Liu, Yunshen Zhou","doi":"10.1109/ICNSC55942.2022.10004049","DOIUrl":null,"url":null,"abstract":"In real-world scenarios, datasets often perform a long-tailed distribution, making it difficult to train neural net-work models that achieve high accuracy across all classes. In this paper, we explore self-supervised learning for the purpose of learning generalized features and propose a score fusion module to integrate outputs from multiple expert models to obtain a unified prediction. Specifically, we take inspiration from the observation that networks trained on a less unbalanced subset of the distribution tend to produce better performance than networks trained on the entire dataset. However, subsets from tail classes are not adequately represented due to the limitation of data size, which means that their performance is actually unsatisfactory. Therefore, we employ self-supervised learning (SSL) on the whole dataset to obtain a more generalized and transferable feature representation, resulting in a sufficient improvement in subset performance. Unlike previous work that used knowledge distillation models to distill the models trained on a subset to get a unified student model, we propose a score fusion module that directly exploits and integrates the predictions of the subset models. We do extensive experiments on several long-tailed recognition benchmarks to demonstrate the effectiveness of our pronosed model.","PeriodicalId":230499,"journal":{"name":"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)","volume":"309 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SFME: Score Fusion from Multiple Experts for Long-tailed Recognition\",\"authors\":\"Lingyun Wang, Yin Liu, Yunshen Zhou\",\"doi\":\"10.1109/ICNSC55942.2022.10004049\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In real-world scenarios, datasets often perform a long-tailed distribution, making it difficult to train neural net-work models that achieve high accuracy across all classes. In this paper, we explore self-supervised learning for the purpose of learning generalized features and propose a score fusion module to integrate outputs from multiple expert models to obtain a unified prediction. Specifically, we take inspiration from the observation that networks trained on a less unbalanced subset of the distribution tend to produce better performance than networks trained on the entire dataset. However, subsets from tail classes are not adequately represented due to the limitation of data size, which means that their performance is actually unsatisfactory. Therefore, we employ self-supervised learning (SSL) on the whole dataset to obtain a more generalized and transferable feature representation, resulting in a sufficient improvement in subset performance. Unlike previous work that used knowledge distillation models to distill the models trained on a subset to get a unified student model, we propose a score fusion module that directly exploits and integrates the predictions of the subset models. We do extensive experiments on several long-tailed recognition benchmarks to demonstrate the effectiveness of our pronosed model.\",\"PeriodicalId\":230499,\"journal\":{\"name\":\"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)\",\"volume\":\"309 2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICNSC55942.2022.10004049\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Networking, Sensing and Control (ICNSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNSC55942.2022.10004049","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在现实场景中,数据集通常执行长尾分布,这使得很难训练在所有类别中实现高精度的神经网络模型。在本文中,我们探索了以学习广义特征为目的的自监督学习,并提出了一个分数融合模块来整合多个专家模型的输出以获得统一的预测。具体来说,我们从观察中得到灵感,即在分布的一个不太不平衡的子集上训练的网络往往比在整个数据集上训练的网络产生更好的性能。然而,由于数据大小的限制,尾类的子集没有得到充分的表示,这意味着它们的性能实际上并不令人满意。因此,我们在整个数据集上使用自监督学习(self-supervised learning, SSL)来获得更一般化和可转移的特征表示,从而在子集性能上得到了充分的提高。与以前使用知识蒸馏模型提取在子集上训练的模型以获得统一的学生模型不同,我们提出了一个分数融合模块,直接利用和集成子集模型的预测。我们在几个长尾识别基准上做了大量的实验来证明我们提出的模型的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SFME: Score Fusion from Multiple Experts for Long-tailed Recognition
In real-world scenarios, datasets often perform a long-tailed distribution, making it difficult to train neural net-work models that achieve high accuracy across all classes. In this paper, we explore self-supervised learning for the purpose of learning generalized features and propose a score fusion module to integrate outputs from multiple expert models to obtain a unified prediction. Specifically, we take inspiration from the observation that networks trained on a less unbalanced subset of the distribution tend to produce better performance than networks trained on the entire dataset. However, subsets from tail classes are not adequately represented due to the limitation of data size, which means that their performance is actually unsatisfactory. Therefore, we employ self-supervised learning (SSL) on the whole dataset to obtain a more generalized and transferable feature representation, resulting in a sufficient improvement in subset performance. Unlike previous work that used knowledge distillation models to distill the models trained on a subset to get a unified student model, we propose a score fusion module that directly exploits and integrates the predictions of the subset models. We do extensive experiments on several long-tailed recognition benchmarks to demonstrate the effectiveness of our pronosed model.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信