利用加权交叉熵损失函数解决多标签分类中的不平衡问题

M. Rezaei-Dastjerdehei, A. Mijani, E. Fatemizadeh
{"title":"利用加权交叉熵损失函数解决多标签分类中的不平衡问题","authors":"M. Rezaei-Dastjerdehei, A. Mijani, E. Fatemizadeh","doi":"10.1109/ICBME51989.2020.9319440","DOIUrl":null,"url":null,"abstract":"Training a model and network on an imbalanced dataset always has been a challenging problem in the machine learning field that has been discussed by researchers. In fact, available machine learning algorithms are designed moderately imbalanced datasets and mainly do not consider the dataset's imbalanced problem. In the machine learning algorithm, the imbalance problem appears when the number of one class samples are significantly minor than another class. In order to solve the imbalance problem of a dataset, multiple algorithms are proposed in the field of machine learning and especially in deep learning. In this study, we have benefited from weighted binary cross-entropy in the learning process as a loss function instead of ordinary cross-entropy (binary cross-entropy). This model allocates more penalty to minority class samples during the learning process, and it makes that minority class samples are detected more accurately. Finally, we could improve Recall with preserving Accuracy. In fact, results show that using weighted binary cross-entropy recall increases about 10%, and precision does not decrease more than 3% in comparison to binary cross-entropy.","PeriodicalId":120969,"journal":{"name":"2020 27th National and 5th International Iranian Conference on Biomedical Engineering (ICBME)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Addressing Imbalance in Multi-Label Classification Using Weighted Cross Entropy Loss Function\",\"authors\":\"M. Rezaei-Dastjerdehei, A. Mijani, E. Fatemizadeh\",\"doi\":\"10.1109/ICBME51989.2020.9319440\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Training a model and network on an imbalanced dataset always has been a challenging problem in the machine learning field that has been discussed by researchers. In fact, available machine learning algorithms are designed moderately imbalanced datasets and mainly do not consider the dataset's imbalanced problem. In the machine learning algorithm, the imbalance problem appears when the number of one class samples are significantly minor than another class. In order to solve the imbalance problem of a dataset, multiple algorithms are proposed in the field of machine learning and especially in deep learning. In this study, we have benefited from weighted binary cross-entropy in the learning process as a loss function instead of ordinary cross-entropy (binary cross-entropy). This model allocates more penalty to minority class samples during the learning process, and it makes that minority class samples are detected more accurately. Finally, we could improve Recall with preserving Accuracy. In fact, results show that using weighted binary cross-entropy recall increases about 10%, and precision does not decrease more than 3% in comparison to binary cross-entropy.\",\"PeriodicalId\":120969,\"journal\":{\"name\":\"2020 27th National and 5th International Iranian Conference on Biomedical Engineering (ICBME)\",\"volume\":\"34 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 27th National and 5th International Iranian Conference on Biomedical Engineering (ICBME)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICBME51989.2020.9319440\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 27th National and 5th International Iranian Conference on Biomedical Engineering (ICBME)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBME51989.2020.9319440","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

在不平衡数据集上训练模型和网络一直是机器学习领域的一个具有挑战性的问题,研究人员一直在讨论这个问题。事实上,现有的机器学习算法都是设计适度不平衡的数据集,主要不考虑数据集的不平衡问题。在机器学习算法中,当一类样本的数量明显小于另一类样本的数量时,就会出现不平衡问题。为了解决数据集的不平衡问题,机器学习特别是深度学习领域提出了多种算法。在本研究中,我们在学习过程中受益于加权二元交叉熵作为损失函数而不是普通交叉熵(二元交叉熵)。该模型在学习过程中对少数类样本分配了更多的惩罚,使得对少数类样本的检测更加准确。最后,我们可以在保持准确率的前提下提高召回率。事实上,结果表明,与二元交叉熵相比,加权二元交叉熵的召回率提高了约10%,精度降低不超过3%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Addressing Imbalance in Multi-Label Classification Using Weighted Cross Entropy Loss Function
Training a model and network on an imbalanced dataset always has been a challenging problem in the machine learning field that has been discussed by researchers. In fact, available machine learning algorithms are designed moderately imbalanced datasets and mainly do not consider the dataset's imbalanced problem. In the machine learning algorithm, the imbalance problem appears when the number of one class samples are significantly minor than another class. In order to solve the imbalance problem of a dataset, multiple algorithms are proposed in the field of machine learning and especially in deep learning. In this study, we have benefited from weighted binary cross-entropy in the learning process as a loss function instead of ordinary cross-entropy (binary cross-entropy). This model allocates more penalty to minority class samples during the learning process, and it makes that minority class samples are detected more accurately. Finally, we could improve Recall with preserving Accuracy. In fact, results show that using weighted binary cross-entropy recall increases about 10%, and precision does not decrease more than 3% in comparison to binary cross-entropy.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信