Analysis of The Change of Softmax Value in The Training Process of Neural Network

Yuyang Chen
{"title":"Analysis of The Change of Softmax Value in The Training Process of Neural Network","authors":"Yuyang Chen","doi":"10.1145/3573428.3573763","DOIUrl":null,"url":null,"abstract":"Classification is an essential field in deep learning. Generally, the category corresponding to the maximum value of softmax is mainly used as the prediction result and the softmax value as the prediction probability. However, whether softmax can indeed serve as a prediction probability needs further confirmation. This paper first focuses on the classification of paintings through Convolutional Neural Network. To deal with the imbalanced dataset problem, only those with more than 200 paintings are selected. Besides, class weight is also taken into consideration. Next, data augmentation is applied to enlarge the dataset and add more relevant data. For the modeling and training part, transfer learning is employed to avoid training from scratch on a new dataset and reduce the cost of later training. Techniques such as ‘EarlyStopping’ and ‘ReduceLROnPlateau’ are also used to avoid overfitting. The final prediction accuracy can achieve 99 percent on the training and 87 percent on the validation sets. Furthermore, the paper studies the change of softmax distribution during the training process and the relationship between the average maximum value of softmax and the classification performance of classes. The experiments show that the maximum value of softmax will gradually shift to the corresponding correct label during the training process. Still, there is no correlation between the classification performance and the average maximum value of softmax. Therefore, softmax cannot be used as a probability value for classification.","PeriodicalId":314698,"journal":{"name":"Proceedings of the 2022 6th International Conference on Electronic Information Technology and Computer Engineering","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 6th International Conference on Electronic Information Technology and Computer Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3573428.3573763","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Classification is an essential field in deep learning. Generally, the category corresponding to the maximum value of softmax is mainly used as the prediction result and the softmax value as the prediction probability. However, whether softmax can indeed serve as a prediction probability needs further confirmation. This paper first focuses on the classification of paintings through Convolutional Neural Network. To deal with the imbalanced dataset problem, only those with more than 200 paintings are selected. Besides, class weight is also taken into consideration. Next, data augmentation is applied to enlarge the dataset and add more relevant data. For the modeling and training part, transfer learning is employed to avoid training from scratch on a new dataset and reduce the cost of later training. Techniques such as ‘EarlyStopping’ and ‘ReduceLROnPlateau’ are also used to avoid overfitting. The final prediction accuracy can achieve 99 percent on the training and 87 percent on the validation sets. Furthermore, the paper studies the change of softmax distribution during the training process and the relationship between the average maximum value of softmax and the classification performance of classes. The experiments show that the maximum value of softmax will gradually shift to the corresponding correct label during the training process. Still, there is no correlation between the classification performance and the average maximum value of softmax. Therefore, softmax cannot be used as a probability value for classification.
神经网络训练过程中Softmax值的变化分析
分类是深度学习的一个重要领域。一般将softmax最大值所对应的类别作为预测结果,将softmax值作为预测概率。但是,softmax是否真的可以作为预测概率,还需要进一步确认。本文首先研究了卷积神经网络对绘画的分类。为了解决不平衡的数据集问题,只选择超过200幅画的数据集。此外,还考虑了类权重。其次,采用数据增强技术扩大数据集,增加更多的相关数据。在建模和训练部分,采用迁移学习避免了在新的数据集上从头开始训练,降低了后续训练的成本。“early stop”和“ReduceLROnPlateau”等技术也用于避免过拟合。最终的预测准确率在训练集上可以达到99%,在验证集上可以达到87%。进一步研究了训练过程中softmax分布的变化,以及softmax的平均最大值与类的分类性能之间的关系。实验表明,在训练过程中,softmax的最大值会逐渐向相应的正确标签偏移。但是,分类性能与softmax的平均最大值之间没有相关性。因此,softmax不能作为分类的概率值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信