Improvement on Mechanics Attention Deep Learning model for Classification Ear-tag of Swine

Thinh Pham-Duc, M. Ullah, T. Le-Tien, M. Luong, F. A. Cheikh, Øyvind Nordb⊘
{"title":"Improvement on Mechanics Attention Deep Learning model for Classification Ear-tag of Swine","authors":"Thinh Pham-Duc, M. Ullah, T. Le-Tien, M. Luong, F. A. Cheikh, Øyvind Nordb⊘","doi":"10.1109/NICS56915.2022.10013403","DOIUrl":null,"url":null,"abstract":"Classification is a commonly used task that helps computer to resemble human vision in deep neural network problems. In this paper, we investigated the enhanced attention mechanism to improve the model's accuracy and apply the focal loss to deal with the imbalance of data for the ear-tag classification. Briefly, the combination of spatial-channel attention and the current state-of-the-art Convolution Neural Network (CNN), such as ResNet, DenseNet, and EfficientNet enhances model's efficiency in the provided dataset. Moreover, data augmentations namely rotation, shear, Gaussian noise, cropping, and a set of different augmentations are applied to the training phase in which the focal loss is specifically used instead of the traditional cross-entropy (CE) to avoid data imbalance. The research data presented in this paper was collected at a Norwegian farm and manually annotated. An ablation study relating to the augmentation, backbone model, and attention mechanism has proved the importance of each module in the classification. A detailed analysis on the models and their hyperparameters has provided evidence of a significant improvement in the performance.","PeriodicalId":381028,"journal":{"name":"2022 9th NAFOSTED Conference on Information and Computer Science (NICS)","volume":"85 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 9th NAFOSTED Conference on Information and Computer Science (NICS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NICS56915.2022.10013403","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Classification is a commonly used task that helps computer to resemble human vision in deep neural network problems. In this paper, we investigated the enhanced attention mechanism to improve the model's accuracy and apply the focal loss to deal with the imbalance of data for the ear-tag classification. Briefly, the combination of spatial-channel attention and the current state-of-the-art Convolution Neural Network (CNN), such as ResNet, DenseNet, and EfficientNet enhances model's efficiency in the provided dataset. Moreover, data augmentations namely rotation, shear, Gaussian noise, cropping, and a set of different augmentations are applied to the training phase in which the focal loss is specifically used instead of the traditional cross-entropy (CE) to avoid data imbalance. The research data presented in this paper was collected at a Norwegian farm and manually annotated. An ablation study relating to the augmentation, backbone model, and attention mechanism has proved the importance of each module in the classification. A detailed analysis on the models and their hyperparameters has provided evidence of a significant improvement in the performance.
猪耳标分类力学注意深度学习模型的改进
在深度神经网络问题中,分类是帮助计算机模拟人类视觉的常用任务。本文研究了增强注意机制来提高模型的准确性,并利用焦点损失来解决耳标分类中数据不平衡的问题。简而言之,空间通道注意力和当前最先进的卷积神经网络(CNN)(如ResNet、DenseNet和EfficientNet)的结合提高了模型在提供数据集中的效率。此外,在训练阶段应用了数据增强,即旋转、剪切、高斯噪声、裁剪和一组不同的增强,其中专门使用焦点损失而不是传统的交叉熵(CE)来避免数据不平衡。本文中提出的研究数据是在挪威的一个农场收集并手工注释的。一项关于增强、骨干模型和注意机制的消融研究证明了每个模块在分类中的重要性。对模型及其超参数的详细分析提供了性能显著改善的证据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信