Eff-ReLU-Net: a deep learning framework for multiclass wound classification.

IF 3.2 3区 医学 Q2 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
Sifat Ullah, Ali Javed, Muteb Aljasem, Abdul Khader Jilani Saudagar
{"title":"Eff-ReLU-Net: a deep learning framework for multiclass wound classification.","authors":"Sifat Ullah, Ali Javed, Muteb Aljasem, Abdul Khader Jilani Saudagar","doi":"10.1186/s12880-025-01785-z","DOIUrl":null,"url":null,"abstract":"<p><p>Chronic wounds have emerged as a significant medical challenge due to their adverse effects, including infections leading to amputations. Over the past few years, the prevalence of chronic wounds has grown, thus posing significant health hazards. It is now becoming necessary to automate the wound assessment mechanism to limit the dependence of healthcare practitioners on manual methods. Therefore, a need exists for developing an effective wound classifier that enables practitioners to classify wounds quickly and reliably. This work proposed Eff-ReLU-Net, an improved EfficientNet-B0-based deep learning model for accurately identifying multiple categories of wounds. More precisely, we introduced the ReLU activation function over the Swish in our Eff-ReLU-Net because of its simplicity, reliability, and efficiency. Additionally, we introduced three fully connected dense layers at the end to reliably capture more distinct features, leading to improved multi-class wound classification. We also employed augmentation approaches such as fixed-angle rotations at 90°, 180°, and 270°, rotational invariance, random rotation, and translation to improve data diversity and samples for better model generalization and combating overfitting. The proposed model's effectiveness is assessed utilizing the publicly available AZH and Medetec wound datasets. We also conducted the cross-corpora evaluation to show the generalizability of our method. The proposed model achieved an accuracy, precision, recall, and F1-score of 92.33%, 97.66%, 95.33%, and 96.48% on Medetec, respectively. However, for the AZH dataset, the attained accuracy, precision, recall, and F1-score are 90%, 89.45%, 92,19%, and 90.84%, respectively. These results validate the effectiveness of our proposed Eff-ReLU-Net method for classifying chronic wounds.</p>","PeriodicalId":9020,"journal":{"name":"BMC Medical Imaging","volume":"25 1","pages":"257"},"PeriodicalIF":3.2000,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12220098/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"BMC Medical Imaging","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1186/s12880-025-01785-z","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0

Abstract

Chronic wounds have emerged as a significant medical challenge due to their adverse effects, including infections leading to amputations. Over the past few years, the prevalence of chronic wounds has grown, thus posing significant health hazards. It is now becoming necessary to automate the wound assessment mechanism to limit the dependence of healthcare practitioners on manual methods. Therefore, a need exists for developing an effective wound classifier that enables practitioners to classify wounds quickly and reliably. This work proposed Eff-ReLU-Net, an improved EfficientNet-B0-based deep learning model for accurately identifying multiple categories of wounds. More precisely, we introduced the ReLU activation function over the Swish in our Eff-ReLU-Net because of its simplicity, reliability, and efficiency. Additionally, we introduced three fully connected dense layers at the end to reliably capture more distinct features, leading to improved multi-class wound classification. We also employed augmentation approaches such as fixed-angle rotations at 90°, 180°, and 270°, rotational invariance, random rotation, and translation to improve data diversity and samples for better model generalization and combating overfitting. The proposed model's effectiveness is assessed utilizing the publicly available AZH and Medetec wound datasets. We also conducted the cross-corpora evaluation to show the generalizability of our method. The proposed model achieved an accuracy, precision, recall, and F1-score of 92.33%, 97.66%, 95.33%, and 96.48% on Medetec, respectively. However, for the AZH dataset, the attained accuracy, precision, recall, and F1-score are 90%, 89.45%, 92,19%, and 90.84%, respectively. These results validate the effectiveness of our proposed Eff-ReLU-Net method for classifying chronic wounds.

ef - relu - net:一个用于多类伤口分类的深度学习框架。
慢性伤口由于其不利影响,包括导致截肢的感染,已成为一项重大的医疗挑战。在过去几年中,慢性伤口的发病率有所上升,从而对健康造成重大危害。现在有必要将伤口评估机制自动化,以限制医疗保健从业者对手动方法的依赖。因此,需要开发一种有效的伤口分类器,使从业者能够快速可靠地对伤口进行分类。这项工作提出了Eff-ReLU-Net,这是一种改进的基于effentnet -b的深度学习模型,用于准确识别多种类型的伤口。更准确地说,我们在Eff-ReLU-Net中引入了基于Swish的ReLU激活函数,因为它简单、可靠、高效。此外,我们在最后引入了三个完全连接的致密层,以可靠地捕获更多不同的特征,从而改进了多类伤口分类。我们还采用了增强方法,如90°、180°和270°的固定角度旋转、旋转不变性、随机旋转和平移,以改善数据多样性和样本,从而更好地进行模型泛化和对抗过拟合。利用公开可用的AZH和Medetec伤口数据集评估所提出模型的有效性。我们还进行了跨语料库评估,以显示我们的方法的通用性。该模型在Medetec上的准确率、精密度、召回率和f1得分分别为92.33%、97.66%、95.33%和96.48%。然而,对于AZH数据集,获得的准确率、精密度、召回率和f1得分分别为90%、89.45%、92%、19%和90.84%。这些结果验证了我们提出的ef - relu - net方法对慢性伤口分类的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
BMC Medical Imaging
BMC Medical Imaging RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING-
CiteScore
4.60
自引率
3.70%
发文量
198
审稿时长
27 weeks
期刊介绍: BMC Medical Imaging is an open access journal publishing original peer-reviewed research articles in the development, evaluation, and use of imaging techniques and image processing tools to diagnose and manage disease.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信