PolyNet: A self-attention based CNN model for classifying the colon polyp from colonoscopy image

Q1 Medicine
Khaled Eabne Delowar , Mohammed Borhan Uddin , Md Khaliluzzaman , Riadul Islam Rabbi , Md Jakir Hossen , M. Moazzam Hossen
{"title":"PolyNet: A self-attention based CNN model for classifying the colon polyp from colonoscopy image","authors":"Khaled Eabne Delowar ,&nbsp;Mohammed Borhan Uddin ,&nbsp;Md Khaliluzzaman ,&nbsp;Riadul Islam Rabbi ,&nbsp;Md Jakir Hossen ,&nbsp;M. Moazzam Hossen","doi":"10.1016/j.imu.2025.101654","DOIUrl":null,"url":null,"abstract":"<div><div>Colon polyps are small, precancerous growths in the colon that can indicate colorectal cancer (CRC), a disease that has a significant impact on public health. A colonoscopy is a medical procedure that helps detect colon polyps. However, the manual examination for identifying the type of polyps can be time-consuming, tedious, and prone to human error. Automatic classification of polyps through colonoscopy images can be more efficient. However, there are currently no specialized methods for the classification of polyps from colonoscopy; however, several state-of-the-art CNN models can classify polyps. We are introducing a new CNN-based model called PolyNet, a model that shows the best accuracy of the polyps classification from the multiple models and which also performs better than pre-trained models such as VGG16, ResNet50, DenseNetV3, MobileNetV3, and InceptionV3, as well as nine other customized CNN-based models for classification. This study provides a sensitivity analysis to demonstrate how slight modifications in the network's architecture can impact the balance between accuracy and performance. We examined different CNN architectures and developed a good convolutional neural network (CNN) model for correctly predicting colon polyps using the Kvasir dataset. The self-attention mechanism is incorporated in the best CNN model, i.e., PolypNet, to ensure better accuracy. To compare, DenseNetV3, MobileNet-V3, Inception-V3, VGG16, and ResNet50 get 73.87 %, 69.38 %, 61.12 %, 84.00 %, and 86.12 % of accuracy on the Kvasir dataset, while PolypNet with attention archives 86 % accuracy, 86 % precision, 85 % recall, and an 86 % F1-score.</div></div>","PeriodicalId":13953,"journal":{"name":"Informatics in Medicine Unlocked","volume":"56 ","pages":"Article 101654"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Informatics in Medicine Unlocked","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352914825000425","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Medicine","Score":null,"Total":0}
引用次数: 0

Abstract

Colon polyps are small, precancerous growths in the colon that can indicate colorectal cancer (CRC), a disease that has a significant impact on public health. A colonoscopy is a medical procedure that helps detect colon polyps. However, the manual examination for identifying the type of polyps can be time-consuming, tedious, and prone to human error. Automatic classification of polyps through colonoscopy images can be more efficient. However, there are currently no specialized methods for the classification of polyps from colonoscopy; however, several state-of-the-art CNN models can classify polyps. We are introducing a new CNN-based model called PolyNet, a model that shows the best accuracy of the polyps classification from the multiple models and which also performs better than pre-trained models such as VGG16, ResNet50, DenseNetV3, MobileNetV3, and InceptionV3, as well as nine other customized CNN-based models for classification. This study provides a sensitivity analysis to demonstrate how slight modifications in the network's architecture can impact the balance between accuracy and performance. We examined different CNN architectures and developed a good convolutional neural network (CNN) model for correctly predicting colon polyps using the Kvasir dataset. The self-attention mechanism is incorporated in the best CNN model, i.e., PolypNet, to ensure better accuracy. To compare, DenseNetV3, MobileNet-V3, Inception-V3, VGG16, and ResNet50 get 73.87 %, 69.38 %, 61.12 %, 84.00 %, and 86.12 % of accuracy on the Kvasir dataset, while PolypNet with attention archives 86 % accuracy, 86 % precision, 85 % recall, and an 86 % F1-score.
PolyNet:一种基于自关注的CNN模型,用于从结肠镜图像中分类结肠息肉
结肠息肉是结肠中很小的癌前病变,可能预示着结直肠癌(CRC),这是一种对公众健康有重大影响的疾病。结肠镜检查是一种帮助检测结肠息肉的医疗程序。然而,用于识别息肉类型的人工检查可能是耗时的,繁琐的,并且容易出现人为错误。通过结肠镜图像自动分类息肉可以更有效。然而,目前还没有专门的方法来分类结肠镜检查的息肉;然而,一些最先进的CNN模型可以对息肉进行分类。我们正在引入一种新的基于cnn的模型,称为PolyNet,该模型显示了多个模型中息肉分类的最佳准确性,并且比预训练的模型(如VGG16, ResNet50, DenseNetV3, MobileNetV3和InceptionV3)以及其他九种定制的基于cnn的分类模型表现更好。本研究提供了一个敏感性分析,以证明网络架构的轻微修改如何影响准确性和性能之间的平衡。我们研究了不同的CNN架构,并开发了一个良好的卷积神经网络(CNN)模型,用于使用Kvasir数据集正确预测结肠息肉。自注意机制被引入到最好的CNN模型PolypNet中,以保证更好的准确率。相比之下,DenseNetV3、MobileNet-V3、Inception-V3、VGG16和ResNet50在Kvasir数据集上的准确率分别为73.87%、69.38%、61.12%、84.00%和86.12%,而PolypNet的注意力档案准确率为86%,精密度为86%,召回率为85%,f1得分为86%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Informatics in Medicine Unlocked
Informatics in Medicine Unlocked Medicine-Health Informatics
CiteScore
9.50
自引率
0.00%
发文量
282
审稿时长
39 days
期刊介绍: Informatics in Medicine Unlocked (IMU) is an international gold open access journal covering a broad spectrum of topics within medical informatics, including (but not limited to) papers focusing on imaging, pathology, teledermatology, public health, ophthalmological, nursing and translational medicine informatics. The full papers that are published in the journal are accessible to all who visit the website.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信