Cervical Transformation Zone Segmentation and Classification based on Improved Inception-ResNet-V2 Using Colposcopy Images.

IF 2.4 Q2 MATHEMATICAL & COMPUTATIONAL BIOLOGY
Srikanta Dash, Prabira Kumar Sethy, Santi Kumari Behera
{"title":"Cervical Transformation Zone Segmentation and Classification based on Improved Inception-ResNet-V2 Using Colposcopy Images.","authors":"Srikanta Dash,&nbsp;Prabira Kumar Sethy,&nbsp;Santi Kumari Behera","doi":"10.1177/11769351231161477","DOIUrl":null,"url":null,"abstract":"<p><p>The second most frequent malignancy in women worldwide is cervical cancer. In the transformation(transitional) zone, which is a region of the cervix, columnar cells are continuously converting into squamous cells. The most typical location on the cervix for the development of aberrant cells is the transformation zone, a region of transforming cells. This article suggests a 2-phase method that includes segmenting and classifying the transformation zone to identify the type of cervical cancer. In the initial stage, the transformation zone is segmented from the colposcopy images. The segmented images are then subjected to the augmentation process and identified with the improved inception-resnet-v2. Here, multi-scale feature fusion framework that utilizes 3 × 3 convolution kernels from Reduction-A and Reduction-B of inception-resnet-v2 is introduced. The feature extracted from Reduction-A and Reduction -B is concatenated and fed to SVM for classification. This way, the model combines the benefits of residual networks and Inception convolution, increasing network width and resolving the deep network's training issue. The network can extract several scales of contextual information due to the multi-scale feature fusion, which increases accuracy. The experimental results reveal 81.24% accuracy, 81.24% sensitivity, 90.62% specificity, 87.52% precision, 9.38% FPR, and 81.68% F1 score, 75.27% MCC, and 57.79% Kappa coefficient.</p>","PeriodicalId":35418,"journal":{"name":"Cancer Informatics","volume":null,"pages":null},"PeriodicalIF":2.4000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/ad/c1/10.1177_11769351231161477.PMC10064461.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cancer Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/11769351231161477","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

The second most frequent malignancy in women worldwide is cervical cancer. In the transformation(transitional) zone, which is a region of the cervix, columnar cells are continuously converting into squamous cells. The most typical location on the cervix for the development of aberrant cells is the transformation zone, a region of transforming cells. This article suggests a 2-phase method that includes segmenting and classifying the transformation zone to identify the type of cervical cancer. In the initial stage, the transformation zone is segmented from the colposcopy images. The segmented images are then subjected to the augmentation process and identified with the improved inception-resnet-v2. Here, multi-scale feature fusion framework that utilizes 3 × 3 convolution kernels from Reduction-A and Reduction-B of inception-resnet-v2 is introduced. The feature extracted from Reduction-A and Reduction -B is concatenated and fed to SVM for classification. This way, the model combines the benefits of residual networks and Inception convolution, increasing network width and resolving the deep network's training issue. The network can extract several scales of contextual information due to the multi-scale feature fusion, which increases accuracy. The experimental results reveal 81.24% accuracy, 81.24% sensitivity, 90.62% specificity, 87.52% precision, 9.38% FPR, and 81.68% F1 score, 75.27% MCC, and 57.79% Kappa coefficient.

Abstract Image

Abstract Image

Abstract Image

基于改进的Inception-ResNet-V2阴道镜图像的宫颈转化区分割与分类。
全世界妇女中第二常见的恶性肿瘤是宫颈癌。在转化(过渡)区,即子宫颈的一个区域,柱状细胞不断转化为鳞状细胞。宫颈上异常细胞发育最典型的位置是转化区,即转化细胞的区域。本文提出了两阶段的方法,包括分割和分类转化区,以确定宫颈癌的类型。在初始阶段,从阴道镜图像中分割出变换区域。然后对分割后的图像进行增强处理,并使用改进的inception-resnet-v2进行识别。本文介绍了利用inception-resnet-v2的reduce - a和reduce - b的3 × 3卷积核的多尺度特征融合框架。将从reduce - a和Reduction -B中提取的特征串接并馈送给SVM进行分类。这样,该模型结合了残差网络和Inception卷积的优点,增加了网络宽度,解决了深度网络的训练问题。由于多尺度特征融合,该网络可以提取多个尺度的上下文信息,提高了准确率。实验结果显示:准确率81.24%,灵敏度81.24%,特异性90.62%,精密度87.52%,FPR 9.38%, F1评分81.68%,MCC 75.27%, Kappa系数57.79%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Cancer Informatics
Cancer Informatics Medicine-Oncology
CiteScore
3.00
自引率
5.00%
发文量
30
审稿时长
8 weeks
期刊介绍: The field of cancer research relies on advances in many other disciplines, including omics technology, mass spectrometry, radio imaging, computer science, and biostatistics. Cancer Informatics provides open access to peer-reviewed high-quality manuscripts reporting bioinformatics analysis of molecular genetics and/or clinical data pertaining to cancer, emphasizing the use of machine learning, artificial intelligence, statistical algorithms, advanced imaging techniques, data visualization, and high-throughput technologies. As the leading journal dedicated exclusively to the report of the use of computational methods in cancer research and practice, Cancer Informatics leverages methodological improvements in systems biology, genomics, proteomics, metabolomics, and molecular biochemistry into the fields of cancer detection, treatment, classification, risk-prediction, prevention, outcome, and modeling.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信