Firas Ibrahim AlZobi , Khalid Mansour , Ahmad Nasayreh , Ghassan Samara , Neda’a Alsalman , Ayah Bashkami , Aseel Smerat , Khalid M.O. Nahar
{"title":"Optimized soft-voting CNN ensemble using particle swarm optimization for endometrial cancer histopathology classification","authors":"Firas Ibrahim AlZobi , Khalid Mansour , Ahmad Nasayreh , Ghassan Samara , Neda’a Alsalman , Ayah Bashkami , Aseel Smerat , Khalid M.O. Nahar","doi":"10.1016/j.cmpbup.2025.100217","DOIUrl":null,"url":null,"abstract":"<div><div>The heterogeneity of endometrial cancer tissue presents a significant obstacle to accurate automated classification using histopathological images. While ensemble methods are a promising alternative to single Convolutional Neural Networks (CNNs), we introduce PSO-SV (Particle Swarm Optimization–Soft Voting), a novel framework that adaptively fuses the outputs of MobileNetV2, VGG19, DenseNet121, Swin Transformer, and Vision Transformer (ViT). Our key innovation is the use of Particle Swarm Optimization to dynamically determine the optimal contribution of each model in a soft-voting ensemble. We validated PSO-SV on two datasets, the first one consists from 11,977 tiles from 95 whole-slide images (WSIs) obtained from The Cancer Genome Atlas Uterine Corpus Endometrial Carcinoma (TCGA-UCEC) project, the other dataset consists of 3,302 images from 498 patients, which are categorized into four classes. The proposed framework achieved outstanding results, including 99.67% accuracy, a 99.67% F1-score, and an Area Under the Curve (AUC) of 99.9% on the first dataset and 99% for all metrics for the second dataset. It consistently outperformed all three individual CNNs and a traditional hard-voting ensemble, highlighting its ability to synergistically combine complementary model strengths. The PSO-SV framework offers a powerful and clinically promising approach for robust endometrial cancer classification.</div></div>","PeriodicalId":72670,"journal":{"name":"Computer methods and programs in biomedicine update","volume":"8 ","pages":"Article 100217"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer methods and programs in biomedicine update","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666990025000424","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The heterogeneity of endometrial cancer tissue presents a significant obstacle to accurate automated classification using histopathological images. While ensemble methods are a promising alternative to single Convolutional Neural Networks (CNNs), we introduce PSO-SV (Particle Swarm Optimization–Soft Voting), a novel framework that adaptively fuses the outputs of MobileNetV2, VGG19, DenseNet121, Swin Transformer, and Vision Transformer (ViT). Our key innovation is the use of Particle Swarm Optimization to dynamically determine the optimal contribution of each model in a soft-voting ensemble. We validated PSO-SV on two datasets, the first one consists from 11,977 tiles from 95 whole-slide images (WSIs) obtained from The Cancer Genome Atlas Uterine Corpus Endometrial Carcinoma (TCGA-UCEC) project, the other dataset consists of 3,302 images from 498 patients, which are categorized into four classes. The proposed framework achieved outstanding results, including 99.67% accuracy, a 99.67% F1-score, and an Area Under the Curve (AUC) of 99.9% on the first dataset and 99% for all metrics for the second dataset. It consistently outperformed all three individual CNNs and a traditional hard-voting ensemble, highlighting its ability to synergistically combine complementary model strengths. The PSO-SV framework offers a powerful and clinically promising approach for robust endometrial cancer classification.