Next-generation AI for visually occult pancreatic cancer detection in a low-prevalence setting with longitudinal stability and multi-institutional generalisability.
Sovanlal Mukherjee,Ajith Antony,Nandakumar G Patnam,Kamaxi H Trivedi,Aashna Karbhari,Khurram Khaliq Bhinder,Armin Zarrintan,Joel G Fletcher,Mark Truty,Matthew P Johnson,Suresh T Chari,Ajit Harishkumar Goenka
{"title":"Next-generation AI for visually occult pancreatic cancer detection in a low-prevalence setting with longitudinal stability and multi-institutional generalisability.","authors":"Sovanlal Mukherjee,Ajith Antony,Nandakumar G Patnam,Kamaxi H Trivedi,Aashna Karbhari,Khurram Khaliq Bhinder,Armin Zarrintan,Joel G Fletcher,Mark Truty,Matthew P Johnson,Suresh T Chari,Ajit Harishkumar Goenka","doi":"10.1136/gutjnl-2025-337266","DOIUrl":null,"url":null,"abstract":"BACKGROUND\r\nFailure of conventional imaging to detect pancreatic ductal adenocarcinoma (PDA) at its visually occult pre-diagnostic stage is a primary barrier to improving its otherwise poor rate of survival.\r\n\r\nOBJECTIVE\r\nTo develop and validate the Radiomics-based Early Detection MODel (REDMOD), an AI framework to identify subvisual radiomic signatures of pre-diagnostic PDA on standard-of-care CT.\r\n\r\nDESIGNS\r\nREDMOD was trained on a multi-institutional cohort (n=969; 156 pre-diagnostic, 813 control) and tested on an independent set (n=493; 63 pre-diagnostic, 430 control), simulating a low prevalence (~1:6) early detection paradigm. The fully automated framework couples AI-driven segmentation with a heterogeneous ensemble architecture trained on a 40-feature radiomic signature derived from Synthetic Minority Over-sampling Technique (SMOTE)-balanced data. A tunable Youden Index-optimised classification threshold enables performance calibration without retraining. Validation included direct comparison with radiologists, longitudinal test-retest analysis and external specificity validation across two independent cohorts (n=539 and n=80).\r\n\r\nRESULTS\r\nOn an independent test set (n=493), REDMOD identified occult PDA (AUC 0.82; 73.0% sensitivity) at a median 475-day lead time. This represented nearly twofold higher sensitivity than radiologists (38.9%; p<0.001), which grew to nearly threefold (68.0% vs 23.0%) at >24 months lead time. REDMOD showed strong longitudinal stability (90-92% concordance) and generalisable specificity across multi-institutional (81.3%; n=539) and public (87.5%; n=80) datasets. Mechanistic analyses confirmed predictive power derived principally from multi-scale wavelet-filtered textural features (90% of selected signature), which outperformed unfiltered features (AUC 0.82 vs 0.74; p=0.007) in capturing subvisual architectural disruptions.\r\n\r\nCONCLUSIONS\r\nREDMOD is an automated, mechanistically grounded, longitudinally stable, externally validated AI that surpasses radiologists for PDA detection at its visually occult pre-diagnostic stage. These attributes position it for prospective validation in high-risk cohorts, a necessary step towards shifting the paradigm from late-stage symptomatic diagnosis to proactive pre-clinical interception.","PeriodicalId":12825,"journal":{"name":"Gut","volume":"21 1","pages":""},"PeriodicalIF":25.8000,"publicationDate":"2026-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Gut","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1136/gutjnl-2025-337266","RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GASTROENTEROLOGY & HEPATOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
BACKGROUND
Failure of conventional imaging to detect pancreatic ductal adenocarcinoma (PDA) at its visually occult pre-diagnostic stage is a primary barrier to improving its otherwise poor rate of survival.
OBJECTIVE
To develop and validate the Radiomics-based Early Detection MODel (REDMOD), an AI framework to identify subvisual radiomic signatures of pre-diagnostic PDA on standard-of-care CT.
DESIGNS
REDMOD was trained on a multi-institutional cohort (n=969; 156 pre-diagnostic, 813 control) and tested on an independent set (n=493; 63 pre-diagnostic, 430 control), simulating a low prevalence (~1:6) early detection paradigm. The fully automated framework couples AI-driven segmentation with a heterogeneous ensemble architecture trained on a 40-feature radiomic signature derived from Synthetic Minority Over-sampling Technique (SMOTE)-balanced data. A tunable Youden Index-optimised classification threshold enables performance calibration without retraining. Validation included direct comparison with radiologists, longitudinal test-retest analysis and external specificity validation across two independent cohorts (n=539 and n=80).
RESULTS
On an independent test set (n=493), REDMOD identified occult PDA (AUC 0.82; 73.0% sensitivity) at a median 475-day lead time. This represented nearly twofold higher sensitivity than radiologists (38.9%; p<0.001), which grew to nearly threefold (68.0% vs 23.0%) at >24 months lead time. REDMOD showed strong longitudinal stability (90-92% concordance) and generalisable specificity across multi-institutional (81.3%; n=539) and public (87.5%; n=80) datasets. Mechanistic analyses confirmed predictive power derived principally from multi-scale wavelet-filtered textural features (90% of selected signature), which outperformed unfiltered features (AUC 0.82 vs 0.74; p=0.007) in capturing subvisual architectural disruptions.
CONCLUSIONS
REDMOD is an automated, mechanistically grounded, longitudinally stable, externally validated AI that surpasses radiologists for PDA detection at its visually occult pre-diagnostic stage. These attributes position it for prospective validation in high-risk cohorts, a necessary step towards shifting the paradigm from late-stage symptomatic diagnosis to proactive pre-clinical interception.
期刊介绍:
Gut is a renowned international journal specializing in gastroenterology and hepatology, known for its high-quality clinical research covering the alimentary tract, liver, biliary tree, and pancreas. It offers authoritative and current coverage across all aspects of gastroenterology and hepatology, featuring articles on emerging disease mechanisms and innovative diagnostic and therapeutic approaches authored by leading experts.
As the flagship journal of BMJ's gastroenterology portfolio, Gut is accompanied by two companion journals: Frontline Gastroenterology, focusing on education and practice-oriented papers, and BMJ Open Gastroenterology for open access original research.