基于数据转换的深度学习改善了口腔癌前病变的癌症风险预测

IF 6.9 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
John Adeoye, Yuxiong Su
{"title":"基于数据转换的深度学习改善了口腔癌前病变的癌症风险预测","authors":"John Adeoye,&nbsp;Yuxiong Su","doi":"10.1016/j.imed.2024.11.003","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Oral cancer is the most common head and neck malignancy and may develop from oral leukoplakia (OL) and oral lichenoid disease (OLD). Machine learning classifiers using structured (tabular) data have been employed to predict malignant transformation in OL and OLD. However, current models require improved discrimination, and their frameworks may limit feature fusion and multimodal risk prediction. Therefore, this study investigates whether tabular-to-image data conversion and deep learning (DL) based on convolutional neural networks (CNNs) can improve malignant transformation prediction compared to traditional classifiers.</div></div><div><h3>Methods</h3><div>This study used retrospective data of 1,010 patients with OL and OLD treated at Queen Mary Hospital, Hong Kong, from January 2003 to December 2023, to construct artificial intelligence-based models for oral cancer risk stratification in OL/OLD. Twenty-five input features and information on oral cancer development in OL/OLD were retrieved from electronic health records. Tabular-to-2D image data transformation was achieved by creating a feature matrix from encoded labels of the input variables arranged according to their correlation coefficient. Then, 2D images were used to populate five pre-trained DL models (VGG16, VGG19, MobileNetV2, ResNet50, and EfficientNet-B0). Area under the receiver operating characteristic curve (AUC), Brier scores, and net benefit of the DL models were calculated and compared to five traditional classifiers based on structured data and the binary epithelial dysplasia grading system (current method).</div></div><div><h3>Results</h3><div>This study found that the DL models had better AUC values (0.893–0.955) and Brier scores (0.072–0.106) compared to the traditional classifiers (AUC: 0.887–0.941 and Brier score: 0.074–0.136) during validation. During internal testing, VGG16 and VGG19 had better AUC values and Brier scores than other CNNs (AUC: 0.998–1.00; Brier score: 0.036–0.044) and the best traditional classifier (random forest) (AUC: 0.906; Brier score: 0.153). Additionally, VGG16 and VGG19 models outperformed random forest in discrimination and calibration during external testing (AUC: 1.00 <em>vs</em>. 0.976; Brier score: 0.022–0.034 <em>vs</em>. 0.129). The best CNNs also had better discriminatory performance and calibration than binary dysplasia grading at internal and external testing. Overall, decision curve analysis showed that the optimal DL models with transformed data had a higher net benefit than random forest and binary dysplasia grading.</div></div><div><h3>Conclusion</h3><div>Tabular-to-2D image data transformation may improve the use of structured input features for developing optimal intelligent models for oral cancer risk prediction in OL and OLD using convolutional networks. This approach may have the potential to robustly handle structured data in multimodal DL frameworks for oncological outcome prediction.</div></div>","PeriodicalId":73400,"journal":{"name":"Intelligent medicine","volume":"5 2","pages":"Pages 141-150"},"PeriodicalIF":6.9000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep learning with data transformation improves cancer risk prediction in oral precancerous conditions\",\"authors\":\"John Adeoye,&nbsp;Yuxiong Su\",\"doi\":\"10.1016/j.imed.2024.11.003\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Background</h3><div>Oral cancer is the most common head and neck malignancy and may develop from oral leukoplakia (OL) and oral lichenoid disease (OLD). Machine learning classifiers using structured (tabular) data have been employed to predict malignant transformation in OL and OLD. However, current models require improved discrimination, and their frameworks may limit feature fusion and multimodal risk prediction. Therefore, this study investigates whether tabular-to-image data conversion and deep learning (DL) based on convolutional neural networks (CNNs) can improve malignant transformation prediction compared to traditional classifiers.</div></div><div><h3>Methods</h3><div>This study used retrospective data of 1,010 patients with OL and OLD treated at Queen Mary Hospital, Hong Kong, from January 2003 to December 2023, to construct artificial intelligence-based models for oral cancer risk stratification in OL/OLD. Twenty-five input features and information on oral cancer development in OL/OLD were retrieved from electronic health records. Tabular-to-2D image data transformation was achieved by creating a feature matrix from encoded labels of the input variables arranged according to their correlation coefficient. Then, 2D images were used to populate five pre-trained DL models (VGG16, VGG19, MobileNetV2, ResNet50, and EfficientNet-B0). Area under the receiver operating characteristic curve (AUC), Brier scores, and net benefit of the DL models were calculated and compared to five traditional classifiers based on structured data and the binary epithelial dysplasia grading system (current method).</div></div><div><h3>Results</h3><div>This study found that the DL models had better AUC values (0.893–0.955) and Brier scores (0.072–0.106) compared to the traditional classifiers (AUC: 0.887–0.941 and Brier score: 0.074–0.136) during validation. During internal testing, VGG16 and VGG19 had better AUC values and Brier scores than other CNNs (AUC: 0.998–1.00; Brier score: 0.036–0.044) and the best traditional classifier (random forest) (AUC: 0.906; Brier score: 0.153). Additionally, VGG16 and VGG19 models outperformed random forest in discrimination and calibration during external testing (AUC: 1.00 <em>vs</em>. 0.976; Brier score: 0.022–0.034 <em>vs</em>. 0.129). The best CNNs also had better discriminatory performance and calibration than binary dysplasia grading at internal and external testing. Overall, decision curve analysis showed that the optimal DL models with transformed data had a higher net benefit than random forest and binary dysplasia grading.</div></div><div><h3>Conclusion</h3><div>Tabular-to-2D image data transformation may improve the use of structured input features for developing optimal intelligent models for oral cancer risk prediction in OL and OLD using convolutional networks. This approach may have the potential to robustly handle structured data in multimodal DL frameworks for oncological outcome prediction.</div></div>\",\"PeriodicalId\":73400,\"journal\":{\"name\":\"Intelligent medicine\",\"volume\":\"5 2\",\"pages\":\"Pages 141-150\"},\"PeriodicalIF\":6.9000,\"publicationDate\":\"2025-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Intelligent medicine\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2667102625000300\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligent medicine","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2667102625000300","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

背景:口腔癌是最常见的头颈部恶性肿瘤,可由口腔白斑(OL)和口腔苔藓样疾病(OLD)发展而来。使用结构化(表格)数据的机器学习分类器已被用于预测OL和OLD的恶性转化。然而,目前的模型需要改进识别,其框架可能会限制特征融合和多模式风险预测。因此,本研究探讨了基于卷积神经网络(cnn)的表到图像数据转换和深度学习(DL)与传统分类器相比,是否可以提高恶性转化预测。方法本研究采用2003年1月至2023年12月在香港玛丽医院接受治疗的1010例OL和OLD患者的回顾性数据,构建OL/OLD患者口腔癌风险分层的人工智能模型。从电子健康记录中检索了OL/OLD患者口腔癌发展的25个输入特征和信息。通过将输入变量的编码标签按相关系数排列,生成特征矩阵,实现从表格到二维图像数据的转换。然后,使用2D图像填充5个预训练的深度学习模型(VGG16、VGG19、MobileNetV2、ResNet50和EfficientNet-B0)。计算受试者工作特征曲线下面积(AUC)、Brier评分和DL模型的净效益,并与基于结构化数据和二元上皮发育不良分级系统(现行方法)的五种传统分类器进行比较。结果在验证过程中,深度学习模型的AUC值(0.893 ~ 0.955)和Brier评分(0.072 ~ 0.106)优于传统分类器(AUC: 0.887 ~ 0.941, Brier评分:0.074 ~ 0.136)。内测时,VGG16和VGG19的AUC值和Brier得分均高于其他cnn (AUC: 0.998-1.00;Brier评分:0.036-0.044)和最佳传统分类器(随机森林)(AUC: 0.906;Brier评分:0.153)。此外,在外部测试中,VGG16和VGG19模型在识别和校准方面优于随机森林(AUC: 1.00 vs. 0.976;Brier评分:0.022-0.034比0.129)。最好的cnn在内外测试中也比二元不典型增生分级具有更好的区分性能和校准能力。总体而言,决策曲线分析表明,转换数据后的最优DL模型比随机森林和二元发育不良分级具有更高的净效益。结论表格到二维图像数据的转换可以提高结构化输入特征的使用,用于开发最优的智能模型,用于使用卷积网络预测OL和OLD的口腔癌风险。这种方法可能有潜力在多模态深度学习框架中稳健地处理结构化数据,用于肿瘤预后预测。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Deep learning with data transformation improves cancer risk prediction in oral precancerous conditions

Background

Oral cancer is the most common head and neck malignancy and may develop from oral leukoplakia (OL) and oral lichenoid disease (OLD). Machine learning classifiers using structured (tabular) data have been employed to predict malignant transformation in OL and OLD. However, current models require improved discrimination, and their frameworks may limit feature fusion and multimodal risk prediction. Therefore, this study investigates whether tabular-to-image data conversion and deep learning (DL) based on convolutional neural networks (CNNs) can improve malignant transformation prediction compared to traditional classifiers.

Methods

This study used retrospective data of 1,010 patients with OL and OLD treated at Queen Mary Hospital, Hong Kong, from January 2003 to December 2023, to construct artificial intelligence-based models for oral cancer risk stratification in OL/OLD. Twenty-five input features and information on oral cancer development in OL/OLD were retrieved from electronic health records. Tabular-to-2D image data transformation was achieved by creating a feature matrix from encoded labels of the input variables arranged according to their correlation coefficient. Then, 2D images were used to populate five pre-trained DL models (VGG16, VGG19, MobileNetV2, ResNet50, and EfficientNet-B0). Area under the receiver operating characteristic curve (AUC), Brier scores, and net benefit of the DL models were calculated and compared to five traditional classifiers based on structured data and the binary epithelial dysplasia grading system (current method).

Results

This study found that the DL models had better AUC values (0.893–0.955) and Brier scores (0.072–0.106) compared to the traditional classifiers (AUC: 0.887–0.941 and Brier score: 0.074–0.136) during validation. During internal testing, VGG16 and VGG19 had better AUC values and Brier scores than other CNNs (AUC: 0.998–1.00; Brier score: 0.036–0.044) and the best traditional classifier (random forest) (AUC: 0.906; Brier score: 0.153). Additionally, VGG16 and VGG19 models outperformed random forest in discrimination and calibration during external testing (AUC: 1.00 vs. 0.976; Brier score: 0.022–0.034 vs. 0.129). The best CNNs also had better discriminatory performance and calibration than binary dysplasia grading at internal and external testing. Overall, decision curve analysis showed that the optimal DL models with transformed data had a higher net benefit than random forest and binary dysplasia grading.

Conclusion

Tabular-to-2D image data transformation may improve the use of structured input features for developing optimal intelligent models for oral cancer risk prediction in OL and OLD using convolutional networks. This approach may have the potential to robustly handle structured data in multimodal DL frameworks for oncological outcome prediction.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Intelligent medicine
Intelligent medicine Surgery, Radiology and Imaging, Artificial Intelligence, Biomedical Engineering
CiteScore
5.20
自引率
0.00%
发文量
19
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信