融合 18F-FDG PET/CT 的浅层和深层特征,预测非小细胞肺癌的表皮生长因子受体敏感突变。

IF 2.9 2区 医学 Q2 RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
Quantitative Imaging in Medicine and Surgery Pub Date : 2024-08-01 Epub Date: 2024-01-19 DOI:10.21037/qims-23-1028
Xiaohui Yao, Yuan Zhu, Zhenxing Huang, Yue Wang, Shan Cong, Liwen Wan, Ruodai Wu, Long Chen, Zhanli Hu
{"title":"融合 18F-FDG PET/CT 的浅层和深层特征,预测非小细胞肺癌的表皮生长因子受体敏感突变。","authors":"Xiaohui Yao, Yuan Zhu, Zhenxing Huang, Yue Wang, Shan Cong, Liwen Wan, Ruodai Wu, Long Chen, Zhanli Hu","doi":"10.21037/qims-23-1028","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Non-small cell lung cancer (NSCLC) patients with epidermal growth factor receptor-sensitizing (EGFR-sensitizing) mutations exhibit a positive response to tyrosine kinase inhibitors (TKIs). Given the limitations of current clinical predictive methods, it is critical to explore radiomics-based approaches. In this study, we leveraged deep-learning technology with multimodal radiomics data to more accurately predict EGFR-sensitizing mutations.</p><p><strong>Methods: </strong>A total of 202 patients who underwent both flourine-18 fluorodeoxyglucose positron emission tomography/computed tomography (<sup>18</sup>F-FDG PET/CT) scans and EGFR sequencing prior to treatment were included in this study. Deep and shallow features were extracted by a residual neural network and the Python package PyRadiomics, respectively. We used least absolute shrinkage and selection operator (LASSO) regression to select predictive features and applied a support vector machine (SVM) to classify the EGFR-sensitive patients. Moreover, we compared predictive performance across different deep models and imaging modalities.</p><p><strong>Results: </strong>In the classification of EGFR-sensitive mutations, the areas under the curve (AUCs) of ResNet-based deep-shallow features and only shallow features from different multidata were as follows: RES_TRAD, PET/CT <i>vs</i>. CT-only <i>vs</i>. PET-only: 0.94 <i>vs</i>. 0.89 <i>vs</i>. 0.92; and ONLY_TRAD, PET/CT <i>vs</i>. CT-only <i>vs</i>. PET-only: 0.68 <i>vs</i>. 0.50 <i>vs</i>. 0.38. Additionally, the receiver operating characteristic (ROC) curves of the model using both deep and shallow features were significantly different from those of the model built using only shallow features (P<0.05).</p><p><strong>Conclusions: </strong>Our findings suggest that deep features significantly enhance the detection of EGFR-sensitizing mutations, especially those extracted with ResNet. Moreover, PET/CT images are more effective than CT-only and PET-only images in producing EGFR-sensitizing mutation-related signatures.</p>","PeriodicalId":54267,"journal":{"name":"Quantitative Imaging in Medicine and Surgery","volume":null,"pages":null},"PeriodicalIF":2.9000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11320501/pdf/","citationCount":"0","resultStr":"{\"title\":\"Fusion of shallow and deep features from <sup>18</sup>F-FDG PET/CT for predicting EGFR-sensitizing mutations in non-small cell lung cancer.\",\"authors\":\"Xiaohui Yao, Yuan Zhu, Zhenxing Huang, Yue Wang, Shan Cong, Liwen Wan, Ruodai Wu, Long Chen, Zhanli Hu\",\"doi\":\"10.21037/qims-23-1028\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Non-small cell lung cancer (NSCLC) patients with epidermal growth factor receptor-sensitizing (EGFR-sensitizing) mutations exhibit a positive response to tyrosine kinase inhibitors (TKIs). Given the limitations of current clinical predictive methods, it is critical to explore radiomics-based approaches. In this study, we leveraged deep-learning technology with multimodal radiomics data to more accurately predict EGFR-sensitizing mutations.</p><p><strong>Methods: </strong>A total of 202 patients who underwent both flourine-18 fluorodeoxyglucose positron emission tomography/computed tomography (<sup>18</sup>F-FDG PET/CT) scans and EGFR sequencing prior to treatment were included in this study. Deep and shallow features were extracted by a residual neural network and the Python package PyRadiomics, respectively. We used least absolute shrinkage and selection operator (LASSO) regression to select predictive features and applied a support vector machine (SVM) to classify the EGFR-sensitive patients. Moreover, we compared predictive performance across different deep models and imaging modalities.</p><p><strong>Results: </strong>In the classification of EGFR-sensitive mutations, the areas under the curve (AUCs) of ResNet-based deep-shallow features and only shallow features from different multidata were as follows: RES_TRAD, PET/CT <i>vs</i>. CT-only <i>vs</i>. PET-only: 0.94 <i>vs</i>. 0.89 <i>vs</i>. 0.92; and ONLY_TRAD, PET/CT <i>vs</i>. CT-only <i>vs</i>. PET-only: 0.68 <i>vs</i>. 0.50 <i>vs</i>. 0.38. Additionally, the receiver operating characteristic (ROC) curves of the model using both deep and shallow features were significantly different from those of the model built using only shallow features (P<0.05).</p><p><strong>Conclusions: </strong>Our findings suggest that deep features significantly enhance the detection of EGFR-sensitizing mutations, especially those extracted with ResNet. Moreover, PET/CT images are more effective than CT-only and PET-only images in producing EGFR-sensitizing mutation-related signatures.</p>\",\"PeriodicalId\":54267,\"journal\":{\"name\":\"Quantitative Imaging in Medicine and Surgery\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2024-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11320501/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Quantitative Imaging in Medicine and Surgery\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.21037/qims-23-1028\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/1/19 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q2\",\"JCRName\":\"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Quantitative Imaging in Medicine and Surgery","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.21037/qims-23-1028","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/19 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING","Score":null,"Total":0}
引用次数: 0

摘要

背景:表皮生长因子受体致敏(EGFR-致敏)突变的非小细胞肺癌(NSCLC)患者对酪氨酸激酶抑制剂(TKIs)呈阳性反应。鉴于目前临床预测方法的局限性,探索基于放射组学的方法至关重要。在这项研究中,我们将深度学习技术与多模态放射组学数据相结合,以更准确地预测表皮生长因子受体(EGFR)致敏突变:本研究共纳入了202名患者,他们在治疗前均接受了面粉碱-18氟脱氧葡萄糖正电子发射断层扫描/计算机断层扫描(18F-FDG PET/CT)扫描和表皮生长因子受体测序。深层和浅层特征分别由残差神经网络和 Python 软件包 PyRadiomics 提取。我们使用最小绝对收缩和选择算子(LASSO)回归来选择预测特征,并应用支持向量机(SVM)对表皮生长因子受体敏感的患者进行分类。此外,我们还比较了不同深度模型和成像模式的预测性能:在表皮生长因子受体敏感突变的分类中,基于 ResNet 的深层-浅层特征和仅来自不同多数据的浅层特征的曲线下面积(AUC)如下:RES_TRAD,PET/CT vs. 仅 CT vs. 仅 PET:0.94 vs. 0.89 vs. 0.92;ONLY_TRAD,PET/CT vs. 仅 CT vs. 仅 PET:0.68 vs. 0.50 vs. 0.38。此外,使用深层和浅层特征的模型的接收器操作特征曲线(ROC)与仅使用浅层特征的模型的接收器操作特征曲线(PConclusions:我们的研究结果表明,深度特征能显著提高表皮生长因子受体敏感突变的检测能力,尤其是使用 ResNet 提取的深度特征。此外,在生成表皮生长因子受体敏感突变相关特征方面,PET/CT 图像比纯 CT 图像和纯 PET 图像更有效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Fusion of shallow and deep features from 18F-FDG PET/CT for predicting EGFR-sensitizing mutations in non-small cell lung cancer.

Background: Non-small cell lung cancer (NSCLC) patients with epidermal growth factor receptor-sensitizing (EGFR-sensitizing) mutations exhibit a positive response to tyrosine kinase inhibitors (TKIs). Given the limitations of current clinical predictive methods, it is critical to explore radiomics-based approaches. In this study, we leveraged deep-learning technology with multimodal radiomics data to more accurately predict EGFR-sensitizing mutations.

Methods: A total of 202 patients who underwent both flourine-18 fluorodeoxyglucose positron emission tomography/computed tomography (18F-FDG PET/CT) scans and EGFR sequencing prior to treatment were included in this study. Deep and shallow features were extracted by a residual neural network and the Python package PyRadiomics, respectively. We used least absolute shrinkage and selection operator (LASSO) regression to select predictive features and applied a support vector machine (SVM) to classify the EGFR-sensitive patients. Moreover, we compared predictive performance across different deep models and imaging modalities.

Results: In the classification of EGFR-sensitive mutations, the areas under the curve (AUCs) of ResNet-based deep-shallow features and only shallow features from different multidata were as follows: RES_TRAD, PET/CT vs. CT-only vs. PET-only: 0.94 vs. 0.89 vs. 0.92; and ONLY_TRAD, PET/CT vs. CT-only vs. PET-only: 0.68 vs. 0.50 vs. 0.38. Additionally, the receiver operating characteristic (ROC) curves of the model using both deep and shallow features were significantly different from those of the model built using only shallow features (P<0.05).

Conclusions: Our findings suggest that deep features significantly enhance the detection of EGFR-sensitizing mutations, especially those extracted with ResNet. Moreover, PET/CT images are more effective than CT-only and PET-only images in producing EGFR-sensitizing mutation-related signatures.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Quantitative Imaging in Medicine and Surgery
Quantitative Imaging in Medicine and Surgery Medicine-Radiology, Nuclear Medicine and Imaging
CiteScore
4.20
自引率
17.90%
发文量
252
期刊介绍: Information not localized
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信