{"title":"利用深度学习对超声图像进行甲状腺结节风险分层","authors":"Yasaman Sharifi , Morteza Danay Ashgzari , Susan Shafiei , Seyed Rasoul Zakavi , Saeid Eslami","doi":"10.1016/j.wfumbo.2025.100082","DOIUrl":null,"url":null,"abstract":"<div><h3>Background</h3><div>Interpreting thyroid ultrasound images is a tedious task and is prone to interobserver variability. This study proposes a computer-aided diagnosis system (CAD) for thyroid nodule risk classification and management recommendations based on the American College of Radiology (ACR) Thyroid Imaging Reporting and Data System (TIRADS), which uses a deep learning framework to increase diagnostic accuracy and reliability.</div></div><div><h3>Materials and methods</h3><div>In this retrospective analysis, 2450 thyroid ultrasound images with 3250 nodules were acquired from 1037 patients from 2018 to 2020 at a single institution. Our proposed automated method has four main steps: preprocessing and image augmentation, nodule detection, nodule classification on the basis of ACR-TIRADS, and risk-level stratification and treatment management. We trained different state-of-the-art pretrained convolutional neural networks (CNNs) to choose the best architecture in the detection and classification stage. We compared the performance of our method with that of three experienced radiologists.</div></div><div><h3>Results</h3><div>The comparison results show that the Faster R-CNN ResNet-101 has better performance in the detection stage and that the fine-tuned Xception model achieves 0.98 % accuracy, 0.99 % AUC, 0.967 % precision, and 0.912 % recall when it is selected as the backbone of the classification stage. The results demonstrated that the performance of our algorithm was better than that of the three radiologists, with a mean kappa value of 0.85 % for the five ACR-TIRADS categories compared with the gold standard.</div></div><div><h3>Conclusions</h3><div>This study, in addition to generating a valuable database of thyroid US images, demonstrates that our method can effectively improve the performance of thyroid nodule assessment and can assist radiologists as an adjunctive clinical tool to improve efficiency, reliability, and diagnostic performance in clinical practice.</div></div>","PeriodicalId":101281,"journal":{"name":"WFUMB Ultrasound Open","volume":"3 1","pages":"Article 100082"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Using deep learning for thyroid nodule risk stratification from ultrasound images\",\"authors\":\"Yasaman Sharifi , Morteza Danay Ashgzari , Susan Shafiei , Seyed Rasoul Zakavi , Saeid Eslami\",\"doi\":\"10.1016/j.wfumbo.2025.100082\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Background</h3><div>Interpreting thyroid ultrasound images is a tedious task and is prone to interobserver variability. This study proposes a computer-aided diagnosis system (CAD) for thyroid nodule risk classification and management recommendations based on the American College of Radiology (ACR) Thyroid Imaging Reporting and Data System (TIRADS), which uses a deep learning framework to increase diagnostic accuracy and reliability.</div></div><div><h3>Materials and methods</h3><div>In this retrospective analysis, 2450 thyroid ultrasound images with 3250 nodules were acquired from 1037 patients from 2018 to 2020 at a single institution. Our proposed automated method has four main steps: preprocessing and image augmentation, nodule detection, nodule classification on the basis of ACR-TIRADS, and risk-level stratification and treatment management. We trained different state-of-the-art pretrained convolutional neural networks (CNNs) to choose the best architecture in the detection and classification stage. We compared the performance of our method with that of three experienced radiologists.</div></div><div><h3>Results</h3><div>The comparison results show that the Faster R-CNN ResNet-101 has better performance in the detection stage and that the fine-tuned Xception model achieves 0.98 % accuracy, 0.99 % AUC, 0.967 % precision, and 0.912 % recall when it is selected as the backbone of the classification stage. The results demonstrated that the performance of our algorithm was better than that of the three radiologists, with a mean kappa value of 0.85 % for the five ACR-TIRADS categories compared with the gold standard.</div></div><div><h3>Conclusions</h3><div>This study, in addition to generating a valuable database of thyroid US images, demonstrates that our method can effectively improve the performance of thyroid nodule assessment and can assist radiologists as an adjunctive clinical tool to improve efficiency, reliability, and diagnostic performance in clinical practice.</div></div>\",\"PeriodicalId\":101281,\"journal\":{\"name\":\"WFUMB Ultrasound Open\",\"volume\":\"3 1\",\"pages\":\"Article 100082\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-02-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"WFUMB Ultrasound Open\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2949668325000047\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"WFUMB Ultrasound Open","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949668325000047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using deep learning for thyroid nodule risk stratification from ultrasound images
Background
Interpreting thyroid ultrasound images is a tedious task and is prone to interobserver variability. This study proposes a computer-aided diagnosis system (CAD) for thyroid nodule risk classification and management recommendations based on the American College of Radiology (ACR) Thyroid Imaging Reporting and Data System (TIRADS), which uses a deep learning framework to increase diagnostic accuracy and reliability.
Materials and methods
In this retrospective analysis, 2450 thyroid ultrasound images with 3250 nodules were acquired from 1037 patients from 2018 to 2020 at a single institution. Our proposed automated method has four main steps: preprocessing and image augmentation, nodule detection, nodule classification on the basis of ACR-TIRADS, and risk-level stratification and treatment management. We trained different state-of-the-art pretrained convolutional neural networks (CNNs) to choose the best architecture in the detection and classification stage. We compared the performance of our method with that of three experienced radiologists.
Results
The comparison results show that the Faster R-CNN ResNet-101 has better performance in the detection stage and that the fine-tuned Xception model achieves 0.98 % accuracy, 0.99 % AUC, 0.967 % precision, and 0.912 % recall when it is selected as the backbone of the classification stage. The results demonstrated that the performance of our algorithm was better than that of the three radiologists, with a mean kappa value of 0.85 % for the five ACR-TIRADS categories compared with the gold standard.
Conclusions
This study, in addition to generating a valuable database of thyroid US images, demonstrates that our method can effectively improve the performance of thyroid nodule assessment and can assist radiologists as an adjunctive clinical tool to improve efficiency, reliability, and diagnostic performance in clinical practice.