Laith R Sultan, Susan M Schultz, Theodore W Cary, Chandra M Sehgal
{"title":"机器学习提高乳腺癌多模态超声诊断。","authors":"Laith R Sultan, Susan M Schultz, Theodore W Cary, Chandra M Sehgal","doi":"10.1109/ultsym.2018.8579953","DOIUrl":null,"url":null,"abstract":"<p><p>Despite major advances in breast cancer imaging there is compelling need to reduce unnecessary biopsies by improving characterization of breast lesions. This study demonstrates the use of machine learning to enhance breast cancer diagnosis with multimodal ultrasound. Surgically proven solid breast lesions were studied using quantitative features extracted from grayscale and Doppler ultrasound images. Statistically different features from the logistic regression classifier were used train and test lesion differentiation by leave-one-out cross-validation. The area under the ROC curve (AUC) of the grayscale morphologic features was 0.85 (sensitivity = 87, specificity = 69). The diagnostic performance improved (AUC = 0.89, sensitivity = 79, specificity = 89) when Doppler features were added to the analysis. Reliability of the individual training cycles of leave-one-out cross-validation was tested by measuring dispersion from the mean model. Significant dispersion from the mean, representing weak learning, was observed in 11.3% of cases. Pruning the high-dispersion cases improved the diagnostic performance markedly (AUC 0.96, sensitivity = 92, specificity = 95). These results demonstrate the effectiveness of dispersion to identify weakly learned cases. In conclusion, machine learning with multimodal ultrasound including grayscale and Doppler can achieve high performance for breast cancer diagnosis, comparable to that of human observers. Identifying weakly learned cases can markedly enhance diagnosis.</p>","PeriodicalId":73288,"journal":{"name":"IEEE International Ultrasonics Symposium : [proceedings]. IEEE International Ultrasonics Symposium","volume":"2018 ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/ultsym.2018.8579953","citationCount":"19","resultStr":"{\"title\":\"Machine learning to improve breast cancer diagnosis by multimodal ultrasound.\",\"authors\":\"Laith R Sultan, Susan M Schultz, Theodore W Cary, Chandra M Sehgal\",\"doi\":\"10.1109/ultsym.2018.8579953\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Despite major advances in breast cancer imaging there is compelling need to reduce unnecessary biopsies by improving characterization of breast lesions. This study demonstrates the use of machine learning to enhance breast cancer diagnosis with multimodal ultrasound. Surgically proven solid breast lesions were studied using quantitative features extracted from grayscale and Doppler ultrasound images. Statistically different features from the logistic regression classifier were used train and test lesion differentiation by leave-one-out cross-validation. The area under the ROC curve (AUC) of the grayscale morphologic features was 0.85 (sensitivity = 87, specificity = 69). The diagnostic performance improved (AUC = 0.89, sensitivity = 79, specificity = 89) when Doppler features were added to the analysis. Reliability of the individual training cycles of leave-one-out cross-validation was tested by measuring dispersion from the mean model. Significant dispersion from the mean, representing weak learning, was observed in 11.3% of cases. Pruning the high-dispersion cases improved the diagnostic performance markedly (AUC 0.96, sensitivity = 92, specificity = 95). These results demonstrate the effectiveness of dispersion to identify weakly learned cases. In conclusion, machine learning with multimodal ultrasound including grayscale and Doppler can achieve high performance for breast cancer diagnosis, comparable to that of human observers. Identifying weakly learned cases can markedly enhance diagnosis.</p>\",\"PeriodicalId\":73288,\"journal\":{\"name\":\"IEEE International Ultrasonics Symposium : [proceedings]. IEEE International Ultrasonics Symposium\",\"volume\":\"2018 \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1109/ultsym.2018.8579953\",\"citationCount\":\"19\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE International Ultrasonics Symposium : [proceedings]. IEEE International Ultrasonics Symposium\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ultsym.2018.8579953\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2018/12/20 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE International Ultrasonics Symposium : [proceedings]. IEEE International Ultrasonics Symposium","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ultsym.2018.8579953","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2018/12/20 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
Machine learning to improve breast cancer diagnosis by multimodal ultrasound.
Despite major advances in breast cancer imaging there is compelling need to reduce unnecessary biopsies by improving characterization of breast lesions. This study demonstrates the use of machine learning to enhance breast cancer diagnosis with multimodal ultrasound. Surgically proven solid breast lesions were studied using quantitative features extracted from grayscale and Doppler ultrasound images. Statistically different features from the logistic regression classifier were used train and test lesion differentiation by leave-one-out cross-validation. The area under the ROC curve (AUC) of the grayscale morphologic features was 0.85 (sensitivity = 87, specificity = 69). The diagnostic performance improved (AUC = 0.89, sensitivity = 79, specificity = 89) when Doppler features were added to the analysis. Reliability of the individual training cycles of leave-one-out cross-validation was tested by measuring dispersion from the mean model. Significant dispersion from the mean, representing weak learning, was observed in 11.3% of cases. Pruning the high-dispersion cases improved the diagnostic performance markedly (AUC 0.96, sensitivity = 92, specificity = 95). These results demonstrate the effectiveness of dispersion to identify weakly learned cases. In conclusion, machine learning with multimodal ultrasound including grayscale and Doppler can achieve high performance for breast cancer diagnosis, comparable to that of human observers. Identifying weakly learned cases can markedly enhance diagnosis.