Yiheng Ju , Longbo Zheng , Peng Zhao , Fangjie Xin , Fengjiao Wang , Yuan Gao , Xianxiang Zhang , Dongsheng Wang , Yun Lu
{"title":"基于大全景病理切片的人工智能识别直肠癌病理T期及肿瘤侵袭","authors":"Yiheng Ju , Longbo Zheng , Peng Zhao , Fangjie Xin , Fengjiao Wang , Yuan Gao , Xianxiang Zhang , Dongsheng Wang , Yun Lu","doi":"10.1016/j.imed.2022.03.004","DOIUrl":null,"url":null,"abstract":"<div><h3><strong><em>Background</em></strong></h3><p>The incidence of colorectal cancer is increasing worldwide, and it currently ranks third among all cancers. Moreover, pathological diagnosis is becoming increasingly arduous. Artificial intelligence has demonstrated the ability to fully excavate image features and assist doctors in making decisions. Large panoramic pathological sections contain considerable amounts of pathological information. In this study, we used large panoramic pathological sections to establish a deep learning model to assist pathologists in identifying cancerous areas on whole-slide images of rectal cancer, as well as for T staging and prognostic analysis.</p></div><div><h3><em><strong>Methods</strong></em></h3><p>We collected 126 cases of primary rectal cancer from the Affiliated Hospital of Qingdao University West Coast Hospital District (internal dataset) and 42 cases from Shinan and Laoshan Hospital District (external dataset) that had tissue surgically removed from January to September 2019. After sectioning, staining, and scanning, a total of 2350 hematoxylin-eosin-stained whole-slide images were obtained. The patients in the internal dataset were randomly divided into a training cohort (<em>n =</em>88 ) and a test cohort (<em>n</em> =38 ) at a ratio of 7:3. We chose DeepLabV3+ and ResNet50 as target models for our experiment. We used the Dice similarity coefficient, accuracy, sensitivity, specificity, receiver operating characteristic (ROC) curve, and area under the curve (AUC) to evaluate the performance of the artificial intelligence platform in the test set and validation set. Finally, we followed up patients and examined their prognosis and short-term survival to corroborate the value of T-staging investigations.</p></div><div><h3><em><strong>Results</strong></em></h3><p>In the test set, the accuracy of image segmentation was 95.8%, the Dice coefficient was 0.92, the accuracy of automatic T-staging recognition was 86%, and the ROC AUC value was 0.93. In the validation set, the accuracy of image segmentation was 95.3%, the Dice coefficient was 0.90, the accuracy of automatic classification was 85%, the ROC AUC value was 0.92, and the image analysis time was 0.2 s. There was a difference in survival in patients with local recurrence or distant metastasis as the outcome at follow-up. Univariate analysis showed that T stage, N stage, preoperative carcinoembryonic antigen (CEA) level, and tumor location were risk factors for postoperative recurrence or metastasis in patients with rectal cancer. When these factors were included in a multivariate analysis, only preoperative CEA level and N stage showed significant differences.</p></div><div><h3><em><strong>Conclusion</strong></em></h3><p>The deep convolutional neural networks we have establish can assist clinicians in making decisions of T-stage judgment and improve diagnostic efficiency. Using large panoramic pathological sections enables better judgment of the condition of tumors and accurate pathological diagnoses, which has certain clinical application value.</p></div>","PeriodicalId":73400,"journal":{"name":"Intelligent medicine","volume":"2 3","pages":"Pages 141-151"},"PeriodicalIF":4.4000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2667102622000109/pdfft?md5=86b599ef05e61c30eaa22298216678a0&pid=1-s2.0-S2667102622000109-main.pdf","citationCount":"1","resultStr":"{\"title\":\"Artificial intelligence recognition of pathological T stage and tumor invasion in rectal cancer based on large panoramic pathological sections\",\"authors\":\"Yiheng Ju , Longbo Zheng , Peng Zhao , Fangjie Xin , Fengjiao Wang , Yuan Gao , Xianxiang Zhang , Dongsheng Wang , Yun Lu\",\"doi\":\"10.1016/j.imed.2022.03.004\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3><strong><em>Background</em></strong></h3><p>The incidence of colorectal cancer is increasing worldwide, and it currently ranks third among all cancers. Moreover, pathological diagnosis is becoming increasingly arduous. Artificial intelligence has demonstrated the ability to fully excavate image features and assist doctors in making decisions. Large panoramic pathological sections contain considerable amounts of pathological information. In this study, we used large panoramic pathological sections to establish a deep learning model to assist pathologists in identifying cancerous areas on whole-slide images of rectal cancer, as well as for T staging and prognostic analysis.</p></div><div><h3><em><strong>Methods</strong></em></h3><p>We collected 126 cases of primary rectal cancer from the Affiliated Hospital of Qingdao University West Coast Hospital District (internal dataset) and 42 cases from Shinan and Laoshan Hospital District (external dataset) that had tissue surgically removed from January to September 2019. After sectioning, staining, and scanning, a total of 2350 hematoxylin-eosin-stained whole-slide images were obtained. The patients in the internal dataset were randomly divided into a training cohort (<em>n =</em>88 ) and a test cohort (<em>n</em> =38 ) at a ratio of 7:3. We chose DeepLabV3+ and ResNet50 as target models for our experiment. We used the Dice similarity coefficient, accuracy, sensitivity, specificity, receiver operating characteristic (ROC) curve, and area under the curve (AUC) to evaluate the performance of the artificial intelligence platform in the test set and validation set. Finally, we followed up patients and examined their prognosis and short-term survival to corroborate the value of T-staging investigations.</p></div><div><h3><em><strong>Results</strong></em></h3><p>In the test set, the accuracy of image segmentation was 95.8%, the Dice coefficient was 0.92, the accuracy of automatic T-staging recognition was 86%, and the ROC AUC value was 0.93. In the validation set, the accuracy of image segmentation was 95.3%, the Dice coefficient was 0.90, the accuracy of automatic classification was 85%, the ROC AUC value was 0.92, and the image analysis time was 0.2 s. There was a difference in survival in patients with local recurrence or distant metastasis as the outcome at follow-up. Univariate analysis showed that T stage, N stage, preoperative carcinoembryonic antigen (CEA) level, and tumor location were risk factors for postoperative recurrence or metastasis in patients with rectal cancer. When these factors were included in a multivariate analysis, only preoperative CEA level and N stage showed significant differences.</p></div><div><h3><em><strong>Conclusion</strong></em></h3><p>The deep convolutional neural networks we have establish can assist clinicians in making decisions of T-stage judgment and improve diagnostic efficiency. Using large panoramic pathological sections enables better judgment of the condition of tumors and accurate pathological diagnoses, which has certain clinical application value.</p></div>\",\"PeriodicalId\":73400,\"journal\":{\"name\":\"Intelligent medicine\",\"volume\":\"2 3\",\"pages\":\"Pages 141-151\"},\"PeriodicalIF\":4.4000,\"publicationDate\":\"2022-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2667102622000109/pdfft?md5=86b599ef05e61c30eaa22298216678a0&pid=1-s2.0-S2667102622000109-main.pdf\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Intelligent medicine\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2667102622000109\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligent medicine","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2667102622000109","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Artificial intelligence recognition of pathological T stage and tumor invasion in rectal cancer based on large panoramic pathological sections
Background
The incidence of colorectal cancer is increasing worldwide, and it currently ranks third among all cancers. Moreover, pathological diagnosis is becoming increasingly arduous. Artificial intelligence has demonstrated the ability to fully excavate image features and assist doctors in making decisions. Large panoramic pathological sections contain considerable amounts of pathological information. In this study, we used large panoramic pathological sections to establish a deep learning model to assist pathologists in identifying cancerous areas on whole-slide images of rectal cancer, as well as for T staging and prognostic analysis.
Methods
We collected 126 cases of primary rectal cancer from the Affiliated Hospital of Qingdao University West Coast Hospital District (internal dataset) and 42 cases from Shinan and Laoshan Hospital District (external dataset) that had tissue surgically removed from January to September 2019. After sectioning, staining, and scanning, a total of 2350 hematoxylin-eosin-stained whole-slide images were obtained. The patients in the internal dataset were randomly divided into a training cohort (n =88 ) and a test cohort (n =38 ) at a ratio of 7:3. We chose DeepLabV3+ and ResNet50 as target models for our experiment. We used the Dice similarity coefficient, accuracy, sensitivity, specificity, receiver operating characteristic (ROC) curve, and area under the curve (AUC) to evaluate the performance of the artificial intelligence platform in the test set and validation set. Finally, we followed up patients and examined their prognosis and short-term survival to corroborate the value of T-staging investigations.
Results
In the test set, the accuracy of image segmentation was 95.8%, the Dice coefficient was 0.92, the accuracy of automatic T-staging recognition was 86%, and the ROC AUC value was 0.93. In the validation set, the accuracy of image segmentation was 95.3%, the Dice coefficient was 0.90, the accuracy of automatic classification was 85%, the ROC AUC value was 0.92, and the image analysis time was 0.2 s. There was a difference in survival in patients with local recurrence or distant metastasis as the outcome at follow-up. Univariate analysis showed that T stage, N stage, preoperative carcinoembryonic antigen (CEA) level, and tumor location were risk factors for postoperative recurrence or metastasis in patients with rectal cancer. When these factors were included in a multivariate analysis, only preoperative CEA level and N stage showed significant differences.
Conclusion
The deep convolutional neural networks we have establish can assist clinicians in making decisions of T-stage judgment and improve diagnostic efficiency. Using large panoramic pathological sections enables better judgment of the condition of tumors and accurate pathological diagnoses, which has certain clinical application value.