Rosmarie Breu, Carolina Avelar, Zsolt Bertalan, Johannes Grillari, Heinz Redl, Richard Ljuhar, Stefan Quadlbauer, Thomas Hausner
{"title":"创伤学中的人工智能。","authors":"Rosmarie Breu, Carolina Avelar, Zsolt Bertalan, Johannes Grillari, Heinz Redl, Richard Ljuhar, Stefan Quadlbauer, Thomas Hausner","doi":"10.1302/2046-3758.1310.BJR-2023-0275.R3","DOIUrl":null,"url":null,"abstract":"<p><strong>Aims: </strong>The aim of this study was to create artificial intelligence (AI) software with the purpose of providing a second opinion to physicians to support distal radius fracture (DRF) detection, and to compare the accuracy of fracture detection of physicians with and without software support.</p><p><strong>Methods: </strong>The dataset consisted of 26,121 anonymized anterior-posterior (AP) and lateral standard view radiographs of the wrist, with and without DRF. The convolutional neural network (CNN) model was trained to detect the presence of a DRF by comparing the radiographs containing a fracture to the inconspicuous ones. A total of 11 physicians (six surgeons in training and five hand surgeons) assessed 200 pairs of randomly selected digital radiographs of the wrist (AP and lateral) for the presence of a DRF. The same images were first evaluated without, and then with, the support of the CNN model, and the diagnostic accuracy of the two methods was compared.</p><p><strong>Results: </strong>At the time of the study, the CNN model showed an area under the receiver operating curve of 0.97. AI assistance improved the physician's sensitivity (correct fracture detection) from 80% to 87%, and the specificity (correct fracture exclusion) from 91% to 95%. The overall error rate (combined false positive and false negative) was reduced from 14% without AI to 9% with AI.</p><p><strong>Conclusion: </strong>The use of a CNN model as a second opinion can improve the diagnostic accuracy of DRF detection in the study setting.</p>","PeriodicalId":9074,"journal":{"name":"Bone & Joint Research","volume":"13 10","pages":"588-595"},"PeriodicalIF":4.7000,"publicationDate":"2024-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11484119/pdf/","citationCount":"0","resultStr":"{\"title\":\"Artificial intelligence in traumatology.\",\"authors\":\"Rosmarie Breu, Carolina Avelar, Zsolt Bertalan, Johannes Grillari, Heinz Redl, Richard Ljuhar, Stefan Quadlbauer, Thomas Hausner\",\"doi\":\"10.1302/2046-3758.1310.BJR-2023-0275.R3\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Aims: </strong>The aim of this study was to create artificial intelligence (AI) software with the purpose of providing a second opinion to physicians to support distal radius fracture (DRF) detection, and to compare the accuracy of fracture detection of physicians with and without software support.</p><p><strong>Methods: </strong>The dataset consisted of 26,121 anonymized anterior-posterior (AP) and lateral standard view radiographs of the wrist, with and without DRF. The convolutional neural network (CNN) model was trained to detect the presence of a DRF by comparing the radiographs containing a fracture to the inconspicuous ones. A total of 11 physicians (six surgeons in training and five hand surgeons) assessed 200 pairs of randomly selected digital radiographs of the wrist (AP and lateral) for the presence of a DRF. The same images were first evaluated without, and then with, the support of the CNN model, and the diagnostic accuracy of the two methods was compared.</p><p><strong>Results: </strong>At the time of the study, the CNN model showed an area under the receiver operating curve of 0.97. AI assistance improved the physician's sensitivity (correct fracture detection) from 80% to 87%, and the specificity (correct fracture exclusion) from 91% to 95%. The overall error rate (combined false positive and false negative) was reduced from 14% without AI to 9% with AI.</p><p><strong>Conclusion: </strong>The use of a CNN model as a second opinion can improve the diagnostic accuracy of DRF detection in the study setting.</p>\",\"PeriodicalId\":9074,\"journal\":{\"name\":\"Bone & Joint Research\",\"volume\":\"13 10\",\"pages\":\"588-595\"},\"PeriodicalIF\":4.7000,\"publicationDate\":\"2024-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11484119/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Bone & Joint Research\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1302/2046-3758.1310.BJR-2023-0275.R3\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"CELL & TISSUE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bone & Joint Research","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1302/2046-3758.1310.BJR-2023-0275.R3","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CELL & TISSUE ENGINEERING","Score":null,"Total":0}
Aims: The aim of this study was to create artificial intelligence (AI) software with the purpose of providing a second opinion to physicians to support distal radius fracture (DRF) detection, and to compare the accuracy of fracture detection of physicians with and without software support.
Methods: The dataset consisted of 26,121 anonymized anterior-posterior (AP) and lateral standard view radiographs of the wrist, with and without DRF. The convolutional neural network (CNN) model was trained to detect the presence of a DRF by comparing the radiographs containing a fracture to the inconspicuous ones. A total of 11 physicians (six surgeons in training and five hand surgeons) assessed 200 pairs of randomly selected digital radiographs of the wrist (AP and lateral) for the presence of a DRF. The same images were first evaluated without, and then with, the support of the CNN model, and the diagnostic accuracy of the two methods was compared.
Results: At the time of the study, the CNN model showed an area under the receiver operating curve of 0.97. AI assistance improved the physician's sensitivity (correct fracture detection) from 80% to 87%, and the specificity (correct fracture exclusion) from 91% to 95%. The overall error rate (combined false positive and false negative) was reduced from 14% without AI to 9% with AI.
Conclusion: The use of a CNN model as a second opinion can improve the diagnostic accuracy of DRF detection in the study setting.