{"title":"基于人工智能的一期鼻窦提升中移植物材料自动分割的回顾性研究。","authors":"Yue Xi, Xiaoxia Li, Zhikang Wang, Chuanji Shi, Xiaoru Qin, Qifeng Jiang, Guoli Yang","doi":"10.1111/cid.13426","DOIUrl":null,"url":null,"abstract":"<p><strong>Objectives: </strong>Accurate assessment of postoperative bone graft material changes after the 1-stage sinus lift is crucial for evaluating long-term implant survival. However, traditional manual labeling and segmentation of cone-beam computed tomography (CBCT) images are often inaccurate and inefficient. This study aims to utilize artificial intelligence for automated segmentation of graft material in 1-stage sinus lift procedures to enhance accuracy and efficiency.</p><p><strong>Materials and methods: </strong>Swin-UPerNet along with mainstream medical segmentation models, such as FCN, U-Net, DeepLabV3, SegFormer, and UPerNet, were trained using a dataset of 120 CBCT scans. The models were tested on 30 CBCT scans to evaluate model performance based on metrics including the 95% Hausdorff distance, Intersection over Union (IoU), and Dice similarity coefficient. Additionally, processing times were also compared between automated segmentation and manual methods.</p><p><strong>Results: </strong>Swin-UPerNet outperformed other models in accuracy, achieving an accuracy rate of 0.84 and mean precision and IoU values of 0.8574 and 0.7373, respectively (p < 0.05). The time required for uploading and visualizing segmentation results with Swin-UPerNet significantly decreased to 19.28 s from the average manual segmentation times of 1390 s (p < 0.001).</p><p><strong>Conclusions: </strong>Swin-UPerNet exhibited high accuracy and efficiency in identifying and segmenting the three-dimensional volume of bone graft material, indicating significant potential for evaluating the stability of bone graft material.</p>","PeriodicalId":93944,"journal":{"name":"Clinical implant dentistry and related research","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automated Segmentation of Graft Material in 1-Stage Sinus Lift Based on Artificial Intelligence: A Retrospective Study.\",\"authors\":\"Yue Xi, Xiaoxia Li, Zhikang Wang, Chuanji Shi, Xiaoru Qin, Qifeng Jiang, Guoli Yang\",\"doi\":\"10.1111/cid.13426\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objectives: </strong>Accurate assessment of postoperative bone graft material changes after the 1-stage sinus lift is crucial for evaluating long-term implant survival. However, traditional manual labeling and segmentation of cone-beam computed tomography (CBCT) images are often inaccurate and inefficient. This study aims to utilize artificial intelligence for automated segmentation of graft material in 1-stage sinus lift procedures to enhance accuracy and efficiency.</p><p><strong>Materials and methods: </strong>Swin-UPerNet along with mainstream medical segmentation models, such as FCN, U-Net, DeepLabV3, SegFormer, and UPerNet, were trained using a dataset of 120 CBCT scans. The models were tested on 30 CBCT scans to evaluate model performance based on metrics including the 95% Hausdorff distance, Intersection over Union (IoU), and Dice similarity coefficient. Additionally, processing times were also compared between automated segmentation and manual methods.</p><p><strong>Results: </strong>Swin-UPerNet outperformed other models in accuracy, achieving an accuracy rate of 0.84 and mean precision and IoU values of 0.8574 and 0.7373, respectively (p < 0.05). The time required for uploading and visualizing segmentation results with Swin-UPerNet significantly decreased to 19.28 s from the average manual segmentation times of 1390 s (p < 0.001).</p><p><strong>Conclusions: </strong>Swin-UPerNet exhibited high accuracy and efficiency in identifying and segmenting the three-dimensional volume of bone graft material, indicating significant potential for evaluating the stability of bone graft material.</p>\",\"PeriodicalId\":93944,\"journal\":{\"name\":\"Clinical implant dentistry and related research\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-12-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Clinical implant dentistry and related research\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1111/cid.13426\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Clinical implant dentistry and related research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/cid.13426","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Automated Segmentation of Graft Material in 1-Stage Sinus Lift Based on Artificial Intelligence: A Retrospective Study.
Objectives: Accurate assessment of postoperative bone graft material changes after the 1-stage sinus lift is crucial for evaluating long-term implant survival. However, traditional manual labeling and segmentation of cone-beam computed tomography (CBCT) images are often inaccurate and inefficient. This study aims to utilize artificial intelligence for automated segmentation of graft material in 1-stage sinus lift procedures to enhance accuracy and efficiency.
Materials and methods: Swin-UPerNet along with mainstream medical segmentation models, such as FCN, U-Net, DeepLabV3, SegFormer, and UPerNet, were trained using a dataset of 120 CBCT scans. The models were tested on 30 CBCT scans to evaluate model performance based on metrics including the 95% Hausdorff distance, Intersection over Union (IoU), and Dice similarity coefficient. Additionally, processing times were also compared between automated segmentation and manual methods.
Results: Swin-UPerNet outperformed other models in accuracy, achieving an accuracy rate of 0.84 and mean precision and IoU values of 0.8574 and 0.7373, respectively (p < 0.05). The time required for uploading and visualizing segmentation results with Swin-UPerNet significantly decreased to 19.28 s from the average manual segmentation times of 1390 s (p < 0.001).
Conclusions: Swin-UPerNet exhibited high accuracy and efficiency in identifying and segmenting the three-dimensional volume of bone graft material, indicating significant potential for evaluating the stability of bone graft material.