Arjun Maini, Justyn Pisa, Mina Davari, Bert Unger, Jordan Hochman
{"title":"Classification in Virtual Temporal Bone Surgical Education: A First Step Towards Automated Virtual Education With Use of Machine Learning","authors":"Arjun Maini, Justyn Pisa, Mina Davari, Bert Unger, Jordan Hochman","doi":"10.1002/lio2.70188","DOIUrl":null,"url":null,"abstract":"<div>\n \n \n <section>\n \n <h3> Objective</h3>\n \n <p>Simulation-based surgical training is now standard in residency education - aided by tools such as printed, virtual, and augmented reality environments. Autonomous education with use of machine learning is an emerging necessity owing to resident work-hour limitations and staff availability. An essential first step to providing automated feedback during simulated surgery is the development of a tool to classify surgical technique. Distinctive hand motion and drilling patterns can be used in the assessment of trainee proficiency during complex temporal bone surgery (TBS).</p>\n \n <p>This article reviews the development of a software classifier model for automated assessment of surgical performance based on recorded drill trajectory and hand motion tracking during 3D-printed TBS.</p>\n </section>\n \n <section>\n \n <h3> Methods</h3>\n \n <p>REB-approved prospective experimental study, in which a classifier was developed to provide automatic assessment of surgical performance based on drill trajectory and hand motion tracking. Four expert (two otologic surgeons and two PGY5 surgery residents) and four novice (PGY1-3 surgery residents) participants dissected 3D-printed temporal bone models. Individual hand and drill motion data were collected and analyzed for similarities and variations between participants to develop a model to predict the level of expertise (expert or novice), using a supervised classification approach.</p>\n </section>\n \n <section>\n \n <h3> Results</h3>\n \n <p>The automated stroke detection algorithm found 80.2%, 82.7%, and 84.8% precision in stroke detection and classification during cortical mastoidectomy (CM), thinning procedures (TP) and facial recess exposure (FRE), respectively. The classifier was able to predict the level of expertise with an accuracy of 92.8% and a sensitivity of 87.5%.</p>\n </section>\n \n <section>\n \n <h3> Conclusion</h3>\n \n <p>A temporal bone classifier can be developed with a high degree of accuracy as an initial stage towards an autonomous training paradigm.</p>\n </section>\n \n <section>\n \n <h3> Level of Evidence</h3>\n \n <p>IV.</p>\n </section>\n </div>","PeriodicalId":48529,"journal":{"name":"Laryngoscope Investigative Otolaryngology","volume":"10 4","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2025-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/lio2.70188","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Laryngoscope Investigative Otolaryngology","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/lio2.70188","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"OTORHINOLARYNGOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Objective
Simulation-based surgical training is now standard in residency education - aided by tools such as printed, virtual, and augmented reality environments. Autonomous education with use of machine learning is an emerging necessity owing to resident work-hour limitations and staff availability. An essential first step to providing automated feedback during simulated surgery is the development of a tool to classify surgical technique. Distinctive hand motion and drilling patterns can be used in the assessment of trainee proficiency during complex temporal bone surgery (TBS).
This article reviews the development of a software classifier model for automated assessment of surgical performance based on recorded drill trajectory and hand motion tracking during 3D-printed TBS.
Methods
REB-approved prospective experimental study, in which a classifier was developed to provide automatic assessment of surgical performance based on drill trajectory and hand motion tracking. Four expert (two otologic surgeons and two PGY5 surgery residents) and four novice (PGY1-3 surgery residents) participants dissected 3D-printed temporal bone models. Individual hand and drill motion data were collected and analyzed for similarities and variations between participants to develop a model to predict the level of expertise (expert or novice), using a supervised classification approach.
Results
The automated stroke detection algorithm found 80.2%, 82.7%, and 84.8% precision in stroke detection and classification during cortical mastoidectomy (CM), thinning procedures (TP) and facial recess exposure (FRE), respectively. The classifier was able to predict the level of expertise with an accuracy of 92.8% and a sensitivity of 87.5%.
Conclusion
A temporal bone classifier can be developed with a high degree of accuracy as an initial stage towards an autonomous training paradigm.