Nikolay Tonchev, Giulia Renieri, Klaus-Peter Stein, Belal Neyazi, Max Willgerodt, Hagen Thieme, I Erol Sandalcioglu, Karl Hartmann
{"title":"增强现实和光学导航辅助眼眶手术:一种新的集成工作流程。","authors":"Nikolay Tonchev, Giulia Renieri, Klaus-Peter Stein, Belal Neyazi, Max Willgerodt, Hagen Thieme, I Erol Sandalcioglu, Karl Hartmann","doi":"10.1515/iss-2023-0064","DOIUrl":null,"url":null,"abstract":"<p><strong>Objectives: </strong>Due to the close topographical relationship of functional relevant anatomic structures, limited space and cosmetic aspects orbital surgery will remain a challenging discipline. Therefore, novel technical capabilities are necessary for further surgical progress. We here tested the integration of augmented reality and optical navigation in one workflow for interdisciplinary decision-making, feasibility and intraoperative guidance.</p><p><strong>Methods: </strong>High-resolution contrast-enhanced MRI and CT scans were automated and manual-assisted segmented to achieve a detailed three-dimensional (3D) model of the individual patho-anatomical relationships. Augmented reality was used for interdisciplinary preoperative planning and intraoperative intuitive navigation. Mayfield clamp head holder in combination with optical surface matching registration assured navigation assisted microsurgery.</p><p><strong>Results: </strong>Combinations of different MRI-sequences and CT-scans were necessary for detailed 3D-modeling. Modeling was time consuming and only viable in the hands of medical, surgical and anatomical trained staff. Augmented reality assured a quick, intuitive interdisciplinary orientation. Intraoperative surface matching registration enabled precise navigation in the orbital space.</p><p><strong>Conclusions: </strong>Optical Navigation and microscope integration achieved a straightforward microsurgical workflow and should be implemented regularly. Augmented reality represented a useful tool for preoperative interdisciplinary planning and intraoperative intuitive orientation. It further stated an excellent educational tool.</p>","PeriodicalId":44186,"journal":{"name":"Innovative Surgical Sciences","volume":"10 2","pages":"91-98"},"PeriodicalIF":1.2000,"publicationDate":"2024-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12327852/pdf/","citationCount":"0","resultStr":"{\"title\":\"Augmented reality and optical navigation assisted orbital surgery: a novel integrated workflow.\",\"authors\":\"Nikolay Tonchev, Giulia Renieri, Klaus-Peter Stein, Belal Neyazi, Max Willgerodt, Hagen Thieme, I Erol Sandalcioglu, Karl Hartmann\",\"doi\":\"10.1515/iss-2023-0064\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objectives: </strong>Due to the close topographical relationship of functional relevant anatomic structures, limited space and cosmetic aspects orbital surgery will remain a challenging discipline. Therefore, novel technical capabilities are necessary for further surgical progress. We here tested the integration of augmented reality and optical navigation in one workflow for interdisciplinary decision-making, feasibility and intraoperative guidance.</p><p><strong>Methods: </strong>High-resolution contrast-enhanced MRI and CT scans were automated and manual-assisted segmented to achieve a detailed three-dimensional (3D) model of the individual patho-anatomical relationships. Augmented reality was used for interdisciplinary preoperative planning and intraoperative intuitive navigation. Mayfield clamp head holder in combination with optical surface matching registration assured navigation assisted microsurgery.</p><p><strong>Results: </strong>Combinations of different MRI-sequences and CT-scans were necessary for detailed 3D-modeling. Modeling was time consuming and only viable in the hands of medical, surgical and anatomical trained staff. Augmented reality assured a quick, intuitive interdisciplinary orientation. Intraoperative surface matching registration enabled precise navigation in the orbital space.</p><p><strong>Conclusions: </strong>Optical Navigation and microscope integration achieved a straightforward microsurgical workflow and should be implemented regularly. Augmented reality represented a useful tool for preoperative interdisciplinary planning and intraoperative intuitive orientation. It further stated an excellent educational tool.</p>\",\"PeriodicalId\":44186,\"journal\":{\"name\":\"Innovative Surgical Sciences\",\"volume\":\"10 2\",\"pages\":\"91-98\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2024-07-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12327852/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Innovative Surgical Sciences\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1515/iss-2023-0064\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/6/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q2\",\"JCRName\":\"SURGERY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Innovative Surgical Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/iss-2023-0064","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/6/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"SURGERY","Score":null,"Total":0}
Augmented reality and optical navigation assisted orbital surgery: a novel integrated workflow.
Objectives: Due to the close topographical relationship of functional relevant anatomic structures, limited space and cosmetic aspects orbital surgery will remain a challenging discipline. Therefore, novel technical capabilities are necessary for further surgical progress. We here tested the integration of augmented reality and optical navigation in one workflow for interdisciplinary decision-making, feasibility and intraoperative guidance.
Methods: High-resolution contrast-enhanced MRI and CT scans were automated and manual-assisted segmented to achieve a detailed three-dimensional (3D) model of the individual patho-anatomical relationships. Augmented reality was used for interdisciplinary preoperative planning and intraoperative intuitive navigation. Mayfield clamp head holder in combination with optical surface matching registration assured navigation assisted microsurgery.
Results: Combinations of different MRI-sequences and CT-scans were necessary for detailed 3D-modeling. Modeling was time consuming and only viable in the hands of medical, surgical and anatomical trained staff. Augmented reality assured a quick, intuitive interdisciplinary orientation. Intraoperative surface matching registration enabled precise navigation in the orbital space.
Conclusions: Optical Navigation and microscope integration achieved a straightforward microsurgical workflow and should be implemented regularly. Augmented reality represented a useful tool for preoperative interdisciplinary planning and intraoperative intuitive orientation. It further stated an excellent educational tool.