Yuanyuan WANG , Dawei LU , Jingfan FAN , Deqiang XIAO , Danni AI , Tianyu FU , Yucong LIN , Long SHAO , Tao CHEN , Hong SONG , Yongtian WANG , Jian YANG
{"title":"Augmented reality surgical navigation: Clinical applications, key technologies, and future directions","authors":"Yuanyuan WANG , Dawei LU , Jingfan FAN , Deqiang XIAO , Danni AI , Tianyu FU , Yucong LIN , Long SHAO , Tao CHEN , Hong SONG , Yongtian WANG , Jian YANG","doi":"10.1016/j.vrih.2025.12.002","DOIUrl":null,"url":null,"abstract":"<div><div>Surgical navigation has evolved significantly through advances in augmented reality, virtual reality, and mixed reality, improving precision and safety across many clinical applications, including neurosurgery, maxillofacial, spinal, and arthroplasty procedures. By integrating preoperative imaging with real-time intraoperative data, these systems provide dynamic guidance, reduce radiation exposure, and minimize tissue damage. Key challenges persist, including intraoperative registration accuracy, flexible tissue deformation, respiratory compensation, and real-time imaging quality. Emerging solutions include artificial intelligence-driven segmentation, deformation-field modeling, and hybrid registration techniques. Future developments will include lightweight, portable systems, improved non-rigid registration algorithms, and greater clinical adoption. Despite advances in rigid-tissue applications, soft-tissue navigation requires additional innovation to address motion variability and registration reliability, ultimately advancing minimally invasive surgery and precision medicine.</div></div>","PeriodicalId":33538,"journal":{"name":"Virtual Reality Intelligent Hardware","volume":"8 1","pages":"Pages 1-27"},"PeriodicalIF":0.0000,"publicationDate":"2026-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Virtual Reality Intelligent Hardware","FirstCategoryId":"1093","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2096579625000749","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/3/14 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0
Abstract
Surgical navigation has evolved significantly through advances in augmented reality, virtual reality, and mixed reality, improving precision and safety across many clinical applications, including neurosurgery, maxillofacial, spinal, and arthroplasty procedures. By integrating preoperative imaging with real-time intraoperative data, these systems provide dynamic guidance, reduce radiation exposure, and minimize tissue damage. Key challenges persist, including intraoperative registration accuracy, flexible tissue deformation, respiratory compensation, and real-time imaging quality. Emerging solutions include artificial intelligence-driven segmentation, deformation-field modeling, and hybrid registration techniques. Future developments will include lightweight, portable systems, improved non-rigid registration algorithms, and greater clinical adoption. Despite advances in rigid-tissue applications, soft-tissue navigation requires additional innovation to address motion variability and registration reliability, ultimately advancing minimally invasive surgery and precision medicine.