基于双平面 X 光和 CT 图像的 2D/3D 配准,用于手术导航

IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Demin Yang, Haochen Shi, Bolun Zeng, Xiaojun Chen
{"title":"基于双平面 X 光和 CT 图像的 2D/3D 配准,用于手术导航","authors":"Demin Yang,&nbsp;Haochen Shi,&nbsp;Bolun Zeng,&nbsp;Xiaojun Chen","doi":"10.1016/j.cmpb.2024.108444","DOIUrl":null,"url":null,"abstract":"<div><h3>Background and Objectives:</h3><div>Image-based 2D/3D registration is a crucial technology for fluoroscopy-guided surgical interventions. However, traditional registration methods relying on a single X-ray image into surgical navigation systems. This study proposes a novel 2D/3D registration approach utilizing biplanar X-ray images combined with computed tomography (CT) to significantly reduce registration and navigation errors. The method is successfully implemented in a surgical navigation system, enhancing its precision and reliability.</div></div><div><h3>Methods:</h3><div>First, we simultaneously register the frontal and lateral X-ray images with the CT image, enabling mutual complementation and more precise localization. Additionally, we introduce a novel similarity measure for image comparison, providing a more robust cost function for the optimization algorithm. Furthermore, a multi-resolution strategy is employed to enhance registration efficiency. Lastly, we propose a more accurate coordinate transformation method, based on projection and 3D reconstruction, to improve the precision of surgical navigation systems.<em>Results:</em> We conducted registration and navigation experiments using pelvic, spinal, and femur phantoms. The navigation results demonstrated that the feature registration errors (FREs) in the three experiments were 0.505±0.063 mm, 0.515±0.055 mm, and 0.577±0.056 mm, respectively. Compared to the point-to-point (PTP) registration method based on anatomical landmarks, our method reduced registration errors by 31.3%, 23.9%, and 26.3%, respectively.</div></div><div><h3>Conclusion:</h3><div>The results demonstrate that our method significantly reduces registration and navigation errors, highlighting its potential for application across various anatomical sites. Our code is available at: <span><span>https://github.com/SJTUdemon/2D-3D-Registration</span><svg><path></path></svg></span></div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108444"},"PeriodicalIF":4.9000,"publicationDate":"2024-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"2D/3D registration based on biplanar X-ray and CT images for surgical navigation\",\"authors\":\"Demin Yang,&nbsp;Haochen Shi,&nbsp;Bolun Zeng,&nbsp;Xiaojun Chen\",\"doi\":\"10.1016/j.cmpb.2024.108444\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Background and Objectives:</h3><div>Image-based 2D/3D registration is a crucial technology for fluoroscopy-guided surgical interventions. However, traditional registration methods relying on a single X-ray image into surgical navigation systems. This study proposes a novel 2D/3D registration approach utilizing biplanar X-ray images combined with computed tomography (CT) to significantly reduce registration and navigation errors. The method is successfully implemented in a surgical navigation system, enhancing its precision and reliability.</div></div><div><h3>Methods:</h3><div>First, we simultaneously register the frontal and lateral X-ray images with the CT image, enabling mutual complementation and more precise localization. Additionally, we introduce a novel similarity measure for image comparison, providing a more robust cost function for the optimization algorithm. Furthermore, a multi-resolution strategy is employed to enhance registration efficiency. Lastly, we propose a more accurate coordinate transformation method, based on projection and 3D reconstruction, to improve the precision of surgical navigation systems.<em>Results:</em> We conducted registration and navigation experiments using pelvic, spinal, and femur phantoms. The navigation results demonstrated that the feature registration errors (FREs) in the three experiments were 0.505±0.063 mm, 0.515±0.055 mm, and 0.577±0.056 mm, respectively. Compared to the point-to-point (PTP) registration method based on anatomical landmarks, our method reduced registration errors by 31.3%, 23.9%, and 26.3%, respectively.</div></div><div><h3>Conclusion:</h3><div>The results demonstrate that our method significantly reduces registration and navigation errors, highlighting its potential for application across various anatomical sites. Our code is available at: <span><span>https://github.com/SJTUdemon/2D-3D-Registration</span><svg><path></path></svg></span></div></div>\",\"PeriodicalId\":10624,\"journal\":{\"name\":\"Computer methods and programs in biomedicine\",\"volume\":\"257 \",\"pages\":\"Article 108444\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2024-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer methods and programs in biomedicine\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0169260724004371\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer methods and programs in biomedicine","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0169260724004371","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

背景和目的:基于图像的二维/三维配准是荧光屏引导手术干预的关键技术。然而,传统的配准方法依赖于手术导航系统中的单一 X 射线图像。本研究提出了一种新型 2D/3D 配准方法,利用双平面 X 光图像与计算机断层扫描(CT)相结合,显著减少配准和导航误差。方法:首先,我们将正面和侧面的 X 光图像与 CT 图像同时配准,从而实现互补和更精确的定位。此外,我们还引入了一种用于图像对比的新型相似度测量方法,为优化算法提供了更稳健的成本函数。此外,我们还采用了多分辨率策略来提高配准效率。最后,我们提出了一种基于投影和三维重建的更精确坐标转换方法,以提高手术导航系统的精度:我们使用骨盆、脊柱和股骨模型进行了配准和导航实验。导航结果表明,三次实验中的特征配准误差(FREs)分别为 0.505±0.063 mm、0.515±0.055 mm 和 0.577±0.056 mm。与基于解剖地标的点对点(PTP)配准方法相比,我们的方法分别减少了 31.3%、23.9% 和 26.3% 的配准误差。我们的代码可在以下网址获取: https://github.com/SJTUdemon/2D-3D-Registration
本文章由计算机程序翻译,如有差异,请以英文原文为准。
2D/3D registration based on biplanar X-ray and CT images for surgical navigation

Background and Objectives:

Image-based 2D/3D registration is a crucial technology for fluoroscopy-guided surgical interventions. However, traditional registration methods relying on a single X-ray image into surgical navigation systems. This study proposes a novel 2D/3D registration approach utilizing biplanar X-ray images combined with computed tomography (CT) to significantly reduce registration and navigation errors. The method is successfully implemented in a surgical navigation system, enhancing its precision and reliability.

Methods:

First, we simultaneously register the frontal and lateral X-ray images with the CT image, enabling mutual complementation and more precise localization. Additionally, we introduce a novel similarity measure for image comparison, providing a more robust cost function for the optimization algorithm. Furthermore, a multi-resolution strategy is employed to enhance registration efficiency. Lastly, we propose a more accurate coordinate transformation method, based on projection and 3D reconstruction, to improve the precision of surgical navigation systems.Results: We conducted registration and navigation experiments using pelvic, spinal, and femur phantoms. The navigation results demonstrated that the feature registration errors (FREs) in the three experiments were 0.505±0.063 mm, 0.515±0.055 mm, and 0.577±0.056 mm, respectively. Compared to the point-to-point (PTP) registration method based on anatomical landmarks, our method reduced registration errors by 31.3%, 23.9%, and 26.3%, respectively.

Conclusion:

The results demonstrate that our method significantly reduces registration and navigation errors, highlighting its potential for application across various anatomical sites. Our code is available at: https://github.com/SJTUdemon/2D-3D-Registration
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computer methods and programs in biomedicine
Computer methods and programs in biomedicine 工程技术-工程:生物医学
CiteScore
12.30
自引率
6.60%
发文量
601
审稿时长
135 days
期刊介绍: To encourage the development of formal computing methods, and their application in biomedical research and medical practice, by illustration of fundamental principles in biomedical informatics research; to stimulate basic research into application software design; to report the state of research of biomedical information processing projects; to report new computer methodologies applied in biomedical areas; the eventual distribution of demonstrable software to avoid duplication of effort; to provide a forum for discussion and improvement of existing software; to optimize contact between national organizations and regional user groups by promoting an international exchange of information on formal methods, standards and software in biomedicine. Computer Methods and Programs in Biomedicine covers computing methodology and software systems derived from computing science for implementation in all aspects of biomedical research and medical practice. It is designed to serve: biochemists; biologists; geneticists; immunologists; neuroscientists; pharmacologists; toxicologists; clinicians; epidemiologists; psychiatrists; psychologists; cardiologists; chemists; (radio)physicists; computer scientists; programmers and systems analysts; biomedical, clinical, electrical and other engineers; teachers of medical informatics and users of educational software.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信