2-D/3-D Medical Image Registration Based on Feature-Point Matching

IF 5.9 2区 工程技术 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Shengyuan Si;Zheng Li;Ze Lin;Xian Xu;Yudong Zhang;Shipeng Xie
{"title":"2-D/3-D Medical Image Registration Based on Feature-Point Matching","authors":"Shengyuan Si;Zheng Li;Ze Lin;Xian Xu;Yudong Zhang;Shipeng Xie","doi":"10.1109/TIM.2024.3481556","DOIUrl":null,"url":null,"abstract":"Two-dimensional/3-D medical image registration has a wide range of applications in intraoperative image-guided navigation, which can not only assist surgeons in accurately locating lesions but also serve as a key link for surgical robots to locate the surgical site. Current methods for 2-D/3-D spine image registration are susceptible to getting stuck in local optimization, struggling to extract gradient information from noisy real data, and exhibiting slow processing speeds. Recently, deep learning methods have suffered from insufficient training data, poor generalization performance, and a tendency to produce incorrect solutions. We propose an optimized model that significantly improves the speed and accuracy of 2-D/3-D registration by deeply integrating a feature-point matching network. This network demonstrates exceptional robustness in processing high-noise imagery and is adept at coarse registration, providing the initial solution for the optimized model and thereby abbreviating the time required for coarse registration. It also facilitates updates of parameter location modules within the optimized model, diminishing the overall computational demand. Additionally, by harnessing grayscale and spinal feature information, we formulate an objective function enriched with a feature-point similarity metric to govern the descent trajectory, culminating in heightened precision and expedited convergence. Our empirical findings indicate that this method achieves a mean accuracy of 0.2550 mm on real data, substantiating the efficacy of our approach.","PeriodicalId":13341,"journal":{"name":"IEEE Transactions on Instrumentation and Measurement","volume":"73 ","pages":"1-9"},"PeriodicalIF":5.9000,"publicationDate":"2024-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Instrumentation and Measurement","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10720192/","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Two-dimensional/3-D medical image registration has a wide range of applications in intraoperative image-guided navigation, which can not only assist surgeons in accurately locating lesions but also serve as a key link for surgical robots to locate the surgical site. Current methods for 2-D/3-D spine image registration are susceptible to getting stuck in local optimization, struggling to extract gradient information from noisy real data, and exhibiting slow processing speeds. Recently, deep learning methods have suffered from insufficient training data, poor generalization performance, and a tendency to produce incorrect solutions. We propose an optimized model that significantly improves the speed and accuracy of 2-D/3-D registration by deeply integrating a feature-point matching network. This network demonstrates exceptional robustness in processing high-noise imagery and is adept at coarse registration, providing the initial solution for the optimized model and thereby abbreviating the time required for coarse registration. It also facilitates updates of parameter location modules within the optimized model, diminishing the overall computational demand. Additionally, by harnessing grayscale and spinal feature information, we formulate an objective function enriched with a feature-point similarity metric to govern the descent trajectory, culminating in heightened precision and expedited convergence. Our empirical findings indicate that this method achieves a mean accuracy of 0.2550 mm on real data, substantiating the efficacy of our approach.
基于特征点匹配的 2-D/3-D 医学影像配准
二维/三维医学影像配准在术中图像引导导航中有着广泛的应用,它不仅能帮助外科医生准确定位病灶,也是手术机器人定位手术部位的关键环节。目前用于二维/三维脊柱图像配准的方法容易陷入局部优化,难以从嘈杂的真实数据中提取梯度信息,而且处理速度缓慢。最近,深度学习方法存在训练数据不足、泛化性能差以及容易产生错误解决方案等问题。我们提出了一种优化模型,通过深度集成特征点匹配网络,显著提高了二维/三维配准的速度和准确性。该网络在处理高噪声图像时表现出卓越的鲁棒性,并擅长粗配准,为优化模型提供初始解,从而缩短粗配准所需的时间。它还有助于更新优化模型中的参数位置模块,从而降低总体计算需求。此外,通过利用灰度和脊柱特征信息,我们制定了一个目标函数,并用特征点相似度量来控制下降轨迹,最终提高了精度并加快了收敛速度。我们的实证研究结果表明,这种方法在真实数据上的平均精确度达到了 0.2550 毫米,证明了我们方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Instrumentation and Measurement
IEEE Transactions on Instrumentation and Measurement 工程技术-工程:电子与电气
CiteScore
9.00
自引率
23.20%
发文量
1294
审稿时长
3.9 months
期刊介绍: Papers are sought that address innovative solutions to the development and use of electrical and electronic instruments and equipment to measure, monitor and/or record physical phenomena for the purpose of advancing measurement science, methods, functionality and applications. The scope of these papers may encompass: (1) theory, methodology, and practice of measurement; (2) design, development and evaluation of instrumentation and measurement systems and components used in generating, acquiring, conditioning and processing signals; (3) analysis, representation, display, and preservation of the information obtained from a set of measurements; and (4) scientific and technical support to establishment and maintenance of technical standards in the field of Instrumentation and Measurement.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信