Learning to Match 2D Keypoints Across Preoperative MR and Intraoperative Ultrasound.

Hassan Rasheed, Reuben Dorent, Maximilian Fehrentz, Daniil Morozov, Tina Kapur, William M Wells, Alexandra Golby, Sarah Frisken, Julia A Schnabel, Nazim Haouchine
{"title":"Learning to Match 2D Keypoints Across Preoperative MR and Intraoperative Ultrasound.","authors":"Hassan Rasheed, Reuben Dorent, Maximilian Fehrentz, Daniil Morozov, Tina Kapur, William M Wells, Alexandra Golby, Sarah Frisken, Julia A Schnabel, Nazim Haouchine","doi":"10.1007/978-3-031-73647-6_8","DOIUrl":null,"url":null,"abstract":"<p><p>We propose in this paper a texture-invariant 2D keypoints descriptor specifically designed for matching preoperative Magnetic Resonance (MR) images with intraoperative Ultrasound (US) images. We introduce a <i>matching-by-synthesis</i> strategy, where intraoperative US images are synthesized from MR images accounting for multiple MR modalities and intraoperative US variability. We build our training set by enforcing keypoints localization over all images then train a patient-specific descriptor network that learns texture-invariant discriminant features in a supervised contrastive manner, leading to robust keypoints descriptors. Our experiments on real cases with ground truth show the effectiveness of the proposed approach, outperforming the state-of-the-art methods and achieving 80.35% matching precision on average.</p>","PeriodicalId":520353,"journal":{"name":"Simplifying medical ultrasound : 5th international workshop, ASMUS 2024, held in conjunction with MICCAI 2024, Marrakesh, Morocco, October 6, 2024, proceedings. ASMUS (Workshop) (5th : 2024 : Marrakech, Morocco)","volume":"15186 ","pages":"78-87"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11682695/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Simplifying medical ultrasound : 5th international workshop, ASMUS 2024, held in conjunction with MICCAI 2024, Marrakesh, Morocco, October 6, 2024, proceedings. ASMUS (Workshop) (5th : 2024 : Marrakech, Morocco)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/978-3-031-73647-6_8","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/10/5 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

We propose in this paper a texture-invariant 2D keypoints descriptor specifically designed for matching preoperative Magnetic Resonance (MR) images with intraoperative Ultrasound (US) images. We introduce a matching-by-synthesis strategy, where intraoperative US images are synthesized from MR images accounting for multiple MR modalities and intraoperative US variability. We build our training set by enforcing keypoints localization over all images then train a patient-specific descriptor network that learns texture-invariant discriminant features in a supervised contrastive manner, leading to robust keypoints descriptors. Our experiments on real cases with ground truth show the effectiveness of the proposed approach, outperforming the state-of-the-art methods and achieving 80.35% matching precision on average.

学习术前磁共振和术中超声的二维关键点匹配。
本文提出了一种纹理不变的二维关键点描述符,专门用于匹配术前磁共振(MR)图像和术中超声(US)图像。我们引入了一种匹配合成策略,其中术中US图像从考虑多种MR模式和术中US可变性的MR图像合成。我们通过在所有图像上强制关键点定位来构建我们的训练集,然后训练一个特定于患者的描述符网络,该网络以监督对比的方式学习纹理不变的判别特征,从而产生鲁棒的关键点描述符。我们在真实情况下的实验表明了该方法的有效性,优于目前最先进的方法,平均匹配精度达到80.35%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信