图像引导介入的机器人辅助超声探头校准。

IF 2.3 3区 医学 Q3 ENGINEERING, BIOMEDICAL
Atharva Paralikar, Pavan Mantripragada, Trong Nguyen, Youness Arjoune, Raj Shekhar, Reza Monfaredi
{"title":"图像引导介入的机器人辅助超声探头校准。","authors":"Atharva Paralikar, Pavan Mantripragada, Trong Nguyen, Youness Arjoune, Raj Shekhar, Reza Monfaredi","doi":"10.1007/s11548-025-03347-8","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Trackable ultrasound probes facilitate ultrasound-guided procedures, allowing real-time fusion of augmented ultrasound images and live video streams. The integration aids surgeons in accurately locating lesions within organs, and this could only be achieved through a precise registration between the ultrasound probe and the ultrasound image. Currently, calibration and registration processes are often manual, labor-intensive, time-consuming, and suboptimal. Technologists manually manipulate a stylus, moving it through various poses within the ultrasound probe's imaging plane to detect its tip in the ultrasound image. This paper addresses this challenge by proposing a novel automated calibration approach for trackable ultrasound probes.</p><p><strong>Methods: </strong>We utilized a robotic manipulator (KUKA LBR iiwa 7) to execute stylus movements, eliminating the cumbersome manual positioning of the probe. We incorporated a 6-degree-of-freedom electromagnetic tracker into the ultrasound probe to enable real-time pose and orientation tracking. Also, we developed a feature detection algorithm to effectively identify in plane stylus tip coordinates from recorded ultrasound feeds, facilitating automatic selection of calibration correspondences.</p><p><strong>Results: </strong>The proposed system performed comparably to manual ultrasound feature segmentation, yielding a mean re-projection error of 0.38 mm compared to a manual landmark selection error of 0.34 mm. We also achieved an image plane reconstruction of 0.80 deg with manual segmentation and 0.20 deg with automatic segmentation.</p><p><strong>Conclusion: </strong>The proposed system allowed for fully automated calibration while maintaining the same level of accuracy as the state-of-the-art methods. It streamlines the process of using a trackable US probe by simplifying recalibration after sterilization when the electromagnetic tracker is externally attached and is required to be disassembled for cleaning and sterilization, or as a part of out-of-factory calibration of US probe with embedded trackers where probes are in mass production.</p>","PeriodicalId":51251,"journal":{"name":"International Journal of Computer Assisted Radiology and Surgery","volume":" ","pages":"859-868"},"PeriodicalIF":2.3000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Robot-assisted ultrasound probe calibration for image-guided interventions.\",\"authors\":\"Atharva Paralikar, Pavan Mantripragada, Trong Nguyen, Youness Arjoune, Raj Shekhar, Reza Monfaredi\",\"doi\":\"10.1007/s11548-025-03347-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Trackable ultrasound probes facilitate ultrasound-guided procedures, allowing real-time fusion of augmented ultrasound images and live video streams. The integration aids surgeons in accurately locating lesions within organs, and this could only be achieved through a precise registration between the ultrasound probe and the ultrasound image. Currently, calibration and registration processes are often manual, labor-intensive, time-consuming, and suboptimal. Technologists manually manipulate a stylus, moving it through various poses within the ultrasound probe's imaging plane to detect its tip in the ultrasound image. This paper addresses this challenge by proposing a novel automated calibration approach for trackable ultrasound probes.</p><p><strong>Methods: </strong>We utilized a robotic manipulator (KUKA LBR iiwa 7) to execute stylus movements, eliminating the cumbersome manual positioning of the probe. We incorporated a 6-degree-of-freedom electromagnetic tracker into the ultrasound probe to enable real-time pose and orientation tracking. Also, we developed a feature detection algorithm to effectively identify in plane stylus tip coordinates from recorded ultrasound feeds, facilitating automatic selection of calibration correspondences.</p><p><strong>Results: </strong>The proposed system performed comparably to manual ultrasound feature segmentation, yielding a mean re-projection error of 0.38 mm compared to a manual landmark selection error of 0.34 mm. We also achieved an image plane reconstruction of 0.80 deg with manual segmentation and 0.20 deg with automatic segmentation.</p><p><strong>Conclusion: </strong>The proposed system allowed for fully automated calibration while maintaining the same level of accuracy as the state-of-the-art methods. It streamlines the process of using a trackable US probe by simplifying recalibration after sterilization when the electromagnetic tracker is externally attached and is required to be disassembled for cleaning and sterilization, or as a part of out-of-factory calibration of US probe with embedded trackers where probes are in mass production.</p>\",\"PeriodicalId\":51251,\"journal\":{\"name\":\"International Journal of Computer Assisted Radiology and Surgery\",\"volume\":\" \",\"pages\":\"859-868\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2025-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Computer Assisted Radiology and Surgery\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1007/s11548-025-03347-8\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/4/4 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Assisted Radiology and Surgery","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s11548-025-03347-8","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/4/4 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

摘要

背景:可跟踪超声探头促进超声引导程序,允许增强超声图像和实时视频流的实时融合。这种整合有助于外科医生准确定位器官内的病变,而这只能通过超声探头和超声图像之间的精确配准来实现。目前,校准和配准过程通常是手动的、劳动密集型的、耗时的、次优的。技术人员手动操作触控笔,在超声探头的成像平面内移动触控笔,以在超声图像中检测触控笔的尖端。本文通过提出一种新的可跟踪超声探头自动校准方法来解决这一挑战。方法:我们使用一个机器人机械手(KUKA LBR iiwa 7)来执行触针运动,消除了繁琐的手动定位探针。我们在超声探头中加入了一个6自由度的电磁跟踪器,以实现实时的姿态和方向跟踪。此外,我们开发了一种特征检测算法,可以有效地从记录的超声馈电中识别平面触控笔尖端坐标,从而促进校准对应的自动选择。结果:该系统的表现与手动超声特征分割相当,产生的平均重新投影误差为0.38 mm,而手动地标选择误差为0.34 mm。我们还实现了0.80度的图像平面重建,手工分割和0.20度的自动分割。结论:提出的系统允许全自动校准,同时保持与最先进的方法相同的精度水平。当电磁跟踪器外接并需要拆卸清洗和灭菌时,它简化了灭菌后的重新校准,从而简化了使用可跟踪的美国探针的过程,或者作为大规模生产的嵌入式跟踪器美国探针的出厂校准的一部分。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Robot-assisted ultrasound probe calibration for image-guided interventions.

Background: Trackable ultrasound probes facilitate ultrasound-guided procedures, allowing real-time fusion of augmented ultrasound images and live video streams. The integration aids surgeons in accurately locating lesions within organs, and this could only be achieved through a precise registration between the ultrasound probe and the ultrasound image. Currently, calibration and registration processes are often manual, labor-intensive, time-consuming, and suboptimal. Technologists manually manipulate a stylus, moving it through various poses within the ultrasound probe's imaging plane to detect its tip in the ultrasound image. This paper addresses this challenge by proposing a novel automated calibration approach for trackable ultrasound probes.

Methods: We utilized a robotic manipulator (KUKA LBR iiwa 7) to execute stylus movements, eliminating the cumbersome manual positioning of the probe. We incorporated a 6-degree-of-freedom electromagnetic tracker into the ultrasound probe to enable real-time pose and orientation tracking. Also, we developed a feature detection algorithm to effectively identify in plane stylus tip coordinates from recorded ultrasound feeds, facilitating automatic selection of calibration correspondences.

Results: The proposed system performed comparably to manual ultrasound feature segmentation, yielding a mean re-projection error of 0.38 mm compared to a manual landmark selection error of 0.34 mm. We also achieved an image plane reconstruction of 0.80 deg with manual segmentation and 0.20 deg with automatic segmentation.

Conclusion: The proposed system allowed for fully automated calibration while maintaining the same level of accuracy as the state-of-the-art methods. It streamlines the process of using a trackable US probe by simplifying recalibration after sterilization when the electromagnetic tracker is externally attached and is required to be disassembled for cleaning and sterilization, or as a part of out-of-factory calibration of US probe with embedded trackers where probes are in mass production.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
International Journal of Computer Assisted Radiology and Surgery
International Journal of Computer Assisted Radiology and Surgery ENGINEERING, BIOMEDICAL-RADIOLOGY, NUCLEAR MEDICINE & MEDICAL IMAGING
CiteScore
5.90
自引率
6.70%
发文量
243
审稿时长
6-12 weeks
期刊介绍: The International Journal for Computer Assisted Radiology and Surgery (IJCARS) is a peer-reviewed journal that provides a platform for closing the gap between medical and technical disciplines, and encourages interdisciplinary research and development activities in an international environment.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信