Breathing-Compensated Neural Networks for Real Time C-Arm Pose Estimation in Lung CT-Fluoroscopy Registration

Brian C. Lee, Ayushi Sinha, N. Varble, W. Pritchard, J. Karanian, B. Wood, T. Bydlon
{"title":"Breathing-Compensated Neural Networks for Real Time C-Arm Pose Estimation in Lung CT-Fluoroscopy Registration","authors":"Brian C. Lee, Ayushi Sinha, N. Varble, W. Pritchard, J. Karanian, B. Wood, T. Bydlon","doi":"10.1109/ISBI52829.2022.9761705","DOIUrl":null,"url":null,"abstract":"Augmentation of interventional c-arm fluoroscopy using information extracted from pre-operative imaging has the potential to reduce procedure times and improve patient outcomes in minimally invasive peripheral lung procedures, where breathing motion, small airways, and anatomical variation create a challenging environment for planned pathway navigation. Extraction of the rigid c-arm pose relative to preoperative images is a crucial prerequisite; however, accurate 2D-3D fluoroscopy-CT soft tissue registration in the presence of natural deformable patient motion remains challenging. We propose to train a patient-specific neural network on synthetic fluoroscopy derived from the patient’s pre-operative CT, augmented by a generalized breathing motion model, to predict c-arm pose. Our model includes an image supervision path that infers the x-ray projection geometry, providing training stability across patients. We train our model on synthetic fluoroscopy generated from preclinical swine CT and we evaluate on synthetic and real fluoroscopy.","PeriodicalId":6827,"journal":{"name":"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)","volume":"25 1","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISBI52829.2022.9761705","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Augmentation of interventional c-arm fluoroscopy using information extracted from pre-operative imaging has the potential to reduce procedure times and improve patient outcomes in minimally invasive peripheral lung procedures, where breathing motion, small airways, and anatomical variation create a challenging environment for planned pathway navigation. Extraction of the rigid c-arm pose relative to preoperative images is a crucial prerequisite; however, accurate 2D-3D fluoroscopy-CT soft tissue registration in the presence of natural deformable patient motion remains challenging. We propose to train a patient-specific neural network on synthetic fluoroscopy derived from the patient’s pre-operative CT, augmented by a generalized breathing motion model, to predict c-arm pose. Our model includes an image supervision path that infers the x-ray projection geometry, providing training stability across patients. We train our model on synthetic fluoroscopy generated from preclinical swine CT and we evaluate on synthetic and real fluoroscopy.
肺ct透视登记中实时c臂姿态估计的呼吸补偿神经网络
利用术前影像提取的信息增强介入c臂透视有可能减少手术时间,改善微创周围肺手术的患者预后,呼吸运动、小气道和解剖变化为计划的路径导航创造了一个具有挑战性的环境。相对于术前图像的刚性c臂姿势的提取是一个至关重要的先决条件;然而,在存在自然变形的患者运动时,准确的2D-3D透视- ct软组织注册仍然具有挑战性。我们建议在合成透视上训练一个患者特异性神经网络,该神经网络来源于患者术前CT,并辅以广义呼吸运动模型,以预测c臂姿势。我们的模型包括一个图像监督路径,该路径推断x射线投影几何形状,提供跨患者的训练稳定性。我们在临床前猪CT生成的合成透视上训练我们的模型,并在合成透视和真实透视上进行评估。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信