Improved U-Net for Guidewire Tip Segmentation in X-ray Fluoroscopy Images

Shuai Guo, Songyuan Tang, Jianjun Zhu, Jingfan Fan, Danni Ai, Hong Song, P. Liang, Jian Yang
{"title":"Improved U-Net for Guidewire Tip Segmentation in X-ray Fluoroscopy Images","authors":"Shuai Guo, Songyuan Tang, Jianjun Zhu, Jingfan Fan, Danni Ai, Hong Song, P. Liang, Jian Yang","doi":"10.1145/3373419.3373449","DOIUrl":null,"url":null,"abstract":"In percutaneous coronary intervention (PCI), physicians use a guidewire tip to implant stents in vessels with stenosis. Given the small scale and low signal-to-noise ratio of guidewire tips in X-ray fluoroscopy images, physicians experience difficulty in recognizing and locating the tip. The automatic segmentation of the guidewire tip can ease navigation when the physicians implant stents for PCI. In this paper, we propose an end-to-end convolutional neural network-based method for guidewire tip segmentation. The network framework is derived from U-Net, and two specific designs involving reduced dense block and connectivity supervision are embedded in the framework to improve the accuracy and robustness of guidewire tip segmentation. Experiments are performed on clinical data. The proposed method achieves mean sensitivity, F1-score, Jaccard index, Hausdorff distance of 92.95%, 91.35%, 84.14%, and 0.531 mm on testing data, respectively. In addition, the segmentation time is 0.02 s/frame, which can satisfy the requirements for clinical intra-practice.","PeriodicalId":352528,"journal":{"name":"Proceedings of the 2019 3rd International Conference on Advances in Image Processing","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 3rd International Conference on Advances in Image Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3373419.3373449","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

In percutaneous coronary intervention (PCI), physicians use a guidewire tip to implant stents in vessels with stenosis. Given the small scale and low signal-to-noise ratio of guidewire tips in X-ray fluoroscopy images, physicians experience difficulty in recognizing and locating the tip. The automatic segmentation of the guidewire tip can ease navigation when the physicians implant stents for PCI. In this paper, we propose an end-to-end convolutional neural network-based method for guidewire tip segmentation. The network framework is derived from U-Net, and two specific designs involving reduced dense block and connectivity supervision are embedded in the framework to improve the accuracy and robustness of guidewire tip segmentation. Experiments are performed on clinical data. The proposed method achieves mean sensitivity, F1-score, Jaccard index, Hausdorff distance of 92.95%, 91.35%, 84.14%, and 0.531 mm on testing data, respectively. In addition, the segmentation time is 0.02 s/frame, which can satisfy the requirements for clinical intra-practice.
改进U-Net用于x射线透视图像导丝尖端分割
在经皮冠状动脉介入治疗(PCI)中,医生使用导丝尖端在狭窄的血管中植入支架。由于x线透视图像中导丝尖端的尺寸小,信噪比低,医生在识别和定位尖端时遇到困难。导丝尖端的自动分割可以方便医生在PCI植入支架时导航。本文提出了一种基于端到端卷积神经网络的导丝尖端分割方法。该网络框架来源于U-Net,并在框架中嵌入了减少密集块和连通性监督两种具体设计,以提高导丝尖端分割的精度和鲁棒性。根据临床资料进行实验。该方法对测试数据的平均灵敏度为92.95%,f1评分为91.35%,Jaccard指数为84.14%,豪斯多夫距离为0.531 mm。此外,分割时间为0.02 s/帧,可以满足临床实践中的要求。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信