IG-Net:低位直肠癌手术中用于前列腺切除的仪器引导实时语义分割框架。

IF 4.9 2区 医学 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Bo Sun , Zhen Sun , Kexuan Li , Xuehao Wang , Guotao Wang , Wenfeng Song , Shuai Li , Aimin Hao , Yi Xiao
{"title":"IG-Net:低位直肠癌手术中用于前列腺切除的仪器引导实时语义分割框架。","authors":"Bo Sun ,&nbsp;Zhen Sun ,&nbsp;Kexuan Li ,&nbsp;Xuehao Wang ,&nbsp;Guotao Wang ,&nbsp;Wenfeng Song ,&nbsp;Shuai Li ,&nbsp;Aimin Hao ,&nbsp;Yi Xiao","doi":"10.1016/j.cmpb.2024.108443","DOIUrl":null,"url":null,"abstract":"<div><h3>Background and Objective:</h3><div>Accurate prostate dissection is crucial in transanal surgery for patients with low rectal cancer. Improper dissection can lead to adverse events such as urethral injury, severely affecting the patient’s postoperative recovery. However, unclear boundaries, irregular shape of the prostate, and obstructive factors such as smoke present significant challenges for surgeons.</div></div><div><h3>Methods:</h3><div>Our innovative contribution lies in the introduction of a novel video semantic segmentation framework, IG-Net, which incorporates prior surgical instrument features for real-time and precise prostate segmentation. Specifically, we designed an instrument-guided module that calculates the surgeon’s region of attention based on instrument features, performs local segmentation, and integrates it with global segmentation to enhance performance. Additionally, we proposed a keyframe selection module that calculates the temporal correlations between consecutive frames based on instrument features. This module adaptively selects non-keyframe for feature fusion segmentation, reducing noise and optimizing speed.</div></div><div><h3>Results:</h3><div>To evaluate the performance of IG-Net, we constructed the most extensive dataset known to date, comprising 106 video clips and 6153 images. The experimental results reveal that this method achieves favorable performance, with 72.70% IoU, 82.02% Dice, and 35 FPS.</div></div><div><h3>Conclusions:</h3><div>For the task of prostate segmentation based on surgical videos, our proposed IG-Net surpasses all previous methods across multiple metrics. IG-Net balances segmentation accuracy and speed, demonstrating strong robustness against adverse factors.</div></div>","PeriodicalId":10624,"journal":{"name":"Computer methods and programs in biomedicine","volume":"257 ","pages":"Article 108443"},"PeriodicalIF":4.9000,"publicationDate":"2024-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"IG-Net: An Instrument-guided real-time semantic segmentation framework for prostate dissection during surgery for low rectal cancer\",\"authors\":\"Bo Sun ,&nbsp;Zhen Sun ,&nbsp;Kexuan Li ,&nbsp;Xuehao Wang ,&nbsp;Guotao Wang ,&nbsp;Wenfeng Song ,&nbsp;Shuai Li ,&nbsp;Aimin Hao ,&nbsp;Yi Xiao\",\"doi\":\"10.1016/j.cmpb.2024.108443\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Background and Objective:</h3><div>Accurate prostate dissection is crucial in transanal surgery for patients with low rectal cancer. Improper dissection can lead to adverse events such as urethral injury, severely affecting the patient’s postoperative recovery. However, unclear boundaries, irregular shape of the prostate, and obstructive factors such as smoke present significant challenges for surgeons.</div></div><div><h3>Methods:</h3><div>Our innovative contribution lies in the introduction of a novel video semantic segmentation framework, IG-Net, which incorporates prior surgical instrument features for real-time and precise prostate segmentation. Specifically, we designed an instrument-guided module that calculates the surgeon’s region of attention based on instrument features, performs local segmentation, and integrates it with global segmentation to enhance performance. Additionally, we proposed a keyframe selection module that calculates the temporal correlations between consecutive frames based on instrument features. This module adaptively selects non-keyframe for feature fusion segmentation, reducing noise and optimizing speed.</div></div><div><h3>Results:</h3><div>To evaluate the performance of IG-Net, we constructed the most extensive dataset known to date, comprising 106 video clips and 6153 images. The experimental results reveal that this method achieves favorable performance, with 72.70% IoU, 82.02% Dice, and 35 FPS.</div></div><div><h3>Conclusions:</h3><div>For the task of prostate segmentation based on surgical videos, our proposed IG-Net surpasses all previous methods across multiple metrics. IG-Net balances segmentation accuracy and speed, demonstrating strong robustness against adverse factors.</div></div>\",\"PeriodicalId\":10624,\"journal\":{\"name\":\"Computer methods and programs in biomedicine\",\"volume\":\"257 \",\"pages\":\"Article 108443\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2024-09-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer methods and programs in biomedicine\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S016926072400436X\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer methods and programs in biomedicine","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016926072400436X","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

背景和目的:在对低位直肠癌患者进行经肛门手术时,准确的前列腺解剖至关重要。解剖不当会导致尿道损伤等不良事件,严重影响患者的术后恢复。然而,边界不清、前列腺形状不规则以及烟雾等阻塞因素给外科医生带来了巨大挑战:我们的创新贡献在于引入了一个新颖的视频语义分割框架 IG-Net,该框架结合了先前的手术器械特征,可实现实时、精确的前列腺分割。具体来说,我们设计了一个器械引导模块,该模块可根据器械特征计算外科医生的注意区域,执行局部分割,并将其与全局分割整合以提高性能。此外,我们还提出了一个关键帧选择模块,可根据仪器特征计算连续帧之间的时间相关性。该模块可自适应地选择非关键帧进行特征融合分割,从而减少噪声并优化速度:为了评估 IG-Net 的性能,我们构建了迄今为止已知的最广泛的数据集,其中包括 106 个视频片段和 6153 幅图像。实验结果表明,该方法性能良好,IoU 为 72.70%,Dice 为 82.02%,FPS 为 35:对于基于手术视频的前列腺分割任务,我们提出的 IG-Net 在多个指标上超越了之前的所有方法。IG-Net 兼顾了分割准确性和速度,在不利因素面前表现出很强的鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
IG-Net: An Instrument-guided real-time semantic segmentation framework for prostate dissection during surgery for low rectal cancer

Background and Objective:

Accurate prostate dissection is crucial in transanal surgery for patients with low rectal cancer. Improper dissection can lead to adverse events such as urethral injury, severely affecting the patient’s postoperative recovery. However, unclear boundaries, irregular shape of the prostate, and obstructive factors such as smoke present significant challenges for surgeons.

Methods:

Our innovative contribution lies in the introduction of a novel video semantic segmentation framework, IG-Net, which incorporates prior surgical instrument features for real-time and precise prostate segmentation. Specifically, we designed an instrument-guided module that calculates the surgeon’s region of attention based on instrument features, performs local segmentation, and integrates it with global segmentation to enhance performance. Additionally, we proposed a keyframe selection module that calculates the temporal correlations between consecutive frames based on instrument features. This module adaptively selects non-keyframe for feature fusion segmentation, reducing noise and optimizing speed.

Results:

To evaluate the performance of IG-Net, we constructed the most extensive dataset known to date, comprising 106 video clips and 6153 images. The experimental results reveal that this method achieves favorable performance, with 72.70% IoU, 82.02% Dice, and 35 FPS.

Conclusions:

For the task of prostate segmentation based on surgical videos, our proposed IG-Net surpasses all previous methods across multiple metrics. IG-Net balances segmentation accuracy and speed, demonstrating strong robustness against adverse factors.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computer methods and programs in biomedicine
Computer methods and programs in biomedicine 工程技术-工程:生物医学
CiteScore
12.30
自引率
6.60%
发文量
601
审稿时长
135 days
期刊介绍: To encourage the development of formal computing methods, and their application in biomedical research and medical practice, by illustration of fundamental principles in biomedical informatics research; to stimulate basic research into application software design; to report the state of research of biomedical information processing projects; to report new computer methodologies applied in biomedical areas; the eventual distribution of demonstrable software to avoid duplication of effort; to provide a forum for discussion and improvement of existing software; to optimize contact between national organizations and regional user groups by promoting an international exchange of information on formal methods, standards and software in biomedicine. Computer Methods and Programs in Biomedicine covers computing methodology and software systems derived from computing science for implementation in all aspects of biomedical research and medical practice. It is designed to serve: biochemists; biologists; geneticists; immunologists; neuroscientists; pharmacologists; toxicologists; clinicians; epidemiologists; psychiatrists; psychologists; cardiologists; chemists; (radio)physicists; computer scientists; programmers and systems analysts; biomedical, clinical, electrical and other engineers; teachers of medical informatics and users of educational software.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信