Auto Contouring of OAR in Pelvic CT Images Using an Encoder-Decoder Based Deep Residual Network

Seenia Francis, Harsh Bagaria, J. B, Pournami P N, Niyas Puzhakkal
{"title":"Auto Contouring of OAR in Pelvic CT Images Using an Encoder-Decoder Based Deep Residual Network","authors":"Seenia Francis, Harsh Bagaria, J. B, Pournami P N, Niyas Puzhakkal","doi":"10.1109/icdcece53908.2022.9792926","DOIUrl":null,"url":null,"abstract":"Automatic contouring of organs at risk is vital in radiotherapy treatment planning for curing any cancer disease. To calculate accurate distribution dose of radiation to the cancer cells requires the identification of nearby organs to safeguard them from irradiation. And it is very challenging in the pelvis area of the human body due to unclear boundaries, bowel gas, and the varying size of organs in different people. Deep learning-based automatic contouring can mitigate manual contouring difficulties. In this paper, a modified U-Net with a particular residual network(ResNext), having a horizontal dimension in addition to depth and width, is used for contouring of organs, namely, bladder and left & right femoral heads on pelvic Computed Tomography(CT) images. The results reveal that the dice coefficient value obtained from the experiments is above 0.88 for all three organs. The model outperformed the typical U-Net and is comparable to other state-of-the-art models in this area. The proposed model can be used for automatic contouring of pelvic organs and reduce treatment planning time significantly.","PeriodicalId":417643,"journal":{"name":"2022 IEEE International Conference on Distributed Computing and Electrical Circuits and Electronics (ICDCECE)","volume":"47 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Distributed Computing and Electrical Circuits and Electronics (ICDCECE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/icdcece53908.2022.9792926","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Automatic contouring of organs at risk is vital in radiotherapy treatment planning for curing any cancer disease. To calculate accurate distribution dose of radiation to the cancer cells requires the identification of nearby organs to safeguard them from irradiation. And it is very challenging in the pelvis area of the human body due to unclear boundaries, bowel gas, and the varying size of organs in different people. Deep learning-based automatic contouring can mitigate manual contouring difficulties. In this paper, a modified U-Net with a particular residual network(ResNext), having a horizontal dimension in addition to depth and width, is used for contouring of organs, namely, bladder and left & right femoral heads on pelvic Computed Tomography(CT) images. The results reveal that the dice coefficient value obtained from the experiments is above 0.88 for all three organs. The model outperformed the typical U-Net and is comparable to other state-of-the-art models in this area. The proposed model can be used for automatic contouring of pelvic organs and reduce treatment planning time significantly.
基于深度残差网络的骨盆CT图像桨叶的自动轮廓
危险器官的自动轮廓在治疗任何癌症的放射治疗计划中都是至关重要的。要准确计算照射到癌细胞的辐射剂量分布,需要识别癌细胞附近的器官以保护它们免受辐射。这在人体的骨盆区域是非常具有挑战性的,因为边界不清楚,肠道气体,以及不同人的器官大小不同。基于深度学习的自动轮廓可以减轻人工轮廓的困难。在本文中,改进的U-Net具有特定的残差网络(ResNext),除深度和宽度外还具有水平维度,用于骨盆计算机断层扫描(CT)图像上的器官,即膀胱和左右股骨头的轮廓。实验结果表明,三个器官的骰子系数值均在0.88以上。该模型的性能优于典型的U-Net,与该领域的其他最先进的模型相当。该模型可用于盆腔器官的自动轮廓,大大缩短了治疗计划时间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信