Tree-loss function for training neural networks on weakly-labelled datasets

S. Demyanov, R. Chakravorty, ZongYuan Ge, SeyedBehzad Bozorgtabar, M. Pablo, Adrian Bowling, R. Garnavi
{"title":"Tree-loss function for training neural networks on weakly-labelled datasets","authors":"S. Demyanov, R. Chakravorty, ZongYuan Ge, SeyedBehzad Bozorgtabar, M. Pablo, Adrian Bowling, R. Garnavi","doi":"10.1109/ISBI.2017.7950521","DOIUrl":null,"url":null,"abstract":"Neural networks are powerful tools for medical image classification and segmentation. However, existing network structures and training procedures assume that the output classes are mutually exclusive and equally important. Many datasets of medical images do not satisfy these conditions. For example, some skin disease datasets have images labelled as coarse-grained class (such as Benign) in addition to images with fine-grained labels (such as a Benign subclass called Blue Nevus), and conventional neural network can not leverage such additional data for training. Also, in the clinical decision making, some classes (such as skin cancer or Melanoma) often carry more importance than other lesion types. We propose a novel Tree-Loss function for training and fine-tuning a neural network classifier using all available labelled images. The key step is the definition of the class taxonomy tree, which is used to describe the relations between labels. The tree can be also adjusted to reflect the desired importance of each class. These steps can be performed by a domain expert without detailed knowledge of machine learning techniques. The experiments demonstrate the improved performance compared with the conventional approach even without using additional data.","PeriodicalId":6547,"journal":{"name":"2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISBI.2017.7950521","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Neural networks are powerful tools for medical image classification and segmentation. However, existing network structures and training procedures assume that the output classes are mutually exclusive and equally important. Many datasets of medical images do not satisfy these conditions. For example, some skin disease datasets have images labelled as coarse-grained class (such as Benign) in addition to images with fine-grained labels (such as a Benign subclass called Blue Nevus), and conventional neural network can not leverage such additional data for training. Also, in the clinical decision making, some classes (such as skin cancer or Melanoma) often carry more importance than other lesion types. We propose a novel Tree-Loss function for training and fine-tuning a neural network classifier using all available labelled images. The key step is the definition of the class taxonomy tree, which is used to describe the relations between labels. The tree can be also adjusted to reflect the desired importance of each class. These steps can be performed by a domain expert without detailed knowledge of machine learning techniques. The experiments demonstrate the improved performance compared with the conventional approach even without using additional data.
弱标记数据集上训练神经网络的树损失函数
神经网络是医学图像分类和分割的有力工具。然而,现有的网络结构和训练程序假定输出类是相互排斥的,而且同样重要。许多医学图像数据集不满足这些条件。例如,一些皮肤病数据集除了具有细粒度标签的图像(例如称为Blue Nevus的Benign子类)之外,还具有标记为粗粒度类的图像(例如Benign子类),而传统的神经网络无法利用这些额外的数据进行训练。此外,在临床决策中,某些类别(如皮肤癌或黑色素瘤)往往比其他类型的病变更重要。我们提出了一种新的树损失函数,用于训练和微调神经网络分类器,使用所有可用的标记图像。关键步骤是定义类分类树,它用于描述标签之间的关系。树也可以调整,以反映每个类的期望的重要性。这些步骤可以由没有详细机器学习技术知识的领域专家执行。实验结果表明,在不使用额外数据的情况下,与传统方法相比,该方法的性能有所提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信