Integrating Prior Knowledge into Deep Learning

Michelangelo Diligenti, Soumali Roychowdhury, M. Gori
{"title":"Integrating Prior Knowledge into Deep Learning","authors":"Michelangelo Diligenti, Soumali Roychowdhury, M. Gori","doi":"10.1109/ICMLA.2017.00-37","DOIUrl":null,"url":null,"abstract":"Deep learning allows to develop feature representations and train classification models in a fully integrated way. However, learning deep networks is quite hard and it improves over shallow architectures only if a large number of training data is available. Injecting prior knowledge into the learner is a principled way to reduce the amount of required training data, as the learner does not need to induce the knowledge from the data itself. In this paper we propose a general and principled way to integrate prior knowledge when training deep networks. Semantic Based Regularization (SBR) is used as underlying framework to represent the prior knowledge, expressed as a collection of first-order logic clauses (FOL), and where each task to be learned corresponds to a predicate in the knowledge base. The knowledge base correlates the tasks to be learned and it is translated into a set of constraints which are integrated into the learning process via backpropagation. The experimental results show how the integration of the prior knowledge boosts the accuracy of a state-of-the-art deep network on an image classification task.","PeriodicalId":6636,"journal":{"name":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"44 1","pages":"920-923"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"68","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA.2017.00-37","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 68

Abstract

Deep learning allows to develop feature representations and train classification models in a fully integrated way. However, learning deep networks is quite hard and it improves over shallow architectures only if a large number of training data is available. Injecting prior knowledge into the learner is a principled way to reduce the amount of required training data, as the learner does not need to induce the knowledge from the data itself. In this paper we propose a general and principled way to integrate prior knowledge when training deep networks. Semantic Based Regularization (SBR) is used as underlying framework to represent the prior knowledge, expressed as a collection of first-order logic clauses (FOL), and where each task to be learned corresponds to a predicate in the knowledge base. The knowledge base correlates the tasks to be learned and it is translated into a set of constraints which are integrated into the learning process via backpropagation. The experimental results show how the integration of the prior knowledge boosts the accuracy of a state-of-the-art deep network on an image classification task.
将先验知识融入深度学习
深度学习允许以完全集成的方式开发特征表示和训练分类模型。然而,学习深度网络是相当困难的,只有当有大量的训练数据可用时,它才会比浅层架构有所改进。向学习器中注入先验知识是减少所需训练数据量的一种原则性方法,因为学习器不需要从数据本身中导出知识。在本文中,我们提出了一种在训练深度网络时整合先验知识的通用和原则性方法。基于语义的正则化(Semantic Based Regularization, SBR)被用作表示先验知识的底层框架,表示为一阶逻辑子句(FOL)的集合,其中每个要学习的任务对应于知识库中的谓词。知识库与要学习的任务相关联,并将其转化为一组约束,这些约束通过反向传播集成到学习过程中。实验结果表明,先验知识的集成提高了最先进的深度网络在图像分类任务上的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信