Logic of Differentiable Logics: Towards a Uniform Semantics of DL

Natalia Ślusarz, Ekaterina Komendantskaya, Matthew Daggitt, Robert Stewart, Kathrin Stark
{"title":"Logic of Differentiable Logics: Towards a Uniform Semantics of DL","authors":"Natalia Ślusarz, Ekaterina Komendantskaya, Matthew Daggitt, Robert Stewart, Kathrin Stark","doi":"10.29007/c1nt","DOIUrl":null,"url":null,"abstract":"Differentiable logics (DL) have recently been proposed as a method of training neural networks to satisfy logical specifications. A DL consists of a syntax in which specifications are stated and an interpretation function that translates expressions in the syntax into loss functions. These loss functions can then be used during training with standard gradient descent algorithms. The variety of existing DLs and the differing levels of formality with which they are treated makes a systematic comparative study of their properties and implementations difficult. This paper remedies this problem by suggesting a meta-language for defining DLs that we call the Logic of Differentiable Logics, or LDL. Syntactically, it generalises the syntax of existing DLs to FOL, and for the first time introduces the formalism for reasoning about vectors and learners. Semantically, it introduces a general interpretation function that can be instantiated to define loss functions arising from different existing DLs. We use LDL to establish several theoretical properties of existing DLs and to conduct their empirical study in neural network verification.","PeriodicalId":93549,"journal":{"name":"EPiC series in computing","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"EPiC series in computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.29007/c1nt","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Differentiable logics (DL) have recently been proposed as a method of training neural networks to satisfy logical specifications. A DL consists of a syntax in which specifications are stated and an interpretation function that translates expressions in the syntax into loss functions. These loss functions can then be used during training with standard gradient descent algorithms. The variety of existing DLs and the differing levels of formality with which they are treated makes a systematic comparative study of their properties and implementations difficult. This paper remedies this problem by suggesting a meta-language for defining DLs that we call the Logic of Differentiable Logics, or LDL. Syntactically, it generalises the syntax of existing DLs to FOL, and for the first time introduces the formalism for reasoning about vectors and learners. Semantically, it introduces a general interpretation function that can be instantiated to define loss functions arising from different existing DLs. We use LDL to establish several theoretical properties of existing DLs and to conduct their empirical study in neural network verification.
可微逻辑的逻辑:走向DL的统一语义
可微逻辑(DL)最近被提出作为一种训练神经网络以满足逻辑规范的方法。DL由说明规范的语法和将语法中的表达式转换为损失函数的解释函数组成。这些损失函数可以在标准梯度下降算法的训练中使用。现有dl的多样性和对待它们的不同形式级别使得对它们的特性和实现进行系统的比较研究变得困难。本文通过提出一种定义dl的元语言来解决这个问题,我们称之为可微逻辑逻辑(LDL)。在语法上,它将现有dl的语法推广到FOL,并首次引入了关于向量和学习器推理的形式化方法。在语义上,它引入了一个通用的解释函数,可以实例化该函数来定义由不同现有dl产生的损失函数。我们使用LDL建立了现有dl的几个理论性质,并在神经网络验证中进行了实证研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
1.60
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信