Prolog中的自动区分

IF 1.4 2区 数学 Q3 COMPUTER SCIENCE, SOFTWARE ENGINEERING
TOM SCHRIJVERS, BIRTHE VAN DEN BERG, FABRIZIO RIGUZZI
{"title":"Prolog中的自动区分","authors":"TOM SCHRIJVERS, BIRTHE VAN DEN BERG, FABRIZIO RIGUZZI","doi":"10.1017/s1471068423000145","DOIUrl":null,"url":null,"abstract":"Abstract Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function’s (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.","PeriodicalId":49436,"journal":{"name":"Theory and Practice of Logic Programming","volume":null,"pages":null},"PeriodicalIF":1.4000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automatic Differentiation in Prolog\",\"authors\":\"TOM SCHRIJVERS, BIRTHE VAN DEN BERG, FABRIZIO RIGUZZI\",\"doi\":\"10.1017/s1471068423000145\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function’s (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.\",\"PeriodicalId\":49436,\"journal\":{\"name\":\"Theory and Practice of Logic Programming\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Theory and Practice of Logic Programming\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1017/s1471068423000145\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Theory and Practice of Logic Programming","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1017/s1471068423000145","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

摘要

自动微分(AD)是计算函数(偏)导数数值的一系列算法,其中函数通常以计算机程序或抽象语法树的形式给出。作为许多学习算法的一部分,特别是神经网络,AD已经变得非常流行。本文使用Prolog系统地从一个简单的可执行规范中推导出基于梯度的正向和反向模式AD变体:符号导数的评估。在此过程中,我们演示了几个Prolog特征(dcg,协同例程)有助于简洁的算法公式。我们还讨论了Prolog算法在概率编程中的两个应用。第一个是和积循环语言的参数学习,第二个是概率逻辑规划的参数学习和变分推理。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Automatic Differentiation in Prolog
Abstract Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function’s (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Theory and Practice of Logic Programming
Theory and Practice of Logic Programming 工程技术-计算机:理论方法
CiteScore
4.50
自引率
21.40%
发文量
40
审稿时长
>12 weeks
期刊介绍: Theory and Practice of Logic Programming emphasises both the theory and practice of logic programming. Logic programming applies to all areas of artificial intelligence and computer science and is fundamental to them. Among the topics covered are AI applications that use logic programming, logic programming methodologies, specification, analysis and verification of systems, inductive logic programming, multi-relational data mining, natural language processing, knowledge representation, non-monotonic reasoning, semantic web reasoning, databases, implementations and architectures and constraint logic programming.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信