基于前向状态变换和后向损失变换的神经网络

Q3 Computer Science
Bart Jacobs , David Sprunger
{"title":"基于前向状态变换和后向损失变换的神经网络","authors":"Bart Jacobs ,&nbsp;David Sprunger","doi":"10.1016/j.entcs.2019.09.009","DOIUrl":null,"url":null,"abstract":"<div><p>This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved — both forward and backward — in order to develop a semantic/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers, using Kleisli composition for the multiset monad — for the linear parts of network layers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown in other works recently. We illustrate this perspective by training a simple instance of a neural network.</p></div>","PeriodicalId":38770,"journal":{"name":"Electronic Notes in Theoretical Computer Science","volume":"347 ","pages":"Pages 161-177"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.entcs.2019.09.009","citationCount":"4","resultStr":"{\"title\":\"Neural Nets via Forward State Transformation and Backward Loss Transformation\",\"authors\":\"Bart Jacobs ,&nbsp;David Sprunger\",\"doi\":\"10.1016/j.entcs.2019.09.009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved — both forward and backward — in order to develop a semantic/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers, using Kleisli composition for the multiset monad — for the linear parts of network layers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown in other works recently. We illustrate this perspective by training a simple instance of a neural network.</p></div>\",\"PeriodicalId\":38770,\"journal\":{\"name\":\"Electronic Notes in Theoretical Computer Science\",\"volume\":\"347 \",\"pages\":\"Pages 161-177\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/j.entcs.2019.09.009\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electronic Notes in Theoretical Computer Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1571066119301252\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronic Notes in Theoretical Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1571066119301252","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 4

摘要

本文研究(多层感知器)神经网络的重点是所涉及的转换-向前和向后-为了开发符合标准程序语义的语义/逻辑视角。常见的两步神经网络训练算法使这种观点特别适合。在正向方向上,神经网络作为状态变压器,对网络层的线性部分使用Kleisli组合来处理多集单轴。然而,在相反的方向上,神经网络将输出的损失改变为输入的损失,从而像一个(实值)谓词转换器。通过这种方式,反向传播是功能性的,正如最近的其他作品所展示的那样。我们通过训练一个简单的神经网络实例来说明这个观点。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Neural Nets via Forward State Transformation and Backward Loss Transformation

This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved — both forward and backward — in order to develop a semantic/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers, using Kleisli composition for the multiset monad — for the linear parts of network layers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown in other works recently. We illustrate this perspective by training a simple instance of a neural network.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Electronic Notes in Theoretical Computer Science
Electronic Notes in Theoretical Computer Science Computer Science-Computer Science (all)
自引率
0.00%
发文量
0
期刊介绍: ENTCS is a venue for the rapid electronic publication of the proceedings of conferences, of lecture notes, monographs and other similar material for which quick publication and the availability on the electronic media is appropriate. Organizers of conferences whose proceedings appear in ENTCS, and authors of other material appearing as a volume in the series are allowed to make hard copies of the relevant volume for limited distribution. For example, conference proceedings may be distributed to participants at the meeting, and lecture notes can be distributed to those taking a course based on the material in the volume.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信