物理、数学和深度学习之间的联系。

Q2 Physics and Astronomy
Jean Thierry-Mieg
{"title":"物理、数学和深度学习之间的联系。","authors":"Jean Thierry-Mieg","doi":"10.31526/lhep.3.2019.110","DOIUrl":null,"url":null,"abstract":"<p><p>Starting from Fermat's principle of least action, which governs classical and quantum mechanics and from the theory of exterior differential forms, which governs the geometry of curved manifolds, we show how to derive the equations governing neural networks in an intrinsic, coordinate-invariant way, where the loss function plays the role of the Hamiltonian. To be covariant, these equations imply a layer metric which is instrumental in pretraining and explains the role of conjugation when using complex numbers. The differential formalism clarifies the relation of the gradient descent optimizer with Aristotelian and Newtonian mechanics. The Bayesian paradigm is then analyzed as a renormalizable theory yielding a new derivation of the Bayesian information criterion. We hope that this formal presentation of the differential geometry of neural networks will encourage some physicists to dive into deep learning and, reciprocally, that the specialists of deep learning will better appreciate the close interconnection of their subject with the foundations of classical and quantum field theory.</p>","PeriodicalId":36085,"journal":{"name":"Letters in High Energy Physics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2019-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8462849/pdf/nihms-1732645.pdf","citationCount":"0","resultStr":"{\"title\":\"Connections between physics, mathematics, and deep learning.\",\"authors\":\"Jean Thierry-Mieg\",\"doi\":\"10.31526/lhep.3.2019.110\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Starting from Fermat's principle of least action, which governs classical and quantum mechanics and from the theory of exterior differential forms, which governs the geometry of curved manifolds, we show how to derive the equations governing neural networks in an intrinsic, coordinate-invariant way, where the loss function plays the role of the Hamiltonian. To be covariant, these equations imply a layer metric which is instrumental in pretraining and explains the role of conjugation when using complex numbers. The differential formalism clarifies the relation of the gradient descent optimizer with Aristotelian and Newtonian mechanics. The Bayesian paradigm is then analyzed as a renormalizable theory yielding a new derivation of the Bayesian information criterion. We hope that this formal presentation of the differential geometry of neural networks will encourage some physicists to dive into deep learning and, reciprocally, that the specialists of deep learning will better appreciate the close interconnection of their subject with the foundations of classical and quantum field theory.</p>\",\"PeriodicalId\":36085,\"journal\":{\"name\":\"Letters in High Energy Physics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-08-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8462849/pdf/nihms-1732645.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Letters in High Energy Physics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.31526/lhep.3.2019.110\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Physics and Astronomy\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Letters in High Energy Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.31526/lhep.3.2019.110","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Physics and Astronomy","Score":null,"Total":0}
引用次数: 0

摘要

从支配经典力学和量子力学的费马最小作用原理出发,从支配弯曲流形几何的外部微分形式理论出发,我们展示了如何以一种内在的、坐标不变的方式推导控制神经网络的方程,其中损失函数起着哈密顿量的作用。为了协变,这些方程意味着一个层度量,它有助于预训练,并解释了在使用复数时共轭的作用。微分形式阐明了梯度下降优化器与亚里士多德力学和牛顿力学的关系。然后将贝叶斯范式作为一种可重整的理论进行分析,得出贝叶斯信息准则的新推导。我们希望这种神经网络微分几何的正式呈现将鼓励一些物理学家深入研究深度学习,反过来,深度学习的专家将更好地理解他们的学科与经典和量子场论基础的密切联系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Connections between physics, mathematics, and deep learning.

Starting from Fermat's principle of least action, which governs classical and quantum mechanics and from the theory of exterior differential forms, which governs the geometry of curved manifolds, we show how to derive the equations governing neural networks in an intrinsic, coordinate-invariant way, where the loss function plays the role of the Hamiltonian. To be covariant, these equations imply a layer metric which is instrumental in pretraining and explains the role of conjugation when using complex numbers. The differential formalism clarifies the relation of the gradient descent optimizer with Aristotelian and Newtonian mechanics. The Bayesian paradigm is then analyzed as a renormalizable theory yielding a new derivation of the Bayesian information criterion. We hope that this formal presentation of the differential geometry of neural networks will encourage some physicists to dive into deep learning and, reciprocally, that the specialists of deep learning will better appreciate the close interconnection of their subject with the foundations of classical and quantum field theory.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Letters in High Energy Physics
Letters in High Energy Physics Physics and Astronomy-Nuclear and High Energy Physics
CiteScore
1.20
自引率
0.00%
发文量
4
审稿时长
12 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信