Formalizing Piecewise Affine Activation Functions of Neural Networks in Coq

A. Aleksandrov, Kim Völlinger
{"title":"Formalizing Piecewise Affine Activation Functions of Neural Networks in Coq","authors":"A. Aleksandrov, Kim Völlinger","doi":"10.48550/arXiv.2301.12893","DOIUrl":null,"url":null,"abstract":"Verification of neural networks relies on activation functions being piecewise affine (pwa) -- enabling an encoding of the verification problem for theorem provers. In this paper, we present the first formalization of pwa activation functions for an interactive theorem prover tailored to verifying neural networks within Coq using the library Coquelicot for real analysis. As a proof-of-concept, we construct the popular pwa activation function ReLU. We integrate our formalization into a Coq model of neural networks, and devise a verified transformation from a neural network N to a pwa function representing N by composing pwa functions that we construct for each layer. This representation enables encodings for proof automation, e.g. Coq's tactic lra -- a decision procedure for linear real arithmetic. Further, our formalization paves the way for integrating Coq in frameworks of neural network verification as a fallback prover when automated proving fails.","PeriodicalId":436677,"journal":{"name":"NASA Formal Methods","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"NASA Formal Methods","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2301.12893","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Verification of neural networks relies on activation functions being piecewise affine (pwa) -- enabling an encoding of the verification problem for theorem provers. In this paper, we present the first formalization of pwa activation functions for an interactive theorem prover tailored to verifying neural networks within Coq using the library Coquelicot for real analysis. As a proof-of-concept, we construct the popular pwa activation function ReLU. We integrate our formalization into a Coq model of neural networks, and devise a verified transformation from a neural network N to a pwa function representing N by composing pwa functions that we construct for each layer. This representation enables encodings for proof automation, e.g. Coq's tactic lra -- a decision procedure for linear real arithmetic. Further, our formalization paves the way for integrating Coq in frameworks of neural network verification as a fallback prover when automated proving fails.
Coq中神经网络分段仿射激活函数的形式化
神经网络的验证依赖于激活函数是分段仿射的(pwa),这使得定理证明者能够对验证问题进行编码。在本文中,我们提出了一个交互式定理证明器的pwa激活函数的第一个形式化,该证明器专门用于验证Coq中的神经网络,使用库Coquelicot进行实际分析。作为概念验证,我们构造了流行的pwa激活函数ReLU。我们将我们的形式化集成到神经网络的Coq模型中,并通过组合我们为每层构建的pwa函数,设计了从神经网络N到表示N的pwa函数的验证转换。这种表示使证明自动化的编码成为可能,例如Coq的策略lra——线性实数算法的决策过程。此外,我们的形式化为在自动证明失败时将Coq集成到神经网络验证框架中作为后备证明器铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信