{"title":"神经网络的紧致性","authors":"K. Miyajima, Hiroshi Yamazaki","doi":"10.2478/forma-2022-0002","DOIUrl":null,"url":null,"abstract":"Summary In this article, Feed-forward Neural Network is formalized in the Mizar system [1], [2]. First, the multilayer perceptron [6], [7], [8] is formalized using functional sequences. Next, we show that a set of functions generated by these neural networks satisfies equicontinuousness and equiboundedness property [10], [5]. At last, we formalized the compactness of the function set of these neural networks by using the Ascoli-Arzela’s theorem according to [4] and [3].","PeriodicalId":42667,"journal":{"name":"Formalized Mathematics","volume":"63 1","pages":"13 - 21"},"PeriodicalIF":1.0000,"publicationDate":"2022-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Compactness of Neural Networks\",\"authors\":\"K. Miyajima, Hiroshi Yamazaki\",\"doi\":\"10.2478/forma-2022-0002\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Summary In this article, Feed-forward Neural Network is formalized in the Mizar system [1], [2]. First, the multilayer perceptron [6], [7], [8] is formalized using functional sequences. Next, we show that a set of functions generated by these neural networks satisfies equicontinuousness and equiboundedness property [10], [5]. At last, we formalized the compactness of the function set of these neural networks by using the Ascoli-Arzela’s theorem according to [4] and [3].\",\"PeriodicalId\":42667,\"journal\":{\"name\":\"Formalized Mathematics\",\"volume\":\"63 1\",\"pages\":\"13 - 21\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2022-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Formalized Mathematics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2478/forma-2022-0002\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Formalized Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2478/forma-2022-0002","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
Summary In this article, Feed-forward Neural Network is formalized in the Mizar system [1], [2]. First, the multilayer perceptron [6], [7], [8] is formalized using functional sequences. Next, we show that a set of functions generated by these neural networks satisfies equicontinuousness and equiboundedness property [10], [5]. At last, we formalized the compactness of the function set of these neural networks by using the Ascoli-Arzela’s theorem according to [4] and [3].
期刊介绍:
Formalized Mathematics is to be issued quarterly and publishes papers which are abstracts of Mizar articles contributed to the Mizar Mathematical Library (MML) - the basis of a knowledge management system for mathematics.