Steffen Dereich, Arnulf Jentzen, Sebastian Kassing
{"title":"论浅残差 ReLU 神经网络优化景观中最小值的存在性","authors":"Steffen Dereich, Arnulf Jentzen, Sebastian Kassing","doi":"10.1137/23m1556241","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Numerical Analysis, Volume 62, Issue 6, Page 2640-2666, December 2024. <br/> Abstract. In this article, we show the existence of minimizers in the loss landscape for residual artificial neural networks (ANNs) with a multidimensional input layer and one hidden layer with ReLU activation. Our work contrasts with earlier results in [D. Gallon, A. Jentzen, and F. Lindner, preprint, arXiv:2211.15641, 2022] and [P. Petersen, M. Raslan, and F. Voigtlaender, Found. Comput. Math., 21 (2021), pp. 375–444] which showed that in many situations minimizers do not exist for common smooth activation functions even in the case where the target functions are polynomials. The proof of the existence property makes use of a closure of the search space containing all functions generated by ANNs and additional discontinuous generalized responses. As we will show, the additional generalized responses in this larger space are suboptimal so that the minimum is attained in the original function class.","PeriodicalId":49527,"journal":{"name":"SIAM Journal on Numerical Analysis","volume":"182 1","pages":""},"PeriodicalIF":2.8000,"publicationDate":"2024-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On the Existence of Minimizers in Shallow Residual ReLU Neural Network Optimization Landscapes\",\"authors\":\"Steffen Dereich, Arnulf Jentzen, Sebastian Kassing\",\"doi\":\"10.1137/23m1556241\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SIAM Journal on Numerical Analysis, Volume 62, Issue 6, Page 2640-2666, December 2024. <br/> Abstract. In this article, we show the existence of minimizers in the loss landscape for residual artificial neural networks (ANNs) with a multidimensional input layer and one hidden layer with ReLU activation. Our work contrasts with earlier results in [D. Gallon, A. Jentzen, and F. Lindner, preprint, arXiv:2211.15641, 2022] and [P. Petersen, M. Raslan, and F. Voigtlaender, Found. Comput. Math., 21 (2021), pp. 375–444] which showed that in many situations minimizers do not exist for common smooth activation functions even in the case where the target functions are polynomials. The proof of the existence property makes use of a closure of the search space containing all functions generated by ANNs and additional discontinuous generalized responses. As we will show, the additional generalized responses in this larger space are suboptimal so that the minimum is attained in the original function class.\",\"PeriodicalId\":49527,\"journal\":{\"name\":\"SIAM Journal on Numerical Analysis\",\"volume\":\"182 1\",\"pages\":\"\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2024-11-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIAM Journal on Numerical Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1137/23m1556241\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Numerical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1556241","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
摘要
SIAM 数值分析期刊》,第 62 卷第 6 期,第 2640-2666 页,2024 年 12 月。 摘要在这篇文章中,我们证明了具有多维输入层和一个具有 ReLU 激活的隐藏层的残差人工神经网络(ANN)在损失景观中存在最小值。我们的工作与早先在 [D. Gallon, A. Jentzen, and F. Lindner, preprint, arXiv:2211.15641, 2022] 和 [P. Petersen, M. Raslan, preprint, arXiv:2211.15641, 2022] 中的结果形成了鲜明对比。Petersen, M. Raslan, and F. Voigtlaender, Found.Comput.数学》,21 (2021),第 375-444 页],这表明在许多情况下,即使目标函数是多项式,普通平滑激活函数的最小化也不存在。存在性的证明利用了搜索空间的封闭性,其中包含由 ANN 生成的所有函数和额外的不连续广义响应。我们将证明,在这个更大的空间中,额外的广义响应是次优的,因此在原始函数类别中会达到最小值。
On the Existence of Minimizers in Shallow Residual ReLU Neural Network Optimization Landscapes
SIAM Journal on Numerical Analysis, Volume 62, Issue 6, Page 2640-2666, December 2024. Abstract. In this article, we show the existence of minimizers in the loss landscape for residual artificial neural networks (ANNs) with a multidimensional input layer and one hidden layer with ReLU activation. Our work contrasts with earlier results in [D. Gallon, A. Jentzen, and F. Lindner, preprint, arXiv:2211.15641, 2022] and [P. Petersen, M. Raslan, and F. Voigtlaender, Found. Comput. Math., 21 (2021), pp. 375–444] which showed that in many situations minimizers do not exist for common smooth activation functions even in the case where the target functions are polynomials. The proof of the existence property makes use of a closure of the search space containing all functions generated by ANNs and additional discontinuous generalized responses. As we will show, the additional generalized responses in this larger space are suboptimal so that the minimum is attained in the original function class.
期刊介绍:
SIAM Journal on Numerical Analysis (SINUM) contains research articles on the development and analysis of numerical methods. Topics include the rigorous study of convergence of algorithms, their accuracy, their stability, and their computational complexity. Also included are results in mathematical analysis that contribute to algorithm analysis, and computational results that demonstrate algorithm behavior and applicability.