Tizian Wenzel , Gabriele Santin , Bernard Haasdonk
{"title":"Analysis of structured deep kernel networks","authors":"Tizian Wenzel , Gabriele Santin , Bernard Haasdonk","doi":"10.1016/j.cam.2025.116975","DOIUrl":null,"url":null,"abstract":"<div><div>In this paper, we leverage a recent deep kernel representer theorem to connect kernel based learning and (deep) neural networks in order to understand their interplay. In particular, we show that the use of special types of kernels yields models reminiscent of neural networks that are founded in the same theoretical framework of classical kernel methods, while benefiting from the computational advantages of deep neural networks. Especially the introduced Structured Deep Kernel Networks (SDKNs) can be viewed as neural networks (NNs) with optimizable activation functions obeying a representer theorem. This link allows us to analyze also NNs within the framework of kernel networks. We prove analytic properties of the SDKNs which show their universal approximation properties in three different asymptotic regimes of unbounded number of centers, width and depth. Especially in the case of unbounded depth, more accurate constructions can be achieved using fewer layers compared to corresponding constructions for ReLU neural networks. This is made possible by leveraging properties of kernel approximation.</div></div>","PeriodicalId":50226,"journal":{"name":"Journal of Computational and Applied Mathematics","volume":"476 ","pages":"Article 116975"},"PeriodicalIF":2.6000,"publicationDate":"2025-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational and Applied Mathematics","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0377042725004893","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we leverage a recent deep kernel representer theorem to connect kernel based learning and (deep) neural networks in order to understand their interplay. In particular, we show that the use of special types of kernels yields models reminiscent of neural networks that are founded in the same theoretical framework of classical kernel methods, while benefiting from the computational advantages of deep neural networks. Especially the introduced Structured Deep Kernel Networks (SDKNs) can be viewed as neural networks (NNs) with optimizable activation functions obeying a representer theorem. This link allows us to analyze also NNs within the framework of kernel networks. We prove analytic properties of the SDKNs which show their universal approximation properties in three different asymptotic regimes of unbounded number of centers, width and depth. Especially in the case of unbounded depth, more accurate constructions can be achieved using fewer layers compared to corresponding constructions for ReLU neural networks. This is made possible by leveraging properties of kernel approximation.
期刊介绍:
The Journal of Computational and Applied Mathematics publishes original papers of high scientific value in all areas of computational and applied mathematics. The main interest of the Journal is in papers that describe and analyze new computational techniques for solving scientific or engineering problems. Also the improved analysis, including the effectiveness and applicability, of existing methods and algorithms is of importance. The computational efficiency (e.g. the convergence, stability, accuracy, ...) should be proved and illustrated by nontrivial numerical examples. Papers describing only variants of existing methods, without adding significant new computational properties are not of interest.
The audience consists of: applied mathematicians, numerical analysts, computational scientists and engineers.