Peilin Liu , Yuqing Liu , Xiang Zhou , Ding-Xuan Zhou
{"title":"Approximation of functionals on Korobov spaces with Fourier Functional Networks","authors":"Peilin Liu , Yuqing Liu , Xiang Zhou , Ding-Xuan Zhou","doi":"10.1016/j.neunet.2024.106922","DOIUrl":null,"url":null,"abstract":"<div><div>Learning from functional data with deep neural networks has become increasingly useful, and numerous neural network architectures have been developed to tackle high-dimensional problems raised in practical domains. Despite the impressive practical achievements, theoretical foundations underpinning the ability of neural networks to learn from functional data largely remain unexplored. In this paper, we investigate the approximation capacity of a functional neural network, called Fourier Functional Network, consisting of Fourier neural operators and deep convolutional neural networks with a great reduction in parameters. We establish rates of approximating by Fourier Functional Networks nonlinear continuous functionals defined on Korobov spaces of periodic functions. Finally, our results demonstrate dimension-independent convergence rates, which overcomes the curse of dimension.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"182 ","pages":"Article 106922"},"PeriodicalIF":6.0000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024008517","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Learning from functional data with deep neural networks has become increasingly useful, and numerous neural network architectures have been developed to tackle high-dimensional problems raised in practical domains. Despite the impressive practical achievements, theoretical foundations underpinning the ability of neural networks to learn from functional data largely remain unexplored. In this paper, we investigate the approximation capacity of a functional neural network, called Fourier Functional Network, consisting of Fourier neural operators and deep convolutional neural networks with a great reduction in parameters. We establish rates of approximating by Fourier Functional Networks nonlinear continuous functionals defined on Korobov spaces of periodic functions. Finally, our results demonstrate dimension-independent convergence rates, which overcomes the curse of dimension.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.