{"title":"Robust deep learning from weakly dependent data","authors":"William Kengne , Modou Wade","doi":"10.1016/j.neunet.2025.107227","DOIUrl":null,"url":null,"abstract":"<div><div>Recent developments on deep learning established some theoretical properties of deep neural networks estimators. However, most of the existing works on this topic are restricted to bounded loss functions or (sub)-Gaussian or bounded variables. This paper considers robust deep learning from weakly dependent observations, with unbounded loss function and unbounded output. It is only assumed that the output variable has a finite <span><math><mi>r</mi></math></span> order moment, with <span><math><mrow><mi>r</mi><mo>></mo><mn>1</mn></mrow></math></span>. Non asymptotic bounds for the expected excess risk of the deep neural network estimator are established under strong mixing, and <span><math><mi>ψ</mi></math></span>-weak dependence assumptions on the observations. We derive a relationship between these bounds and <span><math><mi>r</mi></math></span>, and when the data have moments of any order, the convergence rate is close to some well-known results. When the target predictor belongs to the class of Hölder smooth functions with sufficiently large smoothness index, the rate of the expected excess risk for exponentially strongly mixing data is close to that obtained with i.i.d. samples. Application to robust nonparametric regression and robust nonparametric autoregression are considered. The simulation study for models with heavy-tailed errors shows that, robust estimators with absolute loss and Huber loss function outperform the least squares method.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"185 ","pages":"Article 107227"},"PeriodicalIF":6.0000,"publicationDate":"2025-02-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025001066","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Recent developments on deep learning established some theoretical properties of deep neural networks estimators. However, most of the existing works on this topic are restricted to bounded loss functions or (sub)-Gaussian or bounded variables. This paper considers robust deep learning from weakly dependent observations, with unbounded loss function and unbounded output. It is only assumed that the output variable has a finite order moment, with . Non asymptotic bounds for the expected excess risk of the deep neural network estimator are established under strong mixing, and -weak dependence assumptions on the observations. We derive a relationship between these bounds and , and when the data have moments of any order, the convergence rate is close to some well-known results. When the target predictor belongs to the class of Hölder smooth functions with sufficiently large smoothness index, the rate of the expected excess risk for exponentially strongly mixing data is close to that obtained with i.i.d. samples. Application to robust nonparametric regression and robust nonparametric autoregression are considered. The simulation study for models with heavy-tailed errors shows that, robust estimators with absolute loss and Huber loss function outperform the least squares method.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.