{"title":"Learning theory of minimum error entropy under weak moment conditions","authors":"Shouyou Huang, Yunlong Feng, Qiang Wu","doi":"10.1142/S0219530521500044","DOIUrl":null,"url":null,"abstract":"Minimum error entropy (MEE) is an information theoretic learning approach that minimizes the information contained in the prediction error, which is measured by entropy. It has been successfully used in various machine learning tasks for its robustness to heavy-tailed distributions and outliers. In this paper, we consider its use in nonparametric regression and analyze its generalization performance from a learning theory perspective by imposing a [Formula: see text]th order moment condition on the noise variable. To this end, we establish a comparison theorem to characterize the relation between the excess generalization error and the prediction error. A relaxed Bernstein condition and concentration inequalities are used to derive error bounds and learning rates. Note that the [Formula: see text]th moment condition is rather weak particularly when [Formula: see text] because the noise variable does not even admit a finite variance in this case. Therefore, our analysis explains the robustness of MEE in the presence of heavy-tailed distributions.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2021-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1142/S0219530521500044","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 9
Abstract
Minimum error entropy (MEE) is an information theoretic learning approach that minimizes the information contained in the prediction error, which is measured by entropy. It has been successfully used in various machine learning tasks for its robustness to heavy-tailed distributions and outliers. In this paper, we consider its use in nonparametric regression and analyze its generalization performance from a learning theory perspective by imposing a [Formula: see text]th order moment condition on the noise variable. To this end, we establish a comparison theorem to characterize the relation between the excess generalization error and the prediction error. A relaxed Bernstein condition and concentration inequalities are used to derive error bounds and learning rates. Note that the [Formula: see text]th moment condition is rather weak particularly when [Formula: see text] because the noise variable does not even admit a finite variance in this case. Therefore, our analysis explains the robustness of MEE in the presence of heavy-tailed distributions.