{"title":"Robust selection and estimation for sparse multivariate functional nonparametric additive models via regularized Huber regression","authors":"Yilun Wang , Yuan Xue , Yujie Li , Gaorong Li","doi":"10.1016/j.cam.2025.117104","DOIUrl":null,"url":null,"abstract":"<div><div>In this paper, we investigate sparse functional additive models with multivariate functional predictors and a scalar response variable. This model adopts a nonparametric additive framework to flexibly incorporate multivariate functional principal component analysis (FPCA) scores, effectively capturing complex nonlinear relationships while mitigating the curse of dimensionality. To enhance robustness against outliers and heavy-tailed errors, we propose a regularized Huber regression method incorporating the component selection and smoothing operator (COSSO) penalty. The proposed approach is formulated within a reproducing kernel Hilbert space (RKHS) framework, enabling simultaneous component selection and estimation in a robust manner. Furthermore, we extend the locally adaptive majorize-minimization (LAMM) principle to develop a general iterative optimization algorithm applicable to any loss function with continuous gradients. Under mild assumptions on the error distribution (without requiring sub-Gaussian tails) and standard regularity conditions, we establish theoretical guarantees for the proposed estimator. Extensive simulation studies and a real data application to fluorescence spectroscopy demonstrate the superior performance of our method compared to existing alternatives.</div></div>","PeriodicalId":50226,"journal":{"name":"Journal of Computational and Applied Mathematics","volume":"476 ","pages":"Article 117104"},"PeriodicalIF":2.6000,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational and Applied Mathematics","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0377042725006181","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we investigate sparse functional additive models with multivariate functional predictors and a scalar response variable. This model adopts a nonparametric additive framework to flexibly incorporate multivariate functional principal component analysis (FPCA) scores, effectively capturing complex nonlinear relationships while mitigating the curse of dimensionality. To enhance robustness against outliers and heavy-tailed errors, we propose a regularized Huber regression method incorporating the component selection and smoothing operator (COSSO) penalty. The proposed approach is formulated within a reproducing kernel Hilbert space (RKHS) framework, enabling simultaneous component selection and estimation in a robust manner. Furthermore, we extend the locally adaptive majorize-minimization (LAMM) principle to develop a general iterative optimization algorithm applicable to any loss function with continuous gradients. Under mild assumptions on the error distribution (without requiring sub-Gaussian tails) and standard regularity conditions, we establish theoretical guarantees for the proposed estimator. Extensive simulation studies and a real data application to fluorescence spectroscopy demonstrate the superior performance of our method compared to existing alternatives.
期刊介绍:
The Journal of Computational and Applied Mathematics publishes original papers of high scientific value in all areas of computational and applied mathematics. The main interest of the Journal is in papers that describe and analyze new computational techniques for solving scientific or engineering problems. Also the improved analysis, including the effectiveness and applicability, of existing methods and algorithms is of importance. The computational efficiency (e.g. the convergence, stability, accuracy, ...) should be proved and illustrated by nontrivial numerical examples. Papers describing only variants of existing methods, without adding significant new computational properties are not of interest.
The audience consists of: applied mathematicians, numerical analysts, computational scientists and engineers.