{"title":"结构化输出预测的正则化降阶回归","authors":"Heng Chen , Di-Rong Chen , Kun Cheng , Yang Zhou","doi":"10.1016/j.jco.2025.101977","DOIUrl":null,"url":null,"abstract":"<div><div>Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all <em>r</em> dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank <em>r</em> tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.</div></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"92 ","pages":"Article 101977"},"PeriodicalIF":1.8000,"publicationDate":"2025-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Regularized reduced-rank regression for structured output prediction\",\"authors\":\"Heng Chen , Di-Rong Chen , Kun Cheng , Yang Zhou\",\"doi\":\"10.1016/j.jco.2025.101977\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all <em>r</em> dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank <em>r</em> tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.</div></div>\",\"PeriodicalId\":50227,\"journal\":{\"name\":\"Journal of Complexity\",\"volume\":\"92 \",\"pages\":\"Article 101977\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2025-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Complexity\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0885064X2500055X\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Complexity","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0885064X2500055X","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
Regularized reduced-rank regression for structured output prediction
Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all r dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank r tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.
期刊介绍:
The multidisciplinary Journal of Complexity publishes original research papers that contain substantial mathematical results on complexity as broadly conceived. Outstanding review papers will also be published. In the area of computational complexity, the focus is on complexity over the reals, with the emphasis on lower bounds and optimal algorithms. The Journal of Complexity also publishes articles that provide major new algorithms or make important progress on upper bounds. Other models of computation, such as the Turing machine model, are also of interest. Computational complexity results in a wide variety of areas are solicited.
Areas Include:
• Approximation theory
• Biomedical computing
• Compressed computing and sensing
• Computational finance
• Computational number theory
• Computational stochastics
• Control theory
• Cryptography
• Design of experiments
• Differential equations
• Discrete problems
• Distributed and parallel computation
• High and infinite-dimensional problems
• Information-based complexity
• Inverse and ill-posed problems
• Machine learning
• Markov chain Monte Carlo
• Monte Carlo and quasi-Monte Carlo
• Multivariate integration and approximation
• Noisy data
• Nonlinear and algebraic equations
• Numerical analysis
• Operator equations
• Optimization
• Quantum computing
• Scientific computation
• Tractability of multivariate problems
• Vision and image understanding.