结构化输出预测的正则化降阶回归

IF 1.8 2区 数学 Q1 MATHEMATICS
Heng Chen , Di-Rong Chen , Kun Cheng , Yang Zhou
{"title":"结构化输出预测的正则化降阶回归","authors":"Heng Chen ,&nbsp;Di-Rong Chen ,&nbsp;Kun Cheng ,&nbsp;Yang Zhou","doi":"10.1016/j.jco.2025.101977","DOIUrl":null,"url":null,"abstract":"<div><div>Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all <em>r</em> dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank <em>r</em> tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.</div></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":"92 ","pages":"Article 101977"},"PeriodicalIF":1.8000,"publicationDate":"2025-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Regularized reduced-rank regression for structured output prediction\",\"authors\":\"Heng Chen ,&nbsp;Di-Rong Chen ,&nbsp;Kun Cheng ,&nbsp;Yang Zhou\",\"doi\":\"10.1016/j.jco.2025.101977\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all <em>r</em> dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank <em>r</em> tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.</div></div>\",\"PeriodicalId\":50227,\"journal\":{\"name\":\"Journal of Complexity\",\"volume\":\"92 \",\"pages\":\"Article 101977\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2025-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Complexity\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0885064X2500055X\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Complexity","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0885064X2500055X","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

摘要

降秩回归(RRR)被广泛用于增强多个输出之间的依赖关系。本文提出了一种正则化的向量值RRR方法,它在预测具有结构的多输出中起着重要的作用。通过最小化输出特征核与向量值核希尔伯特空间(RKHS)中所有r维子空间之间的经验平方距离,得到向量值核希尔伯特空间的估计量。该算法很容易通过核技巧实现。在温和的假设条件下,建立了向量值RRR估计器的学习率。此外,作为输出核回归函数的降维逼近,当秩r适当趋于无穷时,估计量在概率上收敛于输出回归函数。它暗示了结构化预测器在一般情况下的一致性,特别是在假设空间中不包含真实回归函数的错误情况下。数值实验证明了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Regularized reduced-rank regression for structured output prediction
Reduced-rank regression (RRR) has been widely used to strength the dependency among multiple outputs. This paper develops a regularized vector-valued RRR approach, which plays an important role in predicting multiple outputs with structures. The estimator of vector-valued RRR is obtained by minimizing the empirically squared reproducing kernel Hilbert space (RKHS) distances between output feature kernel and all r dimensional subspaces in vector-valued RKHS. The algorithm is implemented easily with kernel tricks. We establish the learning rate of vector-valued RRR estimator under mild assumptions. Moreover, as a reduced-dimensional approximation of output kernel regression function, the estimator converges to the output regression function in probability when the rank r tends to infinity appropriately. It implies the consistency of structured predictor in general settings, especially in a misspecified case where the true regression function is not contained in the hypothesis space. Numerical experiments are provided to illustrate the efficiency of our method.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Complexity
Journal of Complexity 工程技术-计算机:理论方法
CiteScore
3.10
自引率
17.60%
发文量
57
审稿时长
>12 weeks
期刊介绍: The multidisciplinary Journal of Complexity publishes original research papers that contain substantial mathematical results on complexity as broadly conceived. Outstanding review papers will also be published. In the area of computational complexity, the focus is on complexity over the reals, with the emphasis on lower bounds and optimal algorithms. The Journal of Complexity also publishes articles that provide major new algorithms or make important progress on upper bounds. Other models of computation, such as the Turing machine model, are also of interest. Computational complexity results in a wide variety of areas are solicited. Areas Include: • Approximation theory • Biomedical computing • Compressed computing and sensing • Computational finance • Computational number theory • Computational stochastics • Control theory • Cryptography • Design of experiments • Differential equations • Discrete problems • Distributed and parallel computation • High and infinite-dimensional problems • Information-based complexity • Inverse and ill-posed problems • Machine learning • Markov chain Monte Carlo • Monte Carlo and quasi-Monte Carlo • Multivariate integration and approximation • Noisy data • Nonlinear and algebraic equations • Numerical analysis • Operator equations • Optimization • Quantum computing • Scientific computation • Tractability of multivariate problems • Vision and image understanding.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信