线性回归的确定性梯度-后裔学习:自适应算法、收敛分析和噪声补偿。

Kang-Zhi Liu, Chao Gan
{"title":"线性回归的确定性梯度-后裔学习:自适应算法、收敛分析和噪声补偿。","authors":"Kang-Zhi Liu, Chao Gan","doi":"10.1109/TPAMI.2024.3399312","DOIUrl":null,"url":null,"abstract":"<p><p>Weight learning forms a basis for the machine learning and numerous algorithms have been adopted up to date. Most of the algorithms were either developed in the stochastic framework or aimed at minimization of loss or regret functions. Asymptotic convergence of weight learning, vital for good output prediction, was seldom guaranteed for online applications. Since linear regression is the most fundamental component in machine learning, we focus on this model in this paper. Aiming at online applications, a deterministic analysis method is developed based on LaSalle's invariance principle. Convergence conditions are derived for both the first-order and the second-order learning algorithms, without resorting to any stochastic argument. Moreover, the deterministic approach makes it easy to analyze the noise influence. Specifically, adaptive hyperparameters are derived in this framework and their tuning rules disclosed for the compensation of measurement noise. Comparison with four most popular algorithms validates that this approach has a higher learning capability and is quite promising in enhancing the weight learning performance.</p>","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deterministic Gradient-Descent Learning of Linear Regressions: Adaptive Algorithms, Convergence Analysis and Noise Compensation.\",\"authors\":\"Kang-Zhi Liu, Chao Gan\",\"doi\":\"10.1109/TPAMI.2024.3399312\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Weight learning forms a basis for the machine learning and numerous algorithms have been adopted up to date. Most of the algorithms were either developed in the stochastic framework or aimed at minimization of loss or regret functions. Asymptotic convergence of weight learning, vital for good output prediction, was seldom guaranteed for online applications. Since linear regression is the most fundamental component in machine learning, we focus on this model in this paper. Aiming at online applications, a deterministic analysis method is developed based on LaSalle's invariance principle. Convergence conditions are derived for both the first-order and the second-order learning algorithms, without resorting to any stochastic argument. Moreover, the deterministic approach makes it easy to analyze the noise influence. Specifically, adaptive hyperparameters are derived in this framework and their tuning rules disclosed for the compensation of measurement noise. Comparison with four most popular algorithms validates that this approach has a higher learning capability and is quite promising in enhancing the weight learning performance.</p>\",\"PeriodicalId\":94034,\"journal\":{\"name\":\"IEEE transactions on pattern analysis and machine intelligence\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on pattern analysis and machine intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TPAMI.2024.3399312\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/11/6 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TPAMI.2024.3399312","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/11/6 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

权重学习是机器学习的基础,迄今已有许多算法被采用。大多数算法要么是在随机框架下开发的,要么以损失或遗憾函数最小化为目标。权重学习的渐进收敛性对良好的输出预测至关重要,但在在线应用中却很少得到保证。由于线性回归是机器学习中最基本的组成部分,因此我们在本文中重点讨论这一模型。针对在线应用,我们开发了一种基于拉萨尔不变性原理的确定性分析方法。在不诉诸任何随机论证的情况下,推导出了一阶和二阶学习算法的收敛条件。此外,确定性方法还便于分析噪声的影响。具体地说,在这个框架中推导出了自适应超参数,并披露了它们的调整规则,以补偿测量噪声。与四种最流行算法的比较验证了这种方法具有更高的学习能力,在提高权重学习性能方面大有可为。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Deterministic Gradient-Descent Learning of Linear Regressions: Adaptive Algorithms, Convergence Analysis and Noise Compensation.

Weight learning forms a basis for the machine learning and numerous algorithms have been adopted up to date. Most of the algorithms were either developed in the stochastic framework or aimed at minimization of loss or regret functions. Asymptotic convergence of weight learning, vital for good output prediction, was seldom guaranteed for online applications. Since linear regression is the most fundamental component in machine learning, we focus on this model in this paper. Aiming at online applications, a deterministic analysis method is developed based on LaSalle's invariance principle. Convergence conditions are derived for both the first-order and the second-order learning algorithms, without resorting to any stochastic argument. Moreover, the deterministic approach makes it easy to analyze the noise influence. Specifically, adaptive hyperparameters are derived in this framework and their tuning rules disclosed for the compensation of measurement noise. Comparison with four most popular algorithms validates that this approach has a higher learning capability and is quite promising in enhancing the weight learning performance.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信