Relaxation Quadratic Approximation Greedy Pursuit Method Based on Sparse Learning

IF 1 4区 数学 Q3 MATHEMATICS, APPLIED
Shihai Li, Changfeng Ma
{"title":"Relaxation Quadratic Approximation Greedy Pursuit Method Based on Sparse Learning","authors":"Shihai Li, Changfeng Ma","doi":"10.1515/cmam-2023-0050","DOIUrl":null,"url":null,"abstract":"Abstract A high-performance sparse model is very important for processing high-dimensional data. Therefore, based on the quadratic approximate greed pursuit (QAGP) method, we can make full use of the information of the quadratic lower bound of its approximate function to get the relaxation quadratic approximate greed pursuit (RQAGP) method. The calculation process of the RQAGP method is to construct two inexact quadratic approximation functions by using the m -strongly convex and L -smooth characteristics of the objective function and then solve the approximation function iteratively by using the Iterative Hard Thresholding (IHT) method to get the solution of the problem. The convergence analysis is given, and the performance of the method in the sparse logistic regression model is verified on synthetic data and real data sets. The results show that the RQAGP method is effective.","PeriodicalId":48751,"journal":{"name":"Computational Methods in Applied Mathematics","volume":"56 1","pages":"0"},"PeriodicalIF":1.0000,"publicationDate":"2023-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Methods in Applied Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/cmam-2023-0050","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract A high-performance sparse model is very important for processing high-dimensional data. Therefore, based on the quadratic approximate greed pursuit (QAGP) method, we can make full use of the information of the quadratic lower bound of its approximate function to get the relaxation quadratic approximate greed pursuit (RQAGP) method. The calculation process of the RQAGP method is to construct two inexact quadratic approximation functions by using the m -strongly convex and L -smooth characteristics of the objective function and then solve the approximation function iteratively by using the Iterative Hard Thresholding (IHT) method to get the solution of the problem. The convergence analysis is given, and the performance of the method in the sparse logistic regression model is verified on synthetic data and real data sets. The results show that the RQAGP method is effective.
基于稀疏学习的松弛二次逼近贪心追踪方法
高性能的稀疏模型对于处理高维数据非常重要。因此,在二次近似贪婪追踪(QAGP)方法的基础上,我们可以充分利用其近似函数的二次下界信息,得到松弛的二次近似贪婪追踪(RQAGP)方法。RQAGP方法的计算过程是利用目标函数的m -强凸和L -光滑特性构造两个不精确的二次逼近函数,然后利用迭代硬阈值法(IHT)对逼近函数进行迭代求解,得到问题的解。给出了收敛性分析,并在合成数据和实际数据集上验证了该方法在稀疏逻辑回归模型中的性能。结果表明,RQAGP方法是有效的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
2.40
自引率
7.70%
发文量
54
期刊介绍: The highly selective international mathematical journal Computational Methods in Applied Mathematics (CMAM) considers original mathematical contributions to computational methods and numerical analysis with applications mainly related to PDEs. CMAM seeks to be interdisciplinary while retaining the common thread of numerical analysis, it is intended to be readily readable and meant for a wide circle of researchers in applied mathematics. The journal is published by De Gruyter on behalf of the Institute of Mathematics of the National Academy of Science of Belarus.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信