{"title":"用于随机优化的带自适应步长的随机方差降低梯度法","authors":"Jing Li, Dan Xue, Lei Liu, Rulei Qi","doi":"10.1002/oca.3109","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a stochastic variance reduction gradient method with adaptive step size, referred to as the SVRG-New BB method, to solve the convex stochastic optimization problem. The method could be roughly viewed as a hybrid of the SVRG algorithm and a new BB step mechanism. Under the condition that the objective function is strongly convex, we provide the linear convergence proof of this algorithm. Numerical experiment results show that the performance of the SVRG-New BB algorithm can surpass other existing algorithms if parameters in the algorithm are properly chosen.","PeriodicalId":501055,"journal":{"name":"Optimal Control Applications and Methods","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A stochastic variance reduced gradient method with adaptive step for stochastic optimization\",\"authors\":\"Jing Li, Dan Xue, Lei Liu, Rulei Qi\",\"doi\":\"10.1002/oca.3109\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we propose a stochastic variance reduction gradient method with adaptive step size, referred to as the SVRG-New BB method, to solve the convex stochastic optimization problem. The method could be roughly viewed as a hybrid of the SVRG algorithm and a new BB step mechanism. Under the condition that the objective function is strongly convex, we provide the linear convergence proof of this algorithm. Numerical experiment results show that the performance of the SVRG-New BB algorithm can surpass other existing algorithms if parameters in the algorithm are properly chosen.\",\"PeriodicalId\":501055,\"journal\":{\"name\":\"Optimal Control Applications and Methods\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-02-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optimal Control Applications and Methods\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1002/oca.3109\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimal Control Applications and Methods","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/oca.3109","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A stochastic variance reduced gradient method with adaptive step for stochastic optimization
In this paper, we propose a stochastic variance reduction gradient method with adaptive step size, referred to as the SVRG-New BB method, to solve the convex stochastic optimization problem. The method could be roughly viewed as a hybrid of the SVRG algorithm and a new BB step mechanism. Under the condition that the objective function is strongly convex, we provide the linear convergence proof of this algorithm. Numerical experiment results show that the performance of the SVRG-New BB algorithm can surpass other existing algorithms if parameters in the algorithm are properly chosen.