Pham Thi Hoai , Nguyen The Vinh , Nguyen Phung Hai Chung
{"title":"梯度下降法的新型步长","authors":"Pham Thi Hoai , Nguyen The Vinh , Nguyen Phung Hai Chung","doi":"10.1016/j.orl.2024.107072","DOIUrl":null,"url":null,"abstract":"<div><p>We propose a novel adaptive stepsize for the gradient descent scheme to solve unconstrained nonlinear optimization problems. With the convex and smooth objective satisfying locally Lipschitz gradient we obtain the complexity <span><math><mi>O</mi><mrow><mo>(</mo><mfrac><mrow><mn>1</mn></mrow><mrow><mi>k</mi></mrow></mfrac><mo>)</mo></mrow></math></span> of <span><math><mi>f</mi><mo>(</mo><msup><mrow><mi>x</mi></mrow><mrow><mi>k</mi></mrow></msup><mo>)</mo><mo>−</mo><msub><mrow><mi>f</mi></mrow><mrow><mo>⁎</mo></mrow></msub></math></span><span> at most. By using the idea of the new stepsize, we propose another new algorithm based on the projected gradient for solving a class of nonconvex optimization problems over a closed convex set. The computational experiments show the efficiency of the new method.</span></p></div>","PeriodicalId":54682,"journal":{"name":"Operations Research Letters","volume":null,"pages":null},"PeriodicalIF":0.8000,"publicationDate":"2024-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A novel stepsize for gradient descent method\",\"authors\":\"Pham Thi Hoai , Nguyen The Vinh , Nguyen Phung Hai Chung\",\"doi\":\"10.1016/j.orl.2024.107072\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>We propose a novel adaptive stepsize for the gradient descent scheme to solve unconstrained nonlinear optimization problems. With the convex and smooth objective satisfying locally Lipschitz gradient we obtain the complexity <span><math><mi>O</mi><mrow><mo>(</mo><mfrac><mrow><mn>1</mn></mrow><mrow><mi>k</mi></mrow></mfrac><mo>)</mo></mrow></math></span> of <span><math><mi>f</mi><mo>(</mo><msup><mrow><mi>x</mi></mrow><mrow><mi>k</mi></mrow></msup><mo>)</mo><mo>−</mo><msub><mrow><mi>f</mi></mrow><mrow><mo>⁎</mo></mrow></msub></math></span><span> at most. By using the idea of the new stepsize, we propose another new algorithm based on the projected gradient for solving a class of nonconvex optimization problems over a closed convex set. The computational experiments show the efficiency of the new method.</span></p></div>\",\"PeriodicalId\":54682,\"journal\":{\"name\":\"Operations Research Letters\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2024-01-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Operations Research Letters\",\"FirstCategoryId\":\"91\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167637724000087\",\"RegionNum\":4,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"OPERATIONS RESEARCH & MANAGEMENT SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Operations Research Letters","FirstCategoryId":"91","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167637724000087","RegionNum":4,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"OPERATIONS RESEARCH & MANAGEMENT SCIENCE","Score":null,"Total":0}
We propose a novel adaptive stepsize for the gradient descent scheme to solve unconstrained nonlinear optimization problems. With the convex and smooth objective satisfying locally Lipschitz gradient we obtain the complexity of at most. By using the idea of the new stepsize, we propose another new algorithm based on the projected gradient for solving a class of nonconvex optimization problems over a closed convex set. The computational experiments show the efficiency of the new method.
期刊介绍:
Operations Research Letters is committed to the rapid review and fast publication of short articles on all aspects of operations research and analytics. Apart from a limitation to eight journal pages, quality, originality, relevance and clarity are the only criteria for selecting the papers to be published. ORL covers the broad field of optimization, stochastic models and game theory. Specific areas of interest include networks, routing, location, queueing, scheduling, inventory, reliability, and financial engineering. We wish to explore interfaces with other fields such as life sciences and health care, artificial intelligence and machine learning, energy distribution, and computational social sciences and humanities. Our traditional strength is in methodology, including theory, modelling, algorithms and computational studies. We also welcome novel applications and concise literature reviews.