Zhi-Jun Wang , Hong Li , Zhou-Xiang Xu , Shuai-Ye Zhao , Peng-Jun Wang , He-Bei Gao
{"title":"基于改进的 Barzilai-Borwein 方法的自适应学习率算法","authors":"Zhi-Jun Wang , Hong Li , Zhou-Xiang Xu , Shuai-Ye Zhao , Peng-Jun Wang , He-Bei Gao","doi":"10.1016/j.patcog.2024.111179","DOIUrl":null,"url":null,"abstract":"<div><h3>Objective:</h3><div>The Barzilai–Borwein(BB) method is essential in solving unconstrained optimization problems. The momentum method accelerates optimization algorithms with exponentially weighted moving average. In order to design reliable deep learning optimization algorithms, this paper proposes applying the BB method in four variants to the optimization algorithm of deep learning.</div></div><div><h3>Findings:</h3><div>The momentum method generates the BB step size under different step range limits. We also apply the momentum method and its variants to the stochastic gradient descent with the BB step size.</div></div><div><h3>Novelty:</h3><div>The algorithm’s robustness has been demonstrated through experiments on the initial learning rate and random seeds. The algorithm’s sensitivity is tested by choosing different momentum factors until a suitable momentum factor is found. Moreover, we compare our algorithms with popular algorithms in various neural networks. The results show that the new algorithms improve the efficiency of the BB step size in deep learning and provide a variety of optimization algorithm choices.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"160 ","pages":"Article 111179"},"PeriodicalIF":7.5000,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Adaptive learning rate algorithms based on the improved Barzilai–Borwein method\",\"authors\":\"Zhi-Jun Wang , Hong Li , Zhou-Xiang Xu , Shuai-Ye Zhao , Peng-Jun Wang , He-Bei Gao\",\"doi\":\"10.1016/j.patcog.2024.111179\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Objective:</h3><div>The Barzilai–Borwein(BB) method is essential in solving unconstrained optimization problems. The momentum method accelerates optimization algorithms with exponentially weighted moving average. In order to design reliable deep learning optimization algorithms, this paper proposes applying the BB method in four variants to the optimization algorithm of deep learning.</div></div><div><h3>Findings:</h3><div>The momentum method generates the BB step size under different step range limits. We also apply the momentum method and its variants to the stochastic gradient descent with the BB step size.</div></div><div><h3>Novelty:</h3><div>The algorithm’s robustness has been demonstrated through experiments on the initial learning rate and random seeds. The algorithm’s sensitivity is tested by choosing different momentum factors until a suitable momentum factor is found. Moreover, we compare our algorithms with popular algorithms in various neural networks. The results show that the new algorithms improve the efficiency of the BB step size in deep learning and provide a variety of optimization algorithm choices.</div></div>\",\"PeriodicalId\":49713,\"journal\":{\"name\":\"Pattern Recognition\",\"volume\":\"160 \",\"pages\":\"Article 111179\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2024-11-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Pattern Recognition\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0031320324009300\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320324009300","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Adaptive learning rate algorithms based on the improved Barzilai–Borwein method
Objective:
The Barzilai–Borwein(BB) method is essential in solving unconstrained optimization problems. The momentum method accelerates optimization algorithms with exponentially weighted moving average. In order to design reliable deep learning optimization algorithms, this paper proposes applying the BB method in four variants to the optimization algorithm of deep learning.
Findings:
The momentum method generates the BB step size under different step range limits. We also apply the momentum method and its variants to the stochastic gradient descent with the BB step size.
Novelty:
The algorithm’s robustness has been demonstrated through experiments on the initial learning rate and random seeds. The algorithm’s sensitivity is tested by choosing different momentum factors until a suitable momentum factor is found. Moreover, we compare our algorithms with popular algorithms in various neural networks. The results show that the new algorithms improve the efficiency of the BB step size in deep learning and provide a variety of optimization algorithm choices.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.