Zhiyong Qiu, Zhenhua Guo, Li Wang, Yaqian Zhao, Rengang Li
{"title":"Layer-wise based Adabelief Optimization Algorithm for Deep Learning","authors":"Zhiyong Qiu, Zhenhua Guo, Li Wang, Yaqian Zhao, Rengang Li","doi":"10.1145/3573834.3574539","DOIUrl":null,"url":null,"abstract":"For the optimization problem of deep learning, it is important to formulate a optimization method that can improve the convergence rate without sacrificing generalization ability. This paper proposes a layer-wise based Adabelief optimization algorithm to solve the deep learning optimization problems more efficiently. In the proposed algorithm, each layer of the deep neural network is set different learning rate appropriately in order to achieve a faster convergence rate. We also give the theorems that can guarantee the convergence property of Layer-wised AdaBelief method. Finally, we evaluate the effectiveness and efficiency of the proposed algorithm on experimental examples. Experimental results show that the converges speed of the layer-wised AdaBelief algorithm is the fastest compared with the mainstream algorithms. Besides, the new algorithm also maintaining an excellent convergence result in all numerical examples.","PeriodicalId":345434,"journal":{"name":"Proceedings of the 4th International Conference on Advanced Information Science and System","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Conference on Advanced Information Science and System","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3573834.3574539","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
For the optimization problem of deep learning, it is important to formulate a optimization method that can improve the convergence rate without sacrificing generalization ability. This paper proposes a layer-wise based Adabelief optimization algorithm to solve the deep learning optimization problems more efficiently. In the proposed algorithm, each layer of the deep neural network is set different learning rate appropriately in order to achieve a faster convergence rate. We also give the theorems that can guarantee the convergence property of Layer-wised AdaBelief method. Finally, we evaluate the effectiveness and efficiency of the proposed algorithm on experimental examples. Experimental results show that the converges speed of the layer-wised AdaBelief algorithm is the fastest compared with the mainstream algorithms. Besides, the new algorithm also maintaining an excellent convergence result in all numerical examples.