Lin Wu, Tuo Shi, Yongbin Wang, Huiming Chen, W. Lam
{"title":"鲁棒随机拟牛顿方法及其在机器学习中的应用","authors":"Lin Wu, Tuo Shi, Yongbin Wang, Huiming Chen, W. Lam","doi":"10.1109/ICCST53801.2021.00041","DOIUrl":null,"url":null,"abstract":"We study a novel stochastic version of damped and regularized limited memory Broyden-Fletcher-Goldfarb-Shanno (Sd-REG-LBFGS) method for nonconvex optimization problems in this paper. Different from BFGS updating scheme for the strongly convex problems, it is a challenge to preserve the product of the correction pairs positive at each iteration for BFGS in nonconvex case. While utilizing regularization scheme to ensure the robustness of the second-order methods. The proposed method is able to keep the product of correction pairs being away from zero above a specified degree. To make the proposed method robust and computationally efficient, we propose to update curvature information at a spaced interval, in which the average of iterate points is utilized. The numerical experiments have shown that our proposed algorithm has better performance than SdLBFGS or they have almost the same performance. Especially in the problems which utilizes few samples, the proposed method has avoided the ill-conditioned problem and exhibited superior performance.","PeriodicalId":222463,"journal":{"name":"2021 International Conference on Culture-oriented Science & Technology (ICCST)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2021-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Robust Stochastic Quasi-Newton Method with the Application in Machine Learning\",\"authors\":\"Lin Wu, Tuo Shi, Yongbin Wang, Huiming Chen, W. Lam\",\"doi\":\"10.1109/ICCST53801.2021.00041\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We study a novel stochastic version of damped and regularized limited memory Broyden-Fletcher-Goldfarb-Shanno (Sd-REG-LBFGS) method for nonconvex optimization problems in this paper. Different from BFGS updating scheme for the strongly convex problems, it is a challenge to preserve the product of the correction pairs positive at each iteration for BFGS in nonconvex case. While utilizing regularization scheme to ensure the robustness of the second-order methods. The proposed method is able to keep the product of correction pairs being away from zero above a specified degree. To make the proposed method robust and computationally efficient, we propose to update curvature information at a spaced interval, in which the average of iterate points is utilized. The numerical experiments have shown that our proposed algorithm has better performance than SdLBFGS or they have almost the same performance. Especially in the problems which utilizes few samples, the proposed method has avoided the ill-conditioned problem and exhibited superior performance.\",\"PeriodicalId\":222463,\"journal\":{\"name\":\"2021 International Conference on Culture-oriented Science & Technology (ICCST)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Conference on Culture-oriented Science & Technology (ICCST)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCST53801.2021.00041\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Culture-oriented Science & Technology (ICCST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCST53801.2021.00041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Robust Stochastic Quasi-Newton Method with the Application in Machine Learning
We study a novel stochastic version of damped and regularized limited memory Broyden-Fletcher-Goldfarb-Shanno (Sd-REG-LBFGS) method for nonconvex optimization problems in this paper. Different from BFGS updating scheme for the strongly convex problems, it is a challenge to preserve the product of the correction pairs positive at each iteration for BFGS in nonconvex case. While utilizing regularization scheme to ensure the robustness of the second-order methods. The proposed method is able to keep the product of correction pairs being away from zero above a specified degree. To make the proposed method robust and computationally efficient, we propose to update curvature information at a spaced interval, in which the average of iterate points is utilized. The numerical experiments have shown that our proposed algorithm has better performance than SdLBFGS or they have almost the same performance. Especially in the problems which utilizes few samples, the proposed method has avoided the ill-conditioned problem and exhibited superior performance.