{"title":"协变矩阵自适应进化策略的改进算法","authors":"Yu. A. Litvinchuk, I. Malyk","doi":"10.31861/bmj2022.02.09","DOIUrl":null,"url":null,"abstract":"The paper considers the extension of the CMA-ES algorithm using mixtures of distributions for finding optimal hyperparameters of neural networks. Hyperparameter optimization, formulated as the optimization of the black box objective function, which is a necessary condition for automation and high performance of machine learning approaches. CMA-ES is an efficient optimization algorithm without derivatives, one of the alternatives in the combination of hyperparameter optimization methods. The developed algorithm is based on the assumption of a multi-peak density distribution of the parameters of complex systems. Compared to other optimization methods, CMA-ES is computationally inexpensive and supports parallel computations. Research results show that CMA-ES can be competitive, especially in the concurrent assessment mode. However, a much broader and more detailed comparison is still needed, which will include more test tasks and various modifications, such as adding constraints. Based on the Monte Carlo method, it was shown that the new algorithm will improve the search for optimal hyperparameters by an average of 12%.","PeriodicalId":196726,"journal":{"name":"Bukovinian Mathematical Journal","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ADVANCED ALGORITHM OF EVOLUTION STRATEGIES OF COVARIATION MATRIX ADAPTATION\",\"authors\":\"Yu. A. Litvinchuk, I. Malyk\",\"doi\":\"10.31861/bmj2022.02.09\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The paper considers the extension of the CMA-ES algorithm using mixtures of distributions for finding optimal hyperparameters of neural networks. Hyperparameter optimization, formulated as the optimization of the black box objective function, which is a necessary condition for automation and high performance of machine learning approaches. CMA-ES is an efficient optimization algorithm without derivatives, one of the alternatives in the combination of hyperparameter optimization methods. The developed algorithm is based on the assumption of a multi-peak density distribution of the parameters of complex systems. Compared to other optimization methods, CMA-ES is computationally inexpensive and supports parallel computations. Research results show that CMA-ES can be competitive, especially in the concurrent assessment mode. However, a much broader and more detailed comparison is still needed, which will include more test tasks and various modifications, such as adding constraints. Based on the Monte Carlo method, it was shown that the new algorithm will improve the search for optimal hyperparameters by an average of 12%.\",\"PeriodicalId\":196726,\"journal\":{\"name\":\"Bukovinian Mathematical Journal\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Bukovinian Mathematical Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.31861/bmj2022.02.09\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bukovinian Mathematical Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.31861/bmj2022.02.09","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
ADVANCED ALGORITHM OF EVOLUTION STRATEGIES OF COVARIATION MATRIX ADAPTATION
The paper considers the extension of the CMA-ES algorithm using mixtures of distributions for finding optimal hyperparameters of neural networks. Hyperparameter optimization, formulated as the optimization of the black box objective function, which is a necessary condition for automation and high performance of machine learning approaches. CMA-ES is an efficient optimization algorithm without derivatives, one of the alternatives in the combination of hyperparameter optimization methods. The developed algorithm is based on the assumption of a multi-peak density distribution of the parameters of complex systems. Compared to other optimization methods, CMA-ES is computationally inexpensive and supports parallel computations. Research results show that CMA-ES can be competitive, especially in the concurrent assessment mode. However, a much broader and more detailed comparison is still needed, which will include more test tasks and various modifications, such as adding constraints. Based on the Monte Carlo method, it was shown that the new algorithm will improve the search for optimal hyperparameters by an average of 12%.