{"title":"The Whale Optimization Algorithm for Hyperparameter Optimization in Network-Wide Traffic Speed Prediction","authors":"Zhang-Han Zhuang, Ming-Chao Chiang","doi":"10.1145/3440943.3444729","DOIUrl":null,"url":null,"abstract":"Since there are way too many possible combinations of hyperparameters for training a desired deep neural network (DNN) model, finding out a set of suitable values for them is typically a difficult topic for the researchers when they use DNN for solving forecasting problems. In addition to manual tuning and trial-and-error for hyperparameters, how to automatically determine the values of hyperparameters has become a critical problem in recent years. In this study, we present a metaheuristic algorithm based on the whale optimization algorithm (WOA) to select suitable hyperparameters for the DNN because WOA demonstrates brilliant convergence speed in many optimization problems and the local optima avoidance mechanism is devised to prevent the searches from trapping into suboptimal solution easily. To validate the feasibility of the proposed algorithm, we compared it with several state-of-the-art hyperparameter selection algorithms for DNN in solving the network-wide traffic speed prediction problem. The experimental results show that WOA not only behaves much more stable but also outperforms all the other hyperparameter selection algorithms compared in this study in terms of the mean square error, mean average error, and mean average percentage error.","PeriodicalId":310247,"journal":{"name":"Proceedings of the 2020 ACM International Conference on Intelligent Computing and its Emerging Applications","volume":"224 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 ACM International Conference on Intelligent Computing and its Emerging Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3440943.3444729","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Since there are way too many possible combinations of hyperparameters for training a desired deep neural network (DNN) model, finding out a set of suitable values for them is typically a difficult topic for the researchers when they use DNN for solving forecasting problems. In addition to manual tuning and trial-and-error for hyperparameters, how to automatically determine the values of hyperparameters has become a critical problem in recent years. In this study, we present a metaheuristic algorithm based on the whale optimization algorithm (WOA) to select suitable hyperparameters for the DNN because WOA demonstrates brilliant convergence speed in many optimization problems and the local optima avoidance mechanism is devised to prevent the searches from trapping into suboptimal solution easily. To validate the feasibility of the proposed algorithm, we compared it with several state-of-the-art hyperparameter selection algorithms for DNN in solving the network-wide traffic speed prediction problem. The experimental results show that WOA not only behaves much more stable but also outperforms all the other hyperparameter selection algorithms compared in this study in terms of the mean square error, mean average error, and mean average percentage error.