M. Z. Awang Pon, Krishna Prakash K K
{"title":"Keras中深度学习模型的超参数调优","authors":"M. Z. Awang Pon, Krishna Prakash K K","doi":"10.55011/staiqc.2021.1104","DOIUrl":null,"url":null,"abstract":"Hyperparameter tuning or optimization is one of the fundamental way to improve the performance of the machine learning models. Hyper parameter is a parameter passed during the learning process of the model to make corrections or adjustments to the learning process. To generalise diverse data patterns, the same machine learning model may require different constraints, weights, or learning rates. Hyperparameters are the term for these kind of measurements. These parameters have been trial-anderror tested to ensure that the model can solve the machine learning task optimally. This paper focus on the science of hyperparameter tuning using some tools with experimental values and results of each experiments. We have also documented 4 metrics to analyze the hyperparameter tuning results and benchmark the outcome. The experimental results of two tools used commonly for deep learning models namely Keras tuner and AiSara tuner are captured in the article. All relevant experimental code is also available for readers in authors github repository. The metrics used to benchmark the results are accuracy, search time, cost and complexity and expalinability. The results indicate the overall performance of AiSara tuner in search time, cost and complexity and expalinability matrices are superior to keras tuners. © 2021 STAIQC. All rights reserved.","PeriodicalId":231409,"journal":{"name":"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing","volume":"86 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Hyperparameter Tuning of Deep learning Models in Keras\",\"authors\":\"M. Z. Awang Pon, Krishna Prakash K K\",\"doi\":\"10.55011/staiqc.2021.1104\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hyperparameter tuning or optimization is one of the fundamental way to improve the performance of the machine learning models. Hyper parameter is a parameter passed during the learning process of the model to make corrections or adjustments to the learning process. To generalise diverse data patterns, the same machine learning model may require different constraints, weights, or learning rates. Hyperparameters are the term for these kind of measurements. These parameters have been trial-anderror tested to ensure that the model can solve the machine learning task optimally. This paper focus on the science of hyperparameter tuning using some tools with experimental values and results of each experiments. We have also documented 4 metrics to analyze the hyperparameter tuning results and benchmark the outcome. The experimental results of two tools used commonly for deep learning models namely Keras tuner and AiSara tuner are captured in the article. All relevant experimental code is also available for readers in authors github repository. The metrics used to benchmark the results are accuracy, search time, cost and complexity and expalinability. The results indicate the overall performance of AiSara tuner in search time, cost and complexity and expalinability matrices are superior to keras tuners. © 2021 STAIQC. All rights reserved.\",\"PeriodicalId\":231409,\"journal\":{\"name\":\"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing\",\"volume\":\"86 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.55011/staiqc.2021.1104\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.55011/staiqc.2021.1104","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Hyperparameter Tuning of Deep learning Models in Keras
Hyperparameter tuning or optimization is one of the fundamental way to improve the performance of the machine learning models. Hyper parameter is a parameter passed during the learning process of the model to make corrections or adjustments to the learning process. To generalise diverse data patterns, the same machine learning model may require different constraints, weights, or learning rates. Hyperparameters are the term for these kind of measurements. These parameters have been trial-anderror tested to ensure that the model can solve the machine learning task optimally. This paper focus on the science of hyperparameter tuning using some tools with experimental values and results of each experiments. We have also documented 4 metrics to analyze the hyperparameter tuning results and benchmark the outcome. The experimental results of two tools used commonly for deep learning models namely Keras tuner and AiSara tuner are captured in the article. All relevant experimental code is also available for readers in authors github repository. The metrics used to benchmark the results are accuracy, search time, cost and complexity and expalinability. The results indicate the overall performance of AiSara tuner in search time, cost and complexity and expalinability matrices are superior to keras tuners. © 2021 STAIQC. All rights reserved.