Keras中深度学习模型的超参数调优

M. Z. Awang Pon, Krishna Prakash K K
{"title":"Keras中深度学习模型的超参数调优","authors":"M. Z. Awang Pon, Krishna Prakash K K","doi":"10.55011/staiqc.2021.1104","DOIUrl":null,"url":null,"abstract":"Hyperparameter tuning or optimization is one of the fundamental way to improve the performance of the machine learning models. Hyper parameter is a parameter passed during the learning process of the model to make corrections or adjustments to the learning process. To generalise diverse data patterns, the same machine learning model may require different constraints, weights, or learning rates. Hyperparameters are the term for these kind of measurements. These parameters have been trial-anderror tested to ensure that the model can solve the machine learning task optimally. This paper focus on the science of hyperparameter tuning using some tools with experimental values and results of each experiments. We have also documented 4 metrics to analyze the hyperparameter tuning results and benchmark the outcome. The experimental results of two tools used commonly for deep learning models namely Keras tuner and AiSara tuner are captured in the article. All relevant experimental code is also available for readers in authors github repository. The metrics used to benchmark the results are accuracy, search time, cost and complexity and expalinability. The results indicate the overall performance of AiSara tuner in search time, cost and complexity and expalinability matrices are superior to keras tuners. © 2021 STAIQC. All rights reserved.","PeriodicalId":231409,"journal":{"name":"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing","volume":"86 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Hyperparameter Tuning of Deep learning Models in Keras\",\"authors\":\"M. Z. Awang Pon, Krishna Prakash K K\",\"doi\":\"10.55011/staiqc.2021.1104\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hyperparameter tuning or optimization is one of the fundamental way to improve the performance of the machine learning models. Hyper parameter is a parameter passed during the learning process of the model to make corrections or adjustments to the learning process. To generalise diverse data patterns, the same machine learning model may require different constraints, weights, or learning rates. Hyperparameters are the term for these kind of measurements. These parameters have been trial-anderror tested to ensure that the model can solve the machine learning task optimally. This paper focus on the science of hyperparameter tuning using some tools with experimental values and results of each experiments. We have also documented 4 metrics to analyze the hyperparameter tuning results and benchmark the outcome. The experimental results of two tools used commonly for deep learning models namely Keras tuner and AiSara tuner are captured in the article. All relevant experimental code is also available for readers in authors github repository. The metrics used to benchmark the results are accuracy, search time, cost and complexity and expalinability. The results indicate the overall performance of AiSara tuner in search time, cost and complexity and expalinability matrices are superior to keras tuners. © 2021 STAIQC. All rights reserved.\",\"PeriodicalId\":231409,\"journal\":{\"name\":\"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing\",\"volume\":\"86 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.55011/staiqc.2021.1104\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sparklinglight Transactions on Artificial Intelligence and Quantum Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.55011/staiqc.2021.1104","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

摘要

超参数调优是提高机器学习模型性能的基本方法之一。超参数是模型在学习过程中传递的参数,用于对学习过程进行修正或调整。为了概括不同的数据模式,相同的机器学习模型可能需要不同的约束、权重或学习率。超参数是这类测量的术语。这些参数都经过了试错测试,以确保模型能够最优地解决机器学习任务。本文重点介绍了超参数调优的科学,并结合实验值和每次实验的结果介绍了一些工具。我们还记录了4个度量来分析超参数调优结果并对结果进行基准测试。本文捕获了深度学习模型常用的两种工具,即Keras调谐器和AiSara调谐器的实验结果。所有相关的实验代码也可以在作者的github存储库中获得。用于对结果进行基准测试的指标包括准确性、搜索时间、成本、复杂性和可扩展性。结果表明,AiSara调谐器在搜索时间、成本、复杂度和可扩展性矩阵等方面的总体性能优于keras调谐器。©2021 staiqc。版权所有。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Hyperparameter Tuning of Deep learning Models in Keras
Hyperparameter tuning or optimization is one of the fundamental way to improve the performance of the machine learning models. Hyper parameter is a parameter passed during the learning process of the model to make corrections or adjustments to the learning process. To generalise diverse data patterns, the same machine learning model may require different constraints, weights, or learning rates. Hyperparameters are the term for these kind of measurements. These parameters have been trial-anderror tested to ensure that the model can solve the machine learning task optimally. This paper focus on the science of hyperparameter tuning using some tools with experimental values and results of each experiments. We have also documented 4 metrics to analyze the hyperparameter tuning results and benchmark the outcome. The experimental results of two tools used commonly for deep learning models namely Keras tuner and AiSara tuner are captured in the article. All relevant experimental code is also available for readers in authors github repository. The metrics used to benchmark the results are accuracy, search time, cost and complexity and expalinability. The results indicate the overall performance of AiSara tuner in search time, cost and complexity and expalinability matrices are superior to keras tuners. © 2021 STAIQC. All rights reserved.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信