Maximilian Motz , Jonathan Krauß , Robert Heinrich Schmitt
{"title":"Benchmarking of hyperparameter optimization techniques for machine learning applications in production","authors":"Maximilian Motz , Jonathan Krauß , Robert Heinrich Schmitt","doi":"10.1016/j.aime.2022.100099","DOIUrl":null,"url":null,"abstract":"<div><p>Machine learning (ML) has become a key technology to leverage the potential of large data amounts that are generated in the context of digitized and connected production processes. In projects for developing ML solutions for production applications, the selection of hyperparameter optimization (HPO) techniques is a key task that significantly impacts the performance of the resulting ML solution. However, selecting the best suitable HPO technique for an ML use case is challenging, since HPO techniques have individual strengths and weaknesses and ML use cases in production are highly individual in terms of their application areas, objectives, and resources. This makes the selection of HPO techniques in production a very complex task that requires decision support. Thus, we present a structured approach for benchmarking HPO techniques and for integrating the empirical data generated within benchmarking experiments into decision support systems. Based on the data generated within a large-scale benchmarking study, the validation results prove that the usage of benchmarking data improves decision-making in HPO technique selection and thus helps to exploit the full potential of ML solutions in production applications.</p></div>","PeriodicalId":34573,"journal":{"name":"Advances in Industrial and Manufacturing Engineering","volume":null,"pages":null},"PeriodicalIF":3.9000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2666912922000265/pdfft?md5=5e2d13d824528fc37b5ebfe0e0a0640d&pid=1-s2.0-S2666912922000265-main.pdf","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Industrial and Manufacturing Engineering","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666912922000265","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 1
Abstract
Machine learning (ML) has become a key technology to leverage the potential of large data amounts that are generated in the context of digitized and connected production processes. In projects for developing ML solutions for production applications, the selection of hyperparameter optimization (HPO) techniques is a key task that significantly impacts the performance of the resulting ML solution. However, selecting the best suitable HPO technique for an ML use case is challenging, since HPO techniques have individual strengths and weaknesses and ML use cases in production are highly individual in terms of their application areas, objectives, and resources. This makes the selection of HPO techniques in production a very complex task that requires decision support. Thus, we present a structured approach for benchmarking HPO techniques and for integrating the empirical data generated within benchmarking experiments into decision support systems. Based on the data generated within a large-scale benchmarking study, the validation results prove that the usage of benchmarking data improves decision-making in HPO technique selection and thus helps to exploit the full potential of ML solutions in production applications.