{"title":"Learning networks hyper-parameter using multi-objective optimization of statistical performance metrics","authors":"G. Torres, C. Sánchez, D. Gil","doi":"10.1109/SYNASC57785.2022.00044","DOIUrl":null,"url":null,"abstract":"Deep Learning has enabled remarkable progress over the last years on a several objectives, such as image and speech recognition, and machine translation. Deep neural architectures are a main contribution for this progress. Current architectures have mostly been developed manually by engineers, which is a time-consuming and error-prone process. Because of this, there is growing interest in automated neural architecture search methods. In this paper we present a strategy for the optimization of network hyper-parameters using a multi-objective Non-dominated Sorting Genetic Algorithm combined with a nested cross-validation to optimize statistical metrics of the performance of networks. In order to illustrate the proposed hyper-parameter optimization, we have applied it to a use case that uses transformers to map abstract radiomic features to specific radiological annotations. Results obtained with the LUNA16 public data base show generalization power of the proposed optimization strategy for hyper-parameter setting.","PeriodicalId":446065,"journal":{"name":"2022 24th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 24th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SYNASC57785.2022.00044","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Deep Learning has enabled remarkable progress over the last years on a several objectives, such as image and speech recognition, and machine translation. Deep neural architectures are a main contribution for this progress. Current architectures have mostly been developed manually by engineers, which is a time-consuming and error-prone process. Because of this, there is growing interest in automated neural architecture search methods. In this paper we present a strategy for the optimization of network hyper-parameters using a multi-objective Non-dominated Sorting Genetic Algorithm combined with a nested cross-validation to optimize statistical metrics of the performance of networks. In order to illustrate the proposed hyper-parameter optimization, we have applied it to a use case that uses transformers to map abstract radiomic features to specific radiological annotations. Results obtained with the LUNA16 public data base show generalization power of the proposed optimization strategy for hyper-parameter setting.