{"title":"组成彻底优化的回归神经网络集合的进化方法","authors":"Lazar Krstic, Milos Ivanovic, Visnja Simic, Boban Stojanovic","doi":"10.1016/j.eij.2024.100581","DOIUrl":null,"url":null,"abstract":"<div><div>The paper presents the GeNNsem (<strong>Ge</strong>netic algorithm A<strong>NN</strong>s en<strong>sem</strong>ble) software framework for the simultaneous optimization of individual neural networks and building their optimal ensemble. The proposed framework employs a genetic algorithm to search for suitable architectures and hyperparameters of the individual neural networks to maximize the weighted sum of accuracy and diversity in their predictions. The optimal ensemble consists of networks with low errors but diverse predictions, resulting in a more generalized model. The scalability of the proposed framework is ensured by utilizing micro-services and Kubernetes batching orchestration. GeNNsem has been evaluated on two regression benchmark problems and compared with related machine learning techniques. The proposed approach exhibited supremacy over other ensemble approaches and individual neural networks in all common regression modeling metrics. Real-world use-case experiments in the domain of hydro-informatics have further demonstrated the main advantages of GeNNsem: requires the least training sessions for individual models when optimizing an ensemble; networks in an ensemble are generally simple due to the regularization provided by a trivial initial population and custom genetic operators; execution times are reduced by two orders of magnitude as a result of parallelization.</div></div>","PeriodicalId":56010,"journal":{"name":"Egyptian Informatics Journal","volume":"28 ","pages":"Article 100581"},"PeriodicalIF":5.0000,"publicationDate":"2024-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Evolutionary approach for composing a thoroughly optimized ensemble of regression neural networks\",\"authors\":\"Lazar Krstic, Milos Ivanovic, Visnja Simic, Boban Stojanovic\",\"doi\":\"10.1016/j.eij.2024.100581\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The paper presents the GeNNsem (<strong>Ge</strong>netic algorithm A<strong>NN</strong>s en<strong>sem</strong>ble) software framework for the simultaneous optimization of individual neural networks and building their optimal ensemble. The proposed framework employs a genetic algorithm to search for suitable architectures and hyperparameters of the individual neural networks to maximize the weighted sum of accuracy and diversity in their predictions. The optimal ensemble consists of networks with low errors but diverse predictions, resulting in a more generalized model. The scalability of the proposed framework is ensured by utilizing micro-services and Kubernetes batching orchestration. GeNNsem has been evaluated on two regression benchmark problems and compared with related machine learning techniques. The proposed approach exhibited supremacy over other ensemble approaches and individual neural networks in all common regression modeling metrics. Real-world use-case experiments in the domain of hydro-informatics have further demonstrated the main advantages of GeNNsem: requires the least training sessions for individual models when optimizing an ensemble; networks in an ensemble are generally simple due to the regularization provided by a trivial initial population and custom genetic operators; execution times are reduced by two orders of magnitude as a result of parallelization.</div></div>\",\"PeriodicalId\":56010,\"journal\":{\"name\":\"Egyptian Informatics Journal\",\"volume\":\"28 \",\"pages\":\"Article 100581\"},\"PeriodicalIF\":5.0000,\"publicationDate\":\"2024-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Egyptian Informatics Journal\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1110866524001440\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Egyptian Informatics Journal","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1110866524001440","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Evolutionary approach for composing a thoroughly optimized ensemble of regression neural networks
The paper presents the GeNNsem (Genetic algorithm ANNs ensemble) software framework for the simultaneous optimization of individual neural networks and building their optimal ensemble. The proposed framework employs a genetic algorithm to search for suitable architectures and hyperparameters of the individual neural networks to maximize the weighted sum of accuracy and diversity in their predictions. The optimal ensemble consists of networks with low errors but diverse predictions, resulting in a more generalized model. The scalability of the proposed framework is ensured by utilizing micro-services and Kubernetes batching orchestration. GeNNsem has been evaluated on two regression benchmark problems and compared with related machine learning techniques. The proposed approach exhibited supremacy over other ensemble approaches and individual neural networks in all common regression modeling metrics. Real-world use-case experiments in the domain of hydro-informatics have further demonstrated the main advantages of GeNNsem: requires the least training sessions for individual models when optimizing an ensemble; networks in an ensemble are generally simple due to the regularization provided by a trivial initial population and custom genetic operators; execution times are reduced by two orders of magnitude as a result of parallelization.
期刊介绍:
The Egyptian Informatics Journal is published by the Faculty of Computers and Artificial Intelligence, Cairo University. This Journal provides a forum for the state-of-the-art research and development in the fields of computing, including computer sciences, information technologies, information systems, operations research and decision support. Innovative and not-previously-published work in subjects covered by the Journal is encouraged to be submitted, whether from academic, research or commercial sources.