M.A. Ganaie, Yogesh Kumar, Anshika Bhatia, Chavda Jayrajsinh
{"title":"Ensemble deep generalized eigen-value random vector functional link network for classification problems","authors":"M.A. Ganaie, Yogesh Kumar, Anshika Bhatia, Chavda Jayrajsinh","doi":"10.1016/j.compeleceng.2024.110040","DOIUrl":null,"url":null,"abstract":"<div><div>Random vector functional link neural networks have been widely used across applications due to their universal approximation property. The standard random vector functional link neural network consists of a single hidden layer network, and hence, the generalization suffers due to poor representation of features. In this work, we propose ensemble deep generalized eigen value proximal random vector functional link (edGERVFL) network for classification problems. The proposed edGERVFL improves the architecture twofold: generating a better feature representation via deep framework, followed by the ensembling of the base learners, composed of multilayer architecture, to improve the generalization performance of the model. Unlike standard RVFL-based models, the weights are optimized by solving the generalized eigenvalue problem. To showcase the performance of the proposed edGERVFL model, experiments are conducted on diverse tabular UCI binary class datasets. The experimental findings, coupled with the statistical analysis, indicate that the edGERVFL model outperforms the provided baseline models.</div></div>","PeriodicalId":50630,"journal":{"name":"Computers & Electrical Engineering","volume":"123 ","pages":"Article 110040"},"PeriodicalIF":4.0000,"publicationDate":"2025-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Electrical Engineering","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0045790624009650","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
Random vector functional link neural networks have been widely used across applications due to their universal approximation property. The standard random vector functional link neural network consists of a single hidden layer network, and hence, the generalization suffers due to poor representation of features. In this work, we propose ensemble deep generalized eigen value proximal random vector functional link (edGERVFL) network for classification problems. The proposed edGERVFL improves the architecture twofold: generating a better feature representation via deep framework, followed by the ensembling of the base learners, composed of multilayer architecture, to improve the generalization performance of the model. Unlike standard RVFL-based models, the weights are optimized by solving the generalized eigenvalue problem. To showcase the performance of the proposed edGERVFL model, experiments are conducted on diverse tabular UCI binary class datasets. The experimental findings, coupled with the statistical analysis, indicate that the edGERVFL model outperforms the provided baseline models.
期刊介绍:
The impact of computers has nowhere been more revolutionary than in electrical engineering. The design, analysis, and operation of electrical and electronic systems are now dominated by computers, a transformation that has been motivated by the natural ease of interface between computers and electrical systems, and the promise of spectacular improvements in speed and efficiency.
Published since 1973, Computers & Electrical Engineering provides rapid publication of topical research into the integration of computer technology and computational techniques with electrical and electronic systems. The journal publishes papers featuring novel implementations of computers and computational techniques in areas like signal and image processing, high-performance computing, parallel processing, and communications. Special attention will be paid to papers describing innovative architectures, algorithms, and software tools.