Modified Gram-Schmidt Algorithm for Extreme Learning Machine

Jianchuan Yin, Fang Dong, Nini Wang
{"title":"Modified Gram-Schmidt Algorithm for Extreme Learning Machine","authors":"Jianchuan Yin, Fang Dong, Nini Wang","doi":"10.1109/ISCID.2009.275","DOIUrl":null,"url":null,"abstract":"Extreme learning machine (ELM) has shown to be extremely fast with better generalization performance. The basic idea of ELM algorithm is to randomly choose the parameters of hidden nodes and then use simple generalized inverse operation to solve for the output weights of the network. Such a procedure faces two problems. First, ELM tends to require more random hidden nodes than conventional tuning-based algorithms. Second, subjectivity is involved in choosing appropriate number of random hidden nodes. In this paper, we propose an enhanced-ELM(en-ELM) algorithm by applying the modified Gram-Schmidt (MGS) method to select hidden nodes in random hidden nodes pool. Furthermore, enhanced-ELM uses the Akaike's final prediction error (FPE) criterion to automatically determine the number of random hidden nodes. In comparison with conventional ELM learning method on several commonly used regressor benchmark problems, enhanced-ELM algorithm can achieve compact network with much faster response and satisfactory accuracy.","PeriodicalId":294370,"journal":{"name":"International Symposium on Computational Intelligence and Design","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Symposium on Computational Intelligence and Design","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISCID.2009.275","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Extreme learning machine (ELM) has shown to be extremely fast with better generalization performance. The basic idea of ELM algorithm is to randomly choose the parameters of hidden nodes and then use simple generalized inverse operation to solve for the output weights of the network. Such a procedure faces two problems. First, ELM tends to require more random hidden nodes than conventional tuning-based algorithms. Second, subjectivity is involved in choosing appropriate number of random hidden nodes. In this paper, we propose an enhanced-ELM(en-ELM) algorithm by applying the modified Gram-Schmidt (MGS) method to select hidden nodes in random hidden nodes pool. Furthermore, enhanced-ELM uses the Akaike's final prediction error (FPE) criterion to automatically determine the number of random hidden nodes. In comparison with conventional ELM learning method on several commonly used regressor benchmark problems, enhanced-ELM algorithm can achieve compact network with much faster response and satisfactory accuracy.
极限学习机的改进Gram-Schmidt算法
极限学习机(ELM)具有极快的学习速度和较好的泛化性能。ELM算法的基本思想是随机选择隐藏节点的参数,然后用简单的广义逆运算求解网络的输出权值。这样的程序面临两个问题。首先,与传统的基于调优的算法相比,ELM往往需要更多的随机隐藏节点。其次,选择适当数量的随机隐藏节点涉及主观性。本文采用改进的Gram-Schmidt (MGS)方法在随机隐藏节点池中选择隐藏节点,提出了一种增强的elm (en-ELM)算法。此外,增强elm使用赤池最终预测误差(Akaike’s final prediction error, FPE)准则自动确定随机隐藏节点的数量。在几种常用的回归量基准问题上,与传统的ELM学习方法相比,增强的ELM算法可以获得紧凑的网络,并且具有更快的响应速度和令人满意的精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信