基于条件数和方差分解比的极值学习机回归

Meiyi Li, Weibiao Cai, Qingshuai Sun
{"title":"基于条件数和方差分解比的极值学习机回归","authors":"Meiyi Li, Weibiao Cai, Qingshuai Sun","doi":"10.1145/3208788.3208794","DOIUrl":null,"url":null,"abstract":"The extreme learning machine (ELM) is a novel single hidden layer feedforward neural network. Compared with traditional neural network algorithm, ELM has the advantages of fast learning speed and good generalization performance. However, there are still some shortages that restrict the further development of ELM, such as the perturbation and multicollinearity in the linear model. To the adverse effects caused by the perturbation and the multicollinearity, this paper proposes ELM based on condition number and variance decomposition ratio (CVELM) for regression, which separates the interference terms in the model by condition number and variance decomposition ratio, and then manipulate the interference items with weighted. Finally, the output layer weight is calculated by the least square method. The proposed algorithm can not only get good stability of the algorithm, but also reduce the impact on the non-interference items when dealing with the interference terms. The regression experiments on several datasets show that the proposed method owns a good generalization performance and stability.","PeriodicalId":211585,"journal":{"name":"Proceedings of 2018 International Conference on Mathematics and Artificial Intelligence","volume":"118 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Extreme learning machine for regression based on condition number and variance decomposition ratio\",\"authors\":\"Meiyi Li, Weibiao Cai, Qingshuai Sun\",\"doi\":\"10.1145/3208788.3208794\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The extreme learning machine (ELM) is a novel single hidden layer feedforward neural network. Compared with traditional neural network algorithm, ELM has the advantages of fast learning speed and good generalization performance. However, there are still some shortages that restrict the further development of ELM, such as the perturbation and multicollinearity in the linear model. To the adverse effects caused by the perturbation and the multicollinearity, this paper proposes ELM based on condition number and variance decomposition ratio (CVELM) for regression, which separates the interference terms in the model by condition number and variance decomposition ratio, and then manipulate the interference items with weighted. Finally, the output layer weight is calculated by the least square method. The proposed algorithm can not only get good stability of the algorithm, but also reduce the impact on the non-interference items when dealing with the interference terms. The regression experiments on several datasets show that the proposed method owns a good generalization performance and stability.\",\"PeriodicalId\":211585,\"journal\":{\"name\":\"Proceedings of 2018 International Conference on Mathematics and Artificial Intelligence\",\"volume\":\"118 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-04-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 2018 International Conference on Mathematics and Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3208788.3208794\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 2018 International Conference on Mathematics and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3208788.3208794","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

极限学习机是一种新型的单隐层前馈神经网络。与传统神经网络算法相比,ELM具有学习速度快、泛化性能好等优点。然而,线性模型中存在的扰动和多重共线性等问题制约了ELM的进一步发展。针对扰动和多重共线性带来的不利影响,本文提出了基于条件数和方差分解比的ELM (CVELM)回归方法,该方法通过条件数和方差分解比分离模型中的干扰项,然后对干扰项进行加权处理。最后,采用最小二乘法计算输出层权值。该算法在处理干扰项时,不仅具有良好的稳定性,而且减少了对非干扰项的影响。在多个数据集上的回归实验表明,该方法具有良好的泛化性能和稳定性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Extreme learning machine for regression based on condition number and variance decomposition ratio
The extreme learning machine (ELM) is a novel single hidden layer feedforward neural network. Compared with traditional neural network algorithm, ELM has the advantages of fast learning speed and good generalization performance. However, there are still some shortages that restrict the further development of ELM, such as the perturbation and multicollinearity in the linear model. To the adverse effects caused by the perturbation and the multicollinearity, this paper proposes ELM based on condition number and variance decomposition ratio (CVELM) for regression, which separates the interference terms in the model by condition number and variance decomposition ratio, and then manipulate the interference items with weighted. Finally, the output layer weight is calculated by the least square method. The proposed algorithm can not only get good stability of the algorithm, but also reduce the impact on the non-interference items when dealing with the interference terms. The regression experiments on several datasets show that the proposed method owns a good generalization performance and stability.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信