Orthogonal stochastic configuration networks with adaptive construction parameter for data analytics

Wei Dai, Chuanfeng Ning, Shiyu Pei, Song Zhu, Xuesong Wang
{"title":"Orthogonal stochastic configuration networks with adaptive construction parameter for data analytics","authors":"Wei Dai, Chuanfeng Ning, Shiyu Pei, Song Zhu, Xuesong Wang","doi":"10.1007/s44244-023-00004-4","DOIUrl":null,"url":null,"abstract":"Abstract As a randomized learner model, SCNs are remarkable that the random weights and biases are assigned employing a supervisory mechanism to ensure universal approximation and fast learning. However, the randomness makes SCNs more likely to generate approximate linear correlative nodes that are redundant and low quality, thereby resulting in non-compact network structure. In light of a fundamental principle in machine learning, that is, a model with fewer parameters holds improved generalization. This paper proposes orthogonal SCN, termed OSCN, to filtrate out the low-quality hidden nodes for network structure reduction by incorporating Gram–Schmidt orthogonalization technology. The universal approximation property of OSCN and an adaptive setting for the key construction parameters have been presented in details. In addition, an incremental updating scheme is developed to dynamically determine the output weights, contributing to improved computational efficiency. Finally, experimental results on two numerical examples and several real-world regression and classification datasets substantiate the effectiveness and feasibility of the proposed approach.","PeriodicalId":474480,"journal":{"name":"Industrial Artificial Intelligence","volume":"354 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Industrial Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s44244-023-00004-4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract As a randomized learner model, SCNs are remarkable that the random weights and biases are assigned employing a supervisory mechanism to ensure universal approximation and fast learning. However, the randomness makes SCNs more likely to generate approximate linear correlative nodes that are redundant and low quality, thereby resulting in non-compact network structure. In light of a fundamental principle in machine learning, that is, a model with fewer parameters holds improved generalization. This paper proposes orthogonal SCN, termed OSCN, to filtrate out the low-quality hidden nodes for network structure reduction by incorporating Gram–Schmidt orthogonalization technology. The universal approximation property of OSCN and an adaptive setting for the key construction parameters have been presented in details. In addition, an incremental updating scheme is developed to dynamically determine the output weights, contributing to improved computational efficiency. Finally, experimental results on two numerical examples and several real-world regression and classification datasets substantiate the effectiveness and feasibility of the proposed approach.
面向数据分析的自适应结构参数正交随机配置网络
scn作为一种随机学习模型,其显著特点是采用一种监督机制来分配随机权重和偏差,以确保通用逼近和快速学习。然而,这种随机性使得SCNs更容易产生近似的线性相关节点,这些节点冗余且质量低,从而导致网络结构不紧凑。根据机器学习的基本原理,即参数较少的模型具有更好的泛化能力。本文提出正交SCN (OSCN),结合Gram-Schmidt正交化技术,过滤掉低质量的隐藏节点进行网络结构缩减。详细介绍了OSCN的普遍逼近特性和关键结构参数的自适应设置。此外,还提出了一种增量更新方案来动态确定输出权值,提高了计算效率。最后,在两个数值算例和多个实际回归分类数据集上的实验结果验证了所提方法的有效性和可行性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信