Random separation learning for neural network ensembles

Yong Liu
{"title":"Random separation learning for neural network ensembles","authors":"Yong Liu","doi":"10.1109/CISP-BMEI.2017.8302328","DOIUrl":null,"url":null,"abstract":"In order to prevent the individual neural networks from becoming similar in the long learning period of negative correlation learning for designing neural network ensembles, two approaches were adopted in this paper. The first approach is to replace large neural networks with small neural networks in neural network ensembles. Samll neural networks would be more practical in the real applications when the capability is limited. The second approach is to introduce random separation learning in negative correlation learning for each small neural network. The idea of random separation learning is to let each individual neural network learn differently on the randomly separated subsets of the given training samples. It has been found that the small neural networks could easily become weak and different each other by negative correlation learning with random separation learning. After applying large number of small neural networks for neural network ensembles, two combination methods were used to generate the output of the neural network ensembles while their performance had been compared.","PeriodicalId":6474,"journal":{"name":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","volume":"108 1","pages":"1-4"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISP-BMEI.2017.8302328","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

In order to prevent the individual neural networks from becoming similar in the long learning period of negative correlation learning for designing neural network ensembles, two approaches were adopted in this paper. The first approach is to replace large neural networks with small neural networks in neural network ensembles. Samll neural networks would be more practical in the real applications when the capability is limited. The second approach is to introduce random separation learning in negative correlation learning for each small neural network. The idea of random separation learning is to let each individual neural network learn differently on the randomly separated subsets of the given training samples. It has been found that the small neural networks could easily become weak and different each other by negative correlation learning with random separation learning. After applying large number of small neural networks for neural network ensembles, two combination methods were used to generate the output of the neural network ensembles while their performance had been compared.
神经网络集成的随机分离学习
在设计神经网络集成时,为了避免在负相关学习的长学习周期中单个神经网络变得相似,本文采用了两种方法。第一种方法是用神经网络集成中的小神经网络代替大神经网络。在能力有限的情况下,小型神经网络在实际应用中更为实用。第二种方法是在每个小神经网络的负相关学习中引入随机分离学习。随机分离学习的思想是让每个单独的神经网络在给定训练样本的随机分离子集上进行不同的学习。研究发现,采用负相关学习和随机分离学习的方法容易使小型神经网络变弱并产生差异。将大量的小神经网络应用于神经网络集成后,采用两种组合方法生成神经网络集成的输出,并对其性能进行比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信