Deep Random Vector Functional Link Network for handwritten character recognition

H. Cecotti
{"title":"Deep Random Vector Functional Link Network for handwritten character recognition","authors":"H. Cecotti","doi":"10.1109/IJCNN.2016.7727666","DOIUrl":null,"url":null,"abstract":"The field of artificial neural networks has a long history of several decades, where the theoretical contributions have progressed with advances in terms of power and memory in present day computers. Some old methods are now rebranded or represented, taking advantage of the power of present day computers. More particularly, we consider the current trend of Random Vector Functional Link Networks, which suggests that the architecture of a system and the learning algorithm should be properly decoupled. In this paper, we evaluate the performance of multi-layers Random Vector Functional Link Network (RVFL)/ extreme machine learning (EML) on four databases of handwritten characters. Particularly, we evaluate the impact of the architecture (number of neurons per hidden layer), and the robustness of the distribution of the results across different runs. By combining the classifier outputs from different runs, we show that such a maximum combination rule provides an accuracy of 95.97% for Arabic digits, 98.03% for Bangla, 98.64% for Devnagari, and 96.30% for Oriya digits. The results confirm that increasing the size of the hidden layers has a significant impact on the accuracy, and allows to reach state-of-the-art performance; however the performance reaches a plateau after a certain size of the hidden layers.","PeriodicalId":109405,"journal":{"name":"2016 International Joint Conference on Neural Networks (IJCNN)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2016.7727666","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16

Abstract

The field of artificial neural networks has a long history of several decades, where the theoretical contributions have progressed with advances in terms of power and memory in present day computers. Some old methods are now rebranded or represented, taking advantage of the power of present day computers. More particularly, we consider the current trend of Random Vector Functional Link Networks, which suggests that the architecture of a system and the learning algorithm should be properly decoupled. In this paper, we evaluate the performance of multi-layers Random Vector Functional Link Network (RVFL)/ extreme machine learning (EML) on four databases of handwritten characters. Particularly, we evaluate the impact of the architecture (number of neurons per hidden layer), and the robustness of the distribution of the results across different runs. By combining the classifier outputs from different runs, we show that such a maximum combination rule provides an accuracy of 95.97% for Arabic digits, 98.03% for Bangla, 98.64% for Devnagari, and 96.30% for Oriya digits. The results confirm that increasing the size of the hidden layers has a significant impact on the accuracy, and allows to reach state-of-the-art performance; however the performance reaches a plateau after a certain size of the hidden layers.
用于手写字符识别的深度随机向量功能链接网络
人工神经网络领域有着几十年的悠久历史,其理论贡献随着当今计算机在功率和内存方面的进步而不断发展。利用现代计算机的强大功能,一些旧的方法现在被重新命名或代表。更具体地说,我们考虑了随机向量功能链接网络的当前趋势,这表明系统的架构和学习算法应该适当地解耦。在本文中,我们评估了多层随机向量功能链接网络(RVFL)/极限机器学习(EML)在四个手写字符数据库上的性能。特别是,我们评估了架构(每个隐藏层的神经元数量)的影响,以及结果分布在不同运行中的鲁棒性。通过结合不同运行的分类器输出,我们发现这种最大组合规则对阿拉伯数字的准确率为95.97%,对孟加拉语的准确率为98.03%,对德文加里语的准确率为98.64%,对奥里亚语的准确率为96.30%。结果证实,增加隐藏层的大小对精度有显着影响,并允许达到最先进的性能;然而,当隐藏层达到一定规模后,性能达到平稳状态。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信