基于稀疏训练的残差递归神经网络离线阿拉伯手写识别

Ruijie Yan, Liangrui Peng, GuangXiang Bin, Shengjin Wang, Yao Cheng
{"title":"基于稀疏训练的残差递归神经网络离线阿拉伯手写识别","authors":"Ruijie Yan, Liangrui Peng, GuangXiang Bin, Shengjin Wang, Yao Cheng","doi":"10.1109/ICDAR.2017.171","DOIUrl":null,"url":null,"abstract":"Deep Recurrent Neural Networks (RNN) have been suffering from the overfitting problem due to the model redundancy of the network structures. We propose a novel temporal and spatial residual learning method for RNN, followed with sparse training by weight pruning to gain sparsity in network parameters. For a Long Short-Term Memory (LSTM) network, we explore the combination schemes and parameter settings for temporal and spatial residual learning with sparse training. Experiments are carried out on the IFN/ENIT database. For the character error rate on the testing set e while training with sets a, b, c, d, the previously reported best result is 13.42%, and the proposed configuration of temporal residual learning followed with sparse training achieves the state-of-the-art result 12.06%.","PeriodicalId":433676,"journal":{"name":"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Residual Recurrent Neural Network with Sparse Training for Offline Arabic Handwriting Recognition\",\"authors\":\"Ruijie Yan, Liangrui Peng, GuangXiang Bin, Shengjin Wang, Yao Cheng\",\"doi\":\"10.1109/ICDAR.2017.171\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep Recurrent Neural Networks (RNN) have been suffering from the overfitting problem due to the model redundancy of the network structures. We propose a novel temporal and spatial residual learning method for RNN, followed with sparse training by weight pruning to gain sparsity in network parameters. For a Long Short-Term Memory (LSTM) network, we explore the combination schemes and parameter settings for temporal and spatial residual learning with sparse training. Experiments are carried out on the IFN/ENIT database. For the character error rate on the testing set e while training with sets a, b, c, d, the previously reported best result is 13.42%, and the proposed configuration of temporal residual learning followed with sparse training achieves the state-of-the-art result 12.06%.\",\"PeriodicalId\":433676,\"journal\":{\"name\":\"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDAR.2017.171\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDAR.2017.171","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

深度递归神经网络(RNN)由于网络结构的模型冗余性,一直存在过拟合问题。提出了一种新的RNN时空残差学习方法,通过权值剪枝的稀疏训练来获得网络参数的稀疏性。对于长短期记忆(LSTM)网络,我们探索了基于稀疏训练的时空残差学习的组合方案和参数设置。在IFN/ENIT数据库上进行了实验。对于集a、b、c、d训练时测试集e上的字符错误率,先前报道的最佳结果为13.42%,提出的时间残差学习配合稀疏训练的配置达到了最优结果12.06%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Residual Recurrent Neural Network with Sparse Training for Offline Arabic Handwriting Recognition
Deep Recurrent Neural Networks (RNN) have been suffering from the overfitting problem due to the model redundancy of the network structures. We propose a novel temporal and spatial residual learning method for RNN, followed with sparse training by weight pruning to gain sparsity in network parameters. For a Long Short-Term Memory (LSTM) network, we explore the combination schemes and parameter settings for temporal and spatial residual learning with sparse training. Experiments are carried out on the IFN/ENIT database. For the character error rate on the testing set e while training with sets a, b, c, d, the previously reported best result is 13.42%, and the proposed configuration of temporal residual learning followed with sparse training achieves the state-of-the-art result 12.06%.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信