一种新的RNN神经元及其在手写识别中的应用

Li Sun, Tonghua Su, Shengjie Zhou, Lijun Yu
{"title":"一种新的RNN神经元及其在手写识别中的应用","authors":"Li Sun, Tonghua Su, Shengjie Zhou, Lijun Yu","doi":"10.1109/ICDAR.2017.176","DOIUrl":null,"url":null,"abstract":"Recurrent neural networks (RNNs) have been widely used in many sequential labeling fields. Decades of research fruits show that artificial neuron as the building blocks plays great role in its success. Different RNN neurons are proposed, such as long-short term memory (LSTM) and gated recurrent unit (GRU), and used in most applications let alone character recognition, to encode the long-term contextual dependencies. Inspired by both LSTM and GRU, a new structure named gated memory unit (GMU) is presented which carries forward their merits. GMU preserves the constant error carousels (CEC) which is devoted to enhance a smooth information flow. GMU also lends both the cell structure of LSTM and the interpolation gates of GRU. The proposed neuron is evaluated on both online English handwriting recognition and online Chinese handwriting recognition tasks in terms of parameter volumes, convergence and accuracy. The results show that GMU is of potential choice in handwriting recognition tasks.","PeriodicalId":433676,"journal":{"name":"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)","volume":"96 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"GMU: A Novel RNN Neuron and Its Application to Handwriting Recognition\",\"authors\":\"Li Sun, Tonghua Su, Shengjie Zhou, Lijun Yu\",\"doi\":\"10.1109/ICDAR.2017.176\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recurrent neural networks (RNNs) have been widely used in many sequential labeling fields. Decades of research fruits show that artificial neuron as the building blocks plays great role in its success. Different RNN neurons are proposed, such as long-short term memory (LSTM) and gated recurrent unit (GRU), and used in most applications let alone character recognition, to encode the long-term contextual dependencies. Inspired by both LSTM and GRU, a new structure named gated memory unit (GMU) is presented which carries forward their merits. GMU preserves the constant error carousels (CEC) which is devoted to enhance a smooth information flow. GMU also lends both the cell structure of LSTM and the interpolation gates of GRU. The proposed neuron is evaluated on both online English handwriting recognition and online Chinese handwriting recognition tasks in terms of parameter volumes, convergence and accuracy. The results show that GMU is of potential choice in handwriting recognition tasks.\",\"PeriodicalId\":433676,\"journal\":{\"name\":\"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)\",\"volume\":\"96 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDAR.2017.176\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDAR.2017.176","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

递归神经网络(RNNs)在序列标注领域得到了广泛的应用。几十年的研究成果表明,人工神经元作为人工神经网络的基石对人工神经网络的成功起着重要的作用。不同的RNN神经元被提出,如长短期记忆(LSTM)和门控循环单元(GRU),并在大多数应用中使用,更不用说字符识别,来编码长期上下文依赖。在LSTM和GRU的启发下,提出了一种继承了两者优点的门控存储单元(GMU)。GMU保留了恒定误差旋转木马(CEC),致力于增强平滑的信息流。GMU还提供了LSTM的单元结构和GRU的插值门。在在线英文手写识别和在线中文手写识别任务上,对该神经元的参数体积、收敛性和准确率进行了评价。结果表明,GMU在手写识别任务中是一个潜在的选择。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
GMU: A Novel RNN Neuron and Its Application to Handwriting Recognition
Recurrent neural networks (RNNs) have been widely used in many sequential labeling fields. Decades of research fruits show that artificial neuron as the building blocks plays great role in its success. Different RNN neurons are proposed, such as long-short term memory (LSTM) and gated recurrent unit (GRU), and used in most applications let alone character recognition, to encode the long-term contextual dependencies. Inspired by both LSTM and GRU, a new structure named gated memory unit (GMU) is presented which carries forward their merits. GMU preserves the constant error carousels (CEC) which is devoted to enhance a smooth information flow. GMU also lends both the cell structure of LSTM and the interpolation gates of GRU. The proposed neuron is evaluated on both online English handwriting recognition and online Chinese handwriting recognition tasks in terms of parameter volumes, convergence and accuracy. The results show that GMU is of potential choice in handwriting recognition tasks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信