Applying BinaryWeights in Neural Networks for Sequential Text Recognition

Zihe Wang, Chun Yang, Xu-Cheng Yin
{"title":"Applying BinaryWeights in Neural Networks for Sequential Text Recognition","authors":"Zihe Wang, Chun Yang, Xu-Cheng Yin","doi":"10.1109/ACPR.2017.118","DOIUrl":null,"url":null,"abstract":"With the development of deep learning, researchers have achieved lots of breakthroughs in many classical problems. Unfortunately, these progresses are demanding on the hardwares, especially GPU, causing a huge energy consumption. Therefore, how to implement these neural networks with lower requirements of hardwares is holding more and more attention. In this paper, two aspects of the work were done. Firstly, we build a deep learning framework that supports training and prediction with binarized neural networks. This framework include binarized layers, e.g. Convolution (Conv) and LSTM layers. It is based on our analysis of how to implement a binarized layer and train BianryWeights with it. Secondly, we construct a network with binarized layers which are implemented in our framework to achieve good performance on sequential text recognition. We also modify the network architecture in order to obtain the experimental results with little loss on accuracy.","PeriodicalId":426561,"journal":{"name":"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACPR.2017.118","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

With the development of deep learning, researchers have achieved lots of breakthroughs in many classical problems. Unfortunately, these progresses are demanding on the hardwares, especially GPU, causing a huge energy consumption. Therefore, how to implement these neural networks with lower requirements of hardwares is holding more and more attention. In this paper, two aspects of the work were done. Firstly, we build a deep learning framework that supports training and prediction with binarized neural networks. This framework include binarized layers, e.g. Convolution (Conv) and LSTM layers. It is based on our analysis of how to implement a binarized layer and train BianryWeights with it. Secondly, we construct a network with binarized layers which are implemented in our framework to achieve good performance on sequential text recognition. We also modify the network architecture in order to obtain the experimental results with little loss on accuracy.
二值权在连续文本识别中的神经网络应用
随着深度学习的发展,研究人员在许多经典问题上取得了许多突破。不幸的是,这些进步对硬件的要求很高,尤其是GPU,造成了巨大的能量消耗。因此,如何在较低的硬件要求下实现这些神经网络越来越受到人们的关注。本文主要做了两个方面的工作。首先,我们构建了一个支持二值化神经网络训练和预测的深度学习框架。该框架包括二值化层,例如卷积层和LSTM层。它是基于我们对如何实现二值化层并使用它训练BianryWeights的分析。其次,我们构建了一个具有二值化层的网络,并在我们的框架中实现,以获得良好的顺序文本识别性能。我们还对网络结构进行了修改,使实验结果在精度上损失较小。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信