Analysis of high order recurrent neural networks for analog decoding

M. Mostafa, W. Teich, J. Lindner
{"title":"Analysis of high order recurrent neural networks for analog decoding","authors":"M. Mostafa, W. Teich, J. Lindner","doi":"10.1109/ISTC.2012.6325210","DOIUrl":null,"url":null,"abstract":"Forward error correction coding (FEC) is a classical and well known technique to improve the efficiency of a digital transmission. Despite of intensive research in this field the Shannon limit was unachievable for a long time, but today iterative techniques can approach this limit. However, iterative decoding is computationally very demanding, especially for real time applications and/or high data rates. This encouraged researchers to look for alternatives, which led to the new field of analog decoding, meaning an implementation with analog circuits. The performance gain of those analog decoders compared to a digital implementation is believed to be at least a factor of 100 in terms of speed or power consumption. In this paper we focus on iterative threshold decoding. We show that this method can be considered as a dynamical system, which can be described by high order recurrent neural networks. Using this representation we give a qualitative description of the long term behavior of such a dynamical system. The continuous time high order recurrent neural networks can be understood as the basis for an analog implementation of iterative threshold decoding.","PeriodicalId":197982,"journal":{"name":"2012 7th International Symposium on Turbo Codes and Iterative Information Processing (ISTC)","volume":"175 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 7th International Symposium on Turbo Codes and Iterative Information Processing (ISTC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISTC.2012.6325210","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Forward error correction coding (FEC) is a classical and well known technique to improve the efficiency of a digital transmission. Despite of intensive research in this field the Shannon limit was unachievable for a long time, but today iterative techniques can approach this limit. However, iterative decoding is computationally very demanding, especially for real time applications and/or high data rates. This encouraged researchers to look for alternatives, which led to the new field of analog decoding, meaning an implementation with analog circuits. The performance gain of those analog decoders compared to a digital implementation is believed to be at least a factor of 100 in terms of speed or power consumption. In this paper we focus on iterative threshold decoding. We show that this method can be considered as a dynamical system, which can be described by high order recurrent neural networks. Using this representation we give a qualitative description of the long term behavior of such a dynamical system. The continuous time high order recurrent neural networks can be understood as the basis for an analog implementation of iterative threshold decoding.
用于模拟解码的高阶递归神经网络分析
前向纠错编码(FEC)是一种经典的、众所周知的提高数字传输效率的技术。尽管在这一领域进行了深入的研究,但香农极限在很长一段时间内是无法达到的,但今天的迭代技术可以接近这个极限。然而,迭代解码在计算上是非常苛刻的,特别是对于实时应用和/或高数据速率。这鼓励研究人员寻找替代方案,这导致了模拟解码的新领域,这意味着用模拟电路实现。与数字实现相比,这些模拟解码器的性能增益被认为在速度或功耗方面至少是100倍。本文主要研究迭代阈值解码。我们证明了这种方法可以看作是一个动态系统,它可以用高阶递归神经网络来描述。利用这种表示,我们给出了这种动力系统长期行为的定性描述。连续时间高阶递归神经网络可以理解为迭代阈值解码的模拟实现的基础。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信