Reed-Muller子码的机器学习辅助高效解码

Mohammad Vahid Jamali;Xiyang Liu;Ashok Vardhan Makkuva;Hessam Mahdavifar;Sewoong Oh;Pramod Viswanath
{"title":"Reed-Muller子码的机器学习辅助高效解码","authors":"Mohammad Vahid Jamali;Xiyang Liu;Ashok Vardhan Makkuva;Hessam Mahdavifar;Sewoong Oh;Pramod Viswanath","doi":"10.1109/JSAIT.2023.3298362","DOIUrl":null,"url":null,"abstract":"Reed-Muller (RM) codes achieve the capacity of general binary-input memoryless symmetric channels and are conjectured to have a comparable performance to that of random codes in terms of scaling laws. However, such results are established assuming maximum-likelihood decoders for general code parameters. Also, RM codes only admit limited sets of rates. Efficient decoders such as successive cancellation list (SCL) decoder and recently-introduced recursive projection-aggregation (RPA) decoders are available for RM codes at finite lengths. In this paper, we focus on subcodes of RM codes with flexible rates. We first extend the RPA decoding algorithm to RM subcodes. To lower the complexity of our decoding algorithm, referred to as subRPA, we investigate different approaches to prune the projections. Next, we derive the soft-decision based version of our algorithm, called soft-subRPA, that not only improves upon the performance of subRPA but also enables a differentiable decoding algorithm. Building upon the soft-subRPA algorithm, we then provide a framework for training a machine learning (ML) model to search for \n<italic>good</i>\n sets of projections that minimize the decoding error rate. Training our ML model enables achieving very close to the performance of full-projection decoding with a significantly smaller number of projections. We also show that the choice of the projections in decoding RM subcodes matters significantly, and our ML-aided projection pruning scheme is able to find a \n<italic>good</i>\n selection, i.e., with negligible performance degradation compared to the full-projection case, given a reasonable number of projections.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"4 ","pages":"260-275"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Machine Learning-Aided Efficient Decoding of Reed–Muller Subcodes\",\"authors\":\"Mohammad Vahid Jamali;Xiyang Liu;Ashok Vardhan Makkuva;Hessam Mahdavifar;Sewoong Oh;Pramod Viswanath\",\"doi\":\"10.1109/JSAIT.2023.3298362\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Reed-Muller (RM) codes achieve the capacity of general binary-input memoryless symmetric channels and are conjectured to have a comparable performance to that of random codes in terms of scaling laws. However, such results are established assuming maximum-likelihood decoders for general code parameters. Also, RM codes only admit limited sets of rates. Efficient decoders such as successive cancellation list (SCL) decoder and recently-introduced recursive projection-aggregation (RPA) decoders are available for RM codes at finite lengths. In this paper, we focus on subcodes of RM codes with flexible rates. We first extend the RPA decoding algorithm to RM subcodes. To lower the complexity of our decoding algorithm, referred to as subRPA, we investigate different approaches to prune the projections. Next, we derive the soft-decision based version of our algorithm, called soft-subRPA, that not only improves upon the performance of subRPA but also enables a differentiable decoding algorithm. Building upon the soft-subRPA algorithm, we then provide a framework for training a machine learning (ML) model to search for \\n<italic>good</i>\\n sets of projections that minimize the decoding error rate. Training our ML model enables achieving very close to the performance of full-projection decoding with a significantly smaller number of projections. We also show that the choice of the projections in decoding RM subcodes matters significantly, and our ML-aided projection pruning scheme is able to find a \\n<italic>good</i>\\n selection, i.e., with negligible performance degradation compared to the full-projection case, given a reasonable number of projections.\",\"PeriodicalId\":73295,\"journal\":{\"name\":\"IEEE journal on selected areas in information theory\",\"volume\":\"4 \",\"pages\":\"260-275\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-07-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE journal on selected areas in information theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10193768/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE journal on selected areas in information theory","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10193768/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

Reed-Muller(RM)码实现了一般二进制输入无记忆对称信道的容量,并被推测在比例律方面具有与随机码相当的性能。然而,这样的结果是在假设通用代码参数的最大似然解码器的情况下建立的。此外,RM代码只允许有限的费率集。诸如连续消除列表(SCL)解码器和最近引入的递归投影聚合(RPA)解码器之类的高效解码器可用于有限长度的RM码。本文主要研究具有灵活速率的RM码的子码。我们首先将RPA解码算法扩展到RM子码。为了降低我们的解码算法(称为subRPA)的复杂性,我们研究了修剪投影的不同方法。接下来,我们导出了我们算法的基于软判决的版本,称为软subRPA,它不仅提高了subRPA的性能,而且实现了可微分解码算法。在软subRPA算法的基础上,我们提供了一个用于训练机器学习(ML)模型的框架,以搜索最小化解码错误率的良好投影集。训练我们的ML模型能够以显著较少的投影数量实现非常接近全投影解码的性能。我们还表明,在解码RM子码时,投影的选择非常重要,并且我们的ML辅助投影修剪方案能够找到一个很好的选择,即,在给定合理数量的投影的情况下,与全投影情况相比,性能退化可以忽略不计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Machine Learning-Aided Efficient Decoding of Reed–Muller Subcodes
Reed-Muller (RM) codes achieve the capacity of general binary-input memoryless symmetric channels and are conjectured to have a comparable performance to that of random codes in terms of scaling laws. However, such results are established assuming maximum-likelihood decoders for general code parameters. Also, RM codes only admit limited sets of rates. Efficient decoders such as successive cancellation list (SCL) decoder and recently-introduced recursive projection-aggregation (RPA) decoders are available for RM codes at finite lengths. In this paper, we focus on subcodes of RM codes with flexible rates. We first extend the RPA decoding algorithm to RM subcodes. To lower the complexity of our decoding algorithm, referred to as subRPA, we investigate different approaches to prune the projections. Next, we derive the soft-decision based version of our algorithm, called soft-subRPA, that not only improves upon the performance of subRPA but also enables a differentiable decoding algorithm. Building upon the soft-subRPA algorithm, we then provide a framework for training a machine learning (ML) model to search for good sets of projections that minimize the decoding error rate. Training our ML model enables achieving very close to the performance of full-projection decoding with a significantly smaller number of projections. We also show that the choice of the projections in decoding RM subcodes matters significantly, and our ML-aided projection pruning scheme is able to find a good selection, i.e., with negligible performance degradation compared to the full-projection case, given a reasonable number of projections.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
8.20
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信