{"title":"In-Memory Bit Error Rate Estimation Using Syndromes of LDPC Codes","authors":"Yotam Gershon;Yuval Cassuto","doi":"10.1109/JSAIT.2026.3686020","DOIUrl":null,"url":null,"abstract":"Modern AI systems entail steep energy costs due to massive-scale computations and data transfers; offloading parts of the computations to be performed in-memory holds great potential for reducing both. This paper studies a new architecture proposed for reliable in-memory computations. Its main component is a coding scheme that is designed for both in-memory error-rate estimation/detection and outside-of-memory error correction. Estimation and/or detection are used to decide when the error rate exceeds the tolerance of the computation, at which point error correction is invoked. The coding scheme is based on a nested bilayer LDPC construction, where in particular, the first layer comprises degree-1 variable nodes guaranteeing accurate bit-error rate (BER) estimation and detection. Towards that, we derive a closed-form maximum-likelihood BER estimator for irregular codes, and a gapped hypothesis testing framework for deciding when to decode given some prescribed error-rate tolerance. The performance analysis of the derived estimator includes a closed-form mean-squared-error expression with explicit dependence on the check-degree distribution. For the hypothesis testing the analysis shows the dependence of detection performance on the same degree distribution. Both results reveal an advantage of check-regular codes that minimize dominant error terms among codes with a given average check degree.","PeriodicalId":73295,"journal":{"name":"IEEE journal on selected areas in information theory","volume":"7 ","pages":"161-174"},"PeriodicalIF":2.2000,"publicationDate":"2026-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE journal on selected areas in information theory","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11489022/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/4/21 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Modern AI systems entail steep energy costs due to massive-scale computations and data transfers; offloading parts of the computations to be performed in-memory holds great potential for reducing both. This paper studies a new architecture proposed for reliable in-memory computations. Its main component is a coding scheme that is designed for both in-memory error-rate estimation/detection and outside-of-memory error correction. Estimation and/or detection are used to decide when the error rate exceeds the tolerance of the computation, at which point error correction is invoked. The coding scheme is based on a nested bilayer LDPC construction, where in particular, the first layer comprises degree-1 variable nodes guaranteeing accurate bit-error rate (BER) estimation and detection. Towards that, we derive a closed-form maximum-likelihood BER estimator for irregular codes, and a gapped hypothesis testing framework for deciding when to decode given some prescribed error-rate tolerance. The performance analysis of the derived estimator includes a closed-form mean-squared-error expression with explicit dependence on the check-degree distribution. For the hypothesis testing the analysis shows the dependence of detection performance on the same degree distribution. Both results reveal an advantage of check-regular codes that minimize dominant error terms among codes with a given average check degree.