Structure of Optimal Quantizer for Binary-Input Continuous-Output Channels with Output Constraints

Thuan Nguyen, Thinh Nguyen
{"title":"Structure of Optimal Quantizer for Binary-Input Continuous-Output Channels with Output Constraints","authors":"Thuan Nguyen, Thinh Nguyen","doi":"10.1109/ISIT44484.2020.9174174","DOIUrl":null,"url":null,"abstract":"In this paper, we consider a channel whose the input is a binary random source X ∈ {x<inf>1</inf>,x<inf>2</inf>} with the probability mass function (pmf) p<inf>X</inf> = [p<inf>x1</inf>,p<inf>x2</inf>] and the output is a continuous random variable Y ∈ R as a result of a continuous noise, characterized by the channel conditional densities p<inf>y|x1</inf> = ϕ<inf>1</inf>(y) and p<inf>y|x2</inf> = ϕ<inf>2</inf>(y). A quantizer Q is used to map Y back to a discrete set Z ∈ {z<inf>1</inf>,z<inf>2</inf>,...,z<inf>N</inf>}. To retain most amount of information about X, an optimal Q is one that maximizes I(X;Z). On the other hand, our goal is not only to recover X but also ensure that p<inf>Z</inf> = [p<inf>z1</inf>,p<inf>z2</inf>,...,p<inf>zN</inf>] satisfies a certain constraint. In particular, we are interested in designing a quantizer that maximizes βI(X;Z)−C(p<inf>Z</inf>) where β is a tradeoff parameter and C(p<inf>Z</inf>) is an arbitrary cost function of p<inf>Z</inf>. Let the posterior probability ${p_{{x_1}\\mid y}} = {r_y} = \\frac{{{p_{{x_1}}}{\\phi _1}(y)}}{{{p_{{x_1}}}{\\phi _1}(y) + {p_{{x_2}}}{\\phi _2}(y)}}$, our result shows that the structure of the optimal quantizer separates r<inf>y</inf> into convex cells. In other words, the optimal quantizer has the form: ${Q^{\\ast}}\\left( {{r_y}} \\right) = {z_i}$, if $a_{i - 1}^{\\ast} \\leq {r_y} < a_i^{\\ast}$ for some optimal thresholds $a_0^{\\ast} = 0 < a_1^{\\ast} < a_2^{\\ast} < \\cdots < a_{N - 1}^{\\ast} < a_N^{\\ast} = 1$. Based on this optimal structure, we describe some fast algorithms for determining the optimal quantizers.","PeriodicalId":159311,"journal":{"name":"2020 IEEE International Symposium on Information Theory (ISIT)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Information Theory (ISIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT44484.2020.9174174","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

In this paper, we consider a channel whose the input is a binary random source X ∈ {x1,x2} with the probability mass function (pmf) pX = [px1,px2] and the output is a continuous random variable Y ∈ R as a result of a continuous noise, characterized by the channel conditional densities py|x1 = ϕ1(y) and py|x2 = ϕ2(y). A quantizer Q is used to map Y back to a discrete set Z ∈ {z1,z2,...,zN}. To retain most amount of information about X, an optimal Q is one that maximizes I(X;Z). On the other hand, our goal is not only to recover X but also ensure that pZ = [pz1,pz2,...,pzN] satisfies a certain constraint. In particular, we are interested in designing a quantizer that maximizes βI(X;Z)−C(pZ) where β is a tradeoff parameter and C(pZ) is an arbitrary cost function of pZ. Let the posterior probability ${p_{{x_1}\mid y}} = {r_y} = \frac{{{p_{{x_1}}}{\phi _1}(y)}}{{{p_{{x_1}}}{\phi _1}(y) + {p_{{x_2}}}{\phi _2}(y)}}$, our result shows that the structure of the optimal quantizer separates ry into convex cells. In other words, the optimal quantizer has the form: ${Q^{\ast}}\left( {{r_y}} \right) = {z_i}$, if $a_{i - 1}^{\ast} \leq {r_y} < a_i^{\ast}$ for some optimal thresholds $a_0^{\ast} = 0 < a_1^{\ast} < a_2^{\ast} < \cdots < a_{N - 1}^{\ast} < a_N^{\ast} = 1$. Based on this optimal structure, we describe some fast algorithms for determining the optimal quantizers.
具有输出约束的二输入连续输出信道的最优量化器结构
本文考虑一个信道,其输入为二进制随机源X∈{x1,x2},其概率质量函数(pmf) pX = [px1,px2],输出为连续噪声导致的连续随机变量Y∈R,其特征为信道条件密度py|x1 = 1(Y)和py|x2 = 2(Y)。量化器Q用于将Y映射回离散集合Z∈{z1,z2,…,zN}。为了保留关于X的大部分信息,最优Q是使I(X;Z)最大化的Q。另一方面,我们的目标不仅是恢复X,而且要确保pZ = [pz1,pz2,…],pzN]满足一定的约束条件。我们特别感兴趣的是设计一个量化器,使β i (X;Z)−C(pZ)最大化,其中β是一个权衡参数,C(pZ)是pZ的任意成本函数。让后验概率${p_{{x_1}\mid y}} = {r_y} = \frac{{{p_{{x_1}}}{\phi _1}(y)}}{{{p_{{x_1}}}{\phi _1}(y) + {p_{{x_2}}}{\phi _2}(y)}}$,我们的结果表明,最优量化器的结构将ry分成凸细胞。换句话说,最优量化器的形式是:${Q^{\ast}}\left( {{r_y}} \right) = {z_i}$,如果$a_{i - 1}^{\ast} \leq {r_y} < a_i^{\ast}$对于某些最优阈值$a_0^{\ast} = 0 < a_1^{\ast} < a_2^{\ast} < \cdots < a_{N - 1}^{\ast} < a_N^{\ast} = 1$。基于这种最优结构,我们描述了一些快速确定最优量化器的算法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信