{"title":"An information theoretic proof of the Chernoff-Hoeffding inequality","authors":"Olivier Rioul , Patrick Solé","doi":"10.1016/j.ipl.2025.106582","DOIUrl":null,"url":null,"abstract":"<div><div>The Chernoff bound is a well-known upper bound on the tail of binomial distributions of parameter 1/2 involving the binary entropy function. Hoeffding's inequality (or the Chernoff-Hoeffding inequality) is a generalization for binomial distributions of parameter <span><math><mn>1</mn><mo>−</mo><mn>1</mn><mo>/</mo><mi>q</mi></math></span>, involving the <em>q</em>-ary entropy function (with <span><math><mi>q</mi><mo>≥</mo><mn>2</mn></math></span>), which can be written in terms of the Kullback-Leibler divergence and is related to the bound in Fano's inequality. We give an information theoretic proof of that bound, and sketch some applications to channel and source coding. We also derive a refined bound which is always sharper.</div></div>","PeriodicalId":56290,"journal":{"name":"Information Processing Letters","volume":"190 ","pages":"Article 106582"},"PeriodicalIF":0.7000,"publicationDate":"2025-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020019025000262","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
The Chernoff bound is a well-known upper bound on the tail of binomial distributions of parameter 1/2 involving the binary entropy function. Hoeffding's inequality (or the Chernoff-Hoeffding inequality) is a generalization for binomial distributions of parameter , involving the q-ary entropy function (with ), which can be written in terms of the Kullback-Leibler divergence and is related to the bound in Fano's inequality. We give an information theoretic proof of that bound, and sketch some applications to channel and source coding. We also derive a refined bound which is always sharper.
期刊介绍:
Information Processing Letters invites submission of original research articles that focus on fundamental aspects of information processing and computing. This naturally includes work in the broadly understood field of theoretical computer science; although papers in all areas of scientific inquiry will be given consideration, provided that they describe research contributions credibly motivated by applications to computing and involve rigorous methodology. High quality experimental papers that address topics of sufficiently broad interest may also be considered.
Since its inception in 1971, Information Processing Letters has served as a forum for timely dissemination of short, concise and focused research contributions. Continuing with this tradition, and to expedite the reviewing process, manuscripts are generally limited in length to nine pages when they appear in print.