{"title":"Approximate discrete entropy monotonicity for log-concave sums","authors":"Lampros Gavalakis","doi":"10.1017/s0963548323000408","DOIUrl":null,"url":null,"abstract":"Abstract It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the integers: For every $n \\geq 1$ , if $X_1,\\ldots,X_n$ are i.i.d. integer-valued, log-concave random variables, then \\begin{equation*} H(X_1+\\cdots +X_{n+1}) \\geq H(X_1+\\cdots +X_{n}) + \\frac {1}{2}\\log {\\Bigl (\\frac {n+1}{n}\\Bigr )} - o(1) \\end{equation*} as $H(X_1) \\to \\infty$ , where $H(X_1)$ denotes the (discrete) Shannon entropy. The problem is reduced to the continuous setting by showing that if $U_1,\\ldots,U_n$ are independent continuous uniforms on $(0,1)$ , then \\begin{equation*} h(X_1+\\cdots +X_n + U_1+\\cdots +U_n) = H(X_1+\\cdots +X_n) + o(1), \\end{equation*} as $H(X_1) \\to \\infty$ , where $h$ stands for the differential entropy. Explicit bounds for the $o(1)$ -terms are provided.","PeriodicalId":10513,"journal":{"name":"Combinatorics, Probability & Computing","volume":null,"pages":null},"PeriodicalIF":0.9000,"publicationDate":"2023-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Combinatorics, Probability & Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1017/s0963548323000408","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the integers: For every $n \geq 1$ , if $X_1,\ldots,X_n$ are i.i.d. integer-valued, log-concave random variables, then \begin{equation*} H(X_1+\cdots +X_{n+1}) \geq H(X_1+\cdots +X_{n}) + \frac {1}{2}\log {\Bigl (\frac {n+1}{n}\Bigr )} - o(1) \end{equation*} as $H(X_1) \to \infty$ , where $H(X_1)$ denotes the (discrete) Shannon entropy. The problem is reduced to the continuous setting by showing that if $U_1,\ldots,U_n$ are independent continuous uniforms on $(0,1)$ , then \begin{equation*} h(X_1+\cdots +X_n + U_1+\cdots +U_n) = H(X_1+\cdots +X_n) + o(1), \end{equation*} as $H(X_1) \to \infty$ , where $h$ stands for the differential entropy. Explicit bounds for the $o(1)$ -terms are provided.
期刊介绍:
Published bimonthly, Combinatorics, Probability & Computing is devoted to the three areas of combinatorics, probability theory and theoretical computer science. Topics covered include classical and algebraic graph theory, extremal set theory, matroid theory, probabilistic methods and random combinatorial structures; combinatorial probability and limit theorems for random combinatorial structures; the theory of algorithms (including complexity theory), randomised algorithms, probabilistic analysis of algorithms, computational learning theory and optimisation.