Jean-Rene Chazottes, Sandro Gallo, Daniel Takahashi
{"title":"无界记忆随机链的高斯浓度界","authors":"Jean-Rene Chazottes, Sandro Gallo, Daniel Takahashi","doi":"10.1214/22-aap1893","DOIUrl":null,"url":null,"abstract":"Stochastic chains of unbounded memory (SCUMs) are generalization of Markov chains, also known in the literature as “chains with complete connections” or “g-measures”. We obtain Gaussian concentration bounds (GCB) in this large class of models, for general alphabets, under two different conditions on the kernel: (1) when the sum of its oscillations is less than one, or (2) when the sum of its variations is finite, that is, belongs to ℓ1(N). We also obtain explicit constants as functions of the parameters of the model. Our conditions are sharp in the sense that we exhibit examples of SCUMs that do not have GCB and for which the sum of oscillations is 1+ϵ, or the variation belongs to ℓ1+ϵ(N) for any ϵ>0. These examples are based on the existence of phase transitions. We illustrate our results with four applications. First, we derive a Dvoretzky–Kiefer–Wolfowitz-type inequality which gives a uniform control on the fluctuations of the empirical measure. Second, in the finite-alphabet case, we obtain an upper bound on the d¯-distance between two stationary SCUMs and, as a by-product, we obtain new explicit bounds on the speed of Markovian approximation in d¯. Third, we derive new bounds on the fluctuations of the “plug-in” estimator for entropy. Fourth, we obtain new rate of convergence for the maximum likelihood estimator of conditional probability.","PeriodicalId":50979,"journal":{"name":"Annals of Applied Probability","volume":"92 1","pages":"0"},"PeriodicalIF":1.4000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Gaussian concentration bounds for stochastic chains of unbounded memory\",\"authors\":\"Jean-Rene Chazottes, Sandro Gallo, Daniel Takahashi\",\"doi\":\"10.1214/22-aap1893\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Stochastic chains of unbounded memory (SCUMs) are generalization of Markov chains, also known in the literature as “chains with complete connections” or “g-measures”. We obtain Gaussian concentration bounds (GCB) in this large class of models, for general alphabets, under two different conditions on the kernel: (1) when the sum of its oscillations is less than one, or (2) when the sum of its variations is finite, that is, belongs to ℓ1(N). We also obtain explicit constants as functions of the parameters of the model. Our conditions are sharp in the sense that we exhibit examples of SCUMs that do not have GCB and for which the sum of oscillations is 1+ϵ, or the variation belongs to ℓ1+ϵ(N) for any ϵ>0. These examples are based on the existence of phase transitions. We illustrate our results with four applications. First, we derive a Dvoretzky–Kiefer–Wolfowitz-type inequality which gives a uniform control on the fluctuations of the empirical measure. Second, in the finite-alphabet case, we obtain an upper bound on the d¯-distance between two stationary SCUMs and, as a by-product, we obtain new explicit bounds on the speed of Markovian approximation in d¯. Third, we derive new bounds on the fluctuations of the “plug-in” estimator for entropy. Fourth, we obtain new rate of convergence for the maximum likelihood estimator of conditional probability.\",\"PeriodicalId\":50979,\"journal\":{\"name\":\"Annals of Applied Probability\",\"volume\":\"92 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Annals of Applied Probability\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1214/22-aap1893\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annals of Applied Probability","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1214/22-aap1893","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Gaussian concentration bounds for stochastic chains of unbounded memory
Stochastic chains of unbounded memory (SCUMs) are generalization of Markov chains, also known in the literature as “chains with complete connections” or “g-measures”. We obtain Gaussian concentration bounds (GCB) in this large class of models, for general alphabets, under two different conditions on the kernel: (1) when the sum of its oscillations is less than one, or (2) when the sum of its variations is finite, that is, belongs to ℓ1(N). We also obtain explicit constants as functions of the parameters of the model. Our conditions are sharp in the sense that we exhibit examples of SCUMs that do not have GCB and for which the sum of oscillations is 1+ϵ, or the variation belongs to ℓ1+ϵ(N) for any ϵ>0. These examples are based on the existence of phase transitions. We illustrate our results with four applications. First, we derive a Dvoretzky–Kiefer–Wolfowitz-type inequality which gives a uniform control on the fluctuations of the empirical measure. Second, in the finite-alphabet case, we obtain an upper bound on the d¯-distance between two stationary SCUMs and, as a by-product, we obtain new explicit bounds on the speed of Markovian approximation in d¯. Third, we derive new bounds on the fluctuations of the “plug-in” estimator for entropy. Fourth, we obtain new rate of convergence for the maximum likelihood estimator of conditional probability.
期刊介绍:
The Annals of Applied Probability aims to publish research of the highest quality reflecting the varied facets of contemporary Applied Probability. Primary emphasis is placed on importance and originality.