{"title":"Improvements to Sanov and PAC Sublevel-set Bounds for Discrete Random Variables","authors":"M. A. Tope, Joel M. Morris","doi":"10.1109/CISS50987.2021.9400225","DOIUrl":null,"url":null,"abstract":"We derive an improvement for probably approximately correct (PAC) sublevel-set bounds for the multinomial distributed discrete random variables. Previous bounds (including Sanov's Theorem) show that the Kullback Leibler (KL) divergence between the empirical probability mass function (pmf) and the true PMF converges with rate O(log(N)/N), where $N$ is the number of independent and identically distributed (i.i.d.) samples used to compute the empirical pmf. We interpret the KL divergence as bounding the probability that a multinomial distributed random variable (RV) deviates into a halfspace and construct improved uniform PAC sublevel-set bounds that converge with rates $O$(log (log (N)) / N). These results bound the worst case performance for a number of machine learning algorithms. Finally, the ‘halfspace bound’ methodology suggests further improvements are possible for non-uniform bounds. In this paper, we derive an improvement (on the convergence rate) for various Probably Approximately Correct (PAC) bounds (including Sanov's Theorem) for multinomially distributed discrete random variables.","PeriodicalId":228112,"journal":{"name":"2021 55th Annual Conference on Information Sciences and Systems (CISS)","volume":"107 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 55th Annual Conference on Information Sciences and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS50987.2021.9400225","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
We derive an improvement for probably approximately correct (PAC) sublevel-set bounds for the multinomial distributed discrete random variables. Previous bounds (including Sanov's Theorem) show that the Kullback Leibler (KL) divergence between the empirical probability mass function (pmf) and the true PMF converges with rate O(log(N)/N), where $N$ is the number of independent and identically distributed (i.i.d.) samples used to compute the empirical pmf. We interpret the KL divergence as bounding the probability that a multinomial distributed random variable (RV) deviates into a halfspace and construct improved uniform PAC sublevel-set bounds that converge with rates $O$(log (log (N)) / N). These results bound the worst case performance for a number of machine learning algorithms. Finally, the ‘halfspace bound’ methodology suggests further improvements are possible for non-uniform bounds. In this paper, we derive an improvement (on the convergence rate) for various Probably Approximately Correct (PAC) bounds (including Sanov's Theorem) for multinomially distributed discrete random variables.