{"title":"光滑分布的KNN信息估计量分析","authors":"Puning Zhao, L. Lai","doi":"10.1109/ALLERTON.2018.8635874","DOIUrl":null,"url":null,"abstract":"KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence speed of this estimator for random variables with bounded support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support but also those with long tail distributions. In this paper, we analyze the convergence speed of the error of KSG estimator for smooth distributions, whose support can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive combination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.","PeriodicalId":299280,"journal":{"name":"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Analysis of KNN Information Estimators for Smooth Distributions\",\"authors\":\"Puning Zhao, L. Lai\",\"doi\":\"10.1109/ALLERTON.2018.8635874\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence speed of this estimator for random variables with bounded support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support but also those with long tail distributions. In this paper, we analyze the convergence speed of the error of KSG estimator for smooth distributions, whose support can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive combination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.\",\"PeriodicalId\":299280,\"journal\":{\"name\":\"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)\",\"volume\":\"36 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ALLERTON.2018.8635874\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ALLERTON.2018.8635874","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Analysis of KNN Information Estimators for Smooth Distributions
KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence speed of this estimator for random variables with bounded support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support but also those with long tail distributions. In this paper, we analyze the convergence speed of the error of KSG estimator for smooth distributions, whose support can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive combination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.