光滑分布的KNN信息估计量分析

Puning Zhao, L. Lai
{"title":"光滑分布的KNN信息估计量分析","authors":"Puning Zhao, L. Lai","doi":"10.1109/ALLERTON.2018.8635874","DOIUrl":null,"url":null,"abstract":"KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence speed of this estimator for random variables with bounded support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support but also those with long tail distributions. In this paper, we analyze the convergence speed of the error of KSG estimator for smooth distributions, whose support can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive combination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.","PeriodicalId":299280,"journal":{"name":"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Analysis of KNN Information Estimators for Smooth Distributions\",\"authors\":\"Puning Zhao, L. Lai\",\"doi\":\"10.1109/ALLERTON.2018.8635874\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence speed of this estimator for random variables with bounded support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support but also those with long tail distributions. In this paper, we analyze the convergence speed of the error of KSG estimator for smooth distributions, whose support can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive combination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.\",\"PeriodicalId\":299280,\"journal\":{\"name\":\"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)\",\"volume\":\"36 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ALLERTON.2018.8635874\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 56th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ALLERTON.2018.8635874","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

KSG互信息估计是基于每个样本到其第k个近邻的距离,被广泛用于估计两个连续随机变量之间的互信息。已有的研究分析了该估计器对有界支持随机变量的收敛速度。然而,在实践中,KSG估计器对于更广泛的分布类别也表现良好,不仅包括那些有界支持的分布,还包括那些具有长尾分布的分布。本文分析了支持有界和无界光滑分布的KSG估计误差的收敛速度。由于KSG互信息估计量可以看作是KL熵估计量的自适应组合,在我们的分析中,我们还提供了KL熵估计量的收敛性分析。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Analysis of KNN Information Estimators for Smooth Distributions
KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence speed of this estimator for random variables with bounded support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support but also those with long tail distributions. In this paper, we analyze the convergence speed of the error of KSG estimator for smooth distributions, whose support can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive combination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信