Nonparametric Kullback-Liebler Divergence Estimation Using M-Spacing

Linyun He, Eunhye Song
{"title":"Nonparametric Kullback-Liebler Divergence Estimation Using M-Spacing","authors":"Linyun He, Eunhye Song","doi":"10.1109/WSC52266.2021.9715376","DOIUrl":null,"url":null,"abstract":"Entropy of a random variable with unknown distribution function can be estimated nonparametrically by spacing methods when independent and identically distributed (i.i.d.) observations of the random variable are available. We extend the classical entropy estimator based on sample spacing to define an m-spacing estimator for the Kullback-Liebler (KL) divergence between two i.i.d. observations with unknown distribution functions, which can be applied to measure discrepancy between real-world system output and simulation output as well as between two simulators' outputs. We show that the proposed estimator converges almost surely to the true KL divergence as the numbers of outputs collected from both systems increase under mild conditions and discuss the required choices for $m$ and the simulation output sample size as functions of the real-world sample size. Additionally, we show Central Limit Theorems for the proposed estimator with appropriate scaling.","PeriodicalId":369368,"journal":{"name":"2021 Winter Simulation Conference (WSC)","volume":"87 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 Winter Simulation Conference (WSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WSC52266.2021.9715376","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Entropy of a random variable with unknown distribution function can be estimated nonparametrically by spacing methods when independent and identically distributed (i.i.d.) observations of the random variable are available. We extend the classical entropy estimator based on sample spacing to define an m-spacing estimator for the Kullback-Liebler (KL) divergence between two i.i.d. observations with unknown distribution functions, which can be applied to measure discrepancy between real-world system output and simulation output as well as between two simulators' outputs. We show that the proposed estimator converges almost surely to the true KL divergence as the numbers of outputs collected from both systems increase under mild conditions and discuss the required choices for $m$ and the simulation output sample size as functions of the real-world sample size. Additionally, we show Central Limit Theorems for the proposed estimator with appropriate scaling.
m -间距非参数Kullback-Liebler散度估计
对于具有未知分布函数的随机变量,在有独立同分布观测值的情况下,可以用间隔法非参数地估计其熵。我们扩展了经典的基于样本间隔的熵估计量,定义了具有未知分布函数的两个i.i.d观测值之间的Kullback-Liebler (KL)散度的m-间距估计量,该估计量可用于测量真实系统输出与仿真输出之间以及两个模拟器输出之间的差异。我们表明,在温和的条件下,随着从两个系统收集的输出数量的增加,所提出的估计器几乎肯定收敛于真实的KL散度,并讨论了$m$所需的选择和模拟输出样本量作为真实样本量的函数。此外,我们用适当的尺度证明了所提出的估计量的中心极限定理。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信