Bayesian Knowledge Base Distance-Based Tuning

Chase Yakaboski, E. Santos
{"title":"Bayesian Knowledge Base Distance-Based Tuning","authors":"Chase Yakaboski, E. Santos","doi":"10.1109/WI.2018.0-106","DOIUrl":null,"url":null,"abstract":"In order to rigorously characterize the difference or similarity between two probabilistic knowledge bases, a distance/divergence metric must be established. This becomes increasingly important when conducting parameter learning or tuning of a knowledge base. When tuning a knowledge base, it is essential to characterize the global probabilistic belief change as the knowledge base is tuned to ensure the underlying probability distribution of the knowledge base is not drastically compromised. In this paper, we develop an entropy-based distance measure for a Bayesian Knowledge Base (BKB) derived from the Chan-Darwiche distance measure used in a variety of probabilistic belief network analyses. Through this distance measure it is possible to calculate the theoretical minimum distance required to correct a BKB when the system's answer contradicts that of an expert. Having a theoretical minimization limit on distance allows for a quick calculation to test the integrity of either the BKB or the expert's judgement, since a high minimum distance would suggest the necessary tuning is not consistent with the rest of the knowledge base. Further still, this distance measure can be used to prove that in some cases the standard single causal rule set BKB tuning procedure (discussed in our prior work) in fact minimizes the global BKB distance. The final portion of this paper presents a few practical examples which analyze the utility of the BKB distance measure along with evidence supporting the proposition that the Causal Rule Set tuning procedure minimizes the distance between the original and tuned knowledge base in many cases.","PeriodicalId":405966,"journal":{"name":"2018 IEEE/WIC/ACM International Conference on Web Intelligence (WI)","volume":"74 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE/WIC/ACM International Conference on Web Intelligence (WI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WI.2018.0-106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In order to rigorously characterize the difference or similarity between two probabilistic knowledge bases, a distance/divergence metric must be established. This becomes increasingly important when conducting parameter learning or tuning of a knowledge base. When tuning a knowledge base, it is essential to characterize the global probabilistic belief change as the knowledge base is tuned to ensure the underlying probability distribution of the knowledge base is not drastically compromised. In this paper, we develop an entropy-based distance measure for a Bayesian Knowledge Base (BKB) derived from the Chan-Darwiche distance measure used in a variety of probabilistic belief network analyses. Through this distance measure it is possible to calculate the theoretical minimum distance required to correct a BKB when the system's answer contradicts that of an expert. Having a theoretical minimization limit on distance allows for a quick calculation to test the integrity of either the BKB or the expert's judgement, since a high minimum distance would suggest the necessary tuning is not consistent with the rest of the knowledge base. Further still, this distance measure can be used to prove that in some cases the standard single causal rule set BKB tuning procedure (discussed in our prior work) in fact minimizes the global BKB distance. The final portion of this paper presents a few practical examples which analyze the utility of the BKB distance measure along with evidence supporting the proposition that the Causal Rule Set tuning procedure minimizes the distance between the original and tuned knowledge base in many cases.
基于贝叶斯知识库距离的调优
为了严格地表征两个概率知识库之间的差异或相似性,必须建立距离/散度度量。在进行参数学习或知识库调优时,这一点变得越来越重要。在调优知识库时,有必要在知识库调优时描述全局概率信念变化的特征,以确保知识库的潜在概率分布不会受到严重损害。在本文中,我们开发了一种基于熵的贝叶斯知识库(BKB)的距离度量,该知识库来源于各种概率信念网络分析中使用的Chan-Darwiche距离度量。通过这种距离度量,可以计算出当系统的答案与专家的答案相矛盾时,修正BKB所需的理论最小距离。对距离有一个理论上的最小限制,可以快速计算以测试BKB或专家判断的完整性,因为较高的最小距离将表明必要的调优与知识库的其余部分不一致。此外,这个距离度量可以用来证明,在某些情况下,标准的单一因果规则集BKB调优过程(在我们之前的工作中讨论过)实际上最小化了全局BKB距离。本文的最后一部分给出了一些实际的例子,这些例子分析了BKB距离度量的效用,以及支持因果规则集调优过程在许多情况下最小化原始和调优知识库之间距离的证据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信