Information-theoretic and Set-theoretic Similarity

L. Cazzanti, M. Gupta
{"title":"Information-theoretic and Set-theoretic Similarity","authors":"L. Cazzanti, M. Gupta","doi":"10.1109/ISIT.2006.261752","DOIUrl":null,"url":null,"abstract":"We introduce a definition of similarity based on Tversky's set-theoretic linear contrast model and on information-theoretic principles. The similarity measures the residual entropy with respect to a random object. This residual entropy similarity strongly captures context, which we conjecture is important for similarity-based statistical learning. Properties of the similarity definition are established and examples illustrate its characteristics. We show that a previously-defined information-theoretic similarity is also set-theoretic, and compare it to the residual entropy similarity. The similarity between random objects is also treated","PeriodicalId":115298,"journal":{"name":"2006 IEEE International Symposium on Information Theory","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE International Symposium on Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT.2006.261752","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25

Abstract

We introduce a definition of similarity based on Tversky's set-theoretic linear contrast model and on information-theoretic principles. The similarity measures the residual entropy with respect to a random object. This residual entropy similarity strongly captures context, which we conjecture is important for similarity-based statistical learning. Properties of the similarity definition are established and examples illustrate its characteristics. We show that a previously-defined information-theoretic similarity is also set-theoretic, and compare it to the residual entropy similarity. The similarity between random objects is also treated
信息论和集合论的相似性
基于Tversky的集合论线性对比模型和信息论原理,引入了相似性的定义。相似性度量相对于一个随机对象的残差熵。这种残差熵相似性强烈地捕获了上下文,我们推测这对于基于相似性的统计学习很重要。建立了相似度定义的性质,并举例说明了相似度定义的特点。我们证明了先前定义的信息论相似度也是集合论的,并将其与残差熵相似度进行了比较。随机对象之间的相似性也被处理
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信