主题建模的熵正则化

Nagesh Bhattu Sristy, D. Somayajulu
{"title":"主题建模的熵正则化","authors":"Nagesh Bhattu Sristy, D. Somayajulu","doi":"10.1145/2662117.2662130","DOIUrl":null,"url":null,"abstract":"Supervised Latent Dirichlet based Topic Models are variants of Latent Dirichlet Topic Models with the additional capability to discriminate the samples. Light Supervision strategies are recently adopted to express the rich domain knowledge in the form of constraints. Posterior Regularization framework is developed for learning models from this weaker form of supervision expressing the set of constraints over the family of posteriors. Modelling arbitrary problem specific dependencies is a non-trivial task, increasing the complexity of already harder inference problem in the context of latent dirichlet based topic models. In the current work we propose posterior regularization method for topic models to capture wide variety of auxiliary supervision. This approach simplifies the computational challenges posed by additional compound terms. We have demonstrated the use of this framework in improving the utility of topic models in the presence of entropy constraints. We have experimented with real word datasets to test the above mentioned techniques.","PeriodicalId":286103,"journal":{"name":"I-CARE 2014","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Entropy Regularization for Topic Modelling\",\"authors\":\"Nagesh Bhattu Sristy, D. Somayajulu\",\"doi\":\"10.1145/2662117.2662130\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Supervised Latent Dirichlet based Topic Models are variants of Latent Dirichlet Topic Models with the additional capability to discriminate the samples. Light Supervision strategies are recently adopted to express the rich domain knowledge in the form of constraints. Posterior Regularization framework is developed for learning models from this weaker form of supervision expressing the set of constraints over the family of posteriors. Modelling arbitrary problem specific dependencies is a non-trivial task, increasing the complexity of already harder inference problem in the context of latent dirichlet based topic models. In the current work we propose posterior regularization method for topic models to capture wide variety of auxiliary supervision. This approach simplifies the computational challenges posed by additional compound terms. We have demonstrated the use of this framework in improving the utility of topic models in the presence of entropy constraints. We have experimented with real word datasets to test the above mentioned techniques.\",\"PeriodicalId\":286103,\"journal\":{\"name\":\"I-CARE 2014\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"I-CARE 2014\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2662117.2662130\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"I-CARE 2014","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2662117.2662130","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

基于监督潜在狄利克雷的主题模型是潜在狄利克雷主题模型的变体,具有额外的区分样本的能力。近年来采用轻监督策略将丰富的领域知识以约束的形式表达出来。后验正则化框架是为学习模型而开发的,这种较弱的监督形式表达了对后验家族的约束集。在基于潜在狄利克雷的主题模型中,建模任意特定于问题的依赖关系是一项非常重要的任务,这增加了本已困难的推理问题的复杂性。在目前的工作中,我们提出了主题模型的后验正则化方法,以捕获各种辅助监督。这种方法简化了额外复合项带来的计算挑战。我们已经演示了在存在熵约束的情况下使用该框架来改进主题模型的效用。我们用真实的单词数据集来测试上述技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Entropy Regularization for Topic Modelling
Supervised Latent Dirichlet based Topic Models are variants of Latent Dirichlet Topic Models with the additional capability to discriminate the samples. Light Supervision strategies are recently adopted to express the rich domain knowledge in the form of constraints. Posterior Regularization framework is developed for learning models from this weaker form of supervision expressing the set of constraints over the family of posteriors. Modelling arbitrary problem specific dependencies is a non-trivial task, increasing the complexity of already harder inference problem in the context of latent dirichlet based topic models. In the current work we propose posterior regularization method for topic models to capture wide variety of auxiliary supervision. This approach simplifies the computational challenges posed by additional compound terms. We have demonstrated the use of this framework in improving the utility of topic models in the presence of entropy constraints. We have experimented with real word datasets to test the above mentioned techniques.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信