LDA推理的神经网络模型比较

Sarunyoo Srivichitranond, R. Saga
{"title":"LDA推理的神经网络模型比较","authors":"Sarunyoo Srivichitranond, R. Saga","doi":"10.1109/ICBIR54589.2022.9786386","DOIUrl":null,"url":null,"abstract":"One of the most reliable methods to find topics for a document is Latent Dirichlet Allocation (LDA) which is a generative statistical model, but with the growing amount of data, this method can be time consuming. This problem can be solved by utilizing neural network to learn from LDA and train model for faster processing time. This study aims to inspect further on how accurate different neural network models can be when learn from LDA. The neural network models that are used to compare in this work are dense neural network (DNN), recurrent neural network (RNN), long short-term memory (LSTM), gated recurrent unit (GRU), bidirectional LSTM (BiLSTM), and bidirectional GRU (BiGRU). From the experiment, it shows that BiGRU and RNN are good alternative to learn from LDA when compare to DNN, RNN has the best test accuracy on 15 topics at 0.8833, comparing to Dense 3 at 0.8807, Dense 2 at 0.8798, and BiGRU at 0.8767, while BiGRU has the best test accuracy on 20 topics at 0.8727, comparing to Dense 2 at 0.8704, RNN at 0.8664, and Dense 3 at 0.8642. If the topic is more than 35 topics, Dense 2 outperform other techniques including Dense 3 as well.","PeriodicalId":216904,"journal":{"name":"2022 7th International Conference on Business and Industrial Research (ICBIR)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Comparison of Neural Network Models for LDA Inferring\",\"authors\":\"Sarunyoo Srivichitranond, R. Saga\",\"doi\":\"10.1109/ICBIR54589.2022.9786386\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"One of the most reliable methods to find topics for a document is Latent Dirichlet Allocation (LDA) which is a generative statistical model, but with the growing amount of data, this method can be time consuming. This problem can be solved by utilizing neural network to learn from LDA and train model for faster processing time. This study aims to inspect further on how accurate different neural network models can be when learn from LDA. The neural network models that are used to compare in this work are dense neural network (DNN), recurrent neural network (RNN), long short-term memory (LSTM), gated recurrent unit (GRU), bidirectional LSTM (BiLSTM), and bidirectional GRU (BiGRU). From the experiment, it shows that BiGRU and RNN are good alternative to learn from LDA when compare to DNN, RNN has the best test accuracy on 15 topics at 0.8833, comparing to Dense 3 at 0.8807, Dense 2 at 0.8798, and BiGRU at 0.8767, while BiGRU has the best test accuracy on 20 topics at 0.8727, comparing to Dense 2 at 0.8704, RNN at 0.8664, and Dense 3 at 0.8642. If the topic is more than 35 topics, Dense 2 outperform other techniques including Dense 3 as well.\",\"PeriodicalId\":216904,\"journal\":{\"name\":\"2022 7th International Conference on Business and Industrial Research (ICBIR)\",\"volume\":\"34 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-05-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 7th International Conference on Business and Industrial Research (ICBIR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICBIR54589.2022.9786386\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Business and Industrial Research (ICBIR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBIR54589.2022.9786386","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

潜在狄利克雷分配(Latent Dirichlet Allocation, LDA)是一种生成式统计模型,是一种最可靠的文档主题查找方法,但随着数据量的增加,这种方法可能会耗费大量时间。利用神经网络从LDA中学习,训练模型,提高处理速度,可以解决这一问题。本研究旨在进一步检验不同神经网络模型在学习LDA时的准确性。在这项工作中用于比较的神经网络模型有密集神经网络(DNN)、循环神经网络(RNN)、长短期记忆(LSTM)、门控循环单元(GRU)、双向LSTM (BiLSTM)和双向GRU (BiGRU)。从实验中可以看出,与DNN相比,BiGRU和RNN是学习LDA的良好选择,RNN在15个主题上的测试精度为0.8833,优于Dense 3的0.8807、Dense 2的0.8798和BiGRU的0.8767,而BiGRU在20个主题上的测试精度为0.8727,优于Dense 2的0.8704、RNN的0.8664和Dense 3的0.8642。如果主题超过35个主题,Dense 2也优于其他技术,包括Dense 3。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Comparison of Neural Network Models for LDA Inferring
One of the most reliable methods to find topics for a document is Latent Dirichlet Allocation (LDA) which is a generative statistical model, but with the growing amount of data, this method can be time consuming. This problem can be solved by utilizing neural network to learn from LDA and train model for faster processing time. This study aims to inspect further on how accurate different neural network models can be when learn from LDA. The neural network models that are used to compare in this work are dense neural network (DNN), recurrent neural network (RNN), long short-term memory (LSTM), gated recurrent unit (GRU), bidirectional LSTM (BiLSTM), and bidirectional GRU (BiGRU). From the experiment, it shows that BiGRU and RNN are good alternative to learn from LDA when compare to DNN, RNN has the best test accuracy on 15 topics at 0.8833, comparing to Dense 3 at 0.8807, Dense 2 at 0.8798, and BiGRU at 0.8767, while BiGRU has the best test accuracy on 20 topics at 0.8727, comparing to Dense 2 at 0.8704, RNN at 0.8664, and Dense 3 at 0.8642. If the topic is more than 35 topics, Dense 2 outperform other techniques including Dense 3 as well.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信