马尔科夫递归神经网络语言模型

Jen-Tzung Chien, Che-Yu Kuo
{"title":"马尔科夫递归神经网络语言模型","authors":"Jen-Tzung Chien, Che-Yu Kuo","doi":"10.1109/ASRU46091.2019.9003850","DOIUrl":null,"url":null,"abstract":"Recurrent neural network (RNN) has achieved a great success in language modeling where the temporal information based on deterministic state is continuously extracted and evolved through time. Such a simple deterministic transition function using input-to-hidden and hidden-to-hidden weights is usually insufficient to reflect the diversities and variations of latent variable structure behind the heterogeneous natural language. This paper presents a new stochastic Markov RNN (MRNN) to strengthen the learning capability in language model where the trajectory of word sequences is driven by a neural Markov process with Markov state transitions based on a K-state long short-term memory model. A latent state machine is constructed to characterize the complicated semantics in the structured lexical patterns. Gumbel-softmax is introduced to implement the stochastic backpropatation algorithm with discrete states. The parallel computation for rapid realization of MRNN is presented. The variational Bayesian learning procedure is implemented. Experiments demonstrate the merits of stochastic and diverse representation using MRNN language model where the overhead of parameters and computations is limited.","PeriodicalId":150913,"journal":{"name":"2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Markov Recurrent Neural Network Language Model\",\"authors\":\"Jen-Tzung Chien, Che-Yu Kuo\",\"doi\":\"10.1109/ASRU46091.2019.9003850\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recurrent neural network (RNN) has achieved a great success in language modeling where the temporal information based on deterministic state is continuously extracted and evolved through time. Such a simple deterministic transition function using input-to-hidden and hidden-to-hidden weights is usually insufficient to reflect the diversities and variations of latent variable structure behind the heterogeneous natural language. This paper presents a new stochastic Markov RNN (MRNN) to strengthen the learning capability in language model where the trajectory of word sequences is driven by a neural Markov process with Markov state transitions based on a K-state long short-term memory model. A latent state machine is constructed to characterize the complicated semantics in the structured lexical patterns. Gumbel-softmax is introduced to implement the stochastic backpropatation algorithm with discrete states. The parallel computation for rapid realization of MRNN is presented. The variational Bayesian learning procedure is implemented. Experiments demonstrate the merits of stochastic and diverse representation using MRNN language model where the overhead of parameters and computations is limited.\",\"PeriodicalId\":150913,\"journal\":{\"name\":\"2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU)\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ASRU46091.2019.9003850\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ASRU46091.2019.9003850","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

递归神经网络(RNN)在语言建模方面取得了巨大的成功,基于确定性状态的时间信息被不断提取并随时间演化。这种使用输入到隐藏和隐藏到隐藏权重的简单确定性过渡函数通常不足以反映异构自然语言背后潜在变量结构的多样性和变化。为了增强语言模型的学习能力,本文提出了一种新的随机马尔可夫RNN (MRNN),其中单词序列的轨迹由基于k状态长短期记忆模型的马尔可夫状态转移的神经马尔可夫过程驱动。构造了一个潜在状态机来描述结构化词汇模式中的复杂语义。引入Gumbel-softmax来实现离散状态下的随机反向传播算法。提出了快速实现MRNN的并行计算方法。实现了变分贝叶斯学习过程。实验证明了MRNN语言模型在参数和计算开销有限的情况下具有随机和多样化表示的优点。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Markov Recurrent Neural Network Language Model
Recurrent neural network (RNN) has achieved a great success in language modeling where the temporal information based on deterministic state is continuously extracted and evolved through time. Such a simple deterministic transition function using input-to-hidden and hidden-to-hidden weights is usually insufficient to reflect the diversities and variations of latent variable structure behind the heterogeneous natural language. This paper presents a new stochastic Markov RNN (MRNN) to strengthen the learning capability in language model where the trajectory of word sequences is driven by a neural Markov process with Markov state transitions based on a K-state long short-term memory model. A latent state machine is constructed to characterize the complicated semantics in the structured lexical patterns. Gumbel-softmax is introduced to implement the stochastic backpropatation algorithm with discrete states. The parallel computation for rapid realization of MRNN is presented. The variational Bayesian learning procedure is implemented. Experiments demonstrate the merits of stochastic and diverse representation using MRNN language model where the overhead of parameters and computations is limited.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信