Energy-Based Models with Applications to Speech and Language Processing

Zhijian Ou
{"title":"Energy-Based Models with Applications to Speech and Language Processing","authors":"Zhijian Ou","doi":"10.1561/2000000117","DOIUrl":null,"url":null,"abstract":"Energy-Based Models (EBMs) are an important class of probabilistic models, also known as random fields and undirected graphical models. EBMs are un-normalized and thus radically different from other popular self-normalized probabilistic models such as hidden Markov models (HMMs), autoregressive models, generative adversarial nets (GANs) and variational auto-encoders (VAEs). Over the past years, EBMs have attracted increasing interest not only from the core machine learning community, but also from application domains such as speech, vision, natural language processing (NLP) and so on, due to significant theoretical and algorithmic progress. The sequential nature of speech and language also presents special challenges and needs a different treatment from processing fix-dimensional data (e.g., images). Therefore, the purpose of this monograph is to present a systematic introduction to energy-based models, including both algorithmic progress and applications in speech and language processing. First, the basics of EBMs are introduced, including classic models, recent models parameterized by neural networks, sampling methods, and various learning methods from the classic learning algorithms to the most advanced ones. Then, the application of EBMs in three different scenarios is presented, i.e., for modeling marginal, conditional and joint distributions, respectively. 1) EBMs for sequential data with applications in language modeling, where the main focus is on the marginal distribution of a sequence itself; 2) EBMs for modeling conditional distributions of target sequences given observation sequences, with applications in speech recognition, sequence labeling and text generation; 3) EBMs for modeling joint distributions of both sequences of observations and targets, and their applications in semi-supervised learning and calibrated natural language understanding.","PeriodicalId":517538,"journal":{"name":"Foundations and Trends® in Signal Processing","volume":"18 5","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations and Trends® in Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1561/2000000117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Energy-Based Models (EBMs) are an important class of probabilistic models, also known as random fields and undirected graphical models. EBMs are un-normalized and thus radically different from other popular self-normalized probabilistic models such as hidden Markov models (HMMs), autoregressive models, generative adversarial nets (GANs) and variational auto-encoders (VAEs). Over the past years, EBMs have attracted increasing interest not only from the core machine learning community, but also from application domains such as speech, vision, natural language processing (NLP) and so on, due to significant theoretical and algorithmic progress. The sequential nature of speech and language also presents special challenges and needs a different treatment from processing fix-dimensional data (e.g., images). Therefore, the purpose of this monograph is to present a systematic introduction to energy-based models, including both algorithmic progress and applications in speech and language processing. First, the basics of EBMs are introduced, including classic models, recent models parameterized by neural networks, sampling methods, and various learning methods from the classic learning algorithms to the most advanced ones. Then, the application of EBMs in three different scenarios is presented, i.e., for modeling marginal, conditional and joint distributions, respectively. 1) EBMs for sequential data with applications in language modeling, where the main focus is on the marginal distribution of a sequence itself; 2) EBMs for modeling conditional distributions of target sequences given observation sequences, with applications in speech recognition, sequence labeling and text generation; 3) EBMs for modeling joint distributions of both sequences of observations and targets, and their applications in semi-supervised learning and calibrated natural language understanding.
基于能量的模型在语音和语言处理中的应用
基于能量的模型(EBM)是一类重要的概率模型,也被称为随机场和无向图模型。EBM 是非归一化的,因此与其他流行的自归一化概率模型,如隐马尔可夫模型(HMM)、自回归模型、生成对抗网(GAN)和变异自动编码器(VAE)有着本质的区别。在过去的几年里,由于理论和算法上的重大进展,EBM 不仅吸引了核心机器学习社区越来越多的关注,也吸引了语音、视觉、自然语言处理(NLP)等应用领域越来越多的关注。语音和语言的连续性也带来了特殊的挑战,需要与处理固定维度数据(如图像)不同的处理方法。因此,本专著旨在系统介绍基于能量的模型,包括算法进展以及在语音和语言处理中的应用。首先,介绍了 EBM 的基础知识,包括经典模型、由神经网络参数化的最新模型、采样方法以及从经典学习算法到最先进算法的各种学习方法。然后,介绍了 EBM 在三种不同情况下的应用,即分别用于边际分布、条件分布和联合分布的建模。1)针对序列数据的 EBM,主要侧重于序列本身的边际分布,应用于语言建模;2)针对给定观察序列的目标序列的条件分布建模的 EBM,应用于语音识别、序列标注和文本生成;3)针对观察序列和目标序列的联合分布建模的 EBM,应用于半监督学习和校准自然语言理解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信