On the representation of sparse stochastic matrices with state embedding

IF 3.9 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Jugurta Montalvão, Gabriel Bastos, Rodrigo Sousa, Ataíde Gualberto
{"title":"On the representation of sparse stochastic matrices with state embedding","authors":"Jugurta Montalvão,&nbsp;Gabriel Bastos,&nbsp;Rodrigo Sousa,&nbsp;Ataíde Gualberto","doi":"10.1016/j.patrec.2025.04.011","DOIUrl":null,"url":null,"abstract":"<div><div>Embeddings are adjusted to allow points representing states and observations in Markov models, where conditional probabilities are approximately encoded as the exponential of (negative) distances, jointly scaled by a density factor. It is shown that the goodness of this approximation can be managed, mainly if the embedding dimension is chosen in function of entropies associated to the corresponding Markov model. Therefore, for sparse (low entropy) models, their representation as state embeddings can save memory and allow fully geometric versions of probabilistic algorithms, as the Viterbi, taken as an example in this work. Besides, evidences are also gathered in favor of potentially useful properties that emerge from the geometric representation of Markov models, such as analogies, superstates (aggregation) and semantic fields.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"193 ","pages":"Pages 71-78"},"PeriodicalIF":3.9000,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167865525001448","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Embeddings are adjusted to allow points representing states and observations in Markov models, where conditional probabilities are approximately encoded as the exponential of (negative) distances, jointly scaled by a density factor. It is shown that the goodness of this approximation can be managed, mainly if the embedding dimension is chosen in function of entropies associated to the corresponding Markov model. Therefore, for sparse (low entropy) models, their representation as state embeddings can save memory and allow fully geometric versions of probabilistic algorithms, as the Viterbi, taken as an example in this work. Besides, evidences are also gathered in favor of potentially useful properties that emerge from the geometric representation of Markov models, such as analogies, superstates (aggregation) and semantic fields.
稀疏随机矩阵的状态嵌入表示
调整嵌入以允许点表示状态和马尔可夫模型中的观测值,其中条件概率近似编码为(负)距离的指数,由密度因子共同缩放。结果表明,这种近似的优点是可以管理的,主要是如果将嵌入维数选择为与相应的马尔可夫模型相关的熵的函数。因此,对于稀疏(低熵)模型,将其表示为状态嵌入可以节省内存,并允许完全几何版本的概率算法,如本工作中以Viterbi为例。此外,还收集了证据,支持从马尔可夫模型的几何表示中出现的潜在有用属性,例如类比,超状态(聚合)和语义场。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Pattern Recognition Letters
Pattern Recognition Letters 工程技术-计算机:人工智能
CiteScore
12.40
自引率
5.90%
发文量
287
审稿时长
9.1 months
期刊介绍: Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition. Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信