Martin Roa-Villescas, Xuanzhao Gao, Sander Stuijk, Henk Corporaal, Jin-Guo Liu
{"title":"Probabilistic inference in the era of tensor networks and differential programming","authors":"Martin Roa-Villescas, Xuanzhao Gao, Sander Stuijk, Henk Corporaal, Jin-Guo Liu","doi":"10.1103/physrevresearch.6.033261","DOIUrl":null,"url":null,"abstract":"Probabilistic inference is a fundamental task in modern machine learning. Recent advances in tensor network (TN) contraction algorithms have enabled the development of better exact inference methods. However, many common inference tasks in probabilistic graphical models (PGMs) still lack corresponding TN-based adaptations. In this paper, we advance the connection between PGMs and TNs by formulating and implementing tensor-based solutions for the following inference tasks: (A) computing the partition function, (B) computing the marginal probability of sets of variables in the model, (C) determining the most likely assignment to a set of variables, (D) the same as (C) but after having marginalized a different set of variables, and (E) generating samples from a learned probability distribution using a generalized method. Our study is motivated by recent technical advances in the fields of quantum circuit simulation, quantum many-body physics, and statistical physics. Through an experimental evaluation, we demonstrate that the integration of these quantum technologies with a series of algorithms introduced in this study significantly improves the performance efficiency of existing methods for solving probabilistic inference tasks.","PeriodicalId":20546,"journal":{"name":"Physical Review Research","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physical Review Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1103/physrevresearch.6.033261","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Probabilistic inference is a fundamental task in modern machine learning. Recent advances in tensor network (TN) contraction algorithms have enabled the development of better exact inference methods. However, many common inference tasks in probabilistic graphical models (PGMs) still lack corresponding TN-based adaptations. In this paper, we advance the connection between PGMs and TNs by formulating and implementing tensor-based solutions for the following inference tasks: (A) computing the partition function, (B) computing the marginal probability of sets of variables in the model, (C) determining the most likely assignment to a set of variables, (D) the same as (C) but after having marginalized a different set of variables, and (E) generating samples from a learned probability distribution using a generalized method. Our study is motivated by recent technical advances in the fields of quantum circuit simulation, quantum many-body physics, and statistical physics. Through an experimental evaluation, we demonstrate that the integration of these quantum technologies with a series of algorithms introduced in this study significantly improves the performance efficiency of existing methods for solving probabilistic inference tasks.