Marta Gentiloni Silveri, Giovanni Conforti, Alain Durmus
{"title":"Theoretical guarantees in KL for Diffusion Flow Matching","authors":"Marta Gentiloni Silveri, Giovanni Conforti, Alain Durmus","doi":"arxiv-2409.08311","DOIUrl":null,"url":null,"abstract":"Flow Matching (FM) (also referred to as stochastic interpolants or rectified\nflows) stands out as a class of generative models that aims to bridge in finite\ntime the target distribution $\\nu^\\star$ with an auxiliary distribution $\\mu$,\nleveraging a fixed coupling $\\pi$ and a bridge which can either be\ndeterministic or stochastic. These two ingredients define a path measure which\ncan then be approximated by learning the drift of its Markovian projection. The\nmain contribution of this paper is to provide relatively mild assumptions on\n$\\nu^\\star$, $\\mu$ and $\\pi$ to obtain non-asymptotics guarantees for Diffusion\nFlow Matching (DFM) models using as bridge the conditional distribution\nassociated with the Brownian motion. More precisely, we establish bounds on the\nKullback-Leibler divergence between the target distribution and the one\ngenerated by such DFM models under moment conditions on the score of\n$\\nu^\\star$, $\\mu$ and $\\pi$, and a standard $L^2$-drift-approximation error\nassumption.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.08311","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Flow Matching (FM) (also referred to as stochastic interpolants or rectified
flows) stands out as a class of generative models that aims to bridge in finite
time the target distribution $\nu^\star$ with an auxiliary distribution $\mu$,
leveraging a fixed coupling $\pi$ and a bridge which can either be
deterministic or stochastic. These two ingredients define a path measure which
can then be approximated by learning the drift of its Markovian projection. The
main contribution of this paper is to provide relatively mild assumptions on
$\nu^\star$, $\mu$ and $\pi$ to obtain non-asymptotics guarantees for Diffusion
Flow Matching (DFM) models using as bridge the conditional distribution
associated with the Brownian motion. More precisely, we establish bounds on the
Kullback-Leibler divergence between the target distribution and the one
generated by such DFM models under moment conditions on the score of
$\nu^\star$, $\mu$ and $\pi$, and a standard $L^2$-drift-approximation error
assumption.