Fengzhe Zhang, Jiajun He, Laurence I. Midgley, Javier Antorán, José Miguel Hernández-Lobato
{"title":"Efficient and Unbiased Sampling of Boltzmann Distributions via Consistency Models","authors":"Fengzhe Zhang, Jiajun He, Laurence I. Midgley, Javier Antorán, José Miguel Hernández-Lobato","doi":"arxiv-2409.07323","DOIUrl":null,"url":null,"abstract":"Diffusion models have shown promising potential for advancing Boltzmann\nGenerators. However, two critical challenges persist: (1) inherent errors in\nsamples due to model imperfections, and (2) the requirement of hundreds of\nfunctional evaluations (NFEs) to achieve high-quality samples. While existing\nsolutions like importance sampling and distillation address these issues\nseparately, they are often incompatible, as most distillation models lack the\nnecessary density information for importance sampling. This paper introduces a\nnovel sampling method that effectively combines Consistency Models (CMs) with\nimportance sampling. We evaluate our approach on both synthetic energy\nfunctions and equivariant n-body particle systems. Our method produces unbiased\nsamples using only 6-25 NFEs while achieving a comparable Effective Sample Size\n(ESS) to Denoising Diffusion Probabilistic Models (DDPMs) that require\napproximately 100 NFEs.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07323","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Diffusion models have shown promising potential for advancing Boltzmann
Generators. However, two critical challenges persist: (1) inherent errors in
samples due to model imperfections, and (2) the requirement of hundreds of
functional evaluations (NFEs) to achieve high-quality samples. While existing
solutions like importance sampling and distillation address these issues
separately, they are often incompatible, as most distillation models lack the
necessary density information for importance sampling. This paper introduces a
novel sampling method that effectively combines Consistency Models (CMs) with
importance sampling. We evaluate our approach on both synthetic energy
functions and equivariant n-body particle systems. Our method produces unbiased
samples using only 6-25 NFEs while achieving a comparable Effective Sample Size
(ESS) to Denoising Diffusion Probabilistic Models (DDPMs) that require
approximately 100 NFEs.