Wei Chen , Shian Du , Shigui Li , Delu Zeng , John Paisley
{"title":"Entropy-informed weighting channel normalizing flow for deep generative models","authors":"Wei Chen , Shian Du , Shigui Li , Delu Zeng , John Paisley","doi":"10.1016/j.patcog.2025.112442","DOIUrl":null,"url":null,"abstract":"<div><div>Normalizing Flows (NFs) are widely used in deep generative models for their exact likelihood estimation and efficient sampling. However, they require substantial memory since the latent space matches the input dimension. Multi-scale architectures address this by progressively reducing latent dimensions while preserving reversibility. Existing multi-scale architectures use simple, static channel-wise splitting, limiting expressiveness. To improve this, we introduce a regularized, feature-dependent <span><math><mi>Shuffle</mi></math></span> operation and integrate it into vanilla multi-scale architecture. This operation adaptively generates channel-wise weights and shuffles latent variables before splitting them. We observe that such operation guides the variables to evolve in the direction of entropy increase, hence we refer to NFs with the <span><math><mi>Shuffle</mi></math></span> operation as <em>Entropy-Informed Weighting Channel Normalizing Flow</em> (EIW-Flow). Extensive experiments on CIFAR-10, CelebA, ImageNet, and LSUN demonstrate that EIW-Flow achieves state-of-the-art density estimation and competitive sample quality for deep generative modeling, with minimal computational overhead.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"172 ","pages":"Article 112442"},"PeriodicalIF":7.6000,"publicationDate":"2025-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320325011045","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Normalizing Flows (NFs) are widely used in deep generative models for their exact likelihood estimation and efficient sampling. However, they require substantial memory since the latent space matches the input dimension. Multi-scale architectures address this by progressively reducing latent dimensions while preserving reversibility. Existing multi-scale architectures use simple, static channel-wise splitting, limiting expressiveness. To improve this, we introduce a regularized, feature-dependent operation and integrate it into vanilla multi-scale architecture. This operation adaptively generates channel-wise weights and shuffles latent variables before splitting them. We observe that such operation guides the variables to evolve in the direction of entropy increase, hence we refer to NFs with the operation as Entropy-Informed Weighting Channel Normalizing Flow (EIW-Flow). Extensive experiments on CIFAR-10, CelebA, ImageNet, and LSUN demonstrate that EIW-Flow achieves state-of-the-art density estimation and competitive sample quality for deep generative modeling, with minimal computational overhead.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.