TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/PaddingFlow: Improving Normalizing Flows with Padding-Dime...

PaddingFlow: Improving Normalizing Flows with Padding-Dimensional Noise

Qinglong Meng, Chongkun Xia, Xueqian Wang

2024-03-13Density Estimation
PaperPDFCode(official)

Abstract

Normalizing flow is a generative modeling approach with efficient sampling. However, Flow-based models suffer two issues: 1) If the target distribution is manifold, due to the unmatch between the dimensions of the latent target distribution and the data distribution, flow-based models might perform badly. 2) Discrete data might make flow-based models collapse into a degenerate mixture of point masses. To sidestep such two issues, we propose PaddingFlow, a novel dequantization method, which improves normalizing flows with padding-dimensional noise. To implement PaddingFlow, only the dimension of normalizing flows needs to be modified. Thus, our method is easy to implement and computationally cheap. Moreover, the padding-dimensional noise is only added to the padding dimension, which means PaddingFlow can dequantize without changing data distributions. Implementing existing dequantization methods needs to change data distributions, which might degrade performance. We validate our method on the main benchmarks of unconditional density estimation, including five tabular datasets and four image datasets for Variational Autoencoder (VAE) models, and the Inverse Kinematics (IK) experiments which are conditional density estimation. The results show that PaddingFlow can perform better in all experiments in this paper, which means PaddingFlow is widely suitable for various tasks. The code is available at: https://github.com/AdamQLMeng/PaddingFlow.

Results

TaskDatasetMetricValueModel
Density EstimationUCI GASCD0.89PaddingFlow
Density EstimationUCI GASEMD0.131PaddingFlow
Density EstimationUCI GASMMD-CD0.39PaddingFlow
Density EstimationUCI GASMMD-EMD0.121PaddingFlow
Density EstimationBSDS300CD0.495PaddingFlow
Density EstimationBSDS300EMD0.0248PaddingFlow
Density EstimationBSDS300MMD-CD0.48PaddingFlow
Density EstimationBSDS300MMD-EMD0.0212PaddingFlow
Density EstimationCaltech-101MMD-L217.9PaddingFlow
Density EstimationUCI HEPMASSCD13.8PaddingFlow
Density EstimationUCI HEPMASSEMD0.161PaddingFlow
Density EstimationUCI HEPMASSMMD-CD13.7PaddingFlow
Density EstimationUCI HEPMASSMMD-EMD0.153PaddingFlow
Density EstimationUCI MINIBOONECD24.5PaddingFlow
Density EstimationUCI MINIBOONEEMD0.268PaddingFlow
Density EstimationUCI MINIBOONEMMD-CD24PaddingFlow
Density EstimationUCI MINIBOONEMMD-EMD0.255PaddingFlow
Density EstimationMNISTMMD-L211PaddingFlow
Density EstimationFreyfacesMMD-L20.621PaddingFlow
Density EstimationOMNIGLOTMMD-L220.3PaddingFlow
Density EstimationUCI POWERCD0.142PaddingFlow
Density EstimationUCI POWEREMD0.105PaddingFlow
Density EstimationUCI POWERMMD-CD0.135PaddingFlow
Density EstimationUCI POWERMMD-EMD0.098PaddingFlow

Related Papers

Missing value imputation with adversarial random forests -- MissARF2025-07-213C-FBI: A Combinatorial method using Convolutions for Circle Fitting in Blurry Images2025-07-15Rethinking Discrete Tokens: Treating Them as Conditions for Continuous Autoregressive Image Synthesis2025-07-02Binned semiparametric Bayesian networks2025-06-27Distilling Normalizing Flows2025-06-26EBC-ZIP: Improving Blockwise Crowd Counting with Zero-Inflated Poisson Regression2025-06-24SENIOR: Efficient Query Selection and Preference-Guided Exploration in Preference-based Reinforcement Learning2025-06-17Scaling-Up the Pretraining of the Earth Observation Foundation Model PhilEO to the MajorTOM Dataset2025-06-17