TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Marginalizable Density Models

Marginalizable Density Models

Dar Gilboa, Ari Pakman, Thibault Vatter

2021-06-08ImputationDensity Estimation
PaperPDFCode(official)

Abstract

Probability density models based on deep networks have achieved remarkable success in modeling complex high-dimensional datasets. However, unlike kernel density estimators, modern neural models do not yield marginals or conditionals in closed form, as these quantities require the evaluation of seldom tractable integrals. In this work, we present the Marginalizable Density Model Approximator (MDMA), a novel deep network architecture which provides closed form expressions for the probabilities, marginals and conditionals of any subset of the variables. The MDMA learns deep scalar representations for each individual variable and combines them via learned hierarchical tensor decompositions into a tractable yet expressive CDF, from which marginals and conditional densities are easily obtained. We illustrate the advantage of exact marginalizability in several tasks that are out of reach of previous deep network-based density estimation models, such as estimating mutual information between arbitrary subsets of variables, inferring causality by testing for conditional independence, and inference with missing data without the need for data imputation, outperforming state-of-the-art models on these tasks. The model also allows for parallelized sampling with only a logarithmic dependence of the time complexity on the number of variables.

Results

TaskDatasetMetricValueModel
Density EstimationUCI POWERLog-likelihood1.78nMDMA

Related Papers

Missing value imputation with adversarial random forests -- MissARF2025-07-21MoTM: Towards a Foundation Model for Time Series Imputation based on Continuous Modeling2025-07-173C-FBI: A Combinatorial method using Convolutions for Circle Fitting in Blurry Images2025-07-15Rethinking Discrete Tokens: Treating Them as Conditions for Continuous Autoregressive Image Synthesis2025-07-02Binned semiparametric Bayesian networks2025-06-27BMFM-DNA: A SNP-aware DNA foundation model to capture variant effects2025-06-26Distilling Normalizing Flows2025-06-26Leveraging AI Graders for Missing Score Imputation to Achieve Accurate Ability Estimation in Constructed-Response Tests2025-06-25