TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/Residual Normal Distribution

Residual Normal Distribution

GeneralIntroduced 20005 papers
Source Paper

Description

Residual Normal Distributions are used to help the optimization of VAEs, preventing optimization from entering an unstable region. This can happen due to sharp gradients caused in situations where the encoder and decoder produce distributions far away from each other. The residual distribution parameterizes q(z∣x)q\left(\mathbf{z}|\mathbf{x}\right)q(z∣x) relative to p(z)p\left(\mathbf{z}\right)p(z). Let p(zi_l∣z_<l):=N(μ_i(z_<l),σ_i(z_<l))p\left(z^{i}\_{l}|\mathbf{z}\_{<l}\right) := N \left(\mu\_{i}\left(\mathbf{z}\_{<l}\right), \sigma\_{i}\left(\mathbf{z}\_{<l}\right)\right)p(zi_l∣z_<l):=N(μ_i(z_<l),σ_i(z_<l)) be a Normal distribution for the iiith variable in z_l\mathbf{z}\_{l}z_l in prior. Define q(zi_l∣z_<l,x):=N(μ_i(z_<l)+Δμ_i(z_<l,x),σ_i(z_<l)⋅Δσ_i(z_<l,x))q\left(z^{i}\_{l}|\mathbf{z}\_{<l}, x\right) := N\left(\mu\_{i}\left(\mathbf{z}\_{<l}\right) + \Delta\mu\_{i}\left(\mathbf{z}\_{<l}, x\right), \sigma\_{i}\left(\mathbf{z}\_{<l}\right) \cdot \Delta\sigma\_{i}\left(\mathbf{z}\_{<l}, x\right) \right)q(zi_l∣z_<l,x):=N(μ_i(z_<l)+Δμ_i(z_<l,x),σ_i(z_<l)⋅Δσ_i(z_<l,x)), where Δμ_i(z_<l,x)\Delta\mu\_{i}\left(\mathbf{z}\_{<l}, \mathbf{x}\right)Δμ_i(z_<l,x) and Δσ_i(z_<l,x)\Delta\sigma\_{i}\left(\mathbf{z}\_{<l}, \mathbf{x}\right)Δσ_i(z_<l,x) are the relative location and scale of the approximate posterior with respect to the prior. With this parameterization, when the prior moves, the approximate posterior moves accordingly, if not changed.

Papers Using This Method

A Variational AutoEncoder for Transformers with Nonparametric Variational Information Bottleneck2022-07-27Alleviating Adversarial Attacks on Variational Autoencoders with MCMC2022-03-18Polarity Sampling: Quality and Diversity Control of Pre-Trained Generative Networks via Singular Values2022-03-03NVAE-GAN Based Approach for Unsupervised Time Series Anomaly Detection2021-01-08NVAE: A Deep Hierarchical Variational Autoencoder2020-07-08