TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/Metropolis Hastings

Metropolis Hastings

GeneralIntroduced 200010 papers

Description

Metropolis-Hastings is a Markov Chain Monte Carlo (MCMC) algorithm for approximate inference. It allows for sampling from a probability distribution where direct sampling is difficult - usually owing to the presence of an intractable integral.

M-H consists of a proposal distribution q(θ′∣θ)q\left(\theta^{'}\mid\theta\right)q(θ′∣θ) to draw a parameter value. To decide whether θ′\theta^{'}θ′ is accepted or rejected, we then calculate a ratio:

p(θ′∣D)p(θ∣D)\frac{p\left(\theta^{'}\mid{D}\right)}{p\left(\theta\mid{D}\right)}p(θ∣D)p(θ′∣D)​

We then draw a random number r∈[0,1]r \in \left[0, 1\right]r∈[0,1] and accept if it is under the ratio, reject otherwise. If we accept, we set θi=θ′\theta_{i} = \theta^{'}θi​=θ′ and repeat.

By the end we have a sample of θ\thetaθ values that we can use to form quantities over an approximate posterior, such as the expectation and uncertainty bounds. In practice, we typically have a period of tuning to achieve an acceptable acceptance ratio for the algorithm, as well as a warmup period to reduce bias towards initialization values.

Image: Samuel Hudec

Papers Using This Method

AdvNF: Reducing Mode Collapse in Conditional Normalising Flows using Adversarial Learning2024-01-29Binary classification based Monte Carlo simulation2023-07-29Data Subsampling for Bayesian Neural Networks2022-10-17A Two-step Metropolis Hastings Method for Bayesian Empirical Likelihood Computation with Application to Bayesian Model Selection2022-09-02Mix and Match: Learning-free Controllable Text Generationusing Energy Language Models2021-11-16Subsampling Generative Adversarial Networks: Density Ratio Estimation in Feature Space with Softplus Loss2019-09-24Hawkes Processes with Stochastic Excitations2016-09-22C3: Lightweight Incrementalized MCMC for Probabilistic Programs using Continuations and Callsite Caching2015-09-07Neural Adaptive Sequential Monte Carlo2015-06-10GPS-ABC: Gaussian Process Surrogate Approximate Bayesian Computation2014-01-13