Branching Stein Variational Gradient Descent for sampling multimodal distributions
Isaias Banales, Arturo Jaramillo, Heli Ricalde Guerrero
2025-06-16Variational Inference
Abstract
We propose a novel particle-based variational inference method designed to work with multimodal distributions. Our approach, referred to as Branched Stein Variational Gradient Descent (BSVGD), extends the classical Stein Variational Gradient Descent (SVGD) algorithm by incorporating a random branching mechanism that encourages the exploration of the state space. In this work, a theoretical guarantee for the convergence in distribution is presented, as well as numerical experiments to validate the suitability of our algorithm. Performance comparisons between the BSVGD and the SVGD are presented using the Wasserstein distance between samples and the corresponding computational times.
Related Papers
Interpretable Bayesian Tensor Network Kernel Machines with Automatic Rank and Feature Selection2025-07-15Scalable Bayesian Low-Rank Adaptation of Large Language Models via Stochastic Variational Subspace Inference2025-06-26VHU-Net: Variational Hadamard U-Net for Body MRI Bias Field Correction2025-06-23Robust Recursive Fusion of Multiresolution Multispectral Images with Location-Aware Neural Networks2025-06-16Variational Inference with Mixtures of Isotropic Gaussians2025-06-16Robust Filtering -- Novel Statistical Learning and Inference Algorithms with Applications2025-06-13Bayesian Probabilistic Matrix Factorization2025-06-11Variational Inference Optimized Using the Curved Geometry of Coupled Free Energy2025-06-10