| Feature | Variational Inference (VI) | Markov Chain Monte Carlo (MCMC) |
|---|---|---|
| Goal | Find best approximation in a tractable family | Generate exact samples from true posterior (asymptotically) |
| Accuracy | Biased (approximate); underestimates uncertainty | Unbiased (converges to true posterior) |
| Speed | Fast; scales to large datasets | Slow; often impractical for big data |
| Optimization | Gradient-based; deterministic | Sampling-based; stochastic |
| Parallelization | Easily parallelizable (e.g., mini-batches) | Hard to parallelize (chains are sequential) |
| Tuning | Choose variational family ( \mathcal{Q} ) | Choose proposal distribution, step size, etc. |
| Uncertainty quantification | Can be too confident (KL(q∥p) is mode-seeking) | More reliable posterior coverage |
| Use cases | Real-time inference, VAEs, large-scale Bayesian models | Small-data settings, diagnostics, gold-standard inference |
Created
January 13, 2026 12:02
-
-
Save kardesyazilim/ded15da90a009b2e46f80ab17700f906 to your computer and use it in GitHub Desktop.
Comparison: VI vs. MCMC
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment