Skip to main content

Research Repository

Advanced Search

All Outputs (1)

Accelerating Convergence of Replica Exchange Stochastic Gradient MCMC via Variance Reduction (2021)
Presentation / Conference
Deng, W., Feng, Q., Karagiannis, G., Lin, G., & Liang, F. (2021, December). Accelerating Convergence of Replica Exchange Stochastic Gradient MCMC via Variance Reduction. Paper presented at International Conference on Learning Representations (ICLR'21), Virtual Event

Replica exchange stochastic gradient Langevin dynamics (reSGLD) has shown promise in accelerating the convergence in non-convex learning; however, an excessively large correction for avoiding biases from noisy energy estimators has limited the potent... Read More about Accelerating Convergence of Replica Exchange Stochastic Gradient MCMC via Variance Reduction.