TY - JOUR
T1 - Stochastic Gradient Hamiltonian Monte Carlo for non-convex learning
AU - Chau, Huy N.
AU - Rásonyi, Miklós
N1 - Publisher Copyright:
© 2022 The Authors
PY - 2022/7
Y1 - 2022/7
N2 - Stochastic Gradient Hamiltonian Monte Carlo (SGHMC) is a momentum version of stochastic gradient descent with properly injected Gaussian noise to find a global minimum. In this paper, non-asymptotic convergence analysis of SGHMC is given in the context of non-convex optimization, where subsampling techniques are used over an i.i.d. dataset for gradient updates. In contrast to Raginsky et al. (2017) and Gao et al. (2021), our results are sharper in terms of step size, variance, and independent from the number of iterations.
AB - Stochastic Gradient Hamiltonian Monte Carlo (SGHMC) is a momentum version of stochastic gradient descent with properly injected Gaussian noise to find a global minimum. In this paper, non-asymptotic convergence analysis of SGHMC is given in the context of non-convex optimization, where subsampling techniques are used over an i.i.d. dataset for gradient updates. In contrast to Raginsky et al. (2017) and Gao et al. (2021), our results are sharper in terms of step size, variance, and independent from the number of iterations.
UR - http://www.scopus.com/inward/record.url?scp=85129467539&partnerID=8YFLogxK
U2 - 10.1016/j.spa.2022.04.001
DO - 10.1016/j.spa.2022.04.001
M3 - Article
AN - SCOPUS:85129467539
SN - 0304-4149
VL - 149
SP - 341
EP - 368
JO - Stochastic Processes and their Applications
JF - Stochastic Processes and their Applications
ER -