Abstract

We extend the Langevin Monte Carlo (LMC) algorithm to compactly supported measures via a projection step, akin to projected stochastic gradient descent (SGD). We show that (projected) LMC allows to sample in polynomial time from a log-concave distribution with smooth potential. This gives a new Markov chain to sample from a log-concave distribution. Our main result shows in particular that when the target distribution is uniform, LMC mixes in $$\widetilde{O}(n^7)$$ steps (where n is the dimension). We also provide preliminary experimental evidence that LMC performs at least as well as hit-and-run, for which a better mixing time of $$\widetilde{O}(n^4)$$ was proved by Lovász and Vempala.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call