Probabilistic programming allows a user to specify a Bayesian model in code and perform inference on that model in the presence of observed data. Markov chain Monte Carlo (MCMC) is a flexible method for sampling from the posterior distribution of these models, and Hamiltonian Monte Carlo is a particularly efficient implementation of MCMC, allowing it to be applied to more complex models. Hamiltonian Monte Carlo was first described 30 years ago, began to be applied to statistics about 20 years ago, started appearing in textbooks 10 years ago, and became easily available in software libraries only in the last few years. This talk will begin with an introduction to MCMC algorithms, then move to the theory and practice of Hamiltonian Monte Carlo algorithms. Examples will be provided using the Python probabilistic programming library PyMC3.