This is a test library to provide reference implementations of MCMC algorithms and ideas. The basis and reference for much of this library is from Michael Betancourt’s wonderful A Conceptual Introduction to Hamiltonian Monte Carlo. The highlight of the library right now is the ~15 line Hamiltonian Monte Carlo implementation (which relies on an 8 line integrator). This is commented and documented, with an aim to be instructive to read.
I am a contributor to PyMC3, a “Python package for Bayesian statistical modeling and Probabilistic Machine Learning which focuses on advanced Markov chain Monte Carlo and variational fitting algorithms.”
Using hundreds of thousands of historical cross country running results to make predictions about future meets. The page is updated more than weekly during the season.
For the second year in a row, I took part in Kaggle’s contest to predict March Madness winners. The code for the actual model is not very expository (get in touch if you are interested), but I also built a friendlier page to query predictions interactively at the link.
An essay on building linear regression models. It is converted from notes for a talk I gave at Rice University in September 2014. Contains lots of pictures, lots of interactivity, and a modest amount of math. As a bonus, everything is typeset with KaTeX.
A demonstration of linear regression, overfitting, normalization, and regularization. Allows you to choose data from a distribution, and interactively fit polynomials to the data using least squares, ridge, or Lasso regression.