Vinayak Rao

Path and parameter inference for Markov jump processes

A variety of phenomena are best described using dynamical models which operate on a discrete state-space and in continuous time. The most common example is the Markov jump processes whose applications range from systems biology, genetics, computing networks and human-computer interactions. Posterior computations typically involve approximations like time discretization and can be computationally intensive. In the first half of this talk, I will describe some previous work (joint with Yee Whye Teh) on a class of Markov chain Monte Carlo methods that allow efficient computations while still being exact. The core idea builds on the idea of 'uniformization', and is an auxiliary variable Gibbs sampler that alternately resamples a random discretization of time given the state-trajectory of the system, and then samples a new trajectory given this discretization. In the second half, I will talk about more recent work (joint with Boqian Zhang) that extends these ideas to allow efficient Bayesian inference over the process parameters. Here we reformulate the earlier idea to develop an efficient Metropolis-Hastings algorithm that more naturally exploits the symmetry of the problem. In our experiments, we show improved performance over the current state of the art