Optimization Monte Carlo: Efficient and Embarrassingly Parallel Likelihood-Free Inference

Open Access
Authors
Publication date 2015
Host editors
  • C. Cortes
  • N.D. Lawrence
  • D.D. Lee
  • M. Sugiyama
  • R. Garnett
Book title 29th Annual Conference on Neural Information Processing Systems 2015
Book subtitle Montreal, Canada, 7-12 December 2015
ISBN
  • 9781510825024
Series Advances in Neural Information Processing Systems
Event Neural Information Processing Systems (NIPS2015)
Volume | Issue number 3
Pages (from-to) 2080-2088
Publisher Red Hook, NY: Curran Associates
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
We describe an embarrassingly parallel, anytime Monte Carlo method for likelihood-free models. The algorithm starts with the view that the stochasticity of the pseudo-samples generated by the simulator can be controlled externally by a vector of random numbers u, in such a way that the outcome, knowing u, is deterministic. For each instantiation of u we run an optimization procedure to minimize the distance between summary statistics of the simulator and the data. After reweighing these samples using the prior and the Jacobian (accounting for the change of volume in transforming from the space of summary statistics to the space of parameters) we show that this weighted ensemble represents a Monte Carlo estimate of the posterior distribution. The procedure can be run embarrassingly parallel (each node handling one sample) and anytime (by allocating resources to the worst performing sample). The procedure is validated on six experiments.
Document type Conference contribution
Language English
Published at http://papers.nips.cc/paper/5881-optimization-monte-carlo-efficient-and-embarrassingly-parallel-likelihood-free-inference
Downloads
Permalink to this page
Back