- Optimization Monte Carlo: Efficient and Embarrassingly Parallel Likelihood-Free Inference
- Advances in Neural Information Processing Systems
- Pages (from-to)
- Document type
- Faculty of Science (FNWI)
- Informatics Institute (IVI)
We describe an embarrassingly parallel, anytime Monte Carlo method for likelihood-free models. The algorithm starts with the view that the stochasticity of the pseudo-samples generated by the simulator can be controlled externally by a vector of random numbers u, in such a way that the outcome, knowing u, is deterministic. For each instantiation of u we run an optimization procedure to minimize the distance between summary statistics of the simulator and the data. After reweighing these samples using the prior and the Jacobian (accounting for the change of volume in transforming from the space of summary statistics to the space of parameters) we show that this weighted ensemble represents a Monte Carlo estimate of the posterior distribution. The procedure can be run embarrassingly parallel (each node handling one sample) and anytime (by allocating resources to the worst performing sample). The procedure is validated on six experiments.
- Proceedings title: 29th Annual Conference on Neural Information Processing Systems 2015: Montreal, Canada, 7-12 December 2015.
- Volume 3
Publisher: Curran Associates
Place of publication: Red Hook, NY
Editors: C. Cortes, N.D. Lawrence, D.D. Lee, M. Sugiyama, R. Garnett