Wasserstein Variational Inference

Open Access
Authors
  • L. Ambrogioni
  • U. Güçlü
  • Y. Güçlütürk
  • M. Hinne
  • M. van Gerven
  • E. Maris
Publication date 2019
Host editors
  • S. Bengio
  • H. Wallach
  • H. Larochelle
  • K. Grauman
  • N. Cesa-Bianchi
  • R. Garnett
Book title 32nd Conference on Neural Information Processing Systems 2018
Book subtitle Montreal, Canada, 3-8 December 2018
ISBN
  • 9781510884472
Series Advances in Neural Information Processing Systems
Event Advances in Neural Information Processing Systems 2018
Volume | Issue number 4
Pages (from-to) 2473-2482
Publisher La Jolla, CA: Neural Information Processing Systems Foundation
Organisations
  • Faculty of Social and Behavioural Sciences (FMG) - Psychology Research Institute (PsyRes)
Abstract
This paper introduces Wasserstein variational inference, a new form of approximate Bayesian inference based on optimal transport theory. Wasserstein variational inference uses a new family of divergences that includes both f-divergences and the Wasserstein distance as special cases. The gradients of the Wasserstein variational loss are obtained by backpropagating through the Sinkhorn iterations. This technique results in a very stable likelihood-free training method that can be used with implicit distributions and probabilistic programs. Using the Wasserstein variational inference framework, we introduce several new forms of autoencoders and test their robustness and performance against existing variational autoencoding techniques.
Document type Conference contribution
Language English
Published at https://papers.nips.cc/paper/7514-wasserstein-variational-inference
Other links http://www.proceedings.com/48413.html
Downloads
7514-wasserstein-variational-inference (Accepted author manuscript)
Permalink to this page
Back