Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables

Open Access
Authors
Publication date 2020
Journal Proceedings of Machine Learning Research
Event The 37th International Conference on Machine Learning (ICML 2020)
Volume | Issue number 119
Pages (from-to) 10018-10028
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Neural processes (NPs) constitute a family of variational approximate models for stochastic processes with promising properties in computational efficiency and uncertainty quantification. These processes use neural networks with latent variable inputs to induce a predictive distribution. However, the expressiveness of vanilla NPs is limited as they only use a global latent variable, while target-specific local variation may be crucial sometimes. To address this challenge, we investigate NPs systematically and present a new variant of NP model that we call Doubly Stochastic Variational Neural Process (DSVNP). This model combines the global latent variable and local latent variables for prediction. We evaluate this model in several experiments, and our results demonstrate competitive prediction performance in multi-output regression and uncertainty estimation in classification.
Document type Article
Note International Conference on Machine Learning, 13-18 July 2020, Virtual. - With supplementary file.
Language English
Published at http://proceedings.mlr.press/v119/wang20s.html https://arxiv.org/abs/2008.09469
Downloads
wang20s (Final published version)
Supplementary materials
Permalink to this page
Back