The Deep Weight Prior

Open Access
Authors
  • A. Atanov
  • A. Ashukha
  • K. Struminsky
  • D. Vetrov
Publication date 21-03-2019
Book title ICLR 2019
Book subtitle International Conference on Learning Representations : New Orleans, Louisiana, United States, May 6-May 9, 2019
Event 7th International Conference on Learning Representations
Number of pages 17
Publisher OpenReview
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Bayesian inference is known to provide a general framework for incorporating prior knowledge or specific properties into machine learning models via carefully choosing a prior distribution. In this work, we propose a new type of prior distributions for convolutional neural networks, deep weight prior, that in contrast to previously published techniques, favors empirically estimated structure of convolutional filters e.g., spatial correlations of weights. We define deep weight prior as an implicit distribution and propose a method for variational inference with such type of implicit priors. In experiments, we show that deep weight priors can improve the performance of Bayesian neural networks on several problems when training data are limited. Also, we found that initialization of weights of conventional networks with samples from deep weight prior leads to faster training.
Document type Conference contribution
Note Poster presentations.
Language English
Published at https://arxiv.org/abs/1810.06943 https://openreview.net/forum?id=ByGuynAct7
Other links https://openreview.net/group?id=ICLR.cc/2019/Conference
Downloads
the_deep_weight_prior (Final published version)
Permalink to this page
Back