Predictive Complexity Priors
| Authors |
|
|---|---|
| Publication date | 2021 |
| Journal | Proceedings of Machine Learning Research |
| Event | 2021 International Conference on Artificial Intelligence and Statistics |
| Volume | Issue number | 130 |
| Pages (from-to) | 694-702 |
| Organisations |
|
| Abstract |
Specifying a Bayesian prior is notoriously difficult for complex models such as neural networks. Reasoning about parameters is made challenging by the high-dimensionality and over-parameterization of the space. Priors that seem benign and uninformative can have unintuitive and detrimental effects on a model's predictions. For this reason, we propose predictive complexity priors: a functional prior that is defined by comparing the model's predictions to those of a reference model. Although originally defined on the model outputs, we transfer the prior to the model parameters via a change of variables. The traditional Bayesian workflow can then proceed as usual. We apply our predictive complexity prior to high-dimensional regression, reasoning over neural network depth, and sharing of statistical strength for few-shot learning.
|
| Document type | Article |
| Note | International Conference on Artificial Intelligence and Statistics, 13-15 April 2021, Virtual. - With supplementary file. |
| Language | English |
| Published at | http://proceedings.mlr.press/v130/nalisnick21a.html |
| Downloads |
nalisnick21a
(Final published version)
|
| Supplementary materials | |
| Permalink to this page | |