How to overcome the Jeffreys-Lindleys Paradox for invariant Bayesian Inference in Regression Models

Authors
Publication date 2001
Series Tinbergen Institute Discussion Paper, TI 2001-073/4
Number of pages 25
Publisher Amsterdam / Rotterdam: Tinbergen Institute
Organisations
  • Faculty of Economics and Business (FEB) - Amsterdam School of Economics Research Institute (ASE-RI)
Abstract
We obtain invariant expressions for prior probabilities and priors onthe parameters of nested regression models that are induced by aprior on the parameters of an encompassing linear regression model.The invariance is with respect to specifications that satisfy anecessary set of assumptions. Invariant expressions for posteriorprobabilities and posteriors are induced in an identical way by therespective posterior. These posterior probabilities imply a posteriorodds ratio that is robust to the Jeffreys-Lindleys paradox. Thisresults because the prior odds ratio obtained from the induced priorprobabilities corrects the Bayes factor for the plausibility of thecompeting models reflected in the prior. We illustrate the analysis,where we focus on the construction of specifications that satisfy theset of assumptions, with examples of linear restrictions, i.e. alinear regression model, and non-linear restrictions, i.e. acointegration and ARMA(l,l) model, on the parameters of anencompassing linear regression model.
Document type Working paper
Language English
Published at http://papers.tinbergen.nl/01073.pdf
Permalink to this page
Back