- How to overcome the Jeffreys-Lindleys Paradox for invariant Bayesian Inference in Regression Models
- Number of pages
- Amsterdam / Rotterdam: Tinbergen Institute
- Tinbergen Institute Discussion Paper
- Volume | Edition (Serie)
- TI 2001-073/4
- Document type
- Working paper
- Faculty of Economics and Business (FEB)
- Amsterdam School of Economics Research Institute (ASE-RI)
We obtain invariant expressions for prior probabilities and priors onthe parameters of nested regression models that are induced by aprior on the parameters of an encompassing linear regression model.The invariance is with respect to specifications that satisfy anecessary set of assumptions. Invariant expressions for posteriorprobabilities and posteriors are induced in an identical way by therespective posterior. These posterior probabilities imply a posteriorodds ratio that is robust to the Jeffreys-Lindleys paradox. Thisresults because the prior odds ratio obtained from the induced priorprobabilities corrects the Bayes factor for the plausibility of thecompeting models reflected in the prior. We illustrate the analysis,where we focus on the construction of specifications that satisfy theset of assumptions, with examples of linear restrictions, i.e. alinear regression model, and non-linear restrictions, i.e. acointegration and ARMA(l,l) model, on the parameters of anencompassing linear regression model.
If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library, or send a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible.