- Invariant Bayesian Inference in Regression Models that is robust against the Jeffreys-Lindley's paradox
- Number of pages
- Amsterdam: Department of Quantitative Economics
- UvA Econometrics Discussion Paper
- Volume | Edition (Serie)
- Document type
- Working paper
- Faculty of Economics and Business (FEB)
- Amsterdam School of Economics Research Institute (ASE-RI)
We obtain the prior and posterior probability of a nested regression model as the Hausdorff-integral of the prior and posterior on the parameters of an encompassing linear regression model over a lower dimensional set that represents the nested model. The invariant expression of the Hausdorff-integral results from a uniformly converging limit sequence of encompassing full dimensional sets. The uniform convergence avoids the Borel-Kolmogorov paradox. Basing priors and prior probabilities of nested regression models on the prior on the parameters of an encompassing linear regression model reduces the discrepancies between classical and Bayesian inference, like, the Jeffreys-Lindley's paradox. Depending on the parameter of interest in the encompassing linear regression model, the posterior odds ratio is fully robust to the Jeffreys-Lindley's paradox or only allows for a non-informative prior on the parameters of the encompassing model. We illustrate the analysis with examples of linear restrictions, i.e. a linear regression model, and non-linear restrictions, i.e. a cointegration and an autoregressive moving
average model, on the parameters of an encompassing linear regression model.
If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library, or send a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible.