Invariant Bayesian Inference in Regression Models that is robust against the Jeffreys-Lindley's paradox
| Authors | |
|---|---|
| Publication date | 2003 |
| Series | UvA Econometrics Discussion Paper, 2002/22 |
| Number of pages | 29 |
| Publisher | Amsterdam: Department of Quantitative Economics |
| Organisations |
|
| Abstract |
We obtain the prior and posterior probability of a nested regression model as the Hausdorff-integral of the prior and posterior on the parameters of an encompassing linear regression model over a lower dimensional set that represents the nested model. The invariant expression of the Hausdorff-integral results from a uniformly converging limit sequence of encompassing full dimensional sets. The uniform convergence avoids the Borel-Kolmogorov paradox. Basing priors and prior probabilities of nested regression models on the prior on the parameters of an encompassing linear regression model reduces the discrepancies between classical and Bayesian inference, like, the Jeffreys-Lindley's paradox. Depending on the parameter of interest in the encompassing linear regression model, the posterior odds ratio is fully robust to the Jeffreys-Lindley's paradox or only allows for a non-informative prior on the parameters of the encompassing model. We illustrate the analysis with examples of linear restrictions, i.e. a linear regression model, and non-linear restrictions, i.e. a cointegration and an autoregressive moving
average model, on the parameters of an encompassing linear regression model. |
| Document type | Working paper |
| Language | English |
| Published at | http://www1.feb.uva.nl/pp/bin/468fulltext.pdf |
| Downloads | |
| Permalink to this page | |
