Abstract
Saddlepoint conditions on a predictor are introduced and developed to reconfirm the need for the assumption of a prior distribution in constructing a useful inferential procedure. A condition yields that the predictor induced from the maximum likelihood estimator is the worst under a loss, while the predictor induced from a suitable posterior mean is the best. This result indicates the promising role of Bayesian criteria, such as the deviance information criterion (DIC). As an implication, we critique the conventional empirical Bayes method because of its partial assumption of a prior distribution.
Original language | English |
---|---|
Pages (from-to) | 1990-2000 |
Number of pages | 11 |
Journal | Journal of Statistical Planning and Inference |
Volume | 141 |
Issue number | 5 |
DOIs | |
Publication status | Published - May 2011 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty
- Applied Mathematics