Abstract
July 2 - 10.50-11.40
F. Komaki
- Information Geometry of Statistical Prediction
Bayesian predictive distributions are investigated from the viewpoint of
information geometry.
Kullback-Leibler divergence from the true distribution to a predictive
distribution is adopted as a loss function.
We show that there are many examples where the Bayesian predictive
distribution based on
the Jeffreys prior is dominated by Bayesian predictive distributions
based
on other priors.
It is shown that the Bayesian predictive distribution based on the right
invariant measure
is the best invariant predictive distribution when a model has a group
structure. Furthermore, we show that there exist shrinkage predictive
distributions asymptotically dominating Bayesian predictive
distributions
based on the Jeffreys prior or other vague priors if the model manifold
satisfies some differential geometric conditions.
We show several examples where shrinkage predictive distributions
exactly
dominate Bayesian predictive distributions based on vague priors.
[an error occurred while processing this directive]
|