Model choice in generalised linear models: a Bayesian approach via Kullback-Leibler projections

Abstract
We propose a general Bayesian method of comparing models. The approach is based on the Kullback-Leibler distance between two families of models, one nested within the other. For each parameter value of a full model, we compute the projection of the model to the restricted parameter space and the corresponding minimum distance. From the posterior distribution of the minimum distance, we can judge whether or not a more parsimonious model is appropriate. We show how the projection method can be implemented for generalised linear model selection and we propose some Markov chain Monte Carlo algorithms for its practical implementation in less tractable cases. We illustrate the method with examples.