Abstract
This article describes extensions of the basic Bayesian methods using data priors to regression modelling, including hierarchical (multilevel) models. These methods provide an alternative to the parsimony-oriented approach of frequentist regression analysis. In particular, they replace arbitrary variable-selection criteria by prior distributions, and by doing so facilitate realistic use of imprecise but important prior information. They also allow Bayesian analyses to be conducted using standard regression packages; one need only be able to add variables and records to the data set. The methods thus facilitate the use of Bayesian solutions to problems of sparse data, multiple comparisons, subgroup analyses and study bias. Because these solutions have a frequentist interpretation as ‘shrinkage’ (penalized) estimators, the methods can also be viewed as a means of implementing shrinkage approaches to multiparameter problems.