Abstract
Maximum likelihood (ML) inference for the class of homogeneous linear predictor (HLP) models for contingency tables is described. HLP models constrain the expected table counts m through L(m) = Xβ, where the link L is allowed to be a many-to-one, nonlinear function. Generalized log-linear, association trend, marginal cumulative probit, and conditional marginal homogeneity models are given as specific examples. ML fit results, which include point estimates, goodness-of-fit statistics, and asymptotic-based approximate distributions, are described and compared for equivalent HLP models. The results are valid for a wide variety of sampling plans including combinations of product multinomial and Poisson sampling. An important practical implication of this article is that the implementation of ML fitting and theory is straightforward, and an attractive alternative to weighted least squares estimation, for HLP models.