Mucosal IgA and URTI in American College Football Players: A Year Longitudinal Study

Abstract
The purpose of this study was: (a) to evaluate secretory immunoglobulin A (s-IgA) over a 12-month time period in college football players, and (b) to assess which of the commonly used standard methods of reporting s-IgA, either alone or in combination, serves as the best predictor of incidence of upper respiratory tract infection (URTI). One hundred college-aged males (75 varsity college football athletes, 25 nonfootball controls) were studied at eight points over a 12-month period. Resting mucosal IgA, protein and osmolality levels were determined from saliva using established procedures. In addition, incidence of URTI over the 12-month study duration was calculated from completed standard research logs. Repeated-measures ANOVA were conducted on the dependent variables and eight separate stepwise multiple regression analyses were conducted to predict the dependent variable “number of colds” by the independent variables, s-IgA, saliva flow rate, secretion rate of s-IgA, protein, s-IgA:protein, osmolality and s-IgA:osmolality at each data collection point. There was a significant main effect for group, time, and the group × time interaction for s-IgA, the secretion rate of s-IgA, and the number of colds. In the regression model, the only variable that made a significant contribution to the variance at all time points was the secretion rate of s-IgA. These findings suggest that a season of training in American football results in a significant decrease in both s-IgA and the secretion rate of s-IgA as well as an increase in the incidence of URTI. Among the various methods commonly employed to express s- IgA levels, the secretion rate of s-IgA may be the most useful clinical biomarker to predict the incidence of URTI.