Four-Parameter Guessing Model and Related Item Response Models
Open Access
- 17 November 2022
- journal article
- research article
- Published by MDPI AG in Mathematical and Computational Applications
- Vol. 27 (6), 95
- https://doi.org/10.3390/mca27060095
Abstract
Guessing effects frequently occur in testing data in educational or psychological applications. Different item response models have been proposed to handle guessing effects in dichotomous test items. However, it has been pointed out in the literature that the often employed three-parameter logistic model poses implausible assumptions regarding the guessing process. The four-parameter guessing model has been proposed as an alternative to circumvent these conceptual issues. In this article, the four-parameter guessing model is compared with alternative item response models for handling guessing effects through a simulation study and an empirical example. It turns out that model selection for item response models should be rather based on the AIC than the BIC. However, the RMSD item fit statistic used with typical cutoff values was found to be ineffective in detecting misspecified item response models. Furthermore, sufficiently large sample sizes are required for sufficiently precise item parameter estimation. Moreover, it is argued that the criterion of the statistical model fit should not be the sole criterion of model choice. The item response model used in operational practice should be valid with respect to the meaning of the ability variable and the underlying model assumptions. In this sense, the four-parameter guessing model could be the model of choice in educational large-scale assessment studies.Keywords
This publication has 82 references indexed in Scilit:
- A Nested Logit Approach for Investigating Distractors as Causes of Differential Item FunctioningJournal of Educational Measurement, 2011
- Estimation of a four‐parameter item response theory modelBritish Journal of Mathematical and Statistical Psychology, 2010
- On Minimizing Guessing Effects on Multiple-Choice Items: Superiority of a two solutions and three distractors item format to a one solution and five distractors item formatInternational Journal of Selection and Assessment, 2010
- An NCME Instructional Module on Booklet Designs in Large‐Scale Assessments of Student Achievement: Theory and PracticeEducational Measurement: Issues and Practice, 2009
- IRT Model Selection Methods for Dichotomous ItemsApplied Psychological Measurement, 2007
- A skew item response modelBayesian Analysis, 2006
- Nonparametric Item Response Function Estimation for Assessing Parametric Model FitApplied Psychological Measurement, 2001
- Prediction Functions for Categorical Panel DataThe Annals of Statistics, 1995
- Small N Does Not Always Justify Rasch ModelApplied Psychological Measurement, 1986
- Effects of Local Item Dependence on the Fit and Equating Performance of the Three-Parameter Logistic ModelApplied Psychological Measurement, 1984