Using DDF in a Post Hoc Analysis to Understand Sources of DIF
- 30 April 2009
- journal article
- research article
- Published by Taylor & Francis Ltd in Educational Assessment
- Vol. 14 (2), 103-118
- https://doi.org/10.1080/10627190903035229
Abstract
The purpose of this article is to describe and demonstrate a three-step process of using differential distractor functioning (DDF) in a post hoc analysis to understand sources of differential item functioning (DIF) in multiple-choice testing. The process is demonstrated on two multiple-choice tests that used complex alternatives (e.g., “No Mistakes”) as distractors. Comparisons were made between different gender and race groups. DIF analyses were conducted using Simultaneous Item Bias Test, whereas DDF analyses were conducted using loglinear model fitting and odds ratios. Five items made it through all three steps and were identified as those with DIF results related to DDF. Implications of the results, as well as suggestions for future research, are discussed.Keywords
This publication has 32 references indexed in Scilit:
- A Comprehensive Framework for Evaluating Hypotheses About Cultural Bias in Educational TestingApplied Measurement in Education, 2006
- Patterns of Errors Made by Students with Disabilities on a Reading Test with Oral Reading AdministrationEducational and Psychological Measurement, 2003
- A Mixture Item Response Model for Multiple-Choice DataJournal of Educational and Behavioral Statistics, 2001
- Using Statistical Procedures to Identify Differentially Functioning Test ItemsEducational Measurement: Issues and Practice, 1998
- Item‐Bundle DIF Hypothesis Testing: Identifying Suspect Bundles and Assessing Their Differential FunctioningJournal of Educational Measurement, 1996
- Differential Item Functioning: its Multidimensional Model and Resulting Sibtest Detection ProcedureBehaviormetrika, 1996
- The Standardization Approach to Assessing Comprehensive Differential Item FunctioningJournal of Educational Measurement, 1992
- The None-of-the-Above Option: An Empirical StudyApplied Measurement in Education, 1991
- MEASURING PROBLEM SOLVING ABILITY IN MATHEMATICS WITH MULTIPLE‐CHOICE ITEMS: THE EFFECT OF ITEM FORMAT ON SELECTED ITEM AND TEST CHARACTERISTICSJournal of Educational Measurement, 1980
- Effects of item format on item discrimination and difficulty.Journal of Applied Psychology, 1973