A meta-analysis of test format effects on reading and listening test performance: Focus on multiple-choice and open-ended formats
- 1 April 2009
- journal article
- review article
- Published by SAGE Publications in Language Testing
- Vol. 26 (2), 219-244
- https://doi.org/10.1177/0265532208101006
Abstract
A meta-analysis was conducted on the effects of multiple-choice and open-ended formats on L1 reading, L2 reading, and L2 listening test performance. Fifty-six data sources located in an extensive search of the literature were the basis for the estimates of the mean effect sizes of test format effects. The results using the mixed effects model of meta-analysis indicate that multiple-choice formats are easier than open-ended formats in L1 reading and L2 listening, with the degree of format effect ranging from small to large in L1 reading and medium to large in L2 listening. Overall, format effects in L2 reading are not found, although multiple-choice formats are found to be easier than open-ended formats when any one of the following four conditions is met: the studies involve between-subjects designs, random assignment, stem-equivalent items, or learners with a high L2 proficiency level. Format effects favoring multiple-choice formats across the three domains are consistently observed when studies employ between-subjects designs, random assignment, or stem-equivalent items.Keywords
This publication has 29 references indexed in Scilit:
- Three Options Are Optimal for Multiple‐Choice Items: A Meta‐Analysis of 80 Years of ResearchEducational Measurement: Issues and Practice, 2005
- Effects of Reader's Knowledge, Text Type, and Test Type on L1 and L2 Reading Comprehension in SpanishThe Modern Language Journal, 2005
- Construct Equivalence of Multiple‐Choice and Constructed‐Response Items: A Random Effects Synthesis of CorrelationsJournal of Educational Measurement, 2003
- MULTIPLE‐CHOICE AND CONSTRUCTED RESPONSE TESTS OF ABILITY: RACE‐BASED SUBGROUP PERFORMANCE DIFFERENCES ON ALTERNATIVE PAPER‐AND‐PENCIL TEST FORMATSPersonnel Psychology, 2002
- Combining effect size estimates in meta-analysis with repeated measures and independent-groups designs.Psychological Methods, 2002
- A program to compute McGraw and wong’s common language effect size indicatorBehavior Research Methods, Instruments & Computers, 1999
- Self-assessment in second language testing: a meta-analysis and analysis of experiential factorsLanguage Testing, 1998
- What are the characteristics of natural cloze tests?Language Testing, 1993
- Postpassage Questions: Task and Reader Effects on Comprehension and Metacomprehension ProcessesJournal of Reading Behavior, 1987
- Does the testing method make a difference? The case of reading comprehensionLanguage Testing, 1984