Assessing Complex Problem Solving Performances

Abstract
Computer-based simulations can give a more nuanced understanding of what students know and can do than traditional testing methods. These extended, integrated tasks, however, introduce particular problems, including producing an overwhelming amount of data, multidimensionality, and local dependence. In this paper, we describe an approach to understanding the data from complex performances based on Evidence-Centred Design (ECD), a methodology for devising assessments and for using the evidence observed in complex student performances to make inferences about proficiency. We use as an illustration the National Assessment of Educational Progress (NAEP) Problem Solving in Technology-Rich Environments Study, which is being conducted to exemplify how non-traditional skills might be assessed in a sample-based national survey. The paper focuses on the inferential uses of Evidence-Centred Design, especially how features are extracted from student performance, how these extractions are evaluated, and how the evaluations are accumulated to make evaluative judgements.

This publication has 3 references indexed in Scilit: