Book contents
- Frontmatter
- Contents
- List of Contributors
- PART I THE BASIS OF COGNITIVE DIAGNOSTIC ASSESSMENT
- PART II PRINCIPLES OF TEST DESIGN AND ANALYSIS
- PART III PSYCHOMETRIC PROCEDURES AND APPLICATIONS
- 8 Cognitive Foundations of Structured Item Response Models
- 9 Using the Attribute Hierarchy Method to Make Diagnostic Inferences About Examinees' Cognitive Skills
- 10 The Fusion Model Skills Diagnosis System
- 11 Using Information from Multiple-Choice Distractors to Enhance Cognitive-Diagnostic Score Reporting
- 12 Directions for Future Research in Cognitive Diagnostic Assessment
- Author Index
- Subject Index
8 - Cognitive Foundations of Structured Item Response Models
Published online by Cambridge University Press: 23 November 2009
- Frontmatter
- Contents
- List of Contributors
- PART I THE BASIS OF COGNITIVE DIAGNOSTIC ASSESSMENT
- PART II PRINCIPLES OF TEST DESIGN AND ANALYSIS
- PART III PSYCHOMETRIC PROCEDURES AND APPLICATIONS
- 8 Cognitive Foundations of Structured Item Response Models
- 9 Using the Attribute Hierarchy Method to Make Diagnostic Inferences About Examinees' Cognitive Skills
- 10 The Fusion Model Skills Diagnosis System
- 11 Using Information from Multiple-Choice Distractors to Enhance Cognitive-Diagnostic Score Reporting
- 12 Directions for Future Research in Cognitive Diagnostic Assessment
- Author Index
- Subject Index
Summary
A construct-centered approach [to assessment design] would begin by asking what complex of knowledge, skills, or other attributes should be assessed, presumably because they are tied to explicit or implicit objectives of instruction or are otherwise valued by society. Next, what behaviours or performances should reveal those constructs, and what tasks or situations should elicit those behaviours?
Messick (1994, p. 16)INTRODUCTION
The quotation from Messick (1994) that opens this chapter eloquently lays out the essential narrative of educational assessment. Specifically, it captures how the design of tasks should be guided by the structure of the argument about learner competencies that one seeks to develop and eventually support with data from behaviors that activate these competencies. Moreover, the quote alludes to the fact that reasoning about learner competencies is a complex communicative act, which, like all communicative acts, requires the thoughtful integration of distinct pieces of information to construct a coherent, concise, and rhetorically effective argument.
Reasoning about learner competencies in educational assessment is both probabilistic and evidence based. It is probabilistic because it is concerned with developing an argument about one or more unobservable latent characteristics of learners. It is evidence based because the development of the argument must rely on data that are derived from aspects of examinees' observable behavior. However, such data provide only indirect information about latent characteristics, so the process of reasoning about learner competencies requires an agreed-upon framework about which data patterns constitute relevant evidence in what manner (Schum, 1994).
- Type
- Chapter
- Information
- Cognitive Diagnostic Assessment for EducationTheory and Applications, pp. 205 - 241Publisher: Cambridge University PressPrint publication year: 2007
- 14
- Cited by