Complement or Contamination: A Study of the Validity of Multiple-Choice Items when Assessing Reasoning Skills in Physics
- Publisher: Frontiers Media S.A.
Frontiers in Education,
assessment | Education (General) | Didactics | Education | L7-991 | Argumentation skills; assessment; multiple-choice items; national testing; socio-scientific issues | socio-scientific issues | national testing | Didaktik | argumentation skills | multiple-choice items
The purpose of this study is to investigate the validity of using multiple-choice (MC) items as a complement to constructed-response (CR) items when making decisions about student performance on reasoning tasks. CR items from a national test in physics have been reformulated into MC items and students’ reasoning skills have been analyzed in two substudies. In the first study, 12 students answered the MC items and were asked to explain their answers orally. In the second study, 102 students from five randomly chosen schools answered the same items. Their answers were scored, and the frequency of correct answers was calculated for each of the items. The scores were then compared to a sample of student performance on the original CR items from the national test. Findings suggest that results from MC items might be misleading when making decisions about student performance on reasoning tasks, since students use other skills when answering the items than is intended. Results from MC items may also contribute to an overestimation of students’ knowledge in science.