Measuring student learning with item response theory
Young-Jin Lee, David J. Palazzo, Rasil Warnakulasooriya, and David E. Pritchard
We investigate short-term learning from hints and feedback in a Web-based physics tutoring system. Both the skill of students and the difficulty and discrimination of items were determined by applying item response theory (IRT) to the first answers of students who are working on for-credit homework items in an introductory Newtonian physics course. We show that after tutoring a shifted logistic item response function with lower discrimination fits the students’ second responses to an item previously answered incorrectly. Student skill decreased by 1.0 standard deviation when students used no tutoring between their (incorrect) first and second attempts, which we attribute to “item-wrong bias.” On average, using hints or feedback increased students’ skill by 0.8 standard deviation. A skill increase of 1.9 standard deviation was observed when hints were requested after viewing, but prior to attempting to answer, a particular item. The skill changes measured in this way will enable the use of IRT to assess students based on their second attempt in a tutoring environment.
©2008 The American Physical Society
URL: http://link.aps.org/abstract/PRSTPER/v4/e010102
DOI: 10.1103/PhysRevSTPER.4.010102
2008-02-11
Lee Palazzo Warnakulasooriya Pritchard - Phys Rev 2008
Tags:
IRT,
Lee,
methodology,
Palazzo,
Physical Review,
Pritchard,
Warnakulasooriya