Hyeon-Ah (Annie) Kang, University of Texas at Austin

Detecting item parameter drift online using response and response times

Spotlight Speaker

When tests are administered continuously or at frequent time intervals, some items may become known to prospective examinees or may undergo changes in the statistical properties. The purpose of this study is to present a sequential monitoring procedure that regularly checks on the quality of items across the span of time the items are in operation. The procedure is based on a sequential generalized likelihood ratio test, which evaluates the likelihood of the currently estimated item parameters against the likelihood of the pre-calibrated item parameter values. The test is designed to integrate information from the response and response time data, and detect a change-point as soon as an item exhibits parameter drift within the hierarchical framework (van der Linden, 2007). For estimating the item parameters, we perform continuous online calibration based on moving samples. The suggested procedure provides a powerful automated tool for maintaining the quality of an item pool by conducting a series of hypothesis testing on the individual items under the parametric model that capitalizes on two sources of information. The effectiveness of the proposed method is evaluated through extensive simulation studies and an application to a large-scale high-stakes computerized adaptive test. All evaluations are made in comparison with the existing statistical quality control procedure (e.g., Veerkamp & Glas, 2000).

about the speaker

Hyeon-Ah (Annie) Kang

Kang is an assistant professor in the Quantitative Methods program in the Department of Educational Psychology. She also serves as an associate director of the Center for Applied Psychometric Research at the College of Education. Her research is centered on theoretical and applied statistics in educational and psychological measurement.

Log in