Date & Time: Wednesday, July 21 at 12:00PM (Noon) EST
Extending upon joint modeling of response and response times (e.g., van der Linden, 2007; van der Linden, Klein Entink, & Fox, 2010), joint modeling of responses and response times has been proposed for cognitive diagnosis (Minchen & de la Torre, 2016; Zhan, Jiao, & Liao, 2017). Further, answer change data could be informative for cognitive diagnosis of test-takers’ strengths and weaknesses. The joint modeling of responses, response time, and answer change behaviors for cognitive diagnosis (Jiao, Ding, & Yin, 2020) has been proposed. The estimation accuracy in person-related parameters increased. However, due to the increase of data sources, the increase in the number of model parameters increased the computational burden in joint modeling. This study explores to use some machine learning algorithms including both supervised and unsupervised algorithms to analyze responses, response time, and answer changes for cognitive diagnosis. The results will be compared with the joint modeling approaches of the three types of data. Both simulation study and real data analysis will be conducted to investigate cognitive classification decision under different study conditions.
About the Speaker
Dr. Hong Jiao is a Professor in Measurement, Statistics, and Evaluation at the University of Maryland (UMD), College Park and Director for Maryland Assessment Research Center (MARC). She received her doctoral degree from Florida State University. Prior to joining the faculty of UMD, she has worked as a psychometrician at Harcourt Assessment for over four years on different state assessment programs. She has published journal articles and book chapters, and presented on a variety of topics, including integrating product and process data for cognitive diagnosis and cheating detection, multilevel IRT modeling, modeling complex local item dependence in innovative assessment, mixture item response theory modeling, computerized adaptive testing for classification, and psychometrics in large-scale assessment. Her work on dual local item dependence won her the Bradley Hanson Award for Contributions to Educational Measurement by NCME. She is currently serving as Associate Editor for Large-Scale Assessment in Education. She was one of the editors for the special topic on process data for educational and psychological measurement in the journal of Frontiers in Psychology. She is currently serving on several editorial board including Educational Measurement: Issues and Practice, Measurement: Interdisciplinary Research and Perspectives, and the Springer book series: Methodology of Educational Measurement and Assessment. As Director of MARC, she works with the team to provide psychometric research and service to the Maryland State assessment needs. She has served as a committee member and chair on several NCME, AERA, IMPS, and IOMW committees. She also serves on several Technical Advisory Committees for different testing programs.