Multiple-choice (MC) items are widely used in educational assessment because of their ease of administration and scoring. To optimize the diagnostic value of MC data, recent research involving cognitive diagnosis models (CDMs) has focused on harnessing diagnostic information that can be found in distractors. One such CDM is the MC deterministic input, noisy “and” gate (MC-DINA) model, where distractors are coded to probe how examinees missing some of the required attributes respond. Although promising, the MC-DINA model has some important limitations—aside from not accommodating misconceptions, it strictly assumes that the knowledge states represented by the distractors are a subset of those of the correct response. To address these limitations, the extended MC-DINA (eMC-DINA) model, which allows for a more flexible coding of the distractors and can simultaneously accommodate skills and misconceptions, has been proposed. Previous studies have shown that the eMC-DINA model yields higher correct classification rates than the MC-DINA model. However, to date, neither the eMC-DINA model nor the MC-DINA model has been applied to real data. This study illustrates the application of the two models using a proportional reasoning (PR) test, where six skills and three misconceptions are represented in the MC options. The data were collected from over 1,400 secondary students in Hong Kong. Results from fitting the eMC-DINA and MC-DINA models to thePR data will be presented, and discrepancies in the model-data fit and attribute classifications will be highlighted. The implications of this study for test design and item writing will be discussed.
About the speaker
Jimmy de la Torre is Head and Professor of the Human Communication, Development, and Information Sciences Academic Unit of the Faculty of Education at The University of Hong Kong. He is also currently a Chair Professor at the National Taichung University of Education in Taiwan, and an Honorary Professor at the Universidad Autonoma de Madrid in Spain. His primary research interests are in the field of educational and psychological testing and measurement, with a particular emphasis on item response theory, cognitive diagnosis modeling, and the use of assessment to inform instruction and learning. In 2009, he was named by the White House as one of the recipients of the Presidential Early Career Awards for Scientists and Engineers. He also received the Jason Millman Promising Measurement Scholar Award in 2009 and the Bradley Hanson Award for Contributions to Educational Measurement in 2017 from the National Council on Measurement in Education. He was Editor-in-Chief of the Journal of Educational Measurement, and is currently Associate Editor of Applied Psychological Measurement and Frontiers in Education. He is also on the editorial boards of Applied Measurement in Education, Educational Measurement: Issues and Practice, International Journal of Testing, and Measurement: Interdisciplinary Research and Perspectives. For close to two decades now, Jimmy has conducted over 40 training and professional development workshops on cognitive diagnosis modeling in more than a dozen countries and four continents. He is currently the Chair of the American Educational Research Association Cognition and Assessment Special Interest Group, and a member of the Psychometric Society Board of Trustees.