Presenters: Benjamin Deonovic (Corteva), Timo Bechger (Metior Consulting), Gunter Maris (Metior Consulting)
The Boltzmann machine remains an attractive generative model for supervised and unsupervised learning of complex multivariate distributions of binary random variables. Often the observed variables are augmented with unobserved binary variables and a Boltzmann machine is assumed for both together. Popular examples are restricted Boltzmann machines, latent tree models, and deep Boltzmann machines. In this webinar we explore three topics related to the use of Boltzmann machines in psychometrics: how to exploit the representation of a Boltzmann machine as a special instance of Multidimensional Item Response Theory models (Maris and Bechger, 2021); how this representation sheds light on the implications of recent work on the identifiability of the four parameter normal ogive (4PNO) model through the approximation of a latent tree model (Maris, Bechger, and Deonovic 2020); and finally an application of latent tree models to a large scale educational assessment aimed to provide personalized diagnostic feedback (Deonovic, Bechger, and Maris 2020).
Registration is free, but you will need to register to attend. Register in advance for this webinar here.
After registering, you will receive a confirmation email containing information about joining the webinar.
Deonovic, B., Bechger, T., & Maris, G. (2020, September 28). The Ising on the Tree: The Master Model for Learning, Assessment, and Navigation. https://doi.org/10.31234/osf.io/t65wa (preprint)
Kern, J. L., & Culpepper, S. A. (2020). A restricted four-parameter IRT model: The dyad four-parameter normal ogive (dyad-4PNO) model. Psychometrika, 1–25.
Maris, G., & Bechger, T. (2021, May 13). Boltzmann Machines as Multidimensional Item Response Theory Models. https://doi.org/10.31234/osf.io/zjh83 (preprint)
Maris, G., Bechger, T., & Deonovic, B. (2020, December 9). Why does the Dyad-4PNO model of Kern and Culpepper (2020) fit real data? https://doi.org/10.31234/osf.io/razq9 (preprint)