Call for Papers: Psychometrika Special Issue

Data Intensive Methods in Psychometrics

Post Special issue: Data Intensive Methods in Psychometrics

We are excited to announce a call for papers for an upcoming special issue of the Theory and Methods section of Psychometrika, dedicated to “Data Intensive Methods in Psychometrics.”

Psychometrika is now a fully open access journal with open access options for all authors, regardless of funding situation. Further details about these options can be found on the website at https://www.cambridge.org/core/journals/psychometrika.

Psychometrics is a critical discipline for a broad number of fields. Psychometric methods are widely used in integral capacities across education, biomedicine, and business and throughout a variety of academic disciplines. Traditional methodological development has tended to focus on relatively specific datasets. This is a problem in the sense that usage of limited data will not allow for any kind of generalized information on the utility of the method to be readily developed and, in worst case scenarios, may lead to a distorted view of the method’s impact (this may in fact be a relatively common problem given that methodological developments are often inspired by particular use cases). We argue that the field may benefit from methodological research that takes advantage of larger quantities of data and, given that more datasets are being made publicly available, a call to action on this front is timely. Moreover, data-intensive work can offer a critical signal to methodologists about what methodological innovations are actually needed (as compared to methods that may be of conceptual interest but that don’t actually result in substantive differences when compared to simpler alternatives). This proposal for a special issue would showcase papers that advance methodological research with the support of a large volume of datasets. Articles for this special issue need to meet criteria for the Theory and Methods section of the journal. Novel methods presented in data-intensive contexts would be welcome as would research that focuses on the functioning of conventional methodologies to make critical insights about the performance of these methods in the context of large volumes of data.

Scope and Topics of Interest

This proposal emphasizes work with an array of datasets rather than just larger amounts of data. There are some single datasets that contain incredibly large numbers of responses. We are instead interested in methodological development that uses many datasets rather than just a very large volume of data from a single dataset. This emphasis on a range of empirical datasets is a key point of distinction from other recent calls in this journal for papers focusing on high-dimensional or intensive longitudinal data. Some methodologies may excel in analysis of larger datasets but we are interested in the degree to which methodological research can be improved when methods are tested across a broad variety of datasets.  We believe that testing methods using a wide array of comparable empirical data has the ability to push our field further in generating generalizable knowledge. Presenting this broad range of empirical evidence was previously challenging in psychometrics, but a range of resources offering access to harmonized collections of data are being developed which is why we think now is an opportune moment for highlighting such research. We emphasize some of the resources that may be of interest here:

  • The Item Response Warehouse (see here): Item response data from an array of different measures.
  • Attentional Control Data Collection (see here): Data from attention control tasks.
  • openESM (see here): Data from experience sampling studies.
  • PrefLib (see here): A library of preference data.
  • Wordbank (see here): A database of children’s vocabulary development.

Other repositories that contain item response data (e.g., the Open Psychometrics project, PISA, ENEM) or other data (e.g., eye-tracking, fMRI, etc) of interest to Psychometrika readership may also provide data useful to interested researchers (we are also trying to keep a list of such resources updated online: https://itemresponsewarehouse.org/othersites.html).

To make the aims of the special issue concrete, we offer two potential exemplars of data-intensive work of the kind we envision for the special issue:

  • Statistical evidence in psychological networks: uses data from nearly 300 networks in 126 papers to show that evidence regarding individual edges in psychological networks is often quite weak which has implications for how results from such studies should be interpreted. The paper also argues that Bayesian analyses can help to elucidate the degree to which data are supportive of including/excluding specific edges which may help applied researchers make more careful inferences about psychological networks. This refinement of our understanding of the limitations of how statistical methods used to analyze psychological networks actually function is of critical importance for understanding these methods.
  • The InterModel Vigorish as a Lens for Understanding (and Quantifying) the Value of Item Response Models for Dichotomously Coded Items: considers the predictive differences between common item response models for dichotomous items in 89 datasets using a novel metric for studying prediction quality. Results suggest that the 3PL provides only very limited improvements in predictions relative to the 2PL. Exploratory multidimensional models were shown to generate improved predictions in some cases but did not generally yield predictions that were noteworthy relative to unidimensional approaches.

These examples demonstrate that data-intensive methodological work can help us to make critical observations about the functioning of psychological methods and offer improved testing grounds for novel techniques. In our view, such observations constitute important methodological work that should be of interest to Psychometrika as these observations offer foundational information about how methods actually work when used in empirical settings.

The above examples are meant to inspire novel ideas about future work. Alongside work directly with item responses, methodological papers that use expansive collections of response time data, intensive longitudinal data, rated data, or other forms of data analyzed using psychometric methods are encouraged. The focus of the special issue is data-intensive work that incorporates broad streams of empirical evidence into methodological development; this may include meta-analytic or multi-modal analyses that draw from several data sources so long as they meet the criteria of being research that forwards novel insights based on data-intensive work. We would encourage interested parties with any questions regarding the special issue to contact the guest editors listed below.

Guest Editors

Benjamin W. Domingue
Associate Professor
Stanford Graduate School of Education
https://profiles.stanford.edu/benjamin-domingue
bdomingue@stanford.edu

Kylie Gorney
Assistant Professor
Michigan State University College of Education
https://education.msu.edu/people/g/gorney-kylie
kgorney@msu.edu

Jonas Haslbeck
Postdoctoral Researcher
Psychological Methods group at the University of Amsterdam
https://jonashaslbeck.com/about/
jonashaslbeck@protonmail.com

Klint Kanopka
Assistant Professor
NYU Steinhardt School of Culture, Education, and Human Development
https://steinhardt.nyu.edu/people/klint-kanopka
Klint Kanopka

Leonie V.D.E. Vogelsmeier
Assistant Professor
Tilburg School of Social and Behavioral Sciences
https://research.tilburguniversity.edu/en/persons/leonie-vogelsmeier/
L.V.D.E.Vogelsmeier@tilburguniversity.edu

Submission Guidelines

Interested authors should submit a short proposal to the guest editors of the special issue by March 15, 2026 (see link below), and the deadline for the full, submitted manuscripts will be March 1, 2027. The proposal should include a brief summary (max. 500 words) of the intended project, and an explanation of how the intended project will fit in with the special issue. The guest editors will decide on the fit of the proposed project based on the proposal, and may also offer suggestions on the intended projects to ensure a good fit to the special issue. Alongside the conventional criteria for publication in Psychometrika, the editorial team would also consider the scope of the proposed data to be included in a full paper when the LOI is submitted. All manuscripts that are submitted to the special issue will go through the regular peer-review process. Junior scholars are especially encouraged to submit their projects.

  • The proposal (a description of the proposed project in <500 words) can be submitted here.

    • Short proposals are due by March 15, 2026. We aim to supply feedback by April 15, 2026.
    • If you are interested in submitted a proposal after the due date, please contact the editors directly.
  • Paper submissions must conform to the Psychometrika format guidelines available here.
  • Manuscripts must be submitted to the editorial manager submission system at https://mc.manuscriptcentral.com/psychometrika and the authors should select the relevant special issue during the submission process.
  • Submissions must represent original material that has neither been submitted to, nor published in, any other journal.

 

Log in