Psychometrika Call for Papers
We are excited to announce a call for papers for an upcoming special issue of the Theory and Methods section of Psychometrika, dedicated to “Variable Selection for Complex Psychometric Data.”
Psychometrika is now a fully open access journal with open access options for all authors, regardless of funding situation. Further details about these options can be found on the website at https://www.cambridge.org/core/journals/psychometrika.
In the social and behavioral sciences, identifying the predictors/features that most strongly influence behavioral or performance outcomes is one of the central objectives of empirical research. Achieving this goal requires statistical and computational tools capable of disentangling meaningful signals from noise, especially when the data are structured, hierarchical, or high-dimensional. This special issue seeks to bring together theoretical, methodological, and empirical contributions that advance the field of variable selection in complex psychometric data structures. Contributions are encouraged that integrate insights from statistics, machine learning, and psychometrics, emphasizing methods that enhance construct interpretation, model parsimony, and generalizability.
Scope and Topics of Interest
Recent advances in psychological and educational measurement have produced increasingly complex data structures that challenge traditional statistical frameworks. Examples include multilevel data (e.g., students nested within classrooms and schools), intensive longitudinal data (e.g., daily diary or ecological momentary assessment data capturing intraindividual variability), network or relational data (e.g., peer or social interaction networks within classrooms), multimodal data (e.g., combining response times, eye-tracking, or physiological signals with behavioral responses), and high-dimensional data (e.g., large sets of items, text features from large language models, or neuroimaging predictors relative to sample size). These data configurations demand new approaches to variable selection, where the goal extends beyond prediction to encompass causal inference, interpretability, and construct validity within psychometric models.
Traditional variable selection techniques—such as stepwise selection, information criteria, or regularization (e.g., LASSO)—are often insufficient when the data exhibit hierarchical dependencies, temporal autocorrelation, or complex covariance structures typical in psychometric research. Ongoing methodological developments in machine learning and in regularized and Bayesian modeling frameworks provide new opportunities to handle complex data structures while improving predictive performance and supporting interpretable inference.
Submissions may include, but are not limited to:
- Theoretical developments in variable selection for multilevel or cross-classified latent variable models, including IRT and SEM frameworks.
- Regularization and shrinkage methods for longitudinal or dynamic factor models.
- Model-based and algorithmic approaches integrating machine learning (e.g., random forests, boosting, neural networks) with classical psychometric models for variable/feature selection.
- Bayesian variable selection under complex dependency structures (e.g., hierarchical priors, spike-and-slab models, global–local shrinkage priors).
- Group-aware or structure-preserving selection methods, such as hierarchical LASSO, fused LASSO, or group-sparse regularization in multilevel models.
- Variable importance and interpretability measures for mixed-effects and latent variable models.
- Methods to address tensions between variable selection and causal inference.
- Cross-validation strategies for dependent and structured data.
- Novel variable/feature selection methodologies aimed at strengthening construct validity and predictive accuracy in large-scale assessments
We discourage submissions that solely involve applications of existing models or methods unless these applications yield meaningful new insights or are necessary to illustrate or compare methodological approaches. The emphasis of this call is on submissions that make significant methodological contributions rather than primarily substantive ones, while recognizing that such papers may include simulation studies or empirical examples to demonstrate their practical value and relevance to psychological research.
We encourage submissions to include reproducible, well-documented computer code and data. When privacy restrictions preclude sharing the original data, authors may provide a simulated dataset that permits running the code. While static supplementary files are acceptable, we strongly encourage depositing example code and data on the Open Science Framework (OSF) or GitHub and citing these resources in the manuscript. Unlike static supplements, OSF and GitHub support versioned, post-publication updates with documented changes, facilitating corrections necessitated by software updates or other factors affecting results.
Guest Editors
Sun-Joo Cho, Ph.D.
Professor
Quantitative Methods program, Department of Psychology and Human
Development
Vanderbilt University
Edgar Merkle, Ph.D.
Professor
Quantitative Psychology program, Department of Psychological
Sciences
University of Missouri
Submission Guidelines/Instructions
Authors who wish to participant are required to submit a concise proposal (less than 1,000 words), summarizing the methodological contribution, target data structures (e.g., multilevel, longitudinal, network), planned evaluation (simulation design and empirical illustration), anticipated software artifacts (package/code), and the corresponding author’s contact information, by January 15, 2026. Proposals should be submitted to https://forms.gle/jw3EX5wMkzxC78Nk9. Proposals will be screened by the guest editors for topical fit, novelty, methodological depth, and feasibility, with decisions and invitations issued approximately two weeks after the deadline, including formatting guidance and expectations for reproducibility. Invited authors will then submit full manuscripts by September 15, 2026, prepared in accordance with Psychometrika’s Author Guidelines and subject to the journal’s standard peer review.
Important Dates:
- Proposal submission deadline: January 15, 2026
- Notification of proposal decision: February 5, 2026
- Full paper submission deadline: September 15, 2026