Factor Retention in Exploratory Multidimensional Item Response Theory

Changsheng Chen et al.

Educational and Psychological Measurement2025https://doi.org/10.1177/00131644241306680article
ABDC A
Weight
0.44

Abstract

Multidimensional Item Response Theory (MIRT) is applied routinely in developing educational and psychological assessment tools, for instance, for exploring multidimensional structures of items using exploratory MIRT. A critical decision in exploratory MIRT analyses is the number of factors to retain. Unfortunately, the comparative properties of statistical methods and innovative Machine Learning (ML) methods for factor retention in exploratory MIRT analyses are still not clear. This study aims to fill this gap by comparing a selection of statistical and ML methods, including Kaiser Criterion (KC), Empirical Kaiser Criterion (EKC), Parallel Analysis (PA), scree plot (OC and AF), Very Simple Structure (VSS; C1 and C2), Minimum Average Partial (MAP), Exploratory Graph Analysis (EGA), Random Forest (RF), Histogram-based Gradient Boosted Decision Trees (HistGBDT), eXtreme Gradient Boosting (XGBoost), and Artificial Neural Network (ANN). The comparison was performed using 720,000 dichotomous response data sets simulated by the MIRT, for various between-item and within-item structures and considering characteristics of large-scale assessments. The results show that MAP, RF, HistGBDT, XGBoost, and ANN tremendously outperform other methods. Among them, HistGBDT generally performs better than other methods. Furthermore, including statistical methods' results as training features improves ML methods' performance. The methods' correct-factoring proportions decrease with an increase in missingness or a decrease in sample size. KC, PA, EKC, and scree plot (OC) are over-factoring, while EGA, scree plot (AF), and VSS (C1) are under-factoring. We recommend that practitioners use both MAP and HistGBDT to determine the number of factors when applying exploratory MIRT.

3 citations

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.1177/00131644241306680

Or copy a formatted citation

@article{changsheng2025,
  title        = {{Factor Retention in Exploratory Multidimensional Item Response Theory}},
  author       = {Changsheng Chen et al.},
  journal      = {Educational and Psychological Measurement},
  year         = {2025},
  doi          = {https://doi.org/https://doi.org/10.1177/00131644241306680},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

Factor Retention in Exploratory Multidimensional Item Response Theory

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.44

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.32 × 0.4 = 0.13
M · momentum0.57 × 0.15 = 0.09
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.