Theses Doctoral

Computational Psychometrics for Item-based Computerized Adaptive Learning

Chen, Yi

With advances in computer technology and expanded access to educational data, psychometrics faces new opportunities and challenges for enhancing pattern discovery and decision-making in testing and learning. In this dissertation, I introduced three computational psychometrics studies for solving the technical problems in item-based computerized adaptive learning (CAL) systems related to dynamic measurement, diagnosis, and recommendation based on Bayesian item response theory (IRT).

For the first study, I introduced a new knowledge tracing (KT) model, dynamic IRT (DIRT), which can iteratively update the posterior distribution of latent ability based on moment match approximation and capture the uncertainty of ability change during the learning process. For dynamic measurement, DIRT has advantages in interpretation, flexibility, computation cost, and implementability. For the second study, A new measurement model, named multilevel and multidimensional item response theory with Q matrix (MMIRT-Q), was proposed to provide fine-grained diagnostic feedback. I introduced sequential Monte Carlo (SMC) for online estimation of latent abilities.

For the third study, I proposed the maximum expected ratio of posterior variance reduction criterion (MERPV) for testing purposes and the maximum expected improvement in posterior mean (MEIPM) criterion for learning purposes under the unified framework of IRT. With these computational psychometrics solutions, we can improve the students’ learning and testing experience with accurate psychometrics measurement, timely diagnosis feedback, and efficient item selection.


  • thumnail for Chen_columbia_0054D_17637.pdf Chen_columbia_0054D_17637.pdf application/pdf 2.49 MB Download File

More About This Work

Academic Units
Measurement and Evaluation
Thesis Advisors
Lee, Young-Sun
Ph.D., Columbia University
Published Here
January 25, 2023