Reports

Scoring Model Predictions using Cross-Validation

Smith, Anna L.; Zheng, Tian; Gelman, Andrew

We formalize a framework for quantitatively assessing agreement between two datasets that are assumed to come from two distinct data generating mechanisms. We propose a methodology for prediction scoring which provides a measure of the distance between two unobserved data generating mechanisms (DGMs), along the dimension of a particular model. The cross-validated scores can be used to evaluate preregistered hypotheses and to perform model validation in the face of complex statistical models. Using human behavior data from the Next Generation Social Science (NGS2) program, we demonstrate that prediction scores can be used as model assessment tools and that they can reveal insights based on data collected from different populations and across different settings. Our proposed cross-validated prediction scores are capable of quantifying true differences between data generating mechanisms, allow for the validation and assessment of complex models, and serve as valuable tools for reproducible research.

Files

More About This Work

Academic Units
Statistics
Published Here
November 8, 2018

Notes

Keywords: complex models; cross-validation; model assessment; preregistration; reproducibility.