Academic Commons

Software and Data (Information)

Boosting High Dimensional Predictive Regressions with Time Varying Parameters

Ng, Serena

High dimensional predictive regressions are useful in wide range of applications. However, the theory is mainly developed assuming that the model is stationary with time invariant parameters. This is at odds with the prevalent evidence for parameter instability in economic time series, but theories for parameter instability are mainly developed for models with a small number of covariates. In this paper, we present two $L_2$ boosting algorithms for estimating high dimensional models in which the coefficients are modeled as functions evolving smoothly over time and the predictors are locally stationary. The first method uses componentwise local constant estimators as base learner, while the second relies on componentwise local linear estimators. We establish consistency of both methods, and address the practical issues of choosing the bandwidth for the base learners and the number of boosting iterations. In an extensive application to macroeconomic forecasting with many potential predictors, we find that the benefits to modeling time variation are substantial and they increase with the forecast horizon. Furthermore, the timing of the benefits suggests that the Great Moderation is associated with substantial instability in the conditional mean of various economic series.

This zip file contains replication instructions along with the following four folders:
1) Final Data: Contains all the processed data. We also formed our response for each of the horizons and have them as separate csv files. Data source:
2) Create Response: Contains the code and the files needed to create our response variables for each of our horizons. The results from this code is contained in the final data folder.
3) Simulations: Contains the code for the simulations section
4) MacroForecasting Code: Most of the code is contained in this folder

Files

More About This Work

Academic Units
Economics
Published Here
July 19, 2021