Off-the-shelf forecasting routines

The OOS package workflow begins with a user generating out-of-sample forecasts. To this end, OOS can handle both univariate and multivariate models, and comes off-the-shelf ready to estimate and forecast with several popular models via the forecast_univariate, forecast_multivariate, and forecast_combine functions.

Method names recognized by forecast_univariate, forecast_multivariate, and forecast_combine are shown in brackets.

Default univariate techniques
1. Random Walk [naive, rwf]
2. ARIMA [auto.arima, Arima]
3. ETS [ets]
4. Spline [spline]
5. Theta Method [theta]
6. TBATS [tbats]
7. STL [stl]
8. AR Perceptron [nnetar]

Notes: All univariate forecasting routines may be accessed via the OOS function forecast_univariate. Currently all default univariate forecasting techniques are implemented via the forecasts package.

Default multivariate techniques
1. Vector Autoregression [var]
2. Linear Regression [ols]
3. LASSO Regression [lasso]
4. Ridge Regression [ridge]
5. Elastic Net [elastic]
6. Principal Component Regression [pcr]
7. Partial Least Squares Regression [pls]
8. Random Forest [RF]
9. Tree-Based Gradient Boosting Machine [GBM]
10. Averaged Single Layered Neural Network [NN]

Notes: All multivariate forecasting routines may be accessed via the OOS function forecast_multivariate. VAR estimation is implemented via the vars package. The LASSO, Ridge, and Elastic Net penalized regressions are all implemented via the glmnet package, but interfaced with through caret. PCA and PLS dimension reduction regressions, Random Forest, Tree-Based Gradient Boosting Machine, and averaged Single Layered Neural Network all use their standard implementations in R and take their standard default CV values for training unless otherwise specified by the user through caret style commands.

Default forecast combination techniques
1. Mean [uniform]
2. Median [median]
3. Trimmed (Winsorized) Mean [trimmed.mean]
4. N-best [n.best]
5. Linear Regression (OLS) [ols]
6. LASSO Regression [lasso]
7. Ridge Regression [ridge]
8. Elastic Net [elastic]
9. peLASSO [peLasso]
10. Principal Component Regression [pcr]
11. Partial Least Squares Regression [pls]
12. Random Forest [RF]
13. Tree-Based Gradient Boosting Machine [GBM]
14. Averaged Single Layered Neural Network [NN]

Notes: All forecast combination routines may be accessed via the OOS function forecast_combine. The LASSO, Ridge, and Elastic Net penalized regressions are all implemented via the glmnet package, but interfaced with through caret. The N-best and Partial Egalitarian LASSO methods were introduced by Diebold and Shin (2019). The Random Forest, Tree-Based Gradient Boosting Machine, and averaged Single Layered Neural Network all use their standard implementations in R and take their standard default CV values for training unless otherwise specified by the user through caret style commands.

User-defined forecasting routines

While one may rely on default models and training routines, a practitioner may also implement custom forecasting models and edit default training routines through a series of user-facing control panes (list objects).

User-facing control panels
1. forecast_univariate.control_panel
2. forecast_multivariate.ml.control_panel
3. forecast_multivariate.var.control_panel
4. forecast_combinations.control_panel