This article addresses well established return forecasting challenges via frameworks that focus on the sign of the change in asset index excess returns using a family of GARCH models. It investigates them in the literature's original S&P 500 index to study the predictive power of information demand proxied by Google's internet search vector index and finds evidence suggesting that an efficient trading strategy stemming from this study can be constructed. This article is aimed at academics from undergraduate level up, and thus will explain all mathematical notations to ensure that there is no confusion and so that anyone - no matter their expertise on the subject - can follow.
Breakdown
In this article, I primarily study the frameworks outlined in CPV to investigate the explanatory and forecasting powers of information demand on the SPX.
Financial applications of my models are studied via the perspective of a single profit maximising investor in a manner akin to CPV (Kandel and Stambaugh (1996)). In the interest of time, I only investigate the scenario where the investing agent's risk aversion implies taking a risk as soon as its expected reward is positive. Later we will look into a framework to optimise this risk aversion level to allow for a truly profit maximising strategy.
- Part 1: Data Retrieval and Analysis: In this part, we start the R code, construct some rudimentary functions and collect data from Refinitiv all while explaining their use. We also go through the concept of risk-free rates and excess-returns as well as realised variance & realised volatility in addition to Google’s Search Vector Index before lightly analysing the retrieved data.
- Part 2: Econometric Modelling of Non-Linear Variances: In this section, there is no coding. Instead we go – step by step and including working outs – through ARMA models, their parameter estimation process (via log-likelihood), the concepts of stationarity, Taylor’s expansion, the calculus’ chain rule, first order conditions, and GARCH models (including the GACH, GJRGARCH and EGARCH models). Finally, we outline the way in which we construct error values from which to evaluate the performance of our models.
- Part 3: Coding Non-Linear Variance Econometric Models: In this section we encode the mathematical models and concepts in Part 2. We build our code, explaining it step-by-step, using in-series (computationally slow) and in-parallel (computationally fast) methods. Finally we expose the Diebold-Mariano test results for each pair of model performances, with and without using SVIs.
- Part 4: Sign Forecast Frameworks: Here we outline the Christopherson and Diebold model to estimate the probability of a positive returns, we compare the predictive performance of each model via rudimentary methods, Brier Scores and Diebold and Mariano statistics.
- Part 5: Financial significance: Finally, we put ourselves in the shoes of a profit maximising investor who reinvests – on a daily basis - in the risk-free asset or the S&P500 if the model suggests that its excess return will be positive with more probability than not. We graph the cumulative returns of such an investor for each model-following-strategy and discuss the results with the help of Sharpe-Ratios before concluding.
Development Tools & Resources
Refinitiv's DataStream Web Services for R (DatastreamDSWS2R): Access to DataStream data. A DataStream or Refinitiv Workspace IDentification (ID) will be needed to run the code bellow.
Part 3: Coding Non-Linear Variance Econometric Models
Part 3 Contents
- Creating functions for our GARCH models
- Standard Deviation Forecast Diebold-Mariano tests
'roll' will be the number of out-of-sample predictions.
roll = length(index(df)) - 252
roll
3540
GARCH_model_spec is defined as an R function to streamline the use of ugarchspec
GARCH_model_spec = function(mod, exreg)
{ugarchspec(variance.model = list(model = mod, garchOrder = c(1, 1),
external.regressors = exreg),
mean.model = list(armaOrder = c(1, 0), include.mean = TRUE,
external.regressors = exreg),
distribution.model = "norm")}
GARCH_model_forecast = function(mod)
{ugarchforecast(mod, n.ahead = 1, n.roll = 0, data = NULL, out.sample = 0)}
Now we can define the R function GARCHprocess_ser to go through our method. ('ser' for 'series' as opposed to 'par' for 'parallel'.) This function goes through our method in series, which means that it performs every step one after the other, none in parallel at the same time another step is being processed. There we:
- 1st: Define our training_sample_size R variable. It is a number, the number of observations in our training sample. We want to split our sample between the in- and out-of-smaple samples. The training sample is used to compute the first set of estimates of our model. In the case of the GARCH model, we use the first 252 observations (253rd element will be computed - there are approximately 253 trading days per year) to estimate $\Theta = $[${c, \phi_1, \psi_q, \sigma^2}$] into $\hat{\Theta} = $[${\hat{c}, \hat{\phi_1}, \hat{\psi_q}, \hat{\sigma^2}}$]
length(df[, c("SPX_R")]) - roll
252
- 2nd: That's why we create R variables in_sample_GARCH and in_sample_GARCHfit, they include specifications for the in-traning-smaple model
- 3rd: We define R variables GARCH_mu_hat and GARCH_sigma_hat as matricies to be filled by the folowing for loop. There we increase the training sample by 1 for every step in the loop.
- 4th: then we can populate our GARCH_mu_hat and GARCH_sigma_hat matrices and compute our desiered R variables, RV_no_GARCH_sigma_hat_na_dated, GARCHforecasterror, a stats_table with statistics of our variables, and the Root Mean Scuared Error of our results.
GARCHprocess_ser = function(regressand = df[, c("SPX_R")],
roll = roll,
exreg = NULL,
mod = "sGARCH",
RV = df[, c("SPX_RV")],
model_name)
{
training_sample_size = length(regressand) - roll
in_sample_GARCH = GARCH_model_spec(mod = mod, exreg = exreg)
in_sample_GARCHfit = ugarchfit(data = list(regressand)[[1]][1:training_sample_size],
spec = in_sample_GARCH)
GARCH_mu_hat = zoo(matrix(, nrow = roll, ncol = 1),
as.Date(index(regressand[(training_sample_size+1):(training_sample_size+roll)])))
colnames(GARCH_mu_hat) = c('GARCH_mu_hat')
GARCH_sigma_hat = zoo(matrix(, nrow=roll, ncol=1),
as.Date(index(regressand[(training_sample_size+1):(training_sample_size+roll)])))
colnames(GARCH_sigma_hat) = c('GARCH_sigma_hat')
for (i in c(1:roll))
{GARCH_spec = GARCH_model_spec(mod = mod, exreg = exreg)
try(
withTimeout({(
GARCHfit = ugarchfit(data = as.matrix(regressand)[1:((training_sample_size-1)+i)],
spec = GARCH_spec)
)}, timeout = 5),
silent = TRUE)
try((GARCHfitforecast = GARCH_model_forecast(mod = GARCHfit)), silent = TRUE
# suppressWarnings(expr)
)
try((GARCH_mu_hat[i] = GARCHfitforecast@forecast[["seriesFor"]]), silent = TRUE)
try((GARCH_sigma_hat[i] = GARCHfitforecast@forecast[["sigmaFor"]]), silent = TRUE)
rm(GARCH_spec)
rm(GARCHfit)
rm(GARCHfitforecast)
}
# fitted(GARCHfitforecast)
RV_no_GARCH_sigma_hat_na_dated = as.matrix(
zoo(df[, c("SPX_RV")]) - (na.omit(GARCH_sigma_hat*0)))
# This above is just a neat trick that returns strictly the RV (Realised Volatility) values corresponding to the dates for which GARCH11_sigma_hat values are not NA without any new functions or packages.
# Forecast Error, Mean and S.D.:
GARCHforecasterror = (na.omit(GARCH_sigma_hat) - RV_no_GARCH_sigma_hat_na_dated)
colnames(GARCHforecasterror) = c("GARCHforecasterror")
# plot(zoo(GARCHforecasterror, as.Date(index(GARCHforecasterror))),
# type='l', ylab='GARCHforecasterror', xlab='Date')
stats_table = Statistics_Table(paste(model_name, " forecast error", sep = ""),
na.omit(GARCHforecasterror))
# RMSE of the sigma (standard deviations) of the forecast:
print(paste("RMSE of the sigma (standard deviations) of the forecast from the model ",
model_name, sep = ""))
print(rmse(RV_no_GARCH_sigma_hat_na_dated, (na.omit(GARCH_sigma_hat)))) # Note that this is the same as: sqrt(mean((RV[(252+1):(length(FTSE_R_matrix)+1)] - as.matrix((GARCH11fitforecast@forecast[["sigmaFor"]]))[1:roll]) ^2))
return(list(in_sample_GARCH,
in_sample_GARCHfit,
GARCH_mu_hat,
GARCH_sigma_hat,
RV_no_GARCH_sigma_hat_na_dated,
GARCHforecasterror,
stats_table,
rmse(RV_no_GARCH_sigma_hat_na_dated, (na.omit(GARCH_sigma_hat)))))
}
This time we will try and create a GARCHprocess_ser but ask for it to be done in parallel, call it GARCHprocess_par, 'par' for 'parallel'. To do so, the GARCHprocess_out_of_sample_par R function is needed. You see, in order to parallelise our method, we use the R function clusterApply from the library parallel; this necessitates the method to be fully self contained in order to send orders to each CPU (Computer Processing Unit); we thus go through steps such as importing the rugarch library and defining GARCH_model_spec and GARCH_model_forecast again:
GARCHprocess_out_of_sample_par = function(StartEnd,
mod,
exreg,
roll,
regressand = df[, c("SPX_R")]){
library(rugarch)
tryCatch({
training_sample_size = length(regressand) - roll
GARCH_model_spec = function(mod, exreg)
{ugarchspec(variance.model = list(model = mod, garchOrder = c(1, 1),
external.regressors = exreg),
mean.model = list(armaOrder = c(1, 0), include.mean = TRUE,
external.regressors = exreg),
distribution.model = "norm")}
GARCH_model_forecast = function(mod)
{ugarchforecast(mod, n.ahead = 1, n.roll = 0, data = NULL, out.sample = 0)}
GARCH_spec = GARCH_model_spec(mod = mod, exreg = exreg)
GARCHfit = ugarchfit(data = as.matrix(regressand)[1:((training_sample_size-1)+StartEnd)], spec = GARCH_spec)
GARCHfitforecast = GARCH_model_forecast(mod = GARCHfit)
return(list(GARCHfitforecast@forecast[["seriesFor"]], GARCHfitforecast@forecast[["sigmaFor"]]))
},
error = function(e){return(list(NA,NA))})
}
Incorporating a Parallel Computing Framework:
GARCHprocess_par = function(regressand = df[, c("SPX_R")],
roll = roll,
exreg = NULL,
mod = "sGARCH",
no_of_clusters = detectCores(logical = FALSE),
RV = df[, c("SPX_RV")],
model_name = "Econometric Method")
{
training_sample_size = length(regressand) - roll
in_sample_GARCH = GARCH_model_spec(mod = mod, exreg = exreg)
in_sample_GARCHfit = ugarchfit(data = list(regressand)[[1]][1:training_sample_size],
spec = in_sample_GARCH)
GARCH_mu_hat = zoo(matrix(, nrow = roll, ncol = 1),
as.Date(index(regressand[(training_sample_size+1):(training_sample_size+roll)])))
colnames(GARCH_mu_hat) = c('GARCH_mu_hat')
GARCH_sigma_hat = zoo(matrix(, nrow = roll, ncol=1),
as.Date(index(regressand[(training_sample_size+1):(training_sample_size+roll)])))
colnames(GARCH_sigma_hat) = c('GARCH_sigma_hat')
# Parallel part:
cl = makeCluster(no_of_clusters)
mu_and_sigma_out_of_sample = clusterApply(cl,
x = 1:roll, # StartEnd
fun = GARCHprocess_out_of_sample_par,
roll = roll,
mod = mod,
exreg = exreg,
regressand = df[, c("SPX_R")])
stopCluster(cl)
# Back to series processing:
for (i in c(1:roll))
{try({GARCH_mu_hat[i] = mu_and_sigma_out_of_sample[[i]][[1]]}, silent = TRUE)
try({GARCH_sigma_hat[i] = mu_and_sigma_out_of_sample[[i]][[2]]}, silent = TRUE)}
# The ' - (na.omit(GARCH_sigma_hat*0)) ' bellow is just a trick that returns strictly
# the RV (Realised Volatility) values corresponding to the dates for which
# GARCH11_sigma_hat values are not NA without any new functions or packages.
RV_no_GARCH_sigma_hat_na_dated = as.matrix(zoo(RV) - (na.omit(GARCH_sigma_hat*0)))
# Forecast Error, Mean and S.D.:
GARCHforecasterror = (na.omit(GARCH_sigma_hat) - RV_no_GARCH_sigma_hat_na_dated)
colnames(GARCHforecasterror) = c("GARCHforecasterror")
# plot(zoo(GARCHforecasterror, as.Date(index(GARCHforecasterror))),
# type='l', ylab='GARCHforecasterror', xlab='Date')
stats_table = Statistics_Table(paste(model_name, " forecast error", sep = ""),
na.omit(GARCHforecasterror))
# RMSE of the sigma (standard deviations) of the forecast:
print(paste("RMSE of the sigma (standard deviations) of the forecast from the model ",
model_name, sep = ""))
print(rmse(RV_no_GARCH_sigma_hat_na_dated, (na.omit(GARCH_sigma_hat)))) # Note that this is the same as: sqrt(mean((RV[(252+1):(length(FTSE_R_matrix)+1)] - as.matrix((GARCH11fitforecast@forecast[["sigmaFor"]]))[1:roll]) ^2))
return(list(in_sample_GARCH,
in_sample_GARCHfit,
GARCH_mu_hat,
GARCH_sigma_hat,
RV_no_GARCH_sigma_hat_na_dated,
GARCHforecasterror,
stats_table,
rmse(RV_no_GARCH_sigma_hat_na_dated, (na.omit(GARCH_sigma_hat)))))
}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.02.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.02.RData")
Now we can apply our newly defined R functions with a graph of our non-linear-variance model's results' errors when comparing them to $\mathbf{SPX\_RV}$. The graphs will only be there to give us visual confirmation that the R function worked properly:
$$ \\ $$Remember that we outlined our GARCH models as follows:
GARCH(1,1) accounts for SFFR (iv) and (v) in addition to (iii) and non-normal unconditional distributions in levels, which works in parallel with SFFR (i). To do so, an MA component is added to the ARCH(1):
$$\begin{equation} \sigma_{t|t-1}^2 = c + \alpha \epsilon_{t-1}^2 + \beta \sigma_{t-1}^2 \end{equation}$$where elements are as aforementioned, $\beta \in \mathbb{R}$ is positive and in this specific case: $(\alpha + \beta) < 1$ for stationarity. I implement the SVI factor with positive $\delta \in \mathbb{R}$ - enabling us to study its effect on $R_t$'s variance - via:
$$\begin{equation} \sigma_{t|t-1}^2 = c + \alpha \epsilon_{t-1}^2 + \beta \sigma_{t-1}^2 + \delta \Delta SVI_{t-1} \end{equation}$$Whilst GARCH models are a natural fit in this study by the virtues mentioned, “in general, models that incorporate volatility asymmetry such as EGARCH and GJR-GARCH perform better” (Poon and Granger (2003), p.507). Indeed, GARCH models cannot account for SFFR (v) of leverage effects, whereas GJRGARCH and EGACH models may; CPV therefore considered them too - as do I. The thicker bottom left tail of the EPDFs suggest such leverage effects.
GARCH11process_series = GARCHprocess_ser(regressand = df[, c("SPX_R")],
roll = roll, mod = "sGARCH",
exreg = NULL,
model_name = "GARCH11_ser",
RV = df[, c("SPX_RV")])
in_sample_GARCH11_ser = GARCH11process_series[[1]]
in_sample_GARCH11fit_ser = GARCH11process_series[[2]]
GARCH11_mu_hat_ser = GARCH11process_series[[3]]
GARCH11_sigma_hat_ser = GARCH11process_series[[4]]
RV_no_GARCH11_sigma_hat_na_dated_ser = GARCH11process_series[[5]]
GARCH11forecasterror_ser = GARCH11process_series[[6]]
GARCH11_stats_table_ser = GARCH11process_series[[7]]
GARCH11_rmse_ser = GARCH11process_series[[8]]
plot(zoo(GARCH11forecasterror_ser, as.Date(index(GARCH11forecasterror_ser))),
type='l', ylab='GARCH11forecasterror_ser', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.03.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.03.RData")
GARCH11process_parallel = GARCHprocess_par(regressand = df[, c("SPX_R")],
roll = roll, exreg = NULL, mod = "sGARCH",
no_of_clusters = detectCores(logical = FALSE),
RV = df[, c("SPX_RV")], model_name = "GARCH11_par")
[1] "RMSE of the sigma (standard deviations) of the forecast from the model GARCH11_par"
[1] 0.004278812
in_sample_GARCH11_par = GARCH11process_parallel[[1]]
in_sample_GARCH11fit_par = GARCH11process_parallel[[2]]
GARCH11_mu_hat_par = GARCH11process_parallel[[3]]
GARCH11_sigma_hat_par = GARCH11process_parallel[[4]]
RV_no_GARCH11_sigma_hat_na_dated_par = GARCH11process_parallel[[5]]
GARCH11forecasterror_par = GARCH11process_parallel[[6]]
GARCH11_stats_table_par = GARCH11process_parallel[[7]]
GARCH11_rmse_par = GARCH11process_parallel[[8]]
plot(zoo(GARCH11forecasterror_par, as.Date(index(GARCH11forecasterror_par))),
type='l', ylab='GARCH11forecasterror_par', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.04.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.04.RData")
GARCH11SVIprocess_series = GARCHprocess_ser(regressand = df[, c("SPX_R")],
roll = roll, mod = "sGARCH",
exreg = matrix(df[, c("SPX_dSVI")]),
model_name = "GARCH11SVI_ser",
RV = df[, c("SPX_RV")])
in_sample_GARCH11SVI_ser = GARCH11SVIprocess_series[[1]]
in_sample_GARCH11SVIfit_ser = GARCH11SVIprocess_series[[2]]
GARCH11SVI_mu_hat_ser = GARCH11SVIprocess_series[[3]]
GARCH11SVI_sigma_hat_ser = GARCH11SVIprocess_series[[4]]
RV_no_GARCH11SVI_sigma_hat_na_dated_ser = GARCH11SVIprocess_series[[5]]
GARCH11SVIforecasterror_ser = GARCH11SVIprocess_series[[6]]
GARCH11SVI_stats_table_ser = GARCH11SVIprocess_series[[7]]
GARCH11SVI_rmse_ser = GARCH11SVIprocess_series[[8]]
plot(zoo(GARCH11SVIforecasterror_ser, as.Date(index(GARCH11SVIforecasterror_ser))),
type='l', ylab='GARCH11SVIforecasterror_ser', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.05.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.05.RData")
GARCH11SVIprocess_parallel = GARCHprocess_par(regressand = df[, c("SPX_R")],
roll = roll,
mod = "sGARCH",
exreg = matrix(df[, c("SPX_dSVI")]),
no_of_clusters = detectCores(logical = FALSE),
model_name = "GARCH11SVI_par",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model GARCH11SVI_par"
[1] 0.004298738
in_sample_GARCH11SVI_par = GARCH11SVIprocess_parallel[[1]]
in_sample_GARCH11SVIfit_par = GARCH11SVIprocess_parallel[[2]]
GARCH11SVI_mu_hat_par = GARCH11SVIprocess_parallel[[3]]
GARCH11SVI_sigma_hat_par = GARCH11SVIprocess_parallel[[4]]
RV_no_GARCH11SVI_sigma_hat_na_dated_par = GARCH11SVIprocess_parallel[[5]]
GARCH11SVIforecasterror_par = GARCH11SVIprocess_parallel[[6]]
GARCH11SVI_stats_table_par = GARCH11SVIprocess_parallel[[7]]
GARCH11SVI_rmse_par = GARCH11SVIprocess_parallel[[8]]
plot(zoo(GARCH11SVIforecasterror_par, as.Date(index(GARCH11SVIforecasterror_par))),
type='l', ylab='GARCH11forecasterror_par', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.06.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.06.RData")
GJRGARCH models account for asymmetry (SFFR (i)) via the last summand in
$$\begin{equation} \sigma_{t|t-1}^2 = c + \alpha \epsilon_{t-1}^2 + \beta \sigma_{t-i}^2 + \gamma I(\epsilon_{t-1}<0) \epsilon_{t-1}^2 \end{equation}$$
and second to last in $$\begin{equation} \sigma_{t|t-1}^2 = c + \alpha \epsilon_{t-1}^2 + \beta \sigma_{t-1}^2 + \gamma I(\epsilon_{t-1}<0) \epsilon_{t-1}^2 + \delta \Delta SVI_{t-1} \end{equation}$$ where a leverage case would result in $\gamma > 0$, and $I(\cdot)$ is the indicator function returning $1$ if its bracketed statement is true, $0$ otherwise - in this case:
$$\begin{equation} I(\epsilon_{t-1}<0) = \begin{Bmatrix} 1 & \text{if } \epsilon_{t-1}<0 \\ 0 & \text{otherwise} \end{Bmatrix} . \end{equation}$$
GJRGARCH11process_series = GARCHprocess_ser(regressand = df[, c("SPX_R")],
roll = roll, mod = "sGARCH",
exreg = NULL,
model_name = "GJRGARCH11_ser",
RV = df[, c("SPX_RV")])
in_sample_GJRGARCH11_ser = GJRGARCH11process_series[[1]]
in_sample_GJRGARCH11fit_ser = GJRGARCH11process_series[[2]]
GJRGARCH11_mu_hat_ser = GJRGARCH11process_series[[3]]
GJRGARCH11_sigma_hat_ser = GJRGARCH11process_series[[4]]
RV_no_GJRGARCH11_sigma_hat_na_dated_ser = GJRGARCH11process_series[[5]]
GJRGARCH11forecasterror_ser = GJRGARCH11process_series[[6]]
GJRGARCH11_stats_table_ser = GJRGARCH11process_series[[7]]
GJRGARCH11_rmse_ser = GJRGARCH11process_series[[8]]
plot(zoo(GJRGARCH11forecasterror_ser, as.Date(index(GJRGARCH11forecasterror_ser))),
type='l', ylab='GJRGARCH11forecasterror_ser', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.07.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.07.RData")
# Parallel:
GJRGARCH11process_parallel = GARCHprocess_par(regressand = df[, c("SPX_R")],
roll = roll,
mod = "gjrGARCH",
exreg = NULL,
no_of_clusters = detectCores(logical = FALSE),
model_name = "GJRGARCH11_par",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model GJRGARCH11_par"
[1] 0.003897316
in_sample_GJRGARCH11_par = GJRGARCH11process_parallel[[1]]
in_sample_GJRGARCH11fit_par = GJRGARCH11process_parallel[[2]]
GJRGARCH11_mu_hat_par = GJRGARCH11process_parallel[[3]]
GJRGARCH11_sigma_hat_par = GJRGARCH11process_parallel[[4]]
RV_no_GJRGARCH11_sigma_hat_na_dated_par = GJRGARCH11process_parallel[[5]]
GJRGARCH11forecasterror_par = GJRGARCH11process_parallel[[6]]
GJRGARCH11_stats_table_par = GJRGARCH11process_parallel[[7]]
GJRGARCH11_rmse_par = GJRGARCH11process_parallel[[8]]
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.08.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.08.RData")
GJRGARCH11SVIprocess_series = GARCHprocess_ser(regressand = df[, c("SPX_R")],
roll = roll, mod = "sGARCH",
exreg = matrix(df[, c("SPX_dSVI")]),
model_name = "GJRGARCH11SVI_ser",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model GJRGARCH11SVI_ser"
[1] 0.004297734
in_sample_GJRGARCH11SVI_ser = GJRGARCH11SVIprocess_series[[1]]
in_sample_GJRGARCH11SVIfit_ser = GJRGARCH11SVIprocess_series[[2]]
GJRGARCH11SVI_mu_hat_ser = GJRGARCH11SVIprocess_series[[3]]
GJRGARCH11SVI_sigma_hat_ser = GJRGARCH11SVIprocess_series[[4]]
RV_no_GJRGARCH11SVI_sigma_hat_na_dated_ser = GJRGARCH11SVIprocess_series[[5]]
GJRGARCH11SVIforecasterror_ser = GJRGARCH11SVIprocess_series[[6]]
GJRGARCH11SVI_stats_table_ser = GJRGARCH11SVIprocess_series[[7]]
GJRGARCH11SVI_rmse_ser = GJRGARCH11SVIprocess_series[[8]]
plot(zoo(GJRGARCH11SVIforecasterror_ser, as.Date(index(GJRGARCH11SVIforecasterror_ser))),
type='l', ylab='GJRGARCH11SVIforecasterror_ser', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.09.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.09.RData")
GJRGARCH11SVIprocess_parallel = GARCHprocess_par(regressand = df[, c("SPX_R")],
roll = roll,
mod = "gjrGARCH",
exreg = matrix(df[, c("SPX_dSVI")]),
no_of_clusters = detectCores(logical = FALSE),
model_name = "GJRGARCH11SVI_par",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model GJRGARCH11SVI_par"
[1] 0.003910605
in_sample_GJRGARCH11SVI_par = GJRGARCH11SVIprocess_parallel[[1]]
in_sample_GJRGARCH11SVIfit_par = GJRGARCH11SVIprocess_parallel[[2]]
GJRGARCH11SVI_mu_hat_par = GJRGARCH11SVIprocess_parallel[[3]]
GJRGARCH11SVI_sigma_hat_par = GJRGARCH11SVIprocess_parallel[[4]]
RV_no_GJRGARCH11SVI_sigma_hat_na_dated_par = GJRGARCH11SVIprocess_parallel[[5]]
GJRGARCH11SVIforecasterror_par = GJRGARCH11SVIprocess_parallel[[6]]
GJRGARCH11SVI_stats_table_par = GJRGARCH11SVIprocess_parallel[[7]]
GJRGARCH11SVI_rmse_par = GJRGARCH11SVIprocess_parallel[[8]]
plot(zoo(GJRGARCH11SVIforecasterror_par, as.Date(index(GJRGARCH11SVIforecasterror_par))),
type='l', ylab='GJRGARCH11SVIforecasterror_par', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.10.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.10.RData")
The use of the logarithm function insures non-negativity in EGARCH models
$$\begin{equation} ln(\sigma_{t|t-1}^2) = c + \alpha \Bigg[ \frac{|\epsilon_{t-1}|}{\sigma_{t-1}} - \sqrt{\frac{2}{\pi}} \Bigg] + \beta ln(\sigma_{t-1}^2) + \gamma \frac{\epsilon_{t-1}}{\sigma_{t-1}} \end{equation}$$
and
$$\begin{equation} ln(\sigma_{t|t-1}^2) = c + \alpha \Bigg[ \frac{|\epsilon_{t-1}|}{\sigma_{t-1}} - \sqrt{\frac{2}{\pi}} \Bigg] + \beta ln(\sigma_{t-1}^2) + \gamma \frac{\epsilon_{t-1}}{\sigma_{t-1}} + \delta \Delta SVI_{t-1} \end{equation}$$
where ln($\cdot$) and |$\cdot$| denote the logarithm base $e$ and the absolute value functions respectively and where $c, \alpha, \beta, \gamma \in \mathbb{R}$ are not constrained $\textbf{: } c + \alpha + \beta + \gamma > 0$ . If $Var(R_t|\Omega_{t-1})$ and $R_t$ react negatively to each other, $\beta$ would be negative, and \textit{vice versa}.
EGARCH11process_series = GARCHprocess_ser(regressand = df[, c("SPX_R")],
roll = roll, mod = "sGARCH",
exreg = NULL,
model_name = "EGARCH11_ser",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model EGARCH11_ser"
[1] 0.004278411
in_sample_EGARCH11_ser = EGARCH11process_series[[1]]
in_sample_EGARCH11fit_ser = EGARCH11process_series[[2]]
EGARCH11_mu_hat_ser = EGARCH11process_series[[3]]
EGARCH11_sigma_hat_ser = EGARCH11process_series[[4]]
RV_no_EGARCH11_sigma_hat_na_dated_ser = EGARCH11process_series[[5]]
EGARCH11forecasterror_ser = EGARCH11process_series[[6]]
EGARCH11_stats_table_ser = EGARCH11process_series[[7]]
EGARCH11_rmse_ser = EGARCH11process_series[[8]]
plot(zoo(EGARCH11forecasterror_ser, as.Date(index(EGARCH11forecasterror_ser))),
type='l', ylab='EGARCH11forecasterror_ser', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.11.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.11.RData")
EGARCH11process_parallel = GARCHprocess_par(regressand = df[, c("SPX_R")],
roll = roll,
mod = "eGARCH",
exreg = NULL,
no_of_clusters = detectCores(logical = FALSE),
model_name = "EGARCH11_par",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model EGARCH11_par"
[1] 0.003657765
in_sample_EGARCH11_par = EGARCH11process_parallel[[1]]
in_sample_EGARCH11fit_par = EGARCH11process_parallel[[2]]
EGARCH11_mu_hat_par = EGARCH11process_parallel[[3]]
EGARCH11_sigma_hat_par = EGARCH11process_parallel[[4]]
RV_no_EGARCH11_sigma_hat_na_dated_par = EGARCH11process_parallel[[5]]
EGARCH11forecasterror_par = EGARCH11process_parallel[[6]]
EGARCH11_stats_table_par = EGARCH11process_parallel[[7]]
EGARCH11_rmse_par = EGARCH11process_parallel[[8]]
plot(zoo(EGARCH11forecasterror_par, as.Date(index(EGARCH11forecasterror_par))),
type='l', ylab='EGARCH11forecasterror_par', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.12.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.12.RData")
EGARCH11SVIprocess_series = GARCHprocess_ser(regressand = df[, c("SPX_R")],
roll = roll, mod = "sGARCH",
exreg = matrix(df[, c("SPX_dSVI")]),
model_name = "EGARCH11_ser",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model EGARCH11_ser"
[1] 0.004298382
in_sample_EGARCH11SVI_ser = EGARCH11SVIprocess_series[[1]]
in_sample_EGARCH11SVIfit_ser = EGARCH11SVIprocess_series[[2]]
EGARCH11SVI_mu_hat_ser = EGARCH11SVIprocess_series[[3]]
EGARCH11SVI_sigma_hat_ser = EGARCH11SVIprocess_series[[4]]
RV_no_EGARCH11SVI_sigma_hat_na_dated_ser = EGARCH11SVIprocess_series[[5]]
EGARCH11SVIforecasterror_ser = EGARCH11SVIprocess_series[[6]]
EGARCH11SVI_stats_table_ser = EGARCH11SVIprocess_series[[7]]
EGARCH11SVI_rmse_ser = EGARCH11SVIprocess_series[[8]]
plot(zoo(EGARCH11SVIforecasterror_ser, as.Date(index(EGARCH11SVIforecasterror_ser))),
type='l', ylab='EGARCH11SVIforecasterror_ser', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.13.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.13.RData")
EGARCH11SVIprocess_parallel = GARCHprocess_par(regressand = df[, c("SPX_R")],
roll = roll,
mod = "eGARCH",
exreg = matrix(df[, c("SPX_dSVI")]),
no_of_clusters = detectCores(logical = FALSE),
model_name = "EGARCH11SVI_par",
RV = df[, c("SPX_RV")])
[1] "RMSE of the sigma (standard deviations) of the forecast from the model EGARCH11SVI_par"
[1] 0.007463993
in_sample_EGARCH11SVI_par = EGARCH11SVIprocess_parallel[[1]]
in_sample_EGARCH11SVIfit_par = EGARCH11SVIprocess_parallel[[2]]
EGARCH11SVI_mu_hat_par = EGARCH11SVIprocess_parallel[[3]]
EGARCH11SVI_sigma_hat_par = EGARCH11SVIprocess_parallel[[4]]
RV_no_EGARCH11SVI_sigma_hat_na_dated_par = EGARCH11SVIprocess_parallel[[5]]
EGARCH11SVIforecasterror_par = EGARCH11SVIprocess_parallel[[6]]
EGARCH11SVI_stats_table_par = EGARCH11SVIprocess_parallel[[7]]
EGARCH11SVI_rmse_par = EGARCH11SVIprocess_parallel[[8]]
plot(zoo(EGARCH11SVIforecasterror_par, as.Date(index(EGARCH11SVIforecasterror_par))),
type='l', ylab='EGARCH11SVIforecasterror_par', xlab='Date')
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.00.14.RData")
# To restore your workspace, type this:
# load("IDaSRP_work_space_p4.00.14.RData")
This function implements the modified test proposed by Harvey, Leybourne and Newbold (1997). The null hypothesis is that the two methods have the same forecast accuracy. For alternative=less, the alternative hypothesis is that method 2 is less accurate than method 1. For alternative=greater, the alternative hypothesis is that method 2 is more accurate than method 1. For alternative=two.sided, the alternative hypothesis is that method 1 and method 2 have different levels of accuracy.
print("Diebold and Mariano test GARCH11 with and without SVI")
GARCH11forecasterror_par_dm = GARCH11forecasterror_par - (GARCH11SVIforecasterror_par*0)
GARCH11SVIforecasterror_par_dm = GARCH11SVIforecasterror_par - (GARCH11forecasterror_par*0)
dm.test(matrix(GARCH11forecasterror_par_dm), matrix(GARCH11SVIforecasterror_par_dm), alternative = "less")
print("Diebold and Mariano test GJRGARCH11 with and without SVI")
GJRGARCH11forecasterror_par_dm = GJRGARCH11forecasterror_par - (GJRGARCH11SVIforecasterror_par*0)
GJRGARCH11SVIforecasterror_par_dm = GJRGARCH11SVIforecasterror_par - (GJRGARCH11forecasterror_par*0)
dm.test(matrix(GJRGARCH11forecasterror_par_dm), matrix(GJRGARCH11SVIforecasterror_par_dm), alternative = "less")
print("Diebold and Mariano test EGARCH11 with and without SVI")
EGARCH11forecasterror_par_dm = EGARCH11forecasterror_par - (EGARCH11SVIforecasterror_par*0)
EGARCH11SVIforecasterror_par_dm = EGARCH11SVIforecasterror_par - (EGARCH11forecasterror_par*0)
dm.test(matrix(EGARCH11forecasterror_par_dm), matrix(EGARCH11SVIforecasterror_par_dm), alternative = "less")
[1] "Diebold and Mariano test GARCH11 with and without SVI"
Diebold-Mariano Test
data: matrix(GARCH11forecasterror_par_dm)matrix(GARCH11SVIforecasterror_par_dm)
DM = -1.9405, Forecast horizon = 1, Loss function power = 2, p-value = 0.0262
alternative hypothesis: less
[1] "Diebold and Mariano test GJRGARCH11 with and without SVI"
Diebold-Mariano Test
data: matrix(GJRGARCH11forecasterror_par_dm)matrix(GJRGARCH11SVIforecasterror_par_dm)
DM = -1.1588, Forecast horizon = 1, Loss function power = 2, p-value = 0.1233
alternative hypothesis: less
[1] "Diebold and Mariano test EGARCH11 with and without SVI"
Diebold-Mariano Test
data: matrix(EGARCH11forecasterror_par_dm)matrix(EGARCH11SVIforecasterror_par_dm)
DM = -1.5212, Forecast horizon = 1, Loss function power = 2, p-value = 0.06415
alternative hypothesis: less
Here, the alternative hypothesis is that method 2 (SVI-implementing) is less accurate than method 1 (SVI-free). The fact that all test scores were negative suggest that the SVI-free models are not better performing than the SVI-implementing ones. However, only the GARCH and EGARCH models' D-M tests provided statistically significant results (at the 10% Confidence Level / 90% Statistical Significance). All in all, more is needed to evaluate the efficiency of our method.
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p4.01.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p4.01.RData")