Breakdown
In this article, I primarily study the frameworks outlined in CPV to investigate the explanatory and forecasting powers of information demand on the SPX.
Financial applications of my models are studied via the perspective of a single profit maximising investor in a manner akin to CPV (Kandel and Stambaugh (1996)). In the interest of time, I only investigate the scenario where the investing agent's risk aversion implies taking a risk as soon as its expected reward is positive. Later we will look into a framework to optimise this risk aversion level to allow for a truly profit maximising strategy.
- Part 1: Data Retrieval and Analysis: In this part, we start the R code, construct some rudimentary functions and collect data from Refinitiv all while explaining their use. We also go through the concept of risk-free rates and excess-returns as well as realised variance & realised volatility in addition to Google’s Search Vector Index before lightly analysing the retrieved data.
- Part 2: Econometric Modelling of Non-Linear Variances: In this section, there is no coding. Instead we go – step by step and including working outs – through ARMA models, their parameter estimation process (via log-likelihood), the concepts of stationarity, Taylor’s expansion, the calculus’ chain rule, first order conditions, and GARCH models (including the GACH, GJRGARCH and EGARCH models). Finally, we outline the way in which we construct error values from which to evaluate the performance of our models.
- Part 3: Coding Non-Linear Variance Econometric Models: In this section we encode the mathematical models and concepts in Part 2. We build our code, explaining it step-by-step, using in-series (computationally slow) and in-parallel (computationally fast) methods. Finally we expose the Diebold-Mariano test results for each pair of model performances, with and without using SVIs.
- Part 4: Sign Forecast Frameworks: Here we outline the Christopherson and Diebold model to estimate the probability of a positive returns, we compare the predictive performance of each model via rudimentary methods, Brier Scores and Diebold and Mariano statistics.
- Part 5: Financial significance: Finally, we put ourselves in the shoes of a profit maximising investor who reinvests – on a daily basis - in the risk-free asset or the S&P500 if the model suggests that its excess return will be positive with more probability than not. We graph the cumulative returns of such an investor for each model-following-strategy and discuss the results with the help of Sharpe-Ratios before concluding.
Development Tools & Resources
Refinitiv's DataStream Web Services for R (DatastreamDSWS2R): Access to DataStream data. A DataStream or Refinitiv Workspace IDentification (ID) will be needed to run the code bellow.
Part 5: Financial significance
Part 5 Contents
- Financial Significance
- Getting to the Coding
- Granger and Pesaran (2000)'s framework using the Buy and Hold (Cumulative and Recursive) Model
- Granger and Pesaran (2000)'s framework using the GARCH11 model
- Granger and Pesaran (2000)'s framework using the GARCH11-SVI model
- Granger and Pesaran (2000)'s framework using the GJRGARCH11 model
- Granger and Pesaran (2000)'s framework using the GJRGARCH11-SVI model
- Granger and Pesaran (2000)'s framework using the EGARCH11 model
- Granger and Pesaran (2000)'s framework using the EGARCH11-SVI model
- Granger and Pesaran (2000)'s framework 2d Graphs put together
- Trading Strategy Results
- Conclusion
- Appendix
This section investigates the financial implications of our work via the lens of a single profit maximising agent deciding on whether to invest in the risk-free rate of the index host's country (TRUS1MT in this instance) or in the index itself.
Now we can implement Granger and Pesaran (2000)'s framework. It allows us to put ourselves in the shoes of an investor who chooses between the risk-free asset and the risky one on a daily basis, i.e.:
$$ \begin{equation} R_{SPX, j, t} = \omega_{_{SPX, j, t-1}} R_{SPX, m, t} + (1 - \omega_{_{SPX, j, t-1}}) {r_f}_{_{SPX, t}} \end{equation} $$where the strategy is denoted via $j = ( C\&D_M , Naive) $ for model $M$ ranging from GARCH to EGARCH-SVI and $C\&D_M$ stands for the Christoferson and Diebold model applied with the model $M$, and where
$$ \begin{equation} \omega_{_{j, t}} = I(\widehat{\pi}_{j, t+1} > \psi) = \begin{Bmatrix} 1 & \text{if } \widehat{\pi}_{j, t+1} > \psi \\ 0 & \text{otherwise} \end{Bmatrix} \end{equation} $$for a probability threshold $\psi \in \mathbb{R}$ such that $ 0 \leq \psi \leq 1 $, and where
$$ \begin{array}{ll} R_{SPX, m, t} &= \frac{P_{SPX, t} - P_{SPX, t - 1}}{P_{SPX, t - 1}} \\ &= R_{SPX, t} + {r_f}_{_{SPX, t}} \text{ .} \end{array} $$An investor's return following from this strategy is therefore cumulative and can be defined as
$$ \begin{equation} \mathcal{R}_{SPX, j, t} = \prod_{i=1}^{t} R_{SPX, j, i} \text{ .} \end{equation} $$Similarly, the used Buy-and-Hold strategy may be considered a misnomer here as it is defined as the return from recursively investing previous-day-returns in the index every day, starting with one unit of currency (U.S. Dollar for the SPX) for compatibility, i.e.: $$\prod_{i=1}^{t} R_{SPX, B\&H, i} \text{ .}$$
We define $ y_{d,t} = I(R_{SPX, t} > 0) = \begin{Bmatrix} 1 & \text{if } R_{SPX, t} > 0 \\ 0 & \text{otherwise} \end{Bmatrix} $ as the indicator of the realised direction of the return on the S&P 500 index. This is to reflect (but it is not identical to) CPV's notation. From there we construct $ \mathbf{y\_d} = $ $\left[ \begin{matrix} y_{d,1} \\ y_{d,2} \\ \vdots\\ y_{d,T_{y\_d}} \end{matrix} \right] $ where $t \in \mathbb{Z}$ and $ 1 \le t \ge T_{y\_d}$ as follows:
y_d = ifelse(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))>0,1,0)
Set the probability threshold at which to invest in the index in our 2d graphs, i.e.: set $\psi = 0.5$. This implies that the agent invests in the index if forecasts predict a higher probability of increase in its excess return than a decrease.
p = 0.5
Following from C\&D, Christoffersen, Diebold, Mariano, Tay & Tse (2006)'s working paper outlined how the conditional probability of a positive return in the next time period is $$\begin{equation} \mathbb{P}(R_{t+1} > 0 | \Omega_{t}) = 1 - \mathbb{P} \bigg( \frac{R_{t+1} - \mu}{\sigma_{t+1|t}} \leq \frac{-\mu}{\sigma_{t+1|t}} \bigg) = 1 - F\bigg( \frac{-\mu}{\sigma_{t+1|t}} \bigg) \end{equation}$$
where $\mathbb{P}$ denotes probability, where there is no conditional mean predictability in returns $\textbf{: } \mu_{t+1|t} = \mu$, where $R_{t+1}\sim D(\mu,\sigma_{t+1|t})$ for a generic distribution $D$ dependent only on $\mu$ and $\sigma$ (such as the Normal distribution), and where $F$ is the distribution function of the “standardized” return $\frac{R_{t+1} - \mu}{\sigma_{t+1|t}}$. Then $R_{t+1}$'s sign is predictable even if its conditional mean isn't, provided that $\mu \neq 0$. If $F$ is asymmetric - as per SFFR(i) - then forecast constructions are still possible if $\mu = 0$.
$R_{t+1}$ sign forecasts are formulated as $$\begin{equation} \label{eq:PiCandD} \hat{\pi}_{C\&D_M, t+1} = \hat{\mathbb{P}}_{C\&D_M}(R_{t+1|t}>0) = 1 - \frac{1}{t} \sum_{k=1}^{t} I \bigg( \frac{R_{k} - {\hat{\mu}_{k|k-1}}}{\hat{\sigma}_{k|k-1}} \leq \frac{-\hat{\mu}_{t+1|t}}{\hat{\sigma}_{t+1|t}} \bigg) \end{equation}$$
where $M$ is the model used to compute recursive one-step-ahead out-of-sample forecasts $\hat{\mu}$ and $\hat{\sigma}$ ranging from GARCH to EGARCH-SVI (note that the model used to estimate $\hat{\mu}_{t+1|t}$ and $\hat{\sigma}_{t+1|t}$ are not specified on the right-hand-side of this equation to keep the notation uncluttered; it is so throughout the article), and where $k \in \mathbb{Z}$.
Remember our Naïve Framework. A Naïve model of sign change is used as a benchmark to construct comparisons. It is formulated as $$ \hat{\pi}_{Naive, t+1} = \hat{\mathbb{P}}_{Naive}(R_{t+1|t}>0) = \frac{1}{t} \sum_{k=1}^{t} I (R_k > 0) \text{ .}$$
Now we can construct our modelled $ \widehat{y_{d,Naive,t}} = I(\hat{\pi}_{Naive, t} > 0) = \begin{Bmatrix} 1 & \text{if } \hat{\pi}_{Naive, t} > 0 \\ 0 & \text{otherwise} \end{Bmatrix} $. From there we construct $ \mathbf{y\_hat\_d\_Naive} = $ $\left[ \begin{matrix} y_{d,Naive,1} \\ y_{d,Naive,2} \\ \vdots\\ y_{d,Naive,T_{y\_hat\_d\_Naive}} \end{matrix} \right] $ where $t \in \mathbb{Z}$ and $ 1 \le t \ge T_{y\_hat\_d\_Naive}$ as follows:
# corresponding directional forecast and realised direction
y_hat_d_Naive = ifelse(Naive_zoo > p, 1, 0)
# y_d_Naive = y_d - (Naive_zoo*0)
From there we can compute our $\omega_{Naive}$ such that $ \begin{equation} \omega_{_{Naive, t}} = I(\widehat{\pi}_{Naive, t+1} > \psi) = \begin{Bmatrix} 1 & \text{if } \widehat{\pi}_{Naive, t+1} > \psi \\ 0 & \text{otherwise} \end{Bmatrix} \end{equation} $ . Remember that this far we let the threshold probability above which we buy a share of the S&P500 be $0.5$, i.e.: $\psi = 0.5$; we named its R variable p for convenience when it comes to coding. Now we let $ \mathbf{omega\_Naive} = $ $\left[ \begin{matrix} \omega_{_{Naive, 1}} \\ \omega_{_{Naive, 2}} \\ \vdots\\ \omega_{_{Naive, T_{omega\_Naive}}} \end{matrix} \right] $ where $t \in \mathbb{Z}$ and $ 1 \le t \ge T_{omega\_Naive}$ as follows:
# let the portfolio weight attributed to the stock market index be
omega_Naive = ifelse( (lead(as.matrix(y_hat_d_Naive),1)) == 1 , 1, 0)
Remember that $ R_{SPX, j, t} = \omega_{_{SPX, j, t-1}} R_{SPX, m, t} + (1 - \omega_{_{SPX, j, t-1}}) {r_f}_{_{SPX, t}} $ where the strategy is denoted via $j = ( C\&D_M , Naive) $ for model $M$ ranging from GARCH to EGARCH-SVI; thus we may now we define
$$ \begin{array}{ll} \mathbf{R\_Active\_Naive\_p} &= \left[ \begin{matrix} R_{SPX, Naive, 1} \\ R_{SPX, Naive, 2} \\ \vdots\\ R_{SPX, Naive, T_{R\_Active\_Naive\_p}} \end{matrix} \right] \\ &= \left[ \begin{matrix} \omega_{_{SPX, Naive, 0}} (R_{SPX, 1} + {r_f}_{_{SPX, 1}}) + (1 - \omega_{_{SPX, Naive, 0}}) {r_f}_{_{SPX, 1}} \\ \omega_{_{SPX, Naive, 1}} (R_{SPX, 2} + {r_f}_{_{SPX, 2}}) + (1 - \omega_{_{SPX, Naive, 1}}) {r_f}_{_{SPX, 2}} \\ \vdots\\ \omega_{_{SPX, Naive, T_{R\_Active\_Naive\_p}-1}} (R_{SPX, T_{R\_Active\_Naive\_p}} + {r_f}_{_{SPX, T_{R\_Active\_Naive\_p}}}) + (1 - \omega_{_{SPX, Naive, T_{R\_Active\_Naive\_p}-1}}) {r_f}_{_{SPX, T_{R\_Active\_Naive\_p}}} \end{matrix} \right] \end{array}$$where $t \in \mathbb{Z}$ and $ 1 \le t \ge T_{R\_Active\_Naive\_p}$ as follows:
R_Active_Naive_p = ((lag(omega_Naive,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_Naive*0)) +
(1-lag(omega_Naive,1)) * (US_1MO_r_f_zoo-(y_hat_d_Naive*0))))[2:length(y_hat_d_Naive)]
Such matrices are constructed for all $M$. The logic will not be written in words to save on clutter, but the code will be visible for study.
$$ \\ $$Thus $ \mathcal{R}_{SPX, Naive, t} = \prod_{i=1}^{t} R_{SPX, Naive, i} $ and we may define
$$ \begin{array}{ll} \mathbf{R\_Active\_Naive\_p\_cumulated} &= \left[ \begin{matrix} \mathcal{R}_{SPX, Naive, 1} \\ \mathcal{R}_{SPX, Naive, 2} \\ \vdots\\ \mathcal{R}_{SPX, Naive, T_{R\_Active\_Naive\_p\_cumulated}} \end{matrix} \right] \\ &= \left[ \begin{matrix} R_{SPX, Naive, 1} \\ R_{SPX, Naive, 1} * R_{SPX, Naive, 2} \\ \prod_{i=1}^{3} R_{SPX, Naive, i} \\ \vdots\\ \prod_{i=1}^{T_{R\_Active\_Naive\_p\_cumulated}} R_{SPX, Naive, i} \end{matrix} \right] \end{array}$$where $t \in \mathbb{Z}$ and $ 1 \le t \ge T_{R\_Active\_Naive\_p\_cumulated}$ as follows:
R_Active_Naive_p_cumulated = matrix(,nrow=length(R_Active_Naive_p))
for (i in c(1:length(R_Active_Naive_p)))
{R_Active_Naive_p_cumulated[i] = prod((1 + R_Active_Naive_p)[1:i])}
# R_cumulated_Naive = matrix(,nrow = length(R_Active_Naive_p))
# for (i in c(1:length(R_Active_Naive_p)))
# {R_cumulated_Naive[i] = prod((1 + (zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) +
# (US_1MO_r_f_zoo - (R_Active_Naive_p*0))))[1:i])}
As aforementioned, $ \mathcal{R}_{SPX, B\&H, t} = \prod_{i=1}^{t} R_{SPX, B\&H, i}$, thus we let
$$ \begin{array}{ll} \mathbf{R\_cumulated} &= \left[ \begin{matrix} \mathcal{R}_{SPX, B\&H, 1} \\ \mathcal{R}_{SPX, B\&H, 2} \\ \vdots\\ \mathcal{R}_{SPX, B\&H, T_{R\_cumulated}} \end{matrix} \right] \\ &= \left[ \begin{matrix} R_{SPX, B\&H, 1} \\ R_{SPX, B\&H, 1} * R_{SPX, B\&H, 2} \\ \prod_{i=1}^{3} R_{SPX, B\&H, i} \\ \vdots\\ \prod_{i=1}^{T_{R\_cumulated}} R_{SPX, B\&H, i} \end{matrix} \right] \\ \end{array}$$
R_cumulated = matrix(,nrow=length(R_Active_Naive_p))
for (i in c(1:length(R_Active_Naive_p)))
{R_cumulated[i] = prod((1 + (zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))
+ (US_1MO_r_f_zoo - (R_Active_Naive_p*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.16.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.16.RData")
corresponding directional forecast and realised direction
y_hat_d_GARCH11_ser = ifelse(GARCH11_CandD_zoo_no_nas_ser>p,1,0)
y_d_GARCH11_ser = y_d - (GARCH11_CandD_zoo_no_nas_ser*0) # This y_d only has values on dates corresponding to GARCH11's.
let the portfolio weight attributed to the stock market index be as follows (i.e.: if $\hat{y}_{d, GARCH11, t+1} = 1$, then $\omega = 1$, $0$ otherwise):
omega_GARCH11_ser = ifelse( (lead(as.matrix(y_hat_d_GARCH11_ser),1)) == 1, 1, 0)
R_Active_GARCH11_p_ser = ((lag(omega_GARCH11_ser,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GARCH11_ser*0))) +
((1-lag(omega_GARCH11_ser,1)) * (US_1MO_r_f_zoo - (y_hat_d_GARCH11_ser*0))))[2:length(y_hat_d_GARCH11_ser)]
R_Active_GARCH11_p_cumulated_ser = matrix(,nrow=length(R_Active_GARCH11_p_ser))
for (i in c(1:length(R_Active_GARCH11_p_ser)))
{R_Active_GARCH11_p_cumulated_ser[i] = prod((1+R_Active_GARCH11_p_ser)[1:i])}
R_cumulated_GARCH11_ser = matrix(,nrow=length(R_Active_GARCH11_p_ser))
for (i in c(1:length(R_Active_GARCH11_p_ser)))
{R_cumulated_GARCH11_ser[i] = prod((1 + (zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + (US_1MO_r_f_zoo - (R_Active_GARCH11_p_ser*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.17.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.18.RData")
y_hat_d_GARCH11_par = ifelse(GARCH11_CandD_zoo_no_nas_par>p,1,0)
y_d_GARCH11_par = y_d - (GARCH11_CandD_zoo_no_nas_par*0) # This y_d only has values on dates corresponding to GARCH11's.
let the portfolio weight attributed to the stock market index be as follows (i.e.: if $\widehat{y_{d, GARCH11, t+1}} = 1$, then $\omega = 1$, $0$ otherwise):
omega_GARCH11_par = ifelse( (lead(as.matrix(y_hat_d_GARCH11_par),1)) == 1, 1, 0)
R_Active_GARCH11_p_par = ((lag(omega_GARCH11_par,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GARCH11_par*0))) +
((1-lag(omega_GARCH11_par,1)) * (US_1MO_r_f_zoo-(y_hat_d_GARCH11_par*0))))[2:length(y_hat_d_GARCH11_par)]
R_Active_GARCH11_p_cumulated_par = matrix(,nrow=length(R_Active_GARCH11_p_par))
for (i in c(1:length(R_Active_GARCH11_p_par)))
{R_Active_GARCH11_p_cumulated_par[i] = prod((1+R_Active_GARCH11_p_par)[1:i])}
R_cumulated_GARCH11_par = matrix(,nrow=length(R_Active_GARCH11_p_par))
for (i in c(1:length(R_Active_GARCH11_p_par)))
{R_cumulated_GARCH11_par[i] = prod((1 + (zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + (US_1MO_r_f_zoo - (R_Active_GARCH11_p_par*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.18.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.18.RData")
corresponding directional forecast and realised direction
y_hat_d_GARCH11SVI_ser = ifelse(GARCH11SVI_CandD_zoo_no_nas_ser>p, 1, 0)
y_d_GARCH11SVI_ser = y_d - (GARCH11SVI_CandD_zoo_no_nas_ser * 0) # This y_d only has values on dates corresponding to GARCH11SVI's.
let the portfolio weight attributed to the stock market index be
omega_GARCH11SVI_ser = ifelse( (lead(as.matrix(y_hat_d_GARCH11SVI_ser),1))==1 , 1, 0)
R_Active_GARCH11SVI_p_ser = ((lag(omega_GARCH11SVI_ser,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GARCH11SVI_ser*0))) +
((1-lag(omega_GARCH11SVI_ser,1)) * (US_1MO_r_f_zoo - (y_hat_d_GARCH11SVI_ser*0))))[2:length(y_hat_d_GARCH11SVI_ser)]
R_Active_GARCH11SVI_p_cumulated_ser = matrix(,nrow=length(R_Active_GARCH11SVI_p_ser))
for (i in c(1:length(R_Active_GARCH11SVI_p_ser)))
{R_Active_GARCH11SVI_p_cumulated_ser[i] = prod((1+R_Active_GARCH11SVI_p_ser)[1:i])}
R_cumulated_GARCH11SVI_ser = matrix(,nrow=length(R_Active_GARCH11SVI_p_ser))
for (i in c(1:length(R_Active_GARCH11SVI_p_ser)))
{R_cumulated_GARCH11SVI_ser[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + (US_1MO_r_f_zoo - (R_Active_GARCH11SVI_p_ser*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.19.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.19.RData")
corresponding directional forecast and realised direction
y_hat_d_GARCH11SVI_par = ifelse(GARCH11SVI_CandD_zoo_no_nas_par>p, 1, 0)
y_d_GARCH11SVI_par = y_d - (GARCH11SVI_CandD_zoo_no_nas_par * 0) # This y_d only has values on dates corresponding to GARCH11SVI's.
let the portfolio weight attributed to the stock market index be
omega_GARCH11SVI_par = ifelse( (lead(as.matrix(y_hat_d_GARCH11SVI_par),1))==1 , 1, 0)
R_Active_GARCH11SVI_p_par = ((lag(omega_GARCH11SVI_par,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GARCH11SVI_par*0))) +
((1-lag(omega_GARCH11SVI_par,1)) * (US_1MO_r_f_zoo - (y_hat_d_GARCH11SVI_par*0))))[2:length(y_hat_d_GARCH11SVI_par)]
R_Active_GARCH11SVI_p_cumulated_par = matrix(,nrow=length(R_Active_GARCH11SVI_p_par))
for (i in c(1:length(R_Active_GARCH11SVI_p_par)))
{R_Active_GARCH11SVI_p_cumulated_par[i] = prod((1+R_Active_GARCH11SVI_p_par)[1:i])}
R_cumulated_GARCH11SVI_par = matrix(,nrow=length(R_Active_GARCH11SVI_p_par))
for (i in c(1:length(R_Active_GARCH11SVI_p_par)))
{R_cumulated_GARCH11SVI_par[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + (US_1MO_r_f_zoo - (R_Active_GARCH11SVI_p_par*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.20.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.20.RData")
# corresponding directional forecast and realised direction
y_hat_d_GJRGARCH11_ser = ifelse(GJRGARCH11_CandD_zoo_no_nas_ser>p,1,0)
y_d_GJRGARCH11_ser = y_d - (GJRGARCH11_CandD_zoo_no_nas_ser*0) # This y_d only has values on dates corresponding to GJRGARCH11's.
# let the portfolio weight attributed to the stock market index be
omega_GJRGARCH11_ser = ifelse( (lead(as.matrix(y_hat_d_GJRGARCH11_ser),1))==1 , 1, 0)
R_Active_GJRGARCH11_p_ser = ((lag(omega_GJRGARCH11_ser,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GJRGARCH11_ser*0))) +
((1-lag(omega_GJRGARCH11_ser,1)) * (US_1MO_r_f_zoo-(y_hat_d_GJRGARCH11_ser*0))))[2:length(y_hat_d_GJRGARCH11_ser)]
R_Active_GJRGARCH11_p_cumulated_ser = matrix(,nrow=length(R_Active_GJRGARCH11_p_ser))
for (i in c(1:length(R_Active_GJRGARCH11_p_ser)))
{R_Active_GJRGARCH11_p_cumulated_ser[i] = prod((1+R_Active_GJRGARCH11_p_ser)[1:i])}
R_cumulated_GJRGARCH11_ser = matrix(,nrow=length(R_Active_GJRGARCH11_p_ser))
for (i in c(1:length(R_Active_GJRGARCH11_p_ser)))
{R_cumulated_GJRGARCH11_ser[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))+(US_1MO_r_f_zoo - (R_Active_GJRGARCH11_p_ser*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.21.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.21.RData")
# corresponding directional forecast and realised direction
y_hat_d_GJRGARCH11_par = ifelse(GJRGARCH11_CandD_zoo_no_nas_par>p,1,0)
y_d_GJRGARCH11_par = y_d - (GJRGARCH11_CandD_zoo_no_nas_par*0) # This y_d only has values on dates corresponding to GJRGARCH11's.
# let the portfolio weight attributed to the stock market index be
omega_GJRGARCH11_par = ifelse( (lead(as.matrix(y_hat_d_GJRGARCH11_par),1))==1 , 1, 0)
R_Active_GJRGARCH11_p_par = ((lag(omega_GJRGARCH11_par,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GJRGARCH11_par*0))) +
((1-lag(omega_GJRGARCH11_par,1)) * (US_1MO_r_f_zoo-(y_hat_d_GJRGARCH11_par*0))))[2:length(y_hat_d_GJRGARCH11_par)]
R_Active_GJRGARCH11_p_cumulated_par = matrix(,nrow=length(R_Active_GJRGARCH11_p_par))
for (i in c(1:length(R_Active_GJRGARCH11_p_par)))
{R_Active_GJRGARCH11_p_cumulated_par[i] = prod((1+R_Active_GJRGARCH11_p_par)[1:i])}
R_cumulated_GJRGARCH11_par = matrix(,nrow=length(R_Active_GJRGARCH11_p_par))
for (i in c(1:length(R_Active_GJRGARCH11_p_par)))
{R_cumulated_GJRGARCH11_par[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))+(US_1MO_r_f_zoo - (R_Active_GJRGARCH11_p_par*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.22.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.22.RData")
# corresponding directional forecast and realised direction
y_hat_d_GJRGARCH11SVI_ser = ifelse(GJRGARCH11SVI_CandD_zoo_no_nas_ser>p,1,0)
y_d_GJRGARCH11SVI_ser = y_d - (GJRGARCH11SVI_CandD_zoo_no_nas_ser*0) # This y_d only has values on dates corresponding to GJRGARCH11SVI's.
# let the portfolio weight attributed to the stock market index be
omega_GJRGARCH11SVI_ser = ifelse( (lead(as.matrix(y_hat_d_GJRGARCH11SVI_ser),1))==1 , 1, 0)
R_Active_GJRGARCH11SVI_p_ser = ((lag(omega_GJRGARCH11SVI_ser,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GJRGARCH11SVI_ser*0))) +
((1-lag(omega_GJRGARCH11SVI_ser,1)) * (US_1MO_r_f_zoo-(y_hat_d_GJRGARCH11SVI_ser*0))))[2:length(y_hat_d_GJRGARCH11SVI_ser)]
R_Active_GJRGARCH11SVI_p_cumulated_ser = matrix(,nrow=length(R_Active_GJRGARCH11SVI_p_ser))
for (i in c(1:length(R_Active_GJRGARCH11SVI_p_ser)))
{R_Active_GJRGARCH11SVI_p_cumulated_ser[i] = prod((1+R_Active_GJRGARCH11SVI_p_ser)[1:i])}
R_cumulated_GJRGARCH11SVI_ser = matrix(,nrow=length(R_Active_GJRGARCH11SVI_p_ser))
for (i in c(1:length(R_Active_GJRGARCH11SVI_p_ser)))
{R_cumulated_GJRGARCH11SVI_ser[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))+(US_1MO_r_f_zoo - (R_Active_GJRGARCH11SVI_p_ser*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.23.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.23.RData")
# corresponding directional forecast and realised direction
y_hat_d_GJRGARCH11SVI_par = ifelse(GJRGARCH11SVI_CandD_zoo_no_nas_par>p,1,0)
y_d_GJRGARCH11SVI_par = y_d - (GJRGARCH11SVI_CandD_zoo_no_nas_par*0) # This y_d only has values on dates corresponding to GJRGARCH11SVI's.
# let the portfolio weight attributed to the stock market index be
omega_GJRGARCH11SVI_par = ifelse( (lead(as.matrix(y_hat_d_GJRGARCH11SVI_par),1))==1 , 1, 0)
R_Active_GJRGARCH11SVI_p_par = ((lag(omega_GJRGARCH11SVI_par,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_GJRGARCH11SVI_par*0))) +
((1-lag(omega_GJRGARCH11SVI_par,1)) * (US_1MO_r_f_zoo-(y_hat_d_GJRGARCH11SVI_par*0))))[2:length(y_hat_d_GJRGARCH11SVI_par)]
R_Active_GJRGARCH11SVI_p_cumulated_par = matrix(,nrow=length(R_Active_GJRGARCH11SVI_p_par))
for (i in c(1:length(R_Active_GJRGARCH11SVI_p_par)))
{R_Active_GJRGARCH11SVI_p_cumulated_par[i] = prod((1+R_Active_GJRGARCH11SVI_p_par)[1:i])}
R_cumulated_GJRGARCH11SVI_par = matrix(,nrow=length(R_Active_GJRGARCH11SVI_p_par))
for (i in c(1:length(R_Active_GJRGARCH11SVI_p_par)))
{R_cumulated_GJRGARCH11SVI_par[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))+(US_1MO_r_f_zoo - (R_Active_GJRGARCH11SVI_p_par*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.24.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.24.RData")
# corresponding directional forecast and realised direction
y_hat_d_EGARCH11_ser = ifelse(EGARCH11_CandD_zoo_no_nas_ser>p,1,0)
y_d_EGARCH11_ser = y_d - (EGARCH11_CandD_zoo_no_nas_ser*0) # This y_d only has values on dates corresponding to EGARCH11's.
# let the portfolio weight attributed to the stock market index be
omega_EGARCH11_ser = ifelse( (lead(as.matrix(y_hat_d_EGARCH11_ser),1))==1 , 1, 0)
R_Active_EGARCH11_p_ser = ((lag(omega_EGARCH11_ser,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_EGARCH11_ser*0))) +
((1-lag(omega_EGARCH11_ser,1)) * (US_1MO_r_f_zoo-(y_hat_d_EGARCH11_ser*0))))[2:length(y_hat_d_EGARCH11_ser)]
R_Active_EGARCH11_p_cumulated_ser = matrix(,nrow=length(R_Active_EGARCH11_p_ser))
for (i in c(1:length(R_Active_EGARCH11_p_ser)))
{R_Active_EGARCH11_p_cumulated_ser[i] = prod((1+R_Active_EGARCH11_p_ser)[1:i])}
R_cumulated_EGARCH11_ser = matrix(,nrow=length(R_Active_EGARCH11_p_ser))
for (i in c(1:length(R_Active_EGARCH11_p_ser)))
{R_cumulated_EGARCH11_ser[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))+(US_1MO_r_f_zoo - (R_Active_EGARCH11_p_ser*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.25.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.25.RData")
# corresponding directional forecast and realised direction
y_hat_d_EGARCH11_par = ifelse(EGARCH11_CandD_zoo_no_nas_par>p,1,0)
y_d_EGARCH11_par = y_d - (EGARCH11_CandD_zoo_no_nas_par*0) # This y_d only has values on dates corresponding to EGARCH11's.
# let the portfolio weight attributed to the stock market index be
omega_EGARCH11_par = ifelse( (lead(as.matrix(y_hat_d_EGARCH11_par),1))==1 , 1, 0)
R_Active_EGARCH11_p_par = ((lag(omega_EGARCH11_par,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_EGARCH11_par*0))) +
((1-lag(omega_EGARCH11_par,1)) * (US_1MO_r_f_zoo-(y_hat_d_EGARCH11_par*0))))[2:length(y_hat_d_EGARCH11_par)]
R_Active_EGARCH11_p_cumulated_par = matrix(,nrow=length(R_Active_EGARCH11_p_par))
for (i in c(1:length(R_Active_EGARCH11_p_par)))
{R_Active_EGARCH11_p_cumulated_par[i] = prod((1+R_Active_EGARCH11_p_par)[1:i])}
R_cumulated_EGARCH11_par = matrix(,nrow=length(R_Active_EGARCH11_p_par))
for (i in c(1:length(R_Active_EGARCH11_p_par)))
{R_cumulated_EGARCH11_par[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))+(US_1MO_r_f_zoo - (R_Active_EGARCH11_p_par*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.26.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.26.RData")
# corresponding directional forecast and realised direction
y_hat_d_EGARCH11SVI_ser = ifelse(EGARCH11SVI_CandD_zoo_no_nas_ser>p,1,0)
y_d_EGARCH11SVI_ser = y_d - (EGARCH11SVI_CandD_zoo_no_nas_ser*0) # This y_d only has values on dates corresponding to EGARCH11SVI's.
# let the portfolio weight attributed to the stock market index be
omega_EGARCH11SVI_ser = ifelse( (lead(as.matrix(y_hat_d_EGARCH11SVI_ser),1))==1 , 1, 0)
R_Active_EGARCH11SVI_p_ser = ((lag(omega_EGARCH11SVI_ser,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_EGARCH11SVI_ser*0))) +
((1-lag(omega_EGARCH11SVI_ser,1)) * (US_1MO_r_f_zoo-(y_hat_d_EGARCH11SVI_ser*0))))[2:length(y_hat_d_EGARCH11SVI_ser)]
R_Active_EGARCH11SVI_p_cumulated_ser = matrix(,nrow=length(R_Active_EGARCH11SVI_p_ser))
for (i in c(1:length(R_Active_EGARCH11SVI_p_ser)))
{R_Active_EGARCH11SVI_p_cumulated_ser[i] = prod((1+R_Active_EGARCH11SVI_p_ser)[1:i])}
R_cumulated_EGARCH11SVI_ser = matrix(,nrow=length(R_Active_EGARCH11SVI_p_ser))
for (i in c(1:length(R_Active_EGARCH11SVI_p_ser)))
{R_cumulated_EGARCH11SVI_ser[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")]))+(US_1MO_r_f_zoo - (R_Active_EGARCH11SVI_p_ser*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.27.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.27.RData")
# corresponding directional forecast and realised direction
y_hat_d_EGARCH11SVI_par = ifelse(EGARCH11SVI_CandD_zoo_no_nas_par>p,1,0)
y_d_EGARCH11SVI_par = y_d - (EGARCH11SVI_CandD_zoo_no_nas_par*0) # This y_d only has values on dates corresponding to EGARCH11SVI's.
# let the portfolio weight attributed to the stock market index be
omega_EGARCH11SVI_par = ifelse( (lead(as.matrix(y_hat_d_EGARCH11SVI_par),1))==1 , 1, 0)
R_Active_EGARCH11SVI_p_par = ((lag(omega_EGARCH11SVI_par,1) * ((zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + zoo(US_1MO_r_f)) - (y_hat_d_EGARCH11SVI_par*0))) +
((1-lag(omega_EGARCH11SVI_par,1)) * (US_1MO_r_f_zoo-(y_hat_d_EGARCH11SVI_par*0))))[2:length(y_hat_d_EGARCH11SVI_par)]
R_Active_EGARCH11SVI_p_cumulated_par = matrix(,nrow=length(R_Active_EGARCH11SVI_p_par))
for (i in c(1:length(R_Active_EGARCH11SVI_p_par)))
{R_Active_EGARCH11SVI_p_cumulated_par[i] = prod((1+R_Active_EGARCH11SVI_p_par)[1:i])}
R_cumulated_EGARCH11SVI_par = matrix(,nrow=length(R_Active_EGARCH11SVI_p_par))
for (i in c(1:length(R_Active_EGARCH11SVI_p_par)))
{R_cumulated_EGARCH11SVI_par[i] = prod((1+(zoo(df[, c("SPX_R")], index(df[, c("SPX_R")])) + (US_1MO_r_f_zoo - (R_Active_EGARCH11SVI_p_par*0))))[1:i])}
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.28.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.28.RData")
Let's see how many $\hat{y}_{d, t}^j$ observations there are
length(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))
3791
Let's have a look at the cumulative returns of our trader following each of the strategies with estimates computed in series
# Plot all 2d polts
par(bg = "grey30", col.axis = "white", col.lab = "white", col.main = "white", col.sub = "white")
plot(zoo(R_Active_GARCH11_p_cumulated_ser,
as.Date(zoo::index(y_hat_d_GARCH11_ser[2:length(y_hat_d_GARCH11_ser)]))),
cex.axis=1, type="l",col="red", xlab='Date', ylim=c(0.6,2.7), ylab='SPX strategy gain ($)', main=str_c("SPX Strategy_ser for psi ", p))
lines(zoo(R_cumulated,
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col = "white")
lines(zoo(R_Active_Naive_p_cumulated,
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col="black")
lines(zoo(R_Active_GARCH11SVI_p_cumulated_ser,
as.Date(zoo::index(y_hat_d_GARCH11SVI_ser[2:length(y_hat_d_GARCH11SVI_ser)]))),
col="orange")
lines(zoo(R_Active_EGARCH11SVI_p_cumulated_ser,
as.Date(zoo::index(y_hat_d_EGARCH11SVI_ser[2:length(y_hat_d_EGARCH11SVI_ser)]))),
col="purple")
lines(zoo(R_Active_GJRGARCH11_p_cumulated_ser,
as.Date(zoo::index(y_hat_d_GJRGARCH11_ser[2:length(y_hat_d_GJRGARCH11_ser)]))),
col="blue")
lines(zoo(R_Active_GJRGARCH11SVI_p_cumulated_ser,
as.Date(zoo::index(y_hat_d_GJRGARCH11SVI_ser[2:length(y_hat_d_GJRGARCH11SVI_ser)]))),
col="magenta")
lines(zoo(R_Active_EGARCH11_p_cumulated_ser,
as.Date(zoo::index(y_hat_d_EGARCH11_ser[2:length(y_hat_d_EGARCH11_ser)]))),
col="green")
legend(lty=1, cex=1,
"topleft", col = c("white", "black", "red", "orange", "blue", "magenta", "green", "purple"), text.col = c("white"),
legend=c("Buy and hold", "Naive", "GARCH", "GARCH-SVI", "GJRGARCH", "GJRGARCH-SVI", "EGARCH", "EGARCH-SVI"))
To get a narrower view of this graph, we can have a look at the difference between the (active) Buy & Hold model's output and the other's:
par(bg = "grey30", col.axis = "white", col.lab = "white", col.main = "white", col.sub = "white")
plot(zoo(R_Active_GARCH11_p_cumulated_ser - R_cumulated[(length(R_cumulated)-length(R_Active_GARCH11_p_cumulated_ser)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GARCH11_ser[2:length(y_hat_d_GARCH11_ser)]))),
cex.axis=1, type="l",col="red", xlab='Date', ylim=c(-0.1,0.5), ylab='SPX strategy gain ($)', main=str_c("SPX Strategy_ser for psi ", p))
lines(zoo(R_Active_GJRGARCH11SVI_p_cumulated_ser - R_cumulated[(length(R_cumulated)-length(R_Active_GJRGARCH11SVI_p_cumulated_ser)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GJRGARCH11SVI_ser[2:length(y_hat_d_GJRGARCH11SVI_ser)]))),
col="magenta")
lines(zoo(R_cumulated - R_cumulated,
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col = "white")
lines(zoo(R_Active_Naive_p_cumulated - R_cumulated[(length(R_cumulated)-length(R_Active_Naive_p_cumulated)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col="black")
lines(zoo(R_Active_GARCH11SVI_p_cumulated_ser - R_cumulated[(length(R_cumulated)-length(R_Active_GARCH11SVI_p_cumulated_ser)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GARCH11SVI_ser[2:length(y_hat_d_GARCH11SVI_ser)]))),
col="orange")
lines(zoo(R_Active_EGARCH11SVI_p_cumulated_ser - R_cumulated[(length(R_cumulated)-length(R_Active_EGARCH11SVI_p_cumulated_ser)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_EGARCH11SVI_ser[2:length(y_hat_d_EGARCH11SVI_ser)]))),
col="purple")
lines(zoo(R_Active_GJRGARCH11_p_cumulated_ser - R_cumulated[(length(R_cumulated)-length(R_Active_GJRGARCH11_p_cumulated_ser)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GJRGARCH11_ser[2:length(y_hat_d_GJRGARCH11_ser)]))),
col="blue")
lines(zoo(R_Active_EGARCH11_p_cumulated_ser - R_cumulated[(length(R_cumulated)-length(R_Active_EGARCH11_p_cumulated_ser)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_EGARCH11_ser[2:length(y_hat_d_EGARCH11_ser)]))),
col="green")
legend(lty=1, cex=1,
"topleft", col = c("white", "black", "red", "orange", "blue", "magenta", "green", "purple"), text.col = c("white"),
legend=c("Buy and hold", "Naive", "GARCH", "GARCH-SVI", "GJRGARCH", "GJRGARCH-SVI", "EGARCH", "EGARCH-SVI"))
Note the variance shown in lines in the graph above does not represent their variance, only the variance in their level difference with the (active) Buy & Hold model's output. The fact that some lines look flatter than others simply portrays how is follows the B&H model.
Let's have a look at the cumulative returns of our trader following each of the strategies with estimates computed in parallel
# Plot all 2d polts
par(bg = "grey30", col.axis = "white", col.lab = "white", col.main = "white", col.sub = "white")
plot(zoo(R_Active_GARCH11_p_cumulated_par,
as.Date(zoo::index(y_hat_d_GARCH11_par[2:length(y_hat_d_GARCH11_par)]))),
cex.axis=1, type="l",col="red", xlab='Date', ylim=c(0.6,2.7), ylab='SPX strategy gain ($)', main=str_c("SPX Strategy_par for psi ", p))
lines(zoo(R_cumulated,
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col = "white")
lines(zoo(R_Active_Naive_p_cumulated,
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col="black")
lines(zoo(R_Active_GARCH11SVI_p_cumulated_par,
as.Date(zoo::index(y_hat_d_GARCH11SVI_par[2:length(y_hat_d_GARCH11SVI_par)]))),
col="orange")
lines(zoo(R_Active_EGARCH11SVI_p_cumulated_par,
as.Date(zoo::index(y_hat_d_EGARCH11SVI_par[2:length(y_hat_d_EGARCH11SVI_par)]))),
col="purple")
lines(zoo(R_Active_GJRGARCH11_p_cumulated_par,
as.Date(zoo::index(y_hat_d_GJRGARCH11_par[2:length(y_hat_d_GJRGARCH11_par)]))),
col="blue")
lines(zoo(R_Active_GJRGARCH11SVI_p_cumulated_par,
as.Date(zoo::index(y_hat_d_GJRGARCH11SVI_par[2:length(y_hat_d_GJRGARCH11SVI_par)]))),
col="magenta")
lines(zoo(R_Active_EGARCH11_p_cumulated_par,
as.Date(zoo::index(y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)]))),
col="green")
legend(lty=1, cex=1,
"topleft", col = c("white", "black", "red", "orange", "blue", "magenta", "green", "purple"), text.col = c("white"),
legend=c("Buy and hold", "Naive", "GARCH", "GARCH-SVI", "GJRGARCH", "GJRGARCH-SVI", "EGARCH", "EGARCH-SVI"))
To get a narrower view of this graph, we can have a look at the difference between the (active) Buy & Hold model's output and the other's:
par(bg = "grey30", col.axis = "white", col.lab = "white", col.main = "white", col.sub = "white")
plot(zoo(R_Active_GARCH11_p_cumulated_par - R_cumulated[(length(R_cumulated)-length(R_Active_GARCH11_p_cumulated_par)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GARCH11_par[2:length(y_hat_d_GARCH11_par)]))),
cex.axis=1, type="l",col="red", xlab='Date', ylim=c(-0.1,0.5), ylab='SPX strategy gain ($)', main=str_c("SPX Strategy_par for psi ", p))
lines(zoo(R_Active_GJRGARCH11SVI_p_cumulated_par - R_cumulated[(length(R_cumulated)-length(R_Active_GJRGARCH11SVI_p_cumulated_par)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GJRGARCH11SVI_par[2:length(y_hat_d_GJRGARCH11SVI_par)]))),
col="magenta")
lines(zoo(R_cumulated - R_cumulated,
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col = "white")
lines(zoo(R_Active_Naive_p_cumulated - R_cumulated[(length(R_cumulated)-length(R_Active_Naive_p_cumulated)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))),
col="black")
lines(zoo(R_Active_GARCH11SVI_p_cumulated_par - R_cumulated[(length(R_cumulated)-length(R_Active_GARCH11SVI_p_cumulated_par)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GARCH11SVI_par[2:length(y_hat_d_GARCH11SVI_par)]))),
col="orange")
lines(zoo(R_Active_EGARCH11SVI_p_cumulated_par - R_cumulated[(length(R_cumulated)-length(R_Active_EGARCH11SVI_p_cumulated_par)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_EGARCH11SVI_par[2:length(y_hat_d_EGARCH11SVI_par)]))),
col="purple")
lines(zoo(R_Active_GJRGARCH11_p_cumulated_par - R_cumulated[(length(R_cumulated)-length(R_Active_GJRGARCH11_p_cumulated_par)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_GJRGARCH11_par[2:length(y_hat_d_GJRGARCH11_par)]))),
col="blue")
lines(zoo(R_Active_EGARCH11_p_cumulated_par - R_cumulated[(length(R_cumulated)-length(R_Active_EGARCH11_p_cumulated_par)+1):length(R_cumulated)],
as.Date(zoo::index(y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)]))),
col="green")
legend(lty=1, cex=1,
"topleft", col = c("white", "black", "red", "orange", "blue", "magenta", "green", "purple"), text.col = c("white"),
legend=c("Buy and hold", "Naive", "GARCH", "GARCH-SVI", "GJRGARCH", "GJRGARCH-SVI", "EGARCH", "EGARCH-SVI"))
Note the variance shown in lines in the graph above does not represent their variance, only the variance in their level difference with the (active) Buy & Hold model's output. Similarly, sharp and instantaneous decreases or increases are not reflective of $\mathcal{R}_{SPX, j, t}$. The fact that some lines look flatter than others simply portrays how is follows the B&H model.
Saving our data this far to help with de-bugging if needed:
save.image(file = "IDaSRP_work_space_p5.29.RData")
# # To restore your workspace, type this:
# load("IDaSRP_work_space_p5.29.RData")
This section investigates the financial implications of my work via the lens of a single profit maximising agent deciding on whether to invest in the risk-free rate of the index host's country (U.S.D. in this instance) or in the index itself. In this instance, the investor chooses between the One-Month United States' Treasury Bill and the S&P 500.
For completeness, Sharpe-Ratios of such $\mathcal{R}_{SPX, j, t}$ are also provided. They are defined as:
$$ \frac{ \mathcal{R}_{SPX, j, t} - \mathcal{R}_{f_{SPX}, t} }{ \sqrt{Var(\mathcal{R}_{SPX, j, t})} } $$where $\mathcal{R}_{f_{SPX}, t}$ is $r_{f_{SPX},t}$ equivalent to $\mathcal{R}_{SPX, j, t}$ above for `like-to-like' comparison such that $\mathcal{R}_{f_{SPX}, t} = \prod_{i=1}^{t} r_{f_{SPX},t} \text{ .}$
Example of the computation of Sharpe-Ratios with $\mathcal{R}_{SPX, EGARCH11}$
- 1st: We need to compute $\mathcal{R}_{f_{SPX}}$ via $ \text{ } \mathbf{cumulative\_R\_f} =$ $\left[ \begin{matrix} \mathcal{R}_{f_{SPX}, 1} \\ \mathcal{R}_{f_{SPX}, 2} \\ \vdots\\ \mathcal{R}_{f_{SPX}, {T_{cumulative\_R\_f}}} \end{matrix} \right] \text{ }$ where $t \in \mathbb{Z}$ and $ 1 \le t \ge T_{cumulative\_R\_f}$.
one_m_r_f = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30)
cumulative_R_f = matrix(,nrow=(length(one_m_r_f)-1))
for (i in c(2:(length(one_m_r_f))))
{cumulative_R_f[i] = prod((1 + one_m_r_f[2:i]))}
cumulative_R_f_zoo = zoo(cumulative_R_f,
as.Date(index(one_m_r_f)[2:(length(one_m_r_f))]))
- 2nd we need to know when the latest value for our $\mathcal{R}_{SPX, j, t}$ was recorded to match it with the equivalent value in $\mathcal{R}_{f_{SPX}, t}$
zoo(R_Active_EGARCH11_p_cumulated_par,
as.Date(zoo::index(y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)])))[length(R_Active_EGARCH11_p_cumulated_par)]
2019-03-13
2.383941
time(zoo(R_Active_EGARCH11_p_cumulated_par,
as.Date(zoo::index(y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)])))[length(R_Active_EGARCH11_p_cumulated_par)])
2019-03-13
cumulative_R_f_zoo[time(zoo(R_Active_EGARCH11_p_cumulated_par,
as.Date(zoo::index(y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)])))[length(R_Active_EGARCH11_p_cumulated_par)])]
2019-03-13
1.168543
- 3rd We can put it all together:
EGARCH11_Sharpe_Ratio1 = (
(
as.numeric(
zoo(
R_Active_EGARCH11_p_cumulated_par,
as.Date(
zoo::index(
y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)]
)
)
)[length(R_Active_EGARCH11_p_cumulated_par)]
) - as.numeric(cumulative_R_f_zoo[time(
zoo(
R_Active_EGARCH11_p_cumulated_par,
as.Date(
zoo::index(
y_hat_d_EGARCH11_par[2:length(
y_hat_d_EGARCH11_par
)]
)
)
)[length(R_Active_EGARCH11_p_cumulated_par)])])
)/(sd(R_Active_EGARCH11_p_cumulated_par)))
EGARCH11_Sharpe_Ratio1
2.81687593057198
Cumulated_Sharpe_Ratio = function(cumulative_R_zoo, r_f_zoo){
# Compute cumulative_R_f
cumulative_R_f = matrix(,nrow=(length(r_f_zoo)-1))
for (i in c(2:(length(r_f_zoo))))
{cumulative_R_f[i] = prod((1 + r_f_zoo[2:i]))}
cumulative_R_f_zoo = zoo(cumulative_R_f,
as.Date(index(r_f_zoo)[2:(length(r_f_zoo))]))
# Calculate the cumulated_R_f
cumulated_R_f = cumulative_R_f_zoo[time(cumulative_R_zoo[length(cumulative_R_zoo)])]
# Calculate Cumulated_Sharpe_Ratio:
Cumulated_Sharpe_Ratio = (
(as.numeric(
cumulative_R_zoo[length(
cumulative_R_zoo
)]) - as.numeric(
cumulated_R_f
)
)/(sd(cumulative_R_zoo)))
# return a list including r_f 0th and YTM 1st.
return(Cumulated_Sharpe_Ratio)}
EGARCH11_Sharpe_Ratio = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_EGARCH11_p_cumulated_par,as.Date(zoo::index(y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)]))),
r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
EGARCH11_Sharpe_Ratio
Sharpe-Ratio of our $\mathcal{R}_{SPX, j}$
R_cumulated_Sharpe_Ratio = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_cumulated, as.Date(zoo::index(y_hat_d_Naive[2:length(y_hat_d_Naive)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GARCH11_Sharpe_Ratio_par = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GARCH11_p_cumulated_par,as.Date(zoo::index(y_hat_d_GARCH11_par[2:length(y_hat_d_GARCH11_par)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GARCH11SVI_Sharpe_Ratio_par = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GARCH11SVI_p_cumulated_par,as.Date(zoo::index(y_hat_d_GARCH11SVI_par[2:length(y_hat_d_GARCH11SVI_par)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GJRGARCH11_Sharpe_Ratio_par = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GJRGARCH11_p_cumulated_par,as.Date(zoo::index(y_hat_d_GJRGARCH11_par[2:length(y_hat_d_GJRGARCH11_par)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GJRGARCH11SVI_Sharpe_Ratio_par = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GJRGARCH11SVI_p_cumulated_par,as.Date(zoo::index(y_hat_d_GJRGARCH11SVI_par[2:length(y_hat_d_GJRGARCH11SVI_par)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
EGARCH11_Sharpe_Ratio_par = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_EGARCH11_p_cumulated_par,as.Date(zoo::index(y_hat_d_EGARCH11_par[2:length(y_hat_d_EGARCH11_par)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
EGARCH11SVI_Sharpe_Ratio_par = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_EGARCH11SVI_p_cumulated_par,as.Date(zoo::index(y_hat_d_EGARCH11SVI_par[2:length(y_hat_d_EGARCH11SVI_par)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GARCH11_Sharpe_Ratio_ser = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GARCH11_p_cumulated_ser,as.Date(zoo::index(y_hat_d_GARCH11_ser[2:length(y_hat_d_GARCH11_ser)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GARCH11SVI_Sharpe_Ratio_ser = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GARCH11SVI_p_cumulated_ser,as.Date(zoo::index(y_hat_d_GARCH11SVI_ser[2:length(y_hat_d_GARCH11SVI_ser)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GJRGARCH11_Sharpe_Ratio_ser = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GJRGARCH11_p_cumulated_ser,as.Date(zoo::index(y_hat_d_GJRGARCH11_ser[2:length(y_hat_d_GJRGARCH11_ser)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
GJRGARCH11SVI_Sharpe_Ratio_ser = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_GJRGARCH11SVI_p_cumulated_ser,as.Date(zoo::index(y_hat_d_GJRGARCH11SVI_ser[2:length(y_hat_d_GJRGARCH11SVI_ser)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
EGARCH11_Sharpe_Ratio_ser = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_EGARCH11_p_cumulated_ser,as.Date(zoo::index(y_hat_d_EGARCH11_ser[2:length(y_hat_d_EGARCH11_ser)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
EGARCH11SVI_Sharpe_Ratio_ser = Cumulated_Sharpe_Ratio(cumulative_R_zoo = zoo(R_Active_EGARCH11SVI_p_cumulated_ser,as.Date(zoo::index(y_hat_d_EGARCH11SVI_ser[2:length(y_hat_d_EGARCH11SVI_ser)]))), r_f_zoo = ZCB_YTM_Implied_r_f(CMR = TRUS1MT_raw, Maturity = (1/12), D = 30))
rbind(as.data.table(list(Data_Set = "Buy_and_Hold", Sharpe_Ratio = R_cumulated_Sharpe_Ratio,
Cumulative_Return = R_cumulated[length(R_cumulated)])),
as.data.table(list(Data_Set = "GARCH11_par", Sharpe_Ratio = GARCH11_Sharpe_Ratio_par,
Cumulative_Return = R_Active_GARCH11_p_cumulated_par[length(R_Active_GARCH11_p_cumulated_par)])),
as.data.table(list(Data_Set = "GARCH11SVI_par", Sharpe_Ratio = GARCH11SVI_Sharpe_Ratio_par,
Cumulative_Return = R_Active_GARCH11SVI_p_cumulated_par[length(R_Active_GARCH11SVI_p_cumulated_par)])),
as.data.table(list(Data_Set = "GARCH11_ser", Sharpe_Ratio = GARCH11_Sharpe_Ratio_ser,
Cumulative_Return = R_Active_GARCH11_p_cumulated_ser[length(R_Active_GARCH11_p_cumulated_ser)])),
as.data.table(list(Data_Set = "GARCH11SVI_ser", Sharpe_Ratio = GARCH11SVI_Sharpe_Ratio_ser,
Cumulative_Return = R_Active_GARCH11SVI_p_cumulated_ser[length(R_Active_GARCH11SVI_p_cumulated_ser)])),
as.data.table(list(Data_Set = "GJRGARCH11_par", Sharpe_Ratio = GJRGARCH11_Sharpe_Ratio_par,
Cumulative_Return = R_Active_GJRGARCH11_p_cumulated_par[length(R_Active_GJRGARCH11_p_cumulated_par)])),
as.data.table(list(Data_Set = "GJRGARCH11SVI_par", Sharpe_Ratio = GJRGARCH11SVI_Sharpe_Ratio_par,
Cumulative_Return = R_Active_GJRGARCH11SVI_p_cumulated_par[length(R_Active_GJRGARCH11SVI_p_cumulated_par)])),
as.data.table(list(Data_Set = "GJRGARCH11_ser", Sharpe_Ratio = GJRGARCH11_Sharpe_Ratio_ser,
Cumulative_Return = R_Active_GJRGARCH11_p_cumulated_ser[length(R_Active_GJRGARCH11_p_cumulated_ser)])),
as.data.table(list(Data_Set = "GJRGARCH11SVI_ser", Sharpe_Ratio = GJRGARCH11SVI_Sharpe_Ratio_ser,
Cumulative_Return = R_Active_GJRGARCH11SVI_p_cumulated_ser[length(R_Active_GJRGARCH11SVI_p_cumulated_ser)])),
as.data.table(list(Data_Set = "EGARCH11_par", Sharpe_Ratio = EGARCH11_Sharpe_Ratio_par,
Cumulative_Return = R_Active_EGARCH11_p_cumulated_par[length(R_Active_EGARCH11_p_cumulated_par)])),
as.data.table(list(Data_Set = "EGARCH11SVI_par", Sharpe_Ratio = EGARCH11SVI_Sharpe_Ratio_par,
Cumulative_Return = R_Active_EGARCH11SVI_p_cumulated_par[length(R_Active_EGARCH11SVI_p_cumulated_par)])) ,
as.data.table(list(Data_Set = "EGARCH11_ser", Sharpe_Ratio = EGARCH11_Sharpe_Ratio_ser,
Cumulative_Return = R_Active_EGARCH11_p_cumulated_ser[length(R_Active_EGARCH11_p_cumulated_ser)])),
as.data.table(list(Data_Set = "EGARCH11SVI_ser", Sharpe_Ratio = EGARCH11SVI_Sharpe_Ratio_ser,
Cumulative_Return = R_Active_EGARCH11SVI_p_cumulated_ser[length(R_Active_EGARCH11SVI_p_cumulated_ser)])))
Over all, the SVI_par model was always the best performing bar the EGARCHSVI_ser that performed exceptionally well. It is also interesting to note that the SVI models always outperformed the non-SVI ones both in terms of returns and Sharpe-Ratios.
This article replicated part of the work of CPV (Chronopoulos et al. (2018)) on the SPX. It empirically investigates the explanatory and predictive powers of Google Trends data as a proxy for information demand in a series of GARCH models and using the C\&D (Christoffersen and Diebold (2006)) framework. In the interest of time, I empirically studied the financial implications of my findings, but not their economic significance, inferring them instead from financial and graphical analyses.
$$ \\ $$Before moving on any further, it must be noted that I externally investigated the effect of using different Google Trends draw (SVIs) and found that they may impact results. However, the conclusion and the relative performance of each model were the same. I would also like to note that further external analysis additionally indicated that investors clued into its forecasting models' effectiveness and reduced their performance over time by using it - in a similar fashion as in Simon (1955).
Findings above also highlighted how the inclusion of a single external variable in recursive models - namely $\Delta$SVI - is extremely limiting as it does not allow for the inclusion of new variables that may become relevant due to structural changes in domestic economies. It is unrealistic to expect a priori knowledge of all variables affecting asset price change sign movements. One may attempt to use other factors in combination with dummy variables to anticipate when they should apply ex ante (Pesaran and Timmermann (2000)) such as in Vlastakis and Markellos (2012) where a dummy variable is attached to $\Delta$SVI for low and high economic states (when returns dropped or rose by more than one standard deviation in a week).
$$ \\ $$These particular predictive challenges - however - do not seem to be caused by the model constructions. Original tests on the use of Google Trends data in SPX’s excess return variance forecasts were positive. SPX excess return sign forecasts compiled via C\&D's framework also proved more successful to a profit maximising agent trading daily than comparable buy-and- hold and naive model forecasts.
One must - however - be careful when interpreting these results. Graphical analyses indicated that models tend not to foresee economic crashes, and that an application of my strategy may not be superior to buy-and-hold's at any time.
Moreover, it would be interesting to investigate the optimal risk-aversion level one ought to have to maximise their profits; i.e.: the optimal level of $\psi$.
Furthermore, SPX SVI-implementing strategies worked remarkably well despite their variance forecasting models performing at a level that could not be determined in comparison to their SVI-free counterparts. This could be indicative of SVI-implementation in C\&D's models as useful in surprising ways - not in aiding variance forecasting, but in channeling its predictive powers via variance forecasting other than by improving it, possibly by reducing only Type II or Type I errors. Further studies on this aspect of results above would be enlightening.
$$ \\ $$In conclusion, further studies are needed to statistically-significantly propose that Google Trends can be used in forecasting the sign of index excess return at a daily frequency. Indeed, Google Trends data may have changed in minute but substantial ways - possibly since the introduction of caffeine on the 8th of June 2010. It may be enlightening to look into structural changes in $\Delta$SVI at that time before implementing it in this study.
Moreover, it would be compelling to continue the work above with a draw of SVI for other indices only for their respective country region – e.g.: for the FTSE and the U.K.. Extending the logic in Preis et al. (2013), one may find more robust findings this way.
Furthermore, only AR and GARCH models of order one were used in my (and CPV's) paper. It may be revealing to see how Rt or $\Delta$SVI of further lags may pick up on cyclical trends (that can be seen in the graph of our SVI). One may attempt and find their best orders via Information Criteria analyses akin to in Brooks and Burke (1998).
Following from the study in Fleming, Kirby, and Ostdiek (2003), models using Realised Volatility instead of the Moving Average components of the GARCH models($\sigma_{t-1}$) could also be a novel implementation to improve volatility forecasts and see their repercussions in C\&D's model.
In this study, excess returns from our strategy were computed using a risk-free rate based on the change in the same bond's market price from one day to the next; this is not quite so conventional. In order to produce findings more comparable with others in similar fields , one ought to re-create this study using a YTM implied daily risk-free rate.
Investigating the different forecasting models via the Henriksson and Merton (1981) test of market timing would be extremely interesting as well.
An attempt at the above profit maximising strategies over any period of time may also be revealing. They proved useful over our time periods, but that does not constitute of proof that it is the case for any period. These strategies only showed their use following the 2008 crisis and did not foresee significant economic crashes due to its short forecasting horizon. This leads to believe that longer horizons with lower frequency data - such as weekly data – could be easier to forecast and thus more accurate - as per Preis et al. (2013). This strategy would also allow for lower trading costs and improved overall performance.
Finally, and most obviously, one ought to complete this article’s work on the SPX and investigate the economic significance of the findings above in an empirical manner using Granger and Pesaran (2000)'s framework and implement trading costs in analyses.
Whilst a family of GARCH models are used in this thesis to remedy the particular issue of heteroscedasticity modeling, the task of predicting asset prices remains notoriously difficult; often, "even the best prediction models have no out-of-sample forecasting powers" (Bossaerts and Hillion (1999), p.405) (e.g.: Kostakis, Magdalinos, and Stamatogiannis (2014)). This is what motivated C&D to challenge such difficulties by demonstrating how correlations between asset price volatilities and returns arise given non-zero expected returns irrespective of the shape of their distribution. When expected returns are zero, Christoffersen, Diebold, Mariano, Tay, and Tse (2006) (CDMTT hereon) highlighted that sign prediction was still possible with asymmetrically distributed returns - in alignment with SFFR (i).
Market participants try to inform themselves optimally prior to trading (Simon (1955)). Therefore, risk aversion and information demand are closely related, it is thus so for investment activities too, that in turn cause asset price changes. While model specifications such as C&D's are naturally of paramount importance, numerous economists also investigated the explanatory powers of exogenous variables in forecasting models since Keynes' study on Business Cycles (e.g.: Fama and French (1996), famously).
CPV novelly incorporated such a variable - information demand - in these models to study their forecasting powers. While a plethora of studies investigated how asset prices are affected/correlated with information demand indexed via Wikipedia (Moat et al. (2013); Rubin and Rubin (2010)), online message boards (Antweiler and Frank (2004)), et cetera (Vlastakis and Markellos (2012)), Google Trends stood out as particularly useful in finance (Da, Engelberg, and Gao (2011); Preis, Moat, and Stanley (2013)). The prevalence of Google as the internationally dominant search-engine allowed for the use of Google Trends in academia with varying degrees of accuracy in (science (Pelat, Turbelin, Bar-Hen, Flahault, and Valleron (2009)), economics (Askitas and Zimmermann (2009)), and) finance (Choi and Varian (2012); Curme, Preis, Stanley, and Moat (2014), and Jiang (2016)). It is best to note that Google is not strictly dominant in all countries. Notable exceptions exist in South Korea (with Naver), Russia (with Yandex) and China (with Baidu). Vlastakis and Markellos (2012) found that financial information-networks (e.g.: Reuters, Bloomberg, ...) tend to share articles outside their platforms and lead to information arriving in traditional and non-traditional information-channels at approximately the same time; Google Trends can thus be used to measure signals of investor interests in certain topics/assets and that there is a relationship between the information demand of an asset and its return's volatility.
CPV thus used Google Trends data as a proxy for information demand in the form of a Search Vector Index (SVI). They found evidence of distribution asymmetry in the SPX excess returns - allowing for the application of CDMTT's work - as well as a clear and statistically significant relationship between those returns and the first difference in SVI - $\Delta$SVI - using a family of GARCH models which were then used in C&D's framework. This proved highly successful: when testing their findings using Granger and Pesaran (2000)'s framework over the time period extending from the 1st of January 2004 to the 31st of December 2016 (01/01/2004 - 31/12/2016), their scenario analysis resulted in higher Sharpe-ratios (Sharpe (1966)) when implementing SVIs in an active strategy compared to a naive or a buy-and-hold (B&H) strategy for a utility maximising investor acting with a Constant Absolute Risk Aversion (i.e.: with a negative exponential utility function) with and without short-selling. It is also interesting to note that short- and long-term interest rates were shown to have smaller - but still significant - predictive powers when using monthly data (comparing CPV's findings to Nyberg (2011) and Chevapatrakul (2013)'s) suggesting that higher frequency data - such as $\Delta$SVI - may be of better use.
- Anand, A., Irvine, P., Puckett, A., & Venkataraman, K. (2011). Performance of institutional trading desks: An analysis of persistence in trading costs. The Review of Financial Studies, 25 (2), 557-598.
- Andersen, T. G., Bollerslev, T., Diebold, F. X., & Ebens, H. (2001). The distribution of realized stock return volatility. Journal of financial economics, 61 (1), 43-76.
- Antweiler, W., & Frank, M. Z. (2004). Is all that talk just noise? the information content of internet stock message boards. The Journal of finance, 59 (3), 1259-1294.
- Askitas, N., & Zimmermann, K. F. (2009). Google econometrics and unemployment forecasting. SSRN Electronic Journal.
- Awartani, B. M., & Corradi, V. (2005). Predicting the volatility of the s&p-500 stock index via garch models: the role of asymmetries. International Journal of Forecasting, 21 (1), 167-183.
- Barndorff-Nielsen, O. E., & Shephard, N. (2002). Econometric analysis of realized volatility and its use in estimating stochastic volatility models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 64 (2), 253-280.
- Bollerslev, T. (1986). Generalized autoregressive conditional heteroskedasticity. Journal of econometrics, 31 (3), 307-327.
- Booth, G., & Gurun, U. (2004). Financial archaeology: Capitalism, financial markets, and price volatility. Michigan State University.
- Bossaerts, P., & Hillion, P. (1999). Implementing statistical criteria to select return forecasting models: what do we learn? The Review of Financial Studies, 12 (2), 405.
- Brier, G. W. (1950). Verification of forecasts expressed in terms of probability. Monthly weather review, 78 (1), 1-3.
- Brooks, C., & Burke, S. P. (1998). Forecasting exchange rate volatility using conditional variance models selected by information criteria. Economics Letters, 61 (3), 273-278.
- Chevapatrakul, T. (2013). Return sign forecasts based on conditional risk: Evidence from the uk stock market index. Journal of Banking & Finance, 37 (7), 2342-2353.
- Choi, H., & Varian, H. (2012). Predicting the present with google trends. Economic Record, 88 , 2-9.
- Christoffersen, P., & Diebold, F. X. (2006). Financial asset returns, direction-of-change forecasting, and volatility dynamics. Management Science, 52 (8), 1273-1287.
- Christoffersen, P., Diebold, F. X., Mariano, R. S., Tay, A. S., & Tse, Y. K. (2006). Directionof-change forecasts based on conditional variance, skewness and kurtosis dynamics: international evidence.
- Chronopoulos, D. K., Papadimitriou, F. I., & Vlastakis, N. (2018). Information demand and stock return predictability. Journal of International Money and Finance, 80 , 59-74. Retrieved from https://www.sciencedirect.com/science/article/pii/S0261560617301912?via%3Dihub
- Curme, C., Preis, T., Stanley, H. E., & Moat, H. S. (2014). Quantifying the semantics of search behavior before stock market moves. Proceedings of the National Academy of Sciences, 111 (32), 11600-11605.
- Da, Z. H., Engelberg, J., & Gao, P. (2011). In search of attention. The Journal of Finance, 66 (5), 1461-1499.
- Dave Davis. (2017). Google's caffeine update: Better indexing & fresher search results. Retrieved 29/08/2019, from https://www.searchenginejournal.com/google-algorithm-history/caffeine-update/
- Dickey, D. A., & Fuller, W. A. (1979). Distribution of the estimators for autoregressive time series with a unit root. Journal of the American statistical association, 74 (366a), 427-431.
- Diebold, F. X., & Mariano, R. S. (1995). Comparing predictive accuracy. Journal of Business and Economic Statistics, 13 (3), 253-263.
- Economist, T. (8th of Febuary 2018). Bets on low market volatility went spectacularly wrong: Vexed about vix. The Economist, Slowbalisation(Print edition | Finance and economics). Retrieved 19th of August 2019, from https://www.economist.com/finance-and-economics/2018/02/08/bets-on-low-market-volatility-went-spectacularly-wrong
- Engle, R. F. (1982). Autoregressive conditional heteroscedasticity with estimates of the variance of united kingdom in ation. Econometrica: Journal of the Econometric Society, 987-1007.
- Fama, E. F., & French, K. R. (1996). Multifactor explanations of asset pricing anomalies. The journal of finance, 51 (1), 55-84.
- Fleming, J., Kirby, C., & Ostdiek, B. (2003). The economic value of volatility timing using "realized" volatility. Journal of Financial Economics, 67 (3), 473-509.
- Glosten, L. R., Jagannathan, R., & Runkle, D. E. (1993). On the relation between the expected value and the volatility of the nominal excess return on stocks. The journal of finance, 48 (5), 1779-1801.
- Granger, C. W., & Pesaran, M. H. (2000). Economic and statistical measures of forecast accuracy. Journal of Forecasting, 19 (7), 537-560.
- Harrison, P. (1998). Similarities in the distribution of stock market price changes between the eighteenth and twentieth centuries. The Journal of Business, 71 (1), 55-79.
- Henriksson, R. D., & Merton, R. C. (1981). On market timing and investment performance. ii. statistical procedures for evaluating forecasting skills. Journal of business, 513-533.
- Jiang, W. (2016). Stock market valuation using internet search volumes: Us-china comparison. Kandel, S., & Stambaugh, R. F. (1996). On the predictability of stock returns: an assetallocation perspective. The Journal of Finance, 51 (2), 385-424.
- Kostakis, A., Magdalinos, T., & Stamatogiannis, M. P. (2014). Robust econometric inference for stock return predictability. The Review of Financial Studies, 28 (5), 1506-1553.
- Lesmond, D. A., Ogden, J. P., & Trzcinka, C. A. (1999). A new estimate of transaction costs. The Review of Financial Studies, 12 (5), 1113-1141.
- Lo, A. W. (2005). Reconciling efficient markets with behavioral finance: the adaptive markets hypothesis. Journal of investment consulting, 7 (2), 21-44.
- Mandelbrot, B. (1963). The variation of certain of certain speculative. Journal of Finance, 36 , 418.
- Mitchell, H., Brown, R., & Easton, S. (2002). Old volatility-arch effects in 19th century consol data. Applied Financial Economics, 12 (4), 301-307.
- Moat, H. S., Curme, C., Avakian, A., Kenett, D. Y., Stanley, H. E., & Preis, T. (2013). Quantifying wikipedia usage patterns before stock market moves. Scientific Reports, 3 (1), 269.
- N., J., & Wold, H. (1939). A study in analysis of stationary time series. Journal of the Royal Statistical Society, 102 (2), 295.
- Nelson, D. B. (1991). Conditional heteroskedasticity in asset returns: A new approach. Econo-metrica: Journal of the Econometric Society, 347-370.
- Nyberg, H. (2011). Forecasting the direction of the us stock market with dynamic binary probit models. International Journal of Forecasting, 27 (2), 561-578.
- Pelat, C., Turbelin, C., Bar-Hen, A., Flahault, A., & Valleron, A.-J. (2009). More diseases tracked by using google trends. Emerging infectious diseases, 15 (8), 1327.
- Pesaran, M. H., & Timmermann, A. (1995). Predictability of stock returns: Robustness and economic significance. The Journal of Finance, 50 (4), 1201-1228.
- Pesaran, M. H., & Timmermann, A. (2000). A recursive modelling approach to predicting uk stock returns. The Economic Journal, 110 (460), 159-191.
- Phillips, P. C., & Perron, P. (1988). Testing for a unit root in time series regression. Biometrika, 75 (2), 335-346.
- Poon, S.-H., & Granger, C. W. (2003). Forecasting volatility in financial markets: A review. Journal of economic literature, 41 (2), 478-539.
- Preis, T., Moat, H. S., & Stanley, H. E. (2013). Quantifying trading behavior in financial markets using google trends.
- Rapach, D., & Zhou, G. (2013). Forecasting stock returns. In Handbook of economic forecasting (Vol. 2, pp. 328-383). Elsevier.
- Rubin, A., & Rubin, E. (2010). Informed investors and the internet. Journal of Business Finance & Accounting, 37 (7-8), 841-865.
- Sharpe, W. F. (1966). Mutual fund performance. The Journal of business, 39 (1), 119-138.
- Silk, M. J. (2012). Link between s&p 500 and ftse 100 and the comparison of that link before and after the s&p 500 peak in october 2007. Lingnan Journal of Banking, Finance and Economics, 3 (1), 3.
- Simon, H. A. (1955). A behavioral model of rational choice. The quarterly journal of economics, 69 (1), 99-118.
- Taylor, S. J. (2005). Asset price dynamics, volatility, and prediction. Princeton university press.
- Timmermann, A., & Granger, C. W. (2004). Efficient market hypothesis and forecasting. International Journal of forecasting, 20 (1), 15-27.
- Tinbergen, J. (1933). Statistiek en wiskunde in dienst van het konjunktuuronderzoek: Statistics and mathematics in the service of business cycle research. Netherlands School of Economics.
- Tong, H. (1990). Non-linear time series: a dynamical system approach. Oxford University Press.
- Verbeek, M. (2008). A guide to modern econometrics (2nd ed.). John Wiley & Sons. Vlastakis, N., & Markellos, R. N. (2012). Information demand and stock market volatility. Journal of Banking & Finance, 36 (6), 1808-1821.
- Whittle, P. (1951). Hypothesis testing in time series analysis.
- Yule, G. U. (1927). On a method of investigating periodicities in disturbed series, with special reference to wolfer's sunspot numbers. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 226 (636-646), 267-298.
as_tibble(df, rownames = NA)