Skip to content

Instantly share code, notes, and snippets.

@YikaiLL
Created November 13, 2017 08:36
Show Gist options
  • Select an option

  • Save YikaiLL/59d5531c3d4feb62249a171460360be2 to your computer and use it in GitHub Desktop.

Select an option

Save YikaiLL/59d5531c3d4feb62249a171460360be2 to your computer and use it in GitHub Desktop.

Chapter 1

Time domain approach

This is generally motivated by the presumption that correlation between adjacent points in time is best explained in terms of a dependence of the current value on past values

mulitplicative models and additvie models

ARIMA / autoregressive integrated moving average

Frequency Domain Approach

assumes the primary characteristics of interest in time series analyses relate to periodic or systematic sinusoidal variations found naturally in most data.


Time Series Statistical Models

Using discrete random variables to represent sampled time series date.

The selection of sampling interval matters.

White Noise

A white noise process is one with a mean zero and no correlation between its values at different times.

\

A simple kind of generated series might be a collection of uncorrelated random variables, $$w_t$$, with mean 0 and finite variance $$\sigma_w^2$$.

The time series generated from uncorrelated variables is used as a model for noise in en- gineering applications, where it is called white noise; we shall sometimes denote this process as $$w_t\sim wn(0,\sigma_w^2)$$.

Noticed that the variable should be indepnedent so that the calssical statiscal method suffice.

Moving Averages

We might replace the white noise series $$w_t$$ by a moving average that smooths the series.

$$v_t=1/3(w_{t-1}+w_{t}+w_{t+1})$$ Inspecting the series shows a smoother version of the first series, reflecting the fact that the slower oscillations are more apparent and some of the faster oscillations are taken out.

Autoregressions

Suppose we consider the white noise series $w_t$ of Example 1.8 as input and calculate the output using the second-order equation $$x_t=x_{t-1}-.9x_{t-2}+w_{t}\ (1.2)$$ Equation (1.2) represents a regression or prediction of the current value $$x_t$$ of a time series as a function of the past two values of the series, and, hence, the term autoregression is suggested for this model.

Random Walk with Drift

$$x_t=\delta + x_{t-1}+w_t$$ for t = 1, 2, . . ., with initial condition x0 = 0, and where wt is white noise. The constant $$\delta$$ is called the drift and when $$\delta=0$$, this is called simply a random walk. Cumulative sum of white noise variates $$x_t = \delta t+\sum_{j=1}^t w_j$$


Measures of Dependence: Autocorrelation and Cross-Correlation

The mean function

$$\mu_{xt}=E(x_t)=\int_{-\infty}^{\infty}xf_t(x)dx$$

Autocovariance function

$$\gamma_x(s,t)=cov(x_s,x_t)=E[(x_s-\mu_s)(x_t-\mu_t)]$$

for all s and t. When no possible confusion exists about which time series we are referring to, we will drop the subscript and write as $$\gamma(s,t)$$.

The autocovariance measures the linear dependence between two points on the same series observed at different times.

Recall from classical statistics that if x(s,t) = 0, xs and xt are not linearly related, but there still may be some dependence structure between them. If, however, xs and xt are bivariate normal, x(s,t) = 0 ensures their independence. It is clear that, for s = t, the autocovariance reduces to the (assumed finite) variance, because

$$\gamma_x(t,t)=E[(x_t-\mu_t)^2]=var(x_t)$$

Autocorrelation function

$$\rho(s,t)=\frac{\gamma(s,t)}{\sqrt{\gamma(s,s)\gamma(t,t)}}$$ The ACF measures the linear predictability of the series at time t, say xt, using only the value xs.

Cross-covariance function

The cross-covariance function between two series, $$x_t$$ and $$y_t$$, is $$\gamma_{xy}(s,t)=cov(x_s,y_t)=E[(x_s-\mu_{xs})(y_t-\mu_{yt})]$$

Cross-correlation function

$$\rho_{xy}(s,t)=\frac{\gamma_{xy}(s,t)}{\sqrt{\gamma(s,s)\gamma(t,t)}}$$

Stationary Time Series

Definition 1.6 A strictly stationary

A strictly stationary time series is one for which the probabilistic behavior of every collection of values {$$x_{t_1},x_{t_2},...,x_{t_k}$$} is identical to that of the time shifted set {$$x_{t_1+h},x_{t_2+h},...,x_{t_k+h}$$}.

Definition 1.7 A weakly stationary

A weakly stationary time series, $$x_t$$, is a finite variance process such that (i) the mean value function, $$\mu_t$$, defined in the mean function is constant and does not depend on time t, and (ii) the autocovariance function, $$\gamma (s, t)$$, defined in (1.10) depends on s and t only through their difference $$|s-t|$$.

Mean

$$\mu_t=\mu$$

Autocovariance function

$$\gamma (t+h,t)=cov(x_{t+h},x_t)=cov(x_h,x_0)=\gamma (h,0)$$

Definition 1.8 Autocovariance function of a stationary time series

$$\gamma(h)=cov(x_{t+h},x_t)=E[(x_{t+h}-\mu)(x_t-\mu)]$$

Definition 1.9 The autocorrelation function (ACF) of a stationary time series

$$\rho(h)=^{\gamma(t+h,t)}/_{\sqrt{\gamma(t+h,t+h)\gamma(t,t)}}=^{\gamma(h)}/_{\gamma(0)}$$

Properties of stationary time series

1.$$\gamma(0)=E[(x_t-\mu)^2]$$ 2.$$|\gamma(h)|<= \gamma(0)$$ 3.$$\gamma(h)=\gamma(-h)$$

Definition 1.10 jointly stationary

Two time series, say, $$x_t$$ and $$y_t$$ , are said to be jointly stationary if they are each stationary, and the cross-covariance function $$\gamma_{xy}(h)=cov(x_{t+h},y_t)=E[(x_{t+h}-\mu_x)(y_t-\mu_y)]$$ is a function only of lag $$h$$.

Definition 1.11 cross-correlation function (CCF) of jointly stationary time series

The cross-correlation function (CCF) of jointly stationary time series xt and yt is defined as $$\rho_{xy}(h)=^{\gamma_{xy}(h)}/_{\gamma_x(0)\gamma_y(0)}$$

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment