序列相关性分析——自协方差

文章来自微信公众号“科文路”,欢迎关注、互动。转发须注明出处。

该文章属于“时间序列分析”系列文章,是之前在校阶段的学习总结。为避免翻译歧义,​采用英文写作。当前主题分为三个部分,自协方差、自相关函数、偏自相关函数。

In time series analysis, we should always focus on the correlation itself. Because there exists no other series to compare, we define autocovariance, autocorrelation function and partial autocorrelation function based on the characteristics of the time series.

Word auto means we do the analysis on itself.

1 Autocovariance

/wiki/Autocovariance

1.1 Introduction

In probability theory and statistics, given a stochastic process, the autocovariance is a function that gives the covariance of the process with itself at pairs of time points. Autocovariance is closely related to the autocorrelation of the process in question.

It can be thought that the autovariance is the similarity of the signal with with a delayed copy of itself .

1.2 Definition

With the usual notation $\operatorname {E}$ for the expectation operator, if the stochastic process ${X_{t}}$ has the mean function $\mu {t}=\operatorname {E} [X{t}]$, then the autocovariance is given by,
$$
\begin{aligned}\mathbf{K}{X X}\left(t{1}, t_{2}\right) &= \operatorname{cov}\left[X_{t_{1}}, X_{t_{2}}\right] \&= \mathrm{E}\left[\left(X_{t_{1}}-\mu_{t_{1}}\right)\left(X_{t_{2}}-\mu_{t_{2}}\right)\right] \&= \mathrm{E}\left[X_{t_{1}} X_{t_{2}}\right]-\mu_{t_{1}} \mu_{t_{2}}\end{aligned},
$$
where $t_{1}$and $t_{2}$are two moments in time.

Or, we usually use $\gamma$ as the notion,
$$
\gamma(i, j)=E\left[\left(X_{i}-\mu_{i}\right)\left(X_{j}-\mu_{j}\right)\right].
$$
More generally, if the process is a second order stationary process, which means a weakly stationaty process (WSS). From the definition of a weakly stationary signal, the autocovariance and autocorrelation will not depend on $t$.it gives
$$
\gamma(k)=\operatorname{cov}\left(X_{t+k}, X_{t}\right)=E\left[\left(X_{t}-\mu\right)\left(X_{t+k}-\mu\right)\right],
$$
where $\tau$ is the lag time, the time has been shifted.

1.3 Calculation

$$
\begin{aligned}\gamma(k)&=\mathrm{E}\left[\left(X_{t}-\mu_{t}\right)\left(X_{t-k}-\mu_{t}\right)\right] \&=\mathrm{E}\left[X_{t} X_{t-k}\right]-\mu_{t} \mu_{t}\&=\frac{1}{N-k}\sum_{t=k+1}^{N}{X_tX_{t-k}}-\overline{X_t}^2\end{aligned}
$$

1.4 Normalization

For a WSS with a variance $\sigma^2$, it converts to a time-dependent Pearson correlation coefficient,
$$
\rho(k)=\frac{\gamma(k)}{\gamma(0)}=\frac{\gamma(k)}{\sigma^{2}}
$$
where $\gamma(0)=\sigma^{2}$.

1.5 MATLAB code for Autocovariance Matrix

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
% autocov_m
% Author: xlindo
% Date: 2019.7.24
% License: Apache License 2.0
%
% function: compute the autocovariance of a weakly stationaty process
%
% INPUT: x, a 1*n series vector
% OUTPUT: ac_m, a n*n matrix
%
function ac_m = autocov_m(x)

len_x = length(x);
gamma = zeros(1,len_x);

mu_x = mean(x);

for k=1:len_x-1
gamma(k) = sum(x(1:len_x-k+1).*x(k:len_x)) / (len_x-k)- mu_x^2;
end
gamma(len_x) = 0;

ac_m = zeros(len_x, len_x);
ac_m(1,:) = gamma;

for i=2:len_x
ac_m(i,1:len_x) = [gamma(i:-1:2), gamma(1:len_x-i+1)];
end

都看到这儿了,不如关注每日推送的“科文路”、互动起来~

序列相关性分析——自协方差

https://xlindo.com/kewenlu2022/posts/f3d96356/

Author

xlindo

Posted on

2022-02-07

Updated on

2023-05-10

Licensed under

Comments