With any number of random variables in excess of 1, the variables can be stacked into a random vector whose i th element is the i th random variable. Then the variances and covariances can be placed in a covariance matrix, in which the (i, j) element is the covariance between the i th random variable and the j th one. Likewise, the correlations can be placed in a correlation matrix.
In the case of a time series which is stationary in the wide sense, both the means and variances are constant over time (E(Xn+m) = E(Xn) = μX and var(Xn+m) = var(Xn) and likewise for the variable Y). In this case the cross-covariance and cross-correlation are functions of the time difference:
If Y is the same variable as X, the above expressions are called the autocovariance and autocorrelation:
Weisstein, Eric W. "Covariance". MathWorld. /wiki/Eric_W._Weisstein ↩
Weisstein, Eric W. "Statistical Correlation". MathWorld. /wiki/Eric_W._Weisstein ↩