In probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of a given random vector.
Intuitively, the covariance matrix generalizes the notion of variance to multiple dimensions. As an example, the variation in a collection of random points in two-dimensional space cannot be characterized fully by a single number, nor would the variances in the x {\displaystyle x} and y {\displaystyle y} directions contain all of the necessary information; a 2 × 2 {\displaystyle 2\times 2} matrix would be necessary to fully characterize the two-dimensional variation.
Any covariance matrix is symmetric and positive semi-definite and its main diagonal contains variances (i.e., the covariance of each element with itself).
The covariance matrix of a random vector X {\displaystyle \mathbf {X} } is typically denoted by K X X {\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {X} }} , Σ {\displaystyle \Sigma } or S {\displaystyle S} .