A complex random vector Z = ( Z 1 , … , Z n ) T {\displaystyle \mathbf {Z} =(Z_{1},\ldots ,Z_{n})^{T}} on the probability space ( Ω , F , P ) {\displaystyle (\Omega ,{\mathcal {F}},P)} is a function Z : Ω → C n {\displaystyle \mathbf {Z} \colon \Omega \rightarrow \mathbb {C} ^{n}} such that the vector ( ℜ ( Z 1 ) , ℑ ( Z 1 ) , … , ℜ ( Z n ) , ℑ ( Z n ) ) T {\displaystyle (\Re {(Z_{1})},\Im {(Z_{1})},\ldots ,\Re {(Z_{n})},\Im {(Z_{n})})^{T}} is a real random vector on ( Ω , F , P ) {\displaystyle (\Omega ,{\mathcal {F}},P)} where ℜ ( z ) {\displaystyle \Re {(z)}} denotes the real part of z {\displaystyle z} and ℑ ( z ) {\displaystyle \Im {(z)}} denotes the imaginary part of z {\displaystyle z} .1: p. 292
The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form P ( Z ≤ 1 + 3 i ) {\displaystyle P(Z\leq 1+3i)} make no sense. However expressions of the form P ( ℜ ( Z ) ≤ 1 , ℑ ( Z ) ≤ 3 ) {\displaystyle P(\Re {(Z)}\leq 1,\Im {(Z)}\leq 3)} make sense. Therefore, the cumulative distribution function F Z : C n ↦ [ 0 , 1 ] {\displaystyle F_{\mathbf {Z} }:\mathbb {C} ^{n}\mapsto [0,1]} of a random vector Z = ( Z 1 , . . . , Z n ) T {\displaystyle \mathbf {Z} =(Z_{1},...,Z_{n})^{T}} is defined as
where z = ( z 1 , . . . , z n ) T {\displaystyle \mathbf {z} =(z_{1},...,z_{n})^{T}} .
As in the real case the expectation (also called expected value) of a complex random vector is taken component-wise.2: p. 293
See also: Covariance matrix § Complex random vector
The covariance matrix (also called second central moment) K Z Z {\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {Z} }} contains the covariances between all pairs of components. The covariance matrix of an n × 1 {\displaystyle n\times 1} random vector is an n × n {\displaystyle n\times n} matrix whose ( i , j ) {\displaystyle (i,j)} th element is the covariance between the i th and the j th random variables.3: p.372 Unlike in the case of real random variables, the covariance between two random variables involves the complex conjugate of one of the two. Thus the covariance matrix is a Hermitian matrix.4: p. 293
K Z Z = cov [ Z , Z ] = E [ ( Z − E [ Z ] ) ( Z − E [ Z ] ) H ] = E [ Z Z H ] − E [ Z ] E [ Z H ] {\displaystyle {\begin{aligned}&\operatorname {K} _{\mathbf {Z} \mathbf {Z} }=\operatorname {cov} [\mathbf {Z} ,\mathbf {Z} ]=\operatorname {E} [(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ]){(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ])}^{H}]=\operatorname {E} [\mathbf {Z} \mathbf {Z} ^{H}]-\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {Z} ^{H}]\\[12pt]\end{aligned}}}
The pseudo-covariance matrix (also called relation matrix) is defined replacing Hermitian transposition by transposition in the definition above.
J Z Z = cov [ Z , Z ¯ ] = E [ ( Z − E [ Z ] ) ( Z − E [ Z ] ) T ] = E [ Z Z T ] − E [ Z ] E [ Z T ] {\displaystyle \operatorname {J} _{\mathbf {Z} \mathbf {Z} }=\operatorname {cov} [\mathbf {Z} ,{\overline {\mathbf {Z} }}]=\operatorname {E} [(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ]){(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ])}^{T}]=\operatorname {E} [\mathbf {Z} \mathbf {Z} ^{T}]-\operatorname {E} [\mathbf {Z} ]\operatorname {E} [\mathbf {Z} ^{T}]}
The covariance matrix is a hermitian matrix, i.e.5: p. 293
The pseudo-covariance matrix is a symmetric matrix, i.e.
The covariance matrix is a positive semidefinite matrix, i.e.
See also: Complex random variable § Covariance matrix of real and imaginary parts
By decomposing the random vector Z {\displaystyle \mathbf {Z} } into its real part X = ℜ ( Z ) {\displaystyle \mathbf {X} =\Re {(\mathbf {Z} )}} and imaginary part Y = ℑ ( Z ) {\displaystyle \mathbf {Y} =\Im {(\mathbf {Z} )}} (i.e. Z = X + i Y {\displaystyle \mathbf {Z} =\mathbf {X} +i\mathbf {Y} } ), the pair ( X , Y ) {\displaystyle (\mathbf {X} ,\mathbf {Y} )} has a covariance matrix of the form:
The matrices K Z Z {\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {Z} }} and J Z Z {\displaystyle \operatorname {J} _{\mathbf {Z} \mathbf {Z} }} can be related to the covariance matrices of X {\displaystyle \mathbf {X} } and Y {\displaystyle \mathbf {Y} } via the following expressions:
Conversely:
The cross-covariance matrix between two complex random vectors Z , W {\displaystyle \mathbf {Z} ,\mathbf {W} } is defined as:
And the pseudo-cross-covariance matrix is defined as:
Two complex random vectors Z {\displaystyle \mathbf {Z} } and W {\displaystyle \mathbf {W} } are called uncorrelated if
Main article: Independence (probability theory)
Two complex random vectors Z = ( Z 1 , . . . , Z m ) T {\displaystyle \mathbf {Z} =(Z_{1},...,Z_{m})^{T}} and W = ( W 1 , . . . , W n ) T {\displaystyle \mathbf {W} =(W_{1},...,W_{n})^{T}} are called independent if
where F Z ( z ) {\displaystyle F_{\mathbf {Z} }(\mathbf {z} )} and F W ( w ) {\displaystyle F_{\mathbf {W} }(\mathbf {w} )} denote the cumulative distribution functions of Z {\displaystyle \mathbf {Z} } and W {\displaystyle \mathbf {W} } as defined in Eq.1 and F Z , W ( z , w ) {\displaystyle F_{\mathbf {Z,W} }(\mathbf {z,w} )} denotes their joint cumulative distribution function. Independence of Z {\displaystyle \mathbf {Z} } and W {\displaystyle \mathbf {W} } is often denoted by Z ⊥ ⊥ W {\displaystyle \mathbf {Z} \perp \!\!\!\perp \mathbf {W} } . Written component-wise, Z {\displaystyle \mathbf {Z} } and W {\displaystyle \mathbf {W} } are called independent if
A complex random vector Z {\displaystyle \mathbf {Z} } is called circularly symmetric if for every deterministic φ ∈ [ − π , π ) {\displaystyle \varphi \in [-\pi ,\pi )} the distribution of e i φ Z {\displaystyle e^{\mathrm {i} \varphi }\mathbf {Z} } equals the distribution of Z {\displaystyle \mathbf {Z} } .6: pp. 500–501
A complex random vector Z {\displaystyle \mathbf {Z} } is called proper if the following three conditions are all satisfied:9: p. 293
Two complex random vectors Z , W {\displaystyle \mathbf {Z} ,\mathbf {W} } are called jointly proper is the composite random vector ( Z 1 , Z 2 , … , Z m , W 1 , W 2 , … , W n ) T {\displaystyle (Z_{1},Z_{2},\ldots ,Z_{m},W_{1},W_{2},\ldots ,W_{n})^{T}} is proper.
The Cauchy-Schwarz inequality for complex random vectors is
The characteristic function of a complex random vector Z {\displaystyle \mathbf {Z} } with n {\displaystyle n} components is a function C n → C {\displaystyle \mathbb {C} ^{n}\to \mathbb {C} } defined by:14: p. 295
Lapidoth, Amos (2009). A Foundation in Digital Communication. Cambridge University Press. ISBN 978-0-521-19395-5. 978-0-521-19395-5 ↩
Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1. 978-0-521-86470-1 ↩
Tse, David (2005). Fundamentals of Wireless Communication. Cambridge University Press. ↩