A complex random variable Z {\displaystyle Z} on the probability space ( Ω , F , P ) {\displaystyle (\Omega ,{\mathcal {F}},P)} is a function Z : Ω → C {\displaystyle Z\colon \Omega \rightarrow \mathbb {C} } such that both its real part ℜ ( Z ) {\displaystyle \Re {(Z)}} and its imaginary part ℑ ( Z ) {\displaystyle \Im {(Z)}} are real random variables on ( Ω , F , P ) {\displaystyle (\Omega ,{\mathcal {F}},P)} .
Consider a random variable that may take only the three complex values 1 + i , 1 − i , 2 {\displaystyle 1+i,1-i,2} with probabilities as specified in the table. This is a simple example of a complex random variable.
The expectation of this random variable may be simply calculated: E [ Z ] = 1 4 ( 1 + i ) + 1 4 ( 1 − i ) + 1 2 2 = 3 2 . {\displaystyle \operatorname {E} [Z]={\frac {1}{4}}(1+i)+{\frac {1}{4}}(1-i)+{\frac {1}{2}}2={\frac {3}{2}}.}
Another example of a complex random variable is the uniform distribution over the filled unit circle, i.e. the set { z ∈ C ∣ | z | ≤ 1 } {\displaystyle \{z\in \mathbb {C} \mid |z|\leq 1\}} . This random variable is an example of a complex random variable for which the probability density function is defined. The density function is shown as the yellow disk and dark blue base in the following figure.
Main article: Complex normal distribution
Complex Gaussian random variables are often encountered in applications. They are a straightforward generalization of real Gaussian random variables. The following plot shows an example of the distribution of such a variable.
The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form P ( Z ≤ 1 + 3 i ) {\displaystyle P(Z\leq 1+3i)} make no sense. However expressions of the form P ( ℜ ( Z ) ≤ 1 , ℑ ( Z ) ≤ 3 ) {\displaystyle P(\Re {(Z)}\leq 1,\Im {(Z)}\leq 3)} make sense. Therefore, we define the cumulative distribution F Z : C → [ 0 , 1 ] {\displaystyle F_{Z}:\mathbb {C} \to [0,1]} of a complex random variables via the joint distribution of their real and imaginary parts:
The probability density function of a complex random variable is defined as f Z ( z ) = f ℜ ( Z ) , ℑ ( Z ) ( ℜ ( z ) , ℑ ( z ) ) {\displaystyle f_{Z}(z)=f_{\Re {(Z)},\Im {(Z)}}(\Re {(z)},\Im {(z)})} , i.e. the value of the density function at a point z ∈ C {\displaystyle z\in \mathbb {C} } is defined to be equal to the value of the joint density of the real and imaginary parts of the random variable evaluated at the point ( ℜ ( z ) , ℑ ( z ) ) {\displaystyle (\Re {(z)},\Im {(z)})} .
An equivalent definition is given by f Z ( z ) = ∂ 2 ∂ x ∂ y P ( ℜ ( Z ) ≤ x , ℑ ( Z ) ≤ y ) {\displaystyle f_{Z}(z)={\frac {\partial ^{2}}{\partial x\partial y}}P(\Re {(Z)}\leq x,\Im {(Z)}\leq y)} where x = ℜ ( z ) {\displaystyle x=\Re {(z)}} and y = ℑ ( z ) {\displaystyle y=\Im {(z)}} .
As in the real case the density function may not exist.
The expectation of a complex random variable is defined based on the definition of the expectation of a real random variable:3: p. 112
Note that the expectation of a complex random variable does not exist if E [ ℜ ( Z ) ] {\displaystyle \operatorname {E} [\Re {(Z)}]} or E [ ℑ ( Z ) ] {\displaystyle \operatorname {E} [\Im {(Z)}]} does not exist.
If the complex random variable Z {\displaystyle Z} has a probability density function f Z ( z ) {\displaystyle f_{Z}(z)} , then the expectation is given by E [ Z ] = ∬ C z ⋅ f Z ( z ) d x d y {\displaystyle \operatorname {E} [Z]=\iint _{\mathbb {C} }z\cdot f_{Z}(z)\,dx\,dy} .
If the complex random variable Z {\displaystyle Z} has a probability mass function p Z ( z ) {\displaystyle p_{Z}(z)} , then the expectation is given by E [ Z ] = ∑ z ∈ Z z ⋅ p Z ( z ) {\displaystyle \operatorname {E} [Z]=\sum _{z\in \mathbb {Z} }z\cdot p_{Z}(z)} .
Whenever the expectation of a complex random variable exists, taking the expectation and complex conjugation commute:
The expected value operator E [ ⋅ ] {\displaystyle \operatorname {E} [\cdot ]} is linear in the sense that
for any complex coefficients a , b {\displaystyle a,b} even if Z {\displaystyle Z} and W {\displaystyle W} are not independent.
The variance is defined in terms of absolute squares as:4: 117
The variance is always a nonnegative real number. It is equal to the sum of the variances of the real and imaginary part of the complex random variable:
The variance of a linear combination of complex random variables may be calculated using the following formula:
The pseudo-variance is a special case of the pseudo-covariance and is defined in terms of ordinary complex squares, given by:
Unlike the variance of Z {\displaystyle Z} , which is always real and positive, the pseudo-variance of Z {\displaystyle Z} is in general complex.
Further information: Complex random vector § Covariance matrices of real and imaginary parts
For a general complex random variable, the pair ( ℜ ( Z ) , ℑ ( Z ) ) {\displaystyle (\Re {(Z)},\Im {(Z)})} has a covariance matrix of the form:
The matrix is symmetric, so Cov [ ℜ ( Z ) , ℑ ( Z ) ] = Cov [ ℑ ( Z ) , ℜ ( Z ) ] {\displaystyle \operatorname {Cov} [\Re {(Z)},\Im {(Z)}]=\operatorname {Cov} [\Im {(Z)},\Re {(Z)}]}
Its elements equal:
Conversely:
The covariance between two complex random variables Z , W {\displaystyle Z,W} is defined as5: 119
Notice the complex conjugation of the second factor in the definition.
In contrast to real random variables, we also define a pseudo-covariance (also called complementary variance):
The second order statistics are fully characterized by the covariance and the pseudo-covariance.
The covariance has the following properties:
Circular symmetry of complex random variables is a common assumption used in the field of wireless communication. A typical example of a circular symmetric complex random variable is the complex Gaussian random variable with zero mean and zero pseudo-covariance matrix.
A complex random variable Z {\displaystyle Z} is circularly symmetric if, for any deterministic ϕ ∈ [ − π , π ] {\displaystyle \phi \in [-\pi ,\pi ]} , the distribution of e i ϕ Z {\displaystyle e^{\mathrm {i} \phi }Z} equals the distribution of Z {\displaystyle Z} .
By definition, a circularly symmetric complex random variable has E [ Z ] = E [ e i ϕ Z ] = e i ϕ E [ Z ] {\displaystyle \operatorname {E} [Z]=\operatorname {E} [e^{\mathrm {i} \phi }Z]=e^{\mathrm {i} \phi }\operatorname {E} [Z]} for any ϕ {\displaystyle \phi } .
Thus the expectation of a circularly symmetric complex random variable can only be either zero or undefined.
Additionally, E [ Z Z ] = E [ e i ϕ Z e i ϕ Z ] = e 2 i ϕ E [ Z Z ] {\displaystyle \operatorname {E} [ZZ]=\operatorname {E} [e^{\mathrm {i} \phi }Ze^{\mathrm {i} \phi }Z]=e^{\mathrm {2} i\phi }\operatorname {E} [ZZ]} for any ϕ {\displaystyle \phi } .
Thus the pseudo-variance of a circularly symmetric complex random variable can only be zero.
If Z {\displaystyle Z} and e i ϕ Z {\displaystyle e^{\mathrm {i} \phi }Z} have the same distribution, the phase of Z {\displaystyle Z} must be uniformly distributed over [ − π , π ] {\displaystyle [-\pi ,\pi ]} and independent of the amplitude of Z {\displaystyle Z} .6
The concept of proper random variables is unique to complex random variables, and has no correspondent concept with real random variables.
A complex random variable Z {\displaystyle Z} is called proper if the following three conditions are all satisfied:
This definition is equivalent to the following conditions. This means that a complex random variable is proper if, and only if:
Theorem—Every circularly symmetric complex random variable with finite variance is proper.
For a proper complex random variable, the covariance matrix of the pair ( ℜ ( Z ) , ℑ ( Z ) ) {\displaystyle (\Re {(Z)},\Im {(Z)})} has the following simple form:
I.e.:
The Cauchy-Schwarz inequality for complex random variables, which can be derived using the Triangle inequality and Hölder's inequality, is
The characteristic function of a complex random variable is a function C → C {\displaystyle \mathbb {C} \to \mathbb {C} } defined by
Eriksson, Jan; Ollila, Esa; Koivunen, Visa (2009). Statistics for complex random variables revisited. 2009 IEEE International Conference on Acoustics, Speech and Signal Processing. Taipei, Taiwan: Institute of Electrical and Electronics Engineers. pp. 3565–3568. doi:10.1109/ICASSP.2009.4960396. /wiki/Institute_of_Electrical_and_Electronics_Engineers ↩
Lapidoth, A. (2009). A Foundation in Digital Communication. Cambridge University Press. ISBN 9780521193955. 9780521193955 ↩
Park,Kun Il (2018). Fundamentals of Probability and Stochastic Processes with Applications to Communications. Springer. ISBN 978-3-319-68074-3. 978-3-319-68074-3 ↩
Peter J. Schreier, Louis L. Scharf (2011). Statistical Signal Processing of Complex-Valued Data. Cambridge University Press. ISBN 9780511815911. 9780511815911 ↩