In statistics, the algebra of random variables enables symbolic manipulation of sums, products, and functions of these variables without relying heavily on complex probability theory. This algebra helps determine probability distributions, expectations, variances, and covariances of combined variables. While similar to elementary algebra, the effect of algebraic operations on probability distributions is more complex, leading to specialized algebras, such as Expectation, Variance, Covariance, and Moment algebra, to describe behavior beyond standard symbolic algebra.
Elementary symbolic algebra of random variables
Considering two random variables X {\displaystyle X} and Y {\displaystyle Y} , the following algebraic operations are possible:
- Addition: Z = X + Y = Y + X {\displaystyle Z=X+Y=Y+X}
- Subtraction: Z = X − Y = − Y + X {\displaystyle Z=X-Y=-Y+X}
- Multiplication: Z = X Y = Y X {\displaystyle Z=XY=YX}
- Division: Suppose Y ≠ 0 {\displaystyle Y\neq 0} , Z = X / Y = X ⋅ ( 1 / Y ) = ( 1 / Y ) ⋅ X {\displaystyle Z=X/Y=X\cdot (1/Y)=(1/Y)\cdot X} .
- Exponentiation: Z = X Y = e Y ln ( X ) {\displaystyle Z=X^{Y}=e^{Y\ln(X)}}
In all cases, the variable Z {\displaystyle Z} resulting from each operation is also a random variable. All commutative and associative properties of conventional algebraic operations are also valid for random variables. If any of the random variables is replaced by a deterministic variable or by a constant value, all the previous properties remain valid.
Expectation algebra for random variables
The expected value E [ Z ] {\displaystyle \operatorname {E} [Z]} of the random variable Z {\displaystyle Z} resulting from an algebraic operation between two random variables can be calculated using the following set of rules:
- Addition: E [ Z ] = E [ X + Y ] = E [ X ] + E [ Y ] = E [ Y ] + E [ X ] {\displaystyle \operatorname {E} [Z]=\operatorname {E} [X+Y]=\operatorname {E} [X]+\operatorname {E} [Y]=\operatorname {E} [Y]+\operatorname {E} [X]}
- Subtraction: E [ Z ] = E [ X − Y ] = E [ X ] − E [ Y ] = − E [ Y ] + E [ X ] {\displaystyle \operatorname {E} [Z]=\operatorname {E} [X-Y]=\operatorname {E} [X]-\operatorname {E} [Y]=-\operatorname {E} [Y]+\operatorname {E} [X]}
- Multiplication: E [ Z ] = E [ X Y ] = E [ Y X ] {\displaystyle \operatorname {E} [Z]=\operatorname {E} [XY]=\operatorname {E} [YX]} . Particularly, if X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: E [ X Y ] = E [ X ] ⋅ E [ Y ] = E [ Y ] ⋅ E [ X ] {\displaystyle \operatorname {E} [XY]=\operatorname {E} [X]\cdot \operatorname {E} [Y]=\operatorname {E} [Y]\cdot \operatorname {E} [X]} .
- Division: E [ Z ] = E [ X / Y ] = E [ X ⋅ ( 1 / Y ) ] = E [ ( 1 / Y ) ⋅ X ] {\displaystyle \operatorname {E} [Z]=\operatorname {E} [X/Y]=\operatorname {E} [X\cdot (1/Y)]=\operatorname {E} [(1/Y)\cdot X]} . Particularly, if X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: E [ X / Y ] = E [ X ] ⋅ E [ 1 / Y ] = E [ 1 / Y ] ⋅ E [ X ] {\displaystyle \operatorname {E} [X/Y]=\operatorname {E} [X]\cdot \operatorname {E} [1/Y]=\operatorname {E} [1/Y]\cdot \operatorname {E} [X]} .
- Exponentiation: E [ Z ] = E [ X Y ] = E [ e Y ln ( X ) ] {\displaystyle \operatorname {E} [Z]=\operatorname {E} [X^{Y}]=\operatorname {E} [e^{Y\ln(X)}]}
If any of the random variables is replaced by a deterministic variable or by a constant value ( k {\displaystyle k} ), the previous properties remain valid considering that Pr ( X = k ) = 1 {\displaystyle \Pr(X=k)=1} and, therefore, E [ X ] = k {\displaystyle \operatorname {E} [X]=k} .
If Z {\displaystyle Z} is defined as a general non-linear algebraic function f {\displaystyle f} of a random variable X {\displaystyle X} , then:
E [ Z ] = E [ f ( X ) ] ≠ f ( E [ X ] ) {\displaystyle \operatorname {E} [Z]=\operatorname {E} [f(X)]\neq f(\operatorname {E} [X])}
Some examples of this property include:
- E [ X 2 ] ≠ E [ X ] 2 {\displaystyle \operatorname {E} [X^{2}]\neq \operatorname {E} [X]^{2}}
- E [ 1 / X ] ≠ 1 / E [ X ] {\displaystyle \operatorname {E} [1/X]\neq 1/\operatorname {E} [X]}
- E [ e X ] ≠ e E [ X ] {\displaystyle \operatorname {E} [e^{X}]\neq e^{\operatorname {E} [X]}}
- E [ ln ( X ) ] ≠ ln ( E [ X ] ) {\displaystyle \operatorname {E} [\ln(X)]\neq \ln(\operatorname {E} [X])}
The exact value of the expectation of the non-linear function will depend on the particular probability distribution of the random variable X {\displaystyle X} .
Variance algebra for random variables
The variance Var [ Z ] {\displaystyle \operatorname {Var} [Z]} of the random variable Z {\displaystyle Z} resulting from an algebraic operation between random variables can be calculated using the following set of rules:
- Addition: Var [ Z ] = Var [ X + Y ] = Var [ X ] + 2 Cov [ X , Y ] + Var [ Y ] . {\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [X+Y]=\operatorname {Var} [X]+2\operatorname {Cov} [X,Y]+\operatorname {Var} [Y].} Particularly, if X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Var [ X + Y ] = Var [ X ] + Var [ Y ] . {\displaystyle \operatorname {Var} [X+Y]=\operatorname {Var} [X]+\operatorname {Var} [Y].}
- Subtraction: Var [ Z ] = Var [ X − Y ] = Var [ X ] − 2 Cov [ X , Y ] + Var [ Y ] . {\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [X-Y]=\operatorname {Var} [X]-2\operatorname {Cov} [X,Y]+\operatorname {Var} [Y].} Particularly, if X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Var [ X − Y ] = Var [ X ] + Var [ Y ] . {\displaystyle \operatorname {Var} [X-Y]=\operatorname {Var} [X]+\operatorname {Var} [Y].} That is, for independent random variables the variance is the same for additions and subtractions: Var [ X + Y ] = Var [ X − Y ] = Var [ Y − X ] = Var [ − X − Y ] . {\displaystyle \operatorname {Var} [X+Y]=\operatorname {Var} [X-Y]=\operatorname {Var} [Y-X]=\operatorname {Var} [-X-Y].}
- Multiplication: Var [ Z ] = Var [ X Y ] = Var [ Y X ] . {\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [XY]=\operatorname {Var} [YX].} Particularly, if X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Var [ X Y ] = E [ X 2 ] ⋅ E [ Y 2 ] − ( E [ X ] ⋅ E [ Y ] ) 2 = Var [ X ] ⋅ Var [ Y ] + Var [ X ] ⋅ ( E [ Y ] ) 2 + Var [ Y ] ⋅ ( E [ X ] ) 2 . {\displaystyle {\begin{aligned}\operatorname {Var} [XY]&=\operatorname {E} [X^{2}]\cdot \operatorname {E} [Y^{2}]-{\left(\operatorname {E} [X]\cdot \operatorname {E} [Y]\right)}^{2}\\[2pt]&=\operatorname {Var} [X]\cdot \operatorname {Var} [Y]+\operatorname {Var} [X]\cdot {\left(\operatorname {E} [Y]\right)}^{2}+\operatorname {Var} [Y]\cdot {\left(\operatorname {E} [X]\right)}^{2}.\end{aligned}}}
- Division: Var [ Z ] = Var [ X / Y ] = Var [ X ⋅ ( 1 / Y ) ] = Var [ ( 1 / Y ) ⋅ X ] . {\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [X/Y]=\operatorname {Var} [X\cdot (1/Y)]=\operatorname {Var} [(1/Y)\cdot X].} Particularly, if X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Var [ X / Y ] = E [ X 2 ] ⋅ E [ 1 / Y 2 ] − ( E [ X ] ⋅ E [ 1 / Y ] ) 2 = Var [ X ] ⋅ Var [ 1 / Y ] + Var [ X ] ⋅ ( E [ 1 / Y ] ) 2 + Var [ 1 / Y ] ⋅ ( E [ X ] ) 2 . {\displaystyle {\begin{aligned}\operatorname {Var} [X/Y]&=\operatorname {E} [X^{2}]\cdot \operatorname {E} [1/Y^{2}]-{\left(\operatorname {E} [X]\cdot \operatorname {E} [1/Y]\right)}^{2}\\[2pt]&=\operatorname {Var} [X]\cdot \operatorname {Var} [1/Y]+\operatorname {Var} [X]\cdot {\left(\operatorname {E} [1/Y]\right)}^{2}+\operatorname {Var} [1/Y]\cdot {\left(\operatorname {E} [X]\right)}^{2}.\end{aligned}}}
- Exponentiation: Var [ Z ] = Var [ X Y ] = Var [ e Y ln ( X ) ] {\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [X^{Y}]=\operatorname {Var} [e^{Y\ln(X)}]}
where Cov [ X , Y ] = Cov [ Y , X ] {\displaystyle \operatorname {Cov} [X,Y]=\operatorname {Cov} [Y,X]} represents the covariance operator between random variables X {\displaystyle X} and Y {\displaystyle Y} .
The variance of a random variable can also be expressed directly in terms of the covariance or in terms of the expected value:
Var [ X ] = Cov ( X , X ) = E [ X 2 ] − E [ X ] 2 {\displaystyle \operatorname {Var} [X]=\operatorname {Cov} (X,X)=\operatorname {E} [X^{2}]-\operatorname {E} [X]^{2}}
If any of the random variables is replaced by a deterministic variable or by a constant value ( k {\displaystyle k} ), the previous properties remain valid considering that Pr ( X = k ) = 1 {\displaystyle \Pr(X=k)=1} and E [ X ] = k {\displaystyle \operatorname {E} [X]=k} , Var [ X ] = 0 {\displaystyle \operatorname {Var} [X]=0} and Cov [ Y , k ] = 0 {\displaystyle \operatorname {Cov} [Y,k]=0} . Special cases are the addition and multiplication of a random variable with a deterministic variable or a constant, where:
- Var [ k + Y ] = Var [ Y ] {\displaystyle \operatorname {Var} [k+Y]=\operatorname {Var} [Y]}
- Var [ k Y ] = k 2 Var [ Y ] {\displaystyle \operatorname {Var} [kY]=k^{2}\operatorname {Var} [Y]}
If Z {\displaystyle Z} is defined as a general non-linear algebraic function f {\displaystyle f} of a random variable X {\displaystyle X} , then:
Var [ Z ] = Var [ f ( X ) ] ≠ f ( Var [ X ] ) {\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [f(X)]\neq f(\operatorname {Var} [X])}
The exact value of the variance of the non-linear function will depend on the particular probability distribution of the random variable X {\displaystyle X} .
Covariance algebra for random variables
The covariance ( Cov [ Z , X ] {\displaystyle \operatorname {Cov} [Z,X]} ) between the random variable Z {\displaystyle Z} resulting from an algebraic operation and the random variable X {\displaystyle X} can be calculated using the following set of rules:
- Addition: Cov [ Z , X ] = Cov [ X + Y , X ] = Var [ X ] + Cov [ X , Y ] . {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [X+Y,X]=\operatorname {Var} [X]+\operatorname {Cov} [X,Y].} If X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Cov [ X + Y , X ] = Var [ X ] . {\displaystyle \operatorname {Cov} [X+Y,X]=\operatorname {Var} [X].}
- Subtraction: Cov [ Z , X ] = Cov [ X − Y , X ] = Var [ X ] − Cov [ X , Y ] . {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [X-Y,X]=\operatorname {Var} [X]-\operatorname {Cov} [X,Y].} If X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Cov [ X − Y , X ] = Var [ X ] . {\displaystyle \operatorname {Cov} [X-Y,X]=\operatorname {Var} [X].}
- Multiplication: Cov [ Z , X ] = Cov [ X Y , X ] = E [ X 2 Y ] − E [ X Y ] E [ X ] . {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [XY,X]=\operatorname {E} [X^{2}Y]-\operatorname {E} [XY]\operatorname {E} [X].} If X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Cov [ X Y , X ] = Var [ X ] ⋅ E [ Y ] . {\displaystyle \operatorname {Cov} [XY,X]=\operatorname {Var} [X]\cdot \operatorname {E} [Y].}
- Division (covariance with respect to the numerator): Cov [ Z , X ] = Cov [ X / Y , X ] = E [ X 2 / Y ] − E [ X / Y ] E [ X ] . {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [X/Y,X]=\operatorname {E} [X^{2}/Y]-\operatorname {E} [X/Y]\operatorname {E} [X].} If X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Cov [ X / Y , X ] = Var [ X ] ⋅ E [ 1 / Y ] . {\displaystyle \operatorname {Cov} [X/Y,X]=\operatorname {Var} [X]\cdot \operatorname {E} [1/Y].}
- Division (covariance with respect to the denominator): Cov [ Z , X ] = Cov [ Y / X , X ] = E [ Y ] − E [ Y / X ] E [ X ] . {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [Y/X,X]=\operatorname {E} [Y]-\operatorname {E} [Y/X]\operatorname {E} [X].} If X {\displaystyle X} and Y {\displaystyle Y} are independent from each other, then: Cov [ Y / X , X ] = E [ Y ] ⋅ ( 1 − E [ X ] ⋅ E [ 1 / X ] ) . {\displaystyle \operatorname {Cov} [Y/X,X]=\operatorname {E} [Y]\cdot (1-\operatorname {E} [X]\cdot \operatorname {E} [1/X]).}
- Exponentiation (covariance with respect to the base): Cov [ Z , X ] = Cov [ X Y , X ] = E [ X Y + 1 ] − E [ X Y ] E [ X ] . {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [X^{Y},X]=\operatorname {E} [X^{Y+1}]-\operatorname {E} [X^{Y}]\operatorname {E} [X].}
- Exponentiation (covariance with respect to the power): Cov [ Z , X ] = Cov [ Y X , X ] = E [ X Y X ] − E [ Y X ] E [ X ] . {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [Y^{X},X]=\operatorname {E} [XY^{X}]-\operatorname {E} [Y^{X}]\operatorname {E} [X].}
The covariance of a random variable can also be expressed directly in terms of the expected value:
Cov ( X , Y ) = E [ X Y ] − E [ X ] E [ Y ] {\displaystyle \operatorname {Cov} (X,Y)=\operatorname {E} [XY]-\operatorname {E} [X]\operatorname {E} [Y]}
If any of the random variables is replaced by a deterministic variable or by a constant value ( k {\displaystyle k} ), the previous properties remain valid considering that E [ k ] = k {\displaystyle \operatorname {E} [k]=k} , Var [ k ] = 0 {\displaystyle \operatorname {Var} [k]=0} and Cov [ X , k ] = 0 {\displaystyle \operatorname {Cov} [X,k]=0} .
If Z {\displaystyle Z} is defined as a general non-linear algebraic function f {\displaystyle f} of a random variable X {\displaystyle X} , then:
Cov [ Z , X ] = Cov [ f ( X ) , X ] = E [ X f ( X ) ] − E [ f ( X ) ] E [ X ] {\displaystyle \operatorname {Cov} [Z,X]=\operatorname {Cov} [f(X),X]=\operatorname {E} [Xf(X)]-\operatorname {E} [f(X)]\operatorname {E} [X]}
The exact value of the covariance of the non-linear function will depend on the particular probability distribution of the random variable X {\displaystyle X} .
Approximations by Taylor series expansions of moments
If the moments of a certain random variable X {\displaystyle X} are known (or can be determined by integration if the probability density function is known), then it is possible to approximate the expected value of any general non-linear function f ( X ) {\displaystyle f(X)} as a Taylor series expansion of the moments, as follows:
f ( X ) = ∑ n = 0 ∞ 1 n ! ( d n f d X n ) X = μ ( X − μ ) n , {\displaystyle f(X)=\sum _{n=0}^{\infty }{\frac {1}{n!}}\left({\frac {d^{n}f}{dX^{n}}}\right)_{X=\mu }{\left(X-\mu \right)}^{n},} where μ = E [ X ] {\displaystyle \mu =\operatorname {E} [X]} is the mean value of X {\displaystyle X} .
E [ f ( X ) ] = E [ ∑ n = 0 ∞ 1 n ! ( d n f d X n ) X = μ ( X − μ ) n ] = ∑ n = 0 ∞ 1 n ! ( d n f d X n ) X = μ E [ ( X − μ ) n ] = ∑ n = 0 ∞ 1 n ! ( d n f d X n ) X = μ μ n ( X ) , {\displaystyle {\begin{aligned}\operatorname {E} [f(X)]&=\operatorname {E} \left[\sum _{n=0}^{\infty }{\frac {1}{n!}}\left({d^{n}f \over dX^{n}}\right)_{X=\mu }{\left(X-\mu \right)}^{n}\right]\\&=\sum _{n=0}^{\infty }{\frac {1}{n!}}\left({\frac {d^{n}f}{dX^{n}}}\right)_{X=\mu }\operatorname {E} \left[{\left(X-\mu \right)}^{n}\right]\\&=\sum _{n=0}^{\infty }{\frac {1}{n!}}\left({d^{n}f \over dX^{n}}\right)_{X=\mu }\mu _{n}(X),\end{aligned}}} where μ n ( X ) = E [ ( X − μ ) n ] {\displaystyle \mu _{n}(X)=\operatorname {E} [(X-\mu )^{n}]} is the n-th moment of X {\displaystyle X} about its mean. Note that by their definition, μ 0 ( X ) = 1 {\displaystyle \mu _{0}(X)=1} and μ 1 ( X ) = 0 {\displaystyle \mu _{1}(X)=0} . The first order term always vanishes but was kept to obtain a closed form expression.
Then,
E [ f ( X ) ] ≈ ∑ n = 0 n max 1 n ! ( d n f d X n ) X = μ μ n ( X ) , {\displaystyle \operatorname {E} [f(X)]\approx \sum _{n=0}^{n_{\max }}{\frac {1}{n!}}\left({\frac {d^{n}f}{dX^{n}}}\right)_{X=\mu }\mu _{n}(X),} where the Taylor expansion is truncated after the n max {\displaystyle n_{\max }} -th moment.
Particularly for functions of normal random variables, it is possible to obtain a Taylor expansion in terms of the standard normal distribution:1
f ( X ) = ∑ n = 0 ∞ σ n n ! ( d n f d X n ) X = μ μ n ( Z ) , {\displaystyle f(X)=\sum _{n=0}^{\infty }{\frac {\sigma ^{n}}{n!}}\left({\frac {d^{n}f}{dX^{n}}}\right)_{X=\mu }\mu _{n}(Z),} where X ∼ N ( μ , σ 2 ) {\displaystyle X\sim N(\mu ,\sigma ^{2})} is a normal random variable, and Z ∼ N ( 0 , 1 ) {\displaystyle Z\sim N(0,1)} is the standard normal distribution. Thus,
E [ f ( X ) ] ≈ ∑ n = 0 n max σ n n ! ( d n f d X n ) X = μ μ n ( Z ) , {\displaystyle \operatorname {E} [f(X)]\approx \sum _{n=0}^{n_{\max }}{\sigma ^{n} \over n!}\left({d^{n}f \over dX^{n}}\right)_{X=\mu }\mu _{n}(Z),} where the moments of the standard normal distribution are given by:
μ n ( Z ) = { ∏ i = 1 n / 2 ( 2 i − 1 ) , if n is even 0 , if n is odd {\displaystyle \mu _{n}(Z)={\begin{cases}\prod _{i=1}^{n/2}(2i-1),&{\text{if }}n{\text{ is even}}\\0,&{\text{if }}n{\text{ is odd}}\end{cases}}}
Similarly for normal random variables, it is also possible to approximate the variance of the non-linear function as a Taylor series expansion as:
Var [ f ( X ) ] ≈ ∑ n = 1 n max ( σ n n ! ( d n f d X n ) X = μ ) 2 Var [ Z n ] + ∑ n = 1 n max ∑ m ≠ n σ n + m n ! m ! ( d n f d X n ) X = μ ( d m f d X m ) X = μ Cov [ Z n , Z m ] , {\displaystyle \operatorname {Var} [f(X)]\approx \sum _{n=1}^{n_{\max }}\left({\sigma ^{n} \over n!}\left({d^{n}f \over dX^{n}}\right)_{X=\mu }\right)^{2}\operatorname {Var} [Z^{n}]+\sum _{n=1}^{n_{\max }}\sum _{m\neq n}{\frac {\sigma ^{n+m}}{n!m!}}\left({d^{n}f \over dX^{n}}\right)_{X=\mu }\left({d^{m}f \over dX^{m}}\right)_{X=\mu }\operatorname {Cov} [Z^{n},Z^{m}],} where Var [ Z n ] = { ∏ i = 1 n ( 2 i − 1 ) − ∏ i = 1 n / 2 ( 2 i − 1 ) 2 , if n is even ∏ i = 1 n ( 2 i − 1 ) , if n is odd , {\displaystyle \operatorname {Var} [Z^{n}]={\begin{cases}\prod _{i=1}^{n}(2i-1)-\prod _{i=1}^{n/2}(2i-1)^{2},&{\text{if }}n{\text{ is even}}\\\prod _{i=1}^{n}(2i-1),&{\text{if }}n{\text{ is odd}},\end{cases}}} and Cov [ Z n , Z m ] = { ∏ i = 1 ( n + m ) / 2 ( 2 i − 1 ) − ∏ i = 1 n / 2 ( 2 i − 1 ) ∏ j = 1 m / 2 ( 2 j − 1 ) , if n and m are even ∏ i = 1 ( n + m ) / 2 ( 2 i − 1 ) , if n and m are odd 0 , otherwise {\displaystyle \operatorname {Cov} [Z^{n},Z^{m}]={\begin{cases}\prod _{i=1}^{(n+m)/2}(2i-1)-\prod _{i=1}^{n/2}(2i-1)\prod _{j=1}^{m/2}(2j-1),&{\text{if }}n{\text{ and }}m{\text{ are even}}\\\prod _{i=1}^{(n+m)/2}(2i-1),&{\text{if }}n{\text{ and }}m{\text{ are odd}}\\0,&{\text{otherwise}}\end{cases}}}
Algebra of complex random variables
In the algebraic axiomatization of probability theory, the primary concept is not that of probability of an event, but rather that of a random variable. Probability distributions are determined by assigning an expectation to each random variable. The measurable space and the probability measure arise from the random variables and expectations by means of well-known representation theorems of analysis. One of the important features of the algebraic approach is that apparently infinite-dimensional probability distributions are not harder to formalize than finite-dimensional ones.
Random variables are assumed to have the following properties:
- complex constants are possible realizations of a random variable;
- the sum of two random variables is a random variable;
- the product of two random variables is a random variable;
- addition and multiplication of random variables are both commutative; and
- there is a notion of conjugation of random variables, satisfying (XY)* = Y*X* and X** = X for all random variables X,Y and coinciding with complex conjugation if X is a constant.
This means that random variables form complex commutative *-algebras. If X = X* then the random variable X is called "real".
An expectation E on an algebra A of random variables is a normalized, positive linear functional. What this means is that
- E[k] = k where k is a constant;
- E[X*X] ≥ 0 for all random variables X;
- E[X + Y] = E[X] + E[Y] for all random variables X and Y; and
- E[kX] = kE[X] if k is a constant.
One may generalize this setup, allowing the algebra to be noncommutative. This leads to other areas of noncommutative probability such as quantum probability, random matrix theory, and free probability.
See also
- Relationships among probability distributions
- Ratio distribution
- Inverse distribution
- Product distribution
- Mellin transform
- Sum of normally distributed random variables
- List of convolutions of probability distributions – the probability measure of the sum of independent random variables is the convolution of their probability measures.
- Law of total expectation
- Law of total variance
- Law of total covariance
- Law of total cumulance
- Taylor expansions for the moments of functions of random variables
- Delta method
Further reading
- Whittle, Peter (2000). Probability via Expectation (4th ed.). New York, NY: Springer. ISBN 978-0-387-98955-6. Retrieved 24 September 2012.
- Springer, Melvin Dale (1979). The Algebra of Random Variables. Wiley. ISBN 0-471-01406-0. Retrieved 24 September 2012.
- "Measure algebra", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
References
Hernandez, Hugo (2016). "Modelling the effect of fluctuation in nonlinear systems using variance algebra - Application to light scattering of ideal gases". ForsChem Research Reports. 2016–1. doi:10.13140/rg.2.2.36501.52969. /wiki/Doi_(identifier) ↩