In statistics, the Q-function is the tail distribution function of the standard normal distribution. In other words, Q ( x ) {\displaystyle Q(x)} is the probability that a normal (Gaussian) random variable will obtain a value larger than x {\displaystyle x} standard deviations. Equivalently, Q ( x ) {\displaystyle Q(x)} is the probability that a standard normal random variable takes a value larger than x {\displaystyle x} .
If Y {\displaystyle Y} is a Gaussian random variable with mean μ {\displaystyle \mu } and variance σ 2 {\displaystyle \sigma ^{2}} , then X = Y − μ σ {\displaystyle X={\frac {Y-\mu }{\sigma }}} is standard normal and
where x = y − μ σ {\displaystyle x={\frac {y-\mu }{\sigma }}} .
Other definitions of the Q-function, all of which are simple transformations of the normal cumulative distribution function, are also used occasionally.
Because of its relation to the cumulative distribution function of the normal distribution, the Q-function can also be expressed in terms of the error function, which is an important function in applied mathematics and physics.