A mixed Poisson distribution is a univariate discrete probability distribution in stochastics. It results from assuming that the conditional distribution of a random variable, given the value of the rate parameter, is a Poisson distribution, and that the rate parameter itself is considered as a random variable. Hence it is a special case of a compound probability distribution. Mixed Poisson distributions can be found in actuarial mathematics as a general approach for the distribution of the number of claims and is also examined as an epidemiological model. It should not be confused with compound Poisson distribution or compound Poisson process.
Definition
A random variable X satisfies the mixed Poisson distribution with density π(λ) if it has the probability distribution3
P ( X = k ) = ∫ 0 ∞ λ k k ! e − λ π ( λ ) d λ . {\displaystyle \operatorname {P} (X=k)=\int _{0}^{\infty }{\frac {\lambda ^{k}}{k!}}e^{-\lambda }\,\,\pi (\lambda )\,d\lambda .}
If we denote the probabilities of the Poisson distribution by qλ(k), then
P ( X = k ) = ∫ 0 ∞ q λ ( k ) π ( λ ) d λ . {\displaystyle \operatorname {P} (X=k)=\int _{0}^{\infty }q_{\lambda }(k)\,\,\pi (\lambda )\,d\lambda .}
Properties
- The variance is always bigger than the expected value. This property is called overdispersion. This is in contrast to the Poisson distribution where mean and variance are the same.
- In practice, almost only densities of gamma distributions, logarithmic normal distributions and inverse Gaussian distributions are used as densities π(λ). If we choose the density of the gamma distribution, we get the negative binomial distribution, which explains why this is also called the Poisson gamma distribution.
In the following let μ π = ∫ 0 ∞ λ π ( λ ) d λ {\displaystyle \mu _{\pi }=\int _{0}^{\infty }\lambda \,\,\pi (\lambda )\,d\lambda \,} be the expected value of the density π ( λ ) {\displaystyle \pi (\lambda )\,} and σ π 2 = ∫ 0 ∞ ( λ − μ π ) 2 π ( λ ) d λ {\displaystyle \sigma _{\pi }^{2}=\int _{0}^{\infty }(\lambda -\mu _{\pi })^{2}\,\,\pi (\lambda )\,d\lambda \,} be the variance of the density.
Expected value
The expected value of the mixed Poisson distribution is
E ( X ) = μ π . {\displaystyle \operatorname {E} (X)=\mu _{\pi }.}
Variance
Var ( X ) = μ π + σ π 2 . {\displaystyle \operatorname {Var} (X)=\mu _{\pi }+\sigma _{\pi }^{2}.}
Skewness
The skewness can be represented as
v ( X ) = ( μ π + σ π 2 ) − 3 / 2 [ ∫ 0 ∞ ( λ − μ π ) 3 π ( λ ) d λ + μ π ] . {\displaystyle \operatorname {v} (X)={\Bigl (}\mu _{\pi }+\sigma _{\pi }^{2}{\Bigr )}^{-3/2}\,{\Biggl [}\int _{0}^{\infty }(\lambda -\mu _{\pi })^{3}\,\pi (\lambda )\,d{\lambda }+\mu _{\pi }{\Biggr ]}.}
Characteristic function
The characteristic function has the form
φ X ( s ) = M π ( e i s − 1 ) . {\displaystyle \varphi _{X}(s)=M_{\pi }(e^{is}-1).\,}
Where M π {\displaystyle M_{\pi }} is the moment generating function of the density.
Probability generating function
For the probability generating function, one obtains5
m X ( s ) = M π ( s − 1 ) . {\displaystyle m_{X}(s)=M_{\pi }(s-1).\,}
Moment-generating function
The moment-generating function of the mixed Poisson distribution is
M X ( s ) = M π ( e s − 1 ) . {\displaystyle M_{X}(s)=M_{\pi }(e^{s}-1).\,}
Examples
Theorem—Compounding a Poisson distribution with rate parameter distributed according to a gamma distribution yields a negative binomial distribution.6
ProofLet π ( λ ) = ( p 1 − p ) r Γ ( r ) λ r − 1 e − p 1 − p λ {\displaystyle \pi (\lambda )={\frac {({\frac {p}{1-p}})^{r}}{\Gamma (r)}}\lambda ^{r-1}e^{-{\frac {p}{1-p}}\lambda }} be a density of a Γ ( r , p 1 − p ) {\displaystyle \operatorname {\Gamma } \left(r,{\frac {p}{1-p}}\right)} distributed random variable.
P ( X = k ) = 1 k ! ∫ 0 ∞ λ k e − λ ( p 1 − p ) r Γ ( r ) λ r − 1 e − p 1 − p λ d λ = p r ( 1 − p ) − r Γ ( r ) k ! ∫ 0 ∞ λ k + r − 1 e − λ 1 1 − p d λ = p r ( 1 − p ) − r Γ ( r ) k ! ( 1 − p ) k + r ∫ 0 ∞ λ k + r − 1 e − λ d λ ⏟ = Γ ( r + k ) = Γ ( r + k ) Γ ( r ) k ! ( 1 − p ) k p r {\displaystyle {\begin{aligned}\operatorname {P} (X=k)&={\frac {1}{k!}}\int _{0}^{\infty }\lambda ^{k}e^{-\lambda }{\frac {({\frac {p}{1-p}})^{r}}{\Gamma (r)}}\lambda ^{r-1}e^{-{\frac {p}{1-p}}\lambda }\,d\lambda \\&={\frac {p^{r}(1-p)^{-r}}{\Gamma (r)k!}}\int _{0}^{\infty }\lambda ^{k+r-1}e^{-\lambda {\frac {1}{1-p}}}\,d\lambda \\&={\frac {p^{r}(1-p)^{-r}}{\Gamma (r)k!}}(1-p)^{k+r}\underbrace {\int _{0}^{\infty }\lambda ^{k+r-1}e^{-\lambda }\,d\lambda } _{=\Gamma (r+k)}\\&={\frac {\Gamma (r+k)}{\Gamma (r)k!}}(1-p)^{k}p^{r}\end{aligned}}}
Therefore we get X ∼ NegB ( r , p ) . {\displaystyle X\sim \operatorname {NegB} (r,p).}
Theorem—Compounding a Poisson distribution with rate parameter distributed according to an exponential distribution yields a geometric distribution.
ProofLet π ( λ ) = 1 β e − λ β {\displaystyle \pi (\lambda )={\frac {1}{\beta }}e^{-{\frac {\lambda }{\beta }}}} be a density of a Exp ( 1 β ) {\displaystyle \operatorname {Exp} \left({\frac {1}{\beta }}\right)} distributed random variable. Using integration by parts n times yields: P ( X = k ) = 1 k ! ∫ 0 ∞ λ k e − λ 1 β e − λ β d λ = 1 k ! β ∫ 0 ∞ λ k e − λ ( 1 + β β ) d λ = 1 k ! β ⋅ k ! ( β 1 + β ) k ∫ 0 ∞ e − λ ( 1 + β β ) d λ = ( β 1 + β ) k ( 1 1 + β ) {\displaystyle {\begin{aligned}\operatorname {P} (X=k)&={\frac {1}{k!}}\int _{0}^{\infty }\lambda ^{k}e^{-\lambda }{\frac {1}{\beta }}e^{-{\frac {\lambda }{\beta }}}\,d\lambda \\&={\frac {1}{k!\beta }}\int _{0}^{\infty }\lambda ^{k}e^{-\lambda \left({\frac {1+\beta }{\beta }}\right)}\,d\lambda \\&={\frac {1}{k!\beta }}\cdot k!\left({\frac {\beta }{1+\beta }}\right)^{k}\int _{0}^{\infty }e^{-\lambda \left({\frac {1+\beta }{\beta }}\right)}\,d\lambda \\&=\left({\frac {\beta }{1+\beta }}\right)^{k}\left({\frac {1}{1+\beta }}\right)\end{aligned}}} Therefore we get X ∼ G e o ( 1 1 + β ) . {\displaystyle X\sim \operatorname {Geo\left({\frac {1}{1+\beta }}\right)} .}
Table of mixed Poisson distributions
mixing distribution | mixed Poisson distribution7 |
---|---|
Dirac | Poisson |
gamma, Erlang | negative binomial |
exponential | geometric |
inverse Gaussian | Sichel |
Poisson | Neyman |
generalized inverse Gaussian | Poisson-generalized inverse Gaussian |
generalized gamma | Poisson-generalized gamma |
generalized Pareto | Poisson-generalized Pareto |
inverse-gamma | Poisson-inverse gamma |
log-normal | Poisson-log-normal |
Lomax | Poisson–Lomax |
Pareto | Poisson–Pareto |
Pearson’s family of distributions | Poisson–Pearson family |
truncated normal | Poisson-truncated normal |
uniform | Poisson-uniform |
shifted gamma | Delaporte |
beta with specific parameter values | Yule |
Further reading
- Grandell, Jan (1997). Mixed Poisson Processes. London: Chapman & Hall. ISBN 0-412-78700-8.
- Britton, Tom (2019). Stochastic Epidemic Models with Inference. Springer. doi:10.1007/978-3-030-30900-8.
References
Willmot, Gordon E.; Lin, X. Sheldon (2001), "Mixed Poisson distributions", Lundberg Approximations for Compound Distributions with Insurance Applications, Lecture Notes in Statistics, vol. 156, New York, NY: Springer New York, pp. 37–49, doi:10.1007/978-1-4613-0111-0_3, ISBN 978-0-387-95135-5, retrieved 2022-07-08 978-0-387-95135-5 ↩
Willmot, Gord (1986). "Mixed Compound Poisson Distributions". ASTIN Bulletin. 16 (S1): S59 – S79. doi:10.1017/S051503610001165X. ISSN 0515-0361. https://doi.org/10.1017%2FS051503610001165X ↩
Willmot, Gord (2014-08-29). "Mixed Compound Poisson Distributions". Astin Bulletin. 16: 5–7. doi:10.1017/S051503610001165X. S2CID 17737506. https://doi.org/10.1017%2FS051503610001165X ↩
Willmot, Gord (2014-08-29). "Mixed Compound Poisson Distributions". Astin Bulletin. 16: 5–7. doi:10.1017/S051503610001165X. S2CID 17737506. https://doi.org/10.1017%2FS051503610001165X ↩
Willmot, Gord (2014-08-29). "Mixed Compound Poisson Distributions". Astin Bulletin. 16: 5–7. doi:10.1017/S051503610001165X. S2CID 17737506. https://doi.org/10.1017%2FS051503610001165X ↩
Willmot, Gord (2014-08-29). "Mixed Compound Poisson Distributions". Astin Bulletin. 16: 5–7. doi:10.1017/S051503610001165X. S2CID 17737506. https://doi.org/10.1017%2FS051503610001165X ↩
Karlis, Dimitris; Xekalaki, Evdokia (2005). "Mixed Poisson Distributions". International Statistical Review. 73 (1): 35–58. doi:10.1111/j.1751-5823.2005.tb00250.x. ISSN 0306-7734. JSTOR 25472639. S2CID 53637483. https://www.jstor.org/stable/25472639 ↩