In statistical decision theory, a minimax estimator δ M {\displaystyle \delta ^{M}\,\!} is an estimator which performs best in the worst possible case allowed in a problem. With problems of estimating a deterministic parameter (vector) θ ∈ Θ {\displaystyle \theta \in \Theta } from observations x ∈ X , {\displaystyle x\in {\mathcal {X}},} an estimator (estimation rule) δ M {\displaystyle \delta ^{M}\,\!} is called minimax if its maximal risk is minimal among all estimators of θ {\displaystyle \theta \,\!} .