In statistics, sufficient dimension reduction (SDR) is a paradigm for analyzing data that combines the ideas of dimension reduction with the concept of sufficiency.
Dimension reduction has long been a primary goal of regression analysis. Given a response variable y and a p-dimensional predictor vector x {\displaystyle {\textbf {x}}} , regression analysis aims to study the distribution of y ∣ x {\displaystyle y\mid {\textbf {x}}} , the conditional distribution of y {\displaystyle y} given x {\displaystyle {\textbf {x}}} . A dimension reduction is a function R ( x ) {\displaystyle R({\textbf {x}})} that maps x {\displaystyle {\textbf {x}}} to a subset of R k {\displaystyle \mathbb {R} ^{k}} , k < p, thereby reducing the dimension of x {\displaystyle {\textbf {x}}} . For example, R ( x ) {\displaystyle R({\textbf {x}})} may be one or more linear combinations of x {\displaystyle {\textbf {x}}} .
A dimension reduction R ( x ) {\displaystyle R({\textbf {x}})} is said to be sufficient if the distribution of y ∣ R ( x ) {\displaystyle y\mid R({\textbf {x}})} is the same as that of y ∣ x {\displaystyle y\mid {\textbf {x}}} . In other words, no information about the regression is lost in reducing the dimension of x {\displaystyle {\textbf {x}}} if the reduction is sufficient.