A multiple linear regression model can be written as
where y {\displaystyle y} is the dependent variable, x 1 , x 2 … , x k {\displaystyle x_{1},x_{2}\dots ,x_{k}} are the independent variables, u {\displaystyle u} is the error, and β 0 , β 1 … , β k {\displaystyle \beta _{0},\beta _{1}\dots ,\beta _{k}} are unknown coefficients to be estimated. Given observations { y i , x i 1 , x i 2 , … , x i k } i = 1 n {\displaystyle \left\{y_{i},x_{i1},x_{i2},\dots ,x_{ik}\right\}_{i=1}^{n}} , we have a system of n {\displaystyle n} linear equations that can be expressed in matrix notation.3
or
where y {\displaystyle \mathbf {y} } and u {\displaystyle \mathbf {u} } are each a vector of dimension n × 1 {\displaystyle n\times 1} , X {\displaystyle \mathbf {X} } is the design matrix of order N × ( k + 1 ) {\displaystyle N\times (k+1)} , and β {\displaystyle {\boldsymbol {\beta }}} is a vector of dimension ( k + 1 ) × 1 {\displaystyle (k+1)\times 1} . Under the Gauss–Markov assumptions, the best linear unbiased estimator of β {\displaystyle {\boldsymbol {\beta }}} is the linear least squares estimator b = ( X T X ) − 1 X T y {\displaystyle \mathbf {b} =\left(\mathbf {X} ^{\mathsf {T}}\mathbf {X} \right)^{-1}\mathbf {X} ^{\mathsf {T}}\mathbf {y} } , involving the two moment matrices X T X {\displaystyle \mathbf {X} ^{\mathsf {T}}\mathbf {X} } and X T y {\displaystyle \mathbf {X} ^{\mathsf {T}}\mathbf {y} } defined as
and
where X T X {\displaystyle \mathbf {X} ^{\mathsf {T}}\mathbf {X} } is a square normal matrix of dimension ( k + 1 ) × ( k + 1 ) {\displaystyle (k+1)\times (k+1)} , and X T y {\displaystyle \mathbf {X} ^{\mathsf {T}}\mathbf {y} } is a vector of dimension ( k + 1 ) × 1 {\displaystyle (k+1)\times 1} .
Lasserre, Jean-Bernard, 1953- (2010). Moments, positive polynomials and their applications. World Scientific (Firm). London: Imperial College Press. ISBN 978-1-84816-446-8. OCLC 624365972.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link) 978-1-84816-446-8 ↩
Goldberger, Arthur S. (1964). "Classical Linear Regression". Econometric Theory. New York: John Wiley & Sons. pp. 156–212. ISBN 0-471-31101-4. 0-471-31101-4 ↩
Huang, David S. (1970). Regression and Econometric Methods. New York: John Wiley & Sons. pp. 52–65. ISBN 0-471-41754-8. 0-471-41754-8 ↩