Menu
Home Explore People Places Arts History Plants & Animals Science Life & Culture Technology
On this page
Additive model
Statistical regression model

In statistics, an additive model (AM) is a nonparametric regression method. It was suggested by Jerome H. Friedman and Werner Stuetzle (1981) and is an essential part of the ACE algorithm. The AM uses a one-dimensional smoother to build a restricted class of nonparametric regression models. Because of this, it is less affected by the curse of dimensionality than a p-dimensional smoother. Furthermore, the AM is more flexible than a standard linear model, while being more interpretable than a general regression surface at the cost of approximation errors. Problems with AM, like many other machine-learning methods, include model selection, overfitting, and multicollinearity.

We don't have any images related to Additive model yet.
We don't have any YouTube videos related to Additive model yet.
We don't have any PDF documents related to Additive model yet.
We don't have any Books related to Additive model yet.
We don't have any archived web articles related to Additive model yet.

Description

Given a data set { y i , x i 1 , … , x i p } i = 1 n {\displaystyle \{y_{i},\,x_{i1},\ldots ,x_{ip}\}_{i=1}^{n}} of n statistical units, where { x i 1 , … , x i p } i = 1 n {\displaystyle \{x_{i1},\ldots ,x_{ip}\}_{i=1}^{n}} represent predictors and y i {\displaystyle y_{i}} is the outcome, the additive model takes the form

E [ y i | x i 1 , … , x i p ] = β 0 + ∑ j = 1 p f j ( x i j ) {\displaystyle \mathrm {E} [y_{i}|x_{i1},\ldots ,x_{ip}]=\beta _{0}+\sum _{j=1}^{p}f_{j}(x_{ij})}

or

Y = β 0 + ∑ j = 1 p f j ( X j ) + ε {\displaystyle Y=\beta _{0}+\sum _{j=1}^{p}f_{j}(X_{j})+\varepsilon }

Where E [ ϵ ] = 0 {\displaystyle \mathrm {E} [\epsilon ]=0} , V a r ( ϵ ) = σ 2 {\displaystyle \mathrm {Var} (\epsilon )=\sigma ^{2}} and E [ f j ( X j ) ] = 0 {\displaystyle \mathrm {E} [f_{j}(X_{j})]=0} . The functions f j ( x i j ) {\displaystyle f_{j}(x_{ij})} are unknown smooth functions fit from the data. Fitting the AM (i.e. the functions f j ( x i j ) {\displaystyle f_{j}(x_{ij})} ) can be done using the backfitting algorithm proposed by Andreas Buja, Trevor Hastie and Robert Tibshirani (1989).2

See also

Further reading

References

  1. Friedman, J.H. and Stuetzle, W. (1981). "Projection Pursuit Regression", Journal of the American Statistical Association 76:817–823. doi:10.1080/01621459.1981.10477729 /wiki/Friedman,_J.H.

  2. Buja, A., Hastie, T., and Tibshirani, R. (1989). "Linear Smoothers and Additive Models", The Annals of Statistics 17(2):453–555. JSTOR 2241560 /wiki/JSTOR_(identifier)