Wacky regression



I used to maintain a list of regression methods that were almost nonparametric, but as fun as that category was I was not actually using it, so I broke it apart into more conventional categories.

See bagging and bosting methods, neural networks, functional data analysis, Gaussian process regression and randomised regression.

References

Fomel, Sergey. 2000. β€œInverse B-Spline Interpolation.” Citeseer.
Friedman, Jerome H. 2001. β€œGreedy Function Approximation: A Gradient Boosting Machine.” The Annals of Statistics 29 (5): 1189–1232.
β€”β€”β€”. 2002. β€œStochastic Gradient Boosting.” Computational Statistics & Data Analysis, Nonlinear Methods and Data Mining, 38 (4): 367–78.
Friedman, Jerome, Trevor Hastie, and Robert Tibshirani. 2000. β€œAdditive Logistic Regression: A Statistical View of Boosting (With Discussion and a Rejoinder by the Authors).” The Annals of Statistics 28 (2): 337–407.
Johnson, R., and Tong Zhang. 2014. β€œLearning Nonlinear Functions Using Regularized Greedy Forest.” IEEE Transactions on Pattern Analysis and Machine Intelligence 36 (5): 942–54.
Jones, Lee K. 1992. β€œA Simple Lemma on Greedy Approximation in Hilbert Space and Convergence Rates for Projection Pursuit Regression and Neural Network Training.” The Annals of Statistics 20 (1): 608–13.
Scornet, Erwan. 2014. β€œOn the Asymptotics of Random Forests.” arXiv:1409.2090 [Math, Stat], September.
Scornet, Erwan, GΓ©rard Biau, and Jean-Philippe Vert. 2014. β€œConsistency of Random Forests.” arXiv:1405.2881 [Math, Stat], May.
Tropp, J.A. 2004. β€œGreed Is Good: Algorithmic Results for Sparse Approximation.” IEEE Transactions on Information Theory 50 (10): 2231–42.
Vanli, N. D., and S. S. Kozat. 2014. β€œA Comprehensive Approach to Universal Piecewise Nonlinear Regression Based on Trees.” IEEE Transactions on Signal Processing 62 (20): 5471–86.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.