I used to maintain a list of regression methods that were almost nonparametric, but as fun as that category was I was not actually using it, so I broke it apart into more conventional categories.
See bagging and bosting methods, neural networks, functional data analysis, Gaussian process regression and randomised regression.
References
Fomel, Sergey. 2000. βInverse B-Spline Interpolation.β Citeseer.
Friedman, Jerome H. 2001. βGreedy Function Approximation: A Gradient Boosting Machine.β The Annals of Statistics 29 (5): 1189β1232.
βββ. 2002. βStochastic Gradient Boosting.β Computational Statistics & Data Analysis, Nonlinear Methods and Data Mining, 38 (4): 367β78.
Friedman, Jerome, Trevor Hastie, and Robert Tibshirani. 2000. βAdditive Logistic Regression: A Statistical View of Boosting (With Discussion and a Rejoinder by the Authors).β The Annals of Statistics 28 (2): 337β407.
Johnson, R., and Tong Zhang. 2014. βLearning Nonlinear Functions Using Regularized Greedy Forest.β IEEE Transactions on Pattern Analysis and Machine Intelligence 36 (5): 942β54.
Jones, Lee K. 1992. βA Simple Lemma on Greedy Approximation in Hilbert Space and Convergence Rates for Projection Pursuit Regression and Neural Network Training.β The Annals of Statistics 20 (1): 608β13.
Scornet, Erwan. 2014. βOn the Asymptotics of Random Forests.β arXiv:1409.2090 [Math, Stat], September.
Scornet, Erwan, GΓ©rard Biau, and Jean-Philippe Vert. 2014. βConsistency of Random Forests.β arXiv:1405.2881 [Math, Stat], May.
Tropp, J.A. 2004. βGreed Is Good: Algorithmic Results for Sparse Approximation.β IEEE Transactions on Information Theory 50 (10): 2231β42.
Vanli, N. D., and S. S. Kozat. 2014. βA Comprehensive Approach to Universal Piecewise Nonlinear Regression Based on Trees.β IEEE Transactions on Signal Processing 62 (20): 5471β86.
No comments yet. Why not leave one?