# Wacky regression

September 23, 2015 — May 2, 2019

classification

functional analysis

model selection

nonparametric

optimization

regression

I used to maintain a list of regression methods that were almost nonparametric, but as fun as that category was I was not actually using it, so I broke it apart into more conventional categories.

See bagging and bosting methods, neural networks, functional data analysis, Gaussian process regression and randomised regression.

## 1 References

Fomel. 2000. “Inverse B-Spline Interpolation.”

Friedman, Jerome H. 2001. “Greedy Function Approximation: A Gradient Boosting Machine.”

*The Annals of Statistics*.
———. 2002. “Stochastic Gradient Boosting.”

*Computational Statistics & Data Analysis*, Nonlinear Methods and Data Mining,.
Friedman, Jerome, Hastie, and Tibshirani. 2000. “Additive Logistic Regression: A Statistical View of Boosting (With Discussion and a Rejoinder by the Authors).”

*The Annals of Statistics*.
Johnson, and Zhang. 2014. “Learning Nonlinear Functions Using Regularized Greedy Forest.”

*IEEE Transactions on Pattern Analysis and Machine Intelligence*.
Jones. 1992. “A Simple Lemma on Greedy Approximation in Hilbert Space and Convergence Rates for Projection Pursuit Regression and Neural Network Training.”

*The Annals of Statistics*.
Scornet. 2014. “On the Asymptotics of Random Forests.”

*arXiv:1409.2090 [Math, Stat]*.
Scornet, Biau, and Vert. 2014. “Consistency of Random Forests.”

*arXiv:1405.2881 [Math, Stat]*.
Tropp. 2004. “Greed Is Good: Algorithmic Results for Sparse Approximation.”

*IEEE Transactions on Information Theory*.
Vanli, and Kozat. 2014. “A Comprehensive Approach to Universal Piecewise Nonlinear Regression Based on Trees.”

*IEEE Transactions on Signal Processing*.