I used to maintain a list of regression methods that were almost nonparametric, but as fun as that category was I was not actually suing it very often so I broke it up.
Fomel, Sergey. 2000. “Inverse B-Spline Interpolation.” Citeseer. http://www.reproducibility.org/RSF/book/sep/bspl/paper.pdf.
Friedman, Jerome H. 2001. “Greedy Function Approximation: A Gradient Boosting Machine.” The Annals of Statistics 29 (5): 1189–1232. https://doi.org/10.1214/aos/1013203451.
———. 2002. “Stochastic Gradient Boosting.” Computational Statistics & Data Analysis, Nonlinear Methods and Data Mining, 38 (4): 367–78. https://doi.org/10.1016/S0167-9473(01)00065-2.
Friedman, Jerome, Trevor Hastie, and Robert Tibshirani. 2000. “Additive Logistic Regression: A Statistical View of Boosting (with Discussion and a Rejoinder by the Authors).” The Annals of Statistics 28 (2): 337–407. https://doi.org/10.1214/aos/1016218223.
Johnson, R., and Tong Zhang. 2014. “Learning Nonlinear Functions Using Regularized Greedy Forest.” IEEE Transactions on Pattern Analysis and Machine Intelligence 36 (5): 942–54. https://doi.org/10.1109/TPAMI.2013.159.
Jones, Lee K. 1992. “A Simple Lemma on Greedy Approximation in Hilbert Space and Convergence Rates for Projection Pursuit Regression and Neural Network Training.” The Annals of Statistics 20 (1): 608–13. http://www.jstor.org/stable/2242184.
Scornet, Erwan. 2014. “On the Asymptotics of Random Forests,” September. http://arxiv.org/abs/1409.2090.
Scornet, Erwan, Gérard Biau, and Jean-Philippe Vert. 2014. “Consistency of Random Forests,” May. http://arxiv.org/abs/1405.2881.
Tropp, J. A. 2004. “Greed Is Good: Algorithmic Results for Sparse Approximation.” IEEE Transactions on Information Theory 50 (10): 2231–42. https://doi.org/10.1109/TIT.2004.834793.
Vanli, N. D., and S. S. Kozat. 2014. “A Comprehensive Approach to Universal Piecewise Nonlinear Regression Based on Trees.” IEEE Transactions on Signal Processing 62 (20): 5471–86. https://doi.org/10.1109/TSP.2014.2349882.