Density estimation

Especially non- or semiparametrically

June 6, 2016 — October 16, 2019

convolution
density
functional analysis
nonparametric
probability
statistics

A statistical estimation problem where you are not trying to estimate a function of a distribution of random observations, but the distribution itself. In a sense, all of statistics implicitly does density estimation, but this is often instrumental in the course of discovering some actual parameter of interest. (Although maybe you’re interested in Bayesian statistics and you care a lot about the shape of the posterior density in particular.)

So, estimating distributions nonparametrically is not too weird a function approximation problem. We wish to find a density function \(f:\mathcal{X}\to\mathbb{R}\) such that \(\int_{\mathcal{X}}f(x)dx=1\) and \(\forall x \in \mathcal{X},f(x)\geq 0\).

We might set ourselves different loss functions than usual in statistical regression problems; instead of, e.g. expected \(L_p\) prediction error we might use a traditional function approximation \(L_p\) loss, or a probability divergence measure.

The most common density estimate, that we use implicitly all the time, is to not work with densities as such but distributions. We take the empirical distribution as a distribution estimate; that is, taking the data as a model for itself. This has various non-useful features such as being rough and rather hard to visualize as a density.

Question: When would I actually want to estimate, specifically, a density?

Visualization, sure. Nonparametric regression without any better ideas. As latent parameters in a deep probabilistic model.

What about non-parametric conditional density estimation? Are there any general ways to do this?

1 Divergence measures/contrasts

There are many choices for loss functions between densities here; any of the probability metrics will do. For reasons of tradition or convenience, when the object of interest is the density itself, certain choices dominate:

  • \(L_2\) with respect to the density over Lebesgue measure on the state space, which we call the MISE, and works out nicely for convolution kernels.
  • KL-divergence. (May not do what you want if you care about performance near 0. See (Hall 1987).)
  • Hellinger distance
  • Wasserstein divergences.

But having chosen the divergence you wish to minimise, you now have to choose with respect to which criterion you wish to minimise it? Minimax? in probability? In expectation? …? Every combination is a different publication. Hmf.

2 Minimising Expected (or whatever) MISE

This works fine for Kernel Density Estimators where it turns out just to be a Wiener filter where you have to choose a bandwidth. How do you do this for other estimators, though?

3 Connection to point processes

There is a connection between spatial point process intensity estimation and density estimation. See Densities and intensities.

4 Spline/wavelet estimations

🏗

5 Mixture models

See mixture models.

6 Gaussian processes

Gaussian process can provide posterior densities over densities somehow? (Tokdar 2007; Lenk 2003)

7 Normalizing flow models

A.k.a. measure transport etc. Where one uses reparameterization. 🏗

8 k-NN estimates

Filed here because too small to do elsewhere.

Bruce E. Hansen notes:

To use nearest neighbour methods, the integer k must be selected. This is similar to bandwidth selection, although here k is discrete, not continuous. K.C. Li (Annals of Statistics, 1987) showed that for the knn regression estimator under conditional homoskedasticity, it is asymptotically optimal to pick k by Mallows, Generalized CV, or CV. Andrews (Journal of Econometrics, 1991) generalised this result to the case of heteroskedasticity, and showed that CV is asymptotically optimal.

9 Kernel density estimators

a.k.a. kernel smoothing.

9.1 Fancy ones

HT Gery Geenens for a lecture he just gave on convolution kernel density estimation where he drew a parallel between additive noise in kde estimation and multiplicative noise in non-negative-valued variables.

10 References

Andrews. 1991. Asymptotic Optimality of Generalized CL, Cross-Validation, and Generalized Cross-Validation in Regression with Heteroskedastic Errors.” Journal of Econometrics.
Arnold, Castillo, and Sarabia. 1999. Conditional Specification of Statistical Models.
Barron, and Sheu. 1991. Approximation of Density Functions by Sequences of Exponential Families.” The Annals of Statistics.
Bashtannyk, and Hyndman. 2001. Bandwidth Selection for Kernel Conditional Density Estimation.” Computational Statistics & Data Analysis.
Battey, and Liu. 2013. Smooth Projected Density Estimation.” arXiv:1308.3968 [Stat].
Berman, and Diggle. 1989. Estimating Weighted Integrals of the Second-Order Intensity of a Spatial Point Process.” Journal of the Royal Statistical Society. Series B (Methodological).
Birgé. 2008. Model Selection for Density Estimation with L2-Loss.” arXiv:0808.1416 [Math, Stat].
Bosq. 1998. Nonparametric Statistics for Stochastic Processes: Estimation and Prediction. Lecture Notes in Statistics 110.
Cox. 1965. On the Estimation of the Intensity Function of a Stationary Point Process.” Journal of the Royal Statistical Society: Series B (Methodological).
Cunningham, Shenoy, and Sahani. 2008. Fast Gaussian Process Methods for Point Process Intensity Estimation.” In Proceedings of the 25th International Conference on Machine Learning. ICML ’08.
Devroye, and Lugosi. 2001. Combinatorial Methods in Density Estimation. Springer Series in Statistics.
Dinh, Ho, Nguyen, et al. 2016. Fast Learning Rates with Heavy-Tailed Losses.” In NIPS.
Efromovich. 2007. Conditional Density Estimation in a Regression Setting.” The Annals of Statistics.
Eilers, and Marx. 1996. Flexible Smoothing with B-Splines and Penalties.” Statistical Science.
Ellis. 1991. Density Estimation for Point Processes.” Stochastic Processes and Their Applications.
Giesecke, Kakavand, and Mousavi. 2008. Simulating Point Processes by Intensity Projection.” In Simulation Conference, 2008. WSC 2008. Winter.
Gu. 1993. Smoothing Spline Density Estimation: A Dimensionless Automatic Algorithm.” Journal of the American Statistical Association.
Hall. 1987. On Kullback-Leibler Loss and Density Estimation.” The Annals of Statistics.
Hall, Racine, and Li. 2004. Cross-Validation and the Estimation of Conditional Probability Densities.” Journal of the American Statistical Association.
Hansen. 2004. Nonparametric Conditional Density Estimation.” Unpublished Manuscript.
Hasminskii, and Ibragimov. 1990. On Density Estimation in the View of Kolmogorov’s Ideas in Approximation Theory.” The Annals of Statistics.
Ibragimov. 2001. Estimation of Analytic Functions.” In Institute of Mathematical Statistics Lecture Notes - Monograph Series.
Koenker, and Mizera. 2006. Density Estimation by Total Variation Regularization.” Advances in Statistical Modeling and Inference.
Kooperberg, and Stone. 1991. A Study of Logspline Density Estimation.” Computational Statistics & Data Analysis.
———. 1992. Logspline Density Estimation for Censored Data.” Journal of Computational and Graphical Statistics.
Lee, Ge, Ma, et al. 2017. On the Ability of Neural Nets to Express Distributions.” In arXiv:1702.07028 [Cs].
Lenk. 2003. Bayesian Semiparametric Density Estimation and Model Verification Using a Logistic–Gaussian Process.” Journal of Computational and Graphical Statistics.
Li. 1987. Asymptotic Optimality for \(C_p, C_L\), Cross-Validation and Generalized Cross-Validation: Discrete Index Set.” The Annals of Statistics.
Norets. 2010. Approximation of Conditional Densities by Smooth Mixtures of Regressions.” The Annals of Statistics.
Panaretos, and Zemel. 2016. Separation of Amplitude and Phase Variation in Point Processes.” The Annals of Statistics.
Papangelou. 1974. The Conditional Intensity of General Point Processes and an Application to Line Processes.” Zeitschrift Für Wahrscheinlichkeitstheorie Und Verwandte Gebiete.
Reynaud-Bouret, Rivoirard, and Tuleau-Malot. 2011. Adaptive Density Estimation: A Curse of Support? Journal of Statistical Planning and Inference.
Sardy, and Tseng. 2010. Density Estimation by Total Variation Penalized Likelihood Driven by the Sparsity ℓ1 Information Criterion.” Scandinavian Journal of Statistics.
Schoenberg. 2005. Consistent Parametric Estimation of the Intensity of a Spatial–Temporal Point Process.” Journal of Statistical Planning and Inference.
Schuster, Mollenhauer, Klus, et al. 2019. Kernel Conditional Density Operators.” arXiv:1905.11255 [Cs, Math, Stat].
Shimazaki, and Shinomoto. 2010. Kernel Bandwidth Optimization in Spike Rate Estimation.” Journal of Computational Neuroscience.
Sriperumbudur, Fukumizu, Gretton, et al. 2017. Density Estimation in Infinite Dimensional Exponential Families.” Journal of Machine Learning Research.
Sugiyama, Takeuchi, Suzuki, et al. 2010. Conditional Density Estimation via Least-Squares Density Ratio Estimation.” In International Conference on Artificial Intelligence and Statistics.
Tabak, E. G., and Turner. 2013. A Family of Nonparametric Density Estimation Algorithms.” Communications on Pure and Applied Mathematics.
Tabak, Esteban G., and Vanden-Eijnden. 2010. Density Estimation by Dual Ascent of the Log-Likelihood.” Communications in Mathematical Sciences.
Tokdar. 2007. Towards a Faster Implementation of Density Estimation With Logistic Gaussian Process Priors.” Journal of Computational and Graphical Statistics.
van Lieshout. 2011. On Estimation of the Intensity Function of a Point Process.” Methodology and Computing in Applied Probability.
Zeevi, and Meir. 1997. Density Estimation Through Convex Combinations of Densities: Approximation and Estimation Bounds.” Neural Networks: The Official Journal of the International Neural Network Society.