“Efficiently approximating fields made up of many decaying sources.”
Barnes hut algorithms, fast Gauss transforms, generalized multipole methods.
Not something I intend to worry about right now, but I needed to clear these refs out of my overcrowded
Mercer kernel approximation notebook.
Fast multipole methods can also approximate certain Mercer kernels
in the sense of rapidly approximately evaluating the field strength
at given points,
rather than approximating the kernels themselves with something simpler.
See also H-matrices.
Overview on Vikas Rakyar’s thesis page: FAST SUMMATION ALGORITHMS:
The art of getting ‘good enough’ solutions ‘as fast as possible’.
Huge data sets containing millions of training examples with a large number
of attributes (tall fat data) are relatively easy to gather.
However one of the bottlenecks for successful inference of useful information
from the data is the computational complexity of machine learning algorithms.
Most state-of-the-art nonparametric machine learning algorithms have a
computational complexity of either \(O(N^2)\) or \(O(N^3)\), where N
is the number of training examples.
This has seriously restricted the use of massive data sets.
The bottleneck computational primitive at the heart of various algorithms is
the multiplication of a structured matrix with a vector, which we refer to as
matrix-vector product (MVP) primitive.
The goal of my thesis is to speedup up these MVP primitives by fast
approximate algorithms that scale as \(O(N)\)
and also provide high accuracy guarantees.
I use ideas from computational physics, scientific computing, and
computational geometry to design these algorithms.
Currently the proposed algorithms have been applied in kernel density
estimation, optimal bandwidth estimation, projection pursuit, Gaussian
process regression, implicit surface fitting, and ranking.
Implementation
figtree (C++, MATLAB, Python) does Gaussian fields in the inexact case.
References
Barnes, Josh, and Piet Hut. 1986.
“A Hierarchical O(N Log N) Force-Calculation Algorithm.” Nature 324 (6096): 446–49.
https://doi.org/10.1038/324446a0.
Board, John, and Klaus Schulten. 2000.
“The Fast Multipole Algorithm.” Computing in Science & Engineering 2 (1): 76–79.
https://doi.org/10.1109/5992.814662.
Dongarra, Jack, and Francis Sullivan. 2000.
“Guest Editors’ Introduction: The Top 10 Algorithms.” Computing in Science & Engineering 2 (1): 22–23.
https://doi.org/10.1109/MCISE.2000.814652.
Greengard, L., and J. Strain. 1991.
“The Fast Gauss Transform.” SIAM Journal on Scientific and Statistical Computing 12 (1): 79–94.
https://doi.org/10.1137/0912004.
Raykar, Vikas C., and Ramani Duraiswami. 2005.
“The Improved Fast Gauss Transform with Applications to Machine Learning.” presented at the
NIPS.
http://www.umiacs.umd.edu/users/vikas/publications/IFGT_slides.pdf.
Rokhlin, V. 1985.
“Rapid Solution of Integral Equations of Classical Potential Theory.” Journal of Computational Physics 60 (2): 187–207.
https://doi.org/10.1016/0021-9991(85)90002-6.
Schwab, Christoph, and Radu Alexandru Todor. 2006.
“Karhunen–Loève Approximation of Random Fields by Generalized Fast Multipole Methods.” Journal of Computational Physics, Uncertainty
Quantification in
Simulation Science, 217 (1): 100–122.
https://doi.org/10.1016/j.jcp.2006.01.048.
Simoncini, V., and D. Szyld. 2003.
“Theory of Inexact Krylov Subspace Methods and Applications to Scientific Computing.” SIAM Journal on Scientific Computing 25 (2): 454–77.
https://doi.org/10.1137/S1064827502406415.
Vorst, Henk A. van der. 2000.
“Krylov Subspace Iteration.” Computing in Science & Engineering 2 (1): 32–37.
https://doi.org/10.1109/5992.814655.
Yang, Changjiang, Ramani Duraiswami, and Larry S. Davis. 2004.
“Efficient Kernel Machines Using the Improved Fast Gauss Transform.” In
Advances in Neural Information Processing Systems, 1561–68.
http://machinelearning.wustl.edu/mlpapers/paper_files/NIPS2005_439.pdf.
Yang, Changjiang, Ramani Duraiswami, Nail A. Gumerov, and Larry Davis. 2003.
“Improved Fast Gauss Transform and Efficient Kernel Density Estimation.” In
Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2, 464–64.
ICCV ’03.
Washington, DC, USA:
IEEE Computer Society.
https://doi.org/10.1109/ICCV.2003.1238383.
No comments yet!