Function approximation and interpolation

June 9, 2016 โ€” June 9, 2016

Figure 1

On constructing an approximation of some arbitrary function, and measuring the badness thereof.

THIS IS CHAOS RIGHT NOW. I need to break out the sampling/interpolation problem for regular data, for one thing.

1 Choosing the best approximation

In what sense? Most compact? Most easy to code?

If we are not interpolating, how much smoothing do we do?

We can use cross-validation, especially so-called โ€œgeneralizedโ€ cross validation, to choose smoothing parameter this efficiently, in some sense.

Or you might have noisy data, in which case you now have a function approximation and inference problem, with error due to both approximation and sampling complexity. Compressive sensing can provide finite-sample guarantees for some of these settings.

To discuss: loss functions.

An interesting problem is how you align the curves that are your objects of study; That is a problem of warping.

2 Spline smoothing of observations

The classic.

Special superpowers: Easy to differentiate and integrate.

Special weakness: many free parameters, not so easy to do in high dimension.

See splines.

3 Polynomial bases

See polynomial bases.

4 Fourier bases

๐Ÿ—๏ธ

5 Radial basis function approximation

I# Radial basis function approximation I actually care about this mostly for densities, so see mixture models, for what information I do have.

6 Rational approximation

Padรฉโ€™s is the method Iโ€™ve heard of. Are there others? Easy to differentiate. OK to integrate if you cheat using a computational algebra package.

7 References

Anil, Lucas, and Grosse. 2018. โ€œSorting Out Lipschitz Function Approximation.โ€
Barron, A. 1989. โ€œMinimum Complexity Estimation.โ€ In.
Barron, A.R. 1993. โ€œUniversal Approximation Bounds for Superpositions of a Sigmoidal Function.โ€ IEEE Transactions on Information Theory.
Barron, Andrew R., Cohen, Dahmen, et al. 2008. โ€œApproximation and Learning by Greedy Algorithms.โ€ The Annals of Statistics.
Berinde, and Indyk. 2009. โ€œSequential Sparse Matching Pursuit.โ€ In 2009 47th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2009.
Berner, Grohs, Kutyniok, et al. 2021. โ€œThe Modern Mathematics of Deep Learning.โ€
Beylkin, and Monzรณn. 2010. โ€œApproximation by Exponential Sums Revisited.โ€ Applied and Computational Harmonic Analysis.
Blumensath, and Davies. 2008. โ€œGradient Pursuit for Non-Linear Sparse Signal Modelling.โ€ In Signal Processing Conference, 2008 16th European.
Bรถlcskei, Grohs, Kutyniok, et al. 2019. โ€œOptimal Approximation with Sparsely Connected Deep Neural Networks.โ€ SIAM Journal on Mathematics of Data Science.
Boyd, John P. 2001. Chebyshev & Fourier Spectral Methods. Lecture Notes in Engineering.
Boyd, Nicholas, Hastie, Boyd, et al. 2016. โ€œSaturating Splines and Feature Selection.โ€ arXiv:1609.06764 [Stat].
Carlson, and Fritsch. 1985. โ€œMonotone Piecewise Bicubic Interpolation.โ€ SIAM Journal on Numerical Analysis.
Cheney, and Light. 2009. A Course in Approximation Theory.
Cybenko. 1989. โ€œApproximation by Superpositions of a Sigmoidal Function.โ€ Mathematics of Control, Signals and Systems.
Daniely. 2017. โ€œDepth Separation for Neural Networks.โ€ arXiv:1702.08489 [Cs, Stat].
Davies. n.d. โ€œMultidimensional Triangulation and Interpolation for Reinforcement Learning.โ€
Davis, Geoffrey M. 1998. โ€œA Wavelet-Based Analysis of Fractal Image Compression.โ€ IEEE Transactions on Image Processing.
Davis, G., Mallat, and Avellaneda. 1997. โ€œAdaptive Greedy Approximations.โ€ Constructive Approximation.
DeVore, Hanin, and Petrova. 2021. โ€œNeural Network Approximation.โ€ Acta Numerica.
Dierckx. 1996. Curve and Surface Fitting Splines.
Dougherty, Edelman, and Hyman. 1989. โ€œNonnegativity-, Monotonicity-, or Convexity-Preserving Cubic and Quintic Hermite Interpolation.โ€ Mathematics of Computation.
Ekanadham, Tranchina, and Simoncelli. 2011. โ€œRecovery of Sparse Translation-Invariant Signals With Continuous Basis Pursuit.โ€ IEEE Transactions on Signal Processing.
Elbrรคchter, Perekrestenko, Grohs, et al. 2021. โ€œDeep Neural Network Approximation Theory.โ€ IEEE Transactions on Information Theory.
Fomel. 2000. โ€œInverse B-Spline Interpolation.โ€
Fritsch, and Carlson. 1980. โ€œMonotone Piecewise Cubic Interpolation.โ€ SIAM Journal on Numerical Analysis.
Goodwin, Michael. 1997. โ€œMatching Pursuit with Damped Sinusoids.โ€ In 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.
Goodwin, M M. 2001. โ€œMultiscale Overlap-Add Sinusoidal Modeling Using Matching Pursuit and Refinements.โ€ In IEEE Workshop on Applications of Signal Processing to Audio and Acoustics.
Goodwin, M., and Vetterli. 1997. โ€œAtomic Decompositions of Audio Signals.โ€ In 1997 IEEE ASSP Workshop on Applications of Signal Processing to Audio and Acoustics, 1997.
Goodwin, M M, and Vetterli. 1999. โ€œMatching Pursuit and Atomic Signal Models Based on Recursive Filter Banks.โ€ IEEE Transactions on Signal Processing.
Graps. 1995. โ€œAn Introduction to Wavelets.โ€ IEEE Computational Science Engineering.
Grohs, and Herrmann. 2022. โ€œDeep Neural Network Approximation for High-Dimensional Elliptic PDEs with Boundary Conditions.โ€ IMA Journal of Numerical Analysis.
Higham. 1992. โ€œMonotonic Piecewise Cubic Interpolation, with Applications to ODE Plotting.โ€ Journal of Computational and Applied Mathematics.
Hornik, Stinchcombe, and White. 1989. โ€œMultilayer Feedforward Networks Are Universal Approximators.โ€ Neural Networks.
Hou, and Andrews. 1978. โ€œCubic Splines for Image Interpolation and Digital Filtering.โ€ IEEE Transactions on Acoustics, Speech and Signal Processing.
Huggins, and Zucker. 2007. โ€œGreedy Basis Pursuit.โ€ IEEE Transactions on Signal Processing.
Knudson, Yates, Huk, et al. 2014. โ€œInferring Sparse Representations of Continuous Signals with Continuous Orthogonal Matching Pursuit.โ€ In Advances in Neural Information Processing Systems 27.
Koehler, Mehta, and Risteski. 2020. โ€œRepresentational Aspects of Depth and Conditioning in Normalizing Flows.โ€ arXiv:2010.01155 [Cs, Stat].
Koppel, Warnell, Stump, et al. 2016. โ€œParsimonious Online Learning with Kernels via Sparse Projections in Function Space.โ€ arXiv:1612.04111 [Cs, Stat].
Kronland-Martinet, Guillemain, and Ystad. 1997. โ€œModelling of Natural Sounds by Timeโ€“Frequency and Wavelet Representations.โ€ Organised Sound.
Lee, Wee Sun, Bartlett, and Williamson. 1996. โ€œEfficient Agnostic Learning of Neural Networks with Bounded Fan-in.โ€ IEEE Transactions on Information Theory.
Lee, Holden, Ge, Ma, et al. 2017. โ€œOn the Ability of Neural Nets to Express Distributions.โ€ In arXiv:1702.07028 [Cs].
Lyche, and Mรธrken. 2011. Spline Methods.
Opschoor, Petersen, and Schwab. 2020. โ€œDeep ReLU Networks and High-Order Finite Element Methods.โ€ Analysis and Applications.
Orr. 1996. โ€œIntroduction to Radial Basis Function Networks.โ€
Pati, Rezaiifar, and Krishnaprasad. 1993. โ€œOrthogonal Matching Pursuit: Recursive Function Approximation with Applications to Wavelet Decomposition.โ€ In Conference Record of The Twenty-Seventh Asilomar Conference on Signals, Systems and Computers.
Pinkus. 1999. โ€œApproximation Theory of the MLP Model in Neural Networks.โ€ Acta Numerica.
Poggio, and Girosi. 1990. โ€œNetworks for Approximation and Learning.โ€ Proceedings of the IEEE.
Poole, Lahiri, Raghu, et al. 2016. โ€œExponential Expressivity in Deep Neural Networks Through Transient Chaos.โ€ In Advances in Neural Information Processing Systems 29.
Ramsay. 1988. โ€œMonotone Regression Splines in Action.โ€ Statistical Science.
Rioul, and Vetterli. 1991. โ€œWavelets and Signal Processing.โ€ IEEE Signal Processing Magazine.
Rolnick, and Tegmark. 2017. โ€œThe Power of Deeper Networks for Expressing Natural Functions.โ€ arXiv:1705.05502 [Cs, Stat].
Rubinstein, Bruckstein, and Elad. 2010. โ€œDictionaries for Sparse Representation Modeling.โ€ Proceedings of the IEEE.
Sachdeva. 2013. โ€œFaster Algorithms via Approximation Theory.โ€ Foundations and Trendsยฎ in Theoretical Computer Science.
Song, Vempala, Wilmes, et al. 2017. โ€œOn the Complexity of Learning Neural Networks.โ€ arXiv:1707.04615 [Cs].
Telgarsky. 2016. โ€œBenefits of Depth in Neural Networks.โ€ In arXiv:1602.04485 [Cs, Stat].
โ€”โ€”โ€”. 2017. โ€œNeural Networks and Rational Functions.โ€ In PMLR.
Torrence, and Compo. 1998. โ€œA Practical Guide to Wavelet Analysis.โ€ Bulletin of the American Meteorological Society.
Unser, Michael, Aldroubi, and Eden. 1991. โ€œFast B-Spline Transforms for Continuous Image Representation and Interpolation.โ€ IEEE Transactions on Pattern Analysis and Machine Intelligence.
Unser, M., Aldroubi, and Eden. 1993a. โ€œB-Spline Signal Processing. I. Theory.โ€ IEEE Transactions on Signal Processing.
โ€”โ€”โ€”. 1993b. โ€œB-Spline Signal Processing. II. Efficiency Design and Applications.โ€ IEEE Transactions on Signal Processing.
Vetterli. 1999. โ€œWavelets: Approximation and Compressionโ€“a Review.โ€ In AeroSenseโ€™99.
Wang, Smola, and Tibshirani. 2014. โ€œThe Falling Factorial Basis and Its Statistical Applications.โ€ In Proceedings of the 31st International Conference on International Conference on Machine Learning - Volume 32. ICMLโ€™14.
Weidmann, and Vetterli. 2012. โ€œRate Distortion Behavior of Sparse Sources.โ€ IEEE Transactions on Information Theory.
Weinert, and Kailath. 1974. โ€œMinimum Energy Control Using Spline Functions.โ€ In 1974 IEEE Conference on Decision and Control Including the 13th Symposium on Adaptive Processes.
Wiatowski, and Bรถlcskei. 2015. โ€œA Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction.โ€ In Proceedings of IEEE International Symposium on Information Theory.
Wood. 1994. โ€œMonotonic Smoothing Splines Fitted by Cross Validation.โ€ SIAM Journal on Scientific Computing.
Yarotsky, and Zhevnerchuk. 2020. โ€œThe Phase Diagram of Approximation Rates for Deep Neural Networks.โ€ In Proceedings of the 34th International Conference on Neural Information Processing Systems. NIPSโ€™20.
Zeevi, and Meir. 1997. โ€œDensity Estimation Through Convex Combinations of Densities: Approximation and Estimation Bounds.โ€ Neural Networks: The Official Journal of the International Neural Network Society.