Fourier interpolation

June 19, 2019 — June 19, 2019

feature construction
functional analysis
linear algebra
networks
probability
signal processing
sparser than thou
statistics

\[\renewcommand{\vv}[1]{\boldsymbol{#1}} \renewcommand{\mm}[1]{\mathrm{#1}} \renewcommand{\mmm}[1]{\mathrm{#1}} \renewcommand{\cc}[1]{\mathcal{#1}} \renewcommand{\ff}[1]{\mathfrak{#1}} \renewcommand{\oo}[1]{\operatorname{#1}} \renewcommand{\cc}[1]{\mathcal{#1}}\]

Video
Figure 1: Jezzamon’s Fourier hand

a.k.a. spectral resampling/differentiation/integration.

Rick Lyons, How to Interpolate in the Time-Domain by Zero-Padding in the Frequency Domain. Also more classic Rick Lyons: FFT Interpolation Based on FFT Samples: A Detective Story With a Surprise Ending.

Steven Johnson’s Notes on FFT-based differentiation is all I need; it points out a couple of subtleties about DTFT-based differentiation of functions.

A common numerical technique is to differentiate some sampled function y(x) via fast Fourier transforms (FFTs). Equivalently, one differentiates an approximate Fourier series. Equivalently, one differentiates a trigonometric interpolation. These are also known as spectral differentiation methods. However, the implementation of such methods is prey to common confusions due to the aliasing phenomenon inherent in sampling, and the treatment of the maximum-frequency (Nyquist) component is especially tricky. In this note, we review the basic ideas of spectral differentiation (for equally spaced samples)…

Specifically, if I have some function sampled on \([0,L]\) at discrete equally spaced points

\[y_n = y(nL/N)\] then I will take the DTFT

\[\begin{aligned} Y_k&=\cc{F}(\vv{y})_k\\ &=\frac{1}{N}\sum_{n=0}^{N-1}y_n\exp -\left(\frac{2\pi i}{N}nk\right) \end{aligned}\]

and I will invert it

\[\begin{aligned} y_n&=\cc{F}^{-1}(\vv{Y})_n\\ &=\frac{1}{N}\sum_{n=0}^{N-1}Y_k\exp \left(\frac{2\pi i}{N}nk\right). \end{aligned}\]

Note that I will also be assuming, informally put, that \(y\) is periodic with period \(L,\) or, equivalently, assuming continuity of function and (enough) derivatives between the boundaries \(0\) and \(L\). In practice, I rarely have such a given period for a function that I wish to interpolate in this fashion, so I enforce boundary conditions by windowing the function with a von Hann window or similar. (Failure to do so will still “work”; it’ll just have abysmal convergence properties due to the implicitly super-Nyquist frequencies introduced at the discontinuity that I won’t be handling properly, which we call the Gibbs phenomenon.) Windowing will obviously in general change the function. If I don’t want to change it, I can always use a flattish window such as the Tukey window over a longer signal than I intend to interpolate. If I could sample at arbitrary points I might use Chebychev interpolation to make it effectively periodic. But this is already wandering off-topic. I ignore that for now.

1 Minimum curvature interpolant

So far so normal. When we wish to interpolate and/or differentiate, things get a little less obvious; we wish to preserve minimum curvature for the interpolant \(\hat{y}\), which is under-determined by the DTFT components due to aliasing.

For a trivial example of the pathologies, all possible \(m_k\in\mathbb{Z}\) following give equally valid DTFTs for the same sample vector with regard to having the same sample points

\[Y_k=\cc{F}(\vv{y})_k=\frac{1}{N}\sum_{n=0}^{N-1}y_n\exp -\left(\frac{2\pi i}{N}n(k+m_kN)\right)\] but inspection reveals they have different interpretations.

Some calculation reveals that a minimum mean-square-derivative interpolant has a relatively simple form for even \(N\)

\[\hat{y}(t)=Y_0+\sum_{0< k < N/2} \left[ Y_k\exp\left(\frac{2\pi i}{L} k t\right)+ Y_{N-k}\exp\left(-\frac{2\pi i}{L} k t\right) \right]+ Y_{N/2}\cos\left(\frac{\pi }{L} t\right) \] This is also a strictly real interpolant for real-valued input, since in that case \(Y_{N-k}=Y_{k}\) and in fact

\[\begin{aligned} \hat{y}(t)&=Y_0+\sum_{0< k < N/2} \left[ Y_k\left( \exp\left(\frac{2\pi i}{L} k t\right)+ \exp\left(-\frac{2\pi i}{L} k t\right) \right) \right]+ Y_{N/2}\cos\left(\frac{\pi }{L} t\right)\\ &=Y_0+\sum_{0< k < N/2} \left( 2Y_k\cos\left(\frac{2\pi }{L} k t\right) \right)+ Y_{N/2}\cos\left(\frac{\pi }{L} t\right) \end{aligned}\]

2 Derivatives

Now, suppose at some equally-spaced points \(rm,m\in\mathbb{Z}\) we wish to interpolate this sampled \(y\) using its DTFT coefficients. (What follows will also assume \(r\leq 1\) so I can avoid talking about the Nyquist component.) The implied interpolant is

\[\begin{aligned} \hat{y}(rm) &=Y_0+\sum_{0< k < N/2} \left( 2Y_k\cos\left(\frac{2\pi }{L} k rm\right) \right)+ Y_{N/2}\cos\left(\frac{\pi }{L} rm\right)\\ \end{aligned}\]

Unless we know something special about \(r\) I can’t see any shortcuts to evaluate these; but I can see how we can get the derivatives of this function cheaply if we are going to take the trouble to evaluate these.

🏗

3 References

Johnson. 2011. Notes on FFT-Based Differentiation.”