{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Interpolation and Gaussian quadrature\n", "\n", "\n", "\n", "_Polynomial interpolation_ is the process of finding a polynomial that equals data at a precise set of points.\n", "_Quadrature_ is the act of approximating an integral by a weighted sum:\n", "$$\n", "\\int_a^b f(x) w(x) {\\rm d}x ≈ \\sum_{j=1}^n w_j f(x_j).\n", "$$\n", "In these notes we see that the two concepts are intrinsically linked: interpolation leads naturally\n", "to quadrature rules. Moreover, by using a set of points $x_j$ linked to orthogonal polynomials we get\n", "significantly more accurate rules, and in fact can use quadrature to compute expansions in orthogonal polynomials that\n", "interpolate. That is, we can mirror the link between the Trapezium rule, Fourier series, and interpolation but\n", "now for polynomials.\n", "\n", "\n", "1. Polynomial Interpolation: we describe how to interpolate a function by a polynomial.\n", "3. Truncated Jacobi matrices: we see that truncated Jacobi matrices are diagonalisable\n", "in terms of orthogonal polynomials and their zeros. \n", "2. Interpolatory quadrature rule: polynomial interpolation leads naturally to ways to integrate\n", "functions, but onely realisable in the simplest cases.\n", "3. Gaussian quadrature: Using roots of orthogonal polynomials and truncated Jacobi matrices \n", "leads naturally to an efficiently\n", "computable interpolatory quadrature rule. The _miracle_ is its exact for twice as many polynomials as\n", "expected.\n", "\n", "\n", "\n", "## 1. Polynomial Interpolation\n", "\n", "We already saw a special case of polynomial interpolation, where we saw that the polynomial\n", "$$\n", "f(z) ≈ ∑_{k=0}^{n-1} f̂_k^n z^k\n", "$$\n", "equaled $f$ at evenly spaced points on the unit circle: ${\\rm e}^{{\\rm i} 2π j/n}$. \n", "But here we consider the following:\n", "\n", "**Definition (interpolatory polynomial)** Given $n$ distinct points $x_1,…,x_n ∈ ℝ$ \n", "and $n$ _samples_ $f_1,…,f_n ∈ ℝ$, a degree $n-1$\n", "_interpolatory polynomial_ $p(x)$ satisfies\n", "$$\n", "p(x_j) = f_j\n", "$$\n", "\n", "The easiest way to solve this problem is to invert the Vandermonde system:\n", "\n", "**Definition (Vandermonde)** The _Vandermonde matrix_ associated with $n$ distinct points $x_1,…,x_n ∈ ℝ$\n", "is the matrix\n", "$$\n", "V := \\begin{bmatrix} 1 & x_1 & ⋯ & x_1^{n-1} \\\\\n", " ⋮ & ⋮ & ⋱ & ⋮ \\\\\n", " 1 & x_n & ⋯ & x_n^{n-1}\n", " \\end{bmatrix}\n", "$$\n", "\n", "**Proposition (interpolatory polynomial uniqueness)** \n", "The interpolatory polynomial is unique, and the Vandermonde matrix is invertible.\n", "\n", "**Proof**\n", "Suppose $p$ and $p̃$ are both interpolatory polynomials. Then $p(x) - p̃(x)$ vanishes at $n$ distinct points $x_j$. By the fundamental theorem of\n", "algebra it must be zero, i.e., $p = p̃$.\n", "\n", "For the second part, if $V 𝐜 = 0$ for $𝐜 ∈ ℝ$ then for $q(x) = c_1 + ⋯ + c_n x^{n-1}$ we have\n", "$$\n", "q(x_j) = 𝐞_j^⊤ V 𝐜 = 0\n", "$$\n", "hence $q$ vanishes at $n$ distinct points and is therefore 0, i.e., $𝐜 = 0$.\n", "\n", "∎\n", "\n", "Thus a quick-and-dirty way to to do interpolation is to invert the Vandermonde matrix\n", "(which we saw in the least squares setting with more samples then coefficients):" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "execution": { "iopub.execute_input": "2022-05-02T19:02:18.919692Z", "iopub.status.busy": "2022-05-02T19:02:18.480126Z", "iopub.status.idle": "2022-05-02T19:02:36.306593Z", "shell.execute_reply": "2022-05-02T19:02:36.306016Z" } }, "outputs": [ { "data": { "image/svg+xml": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n" ] }, "execution_count": 1, "metadata": {}, "output_type": "execute_result" } ], "source": [ "using Plots, LinearAlgebra\n", "f = x -> cos(10x)\n", "n = 5\n", "\n", "x = range(0, 1; length=n)# evenly spaced points (BAD for interpolation)\n", "V = x .^ (0:n-1)' # Vandermonde matrix\n", "c = V \\ f.(x) # coefficients of interpolatory polynomial\n", "p = x -> dot(c, x .^ (0:n-1))\n", "\n", "g = range(0,1; length=1000) # plotting grid\n", "plot(g, f.(g); label=\"function\")\n", "plot!(g, p.(g); label=\"interpolation\")\n", "scatter!(x, f.(x); label=\"samples\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "But it turns out we can also construct the interpolatory polynomial directly.\n", "We will use the following with equal $1$ at one grid point\n", "and are zero at the others:\n", "\n", "**Definition (Lagrange basis polynomial)** The _Lagrange basis polynomial_ is defined as\n", "$$\n", "ℓ_k(x) := ∏_{j ≠ k} {x-x_j \\over x_k - x_j} = {(x-x_1) ⋯(x-x_{k-1})(x-x_{k+1}) ⋯ (x-x_n) \\over (x_k - x_1) ⋯ (x_k - x_{k-1}) (x_k - x_{k+1}) ⋯ (x_k - x_n)}\n", "$$\n", "\n", "Plugging in the grid points verifies the following:\n", "\n", "**Proposition (delta interpolation)**\n", "$$\n", "ℓ_k(x_j) = δ_{kj}\n", "$$\n", "\n", "We can use these to construct the interpolatory polynomial:\n", "\n", "**Theorem (Lagrange interpolation)**\n", "The unique polynomial of degree at most $n-1$ that interpolates $f$ at $n$ distinct\n", "points $x_j$ is:\n", "$$\n", "p(x) = f(x_1) ℓ_1(x) + ⋯ + f(x_n) ℓ_n(x)\n", "$$\n", "\n", "**Proof**\n", "Note that\n", "$$\n", "p(x_j) = ∑_{j=1}^n f(x_j) ℓ_k(x_j) = f(x_j)\n", "$$\n", "so we just need to show it is unique. Suppose $p̃(x)$ is a polynomial\n", "of degree at most $n-1$\n", "that also interpolates $f$. Then $p̃ - p$ vanishes at $n$ distinct points.\n", "Thus by the fundamental theorem of algebra it must be zero.\n", "\n", "\n", "∎\n", "\n", "**Example** We can interpolate $\\exp(x)$ at the points $0,1,2$:\n", "$$\n", "\\begin{align*}\n", "p(x) &= ℓ_1(x) + {\\rm e} ℓ_2(x) + {\\rm e}^2 ℓ_3(x) =\n", "{(x - 1) (x-2) \\over (-1)(-2)} + {\\rm e} {x (x-2) \\over (-1)} +\n", "{\\rm e}^2 {x (x-1) \\over 2} \\\\\n", "&= (1/2 - {\\rm e} +{\\rm e}^2/2)x^2 + (-3/2 + 2 {\\rm e} - {\\rm e}^2 /2) x + 1\n", "\\end{align*}\n", "$$\n", "\n", "\n", "**Remark** Interpolating at evenly spaced points is a really **bad** idea:\n", "interpolation is inheritely ill-conditioned. \n", "The problem sheet asks you to explore\n", "this experimentally.\n", "\n", "## 2. Roots of orthogonal polynomials and truncated Jacobi matrices\n", "\n", "We now consider roots (zeros) of orthogonal polynomials $p_n(x)$. This is important as we shall\n", "see they are useful for interpolation and quadrature. For interpolation to be well-defined we\n", "first need to guarantee that the roots are distinct.\n", "\n", "**Lemma** An orthogonal polynomial $p_n(x)$ has exactly $n$ distinct roots.\n", "\n", "**Proof**\n", "\n", "Suppose $x_1, …,x_j$ are the roots where $q_n(x)$ changes sign, that is,\n", "$$\n", "p_n(x) = c_k (x-x_k)^{2p+1} + O((x-x_k)^{2p+2})\n", "$$\n", "for $c_k ≠ 0$ and $k = 1,…,j$ and $p ∈ ℤ$, as$x → x_k$. Then\n", "$$\n", "p_n(x) (x-x_1) ⋯(x-x_j)\n", "$$\n", "does not change signs: it behaves like $c_k (x-x_k)^{2p+2} + O(x-x_k)^{2p+3}$ as $x → x_k$.\n", "In other words:\n", "$$\n", "⟨p_n,(x-x_1) ⋯(x-x_j) ⟩ = \\int_a^b p_n(x) (x-x_1) ⋯(x-x_j) w(x) {\\rm d} x ≠ 0.\n", "$$\n", "where $w(x)$ is the weight of orthogonality.\n", "This is only possible if $j = n$ as $p_n(x)$ is orthogonal w.r.t. all lower degree\n", "polynomials.\n", "\n", "∎\n", "\n", "**Definition (truncated Jacobi matrix)** Given a symmetric Jacobi matrix $X$,\n", "(or the weight $w(x)$ whose orthonormal polynomials are associated with $X$)\n", " the _truncated Jacobi matrix_ is\n", "$$\n", "X_n := \\begin{bmatrix} a_0 & b_0 \\\\\n", " b_0 & ⋱ & ⋱ \\\\\n", " & ⋱ & a_{n-2} & b_{n-2} \\\\\n", " && b_{n-2} & a_{n-1} \\end{bmatrix} ∈ ℝ^{n × n}\n", "$$\n", "\n", "\n", "\n", "**Lemma (zeros)** The zeros $x_1, …,x_n$ of an orthonormal polynomial $q_n(x)$\n", "are the eigenvalues of the truncated Jacobi matrix $X_n$.\n", "More precisely,\n", "$$\n", "X_n Q_n = Q_n \\begin{bmatrix} x_1 \\\\ & ⋱ \\\\ && x_n \\end{bmatrix}\n", "$$\n", "for the orthogonal matrix\n", "$$\n", "Q_n = \\begin{bmatrix}\n", "q_0(x_1) & ⋯ & q_0(x_n) \\\\\n", "⋮ & ⋯ & ⋮ \\\\\n", "q_{n-1}(x_1) & ⋯ & q_{n-1}(x_n)\n", "\\end{bmatrix} \\begin{bmatrix} α_1^{-1} \\\\ & ⋱ \\\\ && α_n^{-1} \\end{bmatrix}\n", "$$\n", "where $α_j = \\sqrt{q_0(x_j)^2 + ⋯ + q_{n-1}(x_j)^2}$.\n", "\n", "**Proof**\n", "\n", "We construct the eigenvector (noting $b_{n-1} q_n(x_j) = 0$):\n", "$$\n", "X_n \\begin{bmatrix} q_0(x_j) \\\\ ⋮ \\\\ q_{n-1}(x_j) \\end{bmatrix} =\n", "\\begin{bmatrix} a_0 q_0(x_j) + b_0 q_1(x_j) \\\\\n", " b_0 q_0(x_j) + a_1 q_1(x_j) + b_1 q_2(x_j) \\\\\n", "⋮ \\\\\n", "b_{n-3} q_{n-3}(x_j) + a_{n-2} q_{n-2}(x_j) + b_{n-2} q_{n-1}(x_j) \\\\\n", "b_{n-2} q_{n-2}(x_j) + a_{n-1} q_{n-1}(x_j) + b_{n-1} q_n(x_j)\n", "\\end{bmatrix} = x_j \\begin{bmatrix} q_0(x_j) \\\\\n", " q_1(x_j) \\\\\n", "⋮ \\\\\n", "q_{n-1}(x_j)\n", "\\end{bmatrix}\n", "$$\n", "The result follows from normalising the eigenvectors. Since $X_n$ is symmetric\n", "the eigenvector matrix is orthogonal.\n", "\n", "∎\n", "\n", "**Example (Chebyshev roots)** Consider $T_n(x) = \\cos n {\\rm acos}\\, x$. The roots \n", "are $x_j = \\cos θ_j$ where $θ_j = (j-1/2)π/n$ for $j = 1,…,n$ are the roots of $\\cos n θ$\n", "that are inside $[0,π]$. \n", "\n", "Consider the $n = 3$ case where we have\n", "$$\n", "x_1,x_2,x_3 = \\cos(π/6),\\cos(π/2),\\cos(5π/6) = \\sqrt{3}/2,0,-\\sqrt{3/2}\n", "$$\n", "We also have from the 3-term recurrence:\n", "$$\n", "\\begin{align*}\n", "T_0(x) = 1 \\\\\n", "T_1(x) = x \\\\\n", "T_2(x) = 2x T_1(x) - T_0(x) = 2x^2-1 \\\\\n", "T_3(x) = 2x T_2(x) - T_1(x) = 4x^2-3x\n", "\\end{align*}\n", "$$\n", "We orthonormalise by rescaling\n", "$$\n", "\\begin{align*}\n", "q_0(x) &= 1/\\sqrt{π} \\\\\n", "q_k(x) &= T_k(x) \\sqrt{2}/\\sqrt{π}\n", "\\end{align*}\n", "$$\n", "so that the Jacobi matrix is symmetric:\n", "$$\n", "x [q_0(x)|q_1(x)|⋯] = [q_0(x)|q_1(x)|⋯] \\underbrace{\\begin{bmatrix} 0 & 1/\\sqrt{2} \\\\\n", " 1/\\sqrt{2} & 0 & 1/2 \\\\\n", " &1/2 & 0 & 1/2 \\\\\n", " & & 1/2 & 0 & ⋱ \\\\\n", " & && ⋱ & ⋱\n", "\\end{bmatrix}}_X\n", "$$\n", "We can then confirm that we have constructed an eigenvector/eigenvalue of the $3 × 3$ truncation of the Jacobi matrix,\n", "e.g. at $x_2 = 0$:\n", "$$\n", "\\begin{bmatrix} \n", "0 & 1/\\sqrt{2} \\\\\n", "1/\\sqrt{2} & 0 & 1/2 \\\\\n", " & 1/2 & 0\\end{bmatrix} \\begin{bmatrix} q_0(0) \\\\ q_1(0) \\\\ q_2(0) \n", " \\end{bmatrix} = {1 \\over \\sqrt π} \\begin{bmatrix} \n", "0 & 1/\\sqrt{2} \\\\\n", "1/\\sqrt{2} & 0 & 1/2 \\\\\n", " & 1/2 & 0\\end{bmatrix} \\begin{bmatrix} 1 \\\\ 0 \\\\ -{1 \\over \\sqrt{2}}\n", " \\end{bmatrix} =\\begin{bmatrix} 0 \\\\ 0 \\\\ 0\n", " \\end{bmatrix}\n", "$$\n", "\n", "\n", "\n", "## 3. Interpolatory quadrature rules\n", "\n", "**Definition (interpolatory quadrature rule)** Given a set of points $𝐱 = [x_1,…,x_n]$\n", "the interpolatory quadrature rule is:\n", "$$\n", "Σ_n^{w,𝐱}[f] := ∑_{j=1}^n w_j f(x_j)\n", "$$\n", "where\n", "$$\n", "w_j := ∫_a^b ℓ_j(x) w(x) {\\rm d} x\n", "$$\n", "\n", "\n", "**Proposition (interpolatory quadrature is exact for polynomials)** \n", "Interpolatory quadrature is exact for all degree $n-1$ polynomials $p$:\n", "$$\n", "∫_a^b p(x) w(x) {\\rm d}x = Σ_n^{w,𝐱}[f]\n", "$$\n", "\n", "**Proof**\n", "The result follows since, by uniqueness of interpolatory polynomial:\n", "$$\n", "p(x) = ∑_{j=1}^n p(x_j) ℓ_j(x)\n", "$$\n", "\n", "∎\n", "\n", "**Example (arbitrary points)** Find the interpolatory quadrature rule for $w(x) = 1$ on $[0,1]$ with points $[x_1,x_2,x_3] = [0,1/4,1]$?\n", "We have:\n", "$$\n", "\\begin{align*}\n", "w_1 = \\int_0^1 w(x) ℓ_1(x) {\\rm d}x = \\int_0^1 {(x-1/4)(x-1) \\over (-1/4)(-1)}{\\rm d}x = -1/6 \\\\\n", "w_2 = \\int_0^1 w(x) ℓ_2(x) {\\rm d}x = \\int_0^1 {x(x-1) \\over (1/4)(-3/4)}{\\rm d}x = 8/9 \\\\\n", "w_3 = \\int_0^1 w(x) ℓ_3(x) {\\rm d}x = \\int_0^1 {x(x-1/4) \\over 3/4}{\\rm d}x = 5/18\n", "\\end{align*}\n", "$$\n", "That is we have\n", "$$\n", "Σ_n^{w,𝐱}[f] = -{f(0) \\over 6} + {8f(1/4) \\over 9} + {5 f(1) \\over 18}\n", "$$\n", "This is indeed exact for polynomials up to degree $2$ (and no more):\n", "$$\n", "Σ_n^{w,𝐱}[1] = 1, Σ_n^{w,𝐱}[x] = 1/2, Σ_n^{w,𝐱}[x^2] = 1/3, Σ_n^{w,𝐱}[x^3] = 7/24 ≠ 1/4.\n", "$$\n", "\n", "**Example (Chebyshev roots)** Find the interpolatory quadrature rule for $w(x) = 1/\\sqrt{1-x^2}$ on $[-1,1]$ with points equal to the\n", "roots of $T_3(x)$. This is a special case of Gaussian quadrature which we will approach in another way below. We use:\n", "$$\n", "\\int_{-1}^1 w(x) {\\rm d}x = π, \\int_{-1}^1 xw(x) {\\rm d}x = 0, \\int_{-1}^1 x^2 w(x) {\\rm d}x = {π/2}\n", "$$\n", "Recall from before that $x_1,x_2,x_3 = \\sqrt{3}/2,0,-\\sqrt{3}/2$. Thus we have:\n", "$$\n", "\\begin{align*}\n", "w_1 = \\int_{-1}^1 w(x) ℓ_1(x) {\\rm d}x = \\int_{-1}^1 {x(x+\\sqrt{3}/2) \\over (\\sqrt{3}/2) \\sqrt{3} \\sqrt{1-x^2}}{\\rm d}x = {π \\over 3} \\\\\n", "w_2 = \\int_{-1}^1 w(x) ℓ_2(x) {\\rm d}x = \\int_{-1}^1 {(x-\\sqrt{3}/2)(x+\\sqrt{3}/2) \\over (-3/4)\\sqrt{1-x^2}}{\\rm d}x = {π \\over 3} \\\\\n", "w_3 = \\int_{-1}^1 w(x) ℓ_3(x) {\\rm d}x = \\int_{-1}^1 {(x-\\sqrt{3}/2) x \\over (-\\sqrt{3})(-\\sqrt{3}/2) \\sqrt{1-x^2}}{\\rm d}x = {π \\over 3}\n", "\\end{align*}\n", "$$\n", "(It's not a coincidence that they are all the same but this will differ for roots of other OPs.) \n", "That is we have\n", "$$\n", "Σ_n^{w,𝐱}[f] = {π \\over 3}(f(\\sqrt{3}/2) + f(0) + f(-\\sqrt{3}/2)\n", "$$\n", "This is indeed exact for polynomials up to degree $n-1=2$, but it goes all the way up to $2n-1 = 5$:\n", "$$\n", "\\begin{align*}\n", "Σ_n^{w,𝐱}[1] &= π, Σ_n^{w,𝐱}[x] = 0, Σ_n^{w,𝐱}[x^2] = {π \\over 2}, \\\\\n", "Σ_n^{w,𝐱}[x^3] &= 0, Σ_n^{w,𝐱}[x^4] &= {3 π \\over 8}, Σ_n^{w,𝐱}[x^5] = 0 \\\\\n", "Σ_n^{w,𝐱}[x^6] &= {9 π \\over 32} ≠ {5 π \\over 16}\n", "\\end{align*}\n", "$$\n", "We shall explain this miracle next.\n", "\n", "\n", "\n", "\n", "\n", "## 4. Gaussian quadrature\n", "\n", "Gaussian quadrature is the interpolatory quadrature rule corresponding\n", "to the grid $x_j$ defined as the roots of the orthonormal polynomial $q_n(x)$.\n", "We shall see that it is exact for polynomials up to degree $2n-1$, i.e., double\n", "the degree of other interpolatory quadrature rules from other grids.\n", "\n", "\n", "\n", "**Definition (Gauss quadrature)** Given a weight $w(x)$, the Gauss quadrature rule is:\n", "$$\n", "∫_a^b f(x)w(x) {\\rm d}x ≈ \\underbrace{∑_{j=1}^n w_j f(x_j)}_{Σ_n^w[f]}\n", "$$\n", "where $x_1,…,x_n$ are the roots of the orthonormal polynomials $q_n(x)$ and \n", "$$\n", "w_j := {1 \\over α_j^2} = {1 \\over q_0(x_j)^2 + ⋯ + q_{n-1}(x_j)^2}.\n", "$$\n", "Equivalentally, $x_1,…,x_n$ are the eigenvalues of $X_n$ and\n", "$$\n", "w_j = ∫_a^b w(x) {\\rm d}x Q_n[1,j]^2.\n", "$$\n", "(Note we have $∫_a^b w(x) {\\rm d} x q_0(x)^2 = 1$.)\n", "\n", "In analogy to how Fourier series are orthogonal with respect to Trapezium rule,\n", "Orthogonal polynomials are orthogonal with respect to Gaussian quadrature:\n", "\n", "**Lemma (Discrete orthogonality)**\n", "For $0 ≤ ℓ,m ≤ n-1$, the orthonormal polynomials $q_n(x)$ satisfy\n", "$$\n", "Σ_n^w[q_ℓ q_m] = δ_{ℓm}\n", "$$\n", "\n", "**Proof**\n", "$$\n", "Σ_n^w[q_ℓ q_m] = ∑_{j=1}^n {q_ℓ(x_j) q_m(x_j) \\over α_j^2}\n", "= \\left[q_ℓ(x_1)/ α_1 | ⋯ | {q_ℓ(x_n)/ α_n}\\right] \n", "\\begin{bmatrix}\n", "q_m(x_1)/α_1 \\\\\n", "⋮ \\\\\n", "q_m(x_n)/α_n \\end{bmatrix} = 𝐞_ℓ Q_n Q_n^⊤ 𝐞_m = δ_{ℓm}\n", "$$\n", "\n", "∎\n", "\n", "Just as approximating Fourier coefficients using Trapezium rule gives a way of\n", "interpolating at the grid, so does Gaussian quadrature:\n", "\n", "**Theorem (interpolation via quadrature)**\n", "For the orthonormal polynomials $q_n(x)$,\n", "$$\n", "f_n(x) = ∑_{k=0}^{n-1} c_k^n q_k(x)\\hbox{ for } c_k^n := Σ_n^w[f q_k]\n", "$$\n", "interpolates $f(x)$ at the Gaussian quadrature points $x_1,…,x_n$.\n", "\n", "**Proof**\n", "\n", "Consider the Vandermonde-like matrix:\n", "$$\n", "Ṽ := \\begin{bmatrix} q_0(x_1) & ⋯ & q_{n-1}(x_1) \\\\\n", " ⋮ & ⋱ & ⋮ \\\\\n", " q_0(x_n) & ⋯ & q_{n-1}(x_n) \\end{bmatrix}\n", "$$\n", "and define\n", "$$\n", "Q_n^w := Ṽ^⊤ \\begin{bmatrix} w_1 \\\\ &⋱ \\\\&& w_n \\end{bmatrix} = \\begin{bmatrix} q_0(x_1)w_1 & ⋯ & q_0(x_n) w_n \\\\\n", " ⋮ & ⋱ & ⋮ \\\\\n", " w_1q_{n-1}(x_1) & ⋯ & q_{n-1}(x_n)w_n \\end{bmatrix}\n", "$$\n", "so that\n", "$$\n", "\\begin{bmatrix}\n", "c_0^n \\\\\n", "⋮ \\\\\n", "c_{n-1}^n \\end{bmatrix} = Q_n^w \\begin{bmatrix} f(x_1) \\\\ ⋮ \\\\ f(x_n) \\end{bmatrix}.\n", "$$\n", "Note that if $p(x) = [q_0(x) | ⋯ | q_{n-1}(x)] 𝐜$ then\n", "$$\n", "\\begin{bmatrix}\n", "p(x_1) \\\\\n", "⋮ \\\\\n", "p(x_n)\n", "\\end{bmatrix} = Ṽ 𝐜\n", "$$\n", "But we see that (similar to the Fourier case)\n", "$$\n", "Q_n^w Ṽ = \\begin{bmatrix} Σ_n^w[q_0 q_0] & ⋯ & Σ_n^w[q_0 q_{n-1}]\\\\\n", " ⋮ & ⋱ & ⋮ \\\\\n", " Σ_n^w[q_{n-1} q_0] & ⋯ & Σ_n^w[q_{n-1} q_{n-1}]\n", " \\end{bmatrix} = I_n\n", "$$\n", "\n", "∎\n", "\n", "**Corollary** Gaussian quadrature is an interpolatory quadrature rule\n", "with the interpolation points equal to the roots of $q_n$:\n", "$$\n", "Σ_n^w[f] = Σ_n^{w,𝐱}[f]\n", "$$\n", "**Proof**\n", "We want to show its the same as integrating the interpolatory polynomial:\n", "$$\n", "\\int_a^b f_n(x) w(x) {\\rm d}x = {1 \\over q_0(x)} \\sum_{k=0}^{n-1} c_k^n \\int_a^b q_k(x) q_0(x) w(x) {\\rm d}x\n", "= {c_0^n \\over q_0} = Σ_n^w[f].\n", "$$\n", "∎\n", "\n", "\n", "\n", "A consequence of being an interpolatory quadrature rule is that it is exact for all\n", "polynomials of degree $n-1$. The _miracle_ of Gaussian quadrature is it is exact for twice\n", "as many!\n", "\n", "\n", "\n", "**Theorem (Exactness of Gauss quadrature)** If $p(x)$ is a degree $2n-1$ polynomial then\n", "Gauss quadrature is exact:\n", "$$\n", "∫_a^b p(x)w(x) {\\rm d}x = Σ_n^w[p].\n", "$$\n", "\n", "**Proof**\n", "Using polynomial division algorithm (e.g. by matching terms) we can write\n", "$$\n", "p(x) = q_n(x) s(x) + r(x)\n", "$$\n", "where $s$ and $r$ are degree $n-1$ and $q_n(x)$ is the degree $n$ orthonormal polynomial. Then we have:\n", "$$\n", "\\begin{align*}\n", "Σ_n^w[p] &= \\underbrace{Σ_n^w[q_n s]}_{\\hbox{$0$ since evaluating $q_n$ at zeros}} + Σ_n^w[r] = ∫_a^b r(x) w(x) {\\rm d}x\n", "= \\underbrace{∫_a^b q_n(x)s(x) w(x) {\\rm d}x}_{\\hbox{$0$ since $s$ is degree $