{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "## Homework 4\n", "\n", "## Linear Algebra and Data Modeling\n", "
\n", "This notebook is arranged in cells. Texts are usually written in the markdown cells, and here you can use html tags (make it bold, italic, colored, etc). You can double click on this cell to see the formatting.
\n", "
\n", "The ellipsis (...) are provided where you are expected to write your solution but feel free to change the template (not over much) in case this style is not to your taste.
\n", "
\n", "Hit \"Shift-Enter\" on a code cell to evaluate it. Double click a Markdown cell to edit.
" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "***\n", "### Link Okpy" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "from client.api.notebook import Notebook\n", "ok = Notebook('hw4.ok')\n", "_ = ok.auth(inline = True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Imports" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import numpy as np\n", "from scipy.integrate import quad\n", "#For plotting\n", "import matplotlib.pyplot as plt\n", "%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "***" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Problem 1 - Gaussian Elimination\n", "\n", "Consider the following circuit of resistors.
\n", "![alt text](Circuit.png \"Title\")\n", "
\n", "All the resistors have the same resistance $R$. The power rail at the top is at voltage $V_+$ = 5 V. What are the other four voltages, $V_1$ to $V_4$?\n", "To answer this question we use Ohm’s law and the Kirchhoff current law, which says that the total net current flow out of (or into) any junction in a circuit must be zero. Thus for the junction at voltage $V_1$, for instance, we have\n", "$$ \\frac{V_1 - V_2}{R} + \\frac{V_1 - V_3}{R} + \\frac{V_1 - V_4}{R} + \\frac{V_1 - V_+}{R} = 0 $$\n", "
or equivalently
\n", "\\begin{align}\n", "4V_1 - V_2 - V_3 - V_4 & = V_+ \\\\\n", "-V_1 + 3V_3 & = V_+ \\\\\n", "-V_1 + 3V_2 - V_4 & = 0 \\\\\n", "-V_1 - V_2 + 4V_4 & = 0 \\\\\n", "\\end{align}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " 1. Write the above system of equations into a matrix form $Ax = b$. Define the coefficient matrix $A$ and the constant matrix $b$. (Note: $x$ is a variable matrix: [$V_1, V_2, V_3, V_4$])
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "A = ...\n", "b = ..." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " 2. Write a program to solve the four resulting equations using Gaussian elimination and hence find the four voltages. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Gaussian Elimination\n", "\n", "def GaussianElimination(A, b):\n", " \n", " ...\n", " \n", " return ...\n", "\n", "x = GaussianElimination(A, b)\n", "\n", "print('V1 =', x[0], ', V2 =', x[1], ', V3 =', x[2], ', V4 =', x[3])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " 3. Modify Part 2 to incorporate partial pivoting. (Make sure that you get the same answer.)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def GaussianElimination_pivot(A, b):\n", " \n", " ...\n", " \n", " return ...\n", "\n", "x = GaussianElimination_pivot(A, b)\n", "\n", "print('V1 =', x[0], ', V2 =', x[1], ', V3 =', x[2], ', V4 =', x[3])" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "***" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Problem 2 - Solving Least Squares Using Normal Equations and SVD\n", "\n", "(Reference - NR 15.4) We fit a set of 50 data points $(x_i, y_i)$ to a polynomial $y(x) = a_0 + a_1x + a_2x^2 + a_3x^3$. (Note that this problem is linear in $a_i$ but nonlinear in $x_i$). The uncertainty $\\sigma_i$ associated with each measurement $y_i$ is known, and we assume that the $x_i$'s are known exactly. To measure how well the model agrees with the data, we use the chi-square merit function:
\n", "$$ \\chi^2 = \\sum_{i=0}^{N-1} \\big( \\frac{y_i-\\sum_{k=0}^{M-1}a_k x^k}{\\sigma_i} \\big)^2. $$\n", "
\n", "where N = 50 and M = 4. Here, $1, x, ... , x^3$ are the basis functions.\n", "

\n", " 1. Plot data (make sure to include error bars). (Hint - https://matplotlib.org/api/_as_gen/matplotlib.axes.Axes.errorbar.html)
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Load a given 2D data\n", "data = np.loadtxt(\"HW4_Problem2_data.dat\")\n", "x = data[:,0]\n", "y = data[:,1]\n", "sig_y = data[:,2]" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Make plot\n", "plt.figure(figsize = (10, 7))\n", "# Scatter plot\n", "..." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We will pick as best parameters those that minimize $\\chi^2$.

\n", "First, let $\\bf A$ be a matrix whose $N \\times M$ components are constructed from the $M$ basis functions evaluated at the $N$ abscissas $x_i$, and from the $N$ measurement errors $\\sigma_i$, by the prescription\n", "$$ A_{ij} = \\frac{X_j(x_i)}{\\sigma_i} $$\n", "
where $X_0(x) = 1,\\ X_1(x) = x,\\ X_2(x) = x^2,\\ X_3(x) = x^3$. We call this matrix $\\bf A$ the design matrix.\n", "

\n", "Also, define a vector $\\bf b$ of length $N$ by\n", "$$ b_i = \\frac{y_i}{\\sigma_i} $$\n", "
and denote the $M$ vector whose components are the parameters to be fitted ($a_0, a_1, a_2, a_3$) by $\\bf a$.\n", "

\n", " 2. Define the design matrix A. (Hint: Its dimension should be NxM = 50x4.) Also, define the vector b.
\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Define A\n", "A = ...\n", "# Define b\n", "b = ..." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Minimize $\\chi^2$ by differentiating it with respect to all $M$ parameters $a_k$ vaishes. This condition yields the matrix equation
\n", "$$ \\sum_{j=0}^{M-1} \\alpha_{kj}a_j = \\beta_k$$ \n", "
where $\\bf \\boldsymbol \\alpha = A^T \\cdot A$ and $\\bf \\boldsymbol \\beta = A^T \\cdot b$ ($\\boldsymbol \\alpha$ is an $M \\times M$ matrix, and $\\boldsymbol \\beta$ is a vector of length $M$). This is the normal equation of the least squares problem. In matrix form, the normal equations can be written as:\n", "$$ \\bf \\boldsymbol \\alpha \\cdot a = \\boldsymbol \\beta. $$\n", "

\n", "This can be solved for the vector of parameters $\\bf a$ by linear algebra numerical methods.\n", "

\n", " 3. Define the matrix alpha and vector beta.
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Transpose of the matrix A\n", "A_transpose = ...\n", "\n", "# alpha matrix\n", "alpha = ...\n", "# beta vector\n", "beta = ..." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " 4. We have $ \\bf \\boldsymbol \\alpha \\cdot a = \\boldsymbol \\beta. $ Solve for $\\bf a$ using (1) \"GaussianElimination_pivot\" from Part 1 and (2) LU decomposition and forward subsitution and backsubstitution. Plot the best-fit line on top of the data.
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Using the Gaussian elimination with partial pivoting\n", "a = ...\n", "\n", "print('Using Gaussian Elimination:')\n", "print('a0 =', a[0], ', a1 =', a[1], ', a2 =', a[2], ', a3 =', a[3])" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# \"lu\" does LU decomposition with pivot. Reference - https://docs.scipy.org/doc/scipy-0.14.0/reference/generated/scipy.linalg.lu_factor.html\n", "from scipy.linalg import lu\n", "\n", "def solve_lu_pivot(A, B):\n", " # LU decomposition with pivot\n", " L, U = lu(A, permute_l=True)\n", " \n", " N = len(B)\n", " \n", " # forward substitution: We have Ly = B. Solve for y\n", " ...\n", "\n", " # backward substitution: We have y = Ux. Solve for x.\n", " ...\n", " \n", " return ...\n", "\n", "a = ...\n", "\n", "print('Using LU Decomposition:')\n", "print('a0 =', a[0], ', a1 =', a[1], ', a2 =', a[2], ', a3 =', a[3])" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Make plot\n", "..." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The inverse matrix $\\bf C = \\boldsymbol \\alpha^{-1}$ is called the covariance matrix, which is closely related to the probable uncertainties of the estimated parameters $\\bf a$. To estimate these uncertainties, we compute the variance associated with the estimate $a_j$. Following NR p.790, we obtain:

\n", "$$ \\sigma^2(a_j) = \\sum_{k=0}^{M-1} \\sum_{l=0}^{M-1} C_{jk} C_{jl} \\alpha_{kl} $$\n", "
\n", " 5. Compute the error (standard deviation - square root of the variance) on the fitted parameters.
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "from scipy.linalg import inv\n", "# Covariance matrix\n", "C = inv(alpha)\n", "\n", "...\n", "\n", "sigma_a0 = ...\n", "sigma_a1 = ...\n", "sigma_a2 = ...\n", "sigma_a3 = ...\n", "\n", "print('Error: on a0 =', sigma_a0, ', on a1 =', sigma_a1, ', on a2 =', sigma_a2, ', on a3 =', sigma_a3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, instead of using the normal equations, we use singular value decomposition (SVD) to find the solution of least squares. Please read Ch. 15 of NR for more details. Remember that we have the $N \\times M$ design matrix $\\bf A$ and the vector $\\bf b$ of length $N$. We wish to mind $\\bf a$ which minimizes $\\chi^2 = |\\bf A \\cdot a - b|^2$.\n", "

\n", "Using SVD, we can decompose $\\bf A$ as the product of an $N \\times M$ column-orthogonal matrix $\\bf U$, an $M \\times M$ diagonal matrix $\\bf S$ (with positive or zero elements - the \"singular\" values), and the transpose of an $M \\times M$ orthogonal matrix $\\bf V$. ($\\bf A = USV^{T}$).
\n", "Let $\\bf U_{(i)}$ and $\\bf V_{(i)}$ denote the columns of $\\bf U$ and $\\bf V$ respectively (Note: We get $M$ number of vectors of length $M$.) $\\bf S_{(i,i)}$ are the $i$th diagonal elements (singular values) of $\\bf S$. Then, the solution of the above least squares problem can be written as:\n", "
\n", "$$ \\bf a = \\sum_{i=1}^M \\big( \\frac{U_{(i)} \\cdot b}{S_{(i,i)}} \\big) V_{(i)}. $$\n", "

\n", "The variance in the estimate of a parameter $a_j$ is given by:\n", "$$ \\sigma^2(a_j) = \\sum_{i=1}^M \\big( \\frac{V_{ji}}{S_{ii}} \\big)^2 $$\n", "
\n", "and the covariance:\n", "$$ \\mathrm{Cov}(a_j, a_k) = \\sum_{i=1}^M \\big( \\frac{V_{ji}V_{ki}}{S_{ii}^2} \\big). $$\n", "

\n", " 6. Decompose the design matrix A using SVD. Estimate the parameter $a_i$'s and its variance.
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Reference - https://docs.scipy.org/doc/numpy-1.13.0/reference/generated/numpy.linalg.svd.html\n", "from scipy.linalg import svd\n", "\n", "# Decompose A\n", "# Note: S, in this case, is a vector of length M, which contains the singular values.\n", "U, S, VT = svd(A, full_matrices=False)\n", "V = V.T\n", "\n", "# Solve for a\n", "...\n", "a_from_SVD = ...\n", "\n", "print('Using SVD:')\n", "print('a0 =', a_from_SVD[0], ', a1 =', a_from_SVD[1], ', a2 =', a_from_SVD[2], ', a3 =', a_from_SVD[3])" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Error on a\n", "...\n", "sigma_a_SVD = ...\n", "\n", "\n", "print('Error: on a0 =', sigma_a_SVD[0], ', on a1 =', sigma_a_SVD[1], ', on a2 =', sigma_a_SVD[2], ', on a3 =', sigma_a_SVD[3])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Suppose that you are only interested in the parameters $a_0$ and $a_1$. We can plot the 2-dimensional confidence region ellipse for these parameters by building the covariance matrix:\n", "$$ \\mathrm{C'} = \\binom{\\sigma({a_0})^2\\ \\ \\ \\ \\ \\ \\mathrm{Cov}({a_0, a_1})}{\\mathrm{Cov}({a_0, a_1}) \\ \\ \\ \\ \\ \\ \\sigma({a_1})^2} $$\n", "

\n", "The lengths of the ellipse axes are the square root of the eigenvalues of the covariance matrix, and we can calculate the counter-clockwise rotation of the ellipse with the rotation angle:\n", "$$ \\theta = \\frac{1}{2} \\mathrm{arctan}\\Big( \\frac{2\\cdot \\mathrm{Cov}({a_0, a_1})}{\\sigma({a_0})^2-\\sigma({a_1})^2} \\Big) $$\n", "
\n", "Then, we multiply the axis lengths by some factor depending on the confidence level we are interested in. For 68%, this scale factor is $\\sqrt{\\Delta \\chi^2} \\approx 1.52$.\n", "

\n", " 7. Compute the covariance between $a_0$ and $a_1$. Plot the 68% confidence region of the parameter $a_0$ and $a_1$.
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Compute the covariance\n", "...\n", "sigma_01 = ...\n", "\n", "# Build the covariance matrix\n", "CovM = np.array([[sigma_a_SVD[0], sigma_01],[sigma_01, sigma_a_SVD[1]]])\n", "\n", "from numpy.linalg import eigvals\n", "axis1 = 1.52*eigvals(CovM)[0]\n", "axis2 = 1.52*eigvals(CovM)[1]\n", "\n", "theta = np.arctan(2*sigma_01/(sigma_a_SVD[0]**2-sigma_a_SVD[1]**2))/2." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Plot the 1-sigma confidence region (https://stackoverflow.com/questions/32371996/python-matplotlib-how-to-plot-ellipse)\n", "from matplotlib.patches import Ellipse\n", "import matplotlib as mpl\n", "\n", "ell = mpl.patches.Ellipse(xy=[a[0], a[1]], width=axis1, height=axis2, angle = theta*180/np.pi)\n", "fig, ax = plt.subplots(figsize=(7,7))\n", "\n", "ax.add_patch(ell)\n", "ax.set_aspect('equal')\n", "ax.autoscale()\n", "plt.xlim(-0.7, 0.7)\n", "plt.ylim(1.9, 3.3)\n", "plt.grid(True)\n", "plt.xlabel('$a_0$')\n", "plt.ylabel('$a_1$')\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "***" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Problem 3 - Rational Function\n", "\n", "The method of finding least-squares fits using polynomials is widely used, but in some cases rational approximations behave better; they may work where the Taylor approximations do not converge.\n", "

\n", "Consider an exponential function $e^{-x}$. The Maclaurin series of this function is $1-x+x^2/2-x^3/6+x^4/24+O(x^5)$. However, this is a poor approximation for large $x$. To resolve this issue, we can instead write it as the rational function:\n", "$e^{-x} = \\frac{e^{-x/2}}{e^{x/2}} = \\frac{1-x/2+x^2/8}{1+x/2+x^2/8}$. (For more detail, read NR 5.12 - Pade Approximant)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAD8CAYAAAB0IB+mAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xl8VNX9//HXmZkkk30PCQn7aiCsYXcBUUQQtYBVRK1b\nKSpqv36rqG21/qy1tmpb+0Wtda8LLriggguIsihI2Al7WLPveybJzJzfHzekYU3IdmeSz1PnMZm5\nZ+793NwH79y599xzldYaIYQQHZ/F7AKEEEK0Dwl8IYToJCTwhRCik5DAF0KITkICXwghOgkJfCGE\n6CQk8IUQopOQwBdCiE5CAl8IIToJm1kLjoqK0j179jRr8UII4ZU2bdqUr7WObs5nTQv8nj17kpKS\nYtbihRDCKymljjT3s3JIRwghOolGA18p9apSKlcptbORdqOUUk6l1OzWK08IIURracoe/uvA1LM1\nUEpZgaeAr1uhJiGEEG2g0WP4WuvVSqmejTS7G1gCjGpJMbW1taSnp+NwOFoym07DbreTkJCAj4+P\n2aUIIbxAi0/aKqXigZ8Bk2hh4KenpxMcHEzPnj1RSrW0tA5Na01BQQHp6en06tXL7HKEEF6gNU7a\n/h1YqLV2N9ZQKTVPKZWilErJy8s7ZbrD4SAyMlLCvgmUUkRGRsq3ISFEk7VGt8xkYHFdSEcB05RS\nTq31Jyc31Fq/BLwEkJycfNpbbUnYN538roQQ56LFga+1rj+eoJR6Hfj8dGEvhBCihba81aKPNxr4\nSql3gYlAlFIqHXgU8AHQWr/YoqV7mIKCAiZPngxAdnY2VquV6GjjgraffvoJX19fM8sTQnRmbhd8\n+8cWzaIpvXTmNHVmWuubW1SNySIjI9m6dSsAf/jDHwgKCuI3v/nNCW201mitsVjkmjUhRDtK+xbK\nslo0C0mtJjhw4ACJiYnMnTuXQYMGcezYMcLCwuqnL168mNtvvx2AnJwcZs6cSXJyMqNHj2b9+vVm\nlS2E6Ei2vAX+ES2ahWlj6TTmsc9S2ZVZ2qrzTOwawqMzBjXrs3v27OHNN98kOTkZp9N5xnb33HMP\nDzzwAGPHjuXw4cNcccUV7Nx51ouUhRDi7CoKYM8XMPqXwJ+bPRuPDXxP06dPH5KTkxttt2LFCvbu\n3Vv/uqioiKqqKvz9/duyPCFER7bjA3DXwrC5dMjAb+6eeFsJDAys/9lisaD1f3uVNuwLr7WWE7xC\niNa19S2IGwaxg1s0GzmG3wwWi4Xw8HD279+P2+3m448/rp92ySWXsGjRovrXx08CCyFEs2Rtg+wd\nMPyGFs9KAr+ZnnrqKS677DLGjx9PQkJC/fuLFi1i3bp1DBkyhMTERP7973+bWKUQwutteRusfpDU\n8oGIVcNDE+0pOTlZn3wDlN27d3PeeeeZUo+3kt+ZEB2YsxqeGQB9LobZrwKglNqktW78hOJpyB6+\nEEJ4qt2fQVVRqxzOAQl8IYTwXJteh7Ae0Gtiq8xOAl8IITxRQRocXgMjboRWurJfAl8IITzR5jdA\nWWFY6xzOAQl8IYTwPM4a2PoO9J8KIXGtNlsJfCGE8DR7l0FFHoy8uVVn67FX2pqhtYZHdjqdREVF\nUVxc3Ga1CiE6sM1vQEgC9J3cqrOVwG+gKcMjtwWn04nNJptCCAEUHYa0VXDRQrBYW3XWckiniWbM\nmMHIkSMZNGgQL7/8MgAvvfTSCX8QXnjhBe6///4TPud2u7nvvvsYPHgwSUlJfPjhh4AxyNrEiRO5\n4oorSEpKar8VEUJ4ts1vGs+t1Pe+Ic/drVz+oDF+RGuKTYLLmzfS3BtvvEFERASVlZUkJycza9Ys\nrrvuOoYPH86f//xnbDYbr732Gm+88cYJn/vggw/YvXs327ZtIy8vj1GjRnHhhRcCkJKSwq5du+je\nvXuLV00I0QE4a4zA738ZhHVr9dnLHn4T/e1vf2Po0KGMGzeO9PR00tLSCAkJ4cILL2T58uWkpqZi\ntVpPGeZg7dq1zJkzB6vVSmxsLOeffz7Hh5QYN26chL0Q4r/2fGacrB11e5vM3nP38Ju5J94WVqxY\nwerVq1m/fj3+/v6cf/759UMi33777Tz77LP07NmTW2655Zzm23DIZSGEYOMrxpW1fVr3ZO1xsoff\nBCUlJURERODv709qaiobN26snzZhwgTS0tL44IMPuPbaa0/57AUXXMDixYtxu93k5OSwbt26Jt1I\nRQjRyeTuhiPrIPnWVruy9mSeu4fvQaZPn85LL71EYmIiAwYMYMyYMSdMnz17Nnv27CE0NPSUz86e\nPZv169czZMgQlFI8++yzxMTEtFfpQghvkfIqWH3b5GTtcY0Oj6yUehW4AsjVWp9yuxWl1FxgIaCA\nMuAOrfW2xhbckYZHnjp1Kg899BAXXXRRuy/bW39nQogGqsvhmYEwcBrMfOmsTdt6eOTXgalnmX4I\nuEhrnQQ8Dpy92g6koKCAfv36ER4ebkrYCyE6iB0fQE0ZJN/Wpotp9JCO1nq1UqrnWab/0ODleiDh\nTG07msjISPbv3292GUIIb6a1cbK2y2DoNrpNF9XaZwZuA5afaaJSap5SKkUplZKXl9fKixZCCC90\n9EfI2WF0xVSqTRfVaoGvlJqEEfgLz9RGa/2S1jpZa518fIwaIYTo1Db8C+xhMOTUXn6trVUCXyk1\nBHgZuEprXdAa8xRCiA6vJN24jeGIm8A3oM0X1+LAV0p1Bz4CbtRa72t5SUII0UlsfAXQbXZl7cka\nDXyl1LvAj8AApVS6Uuo2pdR8pdT8uiaPAJHA80qprUqplDPOzAtYrVaGDRvG4MGDmTFjRqNDHBcX\nF/P888/Xv87MzGT27NmtXtfEiRM5uRurEMKL1VYZ96wdMA3Ce7TLIhsNfK31HK11nNbaR2udoLV+\nRWv9otb6xbrpt2utw7XWw+oeXn0Zqb+/P1u3bmXnzp1ERESwaNGis7Y/OfC7du1aPyKmEEKc0c4l\nUFUIY37VbouUoRXOYty4cWRkZABQXl7O5MmTGTFiBElJSXz66acAPPjgg6SlpTFs2DDuv/9+Dh8+\nzODBxvVpDoeDW265haSkJIYPH86qVasAeP3115k5cyZTp06lX79+PPDAA/XLvOOOO0hOTmbQoEE8\n+uij7bzGQoh2oTVseBFiEqHnBe22WI8dWuGpn55iT+GeVp3nwIiBLBx9xk5EJ3C5XKxcuZLbbjMu\nhLDb7Xz88ceEhISQn5/P2LFjufLKK/nzn//Mzp0762+ccvjw4fp5LFq0CKUUO3bsYM+ePUyZMoV9\n+4zTHFu3bmXLli34+fkxYMAA7r77brp168YTTzxBREQELpeLyZMns337doYMGdKqvwchhMmO/GAM\n/z7jH23eFbMh2cM/SVVVFcOGDSM2NpacnBwuvfRSALTWPPzwwwwZMoRLLrmEjIwMcnJyzjqvtWvX\ncsMNxrgYAwcOpEePHvWBP3nyZEJDQ7Hb7SQmJnLkyBEA3n//fUaMGMHw4cNJTU1l165dbbi2QghT\nrH8e/CMg6eftuliP3cNv6p54azt+DL+yspLLLruMRYsWcc899/D222+Tl5fHpk2b8PHxoWfPnvVD\nJDeHn59f/c9WqxWn08mhQ4d4+umn2bhxI+Hh4dx8880tWoYQwgMVpMGeL+DC37RLV8yGZA//DAIC\nAnjuued45plncDqdlJSUEBMTg4+PD6tWrarfIw8ODqasrOy087jgggt4++23Adi3bx9Hjx5lwIAB\nZ1xmaWkpgYGBhIaGkpOTw/LlZ7xoWQjhrTa8CFYfGPXLdl+0BP5ZDB8+nCFDhvDuu+8yd+5cUlJS\nSEpK4s0332TgwIGAMZ7OhAkTGDx48Cn3s73zzjtxu90kJSVx7bXX8vrrr5+wZ3+yoUOHMnz4cAYO\nHMj111/PhAkT2nT9hBDtrKoItrwFSddAcJd2X3yjwyO3lY40PLKZ5HcmhBdZ8yysfAzmr4PYU0ab\nb5K2Hh5ZCCFESzlr4KeXoPekZod9S0ngCyFEe0j9GMqyYNwC00rwuMA36xCTN5LflRBeQmv44Z8Q\nPRD6ts0NypvCowLfbrdTUFAgQdYEWmsKCgqw2+1mlyKEaMyBlcaY9+PvadcLrU7mUf3wExISSE9P\nR26O0jR2u52EhE5zgzEhvNe6v0NIvNE7x0QeFfg+Pj706tXL7DKEEKL1pG+Cw2tgyhNg8zW1FI86\npCOEEB3Our+BPRRG/sLsSiTwhRCizeTvh92fG1fV+gWbXY0EvhBCtJkfngOrL4yZ33jbdiCBL4QQ\nbaE0C7YthuFzISja7GoACXwhhGgbP/wT3C6YcK/ZldSTwBdCiNZWkQ8pr8KQayG8p9nV1JPAF0KI\n1vbjInA64IL7zK7kBBL4QgjRmqqK4Kd/w6CrIaqf2dWcoNHAV0q9qpTKVUrtPMN0pZR6Til1QCm1\nXSk1ovXLFEIIL7HhJagpgwt+Y3Ylp2jKHv7rwNSzTL8c6Ff3mAe80PKyhBDCC1WXGferHTDNtCGQ\nz6bRwNdarwYKz9LkKuBNbVgPhCml4hqbb3r+ftwuV9MrFUIIT7fxZXAUe+TePbTOMfx44FiD1+l1\n751CKTVPKZWilEopUTX8/cO7W2HxQgjhAarLYN0/oO+lkDDS7GpOq11P2mqtX9JaJ2utk/3dincq\nV7NiwwftWYIQQrSNDf8yTthOfMjsSs6oNQI/A+jW4HVC3XtnFR/aiyC35q87HiM7/1hjzYUQwnM5\nSo0Lrfpd5rF799A6gb8UuKmut85YoERrndXYh/x8/bm7113k2ODhJdfI8XwhhPfa8KJx7H6S5+7d\nQ9O6Zb4L/AgMUEqlK6VuU0rNV0odHw1oGXAQOAD8G7izqQufdfGdXK0Gs9FewZ/evaUZ5QshhMmq\niuHH/zN65nQdbnY1Z9XoDVC01nMama6Bu5pbwO/m/odDr4xnid9mBq16kZ9N8oxR5YQQoknWvwCO\nEpj4oNmVNMr0K21tNh+enPkhMU74+8F/svfQFrNLEkKIpqksNPrdD7wC4oaaXU2jTA98gK7RPXhw\n6GNUWhS/++YWKh0VZpckhBCNW/us0R3z4t+ZXUmTeETgA0waNYtfhExlj5+Lh/9zldnlCCHE2ZVm\nGmPmDL0OYs4zu5om8ZjAB1gw62kuqY1jpW8Of39fLsoSQniw758yxrv34H73J/OowAd48sZPGFRt\n4z8Vq/hi7etmlyOEEKcqSIPN/4HkWyC8h9nVNJnHBb7dL4Anp79LuEvz171/Ze+RrWaXJIQQJ1r1\nBNj84ML7za7knHhc4AP0ih/IQ0mPUWFRPPzVLyirKDa7JCGEMGRth51LYOwdEBRjdjXnxCMDH2Dy\nmNncGjaDfX5ufvP2DLkSVwhhPq3hm9+DfziMv8fsas6ZxwY+wB0/e5Ir3L35wa+Yx9++wexyhBCd\n3YGVcPA7uGgh+IeZXc058+jAB3j8xg8Z5QhgiXsHr37+/8wuRwjRWbldxt59eC9Ivs3saprF4wPf\nZvPhmes+p3eNhX/lvceqlI/NLkkI0RltfQdyd8Elj4LN1+xqmsXjAx8gPDSaJy5+mQA3/HHb76Tn\njhCifdVUGD1z4pMh8Wqzq2k2rwh8gEF9R7PwvIcps8DCr2+isCTX7JKEEJ3Fj89DWRZM+SMoZXY1\nzeY1gQ8wdfxc5kdewyEfN//z3hXU1taYXZIQoqMrzTTGzDlvBvQYZ3Y1LeJVgQ9w65V/4OfW4Wz2\nq+KBN2eYXY4QoqNb8Ri4nXDp42ZX0mJeF/gAD899g8k1MaywZfLHt282uxwhREd1bCNsXwzjFkBE\nL7OraTGvDHxlsfCXXywj2eHP+7UpvPjpb80uSQjR0bjd8OWDENQFLrjP7GpahVcGPoCvrx/PzlnG\ngBorLxV9ykffvWh2SUKIjmTHB5CRApMfBb9gs6tpFV4b+ADhIVH8ddp7xNbCXw/+kzVbPze7JCFE\nR1BdBiseNe5RO/Ssd3n1Kl4d+AA9Ewby+IR/4qfhkU0PsuPABrNLEkJ4u+//YnTDvPyvYPH6mKzX\npDVRSk1VSu1VSh1QSp1yp16lVKhS6jOl1DalVKpS6pbWL/XMRiZO4veDHqFaae7/7nYOZ+5tz8UL\nITqS3D3GfWqH3wjdRpldTatqNPCVUlZgEXA5kAjMUUolntTsLmCX1nooMBF4RinVrtceTx57Lf/b\nbT75Vs2vv/g5+cXZ7bl4IURHoDUs+w34BsElfzC7mlbXlD380cABrfVBrXUNsBg4+aazGghWSikg\nCCgEnK1aaRPMuvRu7gz/GYd9XNz1/nTKKkvauwQhhDfbuQQOr4HJj0BglNnVtLqmBH48cKzB6/S6\n9xr6P+A8IBPYAdyrtXa3SoXn6NarH+dmvwvY5VfDnW9NwVFTZUYZQghvU10GX/8O4obByJvNrqZN\ntNbZiMuArUBXYBjwf0qpkJMbKaXmKaVSlFIpeXl5rbToU/16zgvMZShb/SpZ8MYUnM7aNluWEKKD\n+PYJKMuG6c+AxWp2NW2iKYGfAXRr8Dqh7r2GbgE+0oYDwCFg4Mkz0lq/pLVO1lonR0dHN7fmJnnw\nF28x09mHDb7F3PPGNLljlhDizDI2wU//glG3QUKy2dW0maYE/kagn1KqV92J2OuApSe1OQpMBlBK\ndQEGAAdbs9Dm+MMtHzG9Oo41tmx+/eaVaK3NLkkI4WlcTvjsXgiMMY7dd2CNBr7W2gksAL4CdgPv\na61TlVLzlVLz65o9DoxXSu0AVgILtdb5bVV0UymLhSduXcYURySrLEf53zevltAXQpxo/fOQvQOm\n/QXsoWZX06aUWQGYnJysU1JS2mVZztpa7n/lYlb4F3O5GshTN76P8uIxrYUQraToMCwaC70nwpx3\nvWKse6XUJq11s447dZxLyM7C5uPDX25dwaTKYJbrPTz89vWypy9EZ6c1fH6fcYJ2+tNeEfYt1SkC\nH8DH14+/3LKCiyoD+dy1k4feniOhL0RntuUtSFtpDI4WmmB2Ne2i0wQ+gN0ewNO3fMukygC+cKXy\noIS+EJ1TaSZ89VvoMQFG3W52Ne2mUwU+GKH/l1tWMakygGWuVO5/+1oJfSE6E63h8/8BVw1c+c8O\nNThaYzrPmjZwPPQvrgzkK9du7n1rJm5zLgwWQrS37e/Dvi9h8u8hso/Z1bSrThn48N/Qv6wyhFXu\nAyx480pcbrk4S4gOrSwblj8ACaNhzPzG23cwnTbwAfzs/jx527fMqIxgDUf41RvTcLpkGAYhOiSt\n4dMF4KyGq5/vsMMnnE2nDnwweu88/suVzKyMYYMlk5vfuBRHrQy4JkSHk/IqHPgGpjwOUf3MrsYU\nnT7wAaw2G4/O+5rrq7qxzVrAjW9eTJmj1OyyhBCtpSDNGAmzz8WdqlfOySTw61isVh6c9zm31pzH\nfmsZc9++mLyyHLPLEkK0lMsJH80Dqy9ctahTXGB1JhL4DSiLhV/f/h53qDFkWqqY+/5UDubtM7ss\nIURLrHkaMlLgimchpKvZ1ZhKAv8kSil+9YtXuNd+OeWqhps/m82mw+vNLksI0RxHfoTvn4Ih18Lg\nWWZXYzoJ/DO4cc7TLIy6GV/t5M5Vv2TFzs/MLkkIcS6qimDJ7RDe07ipiZDAP5urrryf3/ZaSIzT\nyQMpD/GfH14wuyQhRFNoDUvvgfJsmPUK+AWbXZFHkMBvxKSLf8Fjw/5OosPFX/Y/z5PLF8pQDEJ4\nuk2vwe6lxsBo8SPMrsZjSOA3wYhRl/H/Ji9mYrmbd3KXce8HN+B0O80uSwhxOlnb4cuHjC6Y4xaY\nXY1HkcBvot79hvLI7K/4WamNVVXbueGtqZRWl5hdlhCiIUcJvH8T+IfDz17qVAOjNYX8Ns5BdJcE\nHrz1e24riWCvO5vZ71zC4SLTb90rhIC6oRPuguKjMPs1CIo2uyKPI4F/jgICQ7j7zpXc60iiUlcw\n55OrWXf4e7PLEkKsfx52fwaXPgY9xpldjUeSwG8Gq83GzfPf5Tf+M4ly1rDguwW8uv55s8sSovM6\n8iN88wgMvEKO25+FBH4LXD3njzzY+3ckV9Xwt70vcP9nd1DrltE2hWhXJRnw/o0Q1qPTD53QmCYF\nvlJqqlJqr1LqgFLqwTO0maiU2qqUSlVKdZpjHBMm38ADF73F7BIXXxau5fp3r6CgqsDssoToHGod\n8N4NxvOcd8E/zOyKPFqjga+UsgKLgMuBRGCOUirxpDZhwPPAlVrrQcA1bVCrx+qXOIoFc1ZyV0EA\nh2rS+dl7U9iWvcXssoTo2I7fqjBzM8z8F0QPMLsij9eUPfzRwAGt9UGtdQ2wGLjqpDbXAx9prY8C\naK1zW7dMzxcZHcctd33P/1YMIsBZyc1f3sRbW18zuywhOq4NL8K2d+CiB2HgdLOr8QpNCfx44FiD\n1+l17zXUHwhXSn2nlNqklLqptQr0Jn5+dq676z3uC76RUVUOntr2LPctu4Mqp9xQRYhWte9r+Oph\nGDAdLlpodjVeo7VO2tqAkcB04DLg90qp/ic3UkrNU0qlKKVS8vLyWmnRnkUpxZRrH+bOYf/khqJq\nvslby+z3pnGk9IjZpQnRMeSkwoe3QpdBMFMurjoXTflNZQDdGrxOqHuvoXTgK611hdY6H1gNDD15\nRlrrl7TWyVrr5Ojojn1RxLBxU7npmq95MM+f0uocZn90FcvTvjC7LCG8W1kOvHMt+AXBnPeMZ9Fk\nTQn8jUA/pVQvpZQvcB2w9KQ2nwLnK6VsSqkAYAywu3VL9T5x8T2ZtWA1v6kcSf/qSh5Y+yC/+/YB\nHE6H2aUJ4X1qKmHx9VBZYPTICT35yLJoTKOBr7V2AguArzBC/H2tdapSar5San5dm93Al8B24Cfg\nZa31zrYr23vY7XauWvAf5sX8mrnFFXx6bDmzP5jBwRIZkkGIJnM5YcltkLEJZv4bug43uyKvpMwa\n6jc5OVmnpKSYsmyz7EvdxNYvbub/olxUWHx4aOxvmTXgGpRcKCLEmWkNn/8aNr0O056G0b80uyJT\nKaU2aa2Tm/NZOdvRjvoPGsll89eysHggwx2VPLbhce7+6lcUO4rNLk0Iz7X6r0bYn39fpw/7lpLA\nb2ehIaFM+58lXB++gLsLyliX/QNXfTiNHzN/NLs0ITxPymuw6gkYej1MfsTsaryeBL4JlFJcfO29\nXHjZRzyZZSW0qpB538zjTz88Ln32hThux4fGlbT9psCVz8kYOa1AAt9EAweNYPzd67irYjRzS8p4\nd//7zProKnbk7TC7NCHMtWcZfDQPekyAn78JVh+zK+oQJPBNFhIUxJR732B83yd5OrOcmrJ0blg2\nl39s+js1rhqzyxOi/R38Dj64GeKGwvWLwcff7Io6DAl8D6CU4sJpcxh4w/f8NjeOK8vKeHnnK8z+\n5Gp25kvvVtGJHFoD786ByD5wwxLwCza7og5FAt+D9OjWnfPv/5rzQ+bzTFYx5cWHmfvF9Tyb8qwc\n2xcd3+G18M7PIaw73PQpBESYXVGHI4HvYXxsVi67aSFxV3zGHzODubqsjNdSX2Pmx1fxU9ZPZpcn\nRNs4vBbevsYI+198BkExZlfUIUnge6ikwcMY8pvVjPaZw/OZBeiSdG77+jYeXfeI9NsXHcvB74yw\nD+0mYd/GJPA9WJC/H9N/9Sd8L/mIP2YGcmtxCZ/s/5gZH09nadpSzLpKWohWs2eZEfbhPeHmzyXs\n25gEvhcYM2os/f53Db19b+A/GfkklBbw27W/5favbiOtOM3s8oRonh0fGrcnjE2Cm7+QsG8HEvhe\nIjTQzlV3/ImKqZ9xb3YUv88vJDUrhVlLZ/FsyrNU1FaYXaIQTbfxFVhyO3QfJydo25EEvpcZNzKZ\nQQ98iyX0bt49WsgVpaW8lvoaV348g2UHl8lhHuHZtIaVj8MX9xlX0N7woXS9bEcS+F4o2N+X2bc/\nSME133JB8RD+k5lNaEkuC9cs5Bdf/oLUglSzSxTiVK5a+PQuWPM0jLgJrntHLqpqZxL4Xiw5sT8X\nLVzCvr5/58/pLh7LK+BQznbmfD6HR9Y9Ql5lx7yNpPBCjhLjTlVb34aJD8GM58BqM7uqTkcC38vZ\nfaz8/Jq5qF+toYqfs+RwFnNLyvnswKdM/3g6L2x7gcraSrPLFJ1Z4SF4ZQoc+t4I+okPykBoJpEb\noHQgbrdm2bqN+K38PQNtKTwVFcv3/hZi/GO4a/hdXNnnSmwW2asS7ejID7B4Lmi3MQha74vMrsjr\nyQ1QBAAWi+KKC0Yz8v7P+LL70yzIVLyRmUNkeRGP/vAos5fOZtXRVXJiV7Q9rY2x7N+40uiB88tv\nJew9gAR+BxQR6Msvb7qF6tu+Y4PtNl44ksMzOfnUlGRwz6p7uGn5TWzM3mh2maKjqnXA0gXGbQl7\nXQi3rzAGQxOmk0M6HZzbrflkfSrl3/yFWe4vWBoSxL9jYshzOxgXN467h99NUnSS2WWKjqL4KLx3\nI2RthQsfMI7XW6xmV9WhtOSQjgR+J1HqqOXNZd/TY8vTXGpbz1th0bwREUax28FFCRdxx9A7GBQ1\nyOwyhTfb/bnR7VK74Wf/goHTzK6oQ2rzY/hKqalKqb1KqQNKqQfP0m6UUsqplJrdnGJE2wmx+7Bg\n5iUk3vMhf41bxMD8aL48uJ/55S62ZG3gui+u4+6Vd5OaL334xTmqdcCy++G9uRDeA+Z9J2HvoRrd\nw1dKWYF9wKVAOrARmKO13nWadt8ADuBVrfWHZ5uv7OGba92BfD7/+G2uK3udPtZDvB7dnXdC/Chz\nOZgQP4FfDfkVw2OGm12m8HS5u+GjX0L2Dhh7F1zyKNj8zK6qQ2vrPfzRwAGt9UGtdQ2wGLjqNO3u\nBpYAuc0pRLSvCX2j+OP/3sO+GZ/ymHUhk7PdfJ22nwVVNnblbOGm5Tdxy5e3sDZjrfTqEadyu+CH\nf8K/LoLSLJizGKb+ScLewzUl8OOBYw1ep9e9V08pFQ/8DHih9UoTbc1qUVwzqjuPL3yAVZM+4g/u\ne5mYWcOXB/ZyX5WVo4V7uWPFHVzz2TUsP7Qcp9tpdsnCExQegjdmwNe/g76XwJ3rYcDlZlclmqC1\nrsL5O7CSF9hcAAAYqklEQVRQa+1WZ7mCTik1D5gH0L1791ZatGgpu4+VOyf1p2j0b3nhu5+Tvf59\n7spawg3ZqXwW3Z3XffN4YPUDdA3sytzz5jKr/ywCfQLNLlu0N5cTNrwA3z4BFhtc9TwMu16umvUi\nTTmGPw74g9b6srrXDwForZ9s0OYQcHyrRwGVwDyt9Sdnmq8cw/dcWSVVPPfNXgq3LOVO2yckqTRW\nhcfxZmwCmx05BPkEMavfLOacN4f4oPjGZyi8X9Z2WHq30d2y/+Uw/RkIlW1vhjbtlqmUsmGctJ0M\nZGCctL1ea33a7hxKqdeBz+Wkrfc7nF/BP1bsI2f7NyzwWcp4tYMdASG82T2Rb2qy0cDF3S5m7nlz\nGdllJGf7die8VFUxrHoCNr4MAVEw7S+QeLXs1ZuozfvhK6WmYRy2sWL0wHlCKTUfQGv94kltX0cC\nv0PZn1PG31fu59DOH5nvs5zplh/ItVp4r+dQPlDllDor6RfejzkD5zC913QCfALMLlm0lNttjGy5\n4g9QVQijbodJD4N/uNmVdXpy4ZVoF/tyynhu5X427djJrb4ruMG2CtxlLOvan8Uhwexx5BLsE8yM\nPjO4pv819A3va3bJojnSvoWvH4GcHdBtLEz7K8QNMbsqUUcCX7Sr/TllLFp1gK+3HWK2bR13BX5L\njOMgW4MjWJzQn29qcql1OxkeM5zZ/WdzaY9L8bfJjS48XtZ2Y48+bSWEdYfJj8LgWXL4xsNI4AtT\nHM6v4PnvDvDxlnRGsJeFkWsYXr6GYlx82i2RD+02jtQUEuQTxLRe05jZfyaJEYlyrN/T5OyC756E\n3UvBHgoX3g+j50mfeg8lgS9MlVlcxStrD/HuT0fxrylkYdxmZji/wV52mJTgcD7u2pevnYVUu2vp\nF96Pq/pcxfTe04nyjzK79M4taxus/RukfgK+QTDuThh7J/iHmV2ZOAsJfOERiipq+M/6I7zxw2EK\nKqq5vssx7ghdT0LW15S5qlge04OloeFsrynAqqyM7zqeGX1mMLHbRDnk0160hsNrjKBP+xZ8g2H0\nL2H83ca49cLjSeALj+KodbFkczovrznEofwK+oXCb3vv4/yKFdiOruOgj5Wlsb35wm4l21lBgC2A\nS3pcwuW9LmdM3Bh8LD5mr0LHU1MJOz6An/5tnIwNjIGxd0DyrbJH72Uk8IVHcrs1K/fk8srag6w/\nWEiAr5WbB/txa9hmog59ijtrG5vsfnwW04MVPm7K3DWE+4UzpecULut5GSNiRmCVsdRbJm8vbH7T\n6GJZVQQxg2DMPBhyHfjYza5ONIMEvvB4OzNKeG3dYT7blkmNy80F/aKYn2RhnGMNlt2fUJO1jTUB\n/iyPiOV7X3BoF1H+UVzS/RKm9Jwi4X8uqoph16ew5S1I/8kYBmHgdBj9K+gxXnrdeDkJfOE18sur\nWfzTUf6z/gg5pdXEh/kzd2x35vTThB9bAXu+oPLoj6z29+XrkDDW2H1x4CbCL4xJ3SczuftkxsSN\nwdfqa/aqeJaaCtj3JexYAge+AVcNRA2AETcae/NB0WZXKFqJBL7wOrUuNyt25fDmj0f48WABvlYL\nUwfHcv2Y7oyJtaAOfgv7vqYy7RvW6kpWBAawOjCQCgUBFl8mdB3PpJ5TuDDhQkL9Qs1eHXOU58Le\n5bB3GaStAlc1BMUafeeTZkHXEbI33wFJ4Auvtj+njLc3HOWjzemUOpz0iQ5kzuju/Gx4PJEBPpC9\nDdJWUX1wJRtyt7HKbuO7gADybVYswLCg7lzU7WIu6DuDvuH9Om4//9oqOPaT0bsmbaVx0xGA0O7G\nIZuB041DNnLoq0OTwBcdQlWNiy92ZPHOhiNsPlqMj1UxJTGWn4/qxvl9o7BalHE7vYwU3IfWsvPo\nt6wuTWO1n43dfsYhnlhsnB/Uk/O7jmdM3ysJiurvvXu5ZdmQsRmObYCjPxo/u2uNY/LdxkKfSdBv\nCsQmee86inMmgS86nH05Zby38RgfbU6nqLKWuFA7M0fEM3tkN3pFNRiL3+WE3F1kH1zB2vQ1rC0/\nxHpLLRUWC1atGVrrYpwtgnFhAxgUOxJb9ACI7AshCWBtrdtBtJCzGgoPQu4u45aBObsgcwuUZRrT\nLTboOtzYe+8+HnpOAL9gc2sWppHAFx1WtdPFyt25fJByjO/35eHWMLJHODNHxHPFkK6E+p/aZ7+2\nsoit+5fyY/r3/FC0l13OUrSCILeb5CoHYx0ORlU76RsQiyW8lzGue3BXCOkKQV0gMBoCIyEg0rgw\nydKUG8OdgbMaKgugIh8q86E0E0oyoDQDig4bd48qOQbU/TtUVuMPUtxQI+TjR0DsEPCVEUiFQQJf\ndArZJQ4+3pLBks3pHMgtx9dm4ZLzYrhqWDwTB0TjZzv9sesiRxEbsjaw4dj3rM9aT7ojH4BwbCS7\nLYysrCK5JI9+NTWnueenMvam/ULAxx9sdmOMGauPMe34oRRXrdEzxlULtRVQXQ415eB0nH5lAqON\nAcoi+kBEb4jsAzHnQWQ/6R8vzkoCX3QqWmt2ZpSyZHM6n23LpKCihlB/H6YlxTJjaFfG9Io0jvef\nQUZ5BhuzN9Y/siqyAAj2CWJ4aB+GB8Qz3CeCwfjiV1MJ1aXgKDXC21ltPLtqjhdjPFt9wOprPPsE\ngF+QMT6NPcS4cUhglPGNITjO+CYhA5OJZpLAF52W0+Vm7YF8PtmSwde7cqiscRET7McVQ7pyxdA4\nhncLa7TXTmZ5JptyNpGSk8KW3C0cKjkEgM1iIzEikSHRQxgaM5ShUUOJDYztuL2AhFeQwBcCqKxx\n8u2eXJZuzeS7vXnUuNzEh/kzLSmWaUlxDGtC+AMUOgrZmruVrXlb2Za7jdSCVKpd1QBE+UcxOGow\nQ6KGMChyEIOiBnXe6wCEKSTwhThJqaOWFbty+GJ7Fqv351Hr0nQNtXPZ4FguHxzHyB7hZz3s01Ct\nu5Z9hfvYnr+dHXk72JG/g8Olh+unJwQlkBiZyHmR55EYkcjAyIFE2GXkSdE2JPCFOIuSqlpW7s5h\n2Y5sVu/Po8bpJirIl0sTuzBlUCzj+0Se8YTvmZTWlLKrYBep+amkFqSyu2A36eXp9dNj/GMYEDGA\ngRED6R/en/7h/eke0h2bxUO6ggqvJYEvRBOVOWr5bm8eX6Vm893ePMqrnQT6WrloQDSXJnZh0oAY\nwgKaN05PSXUJewr3sKdwD3sL97KnaA+Hig/h1E4AfC2+9A7rTd+wvvWP3qG96RrUVQaGE00mgS9E\nM1Q7XfyQVsA3u3JYsSuH3LJqLAqSe0Rw8XkxTB4YQ9+YoBadpK1x1XCo5BD7ivaxr2gf+4v3c6Do\nADmVOfVt/Kx+9AzpSe/Q3vQK7UWvsF70CulFt+BuBPhI/3txojYPfKXUVOAfgBV4WWv955OmzwUW\nAgooA+7QWm872zwl8IUncbs12zNKWLk7h5W7c9mVVQpAfJg/kwZGM2lADOP6RBLg2zqHZEprSjlY\nfJCDJQdJK04jrSSNwyWHySzPRPPff5MxATH0COlB9+DudA/pTo/gHiQEJ8gfg06sTQNfKWUF9gGX\nAunARmCO1npXgzbjgd1a6yKl1OXAH7TWY842Xwl84ckyi6v4dk8u3+3NY92BfKpqXfhaLYzuFcFF\n/aO5sH80/bu0bO//dBxOB0dKj9Q/Dpce5kjpEY6VHaPQUXhC2wh7BAlBCcQHxRMfHE98UDxdA7sS\nFxRHXGAcdptcwNURtXXgj8MI8MvqXj8EoLV+8gztw4GdWuv4s81XAl94i2qni58OFfL93jxW789j\nX045AF1C/Di/bzQX9ItiQt8oooPb9mKqspoyjpYd5VjZMdLL0jlWdoyMsgzSy9PJrsjGpV0ntI+w\nRxAXGEdsYCxxgXF0CehCl8AudAnoQnRANDEBMfhZ5QIwb9OSwG/K99N44FiD1+nA2fbebwOWN6cY\nITyRn83KBf2iuaCfcRORzOIq1uzPY83+fFbuyWHJZqN3zoAuwUzoG8WEvpGM6hVBiL11780b7Bts\n9P2PHHTKNKfbSW5lLpnlmWRVZJFRnkF2RTbZldkcLjnM+qz1VNRWnPK5ML8wovyjiPaPJjogmmj/\naCL9I4nyjyLSHkmkfySR9khC/EKwqBaMKSQ8QlP28GcDU7XWt9e9vhEYo7VecJq2k4DngfO11gWn\nmT4PmAfQvXv3kUeOHGn5GghhIpdbk5pZwroDBfyQls9PhwqpdrqxKEiKD2VcnyjG9I5gVM8IgvzM\n7ZJZXlNOTmUO2RXZ5Fbm1j/yqvLIr8qvf3a6nad81qZshNnDCLeHE+EXQbg9nDA/43WoXyhhfmH1\njxC/EEJ8Qwj2DZY/Em3AIw7pKKWGAB8Dl2ut9zW2YDmkIzoiR62LLUeL+TEtnx/SCth6rBinW2O1\nKAbHhzKmVwSjexp/AEIDWvcbQGvQWlNaU1r/B6DIUUSho5CCqgIKHYUUOgopchRRVF1EcXUxJdUl\nZ5yXQhHsG1wf/iG+IQT5BhHkE0SwbzDBvsEE+gQS5BNEoE8gAT4BBPoEGg9bIP4+/gTYAvC3+ctw\nFg20deDbME7aTgYyME7aXq+1Tm3QpjvwLXCT1vqHpixYAl90BpU1TjYfKWb9wQI2HCpg27ESalxu\nlDIOASX3DGdUzwhG9ggnPsz7gs3pdlJaU1of/iXVJRRXF1NaXUppTSkl1SWU15ZTWlNKaXUp5bXl\nlNWUUV5bftpDTGfib/M/4WG32rHb7PjZ/LBb7fhZ/bDb7PhafI1nqy++Fl/j2eqLj8Wn/j0fqw8+\nFuNhs9hOfCjj2aqsWC1W47nuZ4uyYFVWlFLGMwqLsmBRFhTKGDz1+H9127Hhz2D8QdXo/z6jMf7X\nuLUbt3af8LNbu3Fp1wmv44Pj2+4YvtbaqZRaAHyF0S3zVa11qlJqft30F4FHgEjg+bqVcza3ICE6\nkgBfG+f3i+L8flGA8Q1g27FifjpUyMYjRXyyJZO31h8FIDbEzsge4YzoEc6I7mEkdg055yuA25vN\nYiPCHtGsoSTc2k1lbWV9+B9/VNZWUumsrH+uqK2gyllV/6h2Vdf/XOwoptpVbTyc1VS7q6lx1eBw\nOk7o3ioMcuGVECZyuTV7sktJOVzEpiPGI6O4CgBfm4XBXUMY3j2cYd3CGNYtjIRw7/sWYAatNU7t\npMZVU/+odddS666lxlWD0+2sf+10O//70E5c2oXL7cLpdtbvYbvcLlzahUbjcrtO2RM/Za8d6vfc\nT3a6bwMNvy0cP+9hVdYT3jv+emb/mXKlrRAdRXaJgy1Hi9hyrJjNR4rYkVFCtdMNQGSgL0MSQhmS\nEMbQbsZzVJB0rexM2rpbphCiHcWG2rk8KY7Lk+IAqHW52ZtdxtZjxWw9Vsz29GK+25dXf++VuFA7\nSfGhJMWHMjg+lEHxIcQEy0VX4lSyhy+EF6qodrIzo4Qdxx/pJRzM/+9J0OhgPwZ1DWFQ1xAS40JJ\n7BpCj4gALE0cElp4LtnDF6KTCfSzMaZ3JGN6R9a/V+aoZXdWGTsySkjNKGFXVilr9ufjchs7dQG+\nVgbEBnNeXAjnxQYzMC6EAbHBrX6BmPBcsocvRAfmqHWxL6eM3Vml7M4qY1dWKbuzSilz/Pfiqq6h\ndgbEBtM/Npj+McH07xJM35gg/H09u4dQZyV7+EKI07L7WBmSEMaQhLD697TWZJU42Jtdxu7sUvZl\nl7Enu4x1BwqocRknh5WC7hEB9IsJok9MEP1ijD8CfaIDCZZvBF5LAl+ITkYpRdcwf7qG+TNpYEz9\n+7UuN0cKKtiXU87+nHL25ZZxIKec1fvy6/8QgDFoXJ/oIHpHB9I7ynjuEx1E1zD/Jt82UphDAl8I\nAYCP1ULfmGD6xgRD0n/fd7rcHC2s5EBuOWl5FXXP5Szdmklpg0NDvlYLPSID6BUVSK+oQHpEBtIz\nKoCekYHEhtjlhLEHkMAXQpyVzWqhd3QQvaODTnhfa01+eQ0H88o5lF/BofwK0vKM5+/2GfcOPs7X\nZqFHRAA9IgPoHhFY9xxAt4gAEsL9sfvI+YL2IIEvhGgWpRTRwX5EB/ud0FsIjCuIs0qqOJxfyZHC\nCo4UVHIov4JjhZWsO1BAVe2JY/d3CfGjW7gR/t0iAugWHkB8uD8J4f7Ehfrja5NRN1uDBL4QotVZ\nLYqE8AASwgM4n6gTph3/ZnC0sIJjhVUcLazkaGElxwor2Xi4iKXbMnE36DyoFMQE+xEf5k98eABd\nw+zEh/nTNdSfuDA7XUP9CQvwkSEnmkACXwjRrhp+MxjZ49TptS432SUOjhVVklFURXpRFRnFVWQW\nV7E9vZivdjpOOIkM4O9jJS7UTlyYndgQf+JC7XQJtRMXYic21E6XEDuRgb6d/jyCBL4QwqP4WC3G\nYZ2I09+k3e3W5FdUk1XsILO4iswSB1nFVWSVOMgqqeLHtHxyyqrrLzg7zmZRxAT7ERNip0uIH11C\njD8E0cF+xvvBxs8Rgb4dtreRBL4QwqtYLIqYYDsxwXaGdgs7bRuXW5NfXk12iYOsEgc5pccf1eSU\nOjiUX8H6g4WUVNWe8lmrRREZ6Et0sB9RQcbD+Nl4LzLQj6hgX6KC/AgP8K4/DhL4QogOx2pR9Xvw\nQ7uduZ2j1kVeWTW5ZQ5yS6vJLasmr+6RW+Ygv7yGfTll5JdXU+s6zVDHCiICfIkM8iUi0JfIQOMb\nQkTgf9+LCPAlvO698ABfU09AS+ALITotu4/1rIePjtNaU1JVS355Dfnl1eSXV1NYUVP/urC8hoKK\nanZnl1JQXnPabw7HBfnZCAvwITzAt/45PMCHsAavQwN8CPOve8/fhxB/n1b5JiGBL4QQjVBK1QWy\nL31jghpt73S5KaqspbCihsKKGooq654raiiqrK1/XVxVy9HCSooqak64iO10gv1sLb4PsgS+EEK0\nMpvVUt8TqamcLjelDifFlcYfhdKqWoqraiiqqKWkyniUVtWyriV1teCzQgghWonNaqk//n82f7uu\n+cuQy9eEEKKTkMAXQohOokmBr5SaqpTaq5Q6oJR68DTTlVLqubrp25VSI1q/VCGEEC3RaOArpazA\nIuByIBGYo5RKPKnZ5UC/usc84IVWrlMIIUQLNWUPfzRwQGt9UGtdAywGrjqpzVXAm9qwHghTSsW1\ncq1CCCFaoCmBHw8ca/A6ve69c22DUmqeUipFKZWSl5d3rrUKIYRogXY9aau1fklrnay1To6Ojm7P\nRQshRKfXlMDPABqORpFQ9965thFCCGEipfWpAwKd0EApG7APmIwR4huB67XWqQ3aTAcWANOAMcBz\nWuvRjcy3DNjbouo9WxSQb3YRbUjWz3t15HWDjr9+A7TWwc35YKNX2mqtnUqpBcBXgBV4VWudqpSa\nXzf9RWAZRtgfACqBW5qw7L1a6+TmFO0NlFIpsn7eqyOvX0deN+gc69fczzZpaAWt9TKMUG/43osN\nftbAXc0tQgghRNuTK22FEKKTMDPwXzJx2e1B1s+7deT168jrBrJ+Z9ToSVshhBAdgxzSEUKITqLN\nA7+jD7zWhPWbqJQqUUptrXs8YkadzaGUelUplauU2nmG6d6+7RpbP2/edt2UUquUUruUUqlKqXtP\n08Zrt18T18+bt59dKfWTUmpb3fo9dpo25779tNZt9sDoxpkG9AZ8gW1A4kltpgHLAQWMBTa0ZU0m\nrN9E4HOza23m+l0IjAB2nmG61267Jq6fN2+7OGBE3c/BGNfSdKR/e01ZP2/efgoIqvvZB9gAjG3p\n9mvrPfyOPvBaU9bPa2mtVwOFZ2nizduuKevntbTWWVrrzXU/lwG7OXV8K6/dfk1cP69Vt03K6176\n1D1OPuF6ztuvrQO/1QZe81BNrX183Veu5UqpQe1TWrvw5m3XVF6/7ZRSPYHhGHuJDXWI7XeW9QMv\n3n5KKatSaiuQC3yjtW7x9pN72ra9zUB3rXW5Umoa8AnGfQOE5/P6baeUCgKWAL/WWpeaXU9ra2T9\nvHr7aa1dwDClVBjwsVJqsNb6tOebmqqt9/A7+sBrjdautS49/tVMG1cs+yilotqvxDblzduuUd6+\n7ZRSPhhh+LbW+qPTNPHq7dfY+nn79jtOa10MrAKmnjTpnLdfWwf+RqCfUqqXUsoXuA5YelKbpcBN\ndWecxwIlWuusNq6rtTS6fkqpWKWUqvt5NMbvvKDdK20b3rztGuXN266u7leA3VrrZ8/QzGu3X1PW\nz8u3X3Tdnj1KKX/gUmDPSc3Oefu16SEd3XYDr3mEJq7fbOAOpZQTqAKu03Wn2D2dUupdjJ4OUUqp\ndOBRjJNHXr/toEnr57XbDpgA3AjsqDsODPAw0B06xPZryvp58/aLA95Qxi1mLcD7WuvPW5qdcqWt\nEEJ0EnKlrRBCdBIS+EII0UlI4AshRCchgS+EEJ2EBL4QQnQSEvhCCNFJSOALIUQnIYEvhBCdxP8H\nx5L89RnWxHIAAAAASUVORK5CYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "x = np.linspace(0, 3, 100)\n", "plt.plot(x, np.exp(-x), label = \"True\")\n", "plt.plot(x, 1-x+x**2/2-x**3/6+x**4/24, label = \"Taylor\")\n", "plt.plot(x, (1-x/2+x**2/8)/(1+x/2+x**2/8), label = \"Rational\")\n", "plt.xlim(0, 3)\n", "plt.legend()\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, suppose that we have measured 100 pairs of values $(x_1, y_1), ... , (x_{100}, y_{100})$ of two variables $x, y$.\n", "

\n", " 1. Plot data (make sure to include error bars). (Hint - https://matplotlib.org/api/_as_gen/matplotlib.axes.Axes.errorbar.html)
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Load a given 2D data\n", "data = np.loadtxt(\"HW4_Problem3_data.dat\")\n", "x = data[:,0]\n", "y = data[:,1]\n", "sig_y = data[:,2]" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Make plot\n", "..." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We find that $y \\sim 0$ as $x$ gets large. Hence, it is better to use rational approximations rather than polynomials in this case. We come up with the rational function model for the data:\n", "$$ y = \\frac{a_0 + a_1 x + a_2x^2 + a_3x^3}{1 + a_4 x + a_5x^2 + a_6x^3} $$\n", "
where $a_i$'s are the unknown parameters.\n", "

\n", "Multiply both sides by the denominator of the rational function:\n", "$$ y = a_0 + a_1 x + a_2x^2 + a_3x^3 - a_4 xy + a_5x^2y + a_6x^3y $$\n", "
Again, to measure how well the model agrees with the data, we use the chi-square merit function: $$ \\chi^2 = \\sum_{i=0}^{N-1} \\big( \\frac{y_i-(a_0 + a_1 x_i + a_2x_i^2 + a_3x_i^3 - a_4 x_iy + a_5x_i^2y_i + a_6x_i^3y_i)}{\\sigma_i} \\big)^2. $$\n", "

\n", "Wish pick the parameters which minimize $\\chi^2$.

\n", "Let us solve this problem using normal equations.

\n", "First, let $\\bf A$ be the $N \\times M$ ($N$ = 100, $M$ = 7) design matrix:\n", "\n", "\\begin{bmatrix}\n", " 1/\\sigma_{y0} & x_0/\\sigma_{y0} & x_0^2/\\sigma_{y0} & x_0^3/\\sigma_{y0} & -y_0x_0/\\sigma_{y0} & -y_0x_0^2/\\sigma_{y0} & -y_0x_0^3/\\sigma_{y0} \\\\\n", " 1/\\sigma_{y1} & x_1/\\sigma_{y1} & x_1^2/\\sigma_{y1} & x_1^3/\\sigma_{y1} & -y_1x_1/\\sigma_{y1} & -y_1x_1^2/\\sigma_{y1} & -y_1x_1^3/\\sigma_{y1} \\\\\n", " ....&....&....&....&....&....&.... \\\\\n", " 1/\\sigma_{yN} & x_N/\\sigma_{yN} & x_N^2/\\sigma_{yN} & x_N^3/\\sigma_{yN} & -y_Nx_N/\\sigma_{yN} & -y_Nx_N^2/\\sigma_{yN} & -y_Nx_N^3/\\sigma_{yN}\n", "\\end{bmatrix}\n", "

\n", "Also, define a vector $\\bf b$ of length $N$ by\n", "$$ b_i = \\frac{y_i}{\\sigma_i} $$\n", "
and denote the $M$ vector whose components are the parameters to be fitted ($a_0, a_1, a_2, a_3, a_4, a_5, a_6$) by $\\bf a$.\n", "

\n", "Minimize $\\chi^2$ by differentiating it with respect to all $M$ parameters $a_k$ vaishes. This condition yields the matrix equation
\n", "$$ \\bf \\boldsymbol \\alpha \\cdot a = \\boldsymbol \\beta. $$\n", "
where $\\bf \\boldsymbol \\alpha = A^T \\cdot A$ and $\\bf \\boldsymbol \\beta = A^T \\cdot b$ ($\\boldsymbol \\alpha$ is an $M \\times M$ matrix, and $\\boldsymbol \\beta$ is a vector of length $M$). \n", "

\n", " 2. We have $ \\bf \\boldsymbol \\alpha \\cdot a = \\boldsymbol \\beta. $ Solve for $\\bf a$ using \"GaussianElimination_pivot\" from Part 1.
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Define A\n", "A = ...\n", "# Define b\n", "b = ...\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Transpose of the matrix A\n", "A_transpose = ...\n", "\n", "# alpha matrix\n", "alpha = ...\n", "# beta vector\n", "beta = ..." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Using the Gaussian elimination with partial pivoting\n", "a = ...\n", "\n", "print('Using Gaussian Elimination:')\n", "print('a0 =', a[0], ', a1 =', a[1], ', a2 =', a[2], ', a3 =', a[3], ', a4 =', a[4], ', a5 =', a[5], ', a6 =', a[6])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " 3. Compute the error (standard deviation - square root of the variance) on the fitted parameters..
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "from scipy.linalg import inv\n", "# Covariance matrix\n", "C = inv(alpha)\n", "\n", "...\n", "\n", "sigma_a0 = ...\n", "sigma_a1 = ...\n", "sigma_a2 = ...\n", "sigma_a3 = ...\n", "sigma_a4 = ...\n", "sigma_a5 = ...\n", "sigma_a6 = ...\n", "\n", "print('Error: on a0 =', sigma_a0, ', on a1 =', sigma_a1, ', on a2 =', sigma_a2, ', on a3 =', sigma_a3, ', on a4 =', sigma_a4, ', on a5 =', sigma_a5, ', on a6 =', sigma_a6)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " 4. Plot the best-fit line on top of the data.
" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Make plot\n", "..." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "***" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## To Submit\n", "Execute the following cell to submit.\n", "If you make changes, execute the cell again to resubmit the final copy of the notebook, they do not get updated automatically.
\n", "__We recommend that all the above cells should be executed (their output visible) in the notebook at the time of submission.__
\n", "Only the final submission before the deadline will be graded. \n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "_ = ok.submit()" ] } ], "metadata": { "kernelspec": { "display_name": "Python 2", "language": "python", "name": "python2" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 2 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython2", "version": "2.7.13" } }, "nbformat": 4, "nbformat_minor": 2 }