{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Numerical Methods\n", "\n", "# Lecture 7: Numerical Linear Algebra III" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Learning objectives:\n", "\n", "* Introduce ill-conditioned matrices (via matrix norms and condition number)\n", "* Consider direct vs iterative/indirect methods\n", "* Example iterative algorithm: the Jacobi and Gauss-Seidel methods\n", "* Sparse matrices and a pointer to more advanced algorithms (supplementary readings)" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [], "source": [ "%matplotlib inline\n", "import matplotlib.pyplot as plt\n", "\n", "import numpy as np\n", "import scipy.linalg as sl" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Ill-conditioned matrices\n", "\n", "The conditioning (or lack of, i.e. the ill-conditioning) of matrices we are trying to invert (to obtain the inverse, or to find the solution to a linear matrix system) is incredibly important for the success of any algorithm.\n", "\n", "When we started talking about matrices we noted that as long as the matrix is non-singular, i.e. $\\det(A)\\ne 0$, then an inverse exists, and a linear system with that $A$ has a unique solution.\n", "\n", "But what happens when we consider a matrix that is nearly singular, i.e. $\\det(A)$ is very small?\n", "\n", "Well smallness is a relative term and so we need to ask the question of how large or small $\\det(A)$ is compared to something.\n", "\n", "That something is the *norm* of the matrix.\n", "\n", "#### Vector norms\n", "\n", "Just as for vectors $\\pmb{v}$ (assumed a $n\\times 1$ column vector) where we have multiple possible norms to help us decide quantify the magnitude of a vector:\n", "\n", "\\begin{align}\n", "\\|\\pmb{v}\\|_2 & = \\sqrt{v_1^2 + v_2^2 + \\ldots + v_n^2} = \\left(\\sum_{i=1}^n v_i^2 \\right)^{1/2}, &\\quad{\\textrm{the two-norm or Euclidean norm}}\\\\\n", "\\|\\pmb{v}\\|_1 & = |v_1| + |v_2| + \\ldots + |v_n| = \\sum_{i=1}^n |v_i|, &\\quad{\\textrm{the one-norm or taxi-cab norm}}\\\\\n", "\\|\\pmb{v}\\|_{\\infty} &= \\max\\{|v_1|,|v_2|, \\ldots, |v_n| = \\max_{i=1}^n |v_i|, &\\quad{\\textrm{the max-norm or infinity norm}}\n", "\\end{align}\n", "\n", "#### Matrix norms\n", "\n", "We can define measures of the size of matrices, e.g. for $A$ which for complete generality we will assume is of shape $m\\times n$:\n", "\n", "\\begin{align}\n", "\\|A\\|_F & = \\left(\\sum_{i=1}^m \\sum_{j=1}^n A_{ij}^2 \\right)^{1/2}, &\\quad{\\textrm{the matrix Euclidean or Frobenius norm}}\\\\\n", "\\|A\\|_{\\infty} & = \\max_{i=1}^m \\sum_{j=1}^n|A_{i,j}|, &\\quad{\\textrm{the maximum absolute row-sum norm}}\\\\\n", "\\end{align}\n", "\n", "Note that while these norms give different results (in both the vector and matrix cases), they are consistent or equivalent in that they are always within a constant factor of one another (a result that is true for finite-dimensional or discrete problems as here). This means we don't really need to worry too much about which norm we're using.\n", "\n", "Let's evaluate some examples.\n", "\n" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[10. 2. 1.]\n", " [ 6. 5. 4.]\n", " [ 1. 4. 7.]]\n" ] } ], "source": [ "A = np.array([[10., 2., 1.],[6., 5., 4.],[1., 4., 7.]])\n", "print(A)" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "15.748015748023622\n" ] } ], "source": [ "print(sl.norm(A))" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "15.748015748023622\n" ] } ], "source": [ "# the Frobenius norm - the default\n", "print(sl.norm(A,'fro'))" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "15.0\n" ] } ], "source": [ "# the maximum absolute row-sum\n", "print(sl.norm(A,np.inf))" ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "17.0\n" ] } ], "source": [ "# the maximum absolute column-sum\n", "print(sl.norm(A,1))" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "13.793091098640064\n" ] } ], "source": [ "# the two-norm - note not the same as the Frobenius norm - also termed the spectral norm\n", "print(sl.norm(A,2))" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "13.793091098640064\n" ] } ], "source": [ "# which is defined as (!!!!)\n", "print(np.sqrt(np.real((np.max(sl.eigvals( A.T @ A))))))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise 7.1: matrix norms\n", "\n", "Write some code to explicitly compute the two matrix norms defined mathematically above (i.e. the Frobenius and the maximum absolute row-sum norms) and compare against the values found above using in-built scipy functions.\n", "\n", "Also, based on the above code and comments, what is the mathematical definition of the 1-norm and the 2-norm?\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Matrix conditioning\n", "\n", "The (ill-)conditioning of a matrix is measured with the matrix condition number:\n", "\n", "$$\\textrm{cond}(A) = \\|A\\|\\|A^{-1}\\|$$\n", "\n", "If this is close to one then $A$ is termed well-conditioned; the value increases with the degree of ill-conditioning, reaching infinity for a singular matrix.\n", "\n", "Let's evaluate the condition number for the matrix above.\n" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[10. 2. 1.]\n", " [ 6. 5. 4.]\n", " [ 1. 4. 7.]]\n", "10.713371881346792\n", "10.713371881346786\n", "12.463616561943589\n", "12.463616561943587\n" ] } ], "source": [ "A = np.array([[10., 2., 1.],[6., 5., 4.],[1., 4., 7.]])\n", "\n", "print(A)\n", "print(np.linalg.cond(A)) # let's use the in-built condition number function\n", "print(sl.norm(A,2)*sl.norm(sl.inv(A),2)) # so the default condition number uses the matrix two-norm\n", "print(np.linalg.cond(A,'fro'))\n", "print(sl.norm(A,'fro')*sl.norm(sl.inv(A),'fro'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The condition number is expensive to compute, and so in practice the relative size of the determinant of the matrix can be gauged based on the magnitude of the entries of the matrix.\n", "\n", "#### Example\n", "\n", "We know that a singular matrix does not result in a unique solution to its corresponding linear matrix system. But what are the consequences of near-singularity (ill-conditioning)?\n", "\n", "Consider the following example\n", "\n", "\n", "$$\n", "\\left(\n", " \\begin{array}{cc}\n", " 2 & 1 \\\\\n", " 2 & 1 + \\epsilon \\\\\n", " \\end{array}\n", "\\right)\\left(\n", " \\begin{array}{c}\n", " x \\\\\n", " y \\\\\n", " \\end{array}\n", "\\right) = \\left(\n", " \\begin{array}{c}\n", " 3 \\\\\n", " 0 \\\\\n", " \\end{array}\n", "\\right)\n", "$$\n", "\n", "Clearly when $\\epsilon=0$ the two columns/rows are not linear independent, and hence the determinant of this matrix is zero, the condition number is infinite, and the linear system does not have a solution (as the two equations would be telling us the contradictory information that $2x+y$ is equal to 3 and is also equal to 0)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise 7.2: Ill-conditioned matrix\n", "\n", "For the example above, consider a range of small values for $\\epsilon$ and calculate the matrix determinant and condition number." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You should find for $\\epsilon=0.001$ that $\\det(A)=0.002$ (i.e. quite a lot smaller than the other coefficients in the matrix) and $\\textrm{cond}(A)\\approx 5000$.\n", "\n", "Using `sl.inv(A) @ b` you should also be able to compute the solution $\\pmb{x}=(1501.5,-3000.)^T$.\n", "\n", "What happens when you make a very small change to the coefficients of the matrix (e.g. set $\\epsilon=0.002$)?\n", "\n", "You should find that this change of just $0.1\\%$ in one of the coefficients of the matrix results in a $100\\%$ change in both components of the solution!\n", "\n", "This is the consequence of the matrix being ill-conditioned - we should not trust the numerical solution to ill-conditioned problems.\n", "\n", "A way to see this is to recognise that computers do not perform arithmetic exactly - they necessarily have to [truncate numbers](http://www.mathwords.com/t/truncating_a_number.htm) at a certain number of significant figures, performing multiple operations with these truncated numbers can lead to an erosion of accuracy. Often this is not a problem, but these so-called [roundoff](http://mathworld.wolfram.com/RoundoffError.html) errors in algorithms generating $A$, or operating on $A$ as in Gaussian elimination, will lead to small inaccuracies in the coefficients of the matrix. Hence, in the case of ill-conditioned problems, will fall foul of the issue seen above where a very small error in an input to the algorithm led to a far larger error in an output." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Roundoff errors\n", "\n", "Note that in this course we have largely ignored the limitations of the floating point arithmetic performed by computers, including round-off errors. \n", "\n", "This is often the topic of the first lecture of courses, or first chapter of books, on Numerical Methods or Numerical Analysis - do take a look at some examples if you are interested. \n", "\n", "Also take a look at *D. Goldberg 1991: What every computer scientist should know about floating-point arithmetic, ACM Computing Surveys 23, Pages 5-48*.\n", "\n", "For some examples of catastrophic failures due to round off errors see and and [the sinking of the Sleipner A offshore platform](http://www.ima.umn.edu/~arnold/disasters/sleipner.html).\n", "\n", "As an example, consider the mathematical formula\n", "\n", "$$f(x)=(1-x)^{10}.$$\n", "\n", "We can of course relatively easily expand this out by hand\n", "\n", "$$f(x)=1- 10x + 45x^2 - 120x^3 + 210x^4 - 252x^5 + 210x^6 - 120x^7 + 45x^8 - 10x^9 + x^{10}.$$\n", "\n", "Mathematically these two expressions for $f(x)$ are identical; when evaluated by a computer different operations will be performed, which (we hope) should give the same answer. For numbers $x$ away from $1$ these two expressions do return (pretty much) the same answer. \n", "\n", "However, for $x$ close to 1 the second expression adds and subtracts individual terms of increasing size which should largely cancel out, but they don't to sufficient accuracy due to round off errors; these errors accumulate with more and more operations, leading a loss of significance " ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "0.00010485760000000006 0.00010485760000436464 4.1623815505431594e-11\n", "1.0239999999999978e-07 1.0240001356576212e-07 1.3247813024364063e-07\n", "9.765625000000086e-14 1.2378986724570495e-13 0.21111273343425307\n" ] } ], "source": [ "def f1(x):\n", " return (1. - x)**10\n", "\n", "def f2(x):\n", " return (1. - 10.*x + 45.*x**2 - 120.*x**3 +\n", " 210.*x**4 - 252.*x**5 + 210.*x**6 -\n", " 120.*x**7 + 45.*x**8 - 10.*x**9 + x**10)\n", "\n", "x=0.6\n", "print(f1(x),f2(x),1.-f1(x)/f2(x)) # values computed in different ways and their relative difference\n", "x=0.8\n", "print(f1(x),f2(x),1.-f1(x)/f2(x)) \n", "x=0.95\n", "print(f1(x),f2(x),1.-f1(x)/f2(x)) " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Algorithm stability\n", "\n", "The susceptibility for a numerical algorithm to dampen (inevitable) errors, rather than to magnify them as we have seen in examples above, is termed *stability*. This is a concern for numerical linear algebra as considered here, as well as for the numerical solution of differential equations as you will see in NM2. In that case you don't want small errors to grow and accumulate as you propagate the solution to an ODE or PDE forward in time say.\n", "\n", "If your algorithm is not inherently stable, or has other limitation, you need to understand and appreciate this, as it can cause catastrophic failures! \n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Direct vs iterative methods\n", "\n", "Two types/families of methods exist to solve matrix systems. These are termed *direct* methods and *iterative* (or *indirect*) methods.\n", "\n", "Direct methods perform operations on the linear equations (the matrix system), e.g. the substitution of one equation into another which we performed two weeks ago for your example $2\\times 2$ system considered in MM1. This (and the subsequent Gaussian elimination algorithm) transformed the equations making up the linear system into equivalent ones with the aim of eliminating unknowns from some of the equations and hence allowing for easy solution through back (or forward) substitution.\n", "\n", "Also, in MM1 you (may have) learnt Cramer's rule which gives an explicit formula for the inverse of a matrix, or for the solution of a linear matrix system. It was pointed out that the computational cost (in terms of arithmetic operations required; also termed complexity) scaled like $(n+1)!$, whereas the Gaussian elimination method (which is basically the substitution method descrined above, and implemented over the previous two lectures) scaled like $n^3$. For large $n$ Gaussian elimination will clearly be more efficient - you considered the case where $n=100$ in MM1 for example. $n$ here refers to the number of unknowns or equations, or sometimes termed the *degrees of freedom* of the problem.\n", "\n", "An advantage of direct methods is that they provide the exact solution (assuming exact arithmetic, i.e. ignoring the round off related issues mentioned above) in a finite number of operations.\n", "\n", "However, as pointed out previously, $n$ could be billions for hard-core applications such as in numerical weather forecasting. In this case the $n^3$ operations required of a direct algorithm such as Gaussian elimination is completely prohibitive. In an attempt to further reduce this cost *iterative* algorithms were devised.\n", "\n", "These algorithms start with an initial guess at the solution ($\\pmb{x}_0$), and *iteratively* improve this producing a series of approximate answers $\\pmb{x}_k$. For the *exact* answer to the matrix system $A\\pmb{x} = \\pmb{b}$, we know that the residual vector $\\pmb{r} = A\\pmb{x}-\\pmb{b}$ is zero. For our iterative procedure, we can use the norm of the residual vector $\\pmb{r}_k = A\\pmb{x}_k-\\pmb{b}$ based on the approximate solution $\\pmb{x}_k$, as a measure of how close we are to solving the equation (the norm $\\|\\pmb{r}_k\\|$ expresses this as a single number). As we iterate further, we hope to drive down this number and we may stop the iterations at some small (non-zero) residual norm tolerance level. The final iteration gives us an answer $\\pmb{x}_k$ which is still an approximation to the solution and not the exact solution we would obtain with direct methods. The residual norm tolerance stopping criteria therefore needs to be thought about carefully, e.g. depending on how accurate a solution $\\pmb{x}$ we require.\n", "\n", "We have already considered Gaussian elimination (and back substitution) as examples of direct solution methods. We'll consider an example of an iterative method now." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Iterative methods - Jacobi's method\n", "\n", "Consider our matrix system\n", "\n", "$$A\\pmb{x}=\\pmb{b} \\quad \\iff \\quad \\sum_{j=1}^nA_{ij}x_j=b_i,\\quad \\textrm{for}\\quad i=1,2,\\ldots, n.$$\n", "\n", "Let's rewrite this by pulling out the term involving $x_i$ (i.e. for each row $i$ pull out the diagonal from the summation):\n", "\n", "$$A_{ii}x_i + \\sum_{\\substack{j=1\\\\ j\\ne i}}^nA_{ij}x_j=b_i,\\quad i=1,2,\\ldots, n.$$\n", "\n", "We can then come up with a formula for our unknown $x_i$:\n", "\n", "$$x_i = \\frac{1}{A_{ii}}\\left(b_i- \\sum_{\\substack{j=1\\\\ j\\ne i}}^nA_{ij}x_j\\right),\\quad i=1,2,\\ldots, n.$$\n", "\n", "Now of course for each individual $x_i$, all the other components of $\\pmb{x}$ appearing on the RHS are also unknown and so this is an example of an implicit formula which doesn't help us directly, but does suggest the following iterative scheme:\n", "\n", "\n", "* Starting from a guess at the solution $\\pmb{x}^{(0)}$\n", "\n", "\n", "\n", "* iterate for $k>0$\n", "$$x_i^{(k)} = \\frac{1}{A_{ii}}\\left(b_i- \\sum_{\\substack{j=1\\\\ j\\ne i}}^nA_{ij}x_j^{(k-1)}\\right),\\quad i=1,2,\\ldots, n.$$\n", "\n", "\n", "Note that for this iteration, for a fixed $k$, it does not matter in which order we perform the operations over $i$ as the right hand side only contains the entries of $\\pmb{x}$ at the previous iteration." ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEWCAYAAABxMXBSAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+17YcXAAAgAElEQVR4nO3dd3hUZfr/8fedhAQJGKT33qQICKICUlSaiqDrrmDZFQu2iGv5rrpucau7v921YmPFBRuuXcSCqFRBughIR5DQiyAdAvfvjxnWMSaQwEzOZObzuq65mHPmnOfch4Hcecp5HnN3RERECisl6ABERKRkUeIQEZEiUeIQEZEiUeIQEZEiUeIQEZEiUeIQEZEiUeIQEcysnpm5maUFHYvEPyUOiXtmdoWZzTKzXWa23sw+MLPOQceVrMzsATN7Meg4JDhKHBLXzOxO4BHgr0BVoA7wJNAvyLgi6bd0STZKHBK3zCwL+CNwq7u/6e673f2gu7/r7v8XPibDzB4xs3Xh1yNmlhH+rJuZ5ZjZXWa2KVxbGRT+7Cwz22BmqRHXu8TMvgy/TzGze81shZltNbNXzaxC+LMjzTrXmdk3wKfh/T83s9Xh439rZqvM7PwilPcLM/vGzLaY2f0RcaWa2a/D5+40s9lmVjv8WTMzG2dm28xsiZn97Ch/nxPM7EEzm2FmO8zsnSMx5HNsDTMbHS53uZndEN7fG/g1cHm4BjjvuL5cKdGUOCSenQ2UBt46yjH3A2cBbYDWQAfgNxGfVwOygJrAdcATZnaKu38O7AbOjTj2CuDl8PshQH+gK1AD+BZ4Is+1uwKnAr3MrDmhmtCVQPWIax5RmPI6A02B84Dfmdmp4f13AgOBC4CTgWuBPWaWCYwLx1wlfMyTZtaiwL8t+Hn4/BpALvBYAceNAnLCx10G/NXMznP3DwnV/v7r7mXdvfVRriWJyt310isuX4R+CG84xjErgAsitnsBq8LvuwF7gbSIzzcBZ4Xf/xl4Lvy+HKFEUje8vQg4L+K86sBBIA2oBzjQIOLz3wGjIrbLAAeA84tQXq2Iz2cAA8LvlwD98rn3y4HJefY9A/y+gL+rCcDfIrabh2NMjYghDagNHALKRRz7IDAi/P4B4MWg/33oFdxLbbMSz7YClcwszd1zCzimBrA6Ynt1eN//yshz7h6gbPj9y8BUM7sZuBSY4+5HyqoLvGVmhyPOPUSon+WINXni+N+2u+8xs60RnxemvA0FxFmbUILMqy5wppltj9iXBryQz7H5xbwaKAVUynNMDWCbu+/Mc2z7o5QrSURNVRLPpgH7CDXxFGQdoR+gR9QJ7zsmd/+K0A/EPvywmQpCP2D7uHv5iFdpd18bWUTE+/VArSMbZnYSULGI5RVkDdCwgP0T85RZ1t1vPkpZtSPe1yFU69mS55h1QAUzK5fn2COxakrtJKfEIXHL3XcQagJ6wsz6m1kZMytlZn3M7P+FDxsF/MbMKptZpfDxRRkq+jKh/ocuwGsR+58G/mJmdQHC5R9tJNfrQF8z62hm6cAfADuB8iI9C/zJzBpbyGlmVhEYAzQxs6vDfy+lzOyMiL6R/FxlZs3NrAyhgQevu/uhyAPcfQ0wFXjQzEqb2WmE+odeCh+yEahnZvr5kaT0xUtcc/eHCHUO/wbYTOi37Gzg7fAhfwZmAV8C84E54X2FNYpQX8in7h75m/ejwGjgIzPbCXwOnHmUOBcCtwGvEKp97CTUn7L/eMrL4yHgVeAj4DtgOHBSuCmpJzCAUC1hA/B3IOMoZb0AjAgfW5pQ0szPQEL9HusIDU74vbuPC392JMFuNbM5hbwHSSDmrlqnSLSZWVlgO9DY3b8OOh4IDccl1Kn9bNCxSMmmGodIlJhZ33BzWibwT0I1oFXBRiUSfUocItHTj1DTzjqgMaHhtKrSS8JRU5WIiBRJ3Nc4zKyBmQ03s9eDjkVERAKqcZjZc8BFwCZ3bxmxvzeh0SepwLPu/reIz15398sKU36lSpW8Xr160Q1aRCTBzZ49e4u7Vz7WcUE9OT4CGAo8f2RHeLK5J4AehObImWlmo8MPaRVJvXr1mDVrVpRCFRFJDma2+thHBdRU5e6TgG15dncAlrv7Snc/QGg8fNxMnS0iIiHx1MdRkx/Oo5MD1DSzimb2NNDWzO4r6GQzG2yhxX5mbd68OdaxiogkrXia5NDy2efuvhW46Vgnu/swM1sP9E1PT28X9ehERASIrxpHDj+cgK0WhZysTkREik88JY6ZQGMzqx+eJG4Aobl9Cs1DK8MNzsrKikmAIiISUOIws1GEpsxuaqGlPa8Lr5mQDYwltOjNq+GJ44pSbl8zG7Zjx47oBy0iIkCCPjnevn1713BcEZGiMbPZ7n7MBbviqanqhJ1ojeOjhRsYOXUVW3ftP/bBIiJJSjWOCP/32jxem51DaorRpXEl+retSY/mVSmTHk+Dz0REYqOwNY6EShxm1hfo26hRoxuWLVt2XGUs3vAdb89dx+gv1rJuxz7KpKfSq0U1+retSaeGFUlLTahKmojI/yRl4jgiGn0chw87M1Zt4+25a3lv/np27sulUtkM+rauTv82NTmtVhZm+T16IiJSMilxRLFzfN/BQ0xYsom3567j08WbOHDoMA0qZdKvTU36t61B3YqZUbuWiEhQkjJxRKOp6lh27DnIBwvW89bctUz/OjTd1oWnVeePF7egYtmjLfUsIhLfkjJxHFFcw3HXbt/LKzO+4emJKzi5dCn+3L8lfVpVj/l1RURiISmH4xa3muVP4q6eTXn3ts5UL1+am1+aw5BRc/l294GgQxMRiRkljihoVu1k3rqlE3f2aML789fT4+FJfLRwQ9BhiYjEREIljiCnHCmVmsKQ8xozOrszlctlMPiF2dzx3y/Yvke1DxFJLOrjiIEDuYd5Yvxynhi/nAqZ6Tx4aSvOO7VqYPGIiBSG+jgClJ6Wwh09mvD2rZ2okJnOdSNncder89ix92DQoYmInDAljhhqWTOLd7I7kd29EW9/sZZeD09i/JJNQYclInJClDhiLCMtlbt7NeXNmztSrnQag/4zk1+9Po9tGnklIiVUQiWOeF6Po3Xt8rx7W2du7taQ12fn0PUf43lqwgr2HTwUdGgiIkWizvEALN24k79/sJhPFm+iRlZp7u7VlP5tapKSormvRCQ46hyPY02qlmP4NWfw8g1nUrFsBne+Oo+LHp/ClGVbgg5NROSYlDgC1LFhJd65tROPDmjDd/sOctXw6fziuRks3vBd0KGJiBRIiSNgKSlGvzY1+eSurtx/wanM/eZbLnh0Mr96fR4bduwLOjwRkR+J+z4OM8sEngQOABPc/aVjnRPvfRxHs33PAZ4Yv5yRU1eTkgLXd27AjV0bUK50qaBDE5EEF9d9HGb2nJltMrMFefb3NrMlZrbczO4N774UeN3dbwAuLvZgi1n5Muncf2FzPrmrKz2bV2Po+OV0+8cEXp25JujQRESA4JqqRgC9I3eYWSrwBNAHaA4MNLPmQC3gyE/NpBm7WrtCGR4b2JbR2Z1oWKUsv3rjS4ZNWhF0WCIiwSQOd58EbMuzuwOw3N1XuvsB4BWgH5BDKHlAEvbJnFarPC9ffyYXnladv76/mJFTVwUdkogkubSgA4hQk+9rFhBKGGcCjwFDzexC4N2CTjazwcBggDp16sQwzOKXlprCI5e34UDuYX4/eiEZaSkM6JBY9ygiJUc8JY78nn5zd98NDDrWye4+zMzWA33T09PbRT26gJVKTWHoFW0Z/Pxs7ntrPhmlUrikba1jnygiEmXx1PSTA9SO2K4FrAsolriUkZbKM1e34+wGFbnr1Xm89+X6oEMSkSQUT4ljJtDYzOqbWTowABhdlALc/V13H5yVlRWTAONB6VKpPPuL9pxe5xRuf2Uu477aGHRIIpJkghqOOwqYBjQ1sxwzu87dc4FsYCywCHjV3RcWsdy4neQwmsqkp/GfQWfQomYWt740h4lLNwcdkogkkbh/APB4lOQHAItix56DDPz356zYvIv/DDqDjg0rBR2SiJRgcf0AYKwkS43jiKwypXjhug7UrViG60bMYtaqvCOcRUSiL6ESRzL0ceRVsWwGL15/JtWzSnPNf2byxZrtQYckIgkuoRJHstU4jqhSrjQv3XAmp2SW4ufDp7NwXXLdv4gUr4RKHMlY4ziietZJvHz9WZTNSOPq4TNYunFn0CGJSIJKqMSR7GpXKMNLN5xFWopx5bPTWb5pV9AhiUgCSqjEkaxNVZHqV8rkpevP5PBh5+KhUxg14xsSceSciAQnoRJHMjdVRWpctRzv3taZtnXKc9+b87l+5Cw27dSiUCISHQmVOOR7NcqfxAvXnsnvLmrOlOVb6P3IZD5csCHosEQkAShxJLCUFOPazvUZc1tnapQvzU0vzubu1+axc9/BoEMTkRIsoRKH+jjy17hqOd68uRO3nduIN+fk0PuRyXy+cmvQYYlICZVQiUN9HAVLT0vhrp5Nee2mjpRKNQb++3P++v4i9ucmzaKKIhIlCZU45Nja1T2F94acw8AOdRg2aSX9hn7GovXfBR2WiJQgShxJKDMjjb9e0or/XHMGW3cf4OKhU3hqwgoOHdawXRE5NiWOJNa9WRXG/rIL559alb9/uJgBw6axZdf+oMMSkTiXUIlDneNFVyEznSevPJ2Hftaa+Wt3cM1/ZrBrf27QYYlIHEuoxKHO8eNjZlx6ei2evPJ0Fq3fyY0vzFKnuYgUKKESh5yYc5tV5e8/OY3Plm/lrlfncVh9HiKSj7SgA5D4clm7WmzZtZ+/fbCYSmUz+H3f5phZ0GGJSBxR4pAfubFLAzbv3M/wKV9TuVwGt3ZvFHRIIhJH4j5xmFkD4H4gy90vCzqeZGBm3H/BqWzZtZ9/jF1CpbLpXH5GnaDDEpE4EdM+DjN7zsw2mdmCPPt7m9kSM1tuZvcerQx3X+nu18UyTvmxlBTjH5e15pzGlbjvzfmM+2pj0CGJSJyIdef4CKB35A4zSwWeAPoAzYGBZtbczFqZ2Zg8ryoxjk+OIj0thaevakermllkvzyHWau2BR2SiMSBmCYOd58E5P1p0wFYHq5JHABeAfq5+3x3vyjPa1Nhr2Vmg81slpnN2rx5cxTvIrllZqTx3DVnUKP8SVw7YqaWpBWRQIbj1gTWRGznhPfly8wqmtnTQFszu6+g49x9mLu3d/f2lStXjl60QsWyGTx/bQdKl0rl58NnsHb73qBDEpEABZE48hvbWeADA+6+1d1vcveG7v7gUQvWk+MxU7tCGUZe24Hd+3P5+fDpfLv7QNAhiUhAgkgcOUDtiO1awLoA4pAiOrX6yfz7F+1Z8+1eBo2YyZ4DmppEJBkFkThmAo3NrL6ZpQMDgNHRKFhTjsTeWQ0q8tiANnyZs51bX5rDwUOHgw5JRIpZrIfjjgKmAU3NLMfMrnP3XCAbGAssAl5194VRup6aqopB75bV+XP/Voxfspk7/vsFO/ZoKVqRZGLuiTcfUfv27X3WrFlBh5HwnpywnH+MXcLJpUuR3b0RV59dl9KlUoMOS0SOk5nNdvf2xzouoSY5VI2jeN3SrRHvDzmHNrXL85f3F3Hevyby1twcTY4okuBU45Co+Gz5Fh78YBEL1n5H8+onc98FzTinsYZFi5QkqnFIserUqBKjb+3MowPa8N2+g1w9fAZXD5/OgrX6LkQSjWocEnX7cw/x4uff8Piny9ix9yD929Tkrp5NqHVKmaBDE5GjKGyNQ4lDYmbH3oM8PXEFz035Gnf4Rce63Nq9EeXLpAcdmojkIykTh5n1Bfo2atTohmXLlgUdjoSt276Xh8ct5fU5OZTLSOO6zg248qw6VCqbEXRoIhIhKRPHEapxxKfFG77jHx8u4ZPFm0hPTaFv6xoM6lSPljX1wKZIPFDiUOKIWys272Lk1FW8PjuHPQcO0b7uKVzTqR69WlSjVGpCjdcQKVGUOJQ44t53+w7y2qwcRk5dxTfb9lA9qzRXnVWXgR3qUCFT/SAixS0pE4f6OEqmQ4ed8Ys3MWLqKqYs30J6Wgr929Tgmo71aV7j5KDDE0kaSZk4jlCNo+RaunEnI6eu4s05a9l78BBn1q/AoE716Nm8Gikp+c3ILyLRosShxFGi7dhzkP/O+oaRU1ezdvtezj+1Cg9d3oaTS5cKOjSRhJWUT45L4sgqU4rBXRoy6Vfd+d1FzZmwZDP9hn6mpWtF4oASh8S11BTj2s71GTX4LHbtz6X/E58x5kut+yUSpIRKHJqrKnGdUa8CY27rzKnVTyb75bn89f1F5GoRKZFAJFTi0AqAia3qyaUZdcNZ/PzsugybtJKrh89g6679QYclknQSKnFI4ktPS+GP/Vryz5+2Zs4339L38SnMW7M96LBEkooSh5RIl7WrxRs3dyQlxfjp09P478xvgg5JJGkocUiJ1bJmFu9md+bMBhW454353PfmfPbnHgo6LJGEVyISh5n1N7N/m9k7ZtYz6HgkfpySmc6IQR24pVtDRs34hp898znrtu8NOiyRhBbzxGFmz5nZJjNbkGd/bzNbYmbLzezeo5Xh7m+7+w3ANcDlMQxXSqDUFONXvZvx9FXtWLFpF30fn8LU5VuCDkskYRVHjWME0Dtyh5mlAk8AfYDmwEAza25mrcxsTJ5XlYhTfxM+T+RHeresxtu3dqJ8mVJc8ex0Ln9mGh/MX69huyJRVixTjphZPWCMu7cMb58NPODuvcLb9wG4+4MFnG/A34Bx7v5xAccMBgYD1KlTp93q1aujfBdSUuzen8tL01f/b7qSGlmlufrsegw4ozanaNZdkQLF+5QjNYE1Eds54X0FuQ04H7jMzG7K7wB3H+bu7d29feXKlaMXqZQ4mRlp/5uu5Jmr21G3YiZ//3AxZz34Cfe+8SWL1n8XdIgiJVra0T40s51AflUSA9zdj3fO6/ymOS2w6uPujwGPHbPQ76dVP86wJJGkphi9WlSjV4tqLN7wHSOnruatuTm8MnPN/2bdPf/UqqRp8SiRIjlq4nD3cjG6bg5QO2K7FqAJiCRmmlU7mQcvbcU9vZvy35lreH7aam56cQ41y5/EVWfVVTOWSBEUqY8j3FFd+si2uxfqqat8+jjSgKXAecBaYCZwhbsvLHQwR6Fp1eVYDh12Pl60kRGfrWLayq1kpKVwc7eG3H5eY0JdaiLJp7B9HEetcUQUdjHwL6AGsAmoCywCWhTi3FFAN6CSmeUAv3f34WaWDYwFUoHnopE01FQlhZW3GevxT5fzyMfL2L7nIL/v21zJQ+QoClXjMLN5wLnAx+7e1sy6AwPdfXCsAzweqnFIUbk7f35vEcOnfM3ADnX4S/+WWnFQkk5UaxzAQXffamYpZpbi7uPN7O8nGGPUqcYhx8vM+M2Fp5KRlsKTE1awP/cQ/7isNalKHiI/UtjEsd3MygKTgJfMbBOQG7uwjo+7vwu82759+xuCjkVKHjPj/3o1pXSpVB4at5QDuYd5+PI2lNKoK5EfKGzi6AfsA+4ArgSygD/GKqjjpRqHnCgzY8h5jclIS+HBDxZzIPcwj1/Rloy01KBDE4kbxfLkeHFTH4dEw4jPvuaBd7+ie9PKPHVVO0qXUvKQxBbVJ8fNbKeZfRd+7TOzQ2amx28loV3TqT5/vaQVE5Zu5rqRM9lzIO5aZ0UCUajE4e7l3P3k8Ks08BNgaGxDKzqtOS7RdsWZdfjnZa2ZtmIr1zw3k137lTxEjqvXz93fJjQ8N65ozXGJhZ+0q8WjA9oy+5tvuXr4dHbsPRh0SCKBKuwDgJdGbKYA7TnK3FIiiaZv6xqkp6WQ/fIcrnz2c1649kxNUSJJq7A1jr4Rr17ATkIjreKKmqoklnq1qMawq9uzdOMuBv77czbv3B90SCKB0KgqkSL6bPkWrh85ixrlSzP0itM5tfrxThItEl8KO6rqqInDzB7n6NOdDzm+8GJLiUNibcbX27h+5Ey+25dLj+ZVye7eiNa1ywcdlsgJidZw3FnAbEIz4p4OLAu/2gCHTjRIkZKqQ/0KTP7VudxxfhNmfL2Nfk98xtXDpzPj621BhyYSc4Wd5HA80NPdD4a3SwEfuXv3GMd3XFTjkOK0a38uL36+mmcnr2TLrgN0qF+B285tROdGlTTLrpQoUWmqiihsCXC2u28Lb58CfO7uTU840iiKmHLkhmXLlgUdjiSZvQcO8d+Z3/D0xJVs+G4frWuXJ7t7I84/tYoSiJQI0U4cg4AHgPHhXV2BB9x95IkEGSuqcUiQ9uce4s05a3lywnLWbNtLs2rlyD63EX1aVtdsuxLXopo4wgVWA84Mb0539w0nEF9MKXFIPMg9dJh3v1zH0E+Xs2LzbhpUzmTIuY3p16aGaiASl6LSOW5mzcJ/nk5o9b814VeN8D4RKUBaagqXtK3FR3d05ckrTycjLZVf/vcL7np1HvsOamyJlFzHenL8TmAwoWVj83LicNoRkXiTmmJc0Ko6vVtU48kJy/nXuKUs2biTZ65uR61TygQdnkiR6QFAkWL26eKN3P7KF6SlGE9ccTodG1UKOiQRIPrTqv/UzMqF3//GzN40s7YnGmQhr32qmT1tZq+b2c3FcU2RWDq3WVVGZ3emUtkMrho+nWcnryQRf4GTxFXYuap+6+47zawzobmqRgJPH+skM3vOzDaZ2YI8+3ub2RIzW25m9x6tDHdf5O43AT8jNLmiSIlXv1Imb93aiV4tqvHn9xZx+ytfsPeA+j2kZChs4jjyL/pC4Cl3fwcozNSgI4DekTvMLBV4AugDNAcGmllzM2tlZmPyvKqEz7kYmAJ8Ush4ReJe2Yw0nrzydP6vV1Pe/XIdlz41lTXb9gQdlsgxFfY5jjHAWuB8oB2wF5jh7q0LcW49YIy7twxvn03oGZBe4e37ANz9wUKU9Z67X1jAZ4MJdeRTp06ddqtXrz7mfYnEi/FLNnH7qLmkpBiPD2zLOY0rBx2SJKGo9nEQaiYaC/R29+1ABeD/jjO2moSG9B6RE96XLzPrZmaPmdkzwPsFHefuw4A/AHPS07VOgpQs3ZtWYXR2Z6qWK80vnpvBMxNXqN9D4lZhl47dA2wCOod35RKa7PB45Pfk09Fm4J3g7kPc/UZ3f+IYcWoFQCmx6lXK5M1bOtKnZXUe/GAx2aPmap1ziUuFHVX1e+Ae4L7wrlLAi8d5zRygdsR2LWDdcZb1A1rISUq6zIw0hl7Rlnv7NOOD+eu59MmprNqyO+iwRH6gsE1VlwAXA7sB3H0dUO44rzkTaGxm9c0sHRgAjD7Osn5ANQ5JBGbGTV0bMmJQB9bv2EevRybx8LilGnUlcaOwieOAhxpcHcDMMgtzkpmNAqYBTc0sx8yuc/dcIJtQn8ki4FV3X1j00PO9nmockjC6NKnMB7efQ4/mVXn0k2Wc/9BE3p+/Xn0fErjCjqq6G2gM9AAeBK4FRrn7Y7EN7/joyXFJNJ+v3MoDoxeyeMNOzmpQgQcubkGzalqyVqIrFrPj9gB6EurcHuvu404sxOjTehySyHIPHWbUzDX866MlfLf3IFefVZc7ejShfBmNIpToiHriyFN4KjDA3V86nuBiTTUOSWTf7j7AQ+OW8tL01WSdVIq7ejZlYIc6WutDTli0plU/2czuM7OhZtbTQrKBlYSe7Ygr6uOQZHBKZjp/6t+S94acQ5Oq5fjN2wu46PEpTF+5NejQJEkctcZhZu8A3xLq4D4POIXQVCO3u/sXxRLhcVCNQ5KFu/Pe/PX89b1FrNuxj76ta3Bfn2bUKH9S0KFJCRSVpiozm+/urcLvU4EtQB133xm1SGNAiUOSzd4Dh3hq4gqembiCFDNu6NKAwV0aUDbjWEvuiHwvWlOOHDzyxt0PAV/Hc9JQU5Ukq5PSU7mzRxM+vrMr5zarwmOfLKPbP8bz/LRVHDx0OOjwJMEcq8ZxiPBDf4RGU50E7Am/d3ePy/GAqnFIsvtizXYefH8R07/eRr2KZbi7V1MubFVda53LUcV0VFW8U+IQCfV/jF+yib99sJilG3fRulYW9/Y5lbMbVgw6NIlTSZk49ByHyI8dOuy8MSeHh8ctZf2OfXRvWpl7+jTTA4TyI0mZOI5QjUPkx/YdPMR/PlvFkxOWs2t/Lj85vRZ39miiEVjyP0ocShwi+fp29wGenLCckVNXg8GgjvW4pVsjssqUCjo0CVi0F3ISkQRxSmY691/YnE/v7spFraozbPJKuv5zPGMXbgg6NCkhlDhEklStU8rw0OVtGHNbZ2qfUoYbX5jNb99ewL6Dmr5dji6hEoee4xApuhY1snjj5o7ccE59Xvh8Nf2GfsbSjXH7uJbEgYRKHFrISeT4pKelcP+FzRkx6Ay27t7PxUOn8PL0b7T2h+QroRKHiJyYbk2r8P7t53BGvQr8+q353PLSHHbsOXjsEyWpKHGIyA9UKVeakYM68OsLmjHuq430eXQSM1dtCzosiSNKHCLyIykpxuAuDXnj5o6USkvh8mem8ejHyzh0WE1XosQhIkfRunZ53htyDv3a1OThj5cy8N+fs2773qDDkoCViMRhZplmNtvMLgo6FpFkUzYjjYcvb8NDP2vNwrU76PPoZD5coGc+kllME4eZPWdmm8xsQZ79vc1siZktN7N7C1HUPcCrsYlSRArj0tNrMWbIOdSpUIabXpzNXa/OY822PUGHJQGI6ZQjZtYF2AU87+4tw/tSgaVADyAHmAkMBFKBB/MUcS1wGlAJKA1scfcxx7quphwRiZ0DuYd5+OOlDJ/8NYfdufT0mmR3b0ydimWCDk1OUNzMVWVm9YAxEYnjbOABd+8V3r4PwN3zJo0j5/8FyASaA3uBS9z9RyvTmNlgYDBAnTp12q1evTrq9yIi39uwYx9PT1zBqBnfkHvYuaRtTW7t3oj6lTKDDk2OU2ETRxDrStYE1kRs5wBnFnSwu98PYGbXEKpx5LucmbsPM7P1QN/09PR20QtXRPJTLas0D1zcglu6NeSZSSt5afpq3pyTQ/82Nbn13EY0rFw26BAlRoLoHM9vCbJjVnvcfcSxmqn05LhI8atycml+e1FzJv2qO9d1rs8HCzbQ46GJ3P7KXJZv0tQliSiIxN3ZFtsAAA+MSURBVJED1I7YrgWsi0bBmqtKJDhVypXm/gubM/me7tzQpQHjvtpIj4cnkf3yHM19lWCC6ONII9Q5fh6wllDn+BXuvjBa11TnuEjwtu0+wL8nr+T5qavYc/AQF7Sszj29m6kTPY7FxXocZjYKmAY0NbMcM7vO3XOBbGAssAh4NVpJQzUOkfhRITOde3o3Y8o953Jrt0ZMXLqZCx+fzIcL1gcdmpyghFoBUGuOi8SvNdv2kD1qLvPWbOeajvW474JmZKSlBh2WRIiLGkdxU+e4SPyqXaEMr914Ntd2qs+Iqav42dPT9ABhCZVQiUNE4lt6Wgq/69ucp69qx8otu7nwsclasrYESqjEoT4OkZKhd8tqvD/kHOpVyuTGF2bzpzFfcSA330e0JA4lVOJQU5VIyVG7Qhleu+lsrulYj+FTvuZnz0wj51s1XZUECZU4RKRkyUhL5YGLW/DklaezYtMuLnh0MuO+2hh0WHIMCZU41FQlUjJd0Ko6Y4Z0pk7FMtzw/Cz+8t5XHDykpqt4lVCJQ01VIiVX3YqZvHFzR35+dl3+PTnUdLVWi0bFpYRKHCJSsmWkpfLHfi0ZekVblm0MNV09O3kl+3MPBR2aREioxKGmKpHEcNFpNXj3ts60qpnFn99bxLn/nMibc3K05nmcSKgnx4/QXFUiiWPKsi38/cPFzF+7g2bVynFP72Z0a1oZs/wm2pYTkZRPjotI4uncuBLv3NqJxwe2Ze/BQwwaMZMBwz5n7jffBh1a0lLiEJG4l5Ji9G1dg3F3dOVP/VqwYvMuLnlyKje9MJsVm3cFHV7SUVOViJQ4u/fnMnzK1zwzcQX7cg/zs/a1uP28JlTLKh10aCVa3Kw5Xpw0O65Ictm6az9Dxy/nxc9Xk5piDOpUn5u6NiTrpFJBh1YiJWXiOEI1DpHksmbbHh4at5S3v1hLxcx0/ty/Fb1bVgs6rBJHneMikjRqVyjDw5e34d3szlQ9uTQ3vTibIaPm8u3uA0GHlpCUOEQkYbSsmcXbt3bijvOb8P789fR4eBIfadr2qFPiEJGEUio1hdvPb8zo7M5UKZfB4Bdm88tX5rJ9j2of0aLEISIJqXmNk3n71k7cfl5jxnwZqn18rJl3oyLuE4eZdTOzyWb2tJl1CzoeESk50tNSuKNHE97J7kTFzHSuf34Wd/73C3bsORh0aCVaTBOHmT1nZpvMbEGe/b3NbImZLTeze49RjAO7gNJATqxiFZHE1aJGFqOzOzPk3Ea8M28dPR+ZyKeLVfs4XjEdjmtmXQj90H/e3VuG96UCS4EehBLBTGAgkAo8mKeIa4Et7n7YzKoCD7n7lce6robjikhB5ufs4O7X5rFk404ua1eL317UXM99hMXFcFx3nwRsy7O7A7Dc3Ve6+wHgFaCfu89394vyvDa5+5HVXL4FMgq6lpkNNrNZZjZr8+bNMbkfESn5WtXKYvRtncju3oi35q6l18OTtOpgEQXRx1ETWBOxnRPely8zu9TMngFeAIYWdJy7DwP+AMxJT0+PUqgikogy0lK5u1dT3rqlI1knleKG52dx/ciZrNmmNc8LI4jEkd9cyAW2l7n7m+5+o7tf7u4TjlawVgAUkaI4rVZ5xgzpzK8vaMbUFVs5/6GJDP10mRaOOoYgEkcOUDtiuxawLhoFayEnESmqUqkpDO7SkE/u6sp5p1bhnx8tpc8jk5mybEvQocWtIBLHTKCxmdU3s3RgADA6gDhERP6netZJPHllO0Ze24HD7lw1fDrZL89h43f7gg4t7sR6OO4oYBrQ1MxyzOw6d88FsoGxwCLgVXdfGI3rqalKRE5U1yaV+fCXXbjj/CZ89NVGzvvXRJ6dvJLcQ4ePfXKSSKjZcTWtuohE0+qtu3lg9ELGL9lMs2rl+HP/lrSvVyHosGJG06rrOQ4RiQJ3Z+zCjfzx3YWs27GPn7arxb19mlGxbIFPB5RYcfEch4hISWdm9G5ZjY/v6spNXRvy1ty1nPuvibwwbRWHDifeL96FkVCJQ6OqRCRWyqSncW+fZnxw+zk0r34yv31nIX0fn8LMVXmfcU58aqoSESkid+f9+Rv4y3tfsW7HPi5pW5P7+jSjyskle83zpOzjUOe4iBSnPQdyeXL8CoZNWkmpVGPIeY0Z1Kk+6WklszEnKRPHEapxiEhxWr11N38a8xUfL9pEg8qZPNC3BV2aVA46rCJT57iISDGpWzGTZ39xBv+55gwOH3Z+/twMBj8/K2HnvlLiEBGJku7NqjD2ji78qndTJi/bwvkPTeShcUvZeyCx5r5KqMShUVUiErSMtFRu6daIT+/uSs8W1Xjsk2Wc/9BExi7cEHRoUaM+DhGRGJq2YisPjF7Iko07ufC06vzh4hZUitOHB9XHISISB85uWJExQzpzd88mjFu4kZ4PT2L0vHWU5F/alThERGKsVGoK2ec2ZsyQztSuUIYho+Yy+IXZbCqhM+8qcYiIFJMmVcvxxk1n8+sLmjFp6WbOf2gir8/OKXG1j4RKHOocF5F4lxZeOOqD28+habVy3P3aPAaNmMm67XuDDq3Q1DkuIhKQw4ed56et4u8fLiE1xbj/wlMZcEZtzPJbYTv21DkuIhLnUlKMazrVZ+wvu3BarSzue3M+Vw2fHvcPDipxiIgErE7FMrx0/Zn89ZJWzFuzg16PTGLk1FUcjtNp25U4RETigJlxxZl1GHtHF9rXq8DvRy/kkqemMnv1t0GH9iNKHCIicaRm+ZMYOegMHr68Neu37+UnT01lyKi5rI2jzvO0oAM4FjNLAf4EnAzMcveRAYckIhJTZsYlbWvRs3k1npm4gmcmrWTswg3c2KUBN3ZtSGZGsD+6Y1rjMLPnzGyTmS3Is7+3mS0xs+Vmdu8xiukH1AQOAjmxilVEJN5kZqRxZ8+mfHp3N3q1qMZjny7n3H9N4I3ZOYH2f8R0OK6ZdQF2Ac+7e8vwvlRgKdCDUCKYCQwEUoEH8xRxbfj1rbs/Y2avu/tlx7quhuOKSCKavXobf3z3K+bl7KB1rSx+17c57epWiFr5cTEc190nAXkX5O0ALHf3le5+AHgF6Ofu8939ojyvTYSSy5HeoQLnJjazwWY2y8xmbd68ORa3IyISqHZ1K/DWLZ14+PLWbPxuPz95ahrZL88h59viHb4bROd4TWBNxHZOeF9B3gR6mdnjwKSCDnL3YcAfgDnp6enRiFNEJO6kpIT6Pz69uyu3n9eYjxdt5Lx/TeSfY5ewe39u8cRQLFf5ofweiSywvczd97j7de5+m7s/cbSC3f1ddx+clZV1wkGKiMSzMulp3NGjCZ/e1Y3eLasxdPxyuv9zArNX523kib4gEkcOUDtiuxawLhoFa64qEUk2NcqfxKMD2vLGzR1pWq0cdStmxvyaQYzpmgk0NrP6wFpgAHBFAHGIiCSMdnVP4YXrziyWa8V6OO4oYBrQ1MxyzOw6d88FsoGxwCLgVXdfGI3rqalKRCT2YlrjcPeBBex/H3g/2tczs75A30aNGkW7aBERCUuoKUdU4xARib2EShzqHBcRib2EShyqcYiIxF5CJQ4REYm9hEocaqoSEYm9hEocaqoSEYm9mM6OGxQz2wysPs7TKwFbohhOSZPM9697T17JfP+R917X3Ssf64SETBwnwsxmFWZa4USVzPeve0/Oe4fkvv/jufeEaqoSEZHYU+IQEZEiUeL4sWFBBxCwZL5/3XvySub7L/K9q49DRESKRDUOEREpEiUOEREpEiWOCGbW28yWmNlyM7s36HiKk5mtMrP5ZvaFmc0KOp5YM7PnzGyTmS2I2FfBzMaZ2bLwn6cEGWOsFHDvD5jZ2vD3/4WZXRBkjLFiZrXNbLyZLTKzhWZ2e3h/snz3Bd1/kb5/9XGEmVkqsBToQWh525nAQHf/KtDAiomZrQLau3tSPARlZl2AXcDz7t4yvO//Advc/W/hXxxOcfd7gowzFgq49weAXe7+zyBjizUzqw5Ud/c5ZlYOmA30B64hOb77gu7/ZxTh+1eN43sdgOXuvtLdDwCvAP0CjklixN0nAdvy7O4HjAy/H0noP1TCKeDek4K7r3f3OeH3OwmtQlqT5PnuC7r/IlHi+F5NYE3Edg7H8RdagjnwkZnNNrPBQQcTkKruvh5C/8GAKgHHU9yyzezLcFNWQjbVRDKzekBbYDpJ+N3nuX8owvevxPE9y2dfMrXjdXL304E+wK3h5gxJHk8BDYE2wHrgX8GGE1tmVhZ4A/ilu38XdDzFLZ/7L9L3r8TxvRygdsR2LWBdQLEUO3dfF/5zE/AWoaa7ZLMx3AZ8pC14U8DxFBt33+juh9z9MPBvEvj7N7NShH5ovuTub4Z3J813n9/9F/X7V+L43kygsZnVN7N0YAAwOuCYioWZZYY7yjCzTKAnsODoZyWk0cAvwu9/AbwTYCzF6sgPzbBLSNDv38wMGA4scveHIj5Kiu++oPsv6vevUVURwkPQHgFSgefc/S8Bh1QszKwBoVoGQBrwcqLfu5mNAroRmlJ6I/B74G3gVaAO8A3wU3dPuE7kAu69G6FmCgdWATceafNPJGbWGZgMzAcOh3f/mlA7fzJ89wXd/0CK8P0rcYiISJGoqUpERIpEiUNERIpEiUNERIpEiUNERIpEiUNERIpEiUPkKMxsV/jPemZ2RZTL/nWe7anRLF8kVpQ4RAqnHlCkxBGecflofpA43L1jEWMSCYQSh0jh/A04J7xWwR1mlmpm/zCzmeGJ4W4EMLNu4fUOXib0kBVm9nZ48siFRyaQNLO/ASeFy3spvO9I7cbCZS8Ir5FyeUTZE8zsdTNbbGYvhZ8EFilWaUEHIFJC3Avc7e4XAYQTwA53P8PMMoDPzOyj8LEdgJbu/nV4+1p332ZmJwEzzewNd7/XzLLdvU0+17qU0FO8rQk93T3TzCaFP2sLtCA0j9pnQCdgSvRvV6RgqnGIHJ+ewM/N7AtC01VUBBqHP5sRkTQAhpjZPOBzQhNpNuboOgOjwpPObQQmAmdElJ0TnozuC0JNaCLFSjUOkeNjwG3uPvYHO826AbvzbJ8PnO3ue8xsAlC6EGUXZH/E+0Po/7AEQDUOkcLZCZSL2B4L3ByeohozaxKeWTivLODbcNJoBpwV8dnBI+fnMQm4PNyPUhnoAsyIyl2IRIF+WxEpnC+B3HCT0wjgUULNRHPCHdSbyX+50Q+Bm8zsS2AJoeaqI4YBX5rZHHe/MmL/W8DZwDxCs5X+yt03hBOPSOA0O66IiBSJmqpERKRIlDhERKRIlDhERKRIlDhERKRIlDhERKRIlDhERKRIlDhERKRI/j+E7mlNAsK8xwAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" }, { "name": "stdout", "output_type": "stream", "text": [ "[-0.16340811 -0.01532703 0.27335262 0.36893551]\n", "[-0.16340816 -0.01532706 0.27335264 0.36893555]\n" ] } ], "source": [ "A = np.array([[10., 2., 3., 5.],[1., 14., 6., 2.],[-1., 4., 16., -4],[5. ,4. ,3. ,11. ]])\n", "b = np.array([1., 2., 3., 4.])\n", "\n", "# an initial guess at the solution - here just a vector of zeros of length the number of rows in A\n", "x = np.zeros(A.shape[0]) \n", "\n", "tol = 1.e-6 # iteration tolerance\n", "it_max = 1000 # upper limit on iterations if we don't hit tolerance\n", "residuals=[] # store residuals\n", "\n", "for it in range(it_max):\n", " x_new = np.zeros(A.shape[0]) # initialise the new solution vector\n", " for i in range(A.shape[0]):\n", " x_new[i] = (1./A[i, i]) * (b[i] - np.dot(A[i, :i], x[:i]) \n", " - np.dot(A[i, i + 1:], x[i + 1:]))\n", "\n", " residual = sl.norm(A @ x - b) # calculate the norm of the residual r=Ax-b for this latest guess\n", " residuals.append(residual) # store it for later plotting\n", " if (residual < tol): # if less than our required tolerance jump out of the iteration and end.\n", " break\n", "\n", " x = x_new # update old solution\n", "\n", "# plot the log of the residual against iteration number \n", "fig = plt.figure(figsize=(6, 4))\n", "ax1 = plt.subplot(111)\n", "ax1.semilogy(residuals) # plot the log of the residual against iteration number \n", "ax1.set_xlabel('Iteration')\n", "ax1.set_ylabel('Residual')\n", "ax1.set_title('Convergence plot')\n", "plt.show()\n", "\n", "print(x_new) # our solution vector\n", "print(sl.inv(A) @ b) # check against scipy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Iterative methods - Gauss-Seidel's method\n", "\n", "We can make a small improvement to Jacobi's method using the updated components of the solution vector as soon as they become available:\n", "\n", "\n", "* Starting from a guess at the solution $\\pmb{x}^{(0)}$\n", "\n", "\n", "\n", "* iterate for $k>0$\n", "$$x_i^{(k)} = \\frac{1}{A_{ii}}\\left(b_i- \\sum_{\\substack{j=1\\\\ j< i}}^nA_{ij}x_j^{(k)} - \\sum_{\\substack{j=1\\\\ j> i}}^nA_{ij}x_j^{(k-1)}\\right),\\quad i=1,2,\\ldots, n.$$\n", "\n", "Note that as opposed to Jacobi, we can overwrite the entries of $\\pmb{x}$ as they are updated, with Jacobi we need to store both the new as well as the old iteration (i.e. not overwrite the old entries until we have finished with them - which was not until the end of every iteration).\n", "\n", "As we are using updated knowledge immediately, the Gauss-Seidel algorithm should converge faster than Jacobi, but note that this convergence can only be *guaranteed* for matrices which are diagonally dominant (for every row, the magnitude of value on the main diagonal is greater than the sum of the magnitudes of all the other entries in that row), or if the matrix is *symmetric positive definite* (a property we won't define in this course). \n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise 7.3: Implement Gauss-Seidel's method.\n", "\n", "Generalise the Jacobi code to solve the matrix problem using Gauss-Seidel's method." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Sparse matrices\n", "\n", "Note that the matrices which result from the numerical solution of differential equations are generally *sparse* () which means that most entries are zero (the alternative is termed *dense*). Knowing which entries are zero means that we can devise more efficient matrix storage methods, as well as more efficient implementations of the above algorithms (e.g. by not bothering to do operations that we know involve multiplications by zero - we know the answer will be zero).\n", "\n", "As an example, for the two iterative methods shown above (Jacobi and Gauss Seidel), the cost of *each* iteration is quadratically dependent on the number of unknowns $n$, since we need to loop through all the entries of the $n\\times n$ matrix $A$. For a fixed number of iterations the computational cost of these methods therefore scales as $n^2$. If we know that each row only contains a fixed, small number of non-zero entries however (as for example the matrix in the example below), we can simply skip the zero entries and the cost *per iteration* becomes linear in $n$. These scalings of $n^2$ for *dense* and $n$ for *sparse* matrices for the cost per iteration are typical for iterative methods. Unfortunately this does not mean that the overall cost of an iterative method is also $n^2$ or $n$, as the number of iterations that is needed to achieve a certain accuracy quite often also increases for larger problem sizes. The number of required iterations typically only increases very slowly however, so that the cost of the method is still considerably cheaper than direct methods, in particular for very large problems.\n", "\n", "A huge range of iterative solution methods exist and the literature on this topic is massive. Below is an example of using scipy to access the Conjugate Gradient algorithm which is a popular example of a method suitable for matrices which result from the numerical solution of differential equations." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[-2. 0.01579799 0. ... 0. 0.\n", " 0. ]\n", " [ 0.01579799 -2. 0.96907927 ... 0. 0.\n", " 0. ]\n", " [ 0. 0.96907927 -2. ... 0. 0.\n", " 0. ]\n", " ...\n", " [ 0. 0. 0. ... -2. 0.293628\n", " 0. ]\n", " [ 0. 0. 0. ... 0.293628 -2.\n", " 0.02000908]\n", " [ 0. 0. 0. ... 0. 0.02000908\n", " -2. ]]\n", "This is the same matrix but now stored in a sparse matrix data structure.\n", " (0, 0)\t-2.0\n", " (0, 1)\t0.01579798887096051\n", " (1, 0)\t0.01579798887096051\n", " (1, 1)\t-2.0\n", " (1, 2)\t0.9690792657586554\n", " (2, 1)\t0.9690792657586554\n", " (2, 2)\t-2.0\n", " (2, 3)\t0.4426848628892317\n", " (3, 2)\t0.4426848628892317\n", " (3, 3)\t-2.0\n", " (3, 4)\t0.33287797242853423\n", " (4, 3)\t0.33287797242853423\n", " (4, 4)\t-2.0\n", " (4, 5)\t0.8349961890669807\n", " (5, 4)\t0.8349961890669807\n", " (5, 5)\t-2.0\n", " (5, 6)\t0.2759243129018931\n", " (6, 5)\t0.2759243129018931\n", " (6, 6)\t-2.0\n", " (6, 7)\t0.5899642313423469\n", " (7, 6)\t0.5899642313423469\n", " (7, 7)\t-2.0\n", " (7, 8)\t0.5338100213881314\n", " (8, 7)\t0.5338100213881314\n", " (8, 8)\t-2.0\n", " :\t:\n", " (41, 41)\t-2.0\n", " (41, 42)\t0.6990321498971347\n", " (42, 41)\t0.6990321498971347\n", " (42, 42)\t-2.0\n", " (42, 43)\t0.44120676550363935\n", " (43, 42)\t0.44120676550363935\n", " (43, 43)\t-2.0\n", " (43, 44)\t0.6627012118552\n", " (44, 43)\t0.6627012118552\n", " (44, 44)\t-2.0\n", " (44, 45)\t0.08354089874916015\n", " (45, 44)\t0.08354089874916015\n", " (45, 45)\t-2.0\n", " (45, 46)\t0.06076481090638697\n", " (46, 45)\t0.06076481090638697\n", " (46, 46)\t-2.0\n", " (46, 47)\t0.2600948682739955\n", " (47, 46)\t0.2600948682739955\n", " (47, 47)\t-2.0\n", " (47, 48)\t0.29362800321791604\n", " (48, 47)\t0.29362800321791604\n", " (48, 48)\t-2.0\n", " (48, 49)\t0.02000907848336786\n", " (49, 48)\t0.02000907848336786\n", " (49, 49)\t-2.0\n", "Now we execute the CG algorithm on our problem, with our callback function returning information on the residual at each iteration.\n", "1 2.3562251696335856\n", "2 1.0239182056504734\n", "3 0.401992541842747\n", "4 0.223770573282838\n", "5 0.10784772944319773\n", "6 0.04528275512406776\n", "7 0.016783187792529596\n", "8 0.007841550759271124\n", "9 0.003427366089593852\n", "10 0.0013261858767582826\n", "11 0.0005783870772007964\n", "12 0.0002434899030345161\n", "13 7.049175749160568e-05\n", "14 2.7297491399353727e-05\n", "15 9.758147321408194e-06\n", "16 2.1383778140400358e-06\n", "17 8.453340360899756e-07\n", "18 4.027234283292924e-07\n", "19 1.244334213076751e-07\n", "20 3.765897211891967e-08\n", "21 1.380157208085994e-08\n", "22 3.2245429787673083e-09\n", "23 1.211810534395043e-09\n", "24 3.697853422438397e-10\n" ] } ], "source": [ "import scipy.sparse.linalg\n", "\n", "n = 50\n", "main_diag = np.ones(n) #just a vector of ones\n", "off_diag = np.random.random(n-1) # to make it a bit more interesting make the off-diagonals random\n", "A = np.diag(-2*main_diag,0) + np.diag(1.*off_diag,1) + np.diag(1.*off_diag,-1)\n", "# A random RHS vector\n", "b = np.random.random(A.shape[0])\n", "\n", "print(A) # print our A in \"dense\" matrix format\n", "\n", "sA = scipy.sparse.csr_matrix(A) # The same matrix in a \"sparse\" matrix data structure where only non-zeros stored\n", "print('This is the same matrix but now stored in a sparse matrix data structure.')\n", "print(sA)\n", "\n", "# now use a scipy iterative algorithm (Conjugate Gradient) to solve\n", "\n", "# First define a (callback) function which we are allowed to pass to the solver; here this is coded such that it will store and print the iteration numbers and residuals - basically a method to output some diagnostic information on the solver as it executes\n", "def gen_callback_cg():\n", " diagnostics = dict(it=0, residuals=[]) \n", " def callback(xk): # xk is the solution computed by CG at each iteration\n", " diagnostics[\"it\"] += 1\n", " diagnostics[\"residuals\"].append(sl.norm(A @ xk - b))\n", " print(diagnostics[\"it\"], sl.norm(A @ xk - b))\n", " return callback \n", "\n", "print('Now we execute the CG algorithm on our problem, with our callback function returning information on the residual at each iteration.')\n", "x_sol = scipy.sparse.linalg.cg(A,b,x0=None, tol=1e-10, maxiter=1000, callback=gen_callback_cg())" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "## An example of a sparse matrix\n", "\n", "Let us consider an electric circuit arranged in a regular grid of $n$ rows and $m$ columns. The nodes in the grid are numbered from 0 to $nm-1$ as indicated in the diagram below.\n", "\n", "![bla](images/circuit.png)\n", "\n", "We want to calculate the electric potential $V_i$ in all of the nodes $i$. A node $i$ somewhere in the middle of the circuit is connected via resistor to nodes $i-1$ and $i+1$ to the left and right respectively, and to the nodes $i-m$ and $i+m$ in the rows above and below. For simplicity we assume that all resistors have the same resistance value $R$. The first and last node of the circuit (0 and $nm-1$) to a battery via two additional resistors, with the same resistance value $R$.\n", "\n", "The sum of the currents coming into a node is zero (if we use a sign convention where a current coming into a node is positive and a current going out is negative. The currents between two nodes can be calculated using Ohm's law: $I=V/R$ where $R$ is the resistance of the resistor, and $V$ is the potential difference between two nodes, say $V=V_i-V_{i-1}$. Therefore we can write:\n", "\n", "\\begin{eqnarray}\n", " 0 &=& I_{i-1\\to i} + I_{i+1\\to i} + I_{i-m\\to i} + I_{i+m\\to i} \\\\\n", " &=& V_{i-1\\to i}/R + V_{i+1\\to i}/R + V_{i-m\\to i}/R + V_{i+m\\to i}/R \\\\\n", " &=& (V_{i}-V_{i-1})/R + (V_{i}-V_{i+1})/R + (V_{i}-V_{i-m})/R + (V_{i}-V_{i+m})/R \\\\\n", " &=& (4V_{i}-V_{i-1}-V_{i+1}-V_{i-m}-V_{i+m})/R\n", "\\end{eqnarray}\n", "\n", "This gives us one linear equation for each node in the circuit (with slight modifications for nodes that are not in the interior). These can be combined into a linear system $Ax=b$ which is assembled in the code below:" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[ 3. -1. 0. -1. 0. 0. 0. 0. 0. 0. 0. 0.]\n", " [-1. 3. -1. 0. -1. 0. 0. 0. 0. 0. 0. 0.]\n", " [ 0. -1. 2. 0. 0. -1. 0. 0. 0. 0. 0. 0.]\n", " [-1. 0. 0. 3. -1. 0. -1. 0. 0. 0. 0. 0.]\n", " [ 0. -1. 0. -1. 4. -1. 0. -1. 0. 0. 0. 0.]\n", " [ 0. 0. -1. 0. -1. 3. 0. 0. -1. 0. 0. 0.]\n", " [ 0. 0. 0. -1. 0. 0. 3. -1. 0. -1. 0. 0.]\n", " [ 0. 0. 0. 0. -1. 0. -1. 4. -1. 0. -1. 0.]\n", " [ 0. 0. 0. 0. 0. -1. 0. -1. 3. 0. 0. -1.]\n", " [ 0. 0. 0. 0. 0. 0. -1. 0. 0. 2. -1. 0.]\n", " [ 0. 0. 0. 0. 0. 0. 0. -1. 0. -1. 3. -1.]\n", " [ 0. 0. 0. 0. 0. 0. 0. 0. -1. 0. -1. 3.]]\n" ] } ], "source": [ "n = 4 # number of rows\n", "m = 3 # number of columns\n", "V_battery = 5.0 # voltage on the right of the battery\n", "\n", "A = np.zeros((n*m, n*m))\n", "for row in range(n):\n", " for column in range(m):\n", " i = row*m + column # node number\n", " if column>0: # left neighbour\n", " A[i,i-1] += -1.0\n", " A[i,i] += 1.0\n", " if column0: # neighbour above\n", " A[i,i-m] += -1.0\n", " A[i,i] += 1.0\n", " if row