{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [],
"source": [
"from traitlets.config.manager import BaseJSONConfigManager\n",
"cm = BaseJSONConfigManager()\n",
"cm.update('livereveal', {\n",
" 'theme': 'sky',\n",
" 'transition': 'zoom',\n",
" 'start_slideshow_at': 'selected',\n",
" 'scroll': true\n",
"})"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"# Lecture 1: Floating-point arithmetic, vector norms"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Syllabus\n",
"**Week 1:** floating point, vector norms, matrix multiplication\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Today \n",
"- Fixed/floating point arithmetic;\n",
"- Concept of **backward** and **forward** stability of algorithms\n",
"- How to measure accuracy: vector norms"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Representation of numbers\n",
"\n",
"Real numbers represent quantities: probabilities, velocities, masses, ...\n",
"\n",
"It is important to know, how they are represented in the computer (which only knows about bits)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Fixed point\n",
"\n",
"The most straightforward format for the representation of real numbers is **fixed point** representation,\n",
"\n",
"also known as **Qm.n** format.\n",
"\n",
"A **Qm.n** number is in the range $[-(2^m), 2^m - 2^{-n}]$, with resolution $2^{-n}$.\n",
"\n",
"Total storage is $m + n + 1$ bits.\n",
"\n",
"The range of numbers represented is fixed."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Representation of numbers\n",
"The numbers in computer memory are typically represented as **floating point numbers** \n",
"\n",
"A floating point number is represented as \n",
"\n",
"$$\\textrm{number} = \\textrm{significand} \\times \\textrm{base}^{\\textrm{exponent}},$$\n",
"\n",
"where *significand* is integer, *base* is positive integer and *exponent* is integer (can be negative), i.e.\n",
"\n",
"$$ 1.2 = 12 \\cdot 10^{-1}.$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Fixed vs Floating\n",
"\n",
"**Q**: What are the advantages/disadvantages of the fixed and floating points?"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"**A**: In most cases, they work just fine.\n",
"\n",
"However, fixed point represents numbers within specified range and controls **absolute** accuracy.\n",
"\n",
"Floating point represent numbers with **relative** accuracy, and is suitable for the case when numbers in the computations have varying scale \n",
"\n",
"(i.e., $10^{-1}$ and $10^{5}$).\n",
"\n",
"In practice, if speed is of no concern, use float32 or float64."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## IEEE 754\n",
"In modern computers, the floating point representation is controlled by [IEEE 754 standard](https://en.wikipedia.org/wiki/IEEE_floating_point) which was published in **1985** and before that point different computers behaved differently with floating point numbers. \n",
"\n",
"IEEE 754 has:\n",
"- Floating point representation (as described above), $(-1)^s \\times c \\times b^q$.\n",
"- Two infinities, $+\\infty$ and $-\\infty$\n",
"- Two kinds of **NaN**: a quiet NaN (**qNaN**) and signalling NaN (**sNaN**) \n",
"- Rules for **rounding**\n",
"- Rules for $\\frac{0}{0}, \\frac{1}{-0}, \\ldots$\n",
"\n",
"$ 0 \\leq c \\leq b^p - 1, \\quad 1 - emax \\leq q + p - 1 \\leq emax$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## The two most common format, single & double\n",
"\n",
"The two most common format, called **binary32** and **binary64** (called also **single** and **double** formats).\n",
"\n",
"| Name | Common Name | Base | Digits | Emin | Emax |\n",
"|------|----------|----------|-------|------|------|\n",
"|binary32| single precision | 2 | 11 | -14 | + 15 | \n",
"|binary64| double precision | 2 | 24 | -126 | + 127 | \n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Accuracy and memory\n",
"The **relative accuracy** of single precision is $10^{-7}-10^{-8}$, \n",
"\n",
"while for double precision is $10^{-14}-10^{-16}$.\n",
"\n",
" Crucial note 1: A **float32** takes **4 bytes**, **float64**, or double precision, takes **8 bytes.**\n",
"\n",
" Crucial note 2: These are the only two floating point-types supported in hardware.\n",
"\n",
" Crucial note 3: You should use **double precision** in CSE and **float** on GPU/Data Science.\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Some demo (for division accuracy)"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.1040364727377892\n",
"-5.96046e-08\n"
]
}
],
"source": [
"import numpy as np\n",
"import random\n",
"#c = random.random()\n",
"#print(c)\n",
"c = np.float32(0.925924589693)\n",
"a = np.float32(8.9)\n",
"b = np.float32(c / a)\n",
"print('{0:10.16f}'.format(b))\n",
"print(a * b - c)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.0000001468220603\n"
]
}
],
"source": [
"#a = np.array(1.585858585887575775757575e-5, dtype=np.float)\n",
"a = np.array(5.0, dtype=np.float32)\n",
"b = np.sqrt(a)\n",
"print('{0:10.16f}'.format(b ** 2 - a))"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.0\n"
]
}
],
"source": [
"a = np.array(2.28827272710, dtype=np.float32)\n",
"b = np.exp(a)\n",
"print(np.log(b) - a)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"\n",
"- For some values the inverse functions give exact answers\n",
"- The relative accuracy should be kept due to the IEEE standard.\n",
"- Does not hold for many modern GPU."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Loss of significance\n",
"\n",
"Many operations lead to the loss of digits [loss of significance](https://en.wikipedia.org/wiki/Loss_of_significance)\n",
"\n",
"\n",
"For example, it is a bad idea to subtract two big numbers that are close, the difference will have fewer correct digits.\n",
"\n",
"This is related to algorithms and their properties (forward/backward stability), which we will discuss later."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Summation algorithm\n",
"\n",
"However, the rounding errors can depend on the algorithm. \n",
"\n",
"Consider the simplest problem: given $n$ floating point numbers $x_1, \\ldots, x_n$ \n",
"\n",
"compute their sum\n",
"\n",
"$$S = \\sum_{i=1}^n x_i = x_1 + \\ldots + x_n.$$\n",
"\n",
"The simplest algorithm is to add one-by-one. \n",
"\n",
"What is the actual error for such algorithm? "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Naive algorithm\n",
"\n",
"Naive algorithm adds numbers one-by-one, \n",
"\n",
"$$y_1 = x_1, \\quad y_2 = y_1 + x_2, \\quad y_3 = y_2 + x_3, \\ldots.$$\n",
"\n",
"The worst-case error is then proportional to $\\mathcal{O}(n)$, while **mean-squared** error is $\\mathcal{O}(\\sqrt{n})$.\n",
"\n",
"The **Kahan algorithm** gives the worst-case error bound $\\mathcal{O}(1)$ (i.e., independent of $n$). \n",
"\n",
" Can you find the $\\mathcal{O}(\\log n)$ algorithm? "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Kahan summation\n",
"The following algorithm gives $2 \\varepsilon + \\mathcal{O}(n \\varepsilon^2)$ error, where $\\varepsilon$ is the machine precision.\n",
"```python\n",
"s = 0\n",
"c = 0\n",
"for i in range(len(x)):\n",
" y = x[i] - c\n",
" t = s + y\n",
" c = (t - s) - y\n",
" s = t\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Error in sum: 1.9e-04, \n",
"kahan: -1.3e-07, \n",
"dumb_sum: -1.0e-02. \n"
]
}
],
"source": [
"import math\n",
"n = 10 ** 8\n",
"sm = 1e-10\n",
"x = np.ones(n, dtype=np.float32) * sm\n",
"x[0] = 1.0\n",
"true_sum = 1.0 + (n - 1)*sm\n",
"approx_sum = np.sum(x)\n",
"math_sum = math.fsum(x)\n",
"\n",
"from numba import jit\n",
"@jit\n",
"def dumb_sum2(x):\n",
" s = np.float32(0.0)\n",
" for i in range(len(x)):\n",
" s = s + x[i]\n",
" return s\n",
"@jit\n",
"def kahan_sum(x):\n",
" s = np.float32(0.0)\n",
" c = np.float32(0.0)\n",
" for i in range(len(x)):\n",
" y = x[i] - c\n",
" t = s + y\n",
" c = (t - s) - y\n",
" s = t\n",
" return s\n",
"k_sum = kahan_sum(x)\n",
"d_sum = dumb_sum2(x)\n",
"print('Error in sum: {0:3.1e}, \\nkahan: {1:3.1e}, \\ndumb_sum: {2:3.1e}. '.format(approx_sum - true_sum, k_sum - true_sum, d_sum - true_sum, math_sum - true_sum))"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## More complicated example"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"2.0 0.0 0.0\n"
]
}
],
"source": [
"import numpy as np\n",
"print(math.fsum([1, 1e20, 1, -1e20]), np.sum([1, 1e20, 1, -1e20]), 1 + 1e20 + 1 - 1e20)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Summary of floating-point \n",
"You should be really careful with floating point, since it may give you incorrect answers due to rounding-off errors.\n",
"\n",
"For many standard algorithms, the stability is well-understood and problems can be easily detected."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Vectors\n",
"In NLA we typically work not with **numbers**, but with **vectors**. \n",
"\n",
"Recall that a vector in a fixed basis of size $n$ can be represented as a 1D array with $n$ numbers. \n",
"\n",
"Typically, it is considered as an $n \\times 1$ matrix (**column vector**).\n",
"\n",
"**Example:** \n",
"Polynomials with degree $\\leq n$ form a linear space. \n",
"Polynomial $ x^3 - 2x^2 + 1$ can be considered as a vector $\\begin{bmatrix}1 \\\\ -2 \\\\ 0 \\\\ 1\\end{bmatrix}$ in the basis $\\{x^3, x^2, x, 1\\}$."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Vector norm\n",
"Vectors typically provide an (approximate) description of a physical (or some other) object. \n",
"\n",
"One of the main question is **how accurate** the approximation is (1%, 10%). \n",
"\n",
"What is an acceptable representation, of course, depends on the particular applications. For example:\n",
"- In partial differential equations accuracies $10^{-5} - 10^{-10}$ are the typical case\n",
"- In data mining sometimes an error of $80\\%$ is ok, since the interesting signal is corrupted by a huge noise."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Distances and norms\n",
"Norm is a **qualitative measure of smallness of a vector** and is typically denoted as $\\Vert x \\Vert$.\n",
"\n",
"The norm should satisfy certain properties:\n",
"\n",
"- $\\Vert \\alpha x \\Vert = |\\alpha| \\Vert x \\Vert$,\n",
"- $\\Vert x + y \\Vert \\leq \\Vert x \\Vert + \\Vert y \\Vert$ (triangle inequality),\n",
"- If $\\Vert x \\Vert = 0$ then $x = 0$.\n",
"\n",
"The distance between two vectors is then defined as\n",
"$$\n",
" d(x, y) = \\Vert x - y \\Vert.\n",
"$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Standard norms\n",
"The most well-known and widely used norm is **euclidean norm**:\n",
"$$\\Vert x \\Vert_2 = \\sqrt{\\sum_{i=1}^n |x_i|^2},$$\n",
"which corresponds to the distance in our real life (the vectors might have complex elements, thus is the modulus here)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## $p$-norm\n",
"Euclidean norm, or $2$-norm, is a subclass of an important class of $p$-norms:\n",
"$$\n",
" \\Vert x \\Vert_p = \\Big(\\sum_{i=1}^n |x_i|^p\\Big)^{1/p}.\n",
"$$\n",
"There are two very important special cases:\n",
"- Infinity norm, or Chebyshev norm which is defined as the maximal element: $\\Vert x \\Vert_{\\infty} = \\max_i | x_i|$\n",
"- $L_1$ norm (or **Manhattan distance**) which is defined as the sum of modules of the elements of $x$: $\\Vert x \\Vert_1 = \\sum_i |x_i|$ \n",
" \n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"We will give examples where Manhattan is very important: it all relates to the **compressed sensing** methods \n",
"that emerged in the mid-00s as one of the most popular research topics."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Equivalence of the norms\n",
"All norms are equivalent in the sense that\n",
"$$\n",
" C_1 \\Vert x \\Vert_* \\leq \\Vert x \\Vert_{**} \\leq C_2 \\Vert x \\Vert_*\n",
"$$ \n",
"for some constants $C_1(n), C_2(n)$, $x \\in \\mathbb{R}^n$ for any pairs of norms $\\Vert \\cdot \\Vert_*$ and $\\Vert \\cdot \\Vert_{**}$. The equivalence of the norms basically means that if the vector is small in one norm, it is small in another norm. However, the constants can be large."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Computing norms in Python\n",
"The numpy package has all you need for computing norms (```np.linalg.norm``` function)\n"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Relative error: 0.00343922107489\n"
]
}
],
"source": [
"import numpy as np\n",
"n = 100\n",
"a = np.ones(n)\n",
"b = a + 1e-3 * np.random.randn(n)\n",
"print('Relative error:', np.linalg.norm(a - b, np.inf) / np.linalg.norm(b, np.inf))"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Unit disks in different norms\n",
"A unit disk is a set of point such that $\\Vert x \\Vert \\leq 1$. For the Frobenius norm is a disk; for other norms the \"disks\" look different."
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"data": {
"text/plain": [
""
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYYAAAEKCAYAAAAW8vJGAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xmc1PWd7/vXuzcWJdIsstqNRMagkqC0gEnGZBI14pkI\nxkSFTDRnXGIm3nnMmZtzJJo4uSTx4My5J7m5caJonCxHUKOimJFjXEMSBe1mQLYo2NJNC7I0BbII\nvX3OH/UrU91Udde+fp6PR0HVr36/+n1+3V31qe8uM8M555yLqMh3AM455wqLJwbnnHO9eGJwzjnX\niycG55xzvXhicM4514snBuecc714YnDOOdeLJwbnnHO9eGIoc5I2Sfp0Gsf/XNL3k3ktSdslXZSL\n+FI9by4UWjzORXhiKHKSTNIZfbZ9V9L/SuR4MzvbzF4Kjkvrgyr6tTIl1dcstA/dQounWEi6RVKj\npOOSfj7AviMkLZd0RFKLpAU5CrPkVOU7AOdc7kiqMrOufMeRhJ3A94HPAUMG2PduoAMYA0wH/l3S\nejPblN0QS4+XGEpc8E31m5Jel3RQ0sOSBvd5/iJJvwLqgKckHZb03+K83rmS1ko6JOlh4ITXinp8\nq6R3gn3fkPTZGK/3EUlvS7qmn/gvSuRaoo7p71qm9/OzGC/pMUl7g5j+foCf67ckbZYUkvRvsWJJ\nJ54Y5+vv9zhV0kuSDgTVb5f3OfZWSa8DRyRVBdv+a/B6RyT9TNIYSSuD39dzkmrjXX+f2L4s6eUg\npncl7ZA0J5FjB2Jmj5vZE0D7ADGcBFwJfMfMDpvZH4AVwFcyEUfZMTO/FfENMOCMPtu+C/yv4P52\n4FVgPDAC2ALcHLXvduCivvfjnKsGaAH+C1ANfBHoBL4f47XOBHYA44PHk4APR+8HnAe0An/dzzn7\nxhf3WuId12dbzOMJf0lqAu4IrnMy0Ax8rp/X3wicFrzWHyM/h0zEE+f4eLFXA9uA24LYPwMcAs6M\nOnZdEOuQqG2rCX+7ngDsAdYC5wKDgBeAf0rwb3Ax8D5wVRDLN4GWGPv9BjgQ5/abAc7xfeDn/Tx/\nLvB+n23fBJ7K93u0GG9eYigPPzaznWa2H3iKcDE7FbMJv/F/ZGadZvYo8FqcfbsJf8CcJanazLab\n2VtRz/8l4W9015nZb5KIId1riXf8+cBoM1tkZh1m1gzcB8QsyQR+YmY7gtf6ATA/yVj6iyeZfWcD\nJwOLg9hfIPwhPL/PsTvM7P2obf+/me02s3eA3wNrzOw/zOw4sJzwh20ipgE/NLNHzKwT+CVQ17f0\nY2Z/bWbD49z+OsFzxXMycLDPtoPAsDRftyx5Yih+3YQ/rKNVE/4mH/Fu1P2jhN9EqRgPvGPB17FA\nS6wdzWwb8A+ESy97JD0kaXzULjcDL5vZi0nGkO61xDu+HhgfVMUckHSA8DfwMf281o6o+y2Efz6R\nqpXDwW1livEks+94YIeZ9fSJZ0KcWCN2R91/P8bjRH+204BHox6fChw2s2MJHp8Jh4EP9dn2IcIl\nJ5ckTwzFr5VwNU2004nzgT2AgRbn2AVMkKSobXVxX8xsqZl9kvCHrgF3RT19M+FvlT9MIc5EJLvQ\nyA7g7T7fYoeZ2WX9HHNa1P06wg2lmNmDZnZycIvUtWdz4ZOdwGmSot/PdcA7UY+zcn5Jwwn/HPZG\nbf4icEJCDNovDse5DZRAB/ImUCVpStS2jwHe8JwCTwzF72Hg25ImSqoIGmo/T+9vcInaTbhuPZ5X\ngC7g74MGzC8AM2PtKOlMSZ+RNAg4RvgbaHfULoeAS4ELJS1OIdaBDHQtfb0KvBc00g6RVCnpHEnn\n93PMN4Kf+wjCpYuHMxhPMtYAR4D/Jqla4XEfnwceytQJFB6v8vMYT00j/HtdEPxN/Cfg7wiXFHsx\nszlRCbPvLWZjdfCag4FKoFLSYEkn9KY0syPA48AiSSdJ+gQwF/hVipdc1jwxFL9FwMvAH4AQ8M/A\nl81sYwqv9d8JJ5kDkr7Z90kz6wC+AHw1ONfVhN+MsQwi3Ci5j3AVyKmEPzyjX+8AcDEwR9L3Uoi3\nP/1eS19m1k34w3Q68HYQ9/3AKf0cthT4LeFG6mbCDaQZiScZwe/lcmAO4bj/FbjWzP6UwdOcRriB\nva9pwIPABYT/Jv4fYJ6Zbc7Qeb9N+EvFQuBvgvvfhg9KINF/U39HuEvrHmAZ8HXzrqopUe/qYudc\nIiRtB24ws+fyHUu2SaoB1gMfDRqXo5/7KfCmmWWrStDlgZcYnHP9Cno6Te2bFALTCHeddSXEE4Nz\nLh3nAJmssnIFwKuSnHPO9eIlBuecc70U5SR6o0aNskmTJuU7DOecKypNTU37zGz0QPsVZWKYNGkS\njY2N+Q7DOeeKiqSEBr56VZJzzrlePDE455zrxRODc865XjwxOOec6yUjiUHSA5L2SIo5P4/Cfixp\nW7Bi1HlRz10naWtwuy4T8TjnnEtdpkoMPyc8U2Y8c4Apwe0m4KcQXrwb+CdgFuFZOv8p0eUEnXPO\nZUdGEoOZrQL297PLXOCXFrYaGC5pHOEFvp81s/1mFgKepf8E41xBa2oJcfeL22hqCeU7FOdSlqtx\nDBPovYJUW7At3nbnik5TS4gv37+ajq4eaqoqePCG2cyo9wKwKz65SgyKsc362X7iC0g3Ea6Goq4u\n7qJhzuVcU0uI1c3t7DzwPh1dPfQYdHb1sLq53RODK0q5Sgxt9F4GcSLh5QjbgE/32f5SrBcwsyXA\nEoCGhgaf+c8VhKaWEFff+zJdPVApqKqsoLOrBwMOvR9rlmrnCl+uuquuAK4NeifNBg6a2S7gGeAS\nSbVBo/MlwTbnisI9v3uLrp7w/W6DQZXCgB6De1Y1c+MvG729wRWdjJQYJC0j/M1/lKQ2wj2NqgHM\n7B7gaeAyYBtwFPjPwXP7gyUdXwteapGZ9deI7VxBeXvv4V6PDx3v7vX42c27eeFPe/je3HNYMMur\nQF1xyEhiMLP5AzxvwDfiPPcA8EAm4nAuV5paQtz7u7d4a++RAfft7jFuf2IDgCcHVxSKcnZV5/Kp\nqSXE/PvCvY8SZYYnB1c0fEoM55LQ1BJi0VObkkoKEZHk4G0OrtB5YnAuAU0tIW5bvoFrlrzC+raD\nCR9XU9m7R7YZ3LVyS6bDcy6jPDE4N4DIwLVla1rp7E6up3Ss/V/dHmLx054cXOHyxOBcP5paQtz6\n6HqOdfbEHnkZQ1XUuyreMfesambpmtZ0w3MuK7zx2bk4mlpCfOmnL5Nsa0KizQ8/fv5NNu48yJXn\nTfQR0q6geInBuTgWr9ySdFJIxrvvHWfpmlauuudlLz24guIlBuf6aGoJ8fjaNl7bfmLvIRG/eihV\n3QZ3PLmRM8cO85KDKwheYnAuSqSh+cE43+D7SwoVxJ4VMhFdPcY3HmzykoMrCJ4YnAtExigc70yt\nAqmHP08ZPHNSLfOmj0/q+HffO85tyzd4cnB551VJztF7ltR0GbC29QDDh9akdPwDf3yb0NEOZk8e\n6VVLLi+8xOAc8NjatowkhYiuHuO3m3endOy2PYf5l2feYP59q32UtMsLTwyu7DW1hHj17cKb1Lej\nq4fH17blOwxXhrwqyZWtppYQj61t49GmtpTmPsqF57fs5uzxp/jEey6nPDG4stTUEmL+klfoSHKK\ni1yLNEi3th9h4WVT8x2OKxNeleTK0mNr2wo+KUS7Z1UzN/lqcC5HMpIYJF0q6Q1J2yQtjPH8DyWt\nC25vSjoQ9Vx31HMrMhGPc/E0tYS4+8Vt7Dt0PN+hJO23m3dzzZJXPDm4rEu7KklSJXA3cDHQBrwm\naYWZbY7sY2b/JWr//ws4N+ol3jez6enG4dxAIoPXOrp6qEh1JFoW1VSKqkpxtCN+e0dnt/H42jbv\nxuqyKhMlhpnANjNrNrMO4CFgbj/7zweWZeC8ziVs6ZpW/u9H1nGss4ceS3yiu4FUVypjSaaj2/pN\nChFrmtu91OCyKhOJYQKwI+pxW7DtBJLqgdOBF6I2D5bUKGm1pHkZiMe5XpauaeW25RvY3n404699\n7mnDufxjyY1wTsZJNZUnbNu29whX3/uKj5B2WZOJXkmxvi/Fa9W7BnjUzLqjttWZ2U5Jk4EXJG0w\ns7dOOIl0E3ATQF2dd91ziVu5cVfWXvvV7SEge9/ej3R0x9ze1WN854kNPvGey4pMlBjagNOiHk8E\ndsbZ9xr6VCOZ2c7g/2bgJXq3P0Tvt8TMGsysYfTo0enG7MpEU0uIwdUnfusuBPUjhqY86R6EZ2Vd\n9NQmr1ZyGZeJxPAaMEXS6ZJqCH/4n9C7SNKZQC3wStS2WkmDgvujgE8Am/se61yylq5p5ZOLn+fK\nn77MsylOTZFtrfuPUj9yaFqvsb7tIF++36fOcJmVdmIwsy7gFuAZYAvwiJltkrRI0uVRu84HHjKz\n6GqmqUCjpPXAi8Di6N5MzqUi0qbQduBYvkPplwG730s/xo7OHlY3t6cfkHOBjIx8NrOngaf7bLuj\nz+PvxjjuZWBaJmJwDsJVR0tWndBEVbDeT3GK76ljh7Hl3UNAeLrvdTsO0NQS8vYGlxE+8tmVjMg4\nhWz0Pio0u/sM0Ht2826fjdVljM+V5EpCU0uIHz33ZsqL7GTCqJNr2He4Iyfn2n/kxPN0dPXwnSc2\nMPLkQcw5Z5xPvOdS5onBFb3IhHid3Zbx9ZgTUQFI5Cwp9GfzrkPAIX6/dZ9PvOdS5lVJruhFJsTL\ndVKoEMybPp4zxw6jEOfju3dVs1ctuZR4icEVrch6Ck+/Hm/YTHb1GKxYv5OeAkwKEO71tOipTZwz\n4RS+cN5Eb5h2CfPE4IrS0jWtfOeJDXn/pl6oSSFifdtB1rcd5NdNbSy7cbYnB5cQr0pyRWfpmla+\nXQBJoZh0dPlYB5c4LzG4orL46S3cu6o5L43Mxa52aE2+Q3BFwhODKxpL17Ryz6rmfIdRtJaseovQ\n0Q5mTx7pVUquX16V5IpCU0uIHz//ZlbPoaCX0ccmnpL0sVWFuPJPH9vbj/Ivz7zBfF8Fzg3AE4Mr\neJFxCu++l93lOM3gqfU7uWDyyKSP7Sr0VugoHd3Gvb8rnmlDXO55YnAFraklxKKnNtGRo5bmboPf\n5Kn7ay49u3k3c3/yB1/sx8XkbQyuYDW1hJh/X3iN5lwq9FlZE1FZAd39/NiMSFfWDQA+fYbrxUsM\nrmCtbm7PeVIoFf0lhb7+xzN/8jYH14snBldwmlpC3P3iNrbuPpTvUMrC/qOd3iDtevGqJFdQIlNn\nH0twltSTB1Vy+HjsdZFd4jq6jcfXtnk3VgdkqMQg6VJJb0jaJmlhjOe/KmmvpHXB7Yao566TtDW4\nXZeJeFxxikydHS8pDBt04trNnhQy56FXW73U4IAMlBgkVQJ3AxcDbcBrklbEWKLzYTO7pc+xI4B/\nAhoIt4c1Bcf6X2eZiZQU+ltP4ZAngazqNvjGg038/Wf/whujy1wmSgwzgW1m1mxmHcBDwNwEj/0c\n8KyZ7Q+SwbPApRmIyRWZx9a2cayzx6e6yLN33zvObcs3cNU9L3vpoYxlIjFMAHZEPW4LtvV1paTX\nJT0q6bQkj3UlbOmaVh561fvTF5JXt4e46l5PDuUqE4kh1lwAfb/4PQVMMrOPAs8Bv0ji2PCO0k2S\nGiU17t27N+VgXeFoaglx2/INfOfJjQU/fXU56u6BxSu35DsMlweZSAxtwGlRjycCvYaOmlm7mUXm\nM7gPmJHosVGvscTMGsysYfTo0RkI2+VTZJqLpWta6c5zVhhxks86Gk/j9hBL17Ry94vbvPRQRjLR\nXfU1YIqk04F3gGuABdE7SBpnZruCh5cDka8hzwB3Sor0kbsE+FYGYnIFbvHKLTmb5mIgRTD/Xd4Y\ncPvy8OjoQdUVPHiDL/ZTDtJODGbWJekWwh/ylcADZrZJ0iKg0cxWAH8v6XKgC9gPfDU4dr+k7xFO\nLgCLzGx/ujG5wrZ0TSuvbS+cb5/7DnfkO4SCFknfHZ3hxX48MZS+jAxwM7Ongaf7bLsj6v63iFMS\nMLMHgAcyEYcrfE0tIZas8pk9i1FFhZidwsyzrvj4yGeXM4WyTrNLzQ2fPN1LC2XCE4PLiaVrWrl9\n+YYTupxNGjmU7e1H8xKTS84Df3yb5n1HGD1sEF84b6IniRLmicFlXVNLiDue3BizH/Lu94p/iuty\n0dFt/HbzbgB+3dTGshu9IbpUeWJwWdPUEuKxtW1seudg3BXO3k9wsjxXWDq7evjRc2/yDxf9hSeH\nEuSJwWVFZJxCoXRJdZllwO+37mNNczvLbrrAk0OJ8cTgsmJ1c7snhTLQ0W3MX/IKl00bx5Qxw5g9\neaQniRLgicFlRe3Q0h1NLGDC8MElsQRoJnR0G0+sC09YUFNV4W0PJcBXcHMZt3RNK0tWvRVzIqxS\nYMA7nhRi6ujq4fG1bfkOw6XJSwwuI5paQjy+to0/bN1Hy/4/dz8V4SknSq1WqcQuJ6NWbtjF2eNP\n8TUdipiXGFzamlpCXL3kFR5c09orKQBUVMDnPzbe5yPKs1y+0fcf7eS25Ru49mdrcnhWl0meGFza\n7lq5ha44RYLuHnhi3c6Cn1a7urK0M1c+fvyrtu5j8dM+bXcx8sTg0rJ0TSuvZnBCvMFV+fmT7Eyj\nrmve9PFcctYYThmaWs1sLlJSvvLyI407Bt7JFRxvY3ApibQpPL9ld0Zf91hX8Q14W7F+JyL1dpTq\nSpVs1979RztZ/PQWFl42Nd+huCR4YnBJa2oJMf++1XQU4Yd4NqRbTVaqSSHinlXNAJ4ciognBpe0\n1c3tdHpScEm4Z1Uzh453+eR7RcLbGFzSZk8eSYX/5bgkPbimlS/fv9qXCC0CGXl7S7pU0huStkla\nGOP5f5S0WdLrkp6XVB/1XLekdcFtRSbicdlX6L2MXGE63ukD4IpB2lVJkiqBu4GLgTbgNUkrzGxz\n1G7/ATSY2VFJXwf+Gbg6eO59M5uebhwud1Y3t2OeGDJiSHUFg6oqOPB+V75DyQkDHm7cgQFXerVS\nwcpEiWEmsM3Mms2sA3gImBu9g5m9aGaRkU+rgYkZOK/Lg8VPb+Fnf2jOdxgl43hXT9kkhYiubmPp\nmlbm37ea25Zv8KqlApSJxDABiO6s3BZsi+d6YGXU48GSGiWtljQv3kGSbgr2a9y7d296EbuENbWE\nuPvFbTS1hFj89BbuWdXM/iOdWTtfqmMBilUxVMnNmz6ev5wyKuPjLTq6esIJYskrnhwKTCbehbH+\nXmL+uUv6G6AB+FTU5joz2ylpMvCCpA1mdsJq8Wa2BFgC0NDQUARvp+IX6Zba2dVDVaWoysG8FgeP\nlte352KwYt1Oxg8fnLVBch3dxmNr27xaqYBkosTQBpwW9XgisLPvTpIuAm4HLjez45HtZrYz+L8Z\neAk4NwMxuQy493dv0dHVgxEeGTzQams1lfI5kUpQD2R9ivGXt+37oGTq8i8TieE1YIqk0yXVANcA\nvXoXSToXuJdwUtgTtb1W0qDg/ijgE0B0o7XLk6aWEM8mOaq5o9uKomokn8qtqixR29uP8i/PvMHV\n977syaEApP1XamZdkm4BngEqgQfMbJOkRUCjma0A/gU4Gfi1JIBWM7scmArcK6mHcJJa3Kc3k8uh\nppYQq5vbqR1awwN/fLtseh5VVojuHGW0QqoqqynAqTi6emDxyi38+uaP5zuUsiYrwnd/Q0ODNTY2\n5juMktLUEuLL94enuSi3b/2R2q8yu+yCducV03w9hyyQ1GRmDQPt5+NXHQCPrW3jWGf5JQUIJ4Qy\nvOyCdseTG71KKY88MZS5pWtamfuTP7BsTWu+Q3EZlqcZzDOiq8f40XNvenLIkyL+03HpWrqmlduW\nb2B920H/xlyCin2ew99v3edzK+WJJ4Yy9vBrXkpwha2jq4fVze35DqPseN+5MrR0TSsrN+7y9RRc\nUZg9eWS+Qyg7nhjKTKT6yLlkVVWIrn56J1RXKq0lUmPpMXjj3UO88e4hVm7cxZxzxnlvpRzwxFBm\nVm7cle8QXAERiffI6i8pQHrrZvfnfz77BvsOdwDhdgfAk0OWeRtDmTl73Idych751BhJGz4k99/T\nDBiUh+5LyZwxkhQilqx6yxuks8xLDGWgqSXE42vb2HPoOHvey+6cNxEyHxuQrHxNv308D21N6Zxx\ne/tRrr73FRbNPcdLDlniiaHERWZIzXVDc6JnS6YqIxMqBQU2C0RRkSiIqVK6eoxvPxFuK/PkkHle\nlVTiVje3F3Tvo1x/xnhSSM+EUwbnO4QP9Bjc/sQGbvxlo1ctZZgnhhJXO7Qm5WMHmkL7jNEnMXOS\nz6FfTrI9/XayzODZzbt9sZ8M88RQoppaQtz4y0Z+/PybKR0/YfhgPjt1TL/71J5Uw61zplLpf0Uu\nzyKL/bjM8NlVS1BTS4ir730561MiVFWI8+qGA9C4PUQP4VJGOU7E506k4JariszKCvHI1y7wleD6\nkejsqt74XEIi6yms33EgJ/PkdPUYr27vXXz3pFC+KvhzEpg0cihVFWLb3iM5O393j/G4LxGaEZ4Y\nSkRkPYVjAyy/6Vy2GH8uMW5vP5qXGLbuPpSX85aajNQOS7pU0huStklaGOP5QZIeDp5fI2lS1HPf\nCra/IelzmYin3DS1hPjRc2/mPSmcVFOZ1/O7/DLyX2J8rSXEUp9CPm1pJwZJlcDdwBzgLGC+pLP6\n7HY9EDKzM4AfAncFx55FeI3os4FLgX8NXs8lKDJOITJVQD4d6ejOdwiuxJ01bhhfnlVHVUW4/aKC\n3qPszeA7T27ktuUbvJdSGjJRYpgJbDOzZjPrAB4C5vbZZy7wi+D+o8BnFV78eS7wkJkdN7O3gW3B\n67kEPb62raDHKTiXSX/adYituw/RE4ysj3R4iE4O3T3GsjWtvpZDGjKRGCYAO6IetwXbYu5jZl3A\nQWBkgscCIOkmSY2SGvfu3ZuBsEvD3kPHE9qvskIMLuYlvVxZmjB8MBNqh3zwuAd4dXuoV5VVdw+c\nX1/ba9yNAcc7fS2HVGXikyLWMKi+NY3x9knk2PBGsyVm1mBmDaNHj04yxNI1atighPbr7jGOdfXE\nHLQW2VRZIT428RSmTzwl5i/GuVx758Ax3gm9P+B+r/VJFhD+IElngGc5y0RiaANOi3o8EdgZbx9J\nVcApwP4Ej3X9uPK8idRUCkFCA81iNQ5GNnX3GOvbDrJp13tcdNYYLjlrjCcIVxRifZsUEDraEeMZ\nN5BMJIbXgCmSTpdUQ7gxeUWffVYA1wX3vwi8YOGRdSuAa4JeS6cDU4BXMxBT2ZhRX8uymy7gm587\nk6vPz8xkYp3dxrObd/P8n/b4DKnuA2ecenK+Q0iYgEHVFb76W4rSHsdgZl2SbgGeASqBB8xsk6RF\nQKOZrQB+BvxK0jbCJYVrgmM3SXoE2Ax0Ad8wM+/akqQZ9bXMqK+lqSXEo4076MjQTHHdcfoefmhw\nFe8dy88U0S4/KgS1Q6vzHcaAJJj7sfFMGTOM2ZNH+mC3FPmUGCWmqSXEoqc2sb7tYMLHRKqLiuEv\nIdfTdLv05PL3JWD+rDruvGJajs5YfBKdEsO7qZSYGfW13PH5s6mM0zgw+uSaExqgjeL5sM1GnN5Z\nK3s+MnZYzs5VWRFuc3Pp87dECZpRX8v35k2L+cvdf6RjwFlTy40PA8mOCsGwwbmbdefs8ad41VGG\neGIoUQtm1TE/xspW3RYuctfEK1JE8R5JLh09xgmTLGZTpjpfOE8MJe0L501kcPWJv+Ld7x3jbz9x\neq/RorGcP6nWk4MrWDdfOJlLzhrDxyaewp1XTPMlPjPIZ1ctYTPqa3nwhtl8e/kGtrz751kn17cd\nZNPO9wZcu3f88CH84IqJ3LZ8Q5YjzZyTaip9zqYyceh4F0uuHbAd1aXAE0OJm1Ffy/udJ35QdvXY\ngIvqPLFuJ/9747tZjC6z5k0fz84D7+e0+sKl5qSaSo52dKfVmaBYOkwUI69KKgOXnj32hG0SnDlm\n4B4jxzLUMjuoKvuVUk+s2+lJoUgc6ZMUBlpfvC/hPZCyyRNDGVh42VQunDKq1zYzelUvJao6gUbr\nWI53+fc7F19VkpnhaxdO9h5IWeRVSWXil9fPYumaVlZu3MW+Q8dTSgqjhtUwuLKCtgPHshChK2eJ\njNY/ZUgVH504nDnnjPOG5izzxFBGFsyqY8GsOub+5A8pHb/vkE9I5vLn1kunekLIEa9KKkPe39sV\nm5mTaj0p5JCXGMpQ5A32P575E/uPduY5Guf6V1khbp0zNd9hlBUvMZSpBbPquO+686kJJgqqUHjO\nIP+DcIXk1GE1PPK1C7yhOce8xFDGZtTXsuzG2axubv9g3vp7f/cWv9+6L+bYB+dy7aKzxnpSyANP\nDGUuspYDwNI1rfx28+48R+RcWE2lfKxCnnhicB9YuXFXvkMoeCI8OLC/EeMudSI8RmHYkGpfaCeP\n0qpSljRC0rOStgb/n/BblDRd0iuSNkl6XdLVUc/9XNLbktYFt+npxOPSM+eccfkOoeAZnhSyyYBh\nQ6r5xl+d4Ukhj9ItMSwEnjezxZIWBo9v7bPPUeBaM9sqaTzQJOkZMzsQPP9fzezRNONwGRDprbRy\n4y6GVFd6tZLLuaoKfJ3mApBuYpgLfDq4/wvgJfokBjN7M+r+Tkl7gNHAAVzBiQyCa2oJ8dyW3f7t\n2GWdgPG1Qzhr3Ie4+VMf9pJCAUi3d+IYM9sFEPx/an87S5oJ1ABvRW3+QVDF9ENJg/o59iZJjZIa\n9+7dm2bYbiAz6mv5/rzyXju3UJf8HDm0Oi/nrRCMyPC5JwwfzKNf/zh/vPUz3HdtgyeFAjHgn76k\n5yRtjHGbm8yJJI0DfgX8ZzOLTNn5LeAjwPnACE6shvqAmS0xswYzaxg9enQyp3YpWhAsrJ7sBGel\nolCX/GzP06DEyz82nlCGz/3pM0/1ZFCABqxKMrOL4j0nabekcWa2K/jg3xNnvw8B/w5828xWR712\npBvMcUn/Bnwzqehd1i2YVceZY4ex6KlNrG87mO9wXJomjRzKnkPHOZrCYkZPrNuZ0VgqK8QXvDtq\nQUq3sLyCsVFcAAARYUlEQVQCuC64fx3wZN8dJNUAy4Ffmtmv+zw3LvhfwDxgY5rxuCyYUV/LHZ8/\n20dFl4DRwwYltN53sgSM/VDcmuATVFWI7809x0sLBSrd9/pi4GJJW4GLg8dIapB0f7DPVcCFwFdj\ndEt9UNIGYAMwCvh+mvG4LJlRX8uvv/5xpo4deHEfl7h508czcfjgnJ3vte0hDrzflfHX/cspozh5\nUFVCa4T/5ZRRPPy1C3xSvAImG2jh3wLU0NBgjY2N+Q6jbF30P3/Htj2H8x1GSaipqqCru6dsen9V\nVYiHfe6jvJHUZGYDLpTttQMuaX/7idPzHULJ6OjKTlKoLKAOAwpCqaoQi7z6qCj4lBguadED4Uae\nVJPxRkmXvtEn1fDuoeP5DoOZk2q5dc7UDyZq9KRQHDwxuJREBsJFeHIoLIWQFACGD63pNVGjKw5e\nleTSNmXMsIQaHV35GTUs8Z5KrnB4YnBpmz15JIOqK6gUniAcVRXhvwOfNrt4eVWSS9uM+loevGE2\nj61t46FXWynCjm4uQ+ZNH89XLpjkbQpFzhODy4gZ9bWsbm4vm26X7kQCvnLBJG9TKAFeleQyZvbk\nkQXVTdLl3urm9nyH4DLAE4PLmBn1tXxv7jmeHMpUdVWFr6VQIrwqyWVUZNK91c3trN9xwBf7KRIj\nhlZz6HgXnd2p1QWeOqyGn/6NT5tdKjwxuIyL1DH7Yj/FY//RzrR6lHlSKC1eleSyJrLYT6YrlgQ+\nmV8WpJq/77ximieFEuOJwWXVgll1PPr1j3PGqSfH3adS4akTkvGn3YfSDc2lqULhpOCzpJYer0py\nWTejvpa7rvwoV9/7Cl0x6pX+Ysww5p07kVe3hxJ6PfvgH5cqkdqPcOrYYZxXX4sBV5430UsKJcoT\ng8uJGfW1LJp7Dnc8ufGE5LDl3UPctnxDniIrT6nm1fPqa/nBFeW9Fng58KoklzMLZtXx8NcuYHie\nFrN3yTl/Ui0zJ9VSEUx1UlNV4Utxlom0SgySRgAPA5OA7cBVZnZCfYCkbsKrtAG0mtnlwfbTgYeA\nEcBa4Ctm1pFOTK6wzaiv5ZqG07hnVXO+Q3H9qACaWkLUVFXw/XnTCB3t8Ckuyki6JYaFwPNmNgV4\nPngcy/tmNj24XR61/S7gh8HxIeD6NONxRWDhZVO5+cLJDKryAmshUvBPj0FnVw+hox1846/O8KRQ\nRtJ9Z84FfhHc/wUwL9EDJQn4DPBoKse74rbwsqksvXE2NZXyGVkLzLDBVVRViEr5aOZylW5iGGNm\nuwCC/0+Ns99gSY2SVkuKfPiPBA6YWWRl8jZgQrwTSbopeI3GvXv3phm2KwQz6mtZdtMFzJ9V58mh\ngLx3rAskrp5Zx4M3zPaSQhkasI1B0nPA2BhP3Z7EeerMbKekycALkjYA78XYL25nCTNbAiwBaGho\n8M6KJSIyK6uET9edYal2SQXo6uphwvAhnhTK1ICJwcwuivecpN2SxpnZLknjgD1xXmNn8H+zpJeA\nc4HHgOGSqoJSw0TA14csQ7Mnj6SmqoJjnT35DsUFKirkVUhlLN2qpBXAdcH964An++4gqVbSoOD+\nKOATwGYzM+BF4Iv9He9KX2Shn/5GR7vk9S0tDE6wsb+qQiyae46XFspYuolhMXCxpK3AxcFjJDVI\nuj/YZyrQKGk94USw2Mw2B8/dCvyjpG2E2xx+lmY8rkhFRkdXV4ZbG0q9zSEfcz0d7+6/RCb9eayJ\nT3NR3mRFWLHb0NBgjY2N+Q7DZUFTS+iDZSHfePcQP37+Td5973i+w8q4edPHs2L9zgFnnq0I2l4y\n9S6dOnYYe48cZ9+h3sOFBPzA5z0qeZKazKxhoP18SgxXUKKXhZxRX0tr+5GSHAz3xLrEmtMyPWX5\nG7sPEe4p/mcVgu/P86Tg/swTgytom3bF6rzmUmUGPVG1BJ4UXCw+9NQVtDnnjMt3CEWnv5VVqytF\nTVUFFQo3MntScLF4icEVtAWz6mhtP8K9v2/GLL2++aUgcv0CPjJ2GNv3HeH9rt6NyvGqnwR89/Jz\nPlh61ec+cvF4YnAFb+FlU7n47LGsbm6ndmgN33lyI91lul6oRf2/5d3kFisyIHS0o1c7jnOxeGJw\nRSH6w6xUG6SzrbrSB625xHgbgys6w4b4eg7x1FTGb2D49JmneknBJcQTgys6syePpKq/FtYy1tEd\nv4rt1GGDchiJK2aeGFzRiSwTqhi5IdY256uvueR4YnBFacGsOi6eOuaE7UU4kD+rKgVfnlXHsht9\n+myXOG98dkXra5/6MC+9uZeOLp+VNZYKwfd8nIJLgZcYXNGaUV/Lshtns2BWXVFVIVXm6F13zcw6\nTwouJV5icEUt0o31Q4OqiqYL6wCTnGbE4OoKrvQ2BZciTwyuJCy8bCp1I0/izqc3c/h4d77DyZv6\nEUO56vzTfFSzS4snBlcyItUmty3fkOdI8ufMscP4xl+dke8wXJHzNgZXUhbMquPis07srVQuRvtY\nBZcBaSUGSSMkPStpa/D/CWVXSX8laV3U7ZikecFzP5f0dtRz09OJxzmAmz/1YQZXV5TNtx4FNx+r\n4DIlrRXcJP0zsN/MFktaCNSa2a397D8C2AZMNLOjkn4O/MbMHk3mvL6CmxtIZCW45f/xDtv2HM53\nOFl184WTGTak2tsV3IBytYLbXODTwf1fAC8RXsc5ni8CK83saJrnda5fkd5KtUNrSrrNYcTQahZe\nNjXfYbgSk25pe4yZ7QII/j91gP2vAZb12fYDSa9L+qGkuBWkkm6S1Cipce/evelF7crGgll13HnF\nNM4YfRJFNNQhYVc1nJbvEFwJGrAqSdJzwNgYT90O/MLMhkftGzKzmGVZSeOA14HxZtYZte1doAZY\nArxlZosGCtqrklwqlq5p5TtPbKCfeeZyrqoCUh24PXNSLY/c/PHMBuRKWsaqkszson5OslvSODPb\nFXzI7+nnpa4ClkeSQvDau4K7xyX9G/DNgeJxLlULZtVx5thh3ProerbtPZLR1545qZbDx7vYvCu5\nxXNSTQpVleLWOV6F5LIj3aqkFcB1wf3rgCf72Xc+faqRgmSCJAHzgI1pxuNcv2bU1zIzC4vVvLo9\nxJYkk0KyBJxx6slcctYYHr7pAm9odlmTbuPzYuARSdcDrcCXACQ1ADeb2Q3B40nAacDv+hz/oKTR\nhP/m1wE3pxmPcwO68ryJPPxqa8arlLJdQ/XRiafw5C2fzPJZnEszMZhZO/DZGNsbgRuiHm8HJsTY\n7zPpnN+5VMyor+V786Zx+/INWf8wz6Srz/cJ8VxulMsYIOd6WTCrjh9cMY1iWQju5gsn+0ypLmd8\nriRXtiKN0ff+7i1+u3l3vsM5wYiTaphRX8vNn/qwtye4nPISgytrM+prWXJtA3deMa3gxjl885Iz\nue/aBk8KLuc8MThHuPTwtQsnx3zuQ4Mq+z1WwMk1/e+TitDRjoy/pnOJ8MTgXGDYkOpepYZI+8N7\nA6zvYMDhjsyuAVFVKWZnoVutc4nwNgbnArMnj2RQdQWdXT1IoqsnP32WZk6q5dY5U70KyeWNJwbn\nAjPqa3nwhtmsbm6ndmgNdzy5MafJQYIfzJvmvY9c3nlicC5KZFZWgNb2IzlZR1pAZYVYNPccTwqu\nIHhicC6OYUOqs34OEW74/sJ5E73qyBUMTwzOxTF78khqqiroSHWmuwR87cLJvp6CKzjeK8m5OGbU\n17LsxtnMnJT5b/JnjD6JO6+Y5knBFSQvMTjXjxn14TUPLvp/X0p7qu4po09i7PAhzDlnnLcluILm\nJQbnEvC3n4w9+C0Zb+07wj9c9BeeFFzB8xKDcwmIfJiv3LiLY53dvLY9lPRrmMHq5nZvZHYFz0sM\nziVowaw6fnX9LBbOmcrg6oqk5lYSMKi6wkczu6LgJQbnktR3IFzoaAfrdhzg2TgztFYI5s/0Lqmu\neKSVGCR9CfguMBWYGSzQE2u/S4H/D6gE7jezxcH204GHgBHAWuArZuYzh7mCFz0QDqCpJcQLW3bH\nXBVOwPjhQzwpuKKRblXSRuALwKp4O0iqBO4G5gBnAfMlnRU8fRfwQzObAoSA69OMx7m8UdSqP9WV\noqZSVAqqq7wKyRWXdJf23AIg9VvbOhPYZmbNwb4PAXMlbQE+AywI9vsF4dLHT9OJybl8WN3cTk8w\nr5KALzWcxpXnTWR1czuzJ4/00oIrKrloY5gA7Ih63AbMAkYCB8ysK2r7CetCR0i6CbgJoK7Ou/u5\nwhIZJd3Z1UN1VQVXBu0JnhBcMRowMUh6Dhgb46nbzezJBM4Rqzhh/WyPycyWAEsAGhoaimkNd1cG\nohukvYTgit2AicHMLkrzHG3AaVGPJwI7gX3AcElVQakhst25ouQlBFcqcjGO4TVgiqTTJdUA1wAr\nzMyAF4EvBvtdByRSAnHOOZdFaSUGSVdIagMuAP5d0jPB9vGSngYISgO3AM8AW4BHzGxT8BK3Av8o\naRvhNoefpROPc8659Cn8xb24NDQ0WGNjzCETzjnn4pDUZGYNA+3nU2I455zrxRODc865XjwxOOec\n66Uo2xgk7QVa0nyZUYS7zBY7v47C4tdRWPw6eqs3s9ED7VSUiSETJDUm0ghT6Pw6CotfR2Hx60iN\nVyU555zrxRODc865Xso5MSzJdwAZ4tdRWPw6CotfRwrKto3BOedcbOVcYnDOOReDJwbnnHO9lE1i\nkPQlSZsk9UiK2+1L0nZJGyStk1RwEzIlcR2XSnpD0jZJC3MZYyIkjZD0rKStwf8x56uW1B38LtZJ\nWpHrOOMZ6OcraZCkh4Pn10ialPsoB5bAdXxV0t6o38EN+YizP5IekLRH0sY4z0vSj4NrfF3SebmO\nMREJXMenJR2M+l3ckbVgzKwsbsBU4EzgJaChn/22A6PyHW861wFUAm8Bk4EaYD1wVr5j7xPjPwML\ng/sLgbvi7Hc437Gm8vMF/g64J7h/DfBwvuNO8Tq+Cvwk37EOcB0XAucBG+M8fxmwkvDiYLOBNfmO\nOcXr+DTwm1zEUjYlBjPbYmZv5DuOdCV4HR+ss21mHcBDwNzsR5eUuYTX+Sb4f14eY0lWIj/f6Ot7\nFPisBlgcPQ+K4e9kQGa2Ctjfzy5zgV9a2GrCC4SNy010iUvgOnKmbBJDEgz4raSmYJ3pYhRrne24\n62nnyRgz2wUQ/H9qnP0GS2qUtFpSoSSPRH6+H+xj4TVJDhJec6SQJPp3cmVQBfOopNNiPF/oiuH9\nkKgLJK2XtFLS2dk6yYBLexaTDKxPDfAJM9sp6VTgWUl/CjJ5zmRxne2c6u86kniZuuD3MRl4QdIG\nM3srMxGmLJGfb0H8DgaQSIxPAcvM7LikmwmXgj6T9cgyqxh+F4lYS3iuo8OSLgOeAKZk40QllRgs\n/fWpMbOdwf97JC0nXNzOaWLIwHXEW2c7p/q7Dkm7JY0zs11BsX5PnNeI/D6aJb0EnEu4XjyfEvn5\nRvZpk1QFnEKBVBNEGfA6zKw96uF9wF05iCvTCuL9kC4zey/q/tOS/lXSKDPL+CSBXpUURdJJkoZF\n7gOXADF7CBS4mOts5zmmvlYQXucb4qz3LalW0qDg/ijgE8DmnEUYXyI/3+jr+yLwggUtiAVkwOvo\nUxd/OeHleYvNCuDaoHfSbOBgpBqzmEgaG2mnkjST8Od3e/9HpSjfLfG5ugFXEP7mcBzYDTwTbB8P\nPB3cn0y4Z8Z6YBPhqpu8x57sdQSPLwPeJPztuhCvYyTwPLA1+H9EsL0BuD+4/3FgQ/D72ABcn++4\n+/v5AouAy4P7g4FfA9uAV4HJ+Y45xev478F7YT3wIvCRfMcc4xqWAbuAzuC9cT1wM3Bz8LyAu4Nr\n3EA/vRIL/DpuifpdrAY+nq1YfEoM55xzvXhVknPOuV48MTjnnOvFE4NzzrlePDE455zrxRODc865\nXjwxOOec68UTg3POuV7+Dwj0uUvZXhpwAAAAAElFTkSuQmCC\n",
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"%matplotlib inline\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"p = 1. # Which norm do we use\n",
"M = 40000 # Number of sampling points\n",
"a = np.random.randn(M, 2)\n",
"b = []\n",
"for i in range(M):\n",
" if np.linalg.norm(a[i, :], p) <= 1:\n",
" b.append(a[i, :])\n",
"b = np.array(b)\n",
"plt.plot(b[:, 0], b[:, 1], '.')\n",
"plt.axis('equal')\n",
"plt.title('Unit disk in the p-th norm, $p={0:}$'.format(p))"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Why $L_1$-norm can be important\n",
"$L_1$ norm, as it was discovered quite recently, plays an important role in **compressed sensing**. \n",
"\n",
"The simplest formulation is as follows:\n",
"- You have some observations $f$ \n",
"- You have a linear model $Ax = f$, where $A$ is an $n \\times m$ matrix, $A$ is **known**\n",
"- The number of equations, $n$ is less than the number of unknowns, $m$\n",
"\n",
"The question: can we find the solution?"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"The solution is obviously non-unique, so a natural approach is to find the solution that is minimal in the certain sense:\n",
"\n",
"$$ \\Vert x \\Vert \\rightarrow \\min, \\quad \\mbox{subject to } Ax = f$$\n",
"\n",
"Typical choice of $\\Vert x \\Vert = \\Vert x \\Vert_2$ leads to the **linear least squares problem** (and has been used for ages). \n",
"\n",
"The choice $\\Vert x \\Vert = \\Vert x \\Vert_1$ leads to the [**compressed sensing**]\n",
"\n",
"(https://en.wikipedia.org/wiki/Compressed_sensing) and what happens, it typically yields the **sparsest solution**. \n",
"\n",
"[A short demo](tv-denoising-demo.ipynb)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## What is a stable algorithm?\n",
"\n",
"And we finalize the lecture by the concept of **stability**.\n",
"\n",
"Let $x$ be an object (for example, a vector). Let $f(x)$ be the function (functional) you want to evaluate. \n",
"\n",
"You also have a **numerical algorithm** ``alg(x)`` that actually computes **approximation** to $f(x)$. \n",
"\n",
"The algorithm is called **forward stable**, if $$\\Vert alg(x) - f(x) \\Vert \\leq \\varepsilon $$ \n",
"\n",
"The algorithm is called **backward stable**, if for any $x$ there is a close vector $x + \\delta x$ such that\n",
"\n",
"$$alg(x) = f(x + \\delta x)$$\n",
"\n",
"and $\\Vert \\delta x \\Vert$ is small."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Classical example\n",
"A classical example is the **solution of linear systems of equations** using LU-factorizations\n",
"\n",
"We consider the **Hilbert matrix** with the elements\n",
"\n",
"$$A = \\{a_{ij}\\}, \\quad a_{ij} = \\frac{1}{i + j + 1}, \\quad i,j = 0, \\ldots, n-1.$$\n",
"\n",
"And consider a linear system\n",
"\n",
"$$Ax = f.$$\n",
"\n",
"(We will look into matrices in more details in the next lecture, and for linear systems in the upcoming weeks, but now you actually **see** the linear system)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"scrolled": false,
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"74.7752592502\n"
]
}
],
"source": [
"import numpy as np\n",
"n = 500\n",
"a = [[1.0/(i + j + 1) for i in range(n)] for j in range(n)] #Hil\n",
"a = np.array(a)\n",
"rhs = np.random.random(n)\n",
"sol = np.linalg.solve(a, rhs)\n",
"print(np.linalg.norm(a.dot(sol) - rhs)/np.linalg.norm(rhs)) #Ax - y\n",
"#plt.plot(sol)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"5.75686455918e-07\n"
]
}
],
"source": [
"rhs = np.ones(n)\n",
"sol = np.linalg.solve(a, rhs)\n",
"print(np.linalg.norm(a.dot(sol) - rhs)/np.linalg.norm(rhs)) #Ax - y\n",
"#plt.plot(sol)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Take home message\n",
"- Floating point (double, single, number of bytes), rounding error\n",
"- Norms are measures of smallness, used to compute the accuracy\n",
"- $1$, $p$ and Euclidean norms \n",
"- $L_1$ is used in compressed sensing as a surrogate for sparsity (later lectures) \n",
"- Forward/backward error (and stability of algorithms) (later lectures)"
]
}
],
"metadata": {
"anaconda-cloud": {},
"celltoolbar": "Slideshow",
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.0"
},
"nav_menu": {},
"toc": {
"navigate_menu": true,
"number_sections": false,
"sideBar": true,
"threshold": 6,
"toc_cell": false,
"toc_section_display": "block",
"toc_window_display": false
}
},
"nbformat": 4,
"nbformat_minor": 1
}