{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"# Lecture 2: Floating point errors, vector norms"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Syllabus\n",
"**Week 1:** Intro week, floating point, vector norms, matrix multiplication\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Recap of the previous lecture\n",
"- Basic course structure.\n",
"- Some Python intro (more today by Evgeny Frolov)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Today \n",
"- Floating point arithmetic; concept of **backward** and **forward** stability of algorithms\n",
"- How to measure accuracy: vector norms"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Representation of numbers\n",
"The numbers in computer memory are typically represented as **floating point numbers** (floating means instead of **fixed point**). \n",
"\n",
"A floating point number is represented as \n",
"\n",
"$$\\textrm{number} = \\textrm{significand} \\times \\textrm{base}^{\\textrm{exponent}},$$\n",
"\n",
"where $\\textrm{significand}$ is integer, $\\textrm{base}$ is positive integer and $\\textrm{exponent}$ is integer (can be negative), i.e.\n",
"\n",
"$$ 1.2 = 12 \\cdot 10^{-1}.$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## IEEE 754\n",
"In modern computers, the floating point representation is controlled by [IEEE 754 standard](https://en.wikipedia.org/wiki/IEEE_floating_point) which was published in **1985** and before that point different computers behaved differently with floating point numbers. \n",
"\n",
"IEEE 754 has:\n",
"- Floating point representation (as described above), $(-1)^s \\times c \\times b^q$.\n",
"- Two infinities, $+\\infty$ and $-\\infty$\n",
"- Two kinds of **NaN**: a quiet NaN (**qNaN**) and signalling NaN (**sNaN**) \n",
"- Rules for **rounding**\n",
"\n",
"$ 0 \\leq c \\leq b^p - 1, \\quad 1 - emax \\leq q + p - 1 \\leq emax$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## The two most common format, single & double\n",
"\n",
"The two most common format, called **binary32** and **binary64** (called also **single** and **double** formats).\n",
"\n",
"| Name | Common Name | Base | Digits | Emin | Emax |\n",
"|------|----------|----------|-------|------|------|\n",
"|binary32| single precision | 2 | 11 | -14 | + 15 | \n",
"|binary64| double precision | 2 | 24 | -126 | + 127 | \n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Accuracy and memory\n",
"The **relative accuracy** of single precision is $10^{-7}$ - $10^{-8}$, \n",
"\n",
"while for double precision is $10^{-14}$-$10^{-16}$.\n",
"\n",
" Crucial note 1: A **float32** takes **4 bytes**, **float64**, or double precision, takes **8 bytes.**\n",
"\n",
" Crucial note 2: These are the only two floating point-types supported in hardware.\n",
"\n",
" Crucial note 3: You should use **double precision** in 99% cases.\n",
"\n",
"Can you guess how much for what (mantissa, base...)?"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"Some demo..."
]
},
{
"cell_type": "code",
"execution_count": 127,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.0860972627997398\n",
"0.0\n"
]
}
],
"source": [
"import numpy as np\n",
"import random\n",
"c = random.random()\n",
"c = np.float32(c)\n",
"a = np.float32(9)\n",
"b = np.float32(c / a)\n",
"print('{0:10.16f}'.format(b))\n",
"print a * b - c"
]
},
{
"cell_type": "code",
"execution_count": 129,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0.0\n"
]
}
],
"source": [
"a = np.array(1.585858585887575775757575e-5, dtype=np.float)\n",
"b = np.sqrt(a)\n",
"print b ** 2 - a"
]
},
{
"cell_type": "code",
"execution_count": 137,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"4.23516473627e-22\n"
]
}
],
"source": [
"a = np.array(1e-6, dtype=np.float)\n",
"b = np.log(a)\n",
"print np.exp(b) - a"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Loss of significance\n",
"\n",
"Many operations lead to the loss of digits [loss of significance] (https://en.wikipedia.org/wiki/Loss_of_significance)\n",
"\n",
"\n",
"For example, it is a bad idea to subtract two big numbers that are close, the difference will have fewer correct digits."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Summation algorithm\n",
"\n",
"However, the rounding errors can depend on the algorithm. \n",
"\n",
"Consider the simplest problem: given $n$ numbers floating point numbers $x_1, \\ldots, x_n$ \n",
"\n",
"compute their sum\n",
"\n",
"$$S = \\sum_{i=1}^n x_i = x_1 + \\ldots + x_n.$$\n",
"\n",
"The simplest algorithm is to add one-by-one. \n",
"\n",
"What is the actual error for such algorithm? "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Naive algorithm\n",
"\n",
"Naive algorithm adds numbers one-by-one, \n",
"\n",
"$$y_1 = x_1, \\quad y_2 = y_1 + x_2, \\quad y_3 = y_2 + x_3, \\ldots.$$\n",
"\n",
"The worst-case error is then proportional to $\\mathcal{O}(n)$, while **mean-squared** error is $\\mathcal{O}(\\sqrt{n})$.\n",
"\n",
"The **Kahan algorithm** gives the worst-case error bound $\\mathcal{O}(1)$ (i.e., independent of $n$). \n",
"\n",
" Can you find the $\\mathcal{O}(\\log n)$ algorithm? "
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Kahan summation\n",
"The following algorithm gives $2 \\varepsilon + \\mathcal{O}(n \\varepsilon^2)$ error, where $\\varepsilon$ is the machine precision.\n",
"```python\n",
"s = 0\n",
"c = 0\n",
"for i in range(len(x)):\n",
" y = x[i] - c\n",
" t = s + y\n",
" c = (t - s) - y\n",
" s = t\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 139,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Error in sum: 0.0e+00, kahan: 0.0e+00, dump_sum: 0.0e+00 \n"
]
}
],
"source": [
"n = 10 ** 6\n",
"x = 100 * np.random.randn(n)\n",
"x16 = np.array(x, dtype=np.float32)\n",
"x = np.array(x16, dtype=np.float64)\n",
"true_sum = sum(x)\n",
"sum_16 = sum(x16)\n",
"\n",
"from numba import jit\n",
"@jit\n",
"def dumb_sum(x):\n",
" s = 0\n",
" for i in range(len(x)):\n",
" s = s + x[i]\n",
" return s\n",
"@jit\n",
"def kahan_sum(x):\n",
" s = 0.0\n",
" c = 0.0\n",
" for i in range(len(x)):\n",
" y = x[i] - c\n",
" t = s + y\n",
" c = (t - s) - y\n",
" s = t\n",
" return s\n",
"k_sum = kahan_sum(x16)\n",
"d_sum = dumb_sum(x16)\n",
"print('Error in sum: {0:3.1e}, kahan: {1:3.1e}, dump_sum: {2:3.1e} '.format(sum_16 - true_sum, k_sum - true_sum, d_sum - true_sum))\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## More complicated example"
]
},
{
"cell_type": "code",
"execution_count": 142,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [],
"source": [
"import math\n",
"math.fsum([1, 1e20, 1, -1e20] * 10000)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Vectors\n",
"In NLA we typically work not with **numbers** but with **vectors**. \n",
"\n",
"Recall that a vector is a 1D array with $n$ numbers. Typically, it is considered as an $n \\times 1$ matrix (**column vector**)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Vector norm\n",
"Vectors typically provide an (approximate) description of a physical (or some other) object. One of the main question is **how accurate** the approximation is (1%, 10%). What is an acceptable representation, of course, depends on the particular applications. For example:\n",
"- In partial differential equations accuracies $10^{-5} - 10^{-10}$ are the typical case\n",
"- In data mining sometimes an error of $80\\%$ is ok, since the interesting signal is corrupted by a huge noise."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Distances and norms\n",
"Norm is a **qualitative measure of smallness of a vector** and is typically denoted as $\\Vert x \\Vert$\n",
"The norm should satisfy certain properties:\n",
"\n",
"- $\\Vert \\alpha x \\Vert = |\\alpha| \\Vert x \\Vert$,\n",
"- $\\Vert x + y \\Vert \\leq \\Vert x \\Vert + \\Vert y \\Vert$ (triangle inequality),\n",
"- If $\\Vert x \\Vert = 0$ then $x = 0$.\n",
"\n",
"The distance between two vectors is then defined as\n",
"$$\n",
" d(x, y) = \\Vert x - y \\Vert.\n",
"$$"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Standard norms\n",
"The most well-known and widely used norm is **euclidean norm**:\n",
"$$\\Vert x \\Vert_2 = \\sqrt{\\sum_{i=1}^n |x_i|^2},$$\n",
"which corresponds to the distance in our real life (the vectors might have complex elements, thus is the modulus here)."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## $p$-norm\n",
"Euclidean norm, or $2$-norm, is a subclass of an important class of $p$-norms:\n",
"$$\n",
" \\Vert x \\Vert_p = \\Big(\\sum_{i=1}^n |x_i|^p\\Big)^{1/p}.\n",
"$$\n",
"There are two very important special cases:\n",
"- Infinity norm, or Chebyshev norm which is defined as the maximal element: $\\Vert x \\Vert_{\\infty} = \\max_i | x_i|$\n",
"- $L_1$ norm (or **Manhattan distance**) which is defined as the sum of modules of the elements of $x$: $\\Vert x \\Vert_1 = \\sum_i |x_i|$ \n",
" \n"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"We will give examples where Manhattan is very important: it all relates to the **compressed sensing** methods that emerged in the mid-00s as one of the most popular research topics."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"## Equivalence of the norms\n",
"All norms are equivalent in the sense that\n",
"$$\n",
" C_1 \\Vert x \\Vert_* \\leq \\Vert x \\Vert_{**} \\leq C_2 \\Vert x \\Vert_*\n",
"$$ \n",
"for some constants $C_1(n), C_2(n)$, $x \\in \\mathbb{R}^n$ for any pairs of norms $\\Vert \\cdot \\Vert_*$ and $\\Vert \\cdot \\Vert_{**}$. The equivalence of the norms basically means that if the vector is small in one norm, it is small in another norm. However, the constants can be large."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"## Computing norms in Python\n",
"The numpy package has all you need for computing norms (```np.linalg.norm``` function)\n"
]
},
{
"cell_type": "code",
"execution_count": 145,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Relative error: 0.00299119819916\n"
]
}
],
"source": [
"import numpy as np\n",
"n = 100\n",
"a = np.ones(n)\n",
"b = a + 1e-3 * np.random.randn(n)\n",
"print 'Relative error:', np.linalg.norm(a - b, np.inf) / np.linalg.norm(b, np.inf)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"## Unit disks in different norms\n",
"A unit disk is a set of point such that $\\Vert x \\Vert \\leq 1$. For the Frobenius norm is a disk; for other norms the \"disks\" look different."
]
},
{
"cell_type": "code",
"execution_count": 152,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "fragment"
}
},
"outputs": [
{
"data": {
"text/plain": [
"(-1.0, 1.0, -1.0, 1.0)"
]
},
"execution_count": 152,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": [
"iVBORw0KGgoAAAANSUhEUgAAAX0AAAECCAYAAAASDQdFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\n",
"AAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xd4FNUax/Fvek8gZBNAehu6UgTpCohUBQEVROwXRQXB\n",
"BogFewNFQSmCgiACCkgT6b2H3iaEBEhCEkIqpLFJ9v6RgEAgkLI5W97P89znZmd2Zn7jZN8czp45\n",
"42AymRBCCGEfHFUHEEIIUXqk6AshhB2Roi+EEHZEir4QQtgRKfpCCGFHpOgLIYQdKVbR1zStpaZp\n",
"G26yvJemabs1TduuadoLxTmGEEKIklPkoq9p2tvAdMDthuUuwATgQaAD8D9N0wKLE1IIIUTJKE5L\n",
"PxR4FHC4YXk9IFTX9WRd143AVqB9MY4jhBCihBS56Ou6vgjIuskqXyD5mtcXAb+iHkcIIUTJMccX\n",
"ucmAzzWvfYBEMxxHCCFEITmbYZ8ngNqappUFUsnt2vm6oA1MJpPJweHGXiIh1Nq9ey8tWjRXHUOI\n",
"ghS6cJZE0TcBaJo2APDWdX26pmkjgX/J/ZfEDF3XowvagYODA3FxF0sgimUyGHzk/KzQuK+mMm1i\n",
"RVxdfW7/Zitlq9fuCns4v8JysJBZNk22fmHk/KyH0Whk5T8rWLAlmodbV6Ruzerc3bix6lhmYWvX\n",
"7kZ2cH6FbunLzVlCXCM6Jpo2bVvy4guDcSlTk5AzCXz/41TVsYQoMVL0hbjGspVrOB0eSpkKGhci\n",
"DhMWk8a5uBSOHDlMTk4O4z79nKysmw1aE8I6SNEXIo/JZGLlymUAJEXrRB3bSBplMGbD4wMH8vyw\n",
"UWTihbOzOcY/CFE65LdXCGDHrl24OjuzffO/AFRu0ImIo+twcfdCq1GJI7uiCY+9TM/O1dQGFaKY\n",
"pKUv7J7JZGL06Lfx8HAHwNHJhfOn9wNw+sBKDhw8yF112+Hha2D9npOkXEwB4NixY8oyC1FUUvSF\n",
"3Zv35990fuhhevXqAuS28jNTE6h576MAxJ7aRULUcS6nXyTFpTYffD2djIwMPvhiIrHnz6uMLkSh\n",
"SdEXdi0tLY0Fi5fx/OABXLyYO7TvzKFVAKRfvAAOuR+R1KRzJMWcJDvbSKJbQ4a+MRbHoBY8/9Ir\n",
"7A0OVpZfiMKSoi/s0juj3mD79q08/eLLbF+7kNZt7s33nnMnNoMpB/+76gEQcWQNkUfX4ejkQvDB\n",
"o+z46xMi49L4e+mS0o4vRJFJ0Rd25/jxEyxesoRFixexac1iPH3KkXrp1jfwJEQdv/rzia1zuJyW\n",
"RHTINjIuxZOaFMPUnyYCsGfffo4eO36r3QhhEWT0jrAbaWlp/DxzJhvWraJOq4HMnjWR2rU1Tp7U\n",
"b7mNn6EqyXFnrlu2esozAATVbMHltNwJZQcPHkCLdj1IiIuiapVKeHvb7tQNwrpJ0Rd2ITIygnLl\n",
"Avj0k7GYcnJwdNoOUGDBB/IV/GtlZaaRGJ27/apVK6hY7wFmfvcpVatUokXzZtSrW7fkTkCIEiLd\n",
"O8LmLVj8N02bNqDfEwMw5eQA4OJe/JZ4fOSR617P/PZNAFau+Jsdew8Ve/9CmIMUfWHzTFkZVKhU\n",
"k+TsMgD4+JUjM9V8j3gIDw9n7NsvA7Bs5T9mO44QRSFFX9i8FcuXMXDAAEJ2LwIgI/2SWY93Okwn\n",
"K8vI5cuX+W3JVlasXm/W4wlRGFL0hU37afrPlClfi/Fff3J1WV4Pj9mNGTsaytbjz7VHORsRWToH\n",
"FeI2pOgLmxUZdY6Fi5YSvO36LpbsrEzK125lnoM6/PeRmv3rdFITzxFy4hBf//Q72dnZ5jmmEIUg\n",
"RV/YlLMRZwHIyclhwvQFXEq+wMmQ/GPnY07uME8A0/X/jIgO2Ua2MZ31a5bzyCPdiY+PN89xhbhD\n",
"MmRT2AyTycSQ4aNpVOcuzkWdJTz6IqdDj9x+QzOKPLaBoJotuLfPe6z8ri+7gw/SrUtHpZmEfZOi\n",
"L2xCdnY2W3fs4uCedQRvTVMd5yo3zzLEntrNOX0rAE8P6s2mzbupVbMmLi4uitMJeyTdO8ImrFu3\n",
"hv6PdiUr8/qC7+ruqShRrsy0JAAO/PMt3Xv1o2GnIXRo34KDB/crzSXslxR9YdX0kNw7Yuf8Pvem\n",
"6y9nWE6rf+WyPwms2gSAvo/154L07wsFpOgLqzVp0kTatb2XdRs2smrl36rj3JH1M1+iTfsHycrK\n",
"4tnXxhIbK/Pxi9IlRV9YpTNnz7JlZzCNm7VlwOMPq45TKNs2r8GYcZGw0OOM/vJnIqPOqY4k7IgU\n",
"fWF1kpKT+GLKX0TFJpAQb70t5bjT+4i8kM7Hkxaydu0a1XGEnZCiL6xKcnIS48bPJMHoQ8jBzUSe\n",
"DlEdqVg8/coTlWDktWEvc0SeuStKgRR9YTVSU1Pp1LkDJ6MusmHWSNVxSsT2+aM5e3g14z7+ivc/\n",
"+ZKNmzarjiRsnBR9YTXa3t+BHEdPdi8brzpKiYoJ3cnG4NM8cP8DDBz0GIv/lscvCvORoi8snslk\n",
"4rf5S3AtV5fI8KOq45jF4eMnOR+fRFZmGkNeHMyBQ2rvJBa2S4q+sGgmk4mhr7zI6gNJhO9bqjqO\n",
"2ejb5jJt4jgAarfsz5PPPs/e/QcVpxK2SIq+sGhTp03FwcGZTfPHqY5idqac3Fk4L8ZH4B1Qk49/\n",
"mM/OPcGKUwlbI0VfWKTlK1dx8NBB9gQHc+DQYTIuxVOpQcfrpi62VTGhO8lMS2bH8kkMHzuesR99\n",
"oTqSsCEy4ZqwSFHRsTz3zGN4enqRlpYKgAMO+aYutlXn9C0AnA/fxwZXT5avWk3Prl0UpxK2wPab\n",
"TcKqpKQks2r1ajZu2w1wteADRBxdpyqWMjk5RgKq3s1zg/uxeNlK1XGEDZCWvrAYoafCWLJsBQv+\n",
"mEWayVd1HIuQnhLHjgXvUrtxexYuXoqrmzs9ZD5+UQzS0hcWY82mnfw64wdysrM5H75XdRyLEh0Z\n",
"RpPGDZi9aANL/1mrOo6wYtLSFxZh/ORf+HLccNUxLNalhEjWbgmmd88uzN8QjtH4D30f7qY6lrBC\n",
"0tIXSqWkJBMaepJZc24+H774z/4tiyjj5UI5t1T+3hXPvL+WkZWVpTqWsDJS9IUyR44cJjEpmZ79\n",
"BxFzarfqOFZh+KvPk5pwllOHNjD85Sfp3bcPG7dsVR1LWBEp+kIZgyGQ6dN+IiHquOooVmXV33NI\n",
"TYqmXGAlygTVZPFf81VHElZEir5QxtnZmWnTJquOYXWMlzM4p28l/nwkq5fMRI/JZsovv7NZWvzi\n",
"DsgXuaJUmUwmADIzMxk25gscnVzJyb6sOJV1c3L3Y+7fm3E1nuf3OrUICiqvOpKwYNLSF6Vq6vTp\n",
"BAX5cU/Tu1mzeCpBNZurjmT19qycTLUmPTgXE8sTTz1z9Q+rEDcjRV+UqtCYy5StWI+EC9E07fk2\n",
"0SHbVUeyCf9OfpL4iMOUa9Sfp18YIoVf3JIUfVEqDh4+zKK/l7H74AmyMlNxcfNm3/KvVMeyOR4+\n",
"Bo6GxfHZhJ9ITExQHUdYICn6wuxSUpKJi4vjpRef5MSW2Xj4BWLMvKQ6lk3avXAUEUfXsT74LC3u\n",
"u5ecHPuYoE7cOSn6wuy8vLwZ+ETvq6/Ph8kUC+aSGBcJwMX4s1Sq0ZCPvvlJCr+4jhR9YRZR584x\n",
"7dffSUxM4OTJUwA4u3oqTmU/Tu9fTus2HQjLqMb7X04iOztbdSRhIaToC7OYPncp26MCGPLeNNq3\n",
"zx2hk3U5DQ9fg+Jk9mP69x+SY0wjIqs2Y7/4AaPRqDqSsABFGqevaZoj8CPQGMgEXtB1/dQ160cA\n",
"zwNxeYuG6LoeUsyswgoYjUam/zyN4GPnKOdxio1/Tr1ufXpK3C22FCXN3cMbp9hNmAytOJVVjnc/\n",
"n8Qno17B1dVVdTShUFFvzuoNuOq63lrTtJbA+LxlVzQFntJ1fX9xAwrrcfnyZT77dipzZ07gckYG\n",
"6empt99ImE1G+iUOHA2lXq0Mjoecxr3bCEZ/+gOfjBqKh4eH6nhCkaJ277QBVgHour4LuPEOm2bA\n",
"GE3TtmiaNqoY+YQVmThlFuu37GD+H39JwbcQ0Se3s/6fBcSe2s2JrXOId23EkOGjOHv2jOpoQpGi\n",
"Fn1fIOWa19l5XT5XzAOGAB2Btpqm9SjicYSV2LknmL+Xr+DE7uV0feh+1XFEnhrNHr768+kDKwhd\n",
"+zleHi58M3UBly5dVJhMqFLU7p0UwOea1466rl87LmyiruspAJqmrQCaACsK2qHB4FPQaqtny+eX\n",
"np7OZ19+TsiB9VeXBVZvLk+/sgBhwUuvex1yNBg/Hw8C73uId7+cwvgPhlKhgm3P1WPLn72iKGrR\n",
"3wb0AhZqmnYfcOjKCk3T/IBDmqbVB9LIbe3PuN0O4+Jst9VhMPjY7PkZjUbatG/F6VPXf08vBd9y\n",
"7dm5FXZupUKdNvTWjzJ7ygT8/f1VxzILW/7sQdH+oBW1e2cxkKFp2jZyv8QdoWnaAE3TXtR1PRkY\n",
"BWwANgNHdF1fVcTjCAuWlZVFlarl8xV8YfkcnVyIO70fn9o9GDv+V+IuXFAdSZQSBwuZmMlk63+N\n",
"be38Dh05wpeTZrFm0dTbv1lYHEcnZ3Kys3Bx86LVY59icEvm/WEDqFDetrp6bPGzdy2DwcehsNvI\n",
"zVmiUEwmEzPnLOCdz6ZfV/B9AqqpCyUKLSc799m65SrWZPNvr3N0/xbe+PA7IiKjFCcT5iZFX9yx\n",
"E7rOWx9PZEOoG+lZ1zcwLl44rSaUKJaY8Nyv404d3kqz2v4MGf4Wp0/LcE5bJt07pcAW/on565zf\n",
"eXvkSwC4evpxOS1ZcSJRkgIMgVyIO4+3n4GgitWYNvl7GjVsoDpWsdnCZ68g0r0jzGL/gUP88N1X\n",
"PDJ4NIAUfBuUkBDP8hXrqRgUgLMpjU4dW3HshK46ljADKfqiQGlpaTwxsD9xCcnUqVJWdRxhJjnZ\n",
"2Ywf/wVbt+7iycEvAvDpj4s4fPSY4mSipEnRFwVyd3enRecnybgUz9efvK06jjCDqnd3pVGnl9iw\n",
"fjWBgb6kp6fj4WPg1PFgvvttIwcOHVEdUZSgot6cJeyAyWRi7fqNpLtUUh1FmNGZg7m30Tg4OmPK\n",
"yeLzj0djqHoP3oG1iYgIZ+JcE0ONWdzb7B7FSUVJkKIvburSpYucCj/NoIG9b/9mYRNMOVlXf447\n",
"cwCfgKq4+5QjLimdyQt38L/sLFq3uHFuRWFtpHtH5GMymVi0dAUPdmqjOopQKCz4bxJObUPf+BMJ\n",
"F2KZtmgvm7fvUh1LFJO09MVV2dnZLF72D5v3hXHuksy3LiAmIncEz86lE+j++l/MXH6YLGMWHTtI\n",
"g8BaSdEXAGzesoV+fWUGbHFr2ZmpOLhXYPriPRizs3moY3vVkUQRSPeOAKB9u3ZM+H4GfoE1VEcR\n",
"FurfnwazatIALmebmLXiECtWr7/9RsLiSNEXJCYmsGfvHgwB/iSfD1MdR1i4LfM/IMfBnQWbIliy\n",
"YrXqOKKQpHvHjiUnJ/Hv+k3ExyfywZhhquMIK+Hl48/62SNp3XcMi7Y7YjSupH/v7qpjiTskRd+O\n",
"+fr6kXwxg9+WbFIdRViR1IsJ+Pr6sf2vzwiocjfZWc+Rlb2MAX17qY4m7oB079ixhIQE3n3zRUJ3\n",
"/6U6irAizXq9Q0pKMvcP+Ah3d3ccnV2Zt+YE302erjqauANS9O1QRkYGr7z2GvXqVVcdRVihQ6sn\n",
"4+Luw8Z579OnWwfq+sUQuvsvgqN9+H7KbZ+MKhSTom9nkpOTGDFyGDExkaqjCCtlzLxEuUq50y7/\n",
"MPErJn4zjnK+rvw77X+s3x/L1F/nKU4oCiJ9+nbEZDIx6IVXOHPuAjEnd6iOI6xYTOjO617rxw7Q\n",
"b9ArJJ0/wrRp/5CZeZlhQ55WlE4URFr6dsTBwYFePXpIwRdm8eecyaxdvZyE6DD2HjnF+MkzsJCH\n",
"NIlrSNG3YTk5ORw5epi0tDQgt6U/46evFacStu7eDg+TGn2IsLBTjBn3qRR+CyPdOzbM0dGRX+Yt\n",
"JTrdDx83SAzbQnj4KdWxhA1zcfNi48q5ea/W07Lj48z7cwkD+/dRmkv8R1r6Nq5Tu+Y4ZSWzduF4\n",
"Nq7/R3UcYeOMmanXvb6c48SipSvJzMxUlEjcSFr6Ni71Ygr/zvtSdQxhZ8qUr41jTgZl3LP5efJ4\n",
"3NzcVEcSeaSlb6PCwsP56KvveGXo86qjCDuUFHOShPMRbFg1nz8WLCQrK+v2G4lSIUXfBh05eoSR\n",
"I15l0jfvq44iBO+OHkHlKhVo0vI+pv8yS3UcuydF38YYjUbeef9Ttm/fwj3tHlUdRwgAsrMyiQo/\n",
"RuiJQxw/cZyzEWdVR7JbUvRtxNmzZxk6bDh33VWOPVtWAHBgyyLFqYS43i+/TKdD+5as+Gc1qamp\n",
"t99AlDgHCxlDa4qLu6g6g9kYDD6Y8/x27NzJIw93Mdv+hSgqD18D5Wu1IjM1Aa+yd3Fy53wAfA3V\n",
"cHXMYsO6jQQFBprt+Ob+7KlmMPg4FHYbaelbuezsbBrUr6c6hhA3lZ4Sx/nwvQRUvQcXN6+ry1Pi\n",
"TuNfsx37DhxWmM4+yZBNKxcVFcniJTI1srBcqYnnOLR6Ur7lIdvnkf5Md4L376d61Sr4+5dTkM7+\n",
"SEvfyiUkJPHpJx+qjiHEHQuoejcArp5+jPnoCyZO/wOL6GS2E9LSt0LJyUkEHzhE6OlIxr71kuo4\n",
"QtyxoJotiD21G4DLackkpCWz6s+jlC/jyleffaw4nX2Qom+FzsclMObd0YSFSH+osC5XCv6N0jIu\n",
"l3IS+yVF30ocPXaMuPhEvL08WbNlH0nxMaojCVEs3v6VqFCnDQmRR1kw5yeiIiNYNH8uDg6FHpAi\n",
"CkGGbJaCkhg2lpOTQ/nyZajf4Tnc0kI4dTqSlLjTJRNQCMW8/SuRlnKeqtU1tm/ejJOTU4nsV4Zs\n",
"5idf5FoJR0dH3v3ga45tmsn+PVtvWvCdXNxLP5gQxeAXWANnN08uJUSSk3WZ8JOHqVChLH369SXq\n",
"XJTqeDZJir6FS09P57WRbzFvwZ98Ou6tfOsdHP9rEWUbM0ozmhDFlnw+jKzM3If8VKr/wNXl2zav\n",
"ISJCir45SPdOKSjOPzFPhobSuVNb0tPTSjiVEJbJ1dOPy2nJABw/EY6vjy+ZmRl4e/sUel/SvZOf\n",
"tPQtUGRUJN16PERgoC9tWjeVgi/syuW0ZBydXQF4882R9BowhGPHQxSnsh1S9C3MgQP76TfoRTzr\n",
"9FMdRQhlcrJyh3AeOXKYfZv/ZPUaeepbSZGib2HuuacJrg6ZbJn7BgCefuUpX+s+xamEKF2u7j6M\n",
"GvMRXXr04656HZg6bRp169UgPj5edTSrJ336peBO+xVTU1MZP+U3QhL9WD1tSCkkE8J6BFZvjqez\n",
"kU/GfUiXzp3uaBvp089Pbs6yEDv3BDNr8VZiLiSzbcnbAJStWJfEcycUJxNCPZ+AqjTu8gpuOcnk\n",
"UDJj+O2VdO9YgJiYaP5dt4kqQZ6E7VsOQJmy5aTgC5Hnng5P4OZZBgefaszbcJY//16pOpLVkqKv\n",
"2KrVa3B2duGd14cScuIYOdlGAJISpe9SCIDytVtxMT6KkPXfcilyL47ed7F0TzK//i5TiheFFH3F\n",
"snLgyRdHcHfLdmxYvYhatWrh7etPw/t6qo4mhEWIObmDAxvncD4mmrCj2whZOx7jpRjWn8hh0vTZ\n",
"WMj3klZD+vQVCD0VxunT4eTgyHtjRhIVeQYARydnDh/cy6WUBI7sXK44pRCW477+H2PKycEvsDqJ\n",
"0SEc2bWChKjjpLZ5gKHPD5JJ2gqhSKN3NE1zBH4EGgOZwAu6rp+6Zn0v4D0gC5ip6/rPt9mlXY3e\n",
"iYw6R9fuD3E++ozCVEJYlzLl6+ATUBWA1KRzJEQeJbBGc1o3a8gPE77Gzc0t3zYyeie/onbv9AZc\n",
"dV1vDYwCxl9ZoWmaCzABeBDoAPxP0zTzPfnYCoWfPkvVOk1UxxDCqiTFhOBdtiJVGnbC0zeQDoO/\n",
"p1L9jmSW78ZrH05h9h+LyMrKUh3T4hW16LcBVgHour4LaH7NunpAqK7rybquG4GtQPtipbQRqamp\n",
"vPL6G3z7+xZSHfxxcfPGzassjk6uqqMJYRWOb5nFzr8+xNu/MhmpCVSs0xonFzey/BqxNsSNwa+9\n",
"z/xFy8jJyVEd1WIVtej7AinXvM7O6/K5si75mnUXAb8iHsemeHl5MXLYKySdXEN5tySqV61EZmoi\n",
"Odny1CAh7lS2MYMTW2ez668PyQpbTNaZVRxeNo5DSz/kfMQJfpr2M43vrs+JE8dVR7VIRf0iNwW4\n",
"dso7R13Xr/xpTb5hnQ+QeLsdGgyFn0HPmlw5P4PhbnZtW8cn43/G5N+IkJBxipMJYR3a39+RzRvX\n",
"X7ds1ZJZ172+r8U9vDFiOK1bt7q6zNZrS2EVtehvA3oBCzVNuw84dM26E0BtTdPKAqnkdu18fbsd\n",
"2viXLVfPz2QysSd4H8YsB07tWQyAh4+BgCqNiTi6TmVMISzajQX/WpUrV+WlV97g3mZNqFZNu/p5\n",
"s4Mvcgu9TVGL/mLgQU3TtuW9flbTtAGAt67r0zVNGwn8S2730Qxd16OLeBybk5mZydS5K8kOaIF/\n",
"rY54VWzK8c2/SsEXohiq1ruPhIuXqV6tKi4uLqrjWDSZcK0U3NjaOHz0GP0HPUtCVG6fY8NOQziy\n",
"bqqqeEJYpZaPfkCDig4Meao302bMIC09k4njv7nuPXbQ0i/0kE0p+qXgxl88k8lEy1YtyHDyJ+bk\n",
"DiC3iyf9YpyqiEJYhUoNOlKv3WDSUy6w9fc3AahRpyH9+g3gzddfy/d+Kfr5SdEvBbf6xXv/y0ns\n",
"DUlm77IvFaQSwjrcVbcDcWf2ozXvzuEtf+RbP2XGAh7t1fWm20rRz0/m3lHonVefpWqQJx2e/gFn\n",
"Fw/VcYSwSFEnNuFVpgKxkaH4BdW+urxNj/8xZ/7yWxZ8cXMy945CXl5eDH2yK8+9/iHVmvaknJ8n\n",
"6TEHOHTooOpoQliUwBrNSYo5STlfF2rWeAi/oFoMf6EvrVs0v/3G4jrSvVMKbvdPzCUr/sXT3YV/\n",
"tx4mNsmIV7rO4kXzSzGhEGqUv6saMVGnKVtBIzFav27dA09PJHTvUpxdPQjfvxxPL19aP9ATk6s/\n",
"o199ksYNG9x2/9K9k58U/VJwp794JpOJOfOXMG/xSvZukKIv7E9QzRbUafUEZ/ctwoskvvr6Ox4b\n",
"MAAPv/LERxymUuWqfPXVBDp3evCO9idFPz/p3rEgDg4OPPVEH5zJlKIv7FLsqd3Entp99XXPHp2p\n",
"fd/jRB3fiJefgezsLA4dO0W7tu1vOqumuD35ItcCdWjfjtjYZL6cMI3qNetgCKqkOpIQpcLbvxIV\n",
"tXZXX/vf1YCTO+eTlhzLsy8MY83qzYx87SUp+MUgLX0LYzKZ+HfNOrbtP5k3+6YjcbGRqmMJYXaB\n",
"1ZtxPjyYSwn//b4nRB0FICTkDGXKlFUVzaZIS9/CODg48OzTgxncvyeOrl54V3+ARp1fvu495So3\n",
"UpROCPPwKlOR8+HB+ZYHVahMZOQFKfglSIq+hWrfpiWPdKjP4bU/oW+bi6Hqfw9diY84rDCZECXH\n",
"3bscZSpopCadu7rM0y8IyP1D0KFTDxITbztJrygEKfoWrF3btrz57leAA3Fn9quOI0SJy7gUT9IN\n",
"QzXTkmPxNVRj7Hsf8cVH7xEUFKQonW2SIZuloLjDxi5eTKFmzWu/zHUALOK6CVHiWrXvypKF80vk\n",
"YecyZDM/aelbAU9PLyIjLxBy8ixOTs5IwRe2qnbdxnTt3puUlOTbv1kUiRR9K+Dk5ISrqytl/Mrw\n",
"0suvUr1GTdWRhChxzi6ulG/yOPHxF/D2lqddmYsUfSvzwfsf8cvMudzT9N586wJryDwkwnq1a3c/\n",
"Fw4u4MXBT+Dk5KQ6js2Som+F6tevz9w5+e/YPR+2F7+gWgoSCVF0D3bvl/v/XbqzaeMWgoICFSey\n",
"bXJzlpVyd3Ol9+PPcyHDm92rf+FyegoAybGhipMJUThrVv7JsWNhBAQEqI5iF6ToWykfH1+m/fAt\n",
"qampLF/VlNdeflp1JCGKZOKkGVLwS5F071g5Ly8vHu/bh0WLV9Kj77Oq4whRKDN+mcuAx/qrjmFX\n",
"pKVvI9q2aYuvXxlCjwWjHz+kOo4QBXrvwy/o0qkT1atXVx3F7kjRtyGNGjTgr4WLCD8TQa8eD6iO\n",
"I8RNTZk2i0d791Edw25J0bchRqOR2b/P42hYPHWbdyUxMeG6ucmFUKlHrz78MmOW6hh2T6ZhKAUq\n",
"bgVPS0ujWrXypXpMIQqycdNO6terX6rHlGkY8pOWvg0ymUw0alxPdQwhAGjYbgDPPd6t1Au+uDkp\n",
"+jYoKyuLcR99zsKla/Av68vO3XtJiTuNMcN2WzzCslSu3YSIk/sJqFCTj958nratWqiOJPJI904p\n",
"UPVPzKSkROpo1cGUU+rHFqJb/1f4eNQrVKms7nGf0r2Tn4zTt2Fubu4cPXKSNeu24+5dTnUcYUf8\n",
"yhoY+EgHpQVf3Jx079gwDw8PPDw82H/wINmXU1XHEXZk147d+PtLQ8MSSUvfDnTp3JlDB4+rjiHs\n",
"QI2aGsePh0vBt2BS9O2Ep6cnM2bOu6P3+gRUM28YYZNq1G1Ct5598Pf3Vx1FFEC+yC0FlvJlUlZW\n",
"FtHR51i9cRf/7IvHmBjKjpXTVMcSNmDKlBm0bdsBg8EAUCKPOiwJlvLZMxcZpy8K5OzsjMEQiJND\n",
"Fs8+VANfr/qMN6aQ5V2b3Ys/Vh1PWKkKd1XD08sXg8FgMcVe3JoUfTvj7u7OM4MGXH1dzhDEiHc/\n",
"VZhIWLtmTZvQ9aGHVMcQd0j69O3ciRPHiQ4NBsDdw5OWj36gOJGwFsdPhLNnzyE8PT3Izs5WHUfc\n",
"IenTLwWW3q8447cFrNobi/ulw/y77A/VcYQVOBkagZ+vn+oYt2Xpn73ikj59USTxF86TfGY7p5NS\n",
"VEcRFs7VzY0PPv7WKgq+uDkp+oJnBj2Ou5cfM2dMwdPXQKXa9xISvFJ1LGGBjh07ha+Pr+oYohik\n",
"T18QaDAsL5S+AAAQ6ElEQVQw7H9PsWfrRp5+8gmSow6ojiQsQECVxjg6/dcuHD5ilBR8GyB9+qXA\n",
"mvoVjUYj6zZsYNpvizi6ZzWJCRdURxIWoFxgZQ4E78PNzU11lEKxps9eUUifvig2FxcXunTuzNFj\n",
"x4g7vV+KvuDNMV9wIemi1RV8cXPS0i8F1trayMnJYdL0Ocyc8wfn9K2q4wgFwsKi8Pb2UR2jyKz1\n",
"s3enpKUvSpSjoyPDhgymdo3KzPv3Ic5HhbJvnTzj1F70fep1Dhw6RNvWbVRHESVIir4oUGxsLHv2\n",
"7GTV7M8JMAQSWL0Z58ODVccSZlSmgkazjoN4oG0tKfg2SEbviAIFBQXx/pjRHDkSysiRozBmplGu\n",
"ciPVsYQZ3d28Pe8824nH+vRQHUWYgbT0xR0JDAzk7ibN+ODdACIiI/lxcgzpKXGqY4kS1r3vC3z2\n",
"/htUrFBRdRRhJlL0xR27t2kT7m3ahOSUZMZ/PkZ1HFHCvP0rkpVlxL+szIdvy6R7RxTa6jXrmbdg\n",
"BV+P/1F1FFFC3L3L0W/gS0z+5mPc3d1VxxFmVOghm5qmeQBzAANwEXha1/ULN7xnItAmb70J6K3r\n",
"ekETu8iQTSu1au0GBg98RHUMUQJCQyPwtbE5dWz5swdFG7JZlJb+y8BBXdfbA7OBsTd5T1Ogi67r\n",
"D+i63vE2BV9Ysa6dH2DmzF9VxxDF4OXrz6w5f9lcwRc3V5Si3wZYlffzKqDztSs1TXMEagPTNU3b\n",
"qmnas8WLKCydh9f1N+9orQcqSiJuJ7B6s+teu7h58P13k+nW5UFFiURpK/CLXE3Tngdev2FxLHCl\n",
"5X4RuLF54Al8D0zI2/8GTdP26rp+uPhxhSXq0K4VgUEVqNywC8HrZqFv/111JHEL58ODKesfQGLC\n",
"BV7433Datm1B964yNNOeFFj0dV2fAcy4dpmmaX8BV5p2PkDSDZulAd/rup6R9/71wN1AgUXfYLDe\n",
"W73vhC2fX1qaE8OGvcaqnRE0bdGOfbu3qI4kCpCYcIF3xn7O+6OH4enpqTqO2dnyZ68oijJkcxvQ\n",
"HdgDdAM237BeA+ZpmtYUcALaAr/ebqc2/mWLzZ/fw736UbnSfpZvCSQzM4OjB/eojiXyNGl6L/v3\n",
"5V4PQ1AlPvjoKx7r05PU1Gw8PeWzZ82K8getKEX/J2CWpmlbgExgIICmaSOAUF3Xl2maNhvYARiB\n",
"X3VdP16E4wgrUrZsWRau2k5ypiu167fA2SGbgwf2qY4lgP379uDg4ECXvkOpUKEC/XtLd449k1k2\n",
"S4E9tDbi4i4SF3eBd7+cRnqOG1++NZAlS5fy4XtvqY4ngAcfeQ6XcnX58OVuVKta9epye/ndtFWl\n",
"NWRTiJsyGAL4+r2hlPEw8f7Ev6hfvxFDho5UHcvulTNUwKHiA7Rr4H9dwRf2SYq+KFF+fmX4YszL\n",
"+LhmMmNlCLEXEqlcpbrqWHYpsHwlnn/1A1o88R3+2Sd59sl+qiMJCyBFX5Q4Ly8vPh/zCuXdEzh0\n",
"Kp6Is+GqI9mR3H/tOzu7YMzK4czlqgRkH+fTUUNwdJSPu5CiL8zE3d2dz8a8RtVAd1o99inV75ab\n",
"f0qHCSdnF8pXrETTh8ei+cby5dhh+Fjx069EyZKiL8zGxcWF32dOpUkVR7S2z9Bp0GdX11Vr0pMy\n",
"5WsrTGe7WrZsRblqLWhTy8T7bw3F2Vkm0xX/kaIvzMrJyYn333yZur5RpGZm4+kXBMDp/ctJijmp\n",
"OJ3tqVC9IW6V2vHCk4/w6gtP4eBQ6MEdwsZJ0Rdm5+joyOjXhxBEOBXrtieoZgvVkWyOm5sbz730\n",
"FilJCQzp35bH+/RUHUlYKCn6olQ4ODgwffJ3PNm7I64evnj7V1IdyWYYqjXl5dfeISwyjgXz5tGx\n",
"vTzXVtya3JxVCuzgBpFCnV+3Rx7FIbAFackxHNs404zJ7IN/+RpcSj7Pzu17qXRX4R5zKL+b1k1u\n",
"zhJWYdb0KdxbNYdjG2dSr/0z3FWvg+pIVqlVu9wRUQkxYbz62huFLvjCPsnX+qLUBQYGMm7sGHJM\n",
"jmw+eI6yFeuSmnhOvtgthOo16+LhE0DVqtX5+++VVKx4l+pIwkpIS18o8/F7o3iia1MunDlIrZb9\n",
"VcexGp4+AZQLMODplM7u3Qek4ItCkZa+UOrlF56hfFAgrw59Hhc3b4yZl1RHskjX/rfp0fVBoqJj\n",
"mPHzLBmSKQpNWvpCuT69ujNj5lwp+AW48t/my/FTSb3swKI/F0vBF0UiRV9YhK4PduT1t8YBUKfV\n",
"AMVpLM/TL43hpWHvsmnbTqZN/l7m0RFFJr85wmKMeWsEP0z6mcupcaqjWASt8X/j7dMvJRIRm8yP\n",
"Ez7H1dVVYSph7aRPX1iUxx97jPIVKtC/71rVUZSq0+BezoSdAKBqjbqEnT7DnBk/2sUzbYV5SUtf\n",
"WJwWzZuzfOV67ntwEFemCrY3p0IO4+lXng6de7Fj63YW/jYD/7L+qmMJGyBFX1gcDw8PWjRvzpRv\n",
"3qPnoLfxC7K/2TizjRmUKVOWBXPn4OzsjJeXl+pIwkZI0RcWq2KFCnw15iU6d39cdZRS5ejkzIRJ\n",
"s5j543gZoSNKnBR9YdECAsoxdFBX1TFKVZfufRn0WB/q16unOoqwQfJFrrB4jRo1ZsfOfbS6r6nq\n",
"KGa3Zt12GtSrqzqGsGHS0hdWoWaNWixYuEx1DLPq3H8kdzdqKE+6EmYlRV9Yjfs7dCAs7ByNmtwH\n",
"YBOPWzRUa4p3mSCWLluLszFedRxhB6RJIayKt7c3q1f+Q4UKZfPNyuno5EJOtlFRsqKJO72PqKh4\n",
"XFxcaNrkHkwmk3x5K8xKWvrC6jg5OTF33pJ8y929rW8ce+UaDXBxcQHA1dVVCr4wOyn6wio92Kkj\n",
"sbHJNGjy31QFacmxChMV3ptvv0fF8gYuXbLdJzsJyyNFX1gtBwcHNvz7z01bx1cmbbOUp3K5eZXN\n",
"tyw1I4uZ02fg7e2jIJGwV9KnL6xeTEwSQUF+V18H1mjOxfizAEQd36Qq1nUyUxOv/vzOe1+yZ+8+\n",
"Bj3RH4PBoDCVsEdS9IXVc3Bw4NtJsxjx6tMAnA/bC0Ddtk9xYutvKqPl88OkqTz+2AAuXbooLXyh\n",
"hHTvCJvQ75HufP3ttKuv67YbTOTxjeoC5XFx+2/OnE2bd3PhQu6wTCn4QhUp+sImuLm58fSTTxAT\n",
"kwTAiS2zafnoh0ozuXr4kWXMAODw4ZPUq1uXZ595TmkmIaToC5vi6OjI2bPnAVg3/XkqN+ysLMvl\n",
"9GRMOdm88OpYgoKCAGQ+fKGcFH1hc9zd3dm79zAAppxsgmq2VJbll1nzGf7SM8qOL8SNpOgLm1Sl\n",
"SlVGjxlH5LENxJ7aRVDNFqU+bcN3E3+iR7duBAUGlupxhSiIFH1hs4YPG860n3NH72SmJVOp/gN0\n",
"ebl0RvOMfPNdHuzcpVSOJURhSNEXNsvR0ZHeDz/C5CmzSYrWSTyn4+rhg6GaeadoHj78Dbp07iRj\n",
"8IVFknH6wub1f7Q3p8LGMeGrDwiq1YLUpGgcHBwxmXKKvE8Xdx+MGfmnT5j80yz69+1TnLhCmJW0\n",
"9IVdGPXmCB7q0Y99y78mLSmahp2GFGt/Nyv4Q4cOo88jvYq1XyHMTYq+sBu//TKToAqVADh7eC0A\n",
"nn5BuHv6FbRZPi5u3vmWNWv9EGPHfigPQBEWT4q+sCu7dwbn/ZSDZ5kKePtXJiMt+Y63bzvwG4yZ\n",
"l65bVtY/gIcf7iMFX1gFKfrCrnh4eBAbm8yl+LMEVGlMfOQRBjx15109idEn8i0rG1CeJ/v1KMmY\n",
"QpiNNE2E3XFwcOB0+DmqVAmiWpOepKalArmPLow7va/AbY9u+Pm6138uXs2ePbvx9S1cF5EQqkhL\n",
"X9glNzc3wsPPUbOiNzu3bwbg7rZ3Nuqmeq36ALw04mPat7mPoUNeNFtOIUqaFH1htzw9PZk9bRLV\n",
"a9Sh7ROfcTnpDPe0efjq+hrNHr7pdhdiI3h9xDs80fshILfLSAhrIUVf2DVnZ2cWzvudyr6ZpKZn\n",
"4le9PQA9ew8gNm9e/mv5lS2HX1kDo0eNoX69uqUdV4hik6Iv7J67uzsfv/0CWZdicfergKenN99P\n",
"GI+rx3/99L6G6jR44EWSE+P55LNv5AHmwmoVuehrmtZH07S5t1j3oqZpezRN26FpmgxrEBbP19eP\n",
"336ejOul4zRt0QYnJye6P/oMhsr16TzkF1LiwjHGn2Dt+h1076JuumYhiqtIo3c0TZsIdAH232Rd\n",
"eeA1oBngAWzVNG2NruuXixNUCHMrHxTE6Jf7s2mTgdBTJ6kY4M1dVWri5ORC38efoXu3bjRu2EB1\n",
"TCGKpahDNrcBi4GbDXBuAWzTdd0IGDVNCwUaA/k7SIWwMLVqVKdGtapkZmZSJeQ07i4O+GaF8/13\n",
"3+Lk5KQ6nhDFVmDR1zTteeD1GxY/o+v6Ak3T7r/FZj7Atbc4XgRkELOwGo6Ojnh4eHB/m5b8s3IZ\n",
"9zevKQVf2IwCi76u6zOAGYXcZwq5hf8KHyCxkPsQQrmgoCAmf/8VHu7SZhG2wxx35O4GPtU0zQ1w\n",
"B+oBR26zjYPB4HObt1g3OT/rZMA2z+tatnrtrrD18yus4hR9U97/ANA0bQQQquv6Mk3Tvge2kDs6\n",
"aIx8iSuEEJbBwWQy3f5dQgghbILcnCWEEHZEir4QQtgRKfpCCGFHpOgLIYQdUfoQFU3T+gD9dF1/\n",
"8ibrJgJtyL25ywT01nU9pZQjFsttzu9F4H9AFvCJrusrSjtfUWia5gHMAQzkXpundV2/cMN7rO7a\n",
"aZrmCPxI7t3jmcALuq6fumZ9L+A9cq/XTF3Xf77pjizUHZzfCOB5IC5v0RBd10NKPWgxaJrWEvhC\n",
"1/UHblhu1dfuigLOr1DXTlnRL2j+njxNgS66rieUXqqSY8PzE70MHNR1/SNN0x4HxpL/rm1rvHa9\n",
"AVdd11vnfbjG5y1D0zQXYALQHEgDtmmatlTX9fPK0hbeLc8vT1PgKV3Xb/V5tGiapr0NDAIu3bDc\n",
"Fq7dLc8vT6GuncrunW3kFpB8c9TmtUpqA9M1TduqadqzpR2uBNzy/LhmfqK8FvCV+YmsQRtgVd7P\n",
"q4Drppy04mt39bx0Xd9FbpG4oh6596Ak580ptRVoX/oRi6Wg84PcBsgYTdO2aJo2qrTDlYBQ4FHy\n",
"f95s4drBrc8PCnntzN7SL+L8PZ7A9+T+hXYGNmiatlfX9cPmS1o0tjw/0S3OLZbcqTbg5rmt5trd\n",
"wJf/zgsgW9M0R13Xc/LWWfz1uo2Czg9gHjCZ3HNbrGlaD2vpcgTQdX2RpmnVbrLKFq5dQecHhbx2\n",
"Zi/6RZy/Jw34Xtf1DABN09YDdwMWVzhseX6im52bpml/8V92HyDphs2s5trd4MZrcm1BTMYKrtdt\n",
"FHR+ABOvfO+iadoKoAlgNUW/ALZw7W6nUNfOUkfvaOT2czvm9cm1BYIVZypJu4F2mqa5aZrmx53N\n",
"T2QptgHd837uBmy+Yb21Xrur56Vp2n3AoWvWnQBqa5pWVtM0V3K7B3aUfsRiueX55f0OHtY0zUvT\n",
"NAegI7YzFbotXLtbKsq1Uzp6h4Ln75lN7sUxAr/qun5cUcbisMX5iX4CZmmatoXcUSADwSau3WLg\n",
"QU3TtuW9flbTtAGAt67r0zVNGwn8S+71mqHrerSqoEV0u/MbBWwg95qu1XV91a12ZOFMADZ27a51\n",
"s/Mr1LWTuXeEEMKOWGr3jhBCCDOQoi+EEHZEir4QQtgRKfpCCGFHpOgLIYQdkaIvhBB2RIq+EELY\n",
"ESn6QghhR/4PxzNJDpHDX+wAAAAASUVORK5CYII=\n"
],
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"%matplotlib inline\n",
"import seaborn as sns\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"p = 4 #Which norm do we use\n",
"M = 40000 #Number of sampling points\n",
"a = np.random.randn(M, 2)\n",
"b = []\n",
"for i in xrange(M):\n",
" if np.linalg.norm(a[i, :], 1) <= 1:\n",
" b.append(a[i, :])\n",
"b = np.array(b)\n",
"plt.fill(b[:, 0], b[:, 1])\n",
"plt.axis('equal')"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Why $L_1$-norm can be important\n",
"$L_1$ norm, as it was discovered quite recently, plays an important role in **compressed sensing**. The simplest formulation is as follows:\n",
"- You have some observations $f$ \n",
"- You have a linear model $Ax = f$, where $A$ is an $n \\times m$ matrix, $A$ is **known**\n",
"- The number of equations, $n$ is less than the number of unknowns, $m$\n",
"The question: can we find the solution?"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "fragment"
}
},
"source": [
"The solution is obviously non-unique, so the natural approach is to find the solution that is minimal in the certain sense:\n",
"$$ \\Vert x \\Vert \\rightarrow \\min, \\quad \\mbox{subject to } Ax = f$$\n",
"\n",
"Typical choice of $\\Vert x \\Vert = \\Vert x \\Vert_2$ leads to the **linear least squares problem** (and has been used for ages). \n",
"The choice $\\Vert x \\Vert = \\Vert x \\Vert_1$ leads to the [**compressed sensing**](https://en.wikipedia.org/wiki/Compressed_sensing) and what happens, it typically yields the **sparsest solution**. \n",
"[A short demo](tv-denoising-demo.ipynb)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## What is a stable algorithm?\n",
"\n",
"And we finalize the lecture by the concept of **stability**.\n",
"\n",
"Let $x$ be an object (for example, a vector). Let $f(x)$ be the function (functional) you want to evaluate. \n",
"\n",
"You also have a **numerical algorithm** ``alg(x)`` that actually computes **approximation** to $f(x)$. \n",
"\n",
"The algorithm is called **forward stable**, if $$\\Vert alg(x) - f(x) \\Vert \\leq \\varepsilon $$ \n",
"\n",
"The algorithm is called **backward stable**, if for any $x$ there is a close vector $x + \\delta x$ such that\n",
"\n",
"$$alg(x) = f(x + \\delta x)$$\n",
"\n",
"and $\\Vert \\delta x \\Vert$ is small."
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Classical example\n",
"A classical example is the **solution of linear systems of equations** using LU-factorizations\n",
"\n",
"We consider the **Hilbert matrix** with the elements\n",
"\n",
"$$a_{ij} = 1/(i + j + 1), \\quad i = 0, \\ldots, n-1, \\quad j = 0, \\ldots n-1.$$\n",
"\n",
"And consider a linear system\n",
"\n",
"$$Ax = f.$$\n",
"\n",
"(We will look into matrices in more details in the next lecture, and for linear systems in the upcoming weeks, but now you actually **see** the linear system)"
]
},
{
"cell_type": "code",
"execution_count": 168,
"metadata": {
"collapsed": false,
"scrolled": false,
"slideshow": {
"slide_type": "slide"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"2.08661626788\n"
]
}
],
"source": [
"import numpy as np\n",
"n = 20\n",
"a = [[1.0/(i + j + 1) for i in range(n)] for j in range(n)] #Hil\n",
"a = np.array(a)\n",
"rhs = np.random.rand(n)\n",
"sol = np.linalg.solve(a, rhs)\n",
"print np.linalg.norm(a.dot(sol) - rhs) #Ax - y\n",
"#print sol"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Take home message\n",
"- Floating point (double, single, number of bytes), rounding error\n",
"- Norms are measures of smallness, used to compute the accuracy\n",
"- $1$, $p$ and Euclidean norms \n",
"- $L_1$ is used in compressed sensing as a surrogate for sparsity (later lectures) \n",
"- Forward/backward error (and stability of algorithms) (later lectures)"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"## Next lecture\n",
"- Matrices\n",
"- Matrix multiplication\n",
"- Matrix norms, operator norms\n",
"- unitary matrices, unitary invariant norms\n",
"- Concept of block algorithms for NLA: why and how.\n",
"- Complexity of matrix multiplication"
]
},
{
"cell_type": "markdown",
"metadata": {
"slideshow": {
"slide_type": "slide"
}
},
"source": [
"##### Questions?"
]
},
{
"cell_type": "code",
"execution_count": 111,
"metadata": {
"collapsed": false,
"slideshow": {
"slide_type": "skip"
}
},
"outputs": [
{
"data": {
"text/html": [
"\n",
"\n",
"\n",
"\n",
"\n"
],
"text/plain": [
""
]
},
"execution_count": 111,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from IPython.core.display import HTML\n",
"def css_styling():\n",
" styles = open(\"./styles/custom.css\", \"r\").read()\n",
" return HTML(styles)\n",
"css_styling()"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 2",
"language": "python",
"name": "python2"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 2
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.9"
}
},
"nbformat": 4,
"nbformat_minor": 0
}