{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "slideshow": {
     "slide_type": "skip"
    }
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "{'theme': 'sky',\n",
       " 'transition': 'zoom',\n",
       " 'start_slideshow_at': 'selected',\n",
       " 'scroll': True}"
      ]
     },
     "execution_count": 1,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "from traitlets.config.manager import BaseJSONConfigManager\n",
    "cm = BaseJSONConfigManager()\n",
    "cm.update('livereveal', {\n",
    "              'theme': 'sky',\n",
    "              'transition': 'zoom',\n",
    "              'start_slideshow_at': 'selected',\n",
    "            'scroll': True\n",
    "})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "# Lecture 1: Floating-point arithmetic, vector norms"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Syllabus\n",
    "**Week 1:** floating point, vector norms, matrix multiplication\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Today \n",
    "- Fixed vs. floating point arithmetic\n",
    "- Concept of **backward** and **forward** stability of algorithms\n",
    "- How to measure accuracy: vector norms"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Representation of numbers\n",
    "\n",
    "Real numbers represent quantities: probabilities, velocities, masses, ...\n",
    "\n",
    "It is important to know, how they are represented in the computer, which only knows about bits."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Fixed point\n",
    "\n",
    "- The most straightforward format for the representation of real numbers is **fixed point** representation, also known as **Qm.n** format.\n",
    "\n",
    "- A **Qm.n** number is in the range $[-(2^m), 2^m - 2^{-n}]$, with resolution $2^{-n}$.\n",
    "\n",
    "- Total storage is $m + n + 1$ bits.\n",
    "\n",
    "- The range of numbers represented is fixed."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Floating point\n",
    "The numbers in computer memory are typically represented as **floating point numbers** \n",
    "\n",
    "A floating point number is represented as  \n",
    "\n",
    "$$\\textrm{number} = \\textrm{significand} \\times \\textrm{base}^{\\textrm{exponent}},$$\n",
    "\n",
    "where *significand* is integer, *base* is positive integer  and *exponent* is integer (can be negative), i.e.\n",
    "\n",
    "$$ 1.2 = 12 \\cdot 10^{-1}.$$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Fixed vs Floating\n",
    "\n",
    "**Q**: What are the advantages/disadvantages of the fixed and floating points?"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "source": [
    "**A**:  In most cases, they work just fine.\n",
    "\n",
    "- However, fixed point represents numbers within specified range and controls **absolute** accuracy.\n",
    "\n",
    "- Floating point represent numbers with **relative** accuracy, and is suitable for the case when numbers in the computations have varying scale (i.e., $10^{-1}$ and $10^{5}$).\n",
    "\n",
    "- In practice, if speed is of no concern, use float32 or float64."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## IEEE 754\n",
    "In modern computers, the floating point representation is controlled by [IEEE 754 standard](https://en.wikipedia.org/wiki/IEEE_floating_point) which was published in **1985** and before that point different computers behaved differently with floating point numbers. \n",
    "\n",
    "IEEE 754 has:\n",
    "- Floating point representation (as described above), $(-1)^s \\times c \\times b^q$.\n",
    "- Two infinities, $+\\infty$ and $-\\infty$\n",
    "- Two kinds of **NaN**: a quiet NaN (**qNaN**) and signalling NaN (**sNaN**) \n",
    "    - qNaN does not throw exception in the level of floating point unit (FPU), until you check the result of computations\n",
    "    - sNaN value throws exception from FPU if you use corresponding variable. This type of NaN can be useful for initialization purposes\n",
    "    - C++11 proposes [standard interface](https://en.cppreference.com/w/cpp/numeric/math/nan) for creating different NaNs \n",
    "- Rules for **rounding**\n",
    "- Rules for $\\frac{0}{0}, \\frac{1}{-0}, \\ldots$\n",
    "\n",
    "Possible values are defined with\n",
    "- base $b$\n",
    "- accuracy $p$ - number of digits\n",
    "- maximum possible value $e_{\\max}$\n",
    "\n",
    "and have the following restrictions \n",
    "- $ 0 \\leq c \\leq b^p - 1$\n",
    "- $1 - e_{\\max} \\leq q + p - 1 \\leq e_{\\max}$ "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## The two most common format, single & double\n",
    "\n",
    "The two most common formats, called **binary32** and **binary64** (called also **single** and **double** formats).\n",
    "\n",
    "| Name | Common Name | Base | Digits | Emin | Emax |\n",
    "|------|----------|----------|-------|------|------|\n",
    "|binary32| single precision | 2 | 11 | -14 | + 15 |  \n",
    "|binary64| double precision | 2 | 24 | -126 | + 127 |  \n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Accuracy and memory\n",
    "The **relative accuracy** of single precision is $10^{-7}-10^{-8}$,  \n",
    "\n",
    "while for double precision is $10^{-14}-10^{-16}$.\n",
    "\n",
    "<font color='red'> Crucial note 1: </font> A **float32** takes **4 bytes**, **float64**, or double precision, takes **8 bytes.**\n",
    "\n",
    "<font color='red'> Crucial note 2: </font> These are the only two floating point-types supported in hardware.\n",
    "\n",
    "<font color='red'> Crucial note 3: </font> You should use **double precision** in CSE and **float** on GPU/Data Science.\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "### Division accuracy demo"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.9259246\n",
      "0.1040364727377892\n",
      "-5.9604645e-08\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "import random\n",
    "#c = random.random()\n",
    "#print(c)\n",
    "c = np.float32(0.925924589693)\n",
    "print(c)\n",
    "a = np.float32(8.9)\n",
    "b = np.float32(c / a)\n",
    "print('{0:10.16f}'.format(b))\n",
    "print(a * b - c)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "### Square root accuracy demo"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.0000001468220603\n"
     ]
    }
   ],
   "source": [
    "#a = np.array(1.585858585887575775757575e-5, dtype=np.float)\n",
    "a = np.float32(5.0)\n",
    "b = np.sqrt(a)\n",
    "print('{0:10.16f}'.format(b ** 2 - a))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "### Exponent accuracy demo"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.0\n"
     ]
    }
   ],
   "source": [
    "a = np.array(2.28827272710, dtype=np.float32)\n",
    "b = np.exp(a)\n",
    "print(np.log(b) - a)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Summary of demos\n",
    "\n",
    "- For some values the inverse functions give exact answers\n",
    "- The relative accuracy should be preserved due to the IEEE standard\n",
    "- Does not hold for many modern GPU\n",
    "- More details about adoptation of IEEE 754 standard for GPU you can find [here](https://docs.nvidia.com/cuda/floating-point/index.html#considerations-for-heterogeneous-world) "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Loss of significance\n",
    "\n",
    "- Many operations lead to the loss of digits [loss of significance](https://en.wikipedia.org/wiki/Loss_of_significance)\n",
    "- For example, it is a bad idea to subtract two big numbers that are close, the difference will have fewer correct digits\n",
    "- This is related to algorithms and their properties (forward/backward stability), which we will discuss later"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Summation algorithm\n",
    "\n",
    "However, the rounding errors can depend on the algorithm.\n",
    "\n",
    "- Consider the simplest problem: given $n$ floating point numbers $x_1, \\ldots, x_n$  \n",
    "\n",
    "- Compute their sum\n",
    "\n",
    "$$S = \\sum_{i=1}^n x_i = x_1 + \\ldots + x_n.$$\n",
    "\n",
    "- The simplest algorithm is to add one-by-one \n",
    "\n",
    "- What is the actual error for such algorithm? "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Na&iuml;ve algorithm\n",
    "\n",
    "Na&iuml;ve algorithm adds numbers one-by-one: \n",
    "\n",
    "$$y_1 = x_1, \\quad y_2 = y_1 + x_2, \\quad y_3 = y_2 + x_3, \\ldots.$$\n",
    "\n",
    "- The **worst-case** error is then proportional to $\\mathcal{O}(n)$, while **mean-squared** error is $\\mathcal{O}(\\sqrt{n})$.\n",
    "\n",
    "- The **Kahan algorithm** gives the worst-case error bound $\\mathcal{O}(1)$ (i.e., independent of $n$).  \n",
    "\n",
    "- <font color='red'> Can you find the $\\mathcal{O}(\\log n)$ algorithm? </font>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Kahan summation\n",
    "The following algorithm gives $2 \\varepsilon + \\mathcal{O}(n \\varepsilon^2)$ error, where $\\varepsilon$ is the machine precision.\n",
    "```python\n",
    "s = 0\n",
    "c = 0\n",
    "for i in range(len(x)):\n",
    "    y = x[i] - c\n",
    "    t = s + y\n",
    "    c = (t - s) - y\n",
    "    s = t\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Error in np sum: 1.9e-04\n",
      "Error in Kahan sum: -1.3e-07\n",
      "Error in dumb sum: -1.0e-02\n",
      "Error in math fsum: 1.3e-10\n"
     ]
    }
   ],
   "source": [
    "import math\n",
    "from numba import jit\n",
    "\n",
    "n = 10 ** 8\n",
    "sm = 1e-10\n",
    "x = np.ones(n, dtype=np.float32) * sm\n",
    "x[0] = 1.0\n",
    "true_sum = 1.0 + (n - 1)*sm\n",
    "approx_sum = np.sum(x)\n",
    "math_fsum = math.fsum(x)\n",
    "\n",
    "\n",
    "@jit(nopython=True)\n",
    "def dumb_sum(x):\n",
    "    s = np.float32(0.0)\n",
    "    for i in range(len(x)):\n",
    "        s = s + x[i]\n",
    "    return s\n",
    "\n",
    "@jit(nopython=True)\n",
    "def kahan_sum(x):\n",
    "    s = np.float32(0.0)\n",
    "    c = np.float32(0.0)\n",
    "    for i in range(len(x)):\n",
    "        y = x[i] - c\n",
    "        t = s + y\n",
    "        c = (t - s) - y\n",
    "        s = t\n",
    "    return s\n",
    "k_sum = kahan_sum(x)\n",
    "d_sum = dumb_sum(x)\n",
    "print('Error in np sum: {0:3.1e}'.format(approx_sum - true_sum))\n",
    "print('Error in Kahan sum: {0:3.1e}'.format(k_sum - true_sum))\n",
    "print('Error in dumb sum: {0:3.1e}'.format(d_sum - true_sum))\n",
    "print('Error in math fsum: {0:3.1e}'.format(math_fsum - true_sum))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## More complicated example"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2.0\n",
      "0.0\n",
      "0.0\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "test_list = [1, 1e20, 1, -1e20]\n",
    "print(math.fsum(test_list))\n",
    "print(np.sum(test_list))\n",
    "print(1 + 1e20 + 1 - 1e20)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Summary of floating-point \n",
    "You should be really careful with floating point, since it may give you incorrect answers due to rounding-off errors.\n",
    "\n",
    "For many standard algorithms, the stability is well-understood and problems can be easily detected."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Vectors\n",
    "- In NLA we typically work not with **numbers**, but with **vectors**\n",
    "- Recall that a vector in a fixed basis of size $n$ can be represented as a 1D array with $n$ numbers \n",
    "- Typically, it is considered as an $n \\times 1$ matrix (**column vector**)\n",
    "\n",
    "**Example:** \n",
    "Polynomials with degree $\\leq n$ form a linear space. \n",
    "Polynomial $ x^3 - 2x^2 + 1$ can be considered as a vector $\\begin{bmatrix}1 \\\\ -2 \\\\ 0 \\\\ 1\\end{bmatrix}$ in the basis $\\{x^3, x^2, x, 1\\}$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Vector norm\n",
    "\n",
    "- Vectors typically provide an (approximate) description of a physical (or some other) object \n",
    "\n",
    "- One of the main question is **how accurate** the approximation is (1%, 10%)\n",
    "\n",
    "- What is an acceptable representation, of course, depends on the particular applications. For example:\n",
    "    - In partial differential equations accuracies $10^{-5} - 10^{-10}$ are the typical case\n",
    "    - In data-based applications sometimes an error of $80\\%$ is ok, since the interesting signal is corrupted by a huge noise"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Distances and norms\n",
    "\n",
    "- Norm is a **qualitative measure of smallness of a vector** and is typically denoted as $\\Vert x \\Vert$.\n",
    "\n",
    "The norm should satisfy certain properties:\n",
    "\n",
    "- $\\Vert \\alpha x \\Vert = |\\alpha| \\Vert x \\Vert$\n",
    "- $\\Vert x + y \\Vert \\leq \\Vert x \\Vert + \\Vert y \\Vert$ (triangle inequality)\n",
    "- If $\\Vert x \\Vert = 0$ then $x = 0$\n",
    "\n",
    "The distance between two vectors is then defined as\n",
    "\n",
    "$$\n",
    "   d(x, y) = \\Vert x - y \\Vert.\n",
    "$$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Standard norms\n",
    "The most well-known and widely used norm is **euclidean norm**:\n",
    "\n",
    "$$\\Vert x \\Vert_2 = \\sqrt{\\sum_{i=1}^n |x_i|^2},$$\n",
    "\n",
    "which corresponds to the distance in our real life. If the vectors have complex elements, we use their modulus."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## $p$-norm\n",
    "Euclidean norm, or $2$-norm, is a subclass of an important class of $p$-norms:\n",
    "\n",
    "$$\n",
    " \\Vert x \\Vert_p = \\Big(\\sum_{i=1}^n |x_i|^p\\Big)^{1/p}.\n",
    "$$\n",
    "\n",
    "There are two very important special cases:\n",
    "- Infinity norm, or Chebyshev norm is defined as the maximal element: \n",
    "\n",
    "$$\n",
    "\\Vert x \\Vert_{\\infty} = \\max_i | x_i|\n",
    "$$\n",
    "\n",
    "<img src=\"pics/chebyshev.jpeg\" style=\"height: 1%\">\n",
    "\n",
    "- $L_1$ norm (or **Manhattan distance**) which is defined as the sum of modules of the elements of $x$: \n",
    "\n",
    "$$\n",
    "\\Vert x \\Vert_1 = \\sum_i |x_i|\n",
    "$$\n",
    "  \n",
    "<img src=\"pics/manhattan.jpeg\" style=\"height\">  \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "source": [
    "We will give examples where $L_1$ norm is very important: it all relates to the **compressed sensing** methods \n",
    "that emerged in the mid-00s as one of the most popular research topics."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Equivalence of the norms\n",
    "All norms are equivalent in the sense that\n",
    "\n",
    "$$\n",
    "   C_1 \\Vert x \\Vert_* \\leq  \\Vert x \\Vert_{**} \\leq C_2 \\Vert x \\Vert_*\n",
    "$$  \n",
    "\n",
    "for some positive constants $C_1(n), C_2(n)$, $x \\in \\mathbb{R}^n$ for any pairs of norms $\\Vert \\cdot \\Vert_*$ and $\\Vert \\cdot \\Vert_{**}$. The equivalence of the norms basically means that if the vector is small in one norm, it is small in another norm. However, the constants can be large."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Computing norms in Python\n",
    "\n",
    "The NumPy package has all you need for computing norms: ```np.linalg.norm``` function.\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Relative error in L1 norm: 0.0007329368243860864\n",
      "Relative error in L2 norm: 0.0009161454481284762\n",
      "Relative error in Chebyshev norm: 0.0025611753793460725\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "n = 100\n",
    "a = np.ones(n)\n",
    "b = a + 1e-3 * np.random.randn(n)\n",
    "print('Relative error in L1 norm:', np.linalg.norm(a - b, 1) / np.linalg.norm(b, 1))\n",
    "print('Relative error in L2 norm:', np.linalg.norm(a - b) / np.linalg.norm(b))\n",
    "print('Relative error in Chebyshev norm:', np.linalg.norm(a - b, np.inf) / np.linalg.norm(b, np.inf))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Unit disks in different norms\n",
    "\n",
    "- A unit disk is a set of point such that $\\Vert x \\Vert \\leq 1$\n",
    "- For the euclidean norm a unit disk is a usual disk\n",
    "- For other norms unit disks look very different"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Text(0.5, 1.0, 'Unit disk in the p-th norm, $p=1$')"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYYAAAEKCAYAAAAW8vJGAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzt3XucVPWd5//Xu6q7QRS1uYvQjaghoCYIHcBkRk0iibozYrwC7sbsT4Ps6m8emdnMSmLGyY8xMyTz2012Nu4IcTLJzIpiVJRk9OH9kqx2h24UuUWB1m4aEJqmQS7a18/+Uaexuqnqru66V32ej0c9uuqcU+d8TlVXfep7Od+vzAznnHOuRyjbATjnnMstnhicc8714onBOedcL54YnHPO9eKJwTnnXC+eGJxzzvXiicE551wvnhicc8714onBxSRpi6TLk3j+LyTdN5h9SXpf0hWZiG+ox82EXIvHFR9PDAVKkkk6r8+y70v634k838wuMLNXgucl9UUVva9UGeo+c+1LN9fiyReS7pJUK6lN0i+yHU+hKcl2AM657JNUYmad2Y5jEPYA9wFfBU7JciwFx0sMRSr4pfptSW9LOixpjaThfdZfIelfgQrg15KOSvqvcfZ3saQNko5IWgOctK+ox3dL2h1s+46kL8fY36clvSdpYT/xX5HIuUQ9p79zmdnPazFR0uOSmoOY/myA1/U7krZKapX0z7FiSSaeGMfr732cLukVSYeC6rdr+jz3bklvA8cklQTL/jLY3zFJ/yRpvKRngvfrBUnl8c6/T2y3SHo9iOkDSbskXZXIcwdiZk+Y2ZNASyr25/owM78V4A0w4Lw+y74P/O/g/vvA74GJwChgG7A0atv3gSv63o9zrDKgAfhzoBS4AegA7ouxr2nALmBi8HgKcG70dsAsoBH4k36O2Te+uOcS73l9lsV8PpEfT3XAvcF5TgXqga/2s//NwORgX/+n53VIRTxxnh8v9lJgB/DdIPYvAUeAaVHPfSuI9ZSoZdXAeOBsYD+wAbgYGAa8BPx1gv+DK4CPgJuCWL4NNMTY7jfAoTi33wxwjPuAX2T781ZoNy8xFLd/MLM9ZnYQ+DUwc4j7mUfkg/8TM+sws8eA9XG27SLyBTNDUqmZvW9mO6PW/zGwDrjVzH4ziBiSPZd4z/8cMNbMlptZu5nVAz8DYpZkAj81s13Bvn4ALBpkLP3FM5ht5wGnASuC2F8i8iW8qM9zd5nZR1HL/qeZ7TOz3cBvgRoze9PM2oC1RJJEIi4Cfmxmj5pZB/AvQEXf0o+Z/YmZnRnn9icJHsulkCeGwtVF5Ms6WimRX/I9Poi6f5zIl8hQTAR2W/ATLtAQa0Mz2wF8i0jpZb+kRyRNjNpkKfC6mb08yBiSPZd4z68EJgZVMYckHSLyC3x8P/vaFXW/gcjr01O1cjS4PTPEeAaz7URgl5l194nn7Dix9tgXdf+jGI8TfW0vAh6LejwOOGpmHyf4fJclnhgKVyORappo5xDnC3sAA03asRc4W5KillXE3ZnZajP7IyJfugb8MGr1UiK/Kn88hDgTMdgJSHYB7/X5FTvSzK7u5zmTo+5XEGkoxcweMrPTgltPXXs6J0TZA0yWFP05rwB2Rz1Oy/ElnUnkdWiOWnwDcFJCDNovjsa5DZRAXRp4Yihca4DvSZokKRQ01P4pvX/BJWofkbr1eN4AOoE/CxowrwPmxNpQ0jRJX5I0DPiYyC/QrqhNjgBXApdKWjGEWAcy0Ln09Xvgw6CR9hRJYUkXSvpcP8+5M3jdRxEpXaxJYTyDUQMcA/6rpFJFrvv4U+CRVB1AketVfhFj1UVE3tfFwf/EvwP+M5GSYi9mdlVUwux7i9lYHexzOBAGwpKGS/JeliniiaFwLQdeB34HtAI/Am4xs81D2NffEUkyhyR9u+9KM2sHrgO+ERzrZuCJOPsaRqRR8gCRKpBxRL48o/d3CJgPXCXpb4YQb3/6PZe+zKyLyJfpTOC9IO4HgTP6edpq4DkijdT1RBpIUxLPYATvyzXAVUTi/l/A183sDyk8zGQiDex9XQQ8BFxC5H/i/wOuNbOtKTru94j8qFgG/Pvg/vdStO+ip97Vws65ZEh6H7jdzF7IdizpJqkM2Ah8Jmhcjl73j8C7ZpauKkGXRl5icM4NSdDTaXrfpBC4iEjXWZeHPDE459LhQiCVVVYug7wqyTnnXC9eYnDOOddLXnbvGjNmjE2ZMiXbYTjnXF6pq6s7YGZjB9ouLxPDlClTqK2tzXYYzjmXVyQldIGrVyU555zrxRODc865XjwxOOec68UTg3POuV5Skhgk/VzSfkkxx+FRxD9I2hHMDDUrat2tkrYHt1tTEY9zzrmhS1WJ4RdERsSM5yrg/OC2BPhHgGD0yb8G5hIZjfOvE5020DnnXHqkJDGY2WvAwX42WQD8i0VUA2dKOovIRN7Pm9lBM2sFnqf/BONczqpraOX+l3dQ19Ca7VCcS0qmrmM4m94zRTUFy+ItP4mkJURKG1RUxJ0DxrmsqGtoZdHPquno7Ka0JMTD35zH7Eov/Lr8lKnGZ8VYZv0sP3mh2SozqzKzqrFjB7xwz7mMemJDE+2d3RjQ3tnNExuash2Sc0OWqcTQRO/pDicRmXYw3nLn8sLqmkYW/PR3/Nvbvf9ta9476FVKLm9lqippHXCXpEeINDQfNrO9kp4F/jaqwfkrwHcyFJNzQ1bX0MrKV3fy3NZ9Mdfv2H+Uhave4MaqyVw/a5JXK7m8kpLEIOlh4HJgjKQmIj2NSgHM7AHgaeBqYAdwHPiPwbqDwdSN64NdLTez/hqxncu6uoZWbnmwmo87uvvdrqPLWF3TyBMbmnjodm9zcPkjJYnBzBYNsN6AO+Os+znw81TE4VwmPL6hibYBkkK0jzu6WfnqTlZ9vSqNUTmXOn7ls3ODUNfQymN1TbF7SPTjua37WF3TmJaYnEs1TwzODcLKV3fS3pl4aSHamvWeGFx+yMv5GJzLpJ6G5i17DrP70MdD3s/404dT19BKdX0L86aO9jYHl7M8MTjXj7qGVhaueoOOruTmRhdw+bRx3PJgNe2d3ZSVhLxB2uUsr0pyrh8rX92ZdFIAuGLGeFqPt9Pe2U23RRqkb/vlelY8vS0FUTqXWl5icC6OFU9vi3udwmC9vvMAovel/oeOd/DAa/VsaGzl7qume+nB5QwvMTgXw+qaRh54rT5l+zvW1sVzW/cRq/Dx+/dbWbTqDb9S2uUMTwzO9VHX0Mo9azdl9JjtXcbjPr6SyxGeGJzrY+WrOwd9nUIqbNl92EsNLid4YnCOT+ZSWF3TyPMpalcYrLebDnPzyjf8QjiXdd747IpeXUOkjr+n91E2Sgs9x+3sNu59ajPTJoz0xmiXNV5icEXvgVd30t5lGNlLCtE6u43lv97i1UouazwxuKJW19DKS3/Yn+0wTrKx6TA3/OPrfp2DywpPDK6oPb6hia7uXCgnnMyAB16r9+TgMs7bGFzR6RmvqHxEGb+q3TXwE7LsgdfqOdLWyXU+4Y/LEE8MrqjUNbSy6GfVdHR2EwopZ0sLfa2uaeRxn/DHZUhKqpIkXSnpHUk7JC2Lsf7Hkt4Kbu9KOhS1ritq3bpUxONcPE9saKK9sxuDvEkKEKlW6ujsprq+JduhuCKQdIlBUhi4H5gPNAHrJa0zs60925jZn0dt//8CF0ft4iMzm5lsHM71p6f66N19R7IdypAICIdD7D70EXUNrV5qcGmViqqkOcAOM6sHkPQIsADYGmf7RUTmhHYuI3rmaB7qBDuJGlYi2jrTUwqJlHC6eeT3Poe0S79UVCWdDUS34DUFy04iqRI4B3gpavFwSbWSqiVdG+8gkpYE29U2NzenIGxXLKrrW04Mdx2r9kjA0kunJvRhmHD6sLjr0pUUenR1R+Jv7/AqJZdeqUgMirEs3idkIfCYmXVFLaswsypgMfATSefGeqKZrTKzKjOrGjt2bHIRu6JSPqIsZkLoYcCTb+4mkfLE/g/bEj7usHCsj0byuoF/e3sPC376Ox8+w6VFKqqSmoDJUY8nAXvibLsQuDN6gZntCf7WS3qFSPvDzhTE5Ryraxr5u2fi1Wp+4oMjiX3hD6Yyqm2IE/yceUoJx9q7+p0gaOveSFvJxqbIKLCL51YM6VjOxZKKxLAeOF/SOcBuIl/+i/tuJGkaUA68EbWsHDhuZm2SxgBfAH6UgphcEatraOXxDU282dDKtg/yr7H50EedlITFacPCHG3rGnD7+1/e7onBpVTSicHMOiXdBTwLhIGfm9kWScuBWjPr6YK6CHjEzKJ/Bk0HVkrqJlKttSK6N5Nzg9UzIF57CqbjzKbOLuNoV++kIGLX0e4+9DGX/ehl/vvNM71B2qWEen9P54eqqiqrra3Ndhgux6yuaWTVazt5v+V4tkPJinAIHr3j854cXFyS6oI23X75WEmuIKyuaeS7azcVbVKASK+lla9685xLnicGVxCe2bw32yHkhOe27uNbj7yZ7TBcnvPE4PJeXUMrbR0DN9IOhdLT43TQRo0oTXjbJ9/awxX/7RXvyuqGzAfRc3mtZ1C8dF3VnCtNcAePdwxq+x3Nx/juWu/K6obGSwwub9U1tPKTF95N+1AX+cyr2NxQeGJwealn/KPfbj+Q7VBy2tbdh/nu2k0+TagbFK9KcnlldU0ja9Y3cqytk487MldSKAlBPhZMWo53sLqmkcdqd/Hwkku8K6tLiCcGlze+9cibPPlWvNFW0isfk0K0ji6jur7FE4NLiFclubyw4ultWUsK/TnzlBLCoRzputQPARt3HfIqJZcQTwwu59U1tLLqt/XZDiOmQx915sVMcN1ErnG48YHXvRurG5BXJbmcVtfQyvJfb+l32GyXuG6De9ZuYu2bTXxq/EiumzXJq5fcSbzE4HJWXUMrN618nY1Nh7MdSkExYP37rTxU08iin1V79ZI7iScGl5PqGlq5+7GNdOV5o2+u6+j02eDcybwqyeWc1TWN/NVTm/Oi7j7vKTLDnXPRvMTgcsrqmkbuWbvJk0KGmMH312326iTXi5cYXE6oa2jlgVd38sLWfXEnDO8xcniYIx+nZ9C8TJg+YSQdXd3saD6W7VAAaO8yVjyzjcunjWPe1NHeGO1SU2KQdKWkdyTtkLQsxvpvSGqW9FZwuz1q3a2Stge3W1MRj8svdQ2t3LzydZ5PICkAHEtgustc9ocPjvQ7n3M2rH+/lb9/9h1uWvmGd2d1yZcYJIWB+4H5QBOwXtK6GFN0rjGzu/o8dxTw10AVkc4SdcFzvVxbRB7f0DSoK4vzvZbJgIaDuTmhUFe3cY+Pylr0UlFimAPsMLN6M2sHHgEWJPjcrwLPm9nBIBk8D1yZgphcHsn964aLiwH3PuXtDsUsFYnhbGBX1OOmYFlf10t6W9JjkiYP8rlIWiKpVlJtc3NzCsJ2ueKCiWdkOwTXR1e38ZMX3vXkUKRSkRhi/eDrW9j/NTDFzD4DvAD8chDPjSw0W2VmVWZWNXbs2CEH63JHXUMr3127iXvXbc52KK4PA367/QCLVr3hyaEIpSIxNAGTox5PAnqNdmZmLWbWFjz8GTA70ee6wrS6ppGbg4bOzhxriHWfaO8yHt/QlO0wXIalorvqeuB8SecAu4GFwOLoDSSdZWY9U0ldA2wL7j8L/K2knv5xXwG+k4KYXI6qa2hl5as7E+6B5E4m4hSr0+TAkbaBN3IFJenEYGadku4i8iUfBn5uZlskLQdqzWwd8GeSrgE6gYPAN4LnHpT0N0SSC8ByMzuYbEwuN9U1tLJw1Rs511UzX4wcHuZ4e3fGL/6rP3CMuoZWv76hiMhyZbbzQaiqqrLa2tpsh+EG6Z61m3jI+8jnpeGlIR66fZ4nhzwnqc7MqgbazofEcBmz36sk8lZ7R7f3UioinhhcRtQ1tPLKO/uzHUZRXTMxoiycsn11E+mldJNP9FMUfKwkl3araxpZ9drOnGhbGFYS4uN8n8A5QcfbUz90SJfB99ZuYsuew1ww8Qxaj7f7+EoFyBODS5ueHkjPbd2X7VBOKJakkE7d0KutqKwkxMPf9PaHQuJVSS4t6hpaueXB6pxKCi492ju7WfnqzmyH4VLIE4NLi+r6Fto6/Nd5sXhx2z5vmC4gnhhcytU1tLJx1yG/gK2IdBnea6mAeBuDS6m6hlYW/ayadq/LLzq/3X6AN3a2sHzBhT5kd57zxOBSYnVNI2vWN9JytM2TQhHr7Db+6slIr6XrZk3yBuk85YnBJW3F09t44LX6bIfhckSXRXotPb6hya+WzlPexuCSsrqm0ZOCi6nNr5bOW54Y3JDVNbTyV09uynYYLkMmnD5sUNsb8LvtB7jlwWpPDnnGE4MbkrqGVn7ywrvkwMXMLo5UD//xwYeDH+vKiJQcqutbUhyNSydvY3CDUtfQyuMbmni0dpdPsJPjcuXdMWDjrkM+dHce8cTgEra6ppF7n9pMZ4bnA3D577mt+3hte7M3RucJTwwuIatrGvnek5vwnOCGqq2jm+W/3sIFZ5/B9d6VNaelpI1B0pWS3pG0Q9KyGOv/QtJWSW9LelFSZdS6LklvBbd1qYjHpZYnBSdg1KmlSe3DgI1Nh1ld08iiVW94g3QOSzoxSAoD9wNXATOARZJm9NnsTaDKzD4DPAb8KGrdR2Y2M7hdk2w8LrXqGlq596nN/SaFa2dO5LOTzshcUC7jwiExfuTwlO2vvct84L0clooSwxxgh5nVm1k78AiwIHoDM3vZzI4HD6uBSSk4rkuzFU9v47Zfru+3TeHUsjDvHTjGjuajGYzMZVpnt/HuviMp3edzW/f5pD85KhWJ4WxgV9TjpmBZPLcBz0Q9Hi6pVlK1pGvjPUnSkmC72ubm5uQidgPquZr50PGOfrc71t7FxqbDHGtL/aQwLrekoxPa3/xmqyeHHJSKxudY3aVj/gtJ+vdAFXBZ1OIKM9sjaSrwkqRNZnZSGdPMVgGrAKqqqry2O43qGlp5pHZXr2Uid7o/usLxUUcX310buUjSB97LHakoMTQBk6MeTwL29N1I0hXAPcA1ZnbiShkz2xP8rQdeAS5OQUxuiHom2OlbUjBAQ7xiSgz+qlmXfaelcM7ogTyzeW/GjuUGlorEsB44X9I5ksqAhUCv3kWSLgZWEkkK+6OWl0saFtwfA3wB2JqCmNwQ9TfBTmkoNKRGZmNoV8267Dqahjmj49l18DgLfvo7r1bKEUlXJZlZp6S7gGeBMPBzM9siaTlQa2brgL8HTgN+pcjPzsagB9J0YKWkbiJJaoWZeWLIgrqGVqrrW9i+70jcKqP2rm42Nh3OaFyuOLzfEumbsrFpE40tx1h29fQsR1TcZJZ/NcdVVVVWW1ub7TAKxoqnt7HytXpvQ3A54/H/9Hm/AC4NJNWZWdVA2/kgekWuZ9hsTwoul/yXR9/yaqUs8sRQxOoaWvmHF9/Ndhgux5xSkv2vhfdbjvPdtZv41iNvZjuUopT9/wCXFatrGrnpgde9Udid5KMcmpr1ybf2eMkhCzwxFKGeCXZ81GyXK8KCW+ZWsPTSqSet866smeejqxah6voWTwoup3QZHGvrpPlIG6eVhXt1lR19alkWIytOXmIoQttTPOZNokTqZxVz6ZPp9+rJt/bw3NZ9J10/8Zu39/pIrBnmJYYiUdfQyspXd7Jlz2F2H/o4KzF4ISW/5Mr71dltLP/1Fi48+wyu83kcMsITQxHw+RQKX0gU9Pu7sekwG5sOs6Z2FzdXTfYEkWZelVTgEplPweW/yeUjsh1CRnR2GQ/5RD9p54mhwD2xocnnaC4CDQePD7xRAWnvMh7f0JTtMAqWJ4YCVtfQyq/6DJ/tXKHYsvuwlxrSxBNDgeqZfa3d+6W6ArWx6TCLflbtySENvPG5wNQ1tPLDZ7bx+/f9w+IKX3tnN09saPKG6BTzxFBAVtc0cu9Tm71NwRW088adxo79n8wxXvPewROlhur6FuZNHe2JIkmeGAqED3PhisWFE0/vlRh27D/KzaveICTR2dVNWUmIh26f58khCd7GUCAe39CUF0nBr3yO7cxTShk5LHNTaeYrAe8dOHbS8s4uo6Ozm26Djs5uqutbMh9cAUlJYpB0paR3JO2QtCzG+mGS1gTrayRNiVr3nWD5O5K+mop4itGBI7FHSQ0p/pscPUTFUOdzHqw8yF1ZceijDo60ZW4qzXxlwKbdsWcR7PkfDoVE+Ygy7n95hzdMD1HSVUmSwsD9wHygCVgvaV2fKTpvA1rN7DxJC4EfAjdLmkFkjugLgInAC5I+ZWb+CRmEuoZW9n0Ye5gLM1g8t4J39x1hfZ8GaeuznXP5IF4TWs/yji7jnrWbkPBqpSFKRYlhDrDDzOrNrB14BFjQZ5sFwC+D+48BX1Zk8ucFwCNm1mZm7wE7gv25BNU1tHLLg9W8HWcu5tKwONbWeVJSyBfDc2DSGJd/jEiiaOvwaqWhSMWn7mwg+iqqpmBZzG3MrBM4DIxO8LkASFoiqVZSbXNzcwrCLgzV9S20d3af+PV/+vBPCoEhoHLUCJ58a09WYkuFj3No0hiXfwwoH+HDdg9WKhJDrNrpvoW9eNsk8tzIQrNVZlZlZlVjx44dZIiFa97U0ZSEP3kbP/y488T9bmB788kNdc7loukTRvb6YZMqrcfbU77PQpeKxNAETI56PAno+xP1xDaSSoAzgIMJPtf1Y3ZlOTfMnpTtMJxL2rYPjvT6YZMKJSExb+rolO6zGKQiMawHzpd0jqQyIo3J6/pssw64Nbh/A/CSmVmwfGHQa+kc4Hzg9ymIqahcP2sSZWHvCOpcDxFJCssXXOgNz0OQdLnNzDol3QU8C4SBn5vZFknLgVozWwf8E/CvknYQKSksDJ67RdKjwFagE7jTeyQN3uzKch5ecgkrX93J81v3eZdQN2inDQtztEC6y86ZUs5l08b5FdBJkOVhP8Wqqiqrra3Ndhg55/6Xd/D3z76TseMNLwlRWiJOLS3hgzjXUTiXSSVhsWbJJZ4Q4pBUZ2ZVA23nQ2IUkHlTRxMWGbsC+uPObj7uhCMfF8YvTZcaoaBWM9NDdn120hnc+6cXeFJIAe8kXkBmV5bzN9delLGrmJ2L5dRhJRlPCmUlIU8KKeQlhgKzeG4FQNrneBY+vIWL7UiKexb1JyxYOKfC54BOMS8xFKDFcyu479qLKAmJEFAWFueNPTWlx4hOCl5AcdnSbTDxzFM8KaSYlxgK1OK5FUybMPLE+PQrX93JjjRd7OYlB5ctpWG/TiEdPDEUsNmV5Sd+SY0ZOSyrsXxlxngAdh08zrYPjmQ1Fpcf+quunDOlnPPGj+R6r0JKC08MReL6WZN4tHYXnVmYtEHA5dPGsXhuBQt++ruMH9/lp3j/qbfMreAHX7soo7EUG29jKBKzK8u5qWrywBumgQH3rN1E1X3PszHOKLDOJaIsLK6b5UPApJsnhiJy/axJlIQG11Tc37DXg9mVAQeO+mBmxSh6Qqhk3Vg12auOMsATQxGZXVnO8gUXMphhleINez3h9GFc89mJcT/wg8w/roCNOa2M8acn38ZVVhLy0kKGeBtDkenprXT3YxuT6qX0qfEj+53nodv8WgcX0dxPSbEkBANNuSFg/ozx3HHZuV5ayBBPDEVodmU5c6eOTioxJDKXrieF9CmUpNs3KYSIzCPSY/qEkdz3tYs8IWSYVyUVqetmTaIkiaG6j7Xn5vhIp5aFsx1CRhRCUohl6rjTTtwPCf7ksxM9KWSBJ4YiNbuynDVLLmHOlNz60CXbNDFj4unevpFF4SRf/B37j1JWEiKsSJuCX7yWHT7stmN1TSNr1jcyrCTEGSPKeO/AMXbsPzrg86ZPGMl7B475vMwupebPGM/MyWf6fApp4MNuu4QtnlvB4rkV1DW0csuD1bR19P9FP2ZkGS1H2nPmCuZwCLo8N+W0EWVhjidY/Thu5DDu/OJ5aY7I9SepqiRJoyQ9L2l78Pek9C5ppqQ3JG2R9Lakm6PW/ULSe5LeCm4zk4nHJae6voX2zu4B668PHGkfVB33hJHDOHNEaTKh9as7jUnBa6VS43h7FxPiDMsyc9IZJ+57l9TckGyJYRnwopmtkLQseHx3n22OA183s+2SJgJ1kp41s0PB+r80s8eSjMOlwLypoykrCdHR2U23pa6B84MjbZxSOvBvkJCGNrlLOitD86+iNXfFmuXv2pkT+cnCi6lraD0x4KNXH2VfsolhAXB5cP+XwCv0SQxm9m7U/T2S9gNjgUO4nDK7spyHbp9HdX0La9/cnVA7Q6I+GqB6CoaWFPp2b3T55enNH/AfGlp7Dfjosi/ZXknjzWwvQPB3XH8bS5oDlAE7oxb/IKhi+rGkuJdHSloiqVZSbXNzc5Jhu3hmV5Zz5xfP4//5wjnZDiUh6UoKw/oZCsSlTkdnN9X1LdkOw/Ux4H+/pBckbY5xWzCYA0k6C/hX4D+aWc/n+TvAp4HPAaM4uRrqBDNbZWZVZlY1duzYwRzaDcHiuRUsvXRq0daxt3lPq7SJ/tIp9S6pOWnAqiQzuyLeOkn7JJ1lZnuDL/79cbY7Hfg34HtmVh21773B3TZJ/wx8e1DRu7RadvV0KkafmvZpQl1xWXLpVD5s60TgU3LmqGTbGNYBtwIrgr9P9d1AUhmwFvgXM/tVn3U9SUXAtcDmJONxKdYzh/Q9azelrSF2qI3OLj+NPKWUZVdPz3YYrh/JVqSuAOZL2g7MDx4jqUrSg8E2NwGXAt+I0S31IUmbgE3AGOC+JONxabB4bgXzgxnY0sGTQvHwq5nzg1/57BJS19DKjQ+8ntCX+N9+7SK27DnMQzWN6Q+sj3Aocl1D/v1Xp15YkOkJ++IN7hcSXDHdR0jNtkSvfPauFy4hsyvLue/aixJqjH75nf1cN2tSxhuuK0eN8KQQpSSc2Y935agRzJ8xnrIYgzOawWcnn+lJIU94YnAJWxzMtTvQoKz7P/yY2ZXlKZmcZTAaDh73pBAl0z2rGg4e57mt++iyyDha0UrC8iqkPOJjJblB6Znop7q+hSMfdbDqt/UnVS+dM+ZUVtc08sGHJ1/p6rIv3XM5dHUb7+w7cuI4wqfkzDceid23AAAQtElEQVSeGNygRV+lGqs762/e3svm3YdTftzhJSE6urt9wLw4wiFhZgO2A2WiVNVtUBLEU1oS4nof/yiveGJwSenpzhqdHLq6jZ1JzA4Xjw/v3b/ubmP+jPE8v3Vf1qvUwiGxfMGFtB5v9/GP8pAnBpe0nuRw71Ob6TYjJNHpfVAzzoCa91sIhURXgq9/uqqVvvTpcSf+L1z+8cTgUiK67aF8RBnfXbsp2yEVpcPHO4HEv/BPGcQ8CeePPZXt/ZQEe/oklIbF0svOTWifLjd5YnAp09P2UNfQ6lczZ1miL32iSQFge/MxhoVFW5yLI+bPGM9nfea1guCJwaVcdX0LeXjdpEtAvKQAMMZnXisYfh2DS7l5U0czrDRUtCOz5qLhaR5GvCSE9zwqIF5icCkXPeHP9n1HePKtPdkOqeilq0fXnCnlnDd+JNf7KKkFxRODS4ue9ob7X97h7Q0FKiy4bNo4rz4qQF6V5NKqZx5pr1YqLMIn2SlknhhcWvVUKy2eW4HPllkYQop0T37o9nlefVSgvCrJpV1PtdJ1syax8tWdPLd1X7ZDKlpnnlLCoY86k9rHwjmRwRRd4fLfcC5jZleW89nJZ2Y7jKKWbFIoC8t7HxWBpBKDpFGSnpe0Pfgbs1wpqStq9rZ1UcvPkVQTPH9NMA2oK2Dzpo72KqU89dlJZ/Dwkku8+qgIJPsRXQa8aGbnAy8Gj2P5yMxmBrdropb/EPhx8PxW4LYk43E5bnZlOWvu+DzzZ4xn7Gn+OyAfhIDhpSHu/dMLPCkUiaSm9pT0DnC5me2VdBbwiplNi7HdUTM7rc8yAc3ABDPrlHQJ8H0z++pAx/WpPQvHjQ+8zvr3W7Mdhovj9OEl3HHZuT7MRYHI1NSe481sL0Dwd1yc7YZLqpVULenaYNlo4JCZ9VR6NgFnxzuQpCXBPmqbm5uTDNvlik+NHxl33all4QxG4mJZPKeCO794nieFIjNgryRJLwATYqy6ZxDHqTCzPZKmAi9J2gR8GGO7uMUXM1sFrIJIiWEQx3Y57LpZk/hVXRPtMa7MPTaIAd7cJ2KNrHpaWZijg3g9BSyYOZFlV09PZWguTwyYGMzsinjrJO2TdFZUVdL+OPvYE/ytl/QKcDHwOHCmpJKg1DAJ8LETiszsynIe/uYnw2c8s/mDjM9VnGsG+yWeiMHsb86Ucu6+arqXEopYslVJ64Bbg/u3Ak/13UBSuaRhwf0xwBeArRZp3HgZuKG/57vCN7uynDu/eB4/WXgxn5l0RrbDybpkk0IyxenpE0by6NLPe1IocskmhhXAfEnbgfnBYyRVSXow2GY6UCtpI5FEsMLMtgbr7gb+QtIOIm0O/5RkPC7PxapScplzn1+45kjyymczawG+HGN5LXB7cP91IOZ/m5nVA3OSicEVlps/V8HGJp/9LR3CIejqJ++OPa3MSwoO8CExXI7pmSf4/3/2Dxw83pHlaApL36TQt5H6z+ef1NPcFSlPDC7n9MwffeMDr/tw3Wn0uSnlXHvxJJ7ZvJerLjzrRFJ2zgcncDlpdmU59117Uc7/g+bicOKlCb5o548fyeK5FfzrbXM9Kbhecv1z54rY4rkVLMrxL6xcLNB0JNB+XxoW1/lgeC4OTwwup103axJlPupeSgm4sWqyNzS7uPwT53JazwVwt8yt4Lyxpya9PwETRg5LPrAcNCyBBCpgWGnIh852/fLGZ5fzeib6qWto5eZVb9DZ1bsCR8AfnT8GAa9tP9Dvvgz44Ehb2mLNpv6uGJ8yegRXXjCBkaeU+oB4bkCeGFzemF1Zzpoll7DimW29RmQNh+CqC8/i3qc2ZzG63FUSEv/tppmeDFzCvCrJ5ZXZleVcPm1cr95AN3+ugi17DtNZxH1bwyEIC8Kh3v2kJFi+4EJPCm5QPDG4vDNv6miGlYYIKzKBzAUTz+BXtbtOrA/lYh/SNLtw4hn8xVem8eVP9x75fv708d4V1Q2aVyW5vDO7spyHbo+MyDpv6miq61t6lRaGlYT4KJE+mwXkkqmjufOL51HX0Mor7zbT0dlNaUmIOy47N9uhuTzkicHlpZ4G6R4lIdEeNEoXW1IAWPXbeipGn8riuRUnhjH3RmY3VJ4YXN6bXVnOjVWTeaimMduhZE23wV89uYlpE0aelDSdGyxvY3AF4bpZk4qybSFal0F1fUu2w3AFwBODKwg9YysVeW5g3tTR2Q7BFQBPDK5gLJ5bwWP/6fN8Zcb4jJYeJpyeG1dSTw+qkZxLVlKJQdIoSc9L2h78Pem/UtIXJb0VdftY0rXBul9Iei9q3cxk4nFudmU5q75exaI5meui2Xy0jXAOFFUu9qTgUiTZEsMy4EUzOx94MXjci5m9bGYzzWwm8CXgOPBc1CZ/2bPezN5KMh7ngEibw/DSUEaqlrq6YVL5iAwcKb6SED7+kUuZZBPDAuCXwf1fAtcOsP0NwDNmdjzJ4zrXr55rHT4z6YyMHK/hYHb+pcMhMX/GeNbc8XmvRnIpk2x31fFmthfAzPZKGjfA9guB/95n2Q8k3UtQ4jCzmCOcSVoCLAGoqPArOd3AZleWc8HZZ7Cx6fCA2/ad5jJVwiHRlaahOm6ZW8F1syZ5QnApN2CJQdILkjbHuC0YzIEknQVcBDwbtfg7wKeBzwGjgLvjPd/MVplZlZlVjR07djCHdkXs+lmTKEugASBdoyx9atxpadnv0kun8oOvXeRJwaXFgCUGM7si3jpJ+ySdFZQWzgL297Orm4C1ZnZihvee0gbQJumfgW8nGLdzCZldWc7DSy6hur6FjbsO8dzWfRk9/rYPjqRkPyGBWWRQvCV/PJVlV09PyX6diyXZqqR1wK3AiuDvU/1su4hICeGEqKQiIu0TPm6yS7no+Rxefmc/HV35NQprWVh8/5oLaT3e7sNcuIxINjGsAB6VdBvQCNwIIKkKWGpmtwePpwCTgVf7PP8hSWOJVPG+BSxNMh7n4uoZOmN1Hg2dMX/GeJZedq4nA5dRSSUGM2sBvhxjeS1we9Tj94GzY2z3pWSO79xgXT9rEk9saKK9o5tcH2pP4EnBZYUPoueKSvSQ3eUjymg93p6VtodEGJGxjzwxuEzzxOCKTt/RR+saWnlx2z5yremhrCTkYx+5rPCxklzRm11Zzpenj892GCdI8JUZ43n4m/O8tOCywksMzgF3XHYuL72zn84kig1nnlLKoY86Bt6wHwJ+cO1FPh2nyyovMThHpNRwU9XkpPaRbFIAuOPSqZ4UXNZ5YnAucP2sSZSVpPYjIeCU0sT2OX/GeL9wzeUETwzOBWZXlvPwN+cxZ0rq6vWN/uegFhAWDC8NsfSyc1N2XOeS4YnBuSizK8u5bNpAY0GmTjgsbp5TwUO3e0Ozyx2eGJzrY97U0SmvUorHuo2zzzzFk4LLKZ4YnOujp0rplrkVnDf21LQdJyQo9WsVXA7y7qrOxRA98N4ND7yOpeHity+cN4ZvXfEpLy24nOMlBuf6MbuynDv+eGrK91tWEvKk4HKWlxicG8Cyq6dTMfpU1qxvZNPuwyQzIdt5405j7jmjfOY1l9M8MTiXgMVzK2g93s6m3SdPE9ozP1zffKFgcp2ebYaVhvjh9Z/xhOBynlclOZegnt5KIUXmcv7clHJKw4o5Laj4JGGEgD86f4x3SXV5w0sMziUoesjueVNHU13fQl1DK3ByaWFEWZguMzo6uyn19gSXZ5JKDJJuBL4PTAfmBBP0xNruSuB/AGHgQTNbESw/B3gEGAVsAP6DmbUnE5Nz6dR3yO6QRHeMLkvH2rtYeulURp5S6tNxuryTbFXSZuA64LV4G0gKA/cDVwEzgEWSZgSrfwj82MzOB1qB25KMx7mMmV1ZzvIFF1ISUsz1W/Z+yJ1fPM+Tgss7SSUGM9tmZu8MsNkcYIeZ1QelgUeABZIEfAl4LNjul8C1ycTjXKYtnlvBmjsu4Y/PH3PSuqsuPCsLETmXvEw0Pp8N7Ip63BQsGw0cMrPOPstjkrREUq2k2ubm5rQF69xgza4s51tXfIrhpSFE5IrmpT58tstjA7YxSHoBmBBj1T1m9lQCx4hVzrZ+lsdkZquAVQBVVVU5NgmjK3Z9G6a9+sjlswETg5ldkeQxmoDoGVAmAXuAA8CZkkqCUkPPcufyUt+GaefyVSaqktYD50s6R1IZsBBYZ2YGvAzcEGx3K5BICcQ551waJZUYJH1NUhNwCfBvkp4Nlk+U9DRAUBq4C3gW2AY8amZbgl3cDfyFpB1E2hz+KZl4nHPOJU+WjmEj06yqqspqa2NeMuGccy4OSXVmVjXQdj4khnPOuV48MTjnnOvFE4Nzzrle8rKNQVIz0JDELsYQ6S6b7wrlPKBwzsXPI7f4efRWaWZjB9ooLxNDsiTVJtIAk+sK5TygcM7FzyO3+HkMjVclOeec68UTg3POuV6KNTGsynYAKVIo5wGFcy5+HrnFz2MIirKNwTnnXHzFWmJwzjkXhycG55xzvRRFYpB0o6Qtkrolxe3yJel9SZskvSUp5wZjGsR5XCnpHUk7JC3LZIyJkjRK0vOStgd/Y45XLakreD/ekrQu03HGMtDrK2mYpDXB+hpJUzIf5cASOI9vSGqOev1vz0acA5H0c0n7JW2Os16S/iE4z7clzcp0jIlI4Dwul3Q46v24N23BmFnB34DpwDTgFaCqn+3eB8ZkO95kzgMIAzuBqUAZsBGYke3YY8T5I2BZcH8Z8MM42x3NdqyDfX2B/ww8ENxfCKzJdtxDPI9vAD/NdqwJnMulwCxgc5z1VwPPEJkcbB5Qk+2Yh3gelwO/yUQsRVFisMTmps55CZ5HzDm20x/doC0gMs835Nd834m8vtHn9hjw5WCO81ySL/8nAzKz14CD/WyyAPgXi6gmMkFYzk3IncB5ZExRJIZBMOA5SXWSlmQ7mCGKN8d2rhlvZnsBgr/j4mw3PJjru1pSLiSPRF7fE9tYZD6Sw0TmG8klif6fXB9UvzwmaXKM9fkgXz4TibhE0kZJz0i6IF0HGXBqz3yRgrmpAb5gZnskjQOel/SHIItnTBrn2M64/s5lELupCN6TqcBLkjaZ2c7URDgkiby+OfMe9CORGH8NPGxmbZKWEikFfSntkaVePrwfidhAZKyjo5KuBp4Ezk/HgQomMVjyc1NjZnuCv/slrSVS3M5oYkjBecSbYzvj+jsXSfsknWVme4Ni/f44++h5T+olvQJcTKRuPFsSeX17tmmSVAKcQY5UEUQZ8DzMrCXq4c+AH2YgrnTImc9EMszsw6j7T0v6X5LGmFnKBwn0qqSApFMljey5D3wFiNk7IMfFnGM7yzHFso7IPN8QZ75vSeWShgX3xwBfALZmLMLYEnl9o8/tBuAlC1oPc8iA59GnHv4aIlPz5qN1wNeD3knzgMM91Zj5RNKEnrYqSXOIfH+39P+sIcp2S3wmbsDXiPxqaAP2Ac8GyycCTwf3pxLpmbER2EKk6ibrsQ/2PILHVwPvEvllnXPnEcQ4GngR2B78HRUsrwIeDO5/HtgUvCebgNuyHXe81xdYDlwT3B8O/ArYAfwemJrtmId4Hn8XfBY2Ai8Dn852zHHO42FgL9ARfD5uA5YCS4P1Au4PznMT/fRMzPHzuCvq/agGPp+uWHxIDOecc714VZJzzrlePDE455zrxRODc865XjwxOOec68UTg3POuV48MTjnnOvFE4Nzzrle/i9tTOAiMv1ylgAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "%matplotlib inline\n",
    "import matplotlib.pyplot as plt\n",
    "import numpy as np\n",
    "p = 1 # Which norm do we use\n",
    "M = 40000 # Number of sampling points\n",
    "a = np.random.randn(M, 2)\n",
    "b = []\n",
    "for i in range(M):\n",
    "    if np.linalg.norm(a[i, :], p) <= 1:\n",
    "        b.append(a[i, :])\n",
    "b = np.array(b)\n",
    "plt.plot(b[:, 0], b[:, 1], '.')\n",
    "plt.axis('equal')\n",
    "plt.title('Unit disk in the p-th norm, $p={0:}$'.format(p))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Why $L_1$-norm can be important?\n",
    "\n",
    "$L_1$ norm, as it was discovered quite recently, plays an important role in **compressed sensing**. \n",
    "\n",
    "The simplest formulation of the considered problem is as follows:\n",
    "\n",
    "- You have some observations $f$ \n",
    "- You have a linear model $Ax = f$, where $A$ is an $n \\times m$ matrix, $A$ is **known**\n",
    "- The number of equations, $n$, is less than the number of unknowns, $m$\n",
    "\n",
    "The question: can we find the solution?"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "fragment"
    }
   },
   "source": [
    "The solution is obviously non-unique, so a natural approach is to find the solution that is minimal in the certain sense:\n",
    "\n",
    "\\begin{align*}\n",
    "& \\Vert x \\Vert \\rightarrow \\min_x \\\\\n",
    "\\mbox{subject to } & Ax = f\n",
    "\\end{align*}\n",
    "\n",
    "- Typical choice of $\\Vert x \\Vert = \\Vert x \\Vert_2$ leads to the **linear least squares problem** (and has been used for ages).  \n",
    "\n",
    "- The choice $\\Vert x \\Vert = \\Vert x \\Vert_1$ leads to the [**compressed sensing**](https://en.wikipedia.org/wiki/Compressed_sensing)\n",
    "- It typically yields the **sparsest solution**  \n",
    "\n",
    "[A short demo](tv-denoising-demo.ipynb)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## What is a stable algorithm?\n",
    "\n",
    "And we finalize the lecture by the concept of **stability**.\n",
    "\n",
    "- Let $x$ be an object (for example, a vector) \n",
    "- Let $f(x)$ be the function (functional) you want to evaluate \n",
    "\n",
    "You also have a **numerical algorithm** ``alg(x)`` that actually computes **approximation** to $f(x)$.  \n",
    "\n",
    "The algorithm is called **forward stable**, if $$\\Vert alg(x) - f(x) \\Vert  \\leq \\varepsilon $$  \n",
    "\n",
    "The algorithm is called **backward stable**, if for any $x$ there is a close vector $x + \\delta x$ such that\n",
    "\n",
    "$$alg(x) = f(x + \\delta x)$$\n",
    "\n",
    "and $\\Vert \\delta x \\Vert$ is small."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Classical example\n",
    "A classical example is the **solution of linear systems of equations** using Gaussian elimination which is similar to LU factorization (more details later)\n",
    "\n",
    "We consider the **Hilbert matrix** with the elements\n",
    "\n",
    "$$A = \\{a_{ij}\\}, \\quad a_{ij} = \\frac{1}{i + j + 1}, \\quad i,j = 0, \\ldots, n-1.$$\n",
    "\n",
    "And consider a linear system\n",
    "\n",
    "$$Ax = f.$$\n",
    "\n",
    "We will look into matrices in more details in the next lecture, and for linear systems in the upcoming weeks"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {
    "scrolled": false,
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "34.458965526869264\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "n = 500\n",
    "a = [[1.0/(i + j + 1) for i in range(n)] for j in range(n)] # Hilbert matrix\n",
    "A = np.array(a)\n",
    "rhs =  np.random.random(n)\n",
    "sol = np.linalg.solve(A, rhs)\n",
    "print(np.linalg.norm(A.dot(sol) - rhs)/np.linalg.norm(rhs))\n",
    "#plt.plot(sol)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "1.8802880144962684e-07\n"
     ]
    }
   ],
   "source": [
    "rhs =  np.ones(n)\n",
    "sol = np.linalg.solve(A, rhs)\n",
    "print(np.linalg.norm(A.dot(sol) - rhs)/np.linalg.norm(rhs))\n",
    "#plt.plot(sol)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## More examples of instability\n",
    "\n",
    "How to compute the following functions in numerically stable manner?\n",
    "\n",
    "- $\\log(1 - \\tanh^2(x))$\n",
    "- $SoftMax(x)_j = \\dfrac{e^{x_j}}{\\sum\\limits_{i=1}^n e^{x_i}}$  "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Original function: -inf\n",
      "Attempt imporove stability with add small constant: -13.815510557964274\n",
      "Use more numerically stable form: -598.6137056388801\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/alex/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:3: RuntimeWarning: divide by zero encountered in log\n",
      "  This is separate from the ipykernel package so we can avoid doing imports until\n"
     ]
    }
   ],
   "source": [
    "u = 300\n",
    "eps = 1e-6\n",
    "print(\"Original function:\", np.log(1 - np.tanh(u)**2))\n",
    "eps_add = np.log(1 - np.tanh(u)**2 + eps)\n",
    "print(\"Attempt imporove stability with add small constant:\", eps_add)\n",
    "print(\"Use more numerically stable form:\", np.log(4) - 2 * np.log(np.exp(-u) + np.exp(u)))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[nan  0.  0.  0.  0.]\n",
      "[1. 0. 0. 0. 0.]\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/Users/alex/anaconda3/lib/python3.6/site-packages/ipykernel_launcher.py:4: RuntimeWarning: invalid value encountered in true_divide\n",
      "  after removing the cwd from sys.path.\n"
     ]
    }
   ],
   "source": [
    "n = 5\n",
    "x = np.random.randn(n)\n",
    "x[0] = 1000\n",
    "print(np.exp(x) / np.sum(np.exp(x)))\n",
    "print(np.exp(x - np.max(x)) / np.sum(np.exp(x - np.max(x))))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "slideshow": {
     "slide_type": "slide"
    }
   },
   "source": [
    "## Take home message\n",
    "- Floating point  (double, single, number of bytes), rounding error\n",
    "- Norms are measures of smallness, used to compute the accuracy\n",
    "- $1$, $p$ and Euclidean norms \n",
    "- $L_1$ is used in compressed sensing as a surrogate for sparsity (later lectures) \n",
    "- Forward/backward error (and stability of algorithms)  (later lectures)"
   ]
  }
 ],
 "metadata": {
  "anaconda-cloud": {},
  "celltoolbar": "Slideshow",
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.5"
  },
  "nav_menu": {},
  "toc": {
   "navigate_menu": true,
   "number_sections": false,
   "sideBar": true,
   "threshold": 6,
   "toc_cell": false,
   "toc_section_display": "block",
   "toc_window_display": false
  }
 },
 "nbformat": 4,
 "nbformat_minor": 1
}