{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes_classification.ipynb)" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "try:\n", " # Check if notebook is running in Google Colab\n", " import google.colab\n", " # Get additional files from Github\n", " !wget https://raw.githubusercontent.com/krasserm/bayesian-machine-learning/dev/gaussian-processes/gaussian_processes_util.py\n", "except:\n", " pass" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Gaussian processes for classification\n", "\n", "This article gives an introduction to Gaussian processes for classification and provides a minimal implementation with NumPy. Gaussian processes for regression are covered in a [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb?flush_cache=true) and a brief recap is given in the next section.\n", "\n", "## Regression recap\n", "\n", "A Gaussian process (GP) for regression is a [random process](https://en.wikipedia.org/wiki/Stochastic_process) where any point $\\mathbf{x} \\in \\mathbb{R}^d$ is assigned a random variable $f(\\mathbf{x})$ and where the joint distribution of a finite number of these variables $p(f(\\mathbf{x}_1),...,f(\\mathbf{x}_N))$ is itself Gaussian:\n", "\n", "$$\n", "p(\\mathbf{f} \\mid \\mathbf{X}) = \\mathcal{N}(\\mathbf{f} \\mid \\boldsymbol\\mu, \\mathbf{K})\n", "\\tag{1}\n", "$$\n", "\n", "A GP is a prior over functions whose shape (smoothness, ...) is defined by $\\mathbf{K} = \\kappa(\\mathbf{X}, \\mathbf{X})$ where $\\kappa$ is a parameteric kernel function. It is common to set $\\boldsymbol\\mu = \\mathbf{0}$. Given observed noisy function values $\\mathbf{y}$ at points $\\mathbf{X}$ we want to predict a noise-free function value $f_*$ at point $\\mathbf{x}_*$. The joint distribution of observed values $\\mathbf{y}$ and prediction $f_*$ is also a Gaussian:\n", "\n", "$$\n", "p(\\mathbf{y}, f_* \\mid \\mathbf{X},\\mathbf{x}_*) = \n", "\\mathcal{N} \\left(\n", "\\begin{pmatrix}\\mathbf{y} \\\\ f_*\\end{pmatrix} \\middle| \\ \\boldsymbol{0},\n", "\\begin{pmatrix}\\mathbf{K}_y & \\mathbf{k}_* \\\\ \\mathbf{k}_*^T & k_{**}\\end{pmatrix}\n", "\\right)\n", "\\tag{2}\n", "$$\n", "\n", "where $\\mathbf{K}_y = \\mathbf{K} + \\sigma_y^2\\mathbf{I}$, $\\mathbf{k}_* = \\kappa(\\mathbf{X},\\mathbf{x}_*)$ and $k_{**} = \\kappa(\\mathbf{x}_*,\\mathbf{x}_*)$. $\\sigma_y^2$ models noise in the observed function values $\\mathbf{y}$. Turning the joint distribution $(2)$ into a conditional distribution we obtain a predictive distribution\n", "\n", "$$\n", "p(f_* \\mid \\mathbf{x}_*, \\mathbf{X}, \\mathbf{y}) = \\mathcal{N}(f_* \\mid \\boldsymbol\\mu_*, \\boldsymbol\\Sigma_*)\n", "\\tag{3}\n", "$$\n", "\n", "with\n", "\n", "\n", "\\begin{align*}\n", "\\boldsymbol{\\mu_*} &= \\mathbf{k}_*^T \\mathbf{K}_y^{-1} \\mathbf{y}\\tag{4} \\\\\n", "\\boldsymbol{\\Sigma_*} &= k_{**} - \\mathbf{k}_*^T \\mathbf{K}_y^{-1} \\mathbf{k}_*\\tag{5}\n", "\\end{align*}\n", "\n", "\n", "In contrast to the notation in the [previous article](https://nbviewer.jupyter.org/github/krasserm/bayesian-machine-learning/blob/dev/gaussian-processes/gaussian_processes.ipynb?flush_cache=true), I'm using here a single test input $\\mathbf{x}_*$ to be consistent with the notation in the following sections. However, the implementation further below is vectorized so that predictions can be made for multiple test inputs $\\mathbf{X}_*$ in a single operation.\n", "\n", "\n", "## Binary classification\n", "\n", "In context of binary classification, we have a discrete target variable $t \\in \\{0, 1 \\}$ that follows a Bernoulli distribution. We are interested in the probability $p(t=1 \\mid a) = \\sigma(a)$ where $\\sigma$ is the logistic sigmoid function taking logit $a \\in \\mathbb{R}$ as argument. $p(t=0 \\mid a)$ is given by $1 - p(t=1 \\mid a)$.\n", "\n", "### Predictive distribution\n", "\n", "Given observed targets $\\mathbf{t}$ at points $\\mathbf{X}$, our goal is to predict target $t_*$ at point $\\mathbf{x}_*$ using the predictive distribution $p(t_*=1 \\mid \\mathbf{x}_*, \\mathbf{X}, \\mathbf{t})$. Making the conditioning on input variables implicit, the notation of the predictive distribution simplifies to $p(t_*=1 \\mid \\mathbf{t})$.\n", "\n", "In the following, I will only outline the high-level steps needed to derive an expression for the predictive distribution and only present those results that are relevant for a minimal implementation. A detailed derivation of these results is given in \$1\$, \$2\$ and \$3\$. The predictive distribution can be defined as:\n", "\n", "$$\n", "p(t_*=1 \\mid \\mathbf{t}) = \\int{p(t_*=1 \\mid a_*) p(a_* \\mid \\mathbf{t}) d a_*}\n", "\\tag{6}\n", "$$\n", "\n", "This integral is analytically intractable. Two approximation are needed to make it tractable. First, $p(a_* \\mid \\mathbf{t})$ must be approximated with a Gaussian distribution. Second, $p(t_*=1 \\mid a_*) = \\sigma(a_*)$ must be approximated with the inverse probit function $\\Phi(a_*)$ (see also \$2\$ section 4.3.5). Let's start with $p(a_* \\mid \\mathbf{t})$ which can be defined as:\n", "\n", "$$\n", "p(a_* \\mid \\mathbf{t}) = \\int{p(a_* \\mid \\mathbf{a}) p(\\mathbf{a} \\mid \\mathbf{t}) d\\mathbf{a}}\n", "\\tag{7}\n", "$$\n", "\n", "The first term inside the integral, $p(a_* \\mid \\mathbf{a})$, is a Gaussian distribution that can be obtained using a GP for regression. The joint distribution over logits $p(\\mathbf{a}, a_* \\mid \\mathbf{X}, \\mathbf{x}_*)$ is given by Equation $(2)$ which can be turned into a conditional distribution $p(a_* \\mid \\mathbf{x}_*, \\mathbf{X}, \\mathbf{a})$ using Equation $(3)$. Making the conditioning on input variables implicit gives $p(a_* \\mid \\mathbf{a})$. Instead of $\\mathbf{K}_y$ used in Equation $(2)$ we use $\\mathbf{K}_a = \\mathbf{K} + \\sigma_a^2\\mathbf{I}$. Assuming that training data are correctly labeled, we could set noise parameter $\\sigma_a^2$ to zero but for reasons of numerical stability it is convenient to set it to a small value. Using the Laplace approximation (see \$2\$ section 4.4), the posterior distribution $p(\\mathbf{a} \\mid \\mathbf{t})$ can be approximated with a Gaussian distribution $q(\\mathbf{a})$:\n", "\n", "$$\n", "q(\\mathbf{a}) = \\mathcal{N}(\\mathbf{a} \\mid \\hat{\\mathbf{a}}, \\mathbf{H}^{-1})\n", "\\tag{8}\n", "$$\n", "\n", "where $\\mathbf{H} = \\mathbf{W} + \\mathbf{K}_a^{-1}$. $\\mathbf{W}$ is a diagonal matrix with elements $\\sigma(a_n)(1 - \\sigma(a_n))$ with $a_n$ being the elements of $\\mathbf{a}$. Written in vector notation the diagonal is $\\boldsymbol\\sigma(\\mathbf{1}-\\boldsymbol\\sigma)$. The mean $\\hat{\\mathbf{a}}$ can be obtained iteratively with the following update equation:\n", "\n", "$$\n", "\\mathbf{a}^{\\text{new}} = \\mathbf{K}_a (\\mathbf{I} + \\mathbf{W}\\mathbf{K}_a)^{-1}(\\mathbf{t} - \\boldsymbol\\sigma + \\mathbf{W}\\mathbf{a})\n", "\\tag{9}\n", "$$\n", "\n", "At convergence $\\hat{\\mathbf{a}} = \\mathbf{a}^{\\text{new}}$. With two Gaussians inside the integral of Equation $(7)$ the result is also a Gaussian and can be obtained analytically. The Gaussian approximation of $p(a_* \\mid \\mathbf{t})$ is therefore given by\n", "\n", "\n", "$$\n", "p(a_* \\mid \\mathbf{t}) \\approx \\mathcal{N}(a_* \\mid \\mu_{a_*}, \\sigma_{a_*}^2)\n", "\\tag{10}\n", "$$\n", "\n", "with \n", "\n", "\n", "\\begin{align*}\n", "\\mu_{a_*} &= \\mathbf{k}_*^T(\\mathbf{t} - \\boldsymbol\\sigma)\\tag{11} \\\\\n", "\\sigma_{a_*}^2 &= k_{**} - \\mathbf{k}_*^T (\\mathbf{W}^{-1} + \\mathbf{K}_a)^{-1} \\mathbf{k}_*\\tag{12}\n", "\\end{align*}\n", "\n", "\n", "Finally, we approximate $p(t_*=1 \\mid a_*)$ in Equation $(6)$ with the inverse probit function $\\Phi(a_*)$ so that the predictive distribution can be approximated with:\n", "\n", "$$\n", "p(t_*=1 \\mid \\mathbf{t}) \\approx \\sigma(\\mu_{a_*} (1 + \\pi\\sigma_{a_*}^2 / 8)^{-1/2})\n", "\\tag{13}\n", "$$\n", "\n", "See \$2\$ section 4.5.2 for further details about the probit approximation.\n", "\n", "### Kernel parameter optimization\n", "\n", "Kernel parameters $\\boldsymbol\\theta$ can be optimized by maximizing the log marginal likelihood log $p(\\mathbf{t} \\mid \\boldsymbol\\theta)$. Using again the Laplace approximation $q(\\mathbf{a})$ for posterior $p(\\mathbf{a} \\mid \\mathbf{t})$ the log marginal likelihood can be approximated with:\n", "\n", "$$\n", "p(\\mathbf{t} \\mid \\boldsymbol\\theta) \\approx \n", "\\mathbf{t}^T \\hat{\\mathbf{a}}\n", "-\\frac{1}{2} \\hat{\\mathbf{a}}^T \\mathbf{K}_a^{-1} \\hat{\\mathbf{a}}\n", "-\\frac{1}{2} \\log \\begin{vmatrix}\\mathbf{K}_a\\end{vmatrix}\n", "-\\frac{1}{2} \\log \\begin{vmatrix}\\mathbf{W} + \\mathbf{K}_a^{-1}\\end{vmatrix}\n", "- \\sum_{n=1}^{N} \\log(1 + e^{\\hat{a}_n})\n", "\\tag{14}\n", "$$\n", "\n", "where $N$ is the size of the training dataset. Since $\\hat{\\mathbf{a}}$ also depends on $\\boldsymbol\\theta$, $\\hat{\\mathbf{a}}$ must be re-estimated using Equation $(9)$ at each optimization step.\n", "\n", "## Multi-class classification\n", "\n", "For multi-class classification one option is to use several binary one-versus-rest classifiers. This approach, for example, is taken by [GaussianProcessClassifier](https://scikit-learn.org/stable/modules/generated/sklearn.gaussian_process.GaussianProcessClassifier.html) of scikit-learn and can also be applied to the implementation presented here. An extension to true multi-class classification is also possible [1],[3] but not covered in this article.\n", "\n", "## Implementation with Numpy\n", "\n", "This is the minimum we need to know for implementing GPs for binary classification. The following implementation vectorizes the computation of $p(t_*=1 \\mid \\mathbf{t})$ so that we can make predictions at multiple test inputs X_test in a single operation. The results are compared with those of GaussianProcessClassifier. The implementation presented here directly follows the definition of equations which works quite well for many cases. Numerically more stable implementation options are described in \$1\$ and \$3\$. These have also been used for GaussianProcessClassifier, for example." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import matplotlib.pyplot as plt\n", "\n", "from scipy.optimize import minimize\n", "from scipy.stats import bernoulli\n", "from scipy.special import expit as sigmoid\n", "\n", "from sklearn.datasets import make_moons\n", "from sklearn.gaussian_process import GaussianProcessClassifier\n", "from sklearn.gaussian_process.kernels import ConstantKernel, RBF\n", "\n", "from gaussian_processes_util import (\n", " plot_data_1D,\n", " plot_data_2D,\n", " plot_pt_2D,\n", " plot_db_2D)\n", "\n", "%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 1D dataset\n", "\n", "Let's start with a simple 1D training dataset where logits $\\mathbf{a}$ are given by a sine function. Target values $\\mathbf{t}$ are sampled from a corresponding Bernoulli distribution. In the following we still assume that training data are correctly labeled." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXsAAAEWCAYAAACHVDePAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/d3fzzAAAACXBIWXMAAAsTAAALEwEAmpwYAAAbeklEQVR4nO3de3RU5b3/8fcXguRU0SAoIhHDxaMeokGIt6NQijdEjleqyNJ6Q2utHqptf5bWdaQVa122Im1P67FqFasFtfjrWepPpa1WtLYlQULxgKgcqEG5xYLcL8n398czkVwnZJjMTOb5vNaalcye/ez9fXZmPrP3Mzt7zN0REZH81iXbBYiISMdT2IuIREBhLyISAYW9iEgEFPYiIhFQ2IuIREBhL3nNzN4xs1HpnndfmFmJmbmZFXT0ukTqKewlLczsZjOrMLMdZvZYk8dGmVmdmW1O3KrN7GkzOzHJ8tISiO4+xN1fS/e8mZLYdtX5sh7JHoW9pMtHwDTg0dYed/cDgB7AKcBSYJ6ZnZHqCrVnLLL3FPaSFu4+x93/L1DTxnzu7tXu/h/Aw8C9rcz6euLnhsTRwKlmdrWZvWlm082sBphqZoPM7A9mVmNm683sSTMrql+Ima0wszMTv09NHFHMNLNNiWGb8hTnHWZmbycee8bMZpvZtJY6YmZdzeyHifqWA+c1efwaM1uSWNZyM/tyYvr+wP8DDm9wVHS4mZ1kZm+Z2QYz+9jMfmpm+yXaWGL7rDWzT83sb2ZWmnise6KOv5vZGjN70Mz+qbX1JPs7SuejsJdsmgMMS4RNUyMTP4vc/QB3fytx/2RgOdAHuBsw4B7gcOBY4AhgapJ1ng/MAoqA/wZ+2t55E8H6HPAYcDDwa+CiJMu5HhgHnACUA+ObPL428fiBwDXAdDMb5u5bgHNJHBUlbh8BtcCtQG/gVOAM4KbEss4mbLt/Bg4CLmXPG/APEtOHAoOBfsB/JFmP5BGFvWTTR4SwLmpPG3f/ibvvdvdt7v6+u8919x3uvg64H/h8kvZvuPuL7l4LPAGUpTDvKUAB8GN33+Xuc4C/JlnOpcAD7v6hu39CeHP6jLu/4O4fJI56/gi8AoxobWHuXunuf05sgxXAfzXo8y7CUNkxgLn7Enf/2MwMuAG41d0/cfdNwPeBCUnqljyiMU/Jpn6AAxva0ebDhnfMrA8wgxCOPQg7MP9I0n51g9+3AoVmVuDuu/d2XsJRxCpvfBXBRnU1cXiTx1c26cO5wJ2Eve4uwOeAv7W2MDP7Z8KbWnli3gKgEsDd/2BmPwX+EzjSzOYA3wAKE/NWhtwPiwK6Jqlb8oj27CWbLgIWJIYRmmrtcqxNp38/Me04dz8QuIIQYh3pY6CfNUhNwvBRsvkbPt6//hcz6w78Bvgh0Mfdi4AX2dOHlrbDzwkfcB+V6PO3G8yPu//Y3YcD/0J4A/kmsB7YBgxx96LE7aDEh+atrUfyiMJe0sLMCsyskLCn2NXM6veCm85nZtbPzO4EJhGCqiXrgDpgYBur7gFsBjaaWT9CsHW0twjj5jcn+n0BcFKS+Z8G/t3Mis2sJ/CtBo/tB3Qn9Hd3Yi//7AaPrwF6mdlBDab1AD4FNpvZMcBX6h8wsxPN7GQz6wZsAbYDde5eB/yC8HnAoYl5+5nZOUnWI3lEYS/pcgdhz/FbhL3rbYlp9Q43s82EYJ4PHAeMcvdXWlqYu28lfAD7ZuKsk1NaWe93gWHARuAFwoe+HcrddwIXA9cRhqCuAJ4HdrTS5BfAy0AVsKBhjYmx838nvCH8A5hI+DC4/vGlhA+Alye2w+GEYZmJwKbEsmc3WNeBiWn/IAwX1QD3JR67HXgf+LOZfQr8Djg6yXokj5i+vERk35nZX4AH3f2X2a5FpCXasxdJgZl93swOSwzjXAUcD7yU7bpEWqOzcURSczRh6GV/wnn/49394+yWJNI6DeOIiERAwzgiIhHIyWGc3r17e0lJSbbLEBHpNCorK9e7+yGtPZ6TYV9SUkJFRUW2yxAR6TTMbGWyxzWMIyISAYW9iEgEFPYiIhFQ2IuIRCB/wr7p/wt0xP8PpLKOTLTJVF2ZkKt1pSKf+tJeufq8T0WevB47POzN7NHEV6Qt7rCVTJ0Kt966ZwO5h/tTp2Z3HZlok6m6MiFX60pFPvWlvXL1eZ+KPHo9ZmLP/jFgTIct3R02bIAZM/ZssFtvDfc3bEjPO2Qq68hEm0zVlQm5Wlcq8qkv7ZWrz/tM9CXXX4/u3uE3oARYvLfzDx8+3Nulrs598mT3sGnCbfLkMD1dUllHJtpkqq5MyNW6UpFPfWmvXH3eZ6IvWXw9AhWeJFczcm0cMysBnnf30iTz3ED4jkz69+8/fOXKpP8f0Jw7dGlwoFJXB42+SCgNUllHJtpkqq5MyNW6UpFPfWmvXH3ep6KTvB7NrNLdy1t7PGc+oHX3h9y93N3LDzmk1f/4ba1xOPRpqOEYWHoKbP86MtEmU3VlQq7WlYp86kt75erzPhX59HpMttufrhsdOYzT8BCo/tCn6f19lco6MtEmU3VlQq7WlYp86kt75erzPhN9yfLrkTaGcXLy2jjtYgZFRTB5MkyfHu5Pnx4eKypKz2FdKuvIRJtM1ZUJuVpXKvKpL+2Vq8/7TPQlx1+PHT5mb2a/BkYBvQlfanynuz+SrE15ebm3+0Jo7o03TNP76ZDKOjLRJlN1ZUKu1pWKfOpLe+Xq8z4VneT12NaYfYfv2bv75R29DqDld9lcWEcm2mSqrkzI1bpSkU99aa9cfd6nIk9ejznzAa2IiHQchb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhIBhb2ISAQU9iIiEVDYi4hEQGEvIhKBgmwXICLSkl27dlFdXc327duzXUpOKSwspLi4mG7durWrncJeRHJSdXU1PXr0oKSkBDPLdjk5wd2pqamhurqaAQMGtKuthnFEJCdt376dXr16KegbMDN69eqV0tGOwl5EcpaCvrlUt4nCXkSkBatXr2bChAkMGjSI4cOHM3bsWJYtW8aKFSsoLS3tkHW+/vrrDBs2jIKCAp599tm0LlthLyL5wT35/XYtyrnooosYNWoUH3zwAZWVldxzzz2sWbNmH4tMrn///jz22GNMnDgx7ctW2ItI5zd1Ktx6656Adw/3p05NaXGvvvoq3bp148Ybb/xsWllZGSNGjGg034oVKxgxYgTDhg1j2LBh/OlPfwLg448/ZuTIkQwdOpTS0lLmzZtHbW0tV199NaWlpRx33HFMnz692XpLSko4/vjj6dIl/dGss3FEpHNzhw0bYMaMcH/69BD0M2bA5Mnh8XaOcy9evJjhw4e3Od+hhx7K3LlzKSws5L333uPyyy+noqKCp556inPOOYfvfOc71NbWsnXrVhYuXMiqVatYvHgxABs2bGhnR/eNwl5EOjezEPAQAr4+9CdPDtM78EPeXbt2cfPNN7Nw4UK6du3KsmXLADjxxBO59tpr2bVrFxdeeCFDhw5l4MCBLF++nFtuuYXzzjuPs88+u8PqaomGcUSk82sY+PX2IeiHDBlCZWVlm/NNnz6dPn36UFVVRUVFBTt37gRg5MiRvP766/Tr14+rr76amTNn0rNnT6qqqhg1ahQPPvggkyZNSqm2VCnsRaTzqx+jb6jhGH47jR49mh07dvDQQw99Nm3RokXMmzev0XwbN26kb9++dOnShSeeeILa2loAVq5cSZ8+fbj++uuZNGkSCxYsYP369dTV1XHJJZcwbdo0FixYkFJtqVLYi0jnVh/09WP0dXXh54wZKQe+mfHcc8/xu9/9jkGDBjFkyBCmTJnCYYcd1mi+m266iccff5yysjKWLl3K/vvvD8Brr71GWVkZJ5xwArNnz2by5MmsWrWKUaNGMXToUK644gruueeeZuudP38+xcXFPPPMM3z5y19myJAhqW2Tlvrk+3B6UkcpLy/3ioqKbJchIlm0ZMkSjj322L2beerU8CFt/dBN/RtAUVHKZ+Tkspa2jZlVunt5a230Aa2IdH5TpzY+66Z+DF//gfsZDeOISH5oGuwK+kb2OuzNrKQD6xARkQ7Unj37OU0nmNkpaaxFREQ6SJthb2aXmtkPgB5mdqyZNWzzUGvtREQkd+zNB7RvAoXAJOB+4Ggz2wB8BGzruNJERCRd2tyzd/dV7j4TuMDdz3X3gcBZwJ3A6I4uUEQkG7JxieMdO3Zw2WWXMXjwYE4++WRWrFiRtmXv9amX7v5mg99rgJq0VSEisg+2boWnn4alS+GYY+DSS+Fzn0t9efWXOL7qqquYNWsWAFVVVaxZs4YjjjgiTVU398gjj9CzZ0/ef/99Zs2axe23387s2bPTsmydeikinVp1NZSVwTPPwIEHhp9lZWF6qrJ1iePf/va3XHXVVQCMHz+e3//+96TrH1/1T1Ui0qndfjtMmAB33bVn2h13hOlPPpnaMrN1ieNVq1Z9duRQUFDAQQcdRE1NDb17906tIw0o7EWkU3vuueZ78bfdBsXFHb9uXeJYRCRDCgogcWXhz+zcGaanKluXOO7Xrx8ffvghALt372bjxo306tUr9Y40oLAXkU7tsstg2rTG30h4111heqqydYnj888/n8cffxyAZ599ltGjR2NpuuyDhnFEpFP7wQ/gnHOgvBxOPx3eeAO6dIGXX059mfWXOP7a177GvffeS2FhISUlJTzwwAON5rvpppu45JJLmDlzJmPGjGl0ieP77ruPbt26ccABBzBz5kxWrVrFNddcQ11dHUCLlzi+7rrruPLKKxk8eDAHH3zwZ2cCpYMucSwiOak9lziurYW5c2HJEjj2WDj77BD4+UqXOBaRKHXtCmPGhJu0LI/f+0REpJ7CXkQkAgp7EclZufiZYraluk0U9iKSkwoLC6mpqVHgN+Du1NTUUFhY2O62+oBWRHJScXEx1dXVrFu3Ltul5JTCwkKKU/j3YIW9iOSkbt26MWDAgGyXkTc0jCMiEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiERAYS8iEgGFvYhIBBT2IiIRUNiLiESgINsFZMvGjfDEE7BsGQwZAhMnQo8e2a4qd23ZArNmQVUVDBoEV14JBx+cvM3KlTBzJqxfD6NGwb/9GxSk+Rm3fTvMng2VlXDkkXDVVdC7d/I21dWhrjVr4LTT4KKLoFu35G3eegvmzIEuXeCLX4Ty8vT1od7q1aGuVavg5JNh/HjYb7/kbSoq4JlnoK4OLr4YTj01+fy7dsFzz8Gbb8Jhh8GXvgT9+iVvs25dqGvlShg+HC67DAoL29e3XPHOO/DUU7BtG4wbB1/4Aphlu6rMyMievZmNMbN3zex9M/tWJtaZzAcfwHHHwbx5MHAgvPwylJWFEJDmVq+GYcNCSAwcCPPnQ2kpLFnSepsXXwzBsG4dHHEE3HMPjB0LO3akr66aGjjppPDiHTAAFi8Odb39duttXn0Vhg4Nf+v+/WHGDDjjDNi6tfU2U6bA5ZeHnYHu3eHCC+Huu9PXDwhvJscfD++9ByUl8ItfwOmnw6eftt5m2rRQS2EhHHAATJgA3/526/Nv3QqjR4c+9+8Pf/97eN6/9lrrbd5+O2zTxYvDNn7yyfBG9MknKXY0ix56KPS/ri7sENx0E1x/Pbhnu7IMcfcOvQFdgQ+AgcB+QBXwL8naDB8+3DvSBRe433tv42l33OF+xRUdutpO64Yb3G+7rfG0n/zE/ayzWp5/5073ww93nzdvz7Tdu93PPNP9Zz9LX1233RZqq6vbM+2Xv3Q/9dSW56+tdR840P2llxpPu+AC9/vua7nN22+79+vnXlOzZ9rq1e6HHOK+bNk+diChrs69tNT92WcbT5s40f3OO1tu8+67oYbVq/dMq6kJ233hwpbb3Huv+4UXhj7Xe/FF98GDG09r6JRT3B9/vHFd11/v/vWv71XXcsbate5FRe4ffLBn2pYt7sce6/7yy9mrK52ACk+WxckeTMcNOBV4ucH9KcCUZG06Muxra90LCtw3bWo8ffVq9wMP7LDVdmqHHea+fHnjadu2uRcWhhdMU2+95V5W1nz6nDnuY8akr67Bg90XLWo8bdeu8Hdct675/IsWhTYN3xzc3V95xf3001tex3e/6/7NbzaffuON7vffn1rdTS1f7t63b/O63nzT/YQTWm7zox+5f+Urzad/4xvu3/tey21OO8197tzG0+rqwhvg4sXN51+71v2gg8IbdUNVVe5HHdXyOnLVr37lfvHFzaffd5/7V7+a+Xo6Qlthn4lhnH7Ahw3uVyemNWJmN5hZhZlVrFu3rsOKMQvjoNu2NZ6+ZUs4RJfmundvPsyxY0fYll27tj5/08PjdG/jlurauTMcprc0Bt+9e/i7t1RXa2PQLa2jvk26+tK9e9ieu3fv/TqS1dWevtTVhW3S0nq6dYPa2rBN93YduSoTf8ecl+ydIB03YDzwcIP7VwI/Tdamo4dxrr027BXV70nV1rp/6Uvukyd36Go7rSlT3MeP37OHV1cXDuMnTGh5/rq6cHj8xBN7pm3cGPb2Z89OX1133+1+3nlh2KjenXe6jxvXel3l5e4///meaZs3u598svujj7bcZvly94MPDsMm9aqq3Hv2bDyEsq8+/3n3H/5wz/1t29xHjXL/8Y9bnv+jj0INDY9sli4Ntf7v/7bc5uGHwxBXw6Oxn/3M/cQTW6/rvPPcp07dc3/nTvexY92///22epRbNm1y79Wr8dBidXUY9po/P3t1pRMaxmmupiaMRZaWuk+a5H7MMeGF9emnHbraTmvLFvdzznEfNChsr6FD3YcNc1+zpvU2Cxe6Fxe7f+EL7ldf7X7ooe633NJ8qGJfbN8exqBLStyvuy6EVmlpeBG3ZsmSMP+IEe7XXBOGTyZNan3M2j18DlBU5P7FL4ahgKKi9L5puYex5KOOCmF87bVh202cGIalWjNrVqjl4otDbUVF7o891vr8u3eHZfftG/o+YoT7gAHhTaI1H37oPmRI2LbXXed+5JHuF13kvmNHyl3NmldeCYE/blzYtj17Nn6D7ezaCnvzpse0aWZmBcAy4AxgFTAfmOju77TWpry83CsqKjq0Lnf44x/h3XfD2Qb/+q/xnIKVCnf4y19g0aJwRs7o0eE0xGR27Ahn5axfDyNHwtFHd0xtFRXh1MuSEjjzzJaHlhrauRNeeimcZXTaaeHU27asWwcvvBD6PG5c26edpmL37nBmWP2pl2Vlbbf55BN4/vnw9xk7Fg45pO0277yz59TLc89t+7TT2lqYOzecelleHs6y6qw2bQrba+tWGDOm7dNOOxMzq3T3Vk8K7vCwTxQxFniAcGbOo+6e9MS1TIS9iEg+aSvsM/JPVe7+IvBiJtYlIiLN6XIJIiIRUNiLiERAYS8iEgGFvYhIBDJyNk57mdk6YGWKzXsD69NYTmcSc98h7v6r7/Gq7/+R7t7qybc5Gfb7wswqkp1+lM9i7jvE3X/1Pc6+w973X8M4IiIRUNiLiEQgH8P+oWwXkEUx9x3i7r/6Hq+96n/ejdmLiEhz+bhnLyIiTSjsRUQikDdhn2tfap5JZvaoma01s8XZriXTzOwIM3vVzP7HzN4xs8nZrimTzKzQzP5qZlWJ/n832zVlmpl1NbO3zez5bNeSSWa2wsz+ZmYLzazNywTnxZi9mXUlXDP/LMLXHs4HLnf3/8lqYRliZiOBzcBMdy/Ndj2ZZGZ9gb7uvsDMegCVwIUR/e0N2N/dN5tZN+ANYLK7/znLpWWMmd0GlAMHuvu4bNeTKWa2Aih39736h7J82bM/CXjf3Ze7+05gFnBBlmvKGHd/Hfgk23Vkg7t/7O4LEr9vApbQwncc56vElxRtTtztlrh1/j24vWRmxcB5wMPZriXX5UvY79WXmkt+M7MS4ATgL1kuJaMSwxgLgbXAXHePqf8PAP8HqMtyHdngwCtmVmlmN7Q1c76EvUTOzA4AfgN8zd0/zXY9meTute4+FCgGTjKzKIbyzGwcsNbdK7NdS5ac7u7DgHOBryaGc1uVL2G/Cjiiwf3ixDSJQGKs+jfAk+4+J9v1ZIu7bwBeBcZkuZRMOQ04PzF2PQsYbWa/ym5JmePuqxI/1wLPEYazW5UvYT8fOMrMBpjZfsAE4L+zXJNkQOIDykeAJe5+f7bryTQzO8TMihK//xPhJIWlWS0qQ9x9irsXu3sJ4TX/B3e/IstlZYSZ7Z84IQEz2x84G0h6Nl5ehL277wZuBl4mfED3tLu/k92qMsfMfg28BRxtZtVmdl22a8qg04ArCXt1CxO3sdkuKoP6Aq+a2SLCTs9cd4/qFMRI9QHeMLMq4K/AC+7+UrIGeXHqpYiIJJcXe/YiIpKcwl5EJAIKexGRCCjsRUQioLAXEYmAwl5EJAIKexGRCCjsRfZC4pr5ZyV+n2ZmP8l2TSLtUZDtAkQ6iTuB75nZoYQra56f5XpE2kX/QSuyl8zsj8ABwKjEtfNFOg0N44jsBTM7jnAdmp0KeumMFPYibUh89eGThG8/22xmsVxCWPKIwl4kCTP7HDAH+Lq7LwHuIozfi3QqGrMXEYmA9uxFRCKgsBcRiYDCXkQkAgp7EZEIKOxFRCKgsBcRiYDCXkQkAv8fgNuJwJ8bAzMAAAAASUVORK5CYII=\n", "text/plain": [ "