{ "metadata": { "name": "02_Linear_Regression" }, "nbformat": 3, "nbformat_minor": 0, "worksheets": [ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "

\n", "
\n", "
• Bishop: 3.1.0-3.1.4
• \n", "
• Ng: Lecture 2 pdf, page 4, LMS algorithm
• \n", "
• Ng: Lecture 2 pdf, page 13, Locally weighted linear regression
• \n", "
• Bishop: 3.3.0-3.3.2
• \n", "
\n", "

Regression: Given the value of a D-dimensional input vector $\\mathbf{x}$, predict the value of one or more target variables

\n", "

Linear: The models discussed in this section are linear with respect to the adjustable parameters, not \n", " necessisarily with respect to the input variables.

" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

# Creating A Model

\n", "In this notebook, our objective is to construct models that can predict the value of some target variable, $t$, given some \n", "input vector, $\\mathbf{x}$, where the target value can occupy any value in some space - though here we'll only consider the space of \n", "real valued vectors. We want the models to allow for uncertainty in the accuracy of the model and/or noise on the observed data. \n", "We also want the model to provide some information on our confidence in a given prediction. \n", "\n", "The first step is to contruct a mathematical model that adequately represents the observations we wish to predict. \n", "The model we will use is described in the next two subsections. It is **important to note** that the model itself is independent \n", "of the use of a frequentist or Bayesian viewpoint. It is *how we obtain the free parameters* of the model that is affected by using\n", "frequentist or Bayesian approaches. However, if the model is a poor choice for a particular observation, then its predictive \n", "capability is likely to be poor whether we use a frequentist or Bayesian approach to obtain the parameters." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

## Gaussian Noise: Model Assumption 1

\n", "We will *assume* throughout this notebook that the target variable is described by

\n", " $t = y(\\mathbf{x},\\mathbf{w}) + \\epsilon$\n", "

\n", "where $y(\\mathbf{x},\\mathbf{w})$ is an as of yet undefined function of $\\mathbf{x}$ and $\\mathbf{w}$ and $\\epsilon$ is a Gaussian distributed noise component. \n", "\n", "**Gaussian Noise?** The derivations provided below all assume Gaussian noise on the target data. Is this a good assumption? In many cases yes. The argument hinges\n", "on the use of the [Central_Limit_Theorem](http://en.wikipedia.org/wiki/Central_limit_theorem) that basically says the the **sum** of many independent random\n", "variables behaves behaves like a Gaussian distributed random variable. The _noise_ term in this model, $\\epsilon$, can be thought of as the sum of features\n", "not included in the model function, $y(\\mathbf{x},\\mathbf{w})$. Assuming these features are themselves independent random variables then the Central Limit Theorom suggests a Gaussian model \n", "is appropriate, assuming there are many independent unaccounted for features. It is possible that there is only a small number of unaccounted for features\n", "or that there is genuine _non-Gauisian_ noise in our observation measurements, e.g. sensor shot noise that often has a Poisson distribution. In such cases, the assumption is no longer valid." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

## General Linear Model: Model Assumption 2

\n", "In order to proceed, we need to define a model for $y(\\mathbf{x},\\mathbf{w})$. We will use the *general linear regression* model defined as follows

\n", " $y(\\mathbf{x},\\mathbf{w}) = \\sum_{j=0}^{M-1} w_j\\phi_j(\\mathbf{x}) = \\mathbf{w}^T\\mathbf{\\phi}(\\mathbf{x})$

\n", " where $\\mathbf{x}$ is a $D$ dimensional input vector, $M$ is the number of free parameters in the model, $\\mathbf{w}$ is a column \n", "vector of the free parameters, and \n", "$\\phi(\\mathbf{x}) = \\\\{\\phi_0(\\mathbf{x}),\\phi_1(\\mathbf{x}), \\ldots,\\phi_{M-1}(\\mathbf{x})\\\\}$ with $\\phi_0(\\mathbf{x})=1$ is a set of basis functions where \n", " each $\\phi_i$ is in the real valued function space \n", " $\\\\{f \\in \\mathbf{R}^D\\Rightarrow\\mathbf{R}^1\\\\}$. It is important to note that the set of basis functions, $\\phi$, need\n", " not be linear with respect to $\\mathbf{x}$. Further, note that this model defines an entire class of models. In order to \n", " contruct an actual predictive model for some observable quantity, we will have to make a further assumption on the choice of the\n", " set of basis functions, $\\phi$. However, for the purposes of deriving general results, we can delay this choice.\n", "\n", "Note that that $\\mathbf{w}^T$ is an $1 \\times M$ vector and that $\\mathbf{\\phi}(\\mathbf{x})$ is a $M \\times 1$ vector so that the target, $y$ \n", " is a scalar. This will be exteneded to $K$ dimensional target variables below.\n", "\n", " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

# Frequentist View: Maximum Likelihood

\n", "Let's now embark on the path of obtaining the free parameters, $\\mathbf{w}$, of our model. We will begin using a *frequentist*, or \n", "*maximum likelihood*, approach. This approach assumes that we first obtain observation training data, $\\mathbf{t}$, and that the *best* \n", "value of $\\mathbf{w}$, is that which maximizes the likelihood function, $p(\\mathbf{t}|\\mathbf{w})$.\n", "\n", "

Under the Gaussian noise condition it can be shown that the maximum likelihood function for the training data is

\n", " \n", " $p(\\mathbf{t}|\\mathbf{X},\\mathbf{w},\\sigma^2) = \\prod_{n=1}^N ND(t_n|\\mathbf{w}^T\\phi(\\mathbf{x}_n),\\sigma^2)$

\n", " \n", " $=\\frac{N}{2}\\ln\\frac{1}{\\sigma^2} -\\frac{N}{2}\\ln(2\\pi) - \\frac{1}{2\\sigma^2}\\sum_{n=1}^N\n", " \\{t_n -\\mathbf{w}^T\\phi(\\mathbf{x}_n)\\}^2$

\n", " \n", " where $\\mathbf{X}=\\{\\mathbf{x}_1,\\ldots,\\mathbf{x}_N\\}$ is the input value set for the corresponding $N$ oberved output values contained in the vector \n", " $\\mathbf{t}$, and $ND(\\mu,\\sigma^2)$ is the Normal Distribution (Gaussian). (I used ND instead of the standard N to avoid confusion \n", " with the product limit).\n", " \n", " Taking the logarithm of the maximum likelihood and setting the derivative with respect to $\\mathbf{w}$ equal to zero, one can obtain \n", " the maximum likelikhood parameters given by the normal equations:

\n", " $\\mathbf{w}_{ML} = \\left(\\mathbf{\\Phi}^T\\mathbf{\\Phi}\\right)^{-1}\\mathbf{\\Phi}^T\\mathbf{t}$

\n", " where $\\Phi$ is the $N \\times M$ design matrix with elements $\\Phi_{n,j}=\\phi_j(\\mathbf{x}_n)$, and $\\mathbf{t}$ is the $N \\times K$\n", " matrix of training set target values (for $K=1$, it is simply a column vector). Note that $\\mathbf{\\Phi}^T$ is a $M \\times N$ matrix, so that $\\mathbf{w}_{ML}=\\left(\\mathbf{\\Phi}^T \\mathbf{\\Phi}\\right)^{-1}\\mathbf{\\Phi}^T\\mathbf{t}$ is \n", "$(M \\times N)\\times(N \\times M)\\times(M\\times N)\\times(N \\times K) = M \\times K$, where $M$ is the number of free parameters and $K$ is the number of predicted \n", "target values for a given input.
\n", "

\n", "\n", "Note that the only term in the likelihood function that depends on $\\mathbf{w}$ is the last term. Thus, maximizing the likelihood\n", "function with respect to $\\mathbf{w}$ __under the assumption of Gaussian noise__ is equivalent to minimizing a \n", "sum-of-squares error function. \n", "\n", "

\n", " The quantity, $\\mathbf{\\Phi}^\\dagger=\\left(\\mathbf{\\Phi}^T\\mathbf{\\Phi}\\right)^{-1}\\mathbf{\\Phi}^T$ is known as the \n", " Moore-Penrose pseudo-inverse of $\\Phi$. When $\\Phi^T\\Phi$ is invertible, the pseudo-inverse is \n", " equivalent to the inverse. When this condition fails, the pseudo-inverse can be found with techniques such as singular value decomposition.\n", "

" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

### Example 1

\n", "

#### (a) Linear Data

\n", "

Let's generate data of the for $y = m*x + b + \\epsilon$ where $\\epsilon$ is a random Gaussian component with zero mean. Given this data, let's apply the maximum likelihood \n", " solution to find values for the parameters $m$ and $b$. Given that we know our data is linear, we chose basis functions $\\phi_0(x)=1$ and $\\phi_1(x)=x$. Thus, our \n", " our model will be $y=\\theta_0\\phi_0(x) + \\theta_1\\phi_1(x)$, where presumabely the solution should yield $\\theta_0 \\approx b$ and $\\theta_1 \\approx\n", " m$\n", "

#### (b) Trigonometric Data

\n", "

One common misconception regarding linear regression is that the basis functions are required to be linear. This is not the case. Indeed, the basis\n", " functions need not even be polynomials. They must be linearly independent, i.e. orthoganol. It is only the dependence on the model parameters, $\\mathbf{w}_{ML}$, that must be linear. \n", " Note that an example of a nonlinear parameter model would be $y=\\exp(a)\\sin(x)$ where $a$ is the free parameter.\n", " Let's generate trigonometric data of the form $y = a + b\\sin(x) + c\\cos(x) + \\epsilon$ where again $\\epsilon$ is a random Gaussian component with zero mean. \n", " Here we chose basis functions $\\phi_0=1$, $\\phi_1(x)=sin(x)$ and $\\phi_2(x)=cos(x)$. If you're wondering if we're cheating a bit here, the answer is yes.\n", " In reality, we may not know ahead of time what the appropriate basis functions for the observed data should be. The appropraite choice may be suggested\n", " by the data, knowledge of the problem, and other machine learing techniques (hopefully to be discussed later). In this example, our \n", " our model will be $y=\\theta_0\\phi_0(x) + \\theta_1\\phi_1(x) + + \\theta_2\\phi_2(x)$, where presumabely the solution should yield $\\theta_0 \\approx a$,\n", " $\\theta_1 \\approx b$ and $\\theta_2 \\approx c$. \n", "

\n", "

In cases where the training data set is very large or data is received in a stream, a direct solution using the normal equations\n", " may not be possible. An alternative approach is the stochastic gradient descent algorithm. If the total error function, $TE$, is the sum of \n", " a given error function, $E$, evaluated at each of the $N$ trianing inputs, $TE = \\sum_{i=1}^N E(\\mathbf{x}_i)$ then the stochastic gradient descent\n", " algorithm is

\n", " $\\mathbf{w}^{\\tau + 1} = \\mathbf{w}^\\tau - \\eta \\bigtriangledown E_\\tau$

\n", " where ${\\tau}$ is the iteration number and $\\eta$ is a learning rate parameter. For this type of total error function, the order of evaluation\n", " does not change the result. If the error function is the sum-of-squares function, then the\n", " algorithm is

\n", " $\\mathbf{w}^{\\tau + 1} = \\mathbf{w}^\\tau + \\eta \\left(t_n - \\mathbf{w}^{(\\tau)T}\\phi_n\\right)\\phi_n$\n", "

" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

## Over and Under Fitting

\n", "

This section describes some of the techniques used to account for over fitting or under fitting a given model when using a\n", " maximum likelihood approach.

\n", "

### Regularized Least Squares

\n", "

Regularization attempts to address the problem of overfitting a given model to the training data and/or eliminating parameters that are not important\n", "based on the data. The general total error function with a regularization term is given by

\n", "$E_D(\\mathbf{w}) + \\lambda E_W(\\mathbf{w})$

\n", "where $\\lambda$ is the regularization coefficient and $E_W$ is the regularization term. A commonly used regularization term is the sum-of-squares of the\n", "model parameter elements

\n", " $E_W(\\mathbf{w}) = \\frac1{2}\\mathbf{w}^T\\mathbf{w}$

\n", " known as the weight decay regularizer. This regularization terms leads to the optimal solution, assuming a linear regression model with \n", " Gaussian noise on the training data, of

\n", " $\\mathbf{w} = \\left(\\lambda \\mathbf{I} + \\mathbf{\\Phi}^T \\mathbf{\\Phi}\\right)^{-1} \\mathbf{\\Phi}^T\\mathbf{t}$\n", "

" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

### Example 4

\n", "

In this example, we use the same training data as example 1(a), except that here, the model is erroneously chosen to be a 7th order polynomial. \n", "This example is somewhat contrived, but it illustrates the point that an overfit model can be corrected to some extend using regularization.

### Locally Weighted Linear Regression

\n", "

In attempting to predict a target value, $y$, for some new input $x$ and given a training set $\\mathbf{t}$, locally weighted linear regression weights \n", " training inputs relative to their distance from the new input $x$. The adjusted regression model is one of finding $\\theta$ to minimize

\n", " $\\sum_i w^{(i)} \\left(y^{(i)} - \\theta^Tx^{(i)}\\right)$

\n", " where

\n", "

\n", "

\n", "$w^{(i)} = \\exp \\left( -\\frac{\\left( x^{(i)} - x \\right)^2}{\\left( 2\\tau^2\\right)} \\right)$

\n", "The parameter $\\tau$, called the **bandwidth** controls how the weight of a training input, $x^{(i)}$, falls off with distance from the new input, $x$. Also, notice that the weights, \n", "$w^{(i)}$, require that the training data is available for each evaluation of a new input $x$, making it a so-called non-parametric model.\n", "\n", "The solution is obtained as

\n", "$\\mathbf{w}_{ML} = \\left(\\mathbf{\\Phi}^T\\mathbf{W}\\mathbf{\\Phi}\\right)^{-1}\\mathbf{\\Phi}^T\\mathbf{W}\\mathbf{t}$

\n", " where $\\mathbf{W}$ is the $N \\times N$ weighting matrix with elements $w_{i,j} = \\left[w^{(i)}\\right]^2$ for $i=j$ and 0 otherwise. \n", "

\n", "

# Bayesian View

\n", "Here we consider a fully Bayesian approach to the regression problem. First note, that our fundamental **model** assumptions still hold. Namely, \n", "we assume that the observation we are trying to predict is modeled by

\n", "$t = y(\\mathbf{x},\\mathbf{w}) + \\epsilon$\n", "

\n", "where $\\epsilon$ is a Gaussian distributed noise component. Additionally, the assumption of a linear dependence on the model parameters, $\\mathbf{w}$ also still holds:

\n", "$y(\\mathbf{x},\\mathbf{w}) = \\sum_{j=0}^{M-1} w_j\\phi_j(\\mathbf{x}) = \\mathbf{w}^T\\mathbf{\\phi}(\\mathbf{x})$

\n", "\n", "Recall that the Bayesian approach implies that the *best* model parameters, $\\mathbf{w}$, are those that maximize the **posterior** probability, which from Bayes' Theorem is given by:

\n", "$p(\\mathbf{w}|\\mathbf{t}) = \\frac{p(\\mathbf{t}|\\mathbf{w})p(\\mathbf{w})}{p(\\mathbf{t})}$

\n", "\n", "

## Prior Model

\n", "The first step in formulating a Bayesian model is to construct a model for the *prior* probability model, $p(\\mathbf{w})$. **NOTE:** In general any probability distribution model could be chosen to \n", "model the prior. The appropriate choice is dependent on different factors including prior knowledge of the problem and mathematical convenience. Often, the prior is chosen to be the *conjugate prior* of the \n", "likelihood function. This is a choice of mathematical convenience becuase it implies that the *posterior* can be derived analytically. It is often a reasonable choice, as is this case here for \n", "the problem of linear regression. Thus, given the Gaussian distribution used for the likelihood function above, we assume a Gaussian distribution for our *prior* of the form:

\n", "$p(\\mathbf{w}) = ND(\\mathbf{w}|\\mathbf{m}_0,\\mathbf{S}_0)$

\n", "where $\\mathbf{m}_0$ is the prior mean and $\\mathbf{S}_0$ is the prior covariance. We will *assume* $\\mathbf{m}_0 = 0$ and an infinitely broad prior so that $\\mathbf{S}_0=\\alpha^{-1}\\mathbf{I}$ where $\\alpha$\n", "is a precision parameter that we will have to choose. Given these choices of *prior* and *likelihood* functions the *posterior* probability is given by

\n", "$p(\\mathbf{w}|\\mathbf{t}) = N\\left(\\mathbf{w}|\\mathbf{m}_N, \\mathbf{S}_N\\right)$

\n", "where

\n", "$\\mathbf{m}_N = \\beta \\mathbf{S}_N \\mathbf{\\Phi}^T \\mathbf{t}$

\n", "$\\mathbf{S}_N^{-1} = \\alpha\\mathbf{I} + \\beta \\mathbf{\\Phi}^T\\mathbf{\\Phi}$\n", "\n", "where $\\beta = 1/\\sigma^2$ is the inverse variance of the random noise component associated with the target variable, $t = y(\\mathbf{x},\\mathbf{w}) + ND(0, \\sigma^2)$. \n", "Finally, the log of the *posterior* is seen to be the sum of the log likelihood **and** the log of the prior:

\n", "$\\ln p(\\mathbf{w}|\\mathbf{t}) = -\\frac{\\beta}{2}\\left[\\sum_{n=1}^N\\\\{t_n - \\mathbf{w}^T \\phi(\\mathbf{x}_n)\\\\}^2\\right] - \\frac{\\alpha}{2}\\mathbf{w}^T\\mathbf{w} + constant$

\n", "\n", "**NOTE:** Maximizing the posterior function with respect to $\\mathbf{w}$ __under the assumption of Gaussian noise and a Gaussian prior__ is equivalent to the least-squares error solution \n", "with the addition of a regulariztion term $\\lambda = \\alpha/\\beta$.\n", "\n", "Rather than find a point estimate for $\\mathbf{w}$ by maximizing the posterior and thereby make point predictions for the target variable $t$, it is more instructive to use the posterior \n", "to formulate a *predictive distribution* for $t$. For **our model assumptions** this is given by

\n", "$p(t|\\mathbf{t},\\alpha,\\beta) = \\int p(t|\\mathbf{w},\\beta) p(\\mathbf{w}|\\mathbf{t}, \\alpha, \\beta) d\\mathbf{w} =\n", "ND(t|\\mathbf{m}_N^T\\phi(\\mathbf{x}), \\frac{1}{\\beta} + \\phi(\\mathbf{x})^T \\mathbf{S}_N \\phi(\\mathbf{x})$

\n", "where a point estimate of $t$ is given my the mean $\\mu = \\mathbf{m}_N^T\\phi(\\mathbf{x})$ and an estimate of the uncertainty is given by the standard deviation \n", "$\\sigma_N^2(\\mathbf{x}) = \\frac{1}{\\beta} + \\phi(\\mathbf{x})^T \\mathbf{S}_N \\phi(\\mathbf{x})$\n", "\n", "There is one final issue with completing the Bayesian model, namely the determination of $\\alpha$ and $\\beta$. This can be done in a fully Bayesian manner by developing prior models but\n", "this tends to make the equations intractible. Instead the so called *evidence function* approach is used. See section 3.5 of Bishop for more detail." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "

### Example 6

\n", "Use the data provided in example 5 to formulate a Bayesian model. Here we know that $\\beta = (1/0.2) = 5$. For this example, we will simply choose $\\alpha = .4$ so that $\\alpha/\\beta = 0.08$ (See Bishop 3.5 for methods\n", "on determining $\\alpha$ and $\\beta$." ] }, { "cell_type": "code", "collapsed": false, "input": [ "import numpy as np\n", "from matplotlib import pyplot as plt\n", "#in order to compare between examples, set a seed in random\n", "seed = 123456789\n", "np.random.seed(seed)\n", "alpha = 0.4\n", "beta = 5.0\n", "def y(x,coefs, mu=0, sigma=1.0/beta): \n", " ans = 0\n", " for i in range(len(coefs)): ans += coefs[i]*math.pow(x,i)\n", " return ans + np.random.normal(mu,sigma,1)[0]\n", " \n", "#training data, with N = 101 data points\n", "N = 101\n", "M = 4\n", "t = np.empty(N)\n", "domain = np.empty(N)\n", "domain_bound = 3.0/N\n", "for i in range(N): domain[i] = i*domain_bound\n", "for i in range(N): t[i] = y(x=domain[i],coefs=[1.75, 0.25, -1.0])\n", "\n", "#Let's assume that we want to fit a 3rd order polynomial to the data even though we know its a second order\n", "#polynomial. Given the Bayesain approach, we should see the so that the unecessary terms are damped out. We have \n", "#y = phi_0 + phi_1 * x + phi_2 x^2 + phi_3 x^4\n", "#design matrix, phi, N X M where N = 101 and M = 4\n", "d2 = domain * domain\n", "phi = np.array([np.ones(N),domain, d2, d2 * domain]).T\n", "alphaI = alpha * np.eye(M)\n", "SN = np.linalg.inv(alphaI + beta * np.dot(phi.T,phi)) #posterior variance\n", "mN = beta * np.dot(np.dot(SN, phi.T), t)\n", "point_estimates = [np.dot(mN, phi[i]) for i in range(N)]\n", "uncertain_t = [1.0/beta + np.dot(np.dot(phi[i].T, SN), phi[i]) for i in range(N)]\n", "plt.plot(domain,t)\n", "plt.errorbar(domain,point_estimates, uncertain_t, ecolor = \"red\")\n", "plt.legend(('training','Bayes'),loc='lower left')" ], "language": "python", "metadata": {}, "outputs": [ { "output_type": "pyout", "prompt_number": 6, "text": [ "" ] }, { "output_type": "display_data", "png": "iVBORw0KGgoAAAANSUhEUgAAAXUAAAD5CAYAAADY+KXfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XdcVfUbwPHPZYoyFJQhoIh7Ai7cXgfurbmD0lLLVVba\ncFx/jtK0LM2RuTXNlSs1lcStuHPkXoAT2ThY398fxwhypIJexvN+ve6Ly73nnvvcgz48POd7vl+d\nUkohhBAiRzAxdgBCCCEyjyR1IYTIQSSpCyFEDiJJXQghchBJ6kIIkYOYvcqd63S6V7l7IYTIsV52\nYOIrr9SVUtn2NmrUKKPHIPEbP47cGH92jj0nxJ8R0n4RQogcRJK6EELkIJLUn0Gv1xs7hAyR+I0r\nO8efnWOH7B9/RuhURhs4z9q5Tpfh/pAQQuQ2GcmdUqkLIUQOIkldCCFyEEnqQgiRg0hSF0KIHESS\nuhBC5CCS1IUQIgeRpC6EEDmIJHUhhMhBJKkLIUQOIkldCCFyEEnqQgiRg0hSF0KIHOSVrnyULQUF\naTfgzJozRJ+9gYWFDmVpiaPNA5xcdJhb54EHD7TtPTy0G4Ber92EEMJIMjRLY0hICP7+/ty+fRud\nTkefPn0YNGjQPzvP5rM0xo6ZQujlRO689Qm3DDNYc8+PjWdL0LHgDr6Zb49t4l0YPRpq14Zr1+DU\nKbCxgTyS9IUQLy8juTNDSf3mzZvcvHkTb29v4uLiqFKlCmvWrKFs2bIZDixL+PpruH1b+9q7N9Sq\nxe3WvQkovY/mbzkx6FtPKFAALl2ChASoVAlu3YK9e2HyZFi1CqZNg507oVw5OH0awsLA3FySvRDi\nqYw29a6zszPe3t4AWFtbU7ZsWa5fv56RXWZ5jo4wxGUZizYXevpGCQlw965238wM7O1JHmFgPzW4\n3vRtrb0TEwOhoXDlCvTvD3Xraon9rbfAYEhtAQkhxIvItJ76lStXOHr0KL6+vukeNxgMqff1en2O\nWJGkod1hrl+w4K+/oOx/bHstwpoxO3qyrjA4JHXn7iZbZlSADnFx0K8fNGkCRYrApk1aJd+5MwwZ\nAjNmwHvvgZOTVPVC5HBBQUEEZVIhlykrH8XFxaHX6xk+fDjt2rX7Z+fZvP3y85fd+enebuwqVCH/\noVPkdy6KfdV62E+fx+YCH2PjUJphy9vjuP0AhUxtiaukJ/+d81qV/aja7q8/SdyNWEZurEnx7wdz\nwLQm3dd2xS9hA5+MscMzoC66okVg925wcND+FIiPh0OHtKR/6BCMGAEbNkDbthAcrLV4bGwk2QuR\nQ2Ukd2a4Uk9MTKRjx4707NkzXULPCeqbeOKUmER0pa5E7fmWKFN7IpIfcsr6Hokl1rL6fCInmsVz\n++f63L0fSfLb+Sn/QyXckvNQuEwIrn+MZHFyQSbV2ccDG1tiScC36C2OHoWh5UH/STXufQS+9+bz\nyV5LGrR+SiClS8Ply9ovii++gLx5ta+tWkFiIpQsCV9+qSV/U1NJ9kLkYhlK6kopevfuTbly5fjg\ngw8yK6Ysw9XEDtfkolC2A8T/Bja1oGFvGP4H9B6F9+JafHuhNZarltC2pxVFEv+g5Xgnarr9Tti+\nuewPM8XU7SArnXbzzYpOXCtwgTzRFhRbtgjPdqH0rKSnUAk/7g7ehL/hG6outWdSSjGK/yuO9Udc\nWbWvD7MTwTztE5aW0KEDdOyonYhduVI7sdu3r5bUlYLff9f+cpDkLkSukKGkvmfPHhYvXkylSpXw\n8fEB4Msvv6RZs2aZElxW5+8PXw8byPG3rVkwJZqk/ouZsGQR/xsXBTc28dexUQyMOcToiHwwfiZq\n8CDueDhyuYUfl4Pe5aKpFSdidnO+9noeFg1i44OHrHf2pO2SnlQ2saWccyTl7pzns5UVIeY+b70F\nC4voMH1WUHo9nDsHo0bB4sXw4YcwYIA2CmfwYG20jlTyQuRYGUrqderUISUlJbNiyXa6d4dPPmrE\n+D4PaNE4gUSLP+hzEc6FWFFSwerVsLbHJQjXttehw9HEBkc3X3zDncGtm3aidGAg7N5NZD4TBlae\nREgeb2KqbmdOkbsc/KkJ4f63qZRkRWBUa+qeL8BYrxtUfhBF/v8KsHp1rU9vMMCaNTB/vvZ19Gg4\ndgy8vNJdbCUJXojsT64o/be0SS4wEG7ehIMH4epV2LcPFi3SRqkMGIBzSgrxJuex3OYDWxXmsREs\n8DGwY2FZ4uJKYmYGFV0jUpP6fymQpwDT7iyn1NIp/FCrKuUPnKLG4UNM8t1MyXtTCH67JuO+20Ov\nqMPc/dYdx9KKKtduUHXPRao6xlMlMRa753kjCwsoU0ZL9q1aad9XqqS1b7Zu1cbRS4IXIluSpP5v\naZNZmuGY6aSkgE4HOh15Tp/WTmRu2wazZ+PtCNHzfiO/5Uk22+rRzYiAqChtn7dvQ9GiEBkJERFP\n3HV+XTSffw5Dv3fj49iqRJtA96p3Md1SkBq1+9BySSy+M2dw/pIlUR8253DxshyKu8GoCnc4ur4B\nRXZ7UL0dFF0/i6sbKxB/YzK/pDzjggQHB2jYEAICtERft6528rVfP204pU5HopsH5iU8JNELkQ1I\nUn8ZJmlSZLly2temTaFpU5yAicFaMR+0EopXT/O65cu1qh+06njECBg6FO7cgS5dtAr54UMG3jUQ\nfK4WvWNGMGoWmJr8M7SpeIEIenidZNzY6kx9aEOZ/PXo0bQjfPg7ScuXccQ2hR7zFrPil/1Yek4i\nuv9tqk2pTcd8DtTNd5fqSQ+xfNrnsrWFd9/V2ja2thAczN5Je/hj3D669wBPgqRVI0QWJ0n9FXjr\nLZgwAapV+9cTnTtrN3jyXwGBgfDjj5iawnjP2dw6fJ1q88xBpWjVvsEAV68yXL+Hsj9WZ1AVF0qm\nebmZiRlX9pfD9XAUZ/bXx/TiOeY1n8i4cq2JaD6fj1xPcnqiA1WTHNGbudLo6i58ScbiKZ8jIhLe\nmN2EwdaHqbF5OFv8F+G9fxY0biy9eCGyqEy5+OipO8/mFx+9rIQEbVh56dIZ209iola8s3q1lvAL\nFYItWyAykqsPnbh0Ky8NWtlA2bIwbx5s2oTf4HL03taVrilL4dw5Ulq3xcvyDBPb76X5nxOJ/WUR\neyYNYvuDv/ijSDJnQ49TK18ZGtfx58b7t2jd923075UDW1v8W4aT3yqB738vzR+LwujaMYE1xT+m\n1qHvYfhwbY6bevXSB50mwR8/rrXqdbqMHYcnuX8frKwyf79CZAVGm9DrP3eeS5P663DvHgS4bmVy\n060UKZMXZs3ivlNRjpzOQ43EXZiOGqHNP7NmDb9MCmHKyAj2lumFbu0a7UKlmBj48ksi3+5KUDVH\nFloksf7EZlSeexRLaYpfYCib7i3j1PpE8umrQVgYm8Yf5Z3/uXPmTkFsFkyDM2e0oZLdu2tXwT4a\n1gpw7jz0+VmPYbs+04v4Q4fgzTfhr78yd79CZBVGm9BLGE/evNBtjh8ND00k7mMD3LjBmBb7OdbW\ngGmD+v9seP8+b5w2UDJiP4HhlR7bT4EUS9paVeHajOks3tCNo/VnUURXg1ll8nKnR3GaBLZlvHcs\nJ26doFm1cPzsDzNmzL93UkAbmmkwwKZNRJ8OZe1aeL9cEDEfGTJ9grLNm7XfJ9HRmbZLIXIM6aln\nYx06wNq18PHHMHWqNgw9MFAPZfXaBtu2gYUFJjoY7fojofsi+NNBj87KEldnhb2lAW7eZOHe4lha\nQhf7regKtuWPiW2Js/kfZtMusOP8Vn4L7EubZW1Q9+/TuLkNs7bMIqC1DeWfEFOilS3NT01mQPej\nNPvlTWYk9KJFCx1mf/fgM6H/HhiotaWOH3+8+yNEbiftl2wuOlrrW7doASdOaPOCPU14uDa1e8iU\nlTwMPo5PFVNqRG3mx0uN6NrTHNddy2Dhwn9Gv4SGQmwsVK+OCg3l5LqfWLfmK2Z62nPr/kW6pzjR\n4e0JNPn+N/KUq8TDd/ozyHM9oe41Wb+3ICZuhdF7XGZsw+3UmdpFu7o1rZdI8PfuaXOetW+vnYhO\nsyaLEDmGtF9yMTs7rUKfOVMbjfgsBQtq1f3gnZ3wvzKGpaUNdDw3Hk9PHa6uQFwczJ6ttUuSktK9\nVqfTUTFfMb4IKcbFYQcp8ssOoq/WYnzQZJwLzqfx9V/wbLGW68l5WTjsdOqozx7t7/PdyUbaNwYD\nuLrC0qXa93/PZvkC7Zk9e7QLYevWhaNHn+slQuQq0n7JARo00FoSdeo8/2ucnWHuXDg5pCFubg0h\nP1rV/HdydXLSxmU+eAAPH6Z7rYUFLO3wF2NWDubyH5UwSznPkXIbcG09hd01d/PhzQZ0O/8hjU0U\nnVre5+OxdkQrW+1q10qVIH9+LZGPH6+N5vn7fZ9jiGRgIDRqBN7e2pTzQoj0pP0iHpc2wZ47B7t2\nQfHi2qyQ589rM5lFRkJyMmrqNK4FjKCgtxv5hvTlRrParOjhw9KEw1w8H0znKv6cWNmbgHVLePve\nDA7NP8lfY1fS5bQBy6XztfeZPx8+/1xbFrBMmfSx/CvBV6sGkyZpXwsW1IbvWzxtoL0Q2ZRR51MX\nOdDTKuVly7T5YUBL7uHh6EYbKBpzAiydAXBJsGSQawcGNZzGpdKO/FzXkV+9etHPKYHh3f+H6eEu\nlApviaEsjG1SjK5qh9YDTE7WhkQOG6bNLlm58j9X6z4SGamNeqlRQ/v9UqyYduXuoxUVhRBIUhcv\nomtX7QbaxGY7d2r3r1+HjRu16Q4iI1M394w1Y7jPYD6q8hlv1fiK+5/eZm9FX5JvptCl1jQmj6jP\n8ocDWf3vuWlmzNCW+CtdGi5c0O5bWHDSVE+tWnosH81z4O2tTTYpSV2If8iJUvFy3nzzn5Oq778P\npUppj0dFwYIF2uOPevFWVjp+iZzBun7TCGuwngFXnfgz5Rcuv1mefb4TGTzhEIo0f2oWLKj98tDr\ntbVbY2MBiF29hZEphtTWkI/P00+Wbt8OISGv4oMLkbVJpS4y7q23/rnv7q61ZkCbL2HyZK3p/Wg0\njaWJBZ1u2tOp+2+E/fQt391cyeTwN9hskcRHSVXpFBPLH2G1aRhjQcG2em0WyYEDITSUS6NX06FP\nwdSef4dL8N1xPaB/LKRBg7TroSZPfrUfXYisRip1kbl694avvtKq7G7dtEtfQeuZGwxw5Ejqpq5m\nBZh4ryTrGl0kcv1Upt6MwvHLogS4nmHcrvvpdhviWR8DBpymG6BECfj5Z5ydweVsEOpfwyJDQ7Xz\nu7/+qq3oJ0RuIkldvBp6Pfz4I4wZo83q2L279viRIxAWpiXhM2cAaNnChKEFLah5YArb2p+iY75r\nzCgwkDpz67C0aCwJyQls2qyjGb9jaorWdzE3J+9EAz/k/5zL+Spq+3t0cnfzZm3ZVqXgzz+N8NmF\nMCJpv4hXr1Ej7Qawfr1WtYOW1K9cAYOBoS5ntTOe1YdRPdqBNWsu8N7uTcw+2JOP1zfC9mJ/Blkf\nA3qk23VPu3U8/P5HiD+hTVKmFDa/2/FWOz3Oznp+/VW7WEmI3EKSuni9WrfWbqD1SG7e1CZwGTYs\ndRNrswfUKBmHdUgH/gh048i8b6lxYC2f9V9D8Nq3+bBga/6emsysYjmWnnqX/xnegF69SD5/icuX\n9LRMDKJ0VBAbf0bmexe5irRfhPFcvw5//KG1Tq5fh7Nntfvh4TSvfIvNm7XN7p7zouqNWVycBiXt\nS9Jsdz8a17/K5gubqVU6nF+vV9fOwxYrxu5ib7KyggHr5BjcWvlgUAYuFdEb7zMK8ZrJFaUia0h7\nFevevdyxK8GSrY4Mtv6JIY1P4OBpx/DR5pCcTMKJYywb1pJJrR1Q0dGkbHwH/zbDGHZ/Ap9tro9p\ng3qM3d8Y7OxYd6UiBROuU6upLVhbS9UusgW5olRkf2mTbVAQBYOCUFsg3MQRz7Xf0OkNk9ShLBYm\n5vhfzc+b/Y6zZcEI/lflRz6/+SPx5lXZcLUbM5sDl52gRQvMCvRgacfZ1Kp1SEvq27bJMnwiR5P2\ni8h69Hp0BgMp9fQEO7fhYZIpzs6PnjMYUhfv1ul0NLX2Zk+IL33sfmX6jQROda/F1oT/EWmSAEDD\nhrAosSs3OvTXEviECdo6gfDPLJGZuICHEMYmlbrIsor31tO2kx5/f9A12A4Jj2aLPHhQm5LAYEhd\nAHXK0KoEuc2nmud+rtZeTYli6+hz9yFDkpvwZoFAJi5swLdLKmmra3zxhTb75NSpMGqU8T6gEK+A\nJHWRZTVsqOXsFi3Q5hdu0EB7IiYG2rUDPz9tgrETJ7C0hLXdl6MSEynddh5XV0cyoUAcpaeVppuf\nDfO372XQZSj29863boVvv9VaOikp2jBLc3NpyYhsT06UiixtwQLo1Any5UvzYNqTqteva7eqVeHa\nNW0RjjFjoEcPaNGC0Nb1mTDQh588kvCMfofti6fheO2uVunXqwdXr8LEidqFUj17pn9zSfDCSDKS\nOyWpi+ztX6NmSE7WlkXavRvefltL7pUrc2bCl1T+cR0WxX/kff0HfOzZE3u/NnD1KhErArn27Sq8\n907XllaKioKWLVmxQpunTC5eEq+bjH4Rude/Rs2kJvjbt2H1am1ysdhYyjgUYkK9H1gxsha3qwVS\nanUDBnmn0CA4Fv+BNYmJqMapm+C8dCkcOkTsHwc5PxXsqgF+SNUusg2p1EXOdPSotkK1q6u24MZP\nP5FQoTJtrX7naMEmtHnjOEfD/Tlc5DZvWvak4FJvbtfsyaIqU+DKFQIip3Dm9ytY21sQeLqwsT+N\nyGWkUhfi36KjYe1a7b6lJSxYgEWBdWwy+5Kzv0fx04xi2K6ZwerttswNfpftb0wnPjiZQDsXbMJT\n2LoTDpYfzqJdxVAGU3SXL2vrtubNK1W7yNKkUhc5X9q2zLhxMHQo3LsHP/8Mt27B5s3smTOKd6qb\ncTkkCo/9Qxnaz59eB/rhvmIyOw5Z49muElSpAkWLpt+3JHjxCkilLsSzpE2877wDhQtryyKtXp26\nSe3YApz6aBPVmv7IqVpjWGq2mMpmtlR2D+fIEWs88+fXTrzWq6ctvP3995A/v1E+jhDPIkld5B5p\nK/b798HDQ7uA6dFCHiYmOv5omEzs5TdYV9adZhc+xqVeAkGHZ9Ep7T5WrdJaMfnyaePcdTqp2EWW\nIe0XkTulTfAXLsD+/do49fBwMDODKVOI6fc2vWzjWWcRyIhrNnz09o/kbdBEOwF78iSYmmpjHu/e\nNeYnETmQtF+EeFFpK+uEBK1yt7ODKdroFwBbZcH3Dg35Y9YETjSrTtkD/nzt+D1voNCBNqb9/n0w\nGAi79JDrF+9Rza+AVO3CqDKc1Ddv3swHH3xAcnIy77zzDsPSLHYgRLawd+8/VfulS9pJ1EdzvBeu\nfA/L+8WYdKg+l5vVY/Cu8fzQLprvw0/hVbs2Ko8V3+U3EPTrVnzjt1OhDlil/StAErx4zTLUfklO\nTqZ06dJs27YNV1dXqlWrxtKlSylbtqy2c2m/iOwmbULesAFcXFhytgpFH56nzqK+JNepzexG+RnV\nPA8dirUlpHdjbpXuyvKhh/j8/Uhqj/JjQPVgbcqCTp2e9U5CPJXR2i/BwcGUKFECDw8PALp27cra\ntWtTk7oQ2U7ayrp3b8iblzNTHDizYAF1AFMTU/qdsuKNBTvRT/2G833f4/vO9yj6sCRvO/7GZ/P8\nGHB1BQQGan33p+1biFckQ0k9LCwMd3f31O/d3Nw4cOBAum0MBkPqfb1ej17+UYvs4uJFCAqix3k4\nHH0LluzVlt9LSGD7VhfiV8zgd4sNDC09i/lx95jqZsbt0/BnvhpUqhyptXCmT4eCBaFzZ2N/GpGF\nBQUFEZRJ8/pnKKnrHs1l/Sxpk7oQ2cqjyjrvNVi1NYgeLkEARCbk4/Kgb9jV8SGua+LY13sfc1d8\nTosq3+DhO4RZ+zvzg9ujfQQFaVe3nj79xH0LAY8XvKNHj37pfWVo5SNXV1dCQkJSvw8JCcHNze0Z\nrxAi+3F3h506PdMdDXz6wEAN21NYGL7A9dtPwNwcE50J77i25uReb9xKRjKzfCeWW17TeqLu7tCk\niVa1nz0LtWtr9yWhi1ckQydKk5KSKF26NIGBgRQuXJjq1avLiVKRI33xhTYwprlVEL4PgihVCnT3\n72urJw0dqrVYli6FPXvwqvMrd2u9T9XKvkw75IibS2n46COoUQOKF4eSJbVhlObmcuGSeCKjnSg1\nMzNj2rRpNG3alOTkZHr37i0nSUWONG7c3/f0j25oc7cPHAhubtqY9aVLARjjnY+eU0+zreFUPCtO\n4f3r7fhGpWDi4ADdu0PLluDpqS2C7en5+j+MyNHkilIhXlba4Y83bsC+fdChA8THoyIiCRv9Ez/2\n+57Jrj/jVd2EnzaaUa7XMG16gfbttTlk8ueHyEiwt5eKXaSSlY+EyEp++kmbduCnn1BDPqLm6g8p\n/+l61oV+wKCinfm011zMS5bWKnVXV7C1hYcPjR21yEIkqQthbGmr9pAQrfquVAlOn2ZL/s4M3tWJ\nTaUb874+mut5Epn30x18lu/Shk02bw7Dh0NiovY6R0ep2nM5SepCZCVpE/yWLShrG+aeqUkRi5s0\nntKKRe4RfLz0bfpVf5/hzcZjUaCgVqkfP661ZI4fN2b0IguQpC5EVhUcDFZWBN6uyHttwji99Dhm\nbVpwo3wR+g4pxdWE2ywY/xfeIYkwZ442zKZfP23+mXv3tFE1UrXnOpLUhciqHlXtCpg/8TbelRQ+\nzZxg1izU7t0sigzi4+XvMKDlaD6zbYH5W721Sn3ZMlizRvsqch1J6kJkA3uXXaXLx+6cO2+CVXnP\n1BOloa429J7SgIi7YSxckUjZ3Wdh5EhtZaZOnSAmRtuBra1U7bmEzKcuRDZQy/kyk2zmcbgN1ImM\nhO++I8XalvCoSqzvvJm5G0dQt95XjDzwPf3LlOFS0cac9DJw8vs/ACg/sCEVCkPJR4stCfEkUqkL\n8RqdPQt16sCZfZFYF7bFPwCCVobjUNaJ7wdfxG1+a9q0t+XGBSus135H1eqVaH31ezyjj3EnTxHC\nQhW1ayRTs66ZVO05mLRfhMhGvmkbRKmwICIiwNJS0fHCRC50GsbobbUJivKmaLUClGjQl60mS/ih\n22I6LX80ha/BwNnmH1Bnz1ecuZwHBwfjfg7x6khSFyIbuXkTSpSAN9+EadO0pU4BHgT/yZmeY/E6\nuxzdL8s4sHkOPSpfpl6EDd/RDBvDl9CgAb9dq4ipoz3N6j8ACwswMZGqPYeRnroQ2Yizs3bNkaMj\n6HYEpY5pz3P7Nt4PD8BoA6Sk4PvAgaN9VzP4y3pU5keWXu9IVUdHqnVrSLnP23EgogXFV3ylXeQk\nxCOS1IUwAienR3eeVmEvWwbnzmFjacNc2rKcU7RY0oJPHFz4yPY+H34Ii75sgGHGjDQ7+499ilxB\n2i9CZBVpr0Q9cwYuXIBWreDKFfDw4MoHb9Hjy6pYF3JlZsAW6rumMOKzZN4d7QYODtqkYhYWRvwA\nIrNI+0WInOBpFfaj1cM88nuw40oDDO7J1FlUmfE1ijBmXhC6i0H0jopGN2bMPw36Z+1P5GhSqQuR\nFaWt2vfs0b7Wrg07dkC/fgRWK4j/j83oWro3e2f/gOf+5Sy61xGT8NvwwQewcqWxIheZQEa/CJFb\ndOmizdnepQu3qpWlx9t2JOTJQ8wHQ/nw2yYEWC2HAQNg0KD0r5OqPVuRpC5ETpa2al+1CsqUgfLl\nYeFCkn9dxbio9Xy/fizs2MiVOZ5Yt/eDixeJ2PMX1zadwntsJ2NGL16CJHUhcqNKlWDxYqhUicBS\nZrTqUojqD3qxffVSIg5colG1GGJv3eNivLNMK5DNyIlSIXKLtFX7rVvwaEhjo0uwq8tean7vT119\nPNEt7jKoyGYehh3hRl8bClvehaQkbfijtGJyNKnUhciuIiPBxgbMzLShjHFxGIZF8r/TE7Ct8Svb\nyg1mzXBnHrbrytdOk7RLWSdNMnbU4jlI+0WI3Cht1T5mDHz+OYkRMez9+Sq3d/fk/TW9GfBnFebu\n/4Mrzd9D9+dxaNIEkpO1aR5leoEsS5K6ELldcDBUrQqXL2uJ++JFzqycSftDn3AzrDtr89eknuVJ\nrVLv3Rtq1dK+iixJkroQuVnaij0yEhYt0oY0WloSsy8I38r5iIo5z0HTWrh9PQtatABzc/DxSb8f\nqdqzDDlRKkRuljYZx8RoFyl17gwbNmC7Zw8r31hFrQ9G41trMsuv+VPbxeWfSv2jj7Sv5coZ8xOI\nTGRi7ACEEJnI1lZL6GmUL6+jyMn+fBj2Ju1/ac8s6zP/PLl/v1bdixxD2i9C5CT/HvJ49SpUr86S\nTQVYHdmAL/da0W5Kderl9+L7D7dg4VUZfH3B3R3++gtKldLmj5FWjFFJT10I8Uz3v5xC9W+6MGSi\nCx33vEnPkieIdrLjs2nu6Ce+T56GtcDeXpsZ0t7e2OHmepLUhRCPS1u1Hz3K7VuK+Scq063qeVx6\n6NGHXWFP9M8MTp7OlI5W0Ly5doLVyuqffUjFbhSS1IUQz2X6dJgz8gpu7joibYpSP18TxlU6xrLu\nc+jcIECr1JOStBOpFy4YO9xcS0a/CCGe7VHV/h7glBJKnihLmrUqhOmv19lvuYh31/TmSpUHfKIU\nOqUgNtbYEYuXJJW6ELnNrVtai8XWFmrX5tR7U6k/0hG35p5Urv8GM03bYhHQCz7+GKKitJWXvL2l\nFfMaSftFCPH80vba584FPz9WHHAn7so51s6IJupBFKvGnMPh2h1tUY6RI7Wv4rWR9osQ4vmlrbg9\nPaFhQ8pGudGo0m12lDJj7o0R1Oh4gI13z1Py2DFtWOSjJfVQSps3Rqr2LEsqdSEEAD/m+4AvrL7h\nhy9jiF5ShBGt8rKi5BfU/WalVqnPnAnHjmlfxSsllboQ4uWkacX0KbCS1n55WTHUDHWvL/Nn+9Fx\nRRemODnT3ahBihfx0pX6J598woYNG7CwsKB48eLMmzcPOzu79DuXSl2IbCf6/G06lD9L4S51+ajz\nPNrufY97XoqPAAAfjElEQVR+LUbx6a+30e3cBa1awcWLkCcPuLpKK+YVMMqJ0q1bt9KoUSNMTEz4\n9NNPAfjqq68yLTAhxGuU9uRpfDyJ02ez1PkDrhXw4s0Ck2jbMR7fGBt+uFQWs5k/8lXtdZQuY0L7\nOa2MGXWOZfTRL7/++iurVq1i8eLFmRaYEMJIEhLg4EHuV65N23oROF87yLSrtej8bQ1MY+IY+cZp\nvqi6nwD7Dbw5IP1f51K1Zw6j99Tnzp1Lt27dnvic4e+z5oBer0cvP3AhsjYLC6hdGytg7bhT1O1Y\nlB+n2bA+//v0iZpK48UN+Ej/Pv33fkWbDy2xe7czvPceNGhg7MizraCgIIL+/kspg55Zqfv5+XHz\n5s3HHh8/fjytW7cGYNy4cRw5coRVq1Y9vnOp1IXIftK2Ym7fJvrgOWafqUODJhacjHTmM5fL5Cs+\nDbfNv9J3iJ6u85rCkCHQtKkxo85RjNZ+mT9/PrNnzyYwMJA8efJkamBCiKxj+3bo1u4euqRE1gXZ\ncXhFQz61OI5v6FZ+P9UXypSB4sXh0iVtGl9zc2nFZIBR2i+bN2/m66+/ZseOHU9M6EKIHOBR1d4A\n2FD0KjF3Eqj2W0mqXSyAhVcX3nFqxqY4T5r37KlV6kWKwO7d2ldhFC9dqZcsWZKEhATsH829XLNm\nTaZPn55+51KpC5FzhIfD/ftaJT54MHh6UmGLD2GVmjOzwmC6uDaBtm2hVy9IO7xZKvYXZpRK/fz5\n8y/7UiFEdlSw4GMP+devx+HFX/GR7f+4W8qN9+3s4MMPwckJ+vSBBQuMEGjuJleUCiGeT9oTqBcv\nwq1b9C4USfAZH4LKfE2zfWMJ945mhFLokpNhxQpYsICICNi8GbrLZamvhSR1IcTzeUIbxQE4Pe8q\nRw5Fsvuz3TS7UIS7waP51roTJomJYDDw1x64FASJp8Dc7/F9iMwlSV0I8eLSVO1b8m9m+ZTGnN1n\nxh/n7GhT9gS9dAn8ZGGG6SgDvT0fEKm7h08te1rqjRl07iCzNAohMubgQW5YedKprwMOR7ay4EgR\nuh8cSJ7fAxn8Xjx9esA7SbO40nYw/xpLIZ7C6NMEPHXnktSFyDUSEqCj3Ta83qnGyPGKnn0Kstej\nIaPP+dNm53gWJ3Tmgz730D24Dw4OMirmGYw+TYAQIhd71IqxAH62W8zMOT2IuJ/MvF+tsG/jxqxa\nk+kQYs7UcAON7TdQ8dQymDrV2FHnWJLUhRAZk6bitjEYKDQfWkxJYWDKHRrGzaBUwR40rr2WRinh\nbDjqSkULYwab80n7RQiRqZSC5k2S2bXtAfOX56NTsUN88W0rlnoVQj/7Q+ZZTYEOHeDaNTAxATc3\nacX8i7RfhBDG96gNowOWlU1m3aFo2v9pjy6yMOPOFMa8V3vGhn7Fh7erU8lggOHDtYU2hg83cuA5\ni1TqQohX6/Bh7erSw4ep0Hk4t4vNJPizQ3gMHAFhYVCvXvrtpWqX0S9CiCwm7dWnN27A+vXQpw+b\nDhVk8IPrJLT6mcDwlhTP46JV6t99B7VqQbVqxow6y5D2ixAia0lbbcfGQr9+4OND4/lLsB7Skkpt\ni9LAbCjb1NuUAti5U+utS1LPMEnqQohXy8YGfHwAMDdTzK0xmyZjxvNpwFYaWs9jW3g/dJfMKbp4\nJXlOnEj/WmnFvDBpvwghXq20rZgTJ+DSJbbbtmXbxWI4+v/OZ6Y70M1ej5ddUbYddSBv724wfjwU\nK2bMqI1KeupCiGzl4UOo4n6LkBg76n20gsP0pcqRdZhYNGbVuYqYrfoFypUzdphGI0ldCJHtXBkw\niURzK0p+258lvasx1PMSHgd30z5wGR/5h6MrVCj9C3JRK0ZOlAohsoc0rRiPkKNgagqGO/Q4b4Wq\n1ZOhdRszLXgB1aqXoH6Ah9aCuXhRu0hJPBep1IUQxtexI3TvzsIS8QxYPIg24ZtZPK+GlsyTknJd\nUpdKXQiRI/h7+RP6yWRGeHVixG9fUVopGD0adLp/NspFbZiXIZW6EMI40o6KOXYMXFy0tU3nzcOl\n7KeoBl+yd0IInuHJWgume3c4eNCYEb82UqkLIbKfp1Xcy5fT26MTB6IVDQP6syPqKkVTUiAm5nVH\nmC1JUhdCGF/aqv3OHfpHjsVxeQFSSkGjxX7ssBmI6927YDCkf520Yh4j7RchRNaydCmqSVM8q9qz\n5oo3m3Z2YcGhn9gxDxz/vAi//QYLF8Ivvxg70lcmI7kzd51SFkJkfd26oXOwp2NHWEUHPq09jC5F\nWtC4YSgR9yO00TAPHhg7yixL2i9CiKzlUStmcCSs4w5q9GhGRSjir1vQbHEz5p1qR2JwAl4GA7q0\nr5NWDCDtFyFEFpWSAnXL36WS3p4fBp1D1641PUY1YlXQCVx/WUj9Dp7MrDANy3uRMGKEscPNVNJ+\nEULkOCYmsOmAAydO6Og7yolria7s+fwHKtsXoHivWkTEPKTh1Hbcvm3sSLMWqdSFEFnTozZMQgIs\nWZDI9evQsKk51eqY8MbdGZjWrE2t/zXC884x2r3nkv612bwVIxN6CSFytPg799i5+BrNPywDa9fy\ncN5sWnV8iMP5OH7/eis3o6yxHDEUataE9u2NHW6GSftFCJGj5SuUV0voj1gqU9Z0WcM1s+vkazWI\nzZsV3LolFyghlboQIjtIe3HS2bNw9Ch07UrkjctUsNtBwfvvcvzcTsibF7y80r82G7ZipP0ihMg9\n7t7VqvJy5eCrr/jzeiQ+piv59qEzg3z7QEAA9O8PAwZA2bLGjvalSPtFCJF7ODikWxWpUl4dvue2\nMjr/KX6JP6A9eOhQrm3FSKUuhMhe0rZidu+GBw84XKAxP11MYnW3qSzuvhK/VoOgTh3i87uyajXY\n2mjrbbi/qce+g96IwT8fab8IIXKnK1cgKYnw/CUoXvgeK775kp73Z/Hb1kJ4j5pHo8+qUzluB6Xf\nqkngTgt27YLQUDA3N3bgzyZJXQiR67VwO46LZ170k84w9NeONLq2neiY2qwNssMkLARsbalaFSZN\nyvrnTY3aU588eTImJiZERERkdFdCCPFigoK06XgNBlbkf5dOYd9xXX+YGkHtWO7Sk0mzbmCi+yc5\ntmoFGzYYLdrXIkMTeoWEhLB161aKFi2aWfEIIcTzSzNcMV+LFjR3dKSM8uB6zWUMeNOJLj/XZQcP\nsPvySzAxoc/+q6w6XgJa6bN+uf6SMtR+eeONNxgxYgRt27bl8OHD2Nvbp9+5tF+EEMbg64v67jsG\nRS7h1MoZbPr6Bpb3E0ip5osboQQFQalSxg7y6YyynN3atWtxc3OjUqVKz9zOkGalEr1ejz6H/nYU\nQhhZ2lExYWHo5sxhiqsLXe4p3tryHkvqfIuJTtGqpbbORlZK6kFBQQT9HXsGPbNS9/Pz4+bNm489\nPm7cOMaPH8+WLVuwtbWlWLFiHDp0CAcHh/Q7l0pdCGEMP/8MDRqAiwsPCtjgN7Eivg8LMunzIM62\nHML+fYqADrFga8uN0noivfRph74b3Wsf/XLy5EkaNWpE3rx5AQgNDcXV1ZXg4GAcHR0zJTAhhMgU\ntrZEXDhBnaV+9Nl4iz6/RuNcKIlrVTpwY9Y6GjSA6tVh3TpjB/qP195+qVChArdu3Ur9vlixYk/s\nqQshRFZgn6cAm5ouonZoLdyurKRuRT3fhXbgx8YwZAhMmKAtymGSA66xz5Tl7HQ63X9vJIQQr0va\n/nrZsjBxIkXv32f9bzY0cXyPj0v248EBEza2N+B1D/IoCB8Ajp312X5UjFx8JITIHcLCoHp1Nu+Y\nQ8Av3fluSwu67lgMCxbw9rcVqd63Mu+9Z+wgNTKhlxBCPKdmJZox3qMXw2uu5k78HYiNpa7DaXbt\nMnZkmSNT2i9CCJElpW3DxMeDlRUYDPS2LcylO460XdaWwPM+tL1zmuhDF8CQ5rXZcB52kPaLECI3\nCgwkZfw4erzvRPK5sywNrUnhlT8QPHoTRSvYQJ06Rg1P2i9CCPGCTNAxr+08bqTE8Ln1PurWhV2r\nbsOxY8YOLUOk/SKEyB3StmIuX4bLl8kz9ivWRDWkhs1yqlT/iV0Ly9CT7J3Upf0ihMh9kpLg4UPI\nlw+mTePc+f3UdN5KlcX92FL0oHY1UkSE9ryl5Wvvr0v7RQghXoSZmZawHymVnJ8VXZazreVM9tap\nqU3ne+oUNG5M0nADrSfrCQ83XrgvQtovQojcJ20rJjgYwsNpuLAgzfa3pkP96Zy89x4FH226YYN2\n270b2rUzUrwvQNovQojc7dw5bZHqqlWZ03AJYwrupUjzE2z9IQbL8l4s2lOMmBgoXQoaN+a1tGKk\n/SKEEC+rVCmoWhUA/9IHMN0+BhVfkH7VbnO+YR8+ijHg7g5jUj7X2jJZfOy6JHUhhHjE3DSF/7U4\nROLyRRzPF0vv/Rt4+22oF7aMw8fNSEoydoT/TXrqQojcLW1//epVulkvIOrMXpwimtKl+UJ6tqhJ\n/kUxFHVL5sQJE3x8jBnsf5OeuhBC/MuqVeDf+T5ejXZyQd+VwNkPuWT9IYXdzaiW7zSUL/9Ke+uv\nfZGM5975UwKzt7cnMjLyVb2tAAoUKEBERISxwxAiW1IKGjsc4dNPddxp/hdfzPdnkPMZjh4pwsLf\nCmonVl8ho6xRmhGRkZFSwb9iMse9EC/hUStGBwTmXwin6sK9Ypw6a84Sd3+iDgUaO8L/ZJRKXdoy\nr54cYyEyKDoa8uYFc3NSCrvQblxFjq2z5K/fdpLv8w/Tb5vJrZhs136RhPPqyTEWIhMVLkzM3u0U\nnt6RN/eYM2PPUW3iL4MB1qzJ9LeTcepCCPGK2VrY8LbJShbVvsD2y9u1uWNu3DB2WI+RIY1CCPEk\naYc6li8PU6cy6JIJ14+8STfHbuyvOhMPI4b3NNJ+eQXee+89XF1dGT58eKZu+yJy+jEWwhgirsZQ\nzEMxfOc8luyczJ7ZyeR7qw9ERmorK7m5ZUp/XXrqmczDw4O5c+fSsGFDY4fy0rL6MRYiW4qJwc/+\nEO8sacCm2NbcP7iPZTPD0S1YoFX18+dnyttkuyGNWd2zDmhSUhJmZnLYhMg10rZhHj7ka91W/hrV\nipmDmlAvbxAT90xkGE7GjDA99Qo9bfev+G0zpGfPnsrExERZWVkpa2trNXHiRKXT6dScOXNUkSJF\nVP369ZVSSnXq1Ek5OzsrOzs7Va9ePXXq1KnUfQQEBKjhw4crpZTavn27cnV1VZMnT1aOjo7KxcVF\nzZs376W2DQ8PV61atVK2traqWrVq6osvvlB16tR54ufIysdYiOwsOlopW1ulwn8/pK7V81Iuk1zU\nz4N6qsvFGyk1alT62/btL/UeGfn/K6Nf/mXRokUUKVKEDRs2EBsbS+fOnQHYuXMnZ86c4ffffweg\nZcuWXLhwgTt37lC5cmV69OiRug+dTpfu4p9bt24RExPD9evXmTNnDv379yc6OvqFt+3fvz82Njbc\nunWLBQsWsHDhQrnISIjXzNYWmjWDFYH2uD+wZEn7ZQRYbaZ89Hec7mwABwfw8DDajI5ZNqnrdBm/\nZQb1qA1jMBiwsrLC0tISgLfeeot8+fJhbm7OqFGjOH78OLGxsY+9DsDc3JyRI0diampK8+bNsba2\n5uzZsy+0bXJyMqtXr2b06NHkyZOHsmXLEhAQIH1zIYygRw9Y8rsDAKc21sP95BAKBDSjU7d47l25\nDTdvGi22LJvUlcr4LTO5u7un3k9JSeHTTz+lRIkS2NnZUaxYMQDCn7LelYODAyYm/xzqvHnzEhcX\n90Lb3rlzh6SkpHRxuLm5ZegzCSFeUFAQGAy0PGig9dnJXDXx4N5QA3+4JNIIS+779WHAttZGDVHO\n+D3Bk1oaaR9bsmQJ69atIzAwkKJFixIVFYW9vX26qvlF2iLPs22hQoUwMzMjJCSEkiVLAhASEvLc\n7yGEyASPhiuaApfDocJiGPoZFHWfz8wdNahZ/TQnDs3k1JJ7lH9g0BavzpsX8uR5bYtXZ9lK3Zic\nnJy4ePHiU5+Pi4vD0tISe3t74uPj+fzzz9M9r5R67rbI825rampKhw4dMBgM3L9/nzNnzrBo0SLp\nqQthJO+8A9WqwbBh2vdWyow13VZzse4qWph2J2WkAa5ehZo1X2t/XZL6E3z22WeMHTsWe3t7Vq1a\n9Vji9Pf3p2jRori6ulKhQgVq1qyZbpt/n/x8VuJ9kW2nTZtGdHQ0zs7OBAQE0K1bNywsLF7mIwoh\nMqhKbBB/1DNgMd6gzf9y7BgeU+az9HplbjTpw4LVYUaJSy4+ysaGDRvG7du3mTdv3mPPyTEW4jWK\ni9PmgnFwgE8/pWtcCBvNrhB+KAULdw8oXTr99v/RipGLj3KJs2fP8vDhQypWrMjBgweZO3cuc+bM\nMXZYQghra+32yCKXCuS/HEuP0tdZ0aYrtG0L48ZpPRunV3uhkrRfspHY2Fg6duyItbU1Xbt25eOP\nP6ZNmzbGDksI8S/mpjrGVF7IRrtbLI3cqT24cCFERb3y95ZKPRupWrUq58+fN3YYQoh/SzuVwKlT\nYGHBwLIP+G35x/TPPwqvO70p95pCkZ56DiXHWAjjG1l6GftqXSW03HSCv4nDplN3re/+t6f01mWW\nRvEYOcZCGN+lxn3wPTiV1gsGEL9uGcuGHkRXqhS4u0PY00fHyIlSIYTIKtK0YjxDdzIpzwgqLXWi\nt+NDpp1fwsBSo1/pikkZqtSnTp3K9OnTMTU1pWXLlkyYMCH9zqVSNxo5xkJkAWvXsuRiDRZvdWLa\n7WLUfCOG9T024Fu0NqSkPPVlRqnUt2/fzrp16/jzzz8xNzfnzp07L7srIYTImdq2pcN9GDgWLPIX\nYVa1N+iypD2H8ygcDAZIStJWTSpUKNOmEXjppD5jxgw+++wzzM3NAW1ukicxGAyp9/V6PXojTEUp\nhBDGYmUFXbrAwhXt+MLNjxUhl/BqH0iNUyPRxcRS/uhi9MvLE5R2BE0GvHT7xcfHh7Zt27J582by\n5MnDpEmTqFq1avqdZ8P2i4eHB7dv38bU1BRzc3Nq1arFzJkzs92MiFn5GAuRazxK1GHXYcWcGJw9\n8hBy15SfOs2gWpWhtIrvxrvD7AmNyIed3T8ve2XtFz8/P24+YV7gcePGkZSURGRkJPv37+fgwYN0\n7tyZS5cuvVQQWYlOp2PDhg00bNiQhw8f8v777zNw4EB+/fVXY4cmhMhuHrVUCis4EBZDlbpWDBho\nSlfXsVQr8w39qrlQy6oYf/xRh/btM+k9X3bJpGbNmqmgoKDU74sXL67Cw8PTbfO03WfgbV85Dw8P\nFRgYmPr9b7/9pkqVKqWUUmrDhg3K29tb2draKnd3d2UwGFK3a9GihZo6dWq6fVWsWFGtWbNGKaXU\nX3/9pRo3bqzs7e1V6dKl1fLly9O9R7ly5ZSNjY1ydXVVkyZNyvDnyMrHWIhcLTlZKZ1ObTi7QbmP\nsVcbC9ZT66qMUmrAAKW6dFFq1KgM/f996VfOnDlTjRw5Uiml1NmzZ5W7u/vjO8+mSX3btm1KKaXi\n4+OVv7+/CggIUEopFRQUpE6ePKmUUurPP/9UTk5OqUl7+fLlytfXN3U/x44dUw4ODioxMVHFxcUp\nNzc3NX/+fJWcnKyOHj2qChYsqP766y+llFLOzs5q9+7dSimloqKi1JEjRzL8ObLyMRYiV3uU1JVS\naujSXqruu4VUkaLJKmXFSqU6dFBKZez/70ufKO3Vqxe9evWiYsWKWFhYsHDhwsz4wyGVbnTG5wlX\no168J6WUol27dpiZmREfH4+joyObN28GoH79+qnbVaxYka5du7Jjxw7atm1L69at6du3LxcvXqR4\n8eIsWrSIrl27YmZmxqpVqyhWrBgBAQEAeHt706FDB5YvX87IkSOxsLDg1KlTVKxYETs7O3x8fDL8\n2YUQWUjak6BKaettGgyMLVoTvcVSYipM4ux1L8pkxntlxi+ep3na7l/x22ZI2vZLSkqKWr16tbK3\nt1c3b95U+/fvV3q9XhUqVEjZ2dmpPHnyKH9//9TX9u3bV40aNUqlpKQoV1dXtX//fqWUUhMmTFAW\nFhYqf/78qTdra2v1/vvvK6WUOnjwoGrbtq0qUKCAql+/vtq3b1+GP0dWPsZCiEdOnlRXq5ZUViMc\n1aC+8zOlUpdZGp9Bp9PRvn17TE1N2b17N927d6ddu3aEhoYSFRVFv379SElzAUFAQABLlixh27Zt\n5M2bF19fXwCKFClC/fr1iYyMTL3Fxsbyww8/ANpEXWvWrOHOnTu0a9eOzp07G+XzCiFevyL3zBns\nOZsf7b4gwjQhw/uTpP4E6tFQIqUUa9euJSoqirJlyxIXF0eBAgWwsLAgODiYn3/+Od1KRX+vgPTx\nxx/j7++f+nirVq04d+4cixcvJjExkcTERA4ePMiZM2dITExkyZIlREdHY2pqio2NDaampq/9Mwsh\njOfT9m1IPtWRwc6nMrwvmfvlCVq3bo2pqSk6nQ4PDw8WLFhAuXLlmD59Oh999BEDBgygfv36dOnS\nhah/zY/s7+/PyJEjWbt2bepj1tbWbNmyhSFDhjBkyBBSUlLw9vbmm2++AWDx4sUMHDiQ5ORkypQp\nw5IlS17r5xVCvEZp++sREZCUhN23Bt49VYfa980zvHuZpTGTLVq0iNmzZ7Nz506jxpGTj7EQOdH4\nHie5E3yZb8+3ztD/X2m/ZKJ79+7xww8/0KdPH2OHIoTIZt6oGUpdh78yvB9J6pnk999/x9HRERcX\nF7p3727scIQQ2UxJl3g6uB7I8H6k/ZJDyTEWIhtI21+/ehUuXwa9Ht3o0bLykUhPjrEQ2Zf01IUQ\nQgCS1IUQIkcxyjj1AgUKpLtoR2S+AgUKGDsEIYQRGKWnLoQQ4umkp/6KBGXC0lLGJPEbV3aOPzvH\nDtk//oyQpP4M2f0fhsRvXNk5/uwcO2T/+DNCkroQQuQgktSFECIHeeUnSoUQQry4l03Nr3RIo4x8\nEUKI10vaL0IIkYNIUhdCiBxEkroQQuQgmZ7UIyIi8PPzo1SpUjRp0uSx5d7+5uHhQaVKlfDx8aF6\n9eqZHcYL2bx5M2XKlKFkyZJMmDDhidsMGjSIkiVL4uXlxdGjR19zhM/2X/EHBQVhZ2eHj48PPj4+\njB071ghRPlmvXr1wcnKiYsWKT90mKx/7/4o/Kx97gJCQEBo0aED58uWpUKEC33///RO3y4o/g+eJ\nPSsf/wcPHuDr64u3tzflypXjs88+e+J2L3zsVSb75JNP1IQJE5RSSn311Vdq2LBhT9zOw8ND3b17\nN7Pf/oUlJSWp4sWLq8uXL6uEhATl5eWlTp8+nW6b3377TTVv3lwppdT+/fuVr6+vMUJ9oueJf/v2\n7ap169ZGivDZdu7cqY4cOaIqVKjwxOez8rFX6r/jz8rHXimlbty4oY4ePaqUUio2NlaVKlUq2/z7\nf57Ys/rxj4+PV0oplZiYqHx9fdWuXbvSPf8yxz7TK/V169YREBAAQEBAAGvWrHnWL5TMfvsXFhwc\nTIkSJfDw8MDc3JyuXbumWzQa0n8mX19foqKiuHXrljHCfczzxA9Z41g/Sd26dZ85+VhWPvbw3/FD\n1j32AM7Oznh7ewPaAully5bl+vXr6bbJqj+D54kdsvbxz5s3LwAJCQkkJydjb2+f7vmXOfaZntRv\n3bqFk5MTAE5OTk8NQKfT0bhxY6pWrcrs2bMzO4znFhYWhru7e+r3bm5uhIWF/ec2oaGhry3GZ3me\n+HU6HXv37sXLy4sWLVpw+vTp1x3mS8vKx/55ZKdjf+XKFY4ePYqvr2+6x7PDz+BpsWf145+SkoK3\ntzdOTk40aNCAcuXKpXv+ZY79S41T9/Pz4+bNm489Pm7cuHTf63S6p16AtGfPHlxcXLhz5w5+fn6U\nKVOGunXrvkw4GfK8F0j9+7d9Vrmw6nniqFy5MiEhIeTNm5dNmzbRrl07zp079xqiyxxZ9dg/j+xy\n7OPi4ujUqRPfffcd1tbWjz2flX8Gz4o9qx9/ExMTjh07RnR0NE2bNiUoKAi9Xp9umxc99i9VqW/d\nupUTJ048dmvTpg1OTk6pCf/GjRs4Ojo+cR8uLi4AFCpUiPbt2xMcHPwyoWSYq6srISEhqd+HhITg\n5ub2zG1CQ0NxdXV9bTE+y/PEb2Njk/pnXvPmzUlMTCQiIuK1xvmysvKxfx7Z4dgnJibSsWNHevbs\nSbt27R57Piv/DP4r9uxw/AHs7Oxo2bIlhw4dSvf4yxz7TG+/tGnThgULFgCwYMGCJx7oe/fuERsb\nC0B8fDxbtmx55uiHV6lq1aqcP3+eK1eukJCQwC+//EKbNm3SbdOmTRsWLlwIwP79+8mfP39qi8nY\nnif+W7dupf62Dw4ORin1WO8uq8rKx/55ZPVjr5Sid+/elCtXjg8++OCJ22TVn8HzxJ6Vj394eHjq\n6MD79++zdetWfHx80m3zUsc+4+dv07t7965q1KiRKlmypPLz81ORkZFKKaXCwsJUixYtlFJKXbx4\nUXl5eSkvLy9Vvnx5NX78+MwO44Vs3LhRlSpVShUvXjw1lpkzZ6qZM2embtO/f39VvHhxValSJXX4\n8GFjhfpE/xX/tGnTVPny5ZWXl5eqWbOm2rdvnzHDTadr167KxcVFmZubKzc3NzVnzpxsdez/K/6s\nfOyVUmrXrl1Kp9MpLy8v5e3trby9vdXGjRuzxc/geWLPysf/zz//VD4+PsrLy0tVrFhRTZw4USmV\n8dzzSif0EkII8XrJFaVCCJGDSFIXQogcRJK6EELkIJLUhRAiB5GkLoQQOYgkdSGEyEH+D0cz40kX\ngM2tAAAAAElFTkSuQmCC\n" } ], "prompt_number": 6 }, { "cell_type": "code", "collapsed": true, "input": [], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 6 } ], "metadata": {} } ] }