{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "In this lab we'll go beyond standard data analysis and actually code a machine learning algorithm from scratch. As this is our first such exercise we'll start with a classic linear model, Logistic Regression. We'll accomplish this with the following steps:\n", "\n", "1. Build a class that can fit a Logistic Regression given training data as well as make predictions on examples without labels\n", "2. Write a unit test that confirms our class produces the correct estimates on synthetically generated data\n", "3. Benchmark our class against SkLearn for speed and accuracy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# A Logistic Class" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Before we do any programming, we need to first understand the optimization algorithm we'll use. As a reference, we'll be following the presentation presented here: https://tminka.github.io/papers/logreg/minka-logreg-old.pdf.\n", "\n", "The logistic loss is convex so we'll use a convex optimization process called gradient descent. In particular, we will use a variant called Newton's Method that uses the 2nd derivative of the loss function to set the learning rate. To start, here are some primitives:\n", "\n", "$L = \\sum y * ln(p) + (1-y)*ln(1-p)$\n", "\n", "(Note, we'll drop subscripts over the data instances to keep notation simple. All sums are over examples in the data set.)\n", "\n", " $g_j = \\nabla L_j = \\sum (y-p)*x_j$\n", "\n", "(This is the 1st derivative, where $j$ indicates feature dimension $j$)\n", "\n", " $H_{jk} = \\sum p*(1-p)*x_j*x_k$\n", "\n", "(This is the 2nd derivative w.r.t. features $j$ and $k$)\n", "\n", "Now the general form of Gradient Descent follows this function:\n", "\n", "$w_{new} = w_{old} - \\nu * g$\n", "\n", "Where $w$ is the weight and $\\nu$ is an appropriately chosen step size. In our case, we're going to use the inverse of the 2nd derviative matrix (the Hessian) as our learning rate. If we define $H$ as the Hessian matrix with entries $H_{jk}$ from above, and $G$ a gradient vector who's $jth$ entry is $g_j$, then our optimization problem becomes:\n", "\n", "$w_{new} = w_{old} - H^{-1} * G$\n", "\n", "This is what we are going to program here. Let's define a class that has the following methods:\n", "\n", "1. An __init__ method that takes in an error tolerence as a stopping criterion, as well as max number of iterations.\n", "2. A predict method that takes a given matrix X and predicts $p=(1+e^{(-X*B)})^{-1}$ for each entry\n", "3. A compute_gradient method that computes the gradient vector $G$\n", "4. A compute_hessian method that computes the Hessian. Note that the $H$ can be broken down to the following matrix multiplcation: $H=X^TQX$, where $X$ is the input matrix and $Q$ is a diagonal matrix where each entry $Q_{ii}=p_i*(1-p_i)$.\n", "5. An update_weights method that applies Newton's method to update the weights\n", "6. A check_stop method that checks whether the model has converged or the max iterations have been met\n", "7. A fit method that takes in the data and runs the gradient optimization\n" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#write a class with the following API\n", "import numpy as np\n", "\n", "\n", "class MyFirstLogReg(object):\n", " \n", " def __init__(self, tol = 10**-8, max_iterations = 20):\n", " \n", " self.tolerance = tol\n", " self.max_iterations = max_iterations\n", " self.beta = None\n", " self.alpha = 0\n", " \n", " def predict(self, Xint):\n", " '''\n", " Compute probs using the inverse logit\n", " - Inputs: The NxK X matrix\n", " - Outputs: Vector of probs of length N\n", " '''\n", " \n", " #First compute X*beta+alpha\n", " XB = Xint.dot(self.b) \n", " return (1+np.exp(-1*XB))**-1\n", " \n", " def compute_gradient(self, Xint, y, p):\n", " '''\n", " Computes the gradient vector\n", " -Inputs:\n", " - NxK X matrix\n", " - Nx1 y (label) vector\n", " - Nx1 ps vector of predictions\n", " -Outputs: 1xK vector of gradients\n", " '''\n", " return (y-p).dot(Xint)\n", " \n", " def compute_hessian(self, Xint, P):\n", " '''\n", " computes the Hessian matrix\n", " -inputs:\n", " - NxK X matrix\n", " - Nx1 vector of predictions\n", " -outputs:\n", " - KxK Hessian matrix H=X^T * Diag(Q) * X\n", " '''\n", " #Note, in first \n", " Q = np.diag(P*(1-P))\n", " return (Xint.T).dot(Q).dot(Xint)\n", "\n", "\n", " def update_weights(self, Xint, y, i):\n", " '''\n", " Updates existing weight vector\n", " -Inputs:\n", " -NxK X matrix\n", " -Nx1 y vector\n", " -updates weights by calling predict, compute_gradient and compute_hessian\n", " '''\n", " p = self.predict(Xint)\n", " g = self.compute_gradient(Xint, y, p)\n", " H = self.compute_hessian(Xint, p)\n", " \n", " #Store the current weights before updating so we can check for convergence\n", " self.prior_b = self.b\n", " \n", " #update the weights\n", " self.b = self.b + np.linalg.inv(H).dot(g)\n", " \n", " \n", " def check_stop(self):\n", " '''\n", " check to see if euclidean distance between old and new weights (normalized)\n", " is less than the tolerance\n", " \n", " returns: True or False on whether stopping criteria is met\n", " '''\n", " b_old_norm = self.prior_b / (np.sqrt(self.prior_b.dot(self.prior_b)))\n", " b_new_norm = self.b / (np.sqrt(self.b.dot(self.b)))\n", " diff = b_new_norm - b_old_norm\n", " return (np.sqrt(diff.dot(diff)) < self.tolerance)\n", " \n", " \n", " def fit(self, X, y):\n", " '''\n", " X is the Nx(K-1) data matrix\n", " Y is the labels, using {0,1} coding\n", " '''\n", " \n", " #set initial weights - add an extra dimension for the intercept\n", " self.b = np.zeros(X.shape[1] + 1)\n", " \n", " #Initialize the slope parameter to log(base rate/(1-base rate))\n", " self.b[-1] = np.log(y.mean() / (1-y.mean()))\n", " \n", " #create a new X matrix that includes a column of ones for the intercept\n", " Xint = np.hstack((X, np.ones((X.shape[0],1))))\n", "\n", " for i in range(self.max_iterations):\n", " #print(i)\n", " self.update_weights(Xint, y, i)\n", " self.beta = self.b[0:-1]\n", " self.alpha = self.b[-1]\n", " if self.check_stop():\n", " break\n", " \n", " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note about testing\n", "\n", "One way we can test this implementation is to generate some random data according to a logistic model, and see if our model returns the correct weights. To do this:\n", "\n", "\n", "* generate an NxK X matrix of random numbers\n", "* generate a 1xK weight vector called Beta\n", "* set alpha\n", "* Given Beta and Alpha, compute P(Y|X) using the logistic function\n", "* Generate an Nx1 random sequence in [0,1], and set Y=1 if R_i < P_i\n" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false }, "outputs": [], "source": [ "#Generate a dataset with N=10k and K=5\n", "def gen_logistic(N, K, Beta, Alpha):\n", " X = np.random.random((N,K))\n", " XB = X.dot(Beta) + Alpha * np.ones(N)\n", " P = (1 + np.exp(-1*XB))**-1\n", " Y = (np.random.random(N) < P)\n", " return X, Y\n", "\n", "K = 2\n", "\n", "Beta = 2*(np.random.random(K)-1)\n", "Alpha = -1\n", "\n", "X, Y = gen_logistic(1000, K, Beta, Alpha)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that we have our dataset, let's test out our algorithm" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false }, "outputs": [], "source": [ "lr = MyFirstLogReg()\n", "lr.fit(X, Y)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "The real Betas and Alpha are:\n", "[-1.84180703 -0.20277473] -1\n", "\n", "The fitted Betas and Alpha are:\n", "[-1.64438904 -0.41202617] -1.1480464234681091\n" ] } ], "source": [ "print('The real Betas and Alpha are:')\n", "print(Beta, Alpha)\n", "print('')\n", "print('The fitted Betas and Alpha are:')\n", "print(lr.beta, lr.alpha)\n" ] }, { "cell_type": "markdown", "metadata": { "collapsed": false }, "source": [ "Now let's compare our fitted results to SkLearn's Logistic Regression implementation" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "(array([[-1.64438881, -0.41202617]]), array([-1.14804636]))" ] }, "execution_count": 5, "metadata": {}, "output_type": "execute_result" } ], "source": [ "from sklearn.linear_model import LogisticRegression\n", "\n", "LR_SK = LogisticRegression(C=10**10).fit(X,Y)\n", "LR_SK.coef_, LR_SK.intercept_" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can see that this is pretty close, though not exact at all digits of precision. We can also see that both systems produce estimates that are far from the truth. Does this mean that the code is correct? If it is indeed correct, why might we observe the above? We can write a more robust test by running multiple draws, and seeing if on average we get the right answer. We can also increase the sample size. Doing either will be more computationally expensive. Before going there let's profile our code to see if there is a way to optimize it." ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 226 function calls in 0.028 seconds\n", "\n", " Ordered by: standard name\n", "\n", " ncalls tottime percall cumtime percall filename:lineno(function)\n", " 5 0.000 0.000 0.000 0.000 :14(predict)\n", " 5 0.000 0.000 0.000 0.000 :25(compute_gradient)\n", " 5 0.000 0.000 0.022 0.004 :36(compute_hessian)\n", " 5 0.004 0.001 0.027 0.005 :50(update_weights)\n", " 5 0.000 0.000 0.000 0.000 :69(check_stop)\n", " 1 0.000 0.000 0.028 0.028 :82(fit)\n", " 1 0.000 0.000 0.028 0.028 :1()\n", " 2 0.000 0.000 0.000 0.000 _methods.py:48(_count_reduce_items)\n", " 2 0.000 0.000 0.000 0.000 _methods.py:58(_mean)\n", " 5 0.000 0.000 0.000 0.000 linalg.py:103(get_linalg_error_extobj)\n", " 5 0.000 0.000 0.000 0.000 linalg.py:108(_makearray)\n", " 10 0.000 0.000 0.000 0.000 linalg.py:113(isComplexType)\n", " 5 0.000 0.000 0.000 0.000 linalg.py:126(_realType)\n", " 5 0.000 0.000 0.000 0.000 linalg.py:141(_commonType)\n", " 5 0.000 0.000 0.000 0.000 linalg.py:200(_assertRankAtLeast2)\n", " 5 0.000 0.000 0.000 0.000 linalg.py:211(_assertNdSquareness)\n", " 5 0.000 0.000 0.001 0.000 linalg.py:468(inv)\n", " 1 0.000 0.000 0.000 0.000 numeric.py:156(ones)\n", " 5 0.000 0.000 0.000 0.000 numeric.py:433(asarray)\n", " 9 0.000 0.000 0.000 0.000 numeric.py:504(asanyarray)\n", " 2 0.000 0.000 0.000 0.000 shape_base.py:11(atleast_1d)\n", " 1 0.000 0.000 0.000 0.000 shape_base.py:236(hstack)\n", " 1 0.000 0.000 0.000 0.000 shape_base.py:283()\n", " 5 0.000 0.000 0.009 0.002 twodim_base.py:197(diag)\n", " 5 0.000 0.000 0.000 0.000 {built-in method builtins.abs}\n", " 1 0.000 0.000 0.028 0.028 {built-in method builtins.exec}\n", " 5 0.000 0.000 0.000 0.000 {built-in method builtins.getattr}\n", " 2 0.000 0.000 0.000 0.000 {built-in method builtins.hasattr}\n", " 4 0.000 0.000 0.000 0.000 {built-in method builtins.isinstance}\n", " 17 0.000 0.000 0.000 0.000 {built-in method builtins.issubclass}\n", " 7 0.000 0.000 0.000 0.000 {built-in method builtins.len}\n", " 14 0.000 0.000 0.000 0.000 {built-in method numpy.core.multiarray.array}\n", " 1 0.000 0.000 0.000 0.000 {built-in method numpy.core.multiarray.concatenate}\n", " 1 0.000 0.000 0.000 0.000 {built-in method numpy.core.multiarray.copyto}\n", " 1 0.000 0.000 0.000 0.000 {built-in method numpy.core.multiarray.empty}\n", " 6 0.008 0.001 0.008 0.001 {built-in method numpy.core.multiarray.zeros}\n", " 5 0.000 0.000 0.000 0.000 {method '__array_prepare__' of 'numpy.ndarray' objects}\n", " 2 0.000 0.000 0.000 0.000 {method 'append' of 'list' objects}\n", " 5 0.000 0.000 0.000 0.000 {method 'astype' of 'numpy.ndarray' objects}\n", " 1 0.000 0.000 0.000 0.000 {method 'disable' of '_lsprof.Profiler' objects}\n", " 40 0.014 0.000 0.014 0.000 {method 'dot' of 'numpy.ndarray' objects}\n", " 5 0.000 0.000 0.000 0.000 {method 'get' of 'dict' objects}\n", " 2 0.000 0.000 0.000 0.000 {method 'mean' of 'numpy.ndarray' objects}\n", " 2 0.000 0.000 0.000 0.000 {method 'reduce' of 'numpy.ufunc' objects}\n", "\n", "\n" ] } ], "source": [ "import cProfile\n", "cProfile.run('lr.fit(X, Y)')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "What method is taking the most amount of time? Looking more closely at the exact sequence of operations, what might be taking up a lot of memory or time? How can we achieve the same mathematic results, but do it in a more memory friendly way?\n", "\n", "Let's do a quick test, where we design an alternative, and hopefully faster way to compute the Hessian." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## This is a good place to break. Part 2 in the next lab" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def hessian_slow(X, P):\n", " '''\n", " Copy the operations used in the class above\n", " '''\n", " Q = np.diag(P*(1-P))\n", " return (X.T).dot(Q).dot(X)\n", " \n", " \n", "def hessian_fast(X, P):\n", " '''\n", " Rewrite this without using the np.diag function\n", " '''\n", " Q = P*(1-P)\n", " XQ = X.T * Q\n", " return XQ.dot(X)\n" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "100 loops, best of 3: 3.49 ms per loop\n" ] } ], "source": [ "P = 0.5 * np.ones(X.shape[0])\n", "%timeit hessian_slow(X, P)" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "The slowest run took 14.29 times longer than the fastest. This could mean that an intermediate result is being cached.\n", "10000 loops, best of 3: 21.4 µs per loop\n" ] } ], "source": [ "%timeit hessian_fast(X, P)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Does this make a difference? Now create a new class, same as above, but overwrite the compute_hessian method with the above faster version. We're going to do this in a very light and efficient way. Essentially we'll inherit MyFirstLogReg class, which means all of the methods in the base class are callable for this one. We can overwrite the compute_hessian method with our faster approach (note, in normal software development we'd likely just revise the original class, but we're doing it this way to show the example)." ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false }, "outputs": [], "source": [ "\n", "class MyFasterFirstLogReg(MyFirstLogReg):\n", " \n", " def compute_hessian(self, Xint, p):\n", " '''\n", " computes the Hessian matrix\n", " -inputs:\n", " - NxK X matrix\n", " - Nx1 vector of predictions\n", " -outputs:\n", " - KxK Hessian matrix H=X^T * Diag(Q) * X\n", " '''\n", " Q = p*(1-p)\n", " XintQ = Xint.T * Q\n", " return XintQ.dot(Xint)\n", "\n", "\n", " " ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false }, "outputs": [], "source": [ "#Now rerun\n", "lr = MyFasterFirstLogReg()\n", "lr.fit(X, Y)" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "(array([-1.64438904, -0.41202617]), -1.1480464234681091)" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "lr.beta, lr.alpha" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As one last speed test, let's compare our faster class to sklearn." ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "1000 loops, best of 3: 856 µs per loop\n" ] } ], "source": [ "%timeit LogisticRegression().fit(X,Y)" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "1000 loops, best of 3: 782 µs per loop\n" ] } ], "source": [ "%timeit MyFasterFirstLogReg().fit(X, Y)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that we've got an efficient class written, let's perform 3 different unit tests:\n", "* Compare the results to SkLearn. This replicates a scenario where for some reason we can't use the SkLearn package and had to write our own code. Since SkLearn does exist, we can at least benchmark against it.\n", "* Run it once just on a much larger data set (which should come close to the truth value).\n", "* Run it many times over different draws from the same data generating distribution and look at the distribution of outcomes." ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": false }, "outputs": [], "source": [ "#First let's generate a dataset with 1000000 examples\n", "K = 4\n", "Beta = 2*(np.random.random(K)-1)\n", "Alpha = -2\n", "\n", "X, Y = gen_logistic(1000000, K, Beta, Alpha)" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": false }, "outputs": [], "source": [ "LR_SK = LogisticRegression(C=10**30).fit(X,Y)\n", "LR_Mine = MyFasterFirstLogReg()\n", "LR_Mine.fit(X, Y)" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Truth: beta=[[-0.44567075 -1.42752653 -0.22013092 -1.85578601]], Alpha=-2\n", "SkLearn: beta=[[-0.44567075 -1.42752653 -0.22013092 -1.85578601]], Alpha=-2\n", "Ours: beta=[-0.44566344 -1.42750999 -0.22013456 -1.85580747], Alpha=-2.0060087546764263\n" ] } ], "source": [ "print('Truth: beta=' + str(LR_SK.coef_) + ', Alpha=' +str(Alpha))\n", "print('SkLearn: beta=' + str(LR_SK.coef_) + ', Alpha=' +str(Alpha))\n", "print('Ours: beta=' + str(LR_Mine.beta) + ', Alpha=' +str(LR_Mine.alpha))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For this test we'll check whether on average our own LR class is correct. To do this, run a loop where in each iteration, generate a random data set of size N, using the same Beta and Alpha. Then fit the model and store the weights. After this runs we can look at the distributions of features." ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "collapsed": false }, "outputs": [], "source": [ "betas = []\n", "alphas = []\n", "\n", "for i in range(10000):\n", " X, Y = gen_logistic(10000, K, Beta, Alpha)\n", " LR_Mine = MyFasterFirstLogReg()\n", " LR_Mine.fit(X, Y)\n", " betas.append(LR_Mine.beta)\n", " alphas.append(LR_Mine.alpha)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now plot the histograms of each parameter distribution" ] }, { "cell_type": "code", "execution_count": 30, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#First put it all in a dataframe\n", "import pandas as pd\n", "df = pd.DataFrame(betas, columns = ['b' + str(k) for k in range(K)])\n", "df['alpha'] = alphas" ] }, { "cell_type": "code", "execution_count": 31, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "(b0 -0.452829\n", " b1 -1.459697\n", " b2 -0.212411\n", " b3 -1.845400\n", " alpha -2.003525\n", " dtype: float64,\n", " array([-0.451672 , -1.45290954, -0.20970989, -1.84053181]),\n", " -2)" ] }, "execution_count": 31, "metadata": {}, "output_type": "execute_result" } ], "source": [ "#Let's check the means against the truth\n", "df.mean(), Beta, Alpha" ] }, { "cell_type": "code", "execution_count": 32, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAskAAAHiCAYAAAAatlGFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4zLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvIxREBQAAIABJREFUeJzs3XuYZFV97//3R0BMghGRCQeByaBi\nPJjE0UzUHKNB0YiYCJ6owXhBxYwmkOiJnjjiMZLECyZRfhojBsUA3oCgRhQ0IuIx5gg6EEQuGkdF\nmckAowKCIAp8f3/s1VBT9KW6u6q7uvv9ep56Ztfaa+/61u6aVd9ae+21U1VIkiRJuss9FjsASZIk\nadyYJEuSJEl9TJIlSZKkPibJkiRJUh+TZEmSJKmPSbIkSZLUxyR5CUhyZZInLnIMj03y9cWMYalL\n8qAkzrkoLXNj0mavTnJTkh0WM46lLskXkrxgsePQ4jBJXoaS/EGS7yT5UZJ/SbLbLLattt1N7XE9\nQFX9W1X9Uk+9oX0JJNk5yXuT/DDJ1Un+bMDtzm3x7tgX1y098X96wH3d1PO4o28fz5nj+9qc5IC5\nbDuH13pEkouS3Jzky0l+dYBtHpLk1iQn9ZQ9sb3/m+b7/iXNLMmeSc5M8l+tPVszy+3727ybkty/\nqr5bVbtU1e2t3ueSvHhIMT8+yXlJbkhy5Sy2+4v2Hp/YV/7E1n79qLWbzxpgX+/qeb8/SfLTnuef\nnMPbIsnre9vDUUry7CRfbG32Z2ax3Sn9n5OWyP+45/1fNoqYVyKT5GUmyUOBfwSeB+wB3Ay8c5a7\neVhrXHepql2HHeMkjgH2A34ReDzw50kOmm6DlrjtNMXq3+2J/7cHCaCn/i7Ad/v28YFJXn/Hu+9l\ncSTZGfgY8E/AfYEPAf+SZKrjM+GdwJcmKf9u7/GY7P1LGpo7gE8BvzePffxu3//Z/xpSbFP5EfBe\n4H8PukGSBwLPBLb2le8PfBB4DXAf4GHAhTPtr6pe2tNmvxE4ref9P2WS1x+bNrv5PvBW4G8H3aB1\nuqyZYvVLe97/Q+cfnsAkeSn59SSXJ7kuyT8ludcU9Z4DfLyqPl9VNwGvBf5nknvP58WTHJBkc1t+\nH7Aa+Hj71frn89k3cDjw11V1XVVdAbwbeME0sdwHeB0w39cdWOthOC3Jh5LcCDw3yfuTHNNT54kT\nvSpJPgTcH/hkO0Z/1lPv+a23ZFuSDUMI70Dgjqr6+6q6FTgO2Bn4rWnez3OBq4H/O4TXl3R3A7XZ\nVXVNVb0T+PIwXzzJmokzbUneADwWeEdrj94xn31X1Zeq6n3At2ax2T8ArwJ+0lf+f4B/rKpPVtVt\nVfX9qvrmfOKDu4a3JXlhku8Cn+5to3vqbW7fb79D953ynHaMehP1fZP8vyQ3JvlUZnF2dipV9emq\n+mf6fjRM8352At4G/Ml8X1uDM0leOp4DPBl4IPBguoZlMg8FvjLxpDU2P2nbkOSdSWbbs7ydqnoe\n2/e2/k3b9/XTPCZNBpPcF9izN+a2PN0v4TcCx9MleZP5QEtAP53kYbN9f9N4Ol2Px32A06arWFXP\nBv4LeEo7Rm/tWf0/gAfR/T3/Msl+AEmeN8MxvP8UL/dQ4JKe1y7gq0xxDJPsSvcj45VT7O/+Sa5J\n8q0kb0nys9O9V0mTGrTNnlaSDUk+MZ9Aquo1wL8BR7X26Ki270umaW/m9T3RK8kzgVur6uxJVj+6\n1flqkq2t82HeSWiPxwEPAZ46XaWq+gTwN8AH2jH6tZ7Vf0DXmbMH8HPAn7WYd5ihzZ6qjZ2LVwKf\nAaYaSvG3Sb7Xhl48boivu6KN2+kHTe0dVXUVQOsV+Hsmb3R3AW7oK7sBuDdAVf3xAK91UZI72vIp\nVfWngwQ4x6EZu7R/e2O+M95+SdYBjwFeBuw9SZXnABcBaXX+NclDqur6OcTW7wtV9fG2fEuSue7n\nmKr6Md1xvozu9OI3Ws/M++awv2n/5pN4A/CuqvqvSd7DRDxfB/YFTqE7HXjkHOKSVrJB2+xpVdWx\nA1T7lyS3teXPVdWhA+57xmsX5qudxXwj8KQpquxNNzzwt+k6Fk6mO1bDuhbidVV1c4tlrvs4saq+\n0fbxz3Sx0sZ7j3xIYpJfBF4EPGKKKq8ELgV+SnfczkryK1V15ahjW+7sSV46rupZ/g7dqfzJ3AT8\nfF/ZzwM3zuK1HlFVu7bHQAnyoLL9xRZH08U7ESM9y3eLN8k96MbRvqyqbutfD1BV/15Vt1TVzVX1\nJuB6utOMw3DVzFVmVlW9PeA3c9cPhRm1novtLtBhFn/z9iPjccDbp4hta1VdUVV3tLMQrwKeMWh8\nku40aJs9DIf2tNkDJciDSnJ0T3vzrjns4hjgfdMkbLcA/1RV/9mGCL4ROHhu0U5qGO32nNtsgCTv\n6TmGcxkm+Ha6ZH/S7/GqOr+qbqqqW6vqvcAFwN3GZWv2TJKXjn16llfT/eKezERPIABJHkA3PvU/\nhxzP3aYy60ve+h9Hw/YXW1TVG6vqOroxWb3DIh7G5KeUfh5YB5yW5GruGsO3OclUiXDR9SoPQ/97\n/hHQOxThv81Qf1pJDp/hGN6/qm6f5AKd/r95gF9h8mN4AF0P8VXtGL4c+P0kU42HHObxk1aSQdvs\nhTJZm33ZNO3NuwBaOz3R3rx0Dq97IPCn6WYuupruuJye5FVt/SV9sQ11msw2/GzCdm12uov57jfX\n156k06L/8ecthhf3HMO/mcPbOBB4azt+m1vZl5P8/hT1bbeHxOEWS8eRbVzazXRXAU81JvYDwBdb\n0ngR8FfAR6b6BToP1wAP6C1oVxnPxSnA/0mykW7M1x8CL5yk3g1s3xuzD93sDL8GbEuyupV9me4H\n4J8AuwP/DndeGXxeVQ2r8bgYOCrJm4B7Af297hPH6HOD7KyqTqY71ThbnwV2SHIk8B7gj+hOu012\nUd47gff3PH8V3TGdGKP4eGBTVV3Vjueb6GbOkDQ7g7bZpLuob2I+452T3KsNyRqmydrsOc2C0M7q\n3ZNuhqG0+O+oqv6L8qBL8Hpn2vky3ZjeiWna/gl4bZL30/XYbgDuHIOd7kK7Y6rqpLnE2udrwL2T\nPJmu3fyLvtiuAR6bJH3J9aTacIs5fe+lm796J7o87B7tGN42xVnSB3BXp+YOdInywcClbfz2OuDz\nwO3As4HfoPse0DzZk7x0fBD4NN3VxN8EXj9Zpaq6DHgpXbJ8Ld241DvHIbfhDnM5ZdbvTXSJ7TAu\nTngd3Xv6Dl1i97dV9SnYbkL81dW5euIBbGvbX9Ma53vTXdB3HbAFOIjuwrnvt3r7AP9vnrH2Ogm4\nosX9KeDUvvVvpLsw7/okLx/i626nfZkeAryYbnjJc4FDquqnAElem+Tjre7NfcfwR8AtVTVxLNcB\n5ye5GfgC3Q+t/zWq2KVlbKA2u7mFu4aefa09B+4c7jCneX/7vA14RrrZNiYdbjULj6OL8Wy6XvJb\n6N4rcGcP9XMA2mwVvW3O7cB1bWgFbXjAKXRDBL4D3ErrcEhyT7qe3vPnGS/tta6j6zw5me474gds\nP5TiNLrk/wdJJpsec5heSHfc/p5u6tNbgHfBdj3Uv9Hivrbn+F3Ttt9WVbfQJdpvpPs+3Eb3/X9I\nVW0acfwrQgb4sSQtC0neA/xzVf3rYsciSZpekt8EjmyzBUkLziRZkiRJ6uNwC0mSJKmPSbIkSZLU\nZ8YkOcm9knwpyVfaYPy/bOUnJfl2kovbY20rT5K3J9mU7m4+U01+LUmSJI2lQaaAuxV4QlXdlO7e\n4V/oudL2f1fVGX31nwLs1x6Poptt4FHDCliSJEkatRmT5DZX4MTUNDu1x3RX+x1CdyvjoptKatck\ne1bV1qk22H333WvNmjWDRy1JY+TCCy/8XlWtWuw4FopttqSlbNA2e6CbibRJry8EHgT8Q1VdkOSP\ngDck+QvgXGBDVd0K7MX2t4Hc3Mq29u1zPbAeYPXq1WzcuHGQUCRp7CT5zmLHsJDWrFljmy1pyRq0\nzR7owr12K9y1wN7AI5P8MvBq4CHArwO70d25a2BVdUJVrauqdatWrZgOGEmSJC0Bs5rdoqquB84D\nDqqqre0OaLfS3Vbyka3aFra/Z/3erUySJElaEgaZ3WJVkl3b8s8ATwK+lmTPVhbgUODStsmZwPPb\nLBePBm6YbjyyJEmSNG4GGZO8J3ByG5d8D+D0qvpEks8mWQUEuJjufuHQ3cv9YGATcDPd/cklSZKk\nJWOQ2S0uAR4+SfkTpqhfwJHzD02SJElaHN5xT5KWkWluALVvkgvajZ5OS3LPVr5ze76prV+zmPFL\n0rgwSZak5WXiBlAPA9YCB7XrQ94MHFdVDwKuA45o9Y8Armvlx7V6krTimSRrSVmz4SzWbDhrscOQ\nxlabdWiyG0A9AZi4Q+rJdBdcQ3cDqJPb8hnAge2CbGlZmPje8LtDs2WSLEnLTJIdklwMXAucA3wT\nuL6qbmtVJm7yBD03gGrrbwDut7ARS3NnAqxRGeiOe5KkpaOqbgfWtuk7P0p346d56b9LqrSYTIq1\nEEySJWmZqqrrk5wH/Aawa5IdW29x702eJm4AtTnJjsB9gO9Psq8TgBMA1q1bVwsRvzRXJtEaBpNk\nSVpG2vz1P20J8sQNoN5Md7fUZwCnAocDH2ubnNmef7Gt/2ybylNadnqT5yuPfeqUZRKYJEvScjPV\nDaAuB05N8nrgP4ATW/0Tgfcl2QT8ADhsMYKWpHFjkixJy8g0N4D6FvDIScp/DDxzAUKTpCXFJFmS\nJI09xxlroZkkS5KkJc8kWsPmPMmSJElSH3uStaJ4FbMkLR32Dmsx2ZMsSZIk9bEnWUuevcOSJGnY\nTJK17Hm6TpIkzZZJspYVe5UlSdIwOCZZkiRJ6mOSLEmSJPWZcbhFknsBnwd2bvXPqKrXJdkXOBW4\nH3Ah8Lyq+kmSnYFTgF8Dvg/8flVdOaL4JUmSZs3rVTSTQXqSbwWeUFUPA9YCByV5NPBm4LiqehBw\nHXBEq38EcF0rP67VkyRJkpaMGZPk6tzUnu7UHgU8ATijlZ8MHNqWD2nPaesPTJKhRSwNyZoNZ935\nkCRJ6jXQmOQkOyS5GLgWOAf4JnB9Vd3WqmwG9mrLewFXAbT1N9ANyZAkSZKWhIGS5Kq6varWAnsD\njwQeMt8XTrI+ycYkG7dt2zbf3UmSJElDM6t5kqvq+iTnAb8B7Jpkx9ZbvDewpVXbAuwDbE6yI3Af\nugv4+vd1AnACwLp162rub0GSJGm4nHdfM/YkJ1mVZNe2/DPAk4ArgPOAZ7RqhwMfa8tntue09Z+t\nKpNgSZIkLRmD9CTvCZycZAe6pPr0qvpEksuBU5O8HvgP4MRW/0TgfUk2AT8ADhtB3NJQTfQY9PYW\n2IsgSdLKNWOSXFWXAA+fpPxbdOOT+8t/DDxzKNFJ8+CsFZI0PuyM0FLjHfckSZKkPibJkiRJUp9Z\nzW4hjQuHUkiShs3vFvWyJ1mSlpEk+yQ5L8nlSS5L8rJWfkySLUkubo+De7Z5dZJNSb6e5MmLF70k\njQ97kiVpebkNeEVVXZTk3sCFSc5p646rqr/rrZxkf7pZiB4K3B/4TJIHV9XtCxq11MMeXY0Dk2SN\nPRtLaXBVtRXY2pZvTHIFsNc0mxwCnFpVtwLfbtN3PhL44siDlaQxZpIsSctUkjV0U3heADwGOCrJ\n84GNdL3N19El0Of3bLaZ6ZNqaSTsENG4cUyyJC1DSXYBPgy8vKp+CBwPPBBYS9fT/JZZ7m99ko1J\nNm7btm3o8UrSuDFJlqRlJslOdAnyB6rqIwBVdU1V3V5VdwDv5q6bQW0B9unZfO9Wtp2qOqGq1lXV\nulWrVo32DUjSGHC4hSQtI0kCnAhcUVVv7Snfs41XBng6cGlbPhP4YJK30l24tx/wpQUMWSuMwyq0\nVJgkSz1svLUMPAZ4HvDVJBe3sqOBZydZCxRwJfASgKq6LMnpwOV0M2Mc6cwW0vYmu6W2lj+TZEla\nRqrqC0AmWXX2NNu8AXjDyIKSpCXIMcmSJElSH5NkaR7WbDjLIRqSJC1DJsmSJElSH8ckS5KkkfBM\nm5Yyk2RpliZr9HvLvPpZkqSlzyRZY8neB0nSuLFDZGUxSdbYMDGWJEnjYsYL95Lsk+S8JJcnuSzJ\ny1r5MUm2JLm4PQ7u2ebVSTYl+XqSJ4/yDUiSJEnDNkhP8m3AK6rqoiT3Bi5Mck5bd1xV/V1v5ST7\nA4cBD6W7xelnkjzYOzhJkiRpqZixJ7mqtlbVRW35RuAKYK9pNjkEOLWqbq2qbwObgEcOI1hJkiRp\nIcxqnuQka4CHAxe0oqOSXJLkvUnu28r2Aq7q2Wwz0yfVkiRJ0lgZOElOsgvwYeDlVfVD4HjggcBa\nYCvwltm8cJL1STYm2bht27bZbCpJkiSN1EBJcpKd6BLkD1TVRwCq6pqqur2q7gDezV1DKrYA+/Rs\nvncr205VnVBV66pq3apVq+bzHiRJkqShGmR2iwAnAldU1Vt7yvfsqfZ04NK2fCZwWJKdk+wL7Ad8\naXghS5IkSaM1yOwWjwGeB3w1ycWt7Gjg2UnWAgVcCbwEoKouS3I6cDndzBhHOrOF+k3Miexk7JIk\naRzNmCRX1ReATLLq7Gm2eQPwhnnEJUmSJC0a77gnSZKGxrunarkwSZYGYKMvSdLKYpIsDVlvQu2Y\na0lanmzrl79Z3UxEkiRJWglMkiVJkqQ+JsmSJElSH8cka1F5QZw0XEn2AU4B9qCbx/6Eqnpbkt2A\n04A1dHPbP6uqrms3jHobcDBwM/CCqrpoMWKXpHFiT7IkLS+3Aa+oqv2BRwNHJtkf2ACcW1X7Aee2\n5wBPobsz6n7AeuD4hQ9ZWj7WbDjLDqBlwiRZWiA2nFoIVbV1oie4qm4ErgD2Ag4BTm7VTgYObcuH\nAKdU53xg1yR7LnDYkjR2TJIlaZlKsgZ4OHABsEdVbW2rrqYbjgFdAn1Vz2abW5kkrWiOSdaCsRdV\nWjhJdgE+DLy8qn7YDT3uVFUlqVnubz3dcAxWr149zFAlaSzZkyxJy0ySnegS5A9U1Uda8TUTwyja\nv9e28i3APj2b793KtlNVJ1TVuqpat2rVqtEFL0ljwiRZkpaRNlvFicAVVfXWnlVnAoe35cOBj/WU\nPz+dRwM39AzLkKQVy+EWkrS8PAZ4HvDVJBe3sqOBY4HTkxwBfAd4Vlt3Nt30b5vopoB74cKGK0nj\nySRZkpaRqvoCkClWHzhJ/QKOHGlQkrQEmSRLkqR58cJsLUeOSZYkSZL62JMsSZJmzd5jLXf2JEuS\nJEl9ZkySk+yT5Lwklye5LMnLWvluSc5J8o32731beZK8PcmmJJckecSo34QkSZI0TIP0JN8GvKKq\n9gceDRyZZH9gA3BuVe0HnNueAzwF2K891gPHDz1qSZIkaYRmTJKramtVXdSWbwSuAPYCDgFObtVO\nBg5ty4cAp1TnfGDXibs8SZIkSUvBrMYkJ1kDPBy4ANij565MVwN7tOW9gKt6NtvcyiRJkqQlYeAk\nOckuwIeBl1fVD3vXtcnoazYvnGR9ko1JNm7btm02m0qSJEkjNdAUcEl2okuQP1BVH2nF1yTZs6q2\ntuEU17byLcA+PZvv3cq2U1UnACcArFu3blYJtrRUOEWSJC1/tvXL04xJcpIAJwJXVNVbe1adCRwO\nHNv+/VhP+VFJTgUeBdzQMyxDK0BvY3HlsU9dxEgkSZLmZpCe5McAzwO+muTiVnY0XXJ8epIjgO8A\nz2rrzgYOBjYBNwMvHGrEkiRJY84Oo6VvxiS5qr4AZIrVB05Sv4Aj5xmXJEmStGi8454kSZLUxyRZ\nkiRJ6jPQ7BaSJGnlcnytViJ7kiVJkqQ+JsmSJElSH5NkSZIkqY9jkiVJ0t1MdRc57y6nlcKeZEmS\nJKmPSbIkSZLUx+EWGhpPwUnjIcl7gd8Brq2qX25lxwB/CGxr1Y6uqrPbulcDRwC3A39aVf+64EFL\ny5hT6C1NJsnSmLAR1RCdBLwDOKWv/Liq+rvegiT7A4cBDwXuD3wmyYOr6vaFCFSSxpVJskbK3mVp\n4VXV55OsGbD6IcCpVXUr8O0km4BHAl8cUXjSimaHyNLhmGRJWjmOSnJJkvcmuW8r2wu4qqfO5lYm\nSSuaPcnSIrKnXQvoeOCvgWr/vgV40aAbJ1kPrAdYvXr1KOKTpLFiT7IkrQBVdU1V3V5VdwDvphtS\nAbAF2Ken6t6trH/7E6pqXVWtW7Vq1egDlqRFZpIsSStAkj17nj4duLQtnwkclmTnJPsC+wFfWuj4\nJGncONxCkpaZJB8CDgB2T7IZeB1wQJK1dMMtrgReAlBVlyU5HbgcuA040pktJMkkWZKWnap69iTF\nJ05T/w3AG0YXkSQtPQ63kCRJkvrMmCS3qYKuTXJpT9kxSbYkubg9Du5Z9+okm5J8PcmTRxW4JEmS\nNCqDDLc4Ce/cpCk4hZkkSVqOZuxJrqrPAz8YcH933rmpqr4NTNy5SZIkSVoy5nPh3lFJng9sBF5R\nVdfR3aXp/J463rlJ6mPvu6RxZfsk3WWuF+4dDzwQWAtspbtz06wkWZ9kY5KN27Ztm2MYkiRJ0vDN\nKUme752b2j68e5MkSZLG0pySZO/cJEmSpOVsxjHJ3rlJkqTloXfM8ZXHPnURI5HG34xJsndukiRp\nefOCPenuvC21Zs3GVJIkLXfelloaQ2s2nOWPEUmSFpE9ydIYc/ygJEmLwyRZkqQVyLNV48MOkfHk\ncAtJkiSpj0myJEmS1MfhFhqIp+UkSdJKYk+yJEmS1MeeZEmSpEXgWdrxZk+yJEmS1MckWZIkSerj\ncAtJkqQx4ZzJ48OeZEmSJKmPSbIkLTNJ3pvk2iSX9pTtluScJN9o/963lSfJ25NsSnJJkkcsXuSS\nND4cbqG78Wpback7CXgHcEpP2Qbg3Ko6NsmG9vxVwFOA/drjUcDx7V9JWtHsSZakZaaqPg/8oK/4\nEODktnwycGhP+SnVOR/YNcmeCxOpJI0ve5KlJcgLOzQHe1TV1rZ8NbBHW94LuKqn3uZWtrWnjCTr\ngfUAq1evHm2kkjQGTJKlJcJhMBqWqqokNcttTgBOAFi3bt2stpWkpcgkWZJWhmuS7FlVW9twimtb\n+RZgn556e7cyLXH+sJbmZ8YxyV4lLUnLwpnA4W35cOBjPeXPb+33o4EbeoZlSNKKNciFeycBB/WV\nTVwlvR9wbnsO218lvZ7uKmlJ0gJK8iHgi8AvJdmc5AjgWOBJSb4BPLE9Bzgb+BawCXg38MeLELIk\njZ0Zh1tU1eeTrOkrPgQ4oC2fDHyObiqhO6+SBs5PsuvE6b1hBSxJml5VPXuKVQdOUreAI0cbkaS5\n8CLtxTXXKeBme5W0JEmStGTMe57k1gsx6yudk6xPsjHJxm3bts03DGnFWrPhLC/QkSRpyOY6u8W8\nr5J2OiFJkobLH8zS8My1J9mrpCVJkrRszdiT3K6SPgDYPclm4HV0V0Wf3q6Y/g7wrFb9bOBguquk\nbwZeOIKYNQL2PkiSJN1lkNktvEpakiRJK8q8L9yTJEmSlhuTZEmSJKnPXGe3kCRJ0iLyZiOjZZIs\nSdIS5oXX0miYJK9gNqySJEmTM0mWJEkacw6tWHheuCdJkiT1sSd5BXKYhSRJ0vTsSZYkSZL62JMs\nSZK0hHhGeGHYkyxJkiT1sSdZkqQlxp5E9XP2i+EzSV4hbFBXFhtLSZLmx+EWkiRJUh97kqVlYpCz\nBfYwS5I0GJNkSZKkZcQOkeEwSZakFSTJlcCNwO3AbVW1LsluwGnAGuBK4FlVdd1ixShJ48AxyZK0\n8jy+qtZW1br2fANwblXtB5zbnkvSimaSLEk6BDi5LZ8MHLqIsUjSWJjXcAtP20nSklPAp5MU8I9V\ndQKwR1VtbeuvBvbo3yjJemA9wOrVqxcqVvVxOk9p4QxjTPLjq+p7Pc8nTtsdm2RDe/6qIbyOpDmY\n6UvVCzxWnN+sqi1JfgE4J8nXeldWVbUEmr7yE4ATANatW3e39ZK03Iziwr1DgAPa8snA5zBJXjAm\nPJKmU1Vb2r/XJvko8EjgmiR7VtXWJHsC1y5qkNqOvcfS4pjvmOSJ03YXtlNxMMBpO0nSwkvyc0nu\nPbEM/DZwKXAmcHirdjjwscWJUNKwrdlwlj+05mi+PclzOm0Hjm+TpEWwB/DRJNC1/x+sqk8l+TJw\nepIjgO8Az1rEGFcszwRK42VeSfJ8Tts5vk2SFlZVfQt42CTl3wcOXPiIJGl8zXm4haftJEmStFzN\npyfZ03ZjzjFImo6fD2l8+f9TWnxzTpI9bSdJkrQ0OOZ99rzjniRJC8wZB6TxN4p5kiVJ0gBMlLUY\n7FUejEmypDvZcErSymK7PzWT5GXAnghJkqThckyyJEmS1MeeZEmSFoBn/aSlxSRZkiRJd3Kccsck\neYmxJ0KSJI2COcb2TJIlSRoiEw1peTBJljQpT7dJklYyk+QlwF4JSZKkhWWSPAbssdO4m/iM+vmU\nJK0UzpMsSZIk9bEnWdLAPOshSVopTJIX0CAJhuOPJWlpsv2WlheT5EViY6qlbqbPsD3NkrT0reQz\niCbJkiTNYCUnCtKElfb/wCR5ROwplqSlaaZEYKUlCtJKZZIsaSRMJLSUTNWxMVOHhx0i0vI1siQ5\nyUHA24AdgPdU1bGjeq2FNtWXv42lNDkT5vG3nNtsmHyub9tsaThm+v81zDMyC/l9MpIkOckOwD8A\nTwI2A19OcmZVXT6K1xuGYfyxJM2dNyxZPEuxzZa0uFZC/jOqnuRHApuq6lsASU4FDgGWXIO7Ej4E\n0kIa5unrYSTU9nIDY9hmD3LGbqa/12SfJdt0aXQGGba0lMb5jypJ3gu4quf5ZuBRw36Rqf4YEwfY\nxlBa3uaaUNs23M1YtNnD2O84fcFKmr3J2onF+n+dqhr+TpNnAAdV1Yvb8+cBj6qqo3rqrAfWt6e/\nBHx96IEMZnfge4v02jMZ19iMa3bGNS4Y39iWWly/WFWrFjqYYRlSm73U/maLbVzjgvGNzbhmb1xj\nW+y4BmqzR9WTvAXYp+f53q3sTlV1AnDCiF5/YEk2VtW6xY5jMuMam3HNzrjGBeMbm3EtuHm32eN6\nbIxr9sY1NuOavXGNbVzj6neEXzImAAAgAElEQVSPEe33y8B+SfZNck/gMODMEb2WJGl+bLMlqc9I\nepKr6rYkRwH/Sjed0Hur6rJRvJYkaX5ssyXp7kY2T3JVnQ2cPar9D9GiD/mYxrjGZlyzM65xwfjG\nZlwLbAht9rgeG+OavXGNzbhmb1xjG9e4tjOSC/ckSZKkpWxUY5IlSZKkJWtFJMlJnpnksiR3JJn0\nasok+yQ5L8nlre7LetYdk2RLkovb4+CFiqvVOyjJ15NsSrKhp3zfJBe08tPaBTdDkWS3JOck+Ub7\n976T1Hl8zzG5OMmPkxza1p2U5Ns969YuVFyt3u09r31mT/lIjtmAx2ttki+2v/klSX6/Z91Qj9dU\nn5me9Tu397+pHY81Pete3cq/nuTJ84ljDnH9Wfs/eEmSc5P8Ys+6Sf+mCxjbC5Js64nhxT3rDm9/\n+28kOXzYsY2DJH+b5Gvtb/PRJLtOUe/KJF9tx2hjT/lA/3dHFVsWp40f9JgtaBufwb4Tf6mvff9h\nkpe3dSM5XoPG1uot6OdswGO24J+xQWNr9Rb6czaWecSsVNWyfwD/nW5ez88B66aosyfwiLZ8b+A/\ngf3b82OAVy5SXDsA3wQeANwT+EpPXKcDh7XldwF/NMTY/gbY0JY3AG+eof5uwA+An23PTwKeMYJj\nNlBcwE1TlI/kmA0SF/BgYL+2fH9gK7DrsI/XdJ+Znjp/DLyrLR8GnNaW92/1dwb2bfvZYQHjenzP\nZ+iPJuKa7m+6gLG9AHjHJNvuBnyr/XvftnzfUcW6WA/gt4Ed2/Kbp/m/dyWw+yTls2pThh0bi9PG\nDxLXgrfxDPDdM0mMV9PNLTuy4zWb2Bb6czZIXIvxGZtFbIvxORvLPGI2jxXRk1xVV1TVtDcrqaqt\nVXVRW74RuILuLlSLGhc9t4utqp8ApwKHJAnwBOCMVu9k4NAhhndI2+eg+34G8MmqunmIMUxmtnHd\nacTHbMa4quo/q+obbfm/gGuBUdyAYtLPzDTxngEc2I7PIcCpVXVrVX0b2NT2tyBxVdV5PZ+h8+nm\n610IgxyzqTwZOKeqflBV1wHnAAeNKM5FU1Wfrqrb2tO5/G3m/H93GLEtUhs/yDFb8DZ+wO+eXgcC\n36yq7wzj9aczh9j6jeRzNq55xKCxsTi5xLjmEQNbEUnybKU79fxw4IKe4qPaKbP3DvM04QAmu13s\nXsD9gOt7GuCJ8mHZo6q2tuWrgT1mqH8Y8KG+sje0Y3Zckp0XOK57JdmY5PyJUzeM9pjN6ngleSTd\nr/lv9hQP63hN9ZmZtE47HjfQHZ9Bth1lXL2OAD7Z83yyv+mwDBrb77W/0RlJJm6+McpjNq5exPZ/\nm14FfDrJhenu0jdhtm3KKGIDFq2NnyquxWrjZ2Oy9n2xvhMnLPbnbFpjlkfA4nzOxjWPGNjIpoBb\naEk+A/y3SVa9pqo+Nov97AJ8GHh5Vf2wFR8P/DXdf8q/Bt5C1+AtWFyjMF1svU+qqpJMOQ1Kkj2B\nX6GbY3XCq+n+U9yTbqqXVwF/tYBx/WJVbUnyAOCzSb5KlwjO2ZCP1/uAw6vqjlY85+O1HCV5LrAO\n+K2e4rv9Tavqm5PvYSQ+Dnyoqm5N8hK6npEnLODrj9wg7VWS1wC3AR+YYje/2f5OvwCck+RrVfX5\n3goz/R8ZYWyL0sYPEtewDfE78Z7A0+jaqAlzPl5DjG3on7NxzSOGGduwjWseMSzLJkmuqifOdx9J\ndqL7YH+gqj7Ss+9reuq8G/jEAsY11e1ivw/smmTH9gvwbreRnU9sSa5JsmdVbW0f3mun2dWzgI9W\n1U979j3x6/HWJP8EvHIh46qqLe3fbyX5HN0v+g8zj2M2jLiS/DxwFl3Ddn7Pvud8vCYx4y2Ge+ps\nTrIjcB+6z9Qg244yLpI8ka6B/a2qunWifIq/6bCS5EFuy/z9nqfvoRtvN7HtAX3bfm5IcS2omdqr\nJC8Afgc4sKom/cLr+Ttdm+SjdKd5Pw/Mpk0ZSWyL0cYPENdI2vhhfCc2TwEu6j1G8zlew4ptFJ+z\ncc0jhhTbgn/OFjOPGBaHWzRtXM6JwBVV9da+dXv2PH06cOkChjbp7WJbY3se3RgegMOBYf6aPLPt\nc5B9P5u+UyQTx6wd10MZ3jGbMa4k9504LZNkd+AxwOUjPmaDxHVP4KPAKVV1Rt+6YR6vQW4x3Bvv\nM4DPtuNzJnBYutkv9gX2A740j1hmFVeShwP/CDytqq7tKZ/0bzqkuAaNrbcdeBrdeEPoej5+u8V4\nX7qLtXp7Q5aFJAcBf073t5l0zGCSn0ty74llumMx8VmeTZsyitgWvI0fJC4Wr40f1JTte7PQ34mL\n9jkbIK5xzSNgcT5n45pHDK4W8arBhXrQfSA3A7cC1wD/2srvD5zdln+T7jTIJcDF7XFwW/c+4Ktt\n3ZnAngsVV3t+MN1Vst+k64GcKH8AXQKzCfhnYOchHrP7AecC3wA+A+zWytcB7+mpt4buV+c9+rb/\nbDtmlwLvB3ZZqLiA/9Fe+yvt3yNGfcwGjOu5wE97Pl8XA2tHcbwm+8zQnaZ6Wlu+V3v/m9rxeEDP\ntq9p230deMqQ/y/OFNdn2v+FieNz5kx/0wWM7U3AZS2G84CH9Gz7onYsNwEvHHZs4/Bo7+2qnr/N\nxOwove3oA9rx+Uo7Vr3t1aT/RxYwtsVo42eMa6rPXs/xHEV7Neh3z8/R9TTep2/7kRyvQWNbjM/Z\ngHEt+Gdsln/Phf6cjWUeMZuHd9yTJEmS+jjcQpIkSepjkixJkiT1MUmWJEmS+pgkS5IkSX1MkiVJ\nkqQ+JsmSJElSH5NkSZIkqY9J8hKQ5Mp2q97FjGF1kpuS7LCYcSx1Sb7QblMraZmyzV4+krw/yTGL\nHYcWh0nyMpPkqS0Ruz7J1UneM3H7zgG3vzLJLa1xnXjcv6q+W1W7VNXtrd7nkrx4SDEnyZuTfL89\n3txuQznr95fkWUn+X5Kbk3xuFjG8q+f9/iTJT3uef3KO7+v1SU6ay7ZzeK17JTkpyQ+TbE3ysmnq\nvijJRa3u5iRv6v0iTfKyJBe24/CehYhfWqmSPD7JV1ub9v0kH02y1yy2X4w2+/FJzktyQ5IrB6j/\nrCRXJLkxyeVJDu1Z94Ikt/fFf8AA+zy6p/6P+/Zx2Rzf14tn870xH0kObH+THybZNED9Zyf5WjuG\nlyb53Snq/d8k3iVuSEySl5/7AK+nux3lfwf2Av52lvv43da4Tjz+a9hB9llPd1/2hwG/Cvwu8JIp\n6s70/n4A/H/AsbMJoKpeOvF+gTcCp/W8/6f010+y42z2vwD+mu7WnquBJwFHT9OTdS/gT4DdgUcD\nTwH+V8/6LXS3Yz5pRLFKusvlwJOrale6du0bwPGz3MdCt9k/At4L/O+ZKraE//3AnwE/37b5YJJf\n6Kn2xb74PzfTfqvqjT1t9kv79vHQSeIYtzb7R8B7gFfNVDHJauBk4E/pjuHRwGlJ7tdX73Bg0g4m\nzY1J8tLx6+0X+HVJ/inJvSarVFUfrKpPVdXNVXUd8G7gMfN98SRrklSSHZO8AXgs8I72q/0d89z9\n4cBbqmpzVW0B3gK8YLKKM72/qvpMVZ0ODPVLIsmD2vt/YZLvAp9O8sT+XpTWM3tAkt8B/hx4TjtG\nF/ZU27f1dt+Y5FNJdhtCiM8H/qqqrq+qS+m+wF4wWcWqemdV/XtV/aSqNgMfZPtjeEZVfYzuB4ek\nuRm0zb6mL6m9HXjQfF98lG12VX2pqt4HfGuA6nsD11fVJ6tzFl2C+MD5xDCT9r4ryR+3ntqvTbTj\nffW+0HqzfwV4B/DYdoy+11NttySfbG32F5PsO9/4qur8qno/8O0Bqu8DfK+qPt2O4ZnArcADet7H\nfYHXABvmG5vuYpK8dDwHeDJdw/Jg4P8MuN3jgDtPPSXZkOQT8wmkql4D/BtwVPvVflTb9yXtlOFk\nj3dOs8uHAl/pef6VVjaI7d7fAngc8BDgqdNVqqpPAH8DfKAdo1/rWf0HdD8M9gB+jq6HhSQ7THP8\nrk/yysleK8kq4BdYOsdQWgkGbrPTjR++HrgFeCVd2zGxbhzb7NnYCFyR5GmtjTuULsG7pKfOw5N8\nL8l/JnntkHt9nwb8OvAr01Wqqq8CRwH/1o7R7j2r/wB4LbAb8F26M3cAJLlsmmP49iG9hwuAb6Yb\nbrhDkt8DbgQu7alzLPD3wLVDek0B43b6QVN7R1VdBdB6Bf6eGRLlJE+iS8YeNVFWVYMMQ/iXJLe1\n5c9V1aHT1r5r3786SL1J7ALc0PP8BmCXJKmqKcdWTfb+FsDrqurm9vpz3ceJVfWNto9/Bn4boI0d\n3HUO+9ul/dt/DGcci57kD+mGuDx/Dq8raWoDt9lV9V1g13ZW6Q+Br/WsG8c2e2BVdXuSU+jOWN0L\n+AnwzKr6UavyeeCXge/Q/bA/DbgNeNOQQnhjO+s4nzb7jKra2PbxAboheQBMNrRj2KrqtnYMTwd2\npvuR8XtVdUuL6VF0PwT+GJh3L7fuYk/y0nFVz/J36MauTSnJo+kapWdU1X/O8rUOrapd22OgxnZQ\n2f5ii3e14pvoxllN+HngphkS5Pm8v/m4auYqM7q6Z/lm7kpyB5LuYsWJY/jndMcP7n4Mb5xhP79H\n1yPylKpyaIU0XLNqswHa/8OTgY/Nsjd1odvs2Wz/RLqe8QOAewK/BbwnyVqAqvpWVX27qu5ovbl/\nBTxjeO9gLNrs1/Ycw1kPdUlyEF1i/li6Y/gE4KQkv5LkHsA7gT+ZuEhTw2OSvHTs07O8mmnG3CZ5\nOHAm8KKqOndE8dwtgW2nnW6a4vEu2P5ii6p6adv0MrqL9iY8jGlO/y/Q+5tUX+L+I+Bne+LaEei9\nkGJWVxi302hTHb+JhJiqenHPMfybqtoGbGN2x/CpdBcHPbWqHGohDd/AbXafHemGT/38TBVnaZht\n9mysBT5fVRtbIvxluuEDU11YXAz34rP+NpskP9tT9t+mqDuQJF+f5hi+A6Cq/rrnGB41h/ewlu4M\nwUXtGF5AN4zlQLohIGuBDye5Gvhii+vqJP9jDq+lHg63WDqObOPSbqYbnH/aZJWS/DLwKbpflR8f\nYTzX0HPRAMzrtNMpwJ8lOZuukXoF3anJu5np/aWbymwnus/2PdJdLHN7Vf20rb8SOKaqTppjrL2+\nBtw7yZOBzwJ/0V57wjV0F4FMO2xkQusFmFUPRY9TgNcm+Q+6HqsXAc+drGIbpnIK8LSqunCS9TvS\nHb8dgB3aMfypvRTSrAzaZv9Puh+036D7kf1W4D9GcHZnaG126728J117l9ZG3FFVP5mk+peBDUnW\nVtXFrZPjsXS9nyR5CnBRVV2T5CF0Y3//uee1PkeXIB4zl1j7XN0ez01yInAE8Is9668B9k6y08R3\nxkyq6pfmEsgUx/D2KV73y3Tfkb9aVZckWUd3sfVbge/TzfI0YQ1dorwWL76eN3uSl44PAp+mu5r4\nm3TToE3mFcAq4MSeX7O9F+4dnTnO+9vnbcAz0l25Pd+LE/4R+DjwVboLEc5qZQC09/DY9nTa9wc8\nj+7il+PpGuJb6GbAIMk96b6Ezp9nvAC0cW5/Qnd6dAtdg9R7Wu40ukbwB0m+NIzXnMZr6U4rXkWX\nsL+pqj4DkOQB7ThNnO79C7qp9P615xj2/uA4hrsuIHpBW371iOOXlptB2+y96H7430jXBt4BPH1i\n5Zi22Y+jaxfOpuslv4XuvQJ39lA/B6Cq/i9dm3JGkhuBD9ONE56ofyBwSZIftf19hJ4xv3Q98v8+\nz3hpsRTdmO+jge/RzSJyQU+Vc+h+rFzTemVH6Ql0x+1Muh8vtwB3/p1bD/Xvt7jPpfv8fLQdw9OA\nv6yqz7bZLq6eeLT3RXs+2Y8WzUIG6OCSloUkvwkcWVXPXuxYJEnTS7I3cHpVOWxAi8IkWZIkSerj\ncAtJkiSpj0myJEmS1MckWZIkSepjkixJkiT1GYt5knffffdas2bNYochSXNy4YUXfq+qVi12HAvF\nNlvSUjZomz0WSfKaNWvYuHHjYochSXOS5DuLHcNCss2WtJQN2mY73EKSJEnqY5IsSZIk9TFJliRJ\nkvqYJEuSJEl9TJIlSZKkPibJkiRJUh+TZEmSJKmPSbIkSZLUZyxuJiItJWs2nHXn8pXHPnURI5Gk\n5c32VovJJFmSJC06E2KNG5NkSVpGktwL+DywM10bf0ZVvS7JvsCpwP2AC4HnVdVPkuwMnAL8GvB9\n4Per6spFCV6ahkm0FppjkiVpebkVeEJVPQxYCxyU5NHAm4HjqupBwHXAEa3+EcB1rfy4Vk+SVjx7\nkiVpGamqAm5qT3dqjwKeAPxBKz8ZOAY4HjikLQOcAbwjSdp+pKHr7RGWxpk9yZK0zCTZIcnFwLXA\nOcA3geur6rZWZTOwV1veC7gKoK2/gW5IhiStaPYkS9IyU1W3A2uT7Ap8FHjIfPeZZD2wHmD16tXz\n3Z00LXubNQ5MkrVieRGIlruquj7JecBvALsm2bH1Fu8NbGnVtgD7AJuT7Ajch+4Cvv59nQCcALBu\n3TqHYmjs2KZr2BxuIUnLSJJVrQeZJD8DPAm4AjgPeEardjjwsbZ8ZntOW/9ZxyNLkj3JkrTc7Amc\nnGQHuo6Q06vqE0kuB05N8nrgP4ATW/0Tgfcl2QT8ADhsMYKWpHFjkixx12k6T9FpqauqS4CHT1L+\nLeCRk5T/GHjmAoQmSUuKwy0kSZKkPvYkS5KkkfBiOi1l9iRLkiRJfWZMkpPcK8mXknwlyWVJ/rKV\n75vkgiSbkpyW5J6tfOf2fFNbv2a0b0GSJEkarkF6km8FnlBVDwPWAgcleTTwZuC4qnoQcB1wRKt/\nBHBdKz+u1ZMkSVpwazacdedDmo0Zk+Tq3NSe7tQeBTwBOKOVnwwc2pYPac9p6w9MkqFFLI0RG15J\nkpangS7ca/NtXgg8CPgH4JvA9e3OTQCbgb3a8l7AVQBVdVuSG4D7Ad8bYtzSyJn8StLSZPutYRjo\nwr2qur2q1tLdyvSRwEPm+8JJ1ifZmGTjtm3b5rs7SZIkaWhmNbtFVV1Pd2vT3wB2TTLRE703sKUt\nbwH2AWjr7wN8f5J9nVBV66pq3apVq+YYviRJkjR8g8xusSrJrm35Z4AnAVfQJcvPaNUOBz7Wls9s\nz2nrP1tVNcygpUE4XliSJM3VIGOS9wRObuOS7wGcXlWfSHI5cGqS1wP/AZzY6p8IvC/JJuAHwGEj\niFuSJK1Q3qREC2HGJLmqLgEePkn5t+jGJ/eX/xh45lCik5YgG29JkpY+b0utFWVUwy8c1iFJi8P2\nV6Nikqwlb6aeWxtQSRotz6BpOZrV7BaSJEnSSmBPstTDXmdJkgT2JEuSJEl3Y5IsSZIk9XG4hSRJ\nGjmHs2mpsSdZkiRJ6mOSLEmSJPUxSZYkSZL6mCRL0jKSZJ8k5yW5PMllSV7Wyo9JsiXJxe1xcM82\nr06yKcnXkzx58aKXpPHhhXuStLzcBryiqi5Kcm/gwiTntHXHVdXf9VZOsj9wGPBQ4P7AZ5I8uKpu\nX9CoteR4IZ6WO3uSJWkZqaqtVXVRW74RuALYa5pNDgFOrapbq+rbwCbgkaOPVJLGmz3JWlbs2ZDu\nkmQN8HDgAuAxwFFJng9spOttvo4ugT6/Z7PNTJJUJ1kPrAdYvXr1SOOWpHFgT7IkLUNJdgE+DLy8\nqn4IHA88EFgLbAXeMpv9VdUJVbWuqtatWrVq6PFq6Viz4Sw7JLQi2JMsSctMkp3oEuQPVNVHAKrq\nmp717wY+0Z5uAfbp2XzvVibNiQm0lguTZElaRpIEOBG4oqre2lO+Z1VtbU+fDlzals8EPpjkrXQX\n7u0HfGkBQ5YWTG8Cf+WxT13ESLQUmCRL0vLyGOB5wFeTXNzKjgaenWQtUMCVwEsAquqyJKcDl9PN\njHGkM1tIkkmyJC0rVfUFIJOsOnuabd4AvGFkQUnSEjTjhXtOTC9JkqSVZpCeZCem19jxwhBJkjRK\nM/YkOzG9JEmSVppZzZPcNzE9dBPTX5LkvUnu28r2Aq7q2WzSieklSZKkcTVwkjzsiemTrE+yMcnG\nbdu2zWZTSZIkaaQGSpKnmpi+qm6vqjuAd3PXkIqBJqb37k2SJEkaVzNeuOfE9NLcOXG9JElL0yCz\nWzgxvSRJklaUGZNkJ6aXJEnSSjOr2S0kSZKklcAkWZIkSepjkiwtkDUbzvJOgZIkLREmyZIkSVIf\nk2RJkqRpeCZwZRpkCjhJkqRlZSLp7Z3D3rnt1cueZEmSJKmPPckae/6ylyRJC80kWZIkrViONdZU\nHG4hjSEvEpEkaXHZk6yxMUhSaOIoSZIWgkmyJEmalh0UWokcbiFJy0iSfZKcl+TyJJcleVkr3y3J\nOUm+0f69bytPkrcn2ZTkkiSPWNx3IEnjwZ5kSVpebgNeUVUXJbk3cGGSc4AXAOdW1bFJNgAbgFcB\nTwH2a49HAce3f7UCOZuQdBeTZGlMeDpTw1BVW4GtbfnGJFcAewGHAAe0aicDn6NLkg8BTqmqAs5P\nsmuSPdt+JGnFcriFJC1TSdYADwcuAPboSXyvBvZoy3sBV/VstrmVSdKKZpIsSctQkl2ADwMvr6of\n9q5rvcY1y/2tT7IxycZt27YNMVJJGk8myZK0zCTZiS5B/kBVfaQVX5Nkz7Z+T+DaVr4F2Kdn871b\n2Xaq6oSqWldV61atWjW64CVpTJgkS9IykiTAicAVVfXWnlVnAoe35cOBj/WUP7/NcvFo4AbHI0uS\nF+5Ji8qL9TQCjwGeB3w1ycWt7GjgWOD0JEcA3wGe1dadDRwMbAJuBl64sOFK0niaMUlOsg9wCt1F\nHgWcUFVvS7IbcBqwBrgSeFZVXdd6Md5G1+jeDLygqi4aTfiSpF5V9QUgU6w+cJL6BRw50qAkaQka\nZLjFxJyb+wOPBo5Msj/dHJvnVtV+wLntOWw/5+Z6ujk3JUmSpCVjxp5k59yUJGnlcTiYVrpZjUme\n55yb2yXJSdbT9TSzevXqWYYtLV1+8UiSNP4Gnt1i2HNuOp2QJEmSxtVAPcnTzblZVVvnMuemJEnS\nuPKsn2bsSXbOTUmSJK00g/QkO+emJEmSVpRBZrdwzk1JkiStKN6WWpIkSerjbaklSVrhvEhtML3H\n6cpjn7qIkWgh2JMsSZIk9TFJliRJkvqYJEuSJEl9TJIlSZKkPibJkiRJUh9nt5AkSZolZ7pY/kyS\nJUlagZz2TZqeSbIWlY20JEkaR45JliRJkvrYk6xFYQ+yJEkaZ/YkS5IkSX1MkiVpmUny3iTXJrm0\np+yYJFuSXNweB/ese3WSTUm+nuTJixO1JI0Xk2RJWn5OAg6apPy4qlrbHmcDJNkfOAx4aNvmnUl2\nWLBIJWlMOSZZkpaZqvp8kjUDVj8EOLWqbgW+nWQT8EjgiyMKT4vI60GkwZkkS9LKcVSS5wMbgVdU\n1XXAXsD5PXU2tzJJA/LGIsuTwy0kaWU4HnggsBbYCrxlNhsnWZ9kY5KN27ZtG0V8kjRW7EnWgvE0\nn7R4quqaieUk7wY+0Z5uAfbpqbp3K+vf/gTgBIB169bV6CKVpPEwY0+yV0lL0tKXZM+ep08HJtr0\nM4HDkuycZF9gP+BLCx2fJI2bQXqSTwLeAZzSV35cVf1db0HfVdL3Bz6T5MFVdfsQYpUkDSDJh4AD\ngN2TbAZeBxyQZC1QwJXASwCq6rIkpwOXA7cBR9pmS9L/397dxlxSlgcc/19iwVStgKzb9QUWEmpr\n0hTNE9u0pL5gFWjD0tRSTExXpaGxmtQ0pq7xQ43GFNq0TZs26vpS8KW+FEvcFqwiQvhSrGgQQUUW\nxMh22aVofYkpFb36Ye4Hh9nzPM+cc2bmzHnO/5ecnDlz5sy5cs997rnOPffMtEiSPUtakpZLZr5s\nwuz3bLL824C39ReRJC2fecYke5a01LM247g9k1qSpO7NenWLuc6SBs+UliRJ0njNlCRn5pHM/FFm\n/hh4F9WQCmh5lnRZx/7MXMvMtR07dswShiRJktSLmZJkz5KWJEk61u5913jJ021iyzHJniUtSdLy\nMmGTZtPm6haeJS1JkjQjb1u9nLwttSRJktTgbanVKw/zSZKkZWRPsiRJktRgkixJkiQ1mCRLS87L\nDUmS1D2TZEmSJKnBJFmSJElq8OoWkiRJHXMY3PIzSVbnbBgkSdKyc7iFJEmS1GBPsiRJ24S3P5a6\nY0+yJEmS1GBPsiRJ25Dnh0jzMUmWtgkPs0qS1B2HW0iSJEkN9iRravZYSpKk7c6eZEnaZiLivRFx\nNCJur807OSKui4i7yvNJZX5ExN9FxMGIuC0inrO4yCVpPEySpW1o975rHnloJV0BnNuYtw+4PjPP\nBK4vrwHOA84sj0uBtw8Uo7TybKfHzeEWmotDL6TxycybImJ3Y/Ye4Pll+krgRuANZf77MjOBmyPi\nxIjYlZmHh4lWksZpy55kD9upLf8RS6O2s5b43g/sLNNPA75ZW+6+Mu9RIuLSiLglIm554IEH+o1U\nkkagzXCLK/CwnSRtG6XXOKf8zP7MXMvMtR07dvQUmWbh8CqpH1smyZl5E/Ctxuw9VIfrKM8X1ua/\nLys3AydGxK6ugpUkzezIentcno+W+YeAZ9SWe3qZJ0krbdYT9+Y6bCdpMexxWmkHgL1lei/w8dr8\n3y/D5X4F+I7jkSWpgxP3MjMjYqrDdlCNb6MaksGpp546bxiSNmBCvHoi4kNUJ+mdEhH3AX8GXAZ8\nNCIuAb4BXFQWvxY4HzgI/AB45eABS9IIzZokH1k/+3nWw3aZuR/YD7C2tjZ1ki1JmiwzX7bBW+dM\nWDaB1/QbkSQtn1mHW3jYTpIkSdvWlj3JHraTJEnqhkPglseWSbKH7SRJkrRqvC21JEmS1OBtqSVJ\nkhaoPgTj3st+c4GRqM6eZEmSJKnBnmS14okGkiRplZgka1Mmx5I0HrbJ0nBMkiVJGjETY2kxHJMs\nSZIkNZgkS5IkSQ0Ot88nS6MAAAx3SURBVJBW1PohXC83JEnj4eXgxsOeZEmSJKnBnmQdw5NEJEnS\nqrMnWZIkSWqwJ1mSpBFwLKqarBOLZZIsrTgbYUmSjuVwC0mSJKnBJFmSJElqcLiFJEkj41WGpMUz\nSZb0CMcnS5JUMUmWpBUSEfcC3wN+BDycmWsRcTLwEWA3cC9wUWZ+e1ExStIYOCZZklbPCzLzrMxc\nK6/3Addn5pnA9eW1JK20uXqS7ZGQti+HXqyUPcDzy/SVwI3AGxYVjCSNQRc9yfZISNLySOBTEfH5\niLi0zNuZmYfL9P3AzsWEJknj0ceYZHskloRnT0sr6ezMPBQRTwGui4iv1t/MzIyIbH6oJNSXApx6\n6qnDRCrpER7dG968SfJ6j0QC78zM/bTskbDBlaThZeah8nw0Iq4GngsciYhdmXk4InYBRyd8bj+w\nH2Btbe2YJFrScCZ1cpk4d2/e4RZnZ+ZzgPOA10TEr9ffzMykSqSPkZn7M3MtM9d27NgxZxiSpK1E\nxOMj4onr08CLgduBA8Desthe4OOLiXB17N53jUfzpJGbqyd51h4JLZYNs7rgob+ltBO4OiKgav//\nKTP/PSI+B3w0Ii4BvgFctMAYJWkUZk6SSy/EYzLze7Ueibfwkx6Jy7BHQpJGIzPvAX5pwvwHgXOG\nj2i1TOqgsNNCGq95epLtkZAE2KssSdp+Zk6S7ZGQVsd6EmwCLElaFd6WekV4SE9dsB5JklaFt6WW\nJEmSGkySJUmSpAaTZEmSJKnBJHmJeTF6jZH1UpK0HXji3jZmoiJJ0mrYaJ/vVYlmZ5IsSVJP7KyQ\nlpdJsiRJM/JGOho7r3M/O8ckS5IkSQ0myZIkSVKDwy22Gce/SdJieFhb2l5MkpeAY960nViftey2\n6oyws0LaHhxuIUmSJDXYk7xkJvVQ2GshSf2wfdV259G9jZkkS+qFyYWWiYmCVontczsmySMwqXG2\nAmsVmJhI0jDMK6ZnkixJUo3JhCTwxD1JkiRR/UH0T+JP2JMsaRS8xqwkjYND4Sq9JckRcS7wt8Bx\nwLsz87K+vmsZbfRPzX9wWnUbNc5b/TZWuSHvwnZss7c638M6I01no3Z4u/6WekmSI+I44B+A3wDu\nAz4XEQcy88t9fN/Y2UMm9c/kZ3a22ZI20qbzbqs8Z1nb5756kp8LHMzMewAi4sPAHmClG1x7iaXp\n+JsZzFK32dPsxKf9nKTV1VeS/DTgm7XX9wG/3PWXdPnPpE1j6eXZpOUw6290mXo4OjZ4m103qdxt\nZ6XlM0Sv85C90gs7cS8iLgUuLS+/HxEPAv898/ou7ySszZwSl88eX89OYY6y65mxzcbYZjNXbHO0\nI6fN/MklMaHNvrOzdT+63MdWv8YWD4wvprHFA+OLaWzxwIJjmtDeHhPPVm1y3212X0nyIeAZtddP\nL/MekZn7gf3rryPilsxc6ymeuY05PmObjbHNxti2panb7L6MbRuOLR4YX0xjiwfGF9PY4oHxxTS2\neKC/6yR/DjgzIk6PiOOBi4EDPX2XJGk+ttmS1NBLT3JmPhwRrwU+SXU5ofdm5h19fJckaT622ZJ0\nrN7GJGfmtcC1U3yk98N4cxpzfMY2G2ObjbFtQzO02X0Z2zYcWzwwvpjGFg+ML6axxQPji2ls8RCZ\nuegYJEmSpFHpa0yyJEmStLQGTZIj4ncj4o6I+HFETDyDMSKeERE3RMSXy7J/XHvvzRFxKCJuLY/z\nh4ytLHduRNwZEQcjYl9t/ukR8dky/yPl5JfORMTJEXFdRNxVnk+asMwLamVza0T8b0RcWN67IiK+\nXnvvrCFjK8v9qPb9B2rzeyu7luV2VkT8R9n+t0XE79Xe67zcNqpDtfdPKOVwsJTL7tp7byzz74yI\nl8wbywyx/Un5bd4WEddHxGm19yZu3wFje0VEPFCL4Q9q7+0tdeCuiNjbdWxqLyL+MiK+WurQ1RFx\n4oRlntloy74bEa8r73W+H2gTU1nu3oj4UvneW2rzW7WBXcYTA+8rpyijQfaR0S6fGLoetc0jhqpH\no8u5piijheRax8jMwR7ALwDPBG4E1jZYZhfwnDL9ROBrwLPK6zcDr19gbMcBdwNnAMcDX6zF9lHg\n4jL9DuDVHcf3F8C+Mr0PuHyL5U8GvgX8dHl9BfDSnsquVWzA9zeY31vZtYkN+DngzDL9VOAwcGIf\n5bZZHaot80fAO8r0xcBHyvSzyvInAKeX9Rw3cGwvqNWpV6/Httn2HTC2VwB/P+GzJwP3lOeTyvRJ\nfcXqY8tt+WLgsWX68hZt2XHA/cBp5fWb6Xg/0DYm4F7glAnzp2qfu4iHgfeVLWMabB9Ji332AupR\nq5gGrEejy7laxrSwXKv5GLQnOTO/kpmbXoA+Mw9n5hfK9PeAr1DdDWrhsVG7dWtm/h/wYWBPRATw\nQuCqstyVwIUdh7inrLft+l8KfCIzf9BxHJNMG9sjBii7LWPLzK9l5l1l+r+Ao8CODmOom1iHNon5\nKuCcUk57gA9n5kOZ+XXgYFnfYLFl5g21OnUz1fV0h9Cm3DbyEuC6zPxWZn4buA44t6c4tYXM/FRm\nPlxetqlD5wB3Z+Y3RhRT08xt4KzxDL2vbFlGg+0jW+6z64aoR9PG1NR1PRpdzrUEudajjHpMclSH\nmZ8NfLY2+7XlcM975z0UMYNJt259GvBk4H9qDcj6/C7tzMzDZfp+YOcWy18MfKgx722l7P4mIk5Y\nQGyPi4hbIuLmKMNA6L/spiq3iHgu1T/Xu2uzuyy3jerQxGVKuXyHqpzafLbv2OouAT5Rez1p+w4d\n2++UbXVVRKzfHKPvctPsXsWj69Akk9qyPvcDm8WUwKci4vNR3YFw3bTtc1fxAAvZV24U0yL3kVsZ\nuh5tZhH1aEsjyrlGU486vwRcRHwa+NkJb70pMz8+xXqeAHwMeF1mfrfMfjvwVqoK9lbgr6h+rIPG\n1pfN4qu/yMyMiA0vSxIRu4BfpLrm6bo3Uv3ojqe6zMobgLcMHNtpmXkoIs4APhMRX6JKAOfScbm9\nH9ibmT8us+cqt+0qIl4OrAHPq80+Zvtm5t2T19CLfwU+lJkPRcQfUvUyvHDA71fRpq2NiDcBDwMf\n3GQ9xwMXUP0O1820H+goprNLHX8KcF1EfDUzb6ovsFU703E8ne4ru4qpKx3mE4PWoxYGrUdtDF2P\nlkXnSXJmvmjedUTET1FtrA9m5r/U1n2ktsy7gH8bOLaNbt36IHBiRDy2/MM55pau88YXEUciYldm\nHi7J3NFNVnURcHVm/rC27vV/pw9FxD8Crx86tsw8VJ7viYgbqf6xfow5y66L2CLiZ4BrqH7EN9fW\nPVe5TbDl7X9ry9wXEY8FnkRVx9p8tu/YiIgXUf0BeV5mPrQ+f4Pt21WS3Oa2yQ/WXr6banzf+mef\n3/jsjR3FpQm2amsj4hXAbwHnZOZmicB5wBfqbf+s+4EuYqrV8aMRcTXVYeGbgGna587i6Xpf2UFM\nne4ju8gnisHqUct1DFaP2hi6HrXQa641jdENtyhjTt4DfCUz/7rx3q7ay98Gbh8yNja4dWtpLG6g\nGgcMsBfo+t/SgbLeNut/GY3DSutlV8r3Qrotuy1ji4iT1ocqRMQpwK8BXx6g7NrEdjxwNfC+zLyq\n8V7X5dbm9r/1mF8KfKaU0wHg4qiufnE6cCbwn3PGM1VsEfFs4J3ABZl5tDZ/4vYdOLZ6+3AB1dg6\nqI6ovLjEeBLVCUj1oywaUEScC/wpVR3a6pyJDduyopP9QJuYIuLxEfHE9WmqerT+3dO0z13FM+i+\nsuV2W+Q+cjOD1KM2hqxHLeMZY841nnqUPZ4V2HxQFfJ9wEPAEeCTZf5TgWvL9NlUXfu3AbeWx/nl\nvfcDXyrvHQB2DRlbeX0+1dmfd1P1Oq7PP4MqYTkI/DNwQsdl92TgeuAu4NPAyWX+GvDu2nK7qf5Z\nPabx+c+Usrsd+ADwhCFjA361fP8Xy/MlQ5Rdy9heDvywVt9uBc7qq9wm1SGqIRwXlOnHlXI4WMrl\njNpn31Q+dydwXpd1rGVsny6/j/VyOrDV9h0wtj8H7igx3AD8fO2zryrleRB4Zdex+ZhqOx6kGm+4\nXofWr+TSbGsfT9Vz9KTG5zvfD7SJqbRTXyyPO3h0+z+xnek5nkH3lVNst0H2kbTfZw9Zj9rkOEPW\no9HlXFNst4XkWs2Hd9yTJEmSGkY33EKSJElaNJNkSZIkqcEkWZIkSWowSZYkSZIaTJIlSZKkBpNk\nSZIkqcEkWZIkSWowSZYkSZIa/h8oNZIxvfbiUgAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "%matplotlib inline\n", "\n", "bmeans = df.mean()\n", "fig = plt.figure(figsize=(12,8))\n", "for i in range(K):\n", " fig.add_subplot(2,2,i+1)\n", " plt.hist(df['b'+str(i)], bins=100)\n", " plt.plot()\n", " plt.title('b {}: Fit={}, Truth={}'.format(i, np.round(bmeans[i],2), np.round(Beta[i],2)))\n", " " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python [py35]", "language": "python", "name": "Python [py35]" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.2" } }, "nbformat": 4, "nbformat_minor": 0 }