{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Week 5_2: Boosting a decision stump\n", "\n", "The goal of this notebook is to implement your own boosting module.\n", "\n", "**Brace yourselves**! This is going to be a fun and challenging assignment.\n", "\n", "\n", "* Use SFrames to do some feature engineering.\n", "* Modify the decision trees to incorporate weights.\n", "* Implement Adaboost ensembling.\n", "* Use your implementation of Adaboost to train a boosted decision stump ensemble.\n", "* Evaluate the effect of boosting (adding more decision stumps) on performance of the model.\n", "* Explore the robustness of Adaboost to overfitting.\n", "\n", "Let's get started!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Fire up GraphLab Create" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Make sure you have the latest version of GraphLab Create **(1.8.3 or newer)**. Upgrade by\n", "```\n", " pip install graphlab-create --upgrade\n", "```\n", "See [this page](https://dato.com/download/) for detailed instructions on upgrading." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "[INFO] SFrame v1.8.3 started. Logging /tmp/sframe_server_1458296777.log\n" ] } ], "source": [ "import sframe\n", "import matplotlib.pyplot as plt\n", "%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Getting the data ready" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We will be using the same [LendingClub](https://www.lendingclub.com/) dataset as in the previous assignment." ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false }, "outputs": [], "source": [ "loans = sframe.SFrame('lending-club-data.gl/')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Extracting the target and the feature columns\n", "\n", "We will now repeat some of the feature processing steps that we saw in the previous assignment:\n", "\n", "First, we re-assign the target to have +1 as a safe (good) loan, and -1 as a risky (bad) loan.\n", "\n", "Next, we select four categorical features: \n", "1. grade of the loan \n", "2. the length of the loan term\n", "3. the home ownership status: own, mortgage, rent\n", "4. number of years of employment." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false }, "outputs": [], "source": [ "features = ['grade', # grade of the loan\n", " 'term', # the term of the loan\n", " 'home_ownership', # home ownership status: own, mortgage or rent\n", " 'emp_length', # number of years of employment\n", " ]\n", "loans['safe_loans'] = loans['bad_loans'].apply(lambda x : +1 if x==0 else -1)\n", "loans.remove_column('bad_loans')\n", "target = 'safe_loans'\n", "loans = loans[features + [target]]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Subsample dataset to make sure classes are balanced" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Just as we did in the previous assignment, we will undersample the larger class (safe loans) in order to balance out our dataset. This means we are throwing away many data points. We use `seed=1` so everyone gets the same results." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Percentage of safe loans : 0.502236174422\n", "Percentage of risky loans : 0.497763825578\n", "Total number of loans in our new dataset : 46508\n" ] } ], "source": [ "safe_loans_raw = loans[loans[target] == 1]\n", "risky_loans_raw = loans[loans[target] == -1]\n", "\n", "# Undersample the safe loans.\n", "percentage = len(risky_loans_raw)/float(len(safe_loans_raw))\n", "risky_loans = risky_loans_raw\n", "safe_loans = safe_loans_raw.sample(percentage, seed=1)\n", "loans_data = risky_loans_raw.append(safe_loans)\n", "\n", "print \"Percentage of safe loans :\", len(safe_loans) / float(len(loans_data))\n", "print \"Percentage of risky loans :\", len(risky_loans) / float(len(loans_data))\n", "print \"Total number of loans in our new dataset :\", len(loans_data)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Note:** There are many approaches for dealing with imbalanced data, including some where we modify the learning algorithm. These approaches are beyond the scope of this course, but some of them are reviewed in this [paper](http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5128907&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F69%2F5173046%2F05128907.pdf%3Farnumber%3D5128907 ). For this assignment, we use the simplest possible approach, where we subsample the overly represented class to get a more balanced dataset. In general, and especially when the data is highly imbalanced, we recommend using more advanced methods." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Transform categorical data into binary features" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this assignment, we will work with **binary decision trees**. Since all of our features are currently categorical features, we want to turn them into binary features using 1-hot encoding. \n", "\n", "We can do so with the following code block (see the first assignments for more details):" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": true }, "outputs": [], "source": [ "loans_data = risky_loans.append(safe_loans)\n", "for feature in features:\n", " loans_data_one_hot_encoded = loans_data[feature].apply(lambda x: {x: 1}) \n", " loans_data_unpacked = loans_data_one_hot_encoded.unpack(column_name_prefix=feature)\n", " \n", " # Change None's to 0's\n", " for column in loans_data_unpacked.column_names():\n", " loans_data_unpacked[column] = loans_data_unpacked[column].fillna(0)\n", "\n", " loans_data.remove_column(feature)\n", " loans_data.add_columns(loans_data_unpacked)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's see what the feature columns look like now:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "['grade.A',\n", " 'grade.B',\n", " 'grade.C',\n", " 'grade.D',\n", " 'grade.E',\n", " 'grade.F',\n", " 'grade.G',\n", " 'term. 36 months',\n", " 'term. 60 months',\n", " 'home_ownership.MORTGAGE',\n", " 'home_ownership.OTHER',\n", " 'home_ownership.OWN',\n", " 'home_ownership.RENT',\n", " 'emp_length.1 year',\n", " 'emp_length.10+ years',\n", " 'emp_length.2 years',\n", " 'emp_length.3 years',\n", " 'emp_length.4 years',\n", " 'emp_length.5 years',\n", " 'emp_length.6 years',\n", " 'emp_length.7 years',\n", " 'emp_length.8 years',\n", " 'emp_length.9 years',\n", " 'emp_length.< 1 year',\n", " 'emp_length.n/a']" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "features = loans_data.column_names()\n", "features.remove('safe_loans') # Remove the response variable\n", "features" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Train-test split\n", "\n", "We split the data into training and test sets with 80% of the data in the training set and 20% of the data in the test set. We use `seed=1` so that everyone gets the same result." ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": true }, "outputs": [], "source": [ "train_data, test_data = loans_data.random_split(0.8, seed=1)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Weighted decision trees" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's modify our decision tree code from Module 5 to support weighting of individual data points." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Weighted error definition\n", "\n", "Consider a model with $N$ data points with:\n", "* Predictions $\\hat{y}_1 ... \\hat{y}_n$ \n", "* Target $y_1 ... y_n$ \n", "* Data point weights $\\alpha_1 ... \\alpha_n$.\n", "\n", "Then the **weighted error** is defined by:\n", "$$\n", "\\mathrm{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}}) = \\frac{\\sum_{i=1}^{n} \\alpha_i \\times 1[y_i \\neq \\hat{y_i}]}{\\sum_{i=1}^{n} \\alpha_i}\n", "$$\n", "where $1[y_i \\neq \\hat{y_i}]$ is an indicator function that is set to $1$ if $y_i \\neq \\hat{y_i}$.\n", "\n", "\n", "### Write a function to compute weight of mistakes\n", "\n", "Write a function that calculates the weight of mistakes for making the \"weighted-majority\" predictions for a dataset. The function accepts two inputs:\n", "* `labels_in_node`: Targets $y_1 ... y_n$ \n", "* `data_weights`: Data point weights $\\alpha_1 ... \\alpha_n$\n", "\n", "We are interested in computing the (total) weight of mistakes, i.e.\n", "$$\n", "\\mathrm{WM}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}}) = \\sum_{i=1}^{n} \\alpha_i \\times 1[y_i \\neq \\hat{y_i}].\n", "$$\n", "This quantity is analogous to the number of mistakes, except that each mistake now carries different weight. It is related to the weighted error in the following way:\n", "$$\n", "\\mathrm{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}}) = \\frac{\\mathrm{WM}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})}{\\sum_{i=1}^{n} \\alpha_i}\n", "$$\n", "\n", "The function **intermediate_node_weighted_mistakes** should first compute two weights: \n", " * $\\mathrm{WM}_{-1}$: weight of mistakes when all predictions are $\\hat{y}_i = -1$ i.e $\\mathrm{WM}(\\mathbf{\\alpha}, \\mathbf{-1}$)\n", " * $\\mathrm{WM}_{+1}$: weight of mistakes when all predictions are $\\hat{y}_i = +1$ i.e $\\mbox{WM}(\\mathbf{\\alpha}, \\mathbf{+1}$)\n", " \n", " where $\\mathbf{-1}$ and $\\mathbf{+1}$ are vectors where all values are -1 and +1 respectively.\n", " \n", "After computing $\\mathrm{WM}_{-1}$ and $\\mathrm{WM}_{+1}$, the function **intermediate_node_weighted_mistakes** should return the lower of the two weights of mistakes, along with the class associated with that weight. We have provided a skeleton for you with `YOUR CODE HERE` to be filled in several places." ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def intermediate_node_weighted_mistakes(labels_in_node, data_weights):\n", " # Sum the weights of all entries with label +1\n", " total_weight_positive = sum(data_weights[labels_in_node == +1])\n", " \n", " # Weight of mistakes for predicting all -1's is equal to the sum above\n", " ### YOUR CODE HERE\n", " weighted_mistakes_all_negative = total_weight_positive\n", " \n", " # Sum the weights of all entries with label -1\n", " ### YOUR CODE HERE\n", " total_weight_negative = sum(data_weights[labels_in_node == -1])\n", " \n", " # Weight of mistakes for predicting all +1's is equal to the sum above\n", " ### YOUR CODE HERE\n", " weighted_mistakes_all_positive = total_weight_negative\n", " \n", " # Return the tuple (weight, class_label) representing the lower of the two weights\n", " # class_label should be an integer of value +1 or -1.\n", " # If the two weights are identical, return (weighted_mistakes_all_positive,+1)\n", " ### YOUR CODE HERE\n", " if weighted_mistakes_all_negative == weighted_mistakes_all_positive:\n", " return (weighted_mistakes_all_positive, +1)\n", " else:\n", " if min(weighted_mistakes_all_negative, weighted_mistakes_all_positive) == weighted_mistakes_all_negative:\n", " return (weighted_mistakes_all_negative, -1)\n", " else:\n", " return (weighted_mistakes_all_positive, +1)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Checkpoint:** Test your **intermediate_node_weighted_mistakes** function, run the following cell:" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Test passed!\n" ] } ], "source": [ "example_labels = sframe.SArray([-1, -1, 1, 1, 1])\n", "example_data_weights = sframe.SArray([1., 2., .5, 1., 1.])\n", "if intermediate_node_weighted_mistakes(example_labels, example_data_weights) == (2.5, -1):\n", " print 'Test passed!'\n", "else:\n", " print 'Test failed... try again!'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Recall that the **classification error** is defined as follows:\n", "$$\n", "\\mbox{classification error} = \\frac{\\mbox{# mistakes}}{\\mbox{# all data points}}\n", "$$\n", "\n", "**Quiz Question:** If we set the weights $\\mathbf{\\alpha} = 1$ for all data points, how is the weight of mistakes $\\mbox{WM}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})$ related to the `classification error`?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Function to pick best feature to split on" ] }, { "cell_type": "markdown", "metadata": { "collapsed": false }, "source": [ "We continue modifying our decision tree code from the earlier assignment to incorporate weighting of individual data points. The next step is to pick the best feature to split on.\n", "\n", "The **best_splitting_feature** function is similar to the one from the earlier assignment with two minor modifications:\n", " 1. The function **best_splitting_feature** should now accept an extra parameter `data_weights` to take account of weights of data points.\n", " 2. Instead of computing the number of mistakes in the left and right side of the split, we compute the weight of mistakes for both sides, add up the two weights, and divide it by the total weight of the data.\n", " \n", "Complete the following function. Comments starting with `DIFFERENT HERE` mark the sections where the weighted version differs from the original implementation." ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "'safe_loans'" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "target" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# If the data is identical in each feature, this function should return None\n", "\n", "def best_splitting_feature(data, features, target, data_weights):\n", " \n", " # These variables will keep track of the best feature and the corresponding error\n", " best_feature = None\n", " best_error = float('+inf') \n", " num_points = float(len(data))\n", "\n", " # Loop through each feature to consider splitting on that feature\n", " for feature in features:\n", " \n", " # The left split will have all data points where the feature value is 0\n", " # The right split will have all data points where the feature value is 1\n", " left_split = data[data[feature] == 0]\n", " right_split = data[data[feature] == 1]\n", " \n", " # Apply the same filtering to data_weights to create left_data_weights, right_data_weights\n", " ## YOUR CODE HERE\n", " left_data_weights = data_weights[data[feature] == 0]\n", " right_data_weights = data_weights[data[feature] == 1]\n", " \n", " # DIFFERENT HERE\n", " # Calculate the weight of mistakes for left and right sides\n", " ## YOUR CODE HERE\n", " left_weighted_mistakes, left_class = intermediate_node_weighted_mistakes(left_split[target], left_data_weights)\n", " right_weighted_mistakes, right_class = intermediate_node_weighted_mistakes(right_split[target], right_data_weights)\n", " \n", " # DIFFERENT HERE\n", " # Compute weighted error by computing\n", " # ( [weight of mistakes (left)] + [weight of mistakes (right)] ) / [total weight of all data points]\n", " ## YOUR CODE HERE\n", " error = (left_weighted_mistakes + right_weighted_mistakes) / data_weights.sum()\n", " #print '####', error, feature\n", " # If this is the best error we have found so far, store the feature and the error\n", " if error < best_error:\n", " best_feature = feature\n", " best_error = error\n", " #print 'best errror:', best_error\n", " \n", " # Return the best feature we found\n", " return best_feature" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Checkpoint:** Now, we have another checkpoint to make sure you are on the right track." ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Test passed!\n" ] } ], "source": [ "example_data_weights = sframe.SArray(len(train_data)* [1.5])\n", "if best_splitting_feature(train_data, features, target, example_data_weights) == 'term. 36 months':\n", " print 'Test passed!'\n", "else:\n", " print 'Test failed... try again!'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Note**. If you get an exception in the line of \"the logical filter has different size than the array\", try upgradting your GraphLab Create installation to 1.8.3 or newer." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Very Optional**. Relationship between weighted error and weight of mistakes\n", "\n", "By definition, the weighted error is the weight of mistakes divided by the weight of all data points, so\n", "$$\n", "\\mathrm{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}}) = \\frac{\\sum_{i=1}^{n} \\alpha_i \\times 1[y_i \\neq \\hat{y_i}]}{\\sum_{i=1}^{n} \\alpha_i} = \\frac{\\mathrm{WM}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})}{\\sum_{i=1}^{n} \\alpha_i}.\n", "$$\n", "\n", "In the code above, we obtain $\\mathrm{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})$ from the two weights of mistakes from both sides, $\\mathrm{WM}(\\mathbf{\\alpha}_{\\mathrm{left}}, \\mathbf{\\hat{y}}_{\\mathrm{left}})$ and $\\mathrm{WM}(\\mathbf{\\alpha}_{\\mathrm{right}}, \\mathbf{\\hat{y}}_{\\mathrm{right}})$. First, notice that the overall weight of mistakes $\\mathrm{WM}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})$ can be broken into two weights of mistakes over either side of the split:\n", "$$\n", "\\mathrm{WM}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})\n", "= \\sum_{i=1}^{n} \\alpha_i \\times 1[y_i \\neq \\hat{y_i}]\n", "= \\sum_{\\mathrm{left}} \\alpha_i \\times 1[y_i \\neq \\hat{y_i}]\n", " + \\sum_{\\mathrm{right}} \\alpha_i \\times 1[y_i \\neq \\hat{y_i}]\\\\\n", "= \\mathrm{WM}(\\mathbf{\\alpha}_{\\mathrm{left}}, \\mathbf{\\hat{y}}_{\\mathrm{left}}) + \\mathrm{WM}(\\mathbf{\\alpha}_{\\mathrm{right}}, \\mathbf{\\hat{y}}_{\\mathrm{right}})\n", "$$\n", "We then divide through by the total weight of all data points to obtain $\\mathrm{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})$:\n", "$$\n", "\\mathrm{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})\n", "= \\frac{\\mathrm{WM}(\\mathbf{\\alpha}_{\\mathrm{left}}, \\mathbf{\\hat{y}}_{\\mathrm{left}}) + \\mathrm{WM}(\\mathbf{\\alpha}_{\\mathrm{right}}, \\mathbf{\\hat{y}}_{\\mathrm{right}})}{\\sum_{i=1}^{n} \\alpha_i}\n", "$$" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Building the tree\n", "\n", "With the above functions implemented correctly, we are now ready to build our decision tree. Recall from the previous assignments that each node in the decision tree is represented as a dictionary which contains the following keys:\n", "\n", " { \n", " 'is_leaf' : True/False.\n", " 'prediction' : Prediction at the leaf node.\n", " 'left' : (dictionary corresponding to the left tree).\n", " 'right' : (dictionary corresponding to the right tree).\n", " 'features_remaining' : List of features that are posible splits.\n", " }\n", " \n", "Let us start with a function that creates a leaf node given a set of target values:" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def create_leaf(target_values, data_weights):\n", " \n", " # Create a leaf node\n", " leaf = {'splitting_feature' : None,\n", " 'is_leaf': True}\n", " \n", " # Computed weight of mistakes.\n", " weighted_error, best_class = intermediate_node_weighted_mistakes(target_values, data_weights)\n", " # Store the predicted class (1 or -1) in leaf['prediction']\n", " leaf['prediction'] = best_class ## YOUR CODE HERE\n", " \n", " return leaf " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We provide a function that learns a weighted decision tree recursively and implements 3 stopping conditions:\n", "1. All data points in a node are from the same class.\n", "2. No more features to split on.\n", "3. Stop growing the tree when the tree depth reaches **max_depth**. " ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def weighted_decision_tree_create(data, features, target, data_weights, current_depth = 1, max_depth = 10):\n", " remaining_features = features[:] # Make a copy of the features.\n", " target_values = data[target]\n", " print \"--------------------------------------------------------------------\"\n", " print \"Subtree, depth = %s (%s data points).\" % (current_depth, len(target_values))\n", " \n", " # Stopping condition 1. Error is 0.\n", " if intermediate_node_weighted_mistakes(target_values, data_weights)[0] <= 1e-15:\n", " print \"Stopping condition 1 reached.\" \n", " return create_leaf(target_values, data_weights)\n", " \n", " # Stopping condition 2. No more features.\n", " if remaining_features == []:\n", " print \"Stopping condition 2 reached.\" \n", " return create_leaf(target_values, data_weights) \n", " \n", " # Additional stopping condition (limit tree depth)\n", " if current_depth > max_depth:\n", " print \"Reached maximum depth. Stopping for now.\"\n", " return create_leaf(target_values, data_weights)\n", " \n", " # If all the datapoints are the same, splitting_feature will be None. Create a leaf\n", " splitting_feature = best_splitting_feature(data, features, target, data_weights)\n", " remaining_features.remove(splitting_feature)\n", "\n", " left_split = data[data[splitting_feature] == 0]\n", " right_split = data[data[splitting_feature] == 1]\n", "\n", " left_data_weights = data_weights[data[splitting_feature] == 0]\n", " right_data_weights = data_weights[data[splitting_feature] == 1]\n", "\n", " print \"Split on feature %s. (%s, %s)\" % (\\\n", " splitting_feature, len(left_split), len(right_split))\n", "\n", " # Create a leaf node if the split is \"perfect\"\n", " if len(left_split) == len(data):\n", " print \"Creating leaf node.\"\n", " return create_leaf(left_split[target], data_weights)\n", " if len(right_split) == len(data):\n", " print \"Creating leaf node.\"\n", " return create_leaf(right_split[target], data_weights)\n", "\n", " # Repeat (recurse) on left and right subtrees\n", " left_tree = weighted_decision_tree_create(\n", " left_split, remaining_features, target, left_data_weights, current_depth + 1, max_depth)\n", " right_tree = weighted_decision_tree_create(\n", " right_split, remaining_features, target, right_data_weights, current_depth + 1, max_depth)\n", "\n", " return {'is_leaf' : False, \n", " 'prediction' : None,\n", " 'splitting_feature': splitting_feature,\n", " 'left' : left_tree, \n", " 'right' : right_tree}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here is a recursive function to count the nodes in your tree:" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def count_nodes(tree):\n", " if tree['is_leaf']:\n", " return 1\n", " return 1 + count_nodes(tree['left']) + count_nodes(tree['right'])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Run the following test code to check your implementation. Make sure you get **'Test passed'** before proceeding." ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature term. 36 months. (9223, 28001)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (9223 data points).\n", "Split on feature grade.A. (9122, 101)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (9122 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (101 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (28001 data points).\n", "Split on feature grade.D. (23300, 4701)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (23300 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (4701 data points).\n", "Reached maximum depth. Stopping for now.\n", "Test passed!\n" ] } ], "source": [ "example_data_weights = sframe.SArray([1.0 for i in range(len(train_data))])\n", "small_data_decision_tree = weighted_decision_tree_create(train_data, features, target,\n", " example_data_weights, max_depth=2)\n", "if count_nodes(small_data_decision_tree) == 7:\n", " print 'Test passed!'\n", "else:\n", " print 'Test failed... try again!'\n", " print 'Number of nodes found:', count_nodes(small_data_decision_tree)\n", " print 'Number of nodes that should be there: 7' " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let us take a quick look at what the trained tree is like. You should get something that looks like the following\n", "\n", "```\n", "{'is_leaf': False,\n", " 'left': {'is_leaf': False,\n", " 'left': {'is_leaf': True, 'prediction': -1, 'splitting_feature': None},\n", " 'prediction': None,\n", " 'right': {'is_leaf': True, 'prediction': 1, 'splitting_feature': None},\n", " 'splitting_feature': 'grade.A'\n", " },\n", " 'prediction': None,\n", " 'right': {'is_leaf': False,\n", " 'left': {'is_leaf': True, 'prediction': 1, 'splitting_feature': None},\n", " 'prediction': None,\n", " 'right': {'is_leaf': True, 'prediction': -1, 'splitting_feature': None},\n", " 'splitting_feature': 'grade.D'\n", " },\n", " 'splitting_feature': 'term. 36 months'\n", "}```" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "{'is_leaf': False,\n", " 'left': {'is_leaf': False,\n", " 'left': {'is_leaf': True, 'prediction': -1, 'splitting_feature': None},\n", " 'prediction': None,\n", " 'right': {'is_leaf': True, 'prediction': 1, 'splitting_feature': None},\n", " 'splitting_feature': 'grade.A'},\n", " 'prediction': None,\n", " 'right': {'is_leaf': False,\n", " 'left': {'is_leaf': True, 'prediction': 1, 'splitting_feature': None},\n", " 'prediction': None,\n", " 'right': {'is_leaf': True, 'prediction': -1, 'splitting_feature': None},\n", " 'splitting_feature': 'grade.D'},\n", " 'splitting_feature': 'term. 36 months'}" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "small_data_decision_tree" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Making predictions with a weighted decision tree" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We give you a function that classifies one data point. It can also return the probability if you want to play around with that as well." ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def classify(tree, x, annotate = False): \n", " # If the node is a leaf node.\n", " if tree['is_leaf']:\n", " if annotate: \n", " print \"At leaf, predicting %s\" % tree['prediction']\n", " return tree['prediction'] \n", " else:\n", " # Split on feature.\n", " split_feature_value = x[tree['splitting_feature']]\n", " if annotate: \n", " print \"Split on %s = %s\" % (tree['splitting_feature'], split_feature_value)\n", " if split_feature_value == 0:\n", " return classify(tree['left'], x, annotate)\n", " else:\n", " return classify(tree['right'], x, annotate)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Evaluating the tree\n", "\n", "Now, we will write a function to evaluate a decision tree by computing the classification error of the tree on the given dataset.\n", "\n", "Again, recall that the **classification error** is defined as follows:\n", "$$\n", "\\mbox{classification error} = \\frac{\\mbox{# mistakes}}{\\mbox{# all data points}}\n", "$$\n", "\n", "The function called **evaluate_classification_error** takes in as input:\n", "1. `tree` (as described above)\n", "2. `data` (an SFrame)\n", "\n", "The function does not change because of adding data point weights." ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def evaluate_classification_error(tree, data):\n", " # Apply the classify(tree, x) to each row in your data\n", " prediction = data.apply(lambda x: classify(tree, x))\n", " \n", " # Once you've made the predictions, calculate the classification error\n", " return (prediction != data[target]).sum() / float(len(data))" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "0.3981042654028436" ] }, "execution_count": 20, "metadata": {}, "output_type": "execute_result" } ], "source": [ "evaluate_classification_error(small_data_decision_tree, test_data)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Example: Training a weighted decision tree\n", "\n", "To build intuition on how weighted data points affect the tree being built, consider the following:\n", "\n", "Suppose we only care about making good predictions for the **first 10 and last 10 items** in `train_data`, we assign weights:\n", "* 1 to the last 10 items \n", "* 1 to the first 10 items \n", "* and 0 to the rest. \n", "\n", "Let us fit a weighted decision tree with `max_depth = 2`." ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature home_ownership.RENT. (20514, 16710)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (20514 data points).\n", "Split on feature grade.F. (19613, 901)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (19613 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (901 data points).\n", "Stopping condition 1 reached.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (16710 data points).\n", "Split on feature grade.D. (13315, 3395)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (13315 data points).\n", "Stopping condition 1 reached.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 3 (3395 data points).\n", "Stopping condition 1 reached.\n" ] } ], "source": [ "# Assign weights\n", "example_data_weights = sframe.SArray([1.] * 10 + [0.]*(len(train_data) - 20) + [1.] * 10)\n", "\n", "# Train a weighted decision tree model.\n", "small_data_decision_tree_subset_20 = weighted_decision_tree_create(train_data, features, target,\n", " example_data_weights, max_depth=2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, we will compute the classification error on the `subset_20`, i.e. the subset of data points whose weight is 1 (namely the first and last 10 data points)." ] }, { "cell_type": "code", "execution_count": 22, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "0.05" ] }, "execution_count": 22, "metadata": {}, "output_type": "execute_result" } ], "source": [ "subset_20 = train_data.head(10).append(train_data.tail(10))\n", "evaluate_classification_error(small_data_decision_tree_subset_20, subset_20)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, let us compare the classification error of the model `small_data_decision_tree_subset_20` on the entire test set `train_data`:" ] }, { "cell_type": "code", "execution_count": 23, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "0.48124865678057166" ] }, "execution_count": 23, "metadata": {}, "output_type": "execute_result" } ], "source": [ "evaluate_classification_error(small_data_decision_tree_subset_20, train_data)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The model `small_data_decision_tree_subset_20` performs **a lot** better on `subset_20` than on `train_data`.\n", "\n", "So, what does this mean?\n", "* The points with higher weights are the ones that are more important during the training process of the weighted decision tree.\n", "* The points with zero weights are basically ignored during training.\n", "\n", "**Quiz Question**: Will you get the same model as `small_data_decision_tree_subset_20` if you trained a decision tree with only the 20 data points with non-zero weights from the set of points in `subset_20`?\n", "**Answer**: no" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Implementing your own Adaboost (on decision stumps)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that we have a weighted decision tree working, it takes only a bit of work to implement Adaboost. For the sake of simplicity, let us stick with **decision tree stumps** by training trees with **`max_depth=1`**." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Recall from the lecture the procedure for Adaboost:\n", "\n", "1\\. Start with unweighted data with $\\alpha_j = 1$\n", "\n", "2\\. For t = 1,...T:\n", " * Learn $f_t(x)$ with data weights $\\alpha_j$\n", " * Compute coefficient $\\hat{w}_t$:\n", " $$\\hat{w}_t = \\frac{1}{2}\\ln{\\left(\\frac{1- \\mbox{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})}{\\mbox{E}(\\mathbf{\\alpha}, \\mathbf{\\hat{y}})}\\right)}$$\n", " * Re-compute weights $\\alpha_j$:\n", " $$\\alpha_j \\gets \\begin{cases}\n", " \\alpha_j \\exp{(-\\hat{w}_t)} & \\text{ if }f_t(x_j) = y_j\\\\\n", " \\alpha_j \\exp{(\\hat{w}_t)} & \\text{ if }f_t(x_j) \\neq y_j\n", " \\end{cases}$$\n", " * Normalize weights $\\alpha_j$:\n", " $$\\alpha_j \\gets \\frac{\\alpha_j}{\\sum_{i=1}^{N}{\\alpha_i}} $$\n", " \n", "Complete the skeleton for the following code to implement **adaboost_with_tree_stumps**. Fill in the places with `YOUR CODE HERE`." ] }, { "cell_type": "code", "execution_count": 24, "metadata": { "collapsed": false }, "outputs": [], "source": [ "from math import log\n", "from math import exp\n", "\n", "def adaboost_with_tree_stumps(data, features, target, num_tree_stumps):\n", " # start with unweighted data\n", " alpha = sframe.SArray([1.]*len(data))\n", " weights = []\n", " tree_stumps = []\n", " target_values = data[target]\n", " \n", " for t in xrange(num_tree_stumps):\n", " print '====================================================='\n", " print 'Adaboost Iteration %d' % t\n", " print '=====================================================' \n", " # Learn a weighted decision tree stump. Use max_depth=1\n", " tree_stump = weighted_decision_tree_create(data, features, target, data_weights=alpha, max_depth=1)\n", " tree_stumps.append(tree_stump)\n", " \n", " # Make predictions\n", " predictions = data.apply(lambda x: classify(tree_stump, x))\n", " \n", " # Produce a Boolean array indicating whether\n", " # each data point was correctly classified\n", " is_correct = predictions == target_values\n", " is_wrong = predictions != target_values\n", " \n", " # Compute weighted error\n", " # YOUR CODE HERE\n", " #weighted_mistakes, best_class = intermediate_node_weighted_mistakes(target_values, alpha)\n", " weighted_error = alpha[is_wrong].sum() / alpha.sum()\n", " print 'weighted_error:', weighted_error\n", " # Compute model coefficient using weighted error\n", " # YOUR CODE HERE\n", " weight = 0.5*np.log((1-weighted_error)/weighted_error)\n", " weights.append(weight)\n", " \n", " # Adjust weights on data point\n", " adjustment = is_correct.apply(lambda is_correct : np.exp(-weight) if is_correct else np.exp(weight))\n", " \n", " # Scale alpha by multiplying by adjustment \n", " # Then normalize data points weights\n", " ## YOUR CODE HERE \n", " alpha = alpha*adjustment\n", " \n", " return weights, tree_stumps" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Checking your Adaboost code\n", "\n", "Train an ensemble of **two** tree stumps and see which features those stumps split on. We will run the algorithm with the following parameters:\n", "* `train_data`\n", "* `features`\n", "* `target`\n", "* `num_tree_stumps = 2`" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "=====================================================\n", "Adaboost Iteration 0\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature term. 36 months. (9223, 28001)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (9223 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (28001 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.421636578551\n", "=====================================================\n", "Adaboost Iteration 1\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.412498248915\n" ] } ], "source": [ "stump_weights, tree_stumps = adaboost_with_tree_stumps(train_data, features, target, num_tree_stumps=2)" ] }, { "cell_type": "code", "execution_count": 26, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def print_stump(tree):\n", " split_name = tree['splitting_feature'] # split_name is something like 'term. 36 months'\n", " if split_name is None:\n", " print \"(leaf, label: %s)\" % tree['prediction']\n", " return None\n", " split_feature, split_value = split_name.split('.')\n", " print ' root'\n", " print ' |---------------|----------------|'\n", " print ' | |'\n", " print ' | |'\n", " print ' | |'\n", " print ' [{0} == 0]{1}[{0} == 1] '.format(split_name, ' '*(27-len(split_name)))\n", " print ' | |'\n", " print ' | |'\n", " print ' | |'\n", " print ' (%s) (%s)' \\\n", " % (('leaf, label: ' + str(tree['left']['prediction']) if tree['left']['is_leaf'] else 'subtree'),\n", " ('leaf, label: ' + str(tree['right']['prediction']) if tree['right']['is_leaf'] else 'subtree'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here is what the first stump looks like:" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " root\n", " |---------------|----------------|\n", " | |\n", " | |\n", " | |\n", " [term. 36 months == 0] [term. 36 months == 1] \n", " | |\n", " | |\n", " | |\n", " (leaf, label: -1) (leaf, label: 1)\n" ] } ], "source": [ "print_stump(tree_stumps[0])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here is what the next stump looks like:" ] }, { "cell_type": "code", "execution_count": 28, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " root\n", " |---------------|----------------|\n", " | |\n", " | |\n", " | |\n", " [grade.A == 0] [grade.A == 1] \n", " | |\n", " | |\n", " | |\n", " (leaf, label: -1) (leaf, label: 1)\n" ] } ], "source": [ "print_stump(tree_stumps[1])" ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[0.15802933659263743, 0.17682363293636497]\n" ] } ], "source": [ "print stump_weights" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If your Adaboost is correctly implemented, the following things should be true:\n", "\n", "* `tree_stumps[0]` should split on **term. 36 months** with the prediction -1 on the left and +1 on the right.\n", "* `tree_stumps[1]` should split on **grade.A** with the prediction -1 on the left and +1 on the right.\n", "* Weights should be approximately `[0.158, 0.177]` \n", "\n", "**Reminders**\n", "- Stump weights ($\\mathbf{\\hat{w}}$) and data point weights ($\\mathbf{\\alpha}$) are two different concepts.\n", "- Stump weights ($\\mathbf{\\hat{w}}$) tell you how important each stump is while making predictions with the entire boosted ensemble.\n", "- Data point weights ($\\mathbf{\\alpha}$) tell you how important each data point is while training a decision stump." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Training a boosted ensemble of 10 stumps" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let us train an ensemble of 10 decision tree stumps with Adaboost. We run the **adaboost_with_tree_stumps** function with the following parameters:\n", "* `train_data`\n", "* `features`\n", "* `target`\n", "* `num_tree_stumps = 10`" ] }, { "cell_type": "code", "execution_count": 30, "metadata": { "collapsed": false, "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "=====================================================\n", "Adaboost Iteration 0\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature term. 36 months. (9223, 28001)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (9223 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (28001 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.421636578551\n", "=====================================================\n", "Adaboost Iteration 1\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.412498248915\n", "=====================================================\n", "Adaboost Iteration 2\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.D. (30465, 6759)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (30465 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (6759 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.453574664308\n", "=====================================================\n", "Adaboost Iteration 3\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature home_ownership.MORTGAGE. (19846, 17378)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (19846 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (17378 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.463619975981\n", "=====================================================\n", "Adaboost Iteration 4\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.B. (26858, 10366)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (26858 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (10366 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.466518644019\n", "=====================================================\n", "Adaboost Iteration 5\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.E. (33815, 3409)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (33815 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (3409 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.467760207218\n", "=====================================================\n", "Adaboost Iteration 6\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.472746758716\n", "=====================================================\n", "Adaboost Iteration 7\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.F. (35512, 1712)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35512 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1712 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.478258250407\n", "=====================================================\n", "Adaboost Iteration 8\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.485509702974\n", "=====================================================\n", "Adaboost Iteration 9\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.n/a. (35781, 1443)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35781 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1443 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.487021661048\n" ] } ], "source": [ "stump_weights, tree_stumps = adaboost_with_tree_stumps(train_data, features, \n", " target, num_tree_stumps=10)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Making predictions\n", "\n", "Recall from the lecture that in order to make predictions, we use the following formula:\n", "$$\n", "\\hat{y} = sign\\left(\\sum_{t=1}^T \\hat{w}_t f_t(x)\\right)\n", "$$\n", "\n", "We need to do the following things:\n", "- Compute the predictions $f_t(x)$ using the $t$-th decision tree\n", "- Compute $\\hat{w}_t f_t(x)$ by multiplying the `stump_weights` with the predictions $f_t(x)$ from the decision trees\n", "- Sum the weighted predictions over each stump in the ensemble.\n", "\n", "Complete the following skeleton for making predictions:" ] }, { "cell_type": "code", "execution_count": 31, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def predict_adaboost(stump_weights, tree_stumps, data):\n", " scores = sframe.SArray([0.]*len(data))\n", " \n", " for i, tree_stump in enumerate(tree_stumps):\n", " predictions = data.apply(lambda x: classify(tree_stump, x))\n", " \n", " # Accumulate predictions on scores array\n", " # YOUR CODE HERE\n", " scores += float(stump_weights[i])*predictions\n", " \n", " return scores.apply(lambda score : +1 if score > 0 else -1)" ] }, { "cell_type": "code", "execution_count": 32, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def evaluate_accuracy(target_labels, predicted_labels):\n", " accuracy = (target_labels == predicted_labels).sum() / float(len(target_labels))\n", " return accuracy" ] }, { "cell_type": "code", "execution_count": 33, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Accuracy of 10-component ensemble = 0.620314519604\n" ] } ], "source": [ "predictions = predict_adaboost(stump_weights, tree_stumps, test_data)\n", "#accuracy = graphlab.evaluation.accuracy(test_data[target], predictions)\n", "accuracy = evaluate_accuracy(test_data[target], predictions)\n", "print 'Accuracy of 10-component ensemble = %s' % accuracy " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, let us take a quick look what the `stump_weights` look like at the end of each iteration of the 10-stump ensemble:" ] }, { "cell_type": "code", "execution_count": 34, "metadata": { "collapsed": false }, "outputs": [ { "data": { "text/plain": [ "[0.15802933659263743,\n", " 0.17682363293636497,\n", " 0.093118889711890876,\n", " 0.072888855258550919,\n", " 0.067063069141595502,\n", " 0.06456916961616728,\n", " 0.054560557791856513,\n", " 0.04351093673372116,\n", " 0.028988711500354303,\n", " 0.025962509691406271]" ] }, "execution_count": 34, "metadata": {}, "output_type": "execute_result" } ], "source": [ "stump_weights" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Quiz Question:** Are the weights monotonically decreasing, monotonically increasing, or neither?\n", "\n", "**Reminder**: Stump weights ($\\mathbf{\\hat{w}}$) tell you how important each stump is while making predictions with the entire boosted ensemble." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Performance plots\n", "\n", "In this section, we will try to reproduce some of the performance plots dicussed in the lecture.\n", "\n", "### How does accuracy change with adding stumps to the ensemble?\n", "\n", "We will now train an ensemble with:\n", "* `train_data`\n", "* `features`\n", "* `target`\n", "* `num_tree_stumps = 30`\n", "\n", "Once we are done with this, we will then do the following:\n", "* Compute the classification error at the end of each iteration.\n", "* Plot a curve of classification error vs iteration.\n", "\n", "First, lets train the model." ] }, { "cell_type": "code", "execution_count": 35, "metadata": { "collapsed": false, "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "=====================================================\n", "Adaboost Iteration 0\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature term. 36 months. (9223, 28001)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (9223 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (28001 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.421636578551\n", "=====================================================\n", "Adaboost Iteration 1\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.412498248915\n", "=====================================================\n", "Adaboost Iteration 2\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.D. (30465, 6759)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (30465 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (6759 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.453574664308\n", "=====================================================\n", "Adaboost Iteration 3\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature home_ownership.MORTGAGE. (19846, 17378)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (19846 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (17378 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.463619975981\n", "=====================================================\n", "Adaboost Iteration 4\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.B. (26858, 10366)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (26858 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (10366 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.466518644019\n", "=====================================================\n", "Adaboost Iteration 5\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.E. (33815, 3409)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (33815 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (3409 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.467760207218\n", "=====================================================\n", "Adaboost Iteration 6\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.472746758716\n", "=====================================================\n", "Adaboost Iteration 7\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.F. (35512, 1712)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35512 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1712 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.478258250407\n", "=====================================================\n", "Adaboost Iteration 8\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.485509702974\n", "=====================================================\n", "Adaboost Iteration 9\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.n/a. (35781, 1443)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35781 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1443 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.487021661048\n", "=====================================================\n", "Adaboost Iteration 10\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.D. (30465, 6759)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (30465 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (6759 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.484617644977\n", "=====================================================\n", "Adaboost Iteration 11\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.B. (26858, 10366)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (26858 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (10366 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.48778378868\n", "=====================================================\n", "Adaboost Iteration 12\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.n/a. (35781, 1443)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35781 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1443 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.486229749089\n", "=====================================================\n", "Adaboost Iteration 13\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.4 years. (34593, 2631)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (34593 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (2631 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.488914307103\n", "=====================================================\n", "Adaboost Iteration 14\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.n/a. (35781, 1443)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35781 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1443 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.491328105076\n", "=====================================================\n", "Adaboost Iteration 15\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.C. (27812, 9412)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (27812 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (9412 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.490334137226\n", "=====================================================\n", "Adaboost Iteration 16\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.492674444173\n", "=====================================================\n", "Adaboost Iteration 17\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.F. (35512, 1712)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35512 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1712 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.490189134704\n", "=====================================================\n", "Adaboost Iteration 18\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature term. 36 months. (9223, 28001)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (9223 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (28001 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.489549406889\n", "=====================================================\n", "Adaboost Iteration 19\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.B. (26858, 10366)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (26858 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (10366 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.492808481162\n", "=====================================================\n", "Adaboost Iteration 20\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.n/a. (35781, 1443)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35781 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1443 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.491256963368\n", "=====================================================\n", "Adaboost Iteration 21\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.D. (30465, 6759)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (30465 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (6759 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.491340019779\n", "=====================================================\n", "Adaboost Iteration 22\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.F. (35512, 1712)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35512 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1712 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.494329547875\n", "=====================================================\n", "Adaboost Iteration 23\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.A. (32094, 5130)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (32094 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (5130 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.491042934372\n", "=====================================================\n", "Adaboost Iteration 24\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.n/a. (35781, 1443)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35781 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1443 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.493457365522\n", "=====================================================\n", "Adaboost Iteration 25\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.2 years. (33652, 3572)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (33652 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (3572 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.492718310217\n", "=====================================================\n", "Adaboost Iteration 26\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.F. (35512, 1712)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35512 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1712 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.494148977125\n", "=====================================================\n", "Adaboost Iteration 27\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature home_ownership.OWN. (34149, 3075)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (34149 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (3075 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.493184682982\n", "=====================================================\n", "Adaboost Iteration 28\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature emp_length.n/a. (35781, 1443)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (35781 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (1443 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.494252681313\n", "=====================================================\n", "Adaboost Iteration 29\n", "=====================================================\n", "--------------------------------------------------------------------\n", "Subtree, depth = 1 (37224 data points).\n", "Split on feature grade.C. (27812, 9412)\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (27812 data points).\n", "Reached maximum depth. Stopping for now.\n", "--------------------------------------------------------------------\n", "Subtree, depth = 2 (9412 data points).\n", "Reached maximum depth. Stopping for now.\n", "weighted_error: 0.494348454041\n" ] } ], "source": [ "# this may take a while... \n", "stump_weights, tree_stumps = adaboost_with_tree_stumps(train_data, \n", " features, target, num_tree_stumps=30)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Computing training error at the end of each iteration\n", "\n", "Now, we will compute the classification error on the **train_data** and see how it is reduced as trees are added." ] }, { "cell_type": "code", "execution_count": 36, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Iteration 1, training error = 0.421636578551\n", "Iteration 2, training error = 0.433430045132\n", "Iteration 3, training error = 0.400037610144\n", "Iteration 4, training error = 0.400037610144\n", "Iteration 5, training error = 0.384724908661\n", "Iteration 6, training error = 0.384617451107\n", "Iteration 7, training error = 0.382763808296\n", "Iteration 8, training error = 0.384617451107\n", "Iteration 9, training error = 0.382763808296\n", "Iteration 10, training error = 0.384483129164\n", "Iteration 11, training error = 0.382736943907\n", "Iteration 12, training error = 0.381447453256\n", "Iteration 13, training error = 0.381528046422\n", "Iteration 14, training error = 0.380560928433\n", "Iteration 15, training error = 0.380507199656\n", "Iteration 16, training error = 0.378223726628\n", "Iteration 17, training error = 0.378277455405\n", "Iteration 18, training error = 0.378411777348\n", "Iteration 19, training error = 0.378062540297\n", "Iteration 20, training error = 0.378761014399\n", "Iteration 21, training error = 0.379566946056\n", "Iteration 22, training error = 0.378895336342\n", "Iteration 23, training error = 0.378895336342\n", "Iteration 24, training error = 0.378761014399\n", "Iteration 25, training error = 0.378895336342\n", "Iteration 26, training error = 0.378975929508\n", "Iteration 27, training error = 0.379110251451\n", "Iteration 28, training error = 0.378922200731\n", "Iteration 29, training error = 0.379029658285\n", "Iteration 30, training error = 0.378734150011\n" ] } ], "source": [ "error_all = []\n", "for n in xrange(1, 31):\n", " predictions = predict_adaboost(stump_weights[:n], tree_stumps[:n], train_data)\n", " #error = 1.0 - graphlab.evaluation.accuracy(train_data[target], predictions)\n", " error = 1.0 - evaluate_accuracy(train_data[target], predictions)\n", " error_all.append(error)\n", " print \"Iteration %s, training error = %s\" % (n, error_all[n-1])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Visualizing training error vs number of iterations\n", "\n", "We have provided you with a simple code snippet that plots classification error with the number of iterations. " ] }, { "cell_type": "code", "execution_count": 37, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAdoAAAFgCAYAAAActbi8AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3XmcFMXdx/HPb5dbAUEQ8ABEQUWN8RaNZPGOtz6aaKIm\najyieWI84xEVjyQqPl5Ro/FGEzVe0Wi8ZcWQIChemKiogBoOkfs+dn/PH9XDzs727Pbszuzszn7f\nr9e8drq6urqmZ3Z+09VV1ebuiIiISGGUFbsCIiIipUyBVkREpIAUaEVERApIgVZERKSAFGhFREQK\nSIFWRESkgBRo2xgz+7GZvWdmS8ys2sx+Uew6Sf3M7CAzG29mC6P37IZm3v80M/u82GVI6xF9Tl/L\nIf/90Tb9C1mvYlGgbUZmNiD6MKU/VpjZVDO7x8wGFXj/ewD3AR2AW4CRwPhC7lOaJvpMPAFsDNxJ\neM9eyGH7g9M+a7s2shr5GGzf6gbsm9l3o+N2WbHr0gY4rfAzklS7YlegjfoE+HP0vBtQAZwIHG5m\nu7r7pwXa7/cIH+YT3H1igfYh+bUX0B44290fa8T2J1HzBXYS8Ga+KiYiySjQFscn7n5leoKZ3Qec\nAFxCCLqFsGH0d3aBypf8a/R7Zma9gYOAV4F+wA/M7Cx3X5HH+pUyK3YFpDSo6bjluJ3wj71TeqKZ\n9TGzW8zss6iZeZaZPWhmAzMLSF0XMbONzeyhKG+VmZ1lZtXAT6J9TIvyVmVsf4qZTYyu3y4ys7Fm\ndnjMfkZG2w83s5PMbJKZLTOzJ2PWn2xmk6P1H5nZcVGejmZ2nZl9YWbLzexNM9stZl8jzOxeM/s4\nrV7jzOz7MXlTTfP3mtlmZvaUmc2LtnvZzL4Vd+CjvPeY2fToGM8wsxfM7JCMfBYdo/Fmtjh6jDOz\nI+LKzcbMepvZrdF1y5XR/u5Pf09Tr4XQVGxAZeo9y+E61gmEH9MPRo+uwNH11GtE9HqWmtlsM7vL\nzHpkyTvYzEaZ2TvRMV4evc+XmFnWH/Bm1iM61rOjz8S/zOyALHkHmtkD0fFZGR2vW8ysV5b8R5jZ\nG9FnZEn0WT45Jp+Z2WnR+nnR651uZo+b2fZRnsuB1witAanPc+Ljb2YdzOx8M3s3Kn9B9BkcHpO3\nMiq3XfS/83n0OfzYzH4Wk7+jmV1gZu9Hr3Wxhe+HB81s05jXmugzazXXSQea2a/M7NPoPZqUeo/M\nbD0z+6OZzYzWvWpmW9RzHPqb2WNmNjd6T141s50bOn6NqX+L5u56NNMDGABUA8/ErNs5Wvd+Wtrm\nwH+B1cDTwLWEJucVwNfAphllVAPvAV8AbwE3AHdE5VwGvANURemXAZembXt7tP3nwPWEa7izorRz\nM/ZzeVTO34FFwJ+A3wIXZaz/KzAHuBv4fVTnKuBg4EXgI+Bm4IHoNc4DumXs6/ko3wPRPu4AZkT1\n+kWW4zsm2u8YYBTwZJT+DdA7Y5vhwOJo/38FfkO4Fvou8GRG3kejcj6IXs/vo+NVDfxvws9Ab2Bq\ndBxeiPb3eLT8NTAkytc9eo9ei9bdm3rPMo9RPfuaHL22LsBGwBpgTJa8+wGrgCXAH4HfRZ+lt6PP\n4OcZ+X8V1fcR4LrofXw3OhZPxpQ/FfgqKu/9qPw7gYVRvY7IyL9l9H6tAR6LjtOLUfmfAr0y8l8Q\nrZtF+OxeH+2zGvh9Rt5RUfo7hP+F3xF+iHwFnJH2ubg3yvdadOwTHX+gIzA2et/GR/u4M6rbauDw\njPxjoryPAdOAPwC3Rse3Gjg5I/9jUfrY6HVeF3025wAHNvYzS+i/kfq//QK4Lar3UsJ3zm5pn4n/\nI/QdqAamABbzXfQOMB34Z/T+PRCVswzYLcu+++f7f64lPIpegbb0oP5Am/qnvjst7V/AcmCPjLy7\nEr4Un8lIr44+rHdk2X+2D/N3o20nAZ3T0vsSgtpK0oI6IZBWAwuALWP2k1o/G9g4LX2HKH0h4cur\nY9q6c6K6nZNRVv+Y8jsTvtTnA51ijm8VcF7GNldG6RekpXUkBJFVwJ4x++mX9vy0qOxb0r9UorqM\nj96nvgk+A/dH9bgkI/0nUfmvxhzLKmB4jp+1XaLyRqelvUwIXJk/0MoIQWkVsENG+itROZmBti/Q\nLma/f4zqu3tGeurHxUtAWVr6ltGxmwV0SEuvjPL/KKOcK6L63JOWthkhgH1B2g8pYB3CF3RV+vsL\nzAUmZDlu3WP+Ly7L8dj/Ltrn+Rnp60fHYXbGZ39MtJ9/AuukpQ+J3pN/p6V1i8p+PGa/7TK2z+kz\nS/h+qAY+BNZLSz+Smv/bhzLKuiWqz5EZdUn9H96bkb5XtO7djPQ630251r8lP4pegbb0oCYQfET4\nAr2c8MtwYpQ+B9gsyrt9lHZrlrIei75cuqalVRN+LfbIsk22QJtKPzRmm3Ojdb9OS0sF0muz7CcV\nHC6OWTeF+C/ijaIy70t4LM8mIwClHd9P6zn2j6WlfT9KuzPB/t4nnHHHBZeDonLOaKCMDtH7M5O0\noJKxjypq/zhpbKC9M9puv7S046N6XpWRd88o/dGYcoYRE2jr2W/qx9RlGempQLtrPXU9LFruH5Ux\nKSZvJ0KgWpp6L9KO0S9i8v8PdX/AzgXGJngtOQdaQjP/POCDLOvPjOp6YFpa6oy2znuctm6daLlr\nVKeH8v2ZpeZ74Icxr2lFtG6jjHV7ROVcnpFeTfiRsFHMvl+MytouZt/9G1v/lvxQZ6jiGExohoIQ\nLGcQmld/4+7To/TUUIyNo+tFmfoRzjgGE85EU6a5+/wc65O6djk2Zl0l4R9tu4x0JzQh1ef9mLSZ\nwKCYdbOivxumJ5pZV0Iz5aHRdl0y6tAvZh/vxqR9Ff1dLy1t56iMl2Pyp9ehM7A14YzpErM6fWQ2\niP5uWV85wBaEQPGSu6+KWf96tJ/t0uqbMzPrBPyAcKzTX9uThEsEJxCaQFO2IxyHcTHFjSecBWfu\nw4CfAj+O6tyNms5D2d6X1e4e1+v5H8ApUT2epuazVufz6O4rzGwCcCDheH5Iw59fqP35fRQ4zcwm\nEZrtK4GJ7r46ZvtcbUH4jE3P8n87mHCctiRcekk3qW72Wp/bpe6+2MxeAI41s00IzbyvE84Qq1Mb\nNfEzW+t/093dzL4Gurj7fzPyxv7fRqbH5Ifwfu9DeE/ei1mfz/+5FkGBtjiec/dDG8jTM/p7SPSI\n44TmsXSN6VHcDVjj7gti1s1Ky5OpoX0tikmrAnD3JemJ7l4V/TO1T6WZWXvCl+e3CEH9PsIv3Crg\n28BhhObfBvebVn55WnL36O+MBl5HD8KXY39qfiDV2QW1fwTESR3DbMetvmOdi6OjMu726Oc/gLsv\nNbOngB+Z2f7u/mK0KnUc5mQWFH3JfhOzj1uBnxGuKT4R1X0VISD8kvj3ZW6W+qaOR6oeuR6nrPnd\nfa6ZraH2Mf1fwnW+E4GrCO/tYjMbDVzo7kuz7DeJ1P/tt6j5AVCnWtT9v63zPxFJ/chJ/9z+D/Br\n4FjCNVoD5prZ7YTWijUk/8zWqQfZ/2/j0lP1ax+z7uss+50d1a17lvWQv/+5FkGBtuVKfahPd/e7\nctjOG84Su692ZrZeTLDtk1Gfpu4rF4cRfvXe6e61el+a2QXR+qZIvda4X+PpUq/9TXffvQn7S5XT\nJ8v6+o51LlLDw84xs3Ni1jthTG0q0C4kfKn1zswYnbn2Iu0M28w2AE4ndHbZ3d1Xpq3bhRBo46yf\nJT31uhdGf3M9Tun5Z2bUvyfhe27tMXX3KkKAut7MNgRGEM6ozwTWpWnD61L7edTdf9iEcrJy9+WE\nYYCXWJjQZC9C3S8lNKdeQf4+s02xQZb0PoTP4MIs66Fl1D9vNLyn5ZoQ/R3WDPtKNbXWGXpAmEwj\nPU9z2ozwD/m3mHXfyUP5EwkBZr/6MkVnGh8BQ80s7gwgqY8J17p2ic7WM6WOf2xzWhLR8I7vEjp5\n3Z3lMQ841GqG7qT2F3dMh1H3B/mmhOP2anqQraeMlPYWM4SLcI3Y0+rxblp6LWbWkdDRawXheKby\nG/V/ft+Jq5C7z3D3PwH7Es7o01uaUsPfyutsmN1/CD29d7KY9s58c/fP3f1uajoZHRql5+sz2xQD\nzGyjmPTU+5r1c95C6p83CrQtlLtPIATb482szplbNOZujzztbjThi+pyM1vbFGNmfYHzCNeR/5xl\n20L6IqpXrddpZkcShgg11TOEZuOfmNl3M1dGZzspvyc0P94Zfdln5h1qYYKIrKLrso8SftGfm7H9\njwlNjWPcvdHXZwlnqhCGtJwa9yD0cO8AHBflHUdoAj7czHZMq1M5oWk10xfR31o/As1sCHAh9bd0\nXBWVm9pmKKGT1teEoVy4+5eE647fNrNjM7a/kHCm9HDURArhs1kFnJf+HpjZuoRxyE4YvpMa3xoX\n7LsRerOmT+YxL/q7ST2vp5bobDk1pO4aM6vzHWtmu0TX0XNmZr2iY5apD+H7fHlaWpM/s01UDlyd\nsc+9CddnP3D3hn5QFrv+eVOUpmMz2xi4iXDAjTCE4JfRP1gu5VxIGFv5D3eP+zWbyncM4Z/xK3dv\nTZNW/5AwDOYpM/sH4Vf5GkIP2j0J17zi/uly4u6vm9kfCM2BH0TX8ToQeuX2JgxTmNrU/TTC3whf\n6r8ys20Iv3C3BvYndOw5simFu/vK6Iv8OeBVM3uOMPa0J6Ez2rTUPtz9D2Y2DPgRsKeFCdNnETr9\nbEu4ZjyMmOucGS4gnHH+JgrukwgdOg6Ptj2jsa8nOoP6MeHM5sF6st4HnE9oIv29u1eb2emE4z3W\nzB4mjGE9kHDdtVZzrLvPjD4jh5vZRELP2I0IfQleAI7Kst+ZhGtv70bHuidwDOF76IyMDmI/A94A\nRkc/rD4hTOayL/AZIeCm6vOZmV0MXAO8b2apHvlHEP5XbnP3N6LsnYF/mtlHhGP/JeFa4aGE65Xp\n1wM/IvwQO8bMVhGazx24xd0XZzu4URk7En6kHmpmbxCC9sZR+hDC56YxM3RtBLxjZu8Qhi7NIPzw\nOJzwvq+94UQeP7ON9T4wwsz+Sehwtgmh/8BywtCderWA+udPc3dzJnzQpxDehFRHn/ejtM45lDOI\n0EQzk3q66hP+iWYSmtK+aO7Xm1GXAYRf3k/nsE0PwmDvyYQhDQsIPS3vAkZk5K0iYxxmxvr7CIG6\nztjUaP3JhObUJYRrJJVEQy4y8tU75KS+9YQv5TVZtqtTf0Iz5ROEDhQLCWc6+xACShVh3ubM43tP\n0vKj9MGE8a1fEb78/ksIvt+LyXssYUrDuYQvjGmEM7FTk35+Cdc8b4m2XUH4srwPGJDrsc7Iu1+U\n97kEef8VfRa2T0urIPQIXRod77sInZumAp9lbL8O4Ut9KmHI0geE64QD496DVBlReXcRvjCXEcaO\n7peljgOj92VGdJymESbG6JUl/+GEznOLos/wROCkjDztCD8yXiD8iFsevd8vkjHZQ5R/Z8KP3QXR\n66ozPC5LXcoIPxb+GW27lDDRxpOEwJE+lri+/4la/7OE77NLo23+G9V/elTusCxlJPrMZu4r7v2L\nSY/9n4vSXiUE178QfrgtIfSC36mh15nv/7liPyx6Ic3GzM4idEQY4tFZkoWp56YQzpxuSljOC4Q3\nf0ug3LOc0ZrZHwlv9ixgb29dZ7QiItLKFeMa7SHAeE9rinT3aYTrRIl6kZrZDwkTOlzUQL49CM2v\nZza2siIiIk1RjEC7NaEZNNOHJLjeaGbrEZqszvf4cZ+pfO0IM85c5+664bSIiBRFMQJtT8IctZnm\nEa5HNuR64GN3H91AvgsJHXquya16IiIi+dOqJqwwsz0JQxK2byDf5sDFhI48cVPdiYiINItiBNr5\nxJ+5ZjvTTXcHcA8ww8y6E4YGtQPKouXlUWC9hdBLbUJavg6EERDdgZUec/NrM2venmEiItLiuXuT\nJh8pRtPxh4TrtJmGAv9uYNutCGM950ePeYTJDIZFz09Py3dgRr5jCWPQ5hHG3sYqdjfwYj8uv/zy\noteh2A8dAx0DHQMdg9QjH4pxRvsMMMrMBnrobZwa3rMHYTB/fSpi0m4m/GD4OWGcHoQ7l2TOvHIR\n4RZeRxHGn4mIiBRcMQLtXYThNk+bWepWXVcSBl3/MZXJzPoT7rAx0t2vBnD3OrfBMrMFhHG0qZlf\n8DB9YWa+EwlNxm9krhMRESmUZm86dvdlhAmwPyHMsfsg4Ux072hdiqU9Giw26e5zqGqbVFFRUewq\nFJ2OgY4B6BiAjkG+NPvMUC2ZmbmOh4iIpJgZ3go7Q4mIiLQZrWocrYiUtoEDBzJ9+vRiV0PamAED\nBjBt2rSCla+m4zRqOhYprqiZrtjVkDamvs+dmo5FRERaOAVaERGRAlKgFRERKSAFWhERkQJSoBUR\nESkgBVoRkTwpKyur91FeXs7YsXVmks1Zv379uOyyy3LaZuXKlZSVlXHvvfc2ef+SG42jFRHJk/Hj\nx699vnz5ckaMGMFll13GgQceuDZ96NChTd7P888/zwYbbJDTNh07dmT8+PFsttlmTd6/5EbjaNNo\nHK1IcZXSONqlS5fStWtX7r//fk444YQG869cuZKOHTs2Q81avlWrVtGhQ4c66StWrKBTp8wbsyWz\nevVq2rVrh1ndIbEaRysiUmLuuOMOysrKeOeddxg+fDjrrLMOt956KwDnnnsu2267Leuuuy79+/fn\nJz/5Cd98802t7TObjo899lj23HNPnn/+ebbZZhu6du1KRUUFn3zyydo8cU3Hw4YN4/jjj2f06NFs\nttlmdO/enUMPPZSvv/661v6mTp3KvvvuS5cuXRg8eDAPP/wwhxxySK0z9Wwef/xxdtxxRzp37sxG\nG23Er3/9a6qrq9euv/DCC9lkk02orKxkxx13pFOnTvztb3/jxRdfpKysjDFjxnDQQQex7rrrcv75\n5wPhR8wZZ5xBnz596Ny5M7vtthuVlZW19pt6bbfddhuDBg2iS5cuzJs3r8H6FoKajkWk1Yg5Gcm7\n5jihTp1VHXPMMZx55plcddVV9OzZk+rqaubNm8cll1zChhtuyNdff82oUaPYb7/9mDRpUr1lfvrp\np1x66aVceeWVtGvXjrPPPpsf/ehHTJw4sd7txo4dy5dffsnNN9/MokWLOOusszjjjDN4/PHHAXB3\nDjroINasWcPo0aMpLy9n5MiRzJs3j2233bbeskePHs1JJ53EL37xC6699lo+/vhjLrroIsrKyrjy\nyivXHouFCxdyyimncNFFFzFo0CD69+/PlClTADjxxBM5+eSTOf/88+nSpQsAJ5xwAq+99hrXXnst\n/fv35w9/+AP7778/48aNY6eddlq7/1dffZUpU6Zwww030KFDh7XbN7ti372+JT3C4RCRYmnofzCE\nwcI+8mXJkiVuZv7AAw/UWXfHHXd4WVmZ33XXXfWWUVVV5Z9++qmbmU+cOHFtet++ff3SSy9du3zM\nMcd4x44d/csvv1yb9sgjj3hZWZlPnz7d3d1XrFjhZub33HPP2jy77bab9+rVy5cuXbo27ZprrvH2\n7dt7VVWVu7s//vjjXlZW5pMnT16bZ+rUqV5eXu7f+9736q17v379/Mwzz6yVfvvtt3vXrl198eLF\n7u5+4YUXellZmb/88su18r3wwgtuZn7JJZfUSn/33XfdzPyxxx6rta/Bgwf74YcfXuu1de3a1efP\nn5+1jin1fe6idU2KLWo6FhEpkrim12eeeYZhw4ax3nrr0a5dOwYPHoyZ1WoGjjNkyBA23njjtctD\nhw7F3fnqq6/q3W7YsGG1zvSGDh1KVVUVs2bNAuCtt95i4MCBbL311mvzDBw4sMGz2cmTJzNr1iyO\nOuooqqqq1j5GjBjBkiVL+M9//rM2b/v27dlnn33qlGFmdY7RhAkTaNeuHUccccTatLKyMo466ij+\n8Y9/1Mq72267sd5669Vbz+agQCsiUiR9+vSptTxu3DiOPPJIBg8ezJ/+9CfGjx/PG2+8gbuzYsWK\nesvKDCipzkRN3W7WrFn07t27znZxaelS15X33ntv2rdvv/YxdOhQzIwvv/wyUVmZx2jmzJn06NGD\n8vLyOvnmz59f77bFomu0ItJqlEiH5LUye8A++eSTDBgwgNGjR69Na+hMttD69u0bO/Z3zpw59OvX\nL+t2PXv2BMJ12q222qrO+vRhRnE9gbOt69evH/Pnz6eqqqpWsJ09ezY9evSod9ti0RmtiEgLsXz5\n8jrDWh566KGiBoydd96ZadOmMXny5LVpU6dO5YMPPqh3u2233ZbevXszbdo0dthhhzqP7t27N6o+\nu+yyC2vWrOGpp55am1ZdXc0TTzzBnnvu2agyC01ntM3ommvguutgyBB45BEYOLDYNRKRlmTffffl\nzjvv5IILLuCAAw5g7NixPProo81eD09rOjjiiCPYYostOOKII/jtb39LeXk5V1xxBf369aOsLPu5\nWnl5OaNGjeKUU05h7ty57LfffrRr145PP/2Uv/71rzz//PMN/oDwmCaM7bbbjiOPPJLTTjuNuXPn\nMmDAAG6//XamT5/Oww8/3PgXXUA6o20mH3wAF10E8+fDm2/C1VcXu0YiUmi5nokeccQRXHXVVfz5\nz3/msMMOY+LEiTz99NONLjczX9xytgkc0p///e9/Z9NNN+XHP/4x5513Hueccw6DBg2iW7du9e7/\nhBNO4Mknn2TChAkcddRRHHXUUdx9993svvvuiV5DtjyjR4/mmGOO4bLLLuPII49kzpw5vPjii+yw\nww4NvrZi0MxQaQo5M9RvfgO//nXN8g47wNtvF2RXIq1WKc0MVcrmzZvHoEGDuPjii7nggguKXZ0m\nK/TMUGo6biYvvlh7eebM4tRDRCRXt912G506dWLzzTdn1qxZjBo1CjNLNLWkKNA2i0WL4F//qp02\nezZUVUFGD3URkRanQ4cOXH/99XzxxReUl5ez2267cdddd9G3b99iV61VUNNxmkI1Hf/1r5A2tnqt\nmTNBn1ORGmo6lmLQTQVKQGazcYqaj0VESp8CbYG5K9CKiLRlCrQF9umnMHVq/DoFWhGR0leUQGtm\nG5vZ42a2wMwWmtkTZrZJI8q50MyqzWxsRvq6ZvaomU0xsyVmNt/M3jSzH+XvVSST7WwWFGhFRNqC\nZu91bGadgTHAcuD4KPk3wGtm9i13X56wnEHAJcDsmNUdgNXAb4FpQEfgB8CDZra+u9/SpBeRg5de\nyr5OgVaktgEDBrSYSQak7RgwYEBBy2/2XsdmdhZwPTDE3adGaQOBKcD57n5TwnJeAKYCWwLl7j48\nwTb/BLq4+7ezrM9rr+NVq2D99WHJkvj1RxwBTz6Zt92JiEietdZex4cA41NBFsDdpwHjgMOSFGBm\nPwS2By7Kcd9zgeoct2m0f/4ze5AFndGKiLQFxQi0WwOTY9I/BIY2tLGZrQfcQDj7XZAgf7mZ9TSz\nU4H9gJtzrG+jZV6f3X332ssKtCIipa8YgbYnMD8mfR7QIyY90/XAx+4+uqGMZnYm4VrtN8CtwHnu\n/kAOdW2SzED7k5/UXp45s/TurykiIrW1quE9ZrYncBxwesJNHgF2Ag4A/gjcaGanFKh6tcyeDe+8\nU7NsFq7JrrtuTdqqVeFuPiIiUrqKMdfxfOLPXLOd6aa7A7gHmGFm3QEjvIayaHm5u69KZXb3uYTr\nsgAvmdk6wPVmdq+7V8XtYOTIkWufV1RUUFFRkeQ11fHyy7WXd9wRevWCfv1gypSa9JkzoWfPRu1C\nRETyrLKyksrKyryWWYxex68C7TN7CZvZGAB3H1HPttWAEwJsJgfOrm/oTtSUfAuwibvPiFmft17H\nxx8PDz1Us3zxxeFWed/9LoxNG/X78suwzz552aWIiORZa71N3jPAKDMbGPU2Tg3v2QNo6MaGFTFp\nNxOawH8OfJZg+yXA10kr2xjV1XXHz+6/f/jbr1/tdHWIEhEpbcUItHcBZwJPm9mlUdqVwHTCdVQA\nzKw/8Dkw0t2vBnD3sRllYWYLCONo30hLOxXYDXgF+ApYnzBhxZHAr9x9TQFe11rvvQdfp4Xyrl1h\n2LDwXIFWRKRtafZA6+7LzGwv4EZgNKEZ+BVCs++ytKyW9miw2IzlD4BDgVGEa7/fAP8BDnL3F5r2\nChqW2dt4r72gffvwXIFWRKRtKcqN3939K+DoBvJMBxq8LXrcNV13/xdwcKMr2ESZgTbVbAx1A+2M\nOleKRUSklLSq4T2twZIlMG5c7bT6Aq3OaEVESpsCbZ5VVsLq1TXLm28OgwbVLCvQioi0LQq0eVZf\nszEo0IqItDUKtHnWUKDt0QM6dqxZXroUFi8ufL1ERKQ4FGjzaOrU2rM+tW8PIzK6aplB376103RW\nKyJSuhRo8yjzbHaPPWrPbZyi5mMRkbZDgTaPMgPtfvvF51OgFRFpOxRo82T1anj11dppmddnUxRo\nRUTaDgXaPBk/vnanpt694dvfjs+rQCsi0nYo0OZJXLNxWZajq0ArItJ2KNDmSUPDetIp0IqItB0K\ntHnwzTfw9tu107J1hAIFWhGRtkSBNg9eeQXS7xf/7W9Dnz7Z8yvQioi0HQq0eZBLszGEjlLp12/n\nz4cVK/JfLxERKT4F2iZyh5deqp3WUKAtL697xjtrVn7rJSIiLYMCbRNNnlz7nrLrrBNmhGrIhhvW\nXlbzsYhIaVKgbaLMZuMRI6BDh4a303VaEZG2QYG2iZJOu5hJgVZEpG1QoG2CZcvgjTdqpzV0fTZF\ngVZEpG1oMNCaWQczm2dmhzZHhVqT11+HlStrlgcOhMGDk22rQCsi0jY0GGjdfRWwBtAAlAxxw3rM\nkm2rQCsi0jYkbTr+K3BUISvSGuU6fjZdZqBN77ksIiKlo13CfM8Dt5jZ44SgOxPw9Azu/lqe69ai\nffEFfPRRzXJ5Oey1V/LtdUYrItI2JA20T0R/j4weKQ5Y9Lc8j/Vq8TInqRg2DLp3T7595oQVc+bA\nmjXQLuk7IiIirULSr/URBa1FK9SUZmMIY2179Qo3JIAww9Ts2bDRRvmpn4iItAyJAq27v17oirQm\na9aEGwm8We7sAAAgAElEQVSkyzXQQmg+TgVaCM3HCrQiIqUlp3G0ZtbTzA4ys+Ojvz0bs1Mz29jM\nHjezBWa20MyeMLNNGlHOhWZWbWZjM9IHm9nvzexDM1tsZjPM7Gkz+1Zj6ptp4kRYsKBmef31YYcd\nci9H12lFREpf4kBrZlcD/wWeAR4A/gb818yuymWHZtYZGAMMAY4HjgMGA69F65KWMwi4BJgds3o/\noAK4FzgE+BnQGxhvZtvnUt84mc3G++4bOkPlSoFWRKT0JWo6NrNfAhcD9wAPAbOAvoQgebGZzXH3\nWxLu81RgIDDE3adG5X8ATAFOA25KWM7tUV22pG5HrIfd/baM1zAGmAacBfwk4T5ibbkl7LNPmBVq\n5crk0y5mUqAVESl9STtDnQ7c7O5np6V9DLxuZkuAM4CkgfYQYHwqyAK4+zQzGwccRoJAa2Y/BLYH\njgGeylzv7vNi0haZ2SdAk6+CHnNMeCxbBmPHwk47Na4cBVoRkdKXtOl4IPBclnXPReuT2hqYHJP+\nITC0oY3NbD3gBuB8d1/QUP607XoA2wD/TrpNQ7p0gQMOCL2HG0OBVkSk9CUNtHMJQSrO1tH6pHoC\n82PS5wE9Emx/PfCxu4/OYZ8At0Z/b85xu4JRoBURKX1Jm46fAq4ys7mE659rzKwdcDRwJaFzVMGZ\n2Z6E68I5dWgys4sIzcwnufvnhahbYyjQioiUvqRntBcB7xIC6nIzmw0sB/4EvEfoKJXUfOLPXLOd\n6aa7g9Aha4aZdY+akdsB5dFynVuum9npwG+AS9y9WX4QJJUZaGfNgurq4tRFREQKI+mEFYvNbDhw\nELAnISjOA14Hnnd3r2/7DB8SmpszDaXh66dbEXoZ/yxm3TzgbNI6ZZnZ8cBtwCh3vyZJ5UaOHLn2\neUVFBRUVFUk2a5QuXaBbN1i0KCyvWQNz50Lv3gXbpYiI1KOyspLKysq8lmkNxcjoLPFnwKvuHteJ\nKbcdmp0FjCIM75kWpQ0EPgEucPesvY6jYJ/pZsKZ+c+Bz9x9RpT3COAvwN3uHheY48rP8TdD0225\nJXz8cc3ye+/Bt/IyrYaIiDSVmeHuCW+AmqWMJIHFzJYD+7v72AYzN1xWF0Iz9HLg0ij5SmAdYDt3\nXxbl6w98Dox096vrKW8MUO7uw9PShgMvEno3/wJIb5Bd6e7vZimr2QPtiBGQ/uPphRcaN52jiIjk\nXz4CbdLOUP8BBgFNDrTuvszM9gJuBEYT7v7zCnB2KshGLO3RYLEZyyOADsAOwD8y1k0nvJYWQR2i\nRERKW9JAexlws5m97e4fNHWn7v4VocdyfXmmk+DWe+5e585C7n4FcEWjK9iMFGhFREpb0kD7K2Bd\n4B0zm0bdG7+7u383z3VrEzbcsPayAq2ISGlJGmiryOOMSlJDZ7QiIqUt6fCeigLXo81SoBURKW0N\nTlhhZh3M7KksQ2ukiRRoRURKW4OB1t1XAfskySu5ywy0M2ZAM48wEhGRAkoaPMcBuxWyIm1Vt27Q\nOe129ytWwMKFxauPiIjkV9JAey5wspn93Mw2NrNyMytLfxSykqXMTM3HIiKlLGmA/ADYjDDd4XRg\nFbA67bGqILVrIxRoRURKV9LhPVdSd/YlyRMFWhGR0pV0eM/IAtejTVOgFREpXTlfWzWzdc1sgJm1\nL0SF2iIFWhGR0pU40JrZwWY2CVhIuKvOtlH63Wb2wwLVr01QoBURKV2JAq2ZHQ48DXxDmPc4/Y46\nU4Ef579qbYcCrYhI6Up6Rns5cJ+77wdk3ph9MrBNXmvVxijQioiUrqSBdivg0eh5Zu/j+cD6eatR\nG6RAKyJSupIG2kVAryzrBgJz8lKbNmr99aFdWv/vRYtg2bLi1UdERPInaaB9GbjIzNZLS3Mz6wj8\nHHg+7zVrQ8rKoG/f2mk6qxURKQ1JA+0lQF/gY+BuQvPxhcC7wMbAyEJUri1R87GISGlKFGjdfRqw\nA/AssC/hRvDDgfHAru4+o1AVbCsUaEVESlPSKRhx96+AkwtYlzZNgVZEpDTprjsthAKtiEhpUqBt\nIRRoRURKkwJtC6FAKyJSmhRoW4gNN6y9rEArIlIaFGhbCJ3RioiUJnPX/dxTzMyLdTzWrIEOHSB9\n9ytXhjQRESkOM8PdreGc2SUe3mNmg4DvA/2BThmr3d019KcJ2rWDDTaA2bNr0mbPhk02KV6dRESk\n6XK5Td5HwNXA4cCImEdiZraxmT1uZgvMbKGZPWFmOYcUM7vQzKrNbGzMunPM7BkzmxHluSzX8ptb\nZvPxDE0DIiLS6iW9RnsVUAn0c/cN3X3TjMegpDs0s87AGGAIcDxwHDAYeC1al7ScQYSpIWdnyfJT\noDfwFHXvONQi6TqtiEjpSdp0PAg4193zcZeeUwl3/Bni7lMBzOwDYApwGnXvd5vN7cBDwJZAeeZK\ndx8alV0O/KzJtW4GCrQiIqUn6RntR+TvnrOHAONTQRbWzqU8DjgsSQFm9kNge+CiPNWpRVCgFREp\nPUkD7QXAxVFzbVNtDUyOSf8QGNrQxtGt+m4Aznf3BXmoT4uhQCsiUnqSNh2PJJzR/sfMpgDzMta7\nu383YVk9gfkx6fOAHgm2vx742N1HJ9xfq6FAKyJSepIG2irCvWiLysz2JHSe2r7YdSkEBVoRkdKT\nKNC6e0Ue9zmf+DPXbGe66e4A7gFmmFl3wAivoSxaXu7uq5pSuZEjR659XlFRQUVFRVOKy4kCrYhI\ncVVWVlJZWZnXMpt9ZigzexVo7+7DM9LHALh71jG5ZlZNGKoTN0uHA2e7+y0Z25QDq4GR7n5lA3Ur\n2sxQACtWQOe0AU5lZbBqFZTX6VMtIiLNoblnhuoHnAt8l3D2OY8wHvYGd5+Vwz6fAUaZ2cCotzFm\nNhDYg9Dpqj4VMWk3Ezp1/Rz4LId6tDidOkGPHjA/Oq+vroY5c6Bv3+LWS0REGi/RGa2ZDQHeIDT5\njgNmAX2B3QnNvXu6+5REOzTrArwLLAcujZKvBNYBtnP3ZVG+/sDnhDPRq+spbwxQHnOGvCNhvG45\n8AjwF+CxaPVz7r4ipqyintECbL01/PvfNcuTJsH2JXlFWkSk5cvHGW3S4T3XAosIk0yMcPdjoybe\nIcDCaH0iUSDdC/gEGA08SDgT3TsVZCOW9miw2Ji0nxOC68PR+qOj5b8AGyStb3PTdVoRkdKStOl4\nBHB6qqk3xd2nm9lIwixNibn7V4TAV1+e6cTM+BSTL/aarrufCJyYS71aAgVaEZHSkvSMtgOwOMu6\nxdF6yQMFWhGR0pI00L4L/K+Z1cpvZgacEa2XPFCgFREpLUmbjq8EniXMDPUoMJPQGepowp13DipM\n9doeBVoRkdKSdMKKF8zsYML9aC8hdFBy4G3gYHd/qXBVbFsUaEVESkvicbTu/gLwQjQ8pwcwP6OX\nsOSBAq2ISGlp9pmhWrKWMI528WLo1q1muUOHMGOUNWkUl4iINEY+xtFmDbRmdhlwt7vPiJ7Xx939\nqqZUpCVoCYEWYN11YenSmuW5c6Fnz+LVR0SkrSp0oK0GdnP3CdHz+ri7t/oZeVtKoB0yBKakzbM1\neXKYMUpERJpXQWeGcvcyd5+Q9ry+R6sPsi2JrtOKiJSORONozay/mbXPsq5dNC+x5ElmoJ0xozj1\nEBGRpks6YcVUst9sfbtoveSJzmhFREpH0kBbX/t0e6Cha7iSAwVaEZHSkXUcrZmtR7jvbMpGZjYo\nI1tn4MeE2+ZJnijQioiUjvomrDgLuJwwA5QDj2fJZ1E+yRMFWhGR0lFfoP0rMI0QSO8lTL/4WUae\nlcC/3f39gtSujVKgFREpHVkDrbu/B7wHYXwp8Ky7z22uirVlCrQiIqVDUzCmaSkTVrhD586wcmVN\n2qJF0LVr8eokItIW5WPCisQ3FTCzrYGfAlsAnTJWu7vv3ZSKSA0z6NsXpk+vSZs5U4FWRKQ1Sjph\nxa6EW+J9D9ifcPeeQUAFsDn1D/+RRlDzsYhIaUg6jva3wJPA1oSgerK7DwT2AcoJHaUkjxRoRURK\nQ9JA+y3gIcIwHwjBFXd/jRBkf5f/qrVtCrQiIqUhaaDtACx192pgHpAeBj4Gtsl3xdo6BVoRkdKQ\nNNB+CqRuHPA+cJKZlZlZGXAimhkq7xRoRURKQ9Jex38DhgMPEq7XPgcsAqqAdYFfFKR2bZgCrYhI\naUgUaN19ZNrzV8xsN+B/gC7AC+7+UmGq13Yp0IqIlAZNWJGmpUxYATBrVu1g26MHzJtXvPqIiLRF\n+ZiwIuk42t3M7PtZ1h0djbOVPOrdG8rS3p3582HFiuLVR0REGidpZ6jfEcbQxtmKHIf3mNnGZva4\nmS0ws4Vm9oSZbZJLGVE5F5pZtZmNjVlnZnaRmU01s+Vm9q6ZHZnrPoqlvBz69KmdNktdzkREWp2k\ngXY7YHyWdRMI42wTMbPOwBhgCHA8cBwwGHgtWpe0nEHAJcDsLFmuBi4DbgEOAP4FPGZmByTdR7Hp\nOq2ISOuXtNdxJ7IH5XJgnRz2eSowEBji7lMBzOwDYApwGnBTwnJuJ0yisWVUh7XMrDdwLvBbd78x\nSn7dzAYD1wAv5FDfolGgFRFp/ZKe0f4HODTLukMJk1YkdQgwPhVkAdx9GjAOOCxJAWb2Q2B74KIs\nWQ4A2gN/ykh/CNjWzAbkUN+i2XDD2ssKtCIirU/SQHsHcIqZjTKzIWbWxcwGm9ko4GTC2WVSWwOT\nY9I/BIY2tLGZrQfcAJzv7guyZBsKrHT3zBvVf0iYq7nB/bQEOqMVEWn9ko6jvcvMtgDOBs5JXwXc\n6O5/zGGfPYH5MenzCHcFasj1wMfuPrqBfcQF4Xlp61u8zEB7/fVwxx3Jti0rg+HD4bbb6naqEhGR\n5pP4frTufp6Z/YFwx571gW+AV9z980JVLpOZ7UnoPLV9c+2zmDID7cqVtW8G35AnnoDu3eGee/Jb\nLxERSS5xoAWImmIzm2NzNZ/4M9dsZ7rp7gDuAWaYWXdCM3A7oCxaXu7uq6Jy1suyD6g5s61j5MiR\na59XVFRQUVHRQJUKZ2geGrj/9jdwDzeTFxGR+lVWVlJZWZnXMrPODGVm/YGZ7r46el4vd/8i0Q7N\nXgXau/vwjPQxUTkj6tm2mtBcHRc2HDjb3W8xs+OB+4HB6WfcZvYTQqAe5O7TY8pvMTNDpVxwAdx0\nE6xe3fgypk6FgQPzViURkTYjHzND1Rdoq4Hd3H1CWoDLyt3L61ufVu5ZwCjC8J5pUdpA4BPgAnfP\nOrzHzIbHJN9M6NT1c+Azd58RDe/5Crja3a9K2/4VoLe7b5el/BYXaCHMCLVkSfL8hx8O48bVLD/y\nCPzgB/mvl4hIqctHoK2v6fhEapqJT6KBQJuDu4AzgafN7NIo7UpgOrC2U1V0Fv05MNLdrwZw97gZ\noBYA5e7+RirN3eeY2Q3ARWa2BJgEHANUEIYXtSqdOoVHUrvvXjvQvvmmAq2ISLHUF2i7UzMRxGtE\nzchN3aG7LzOzvYAbgdGEZuBXCM2+y9KyWtqjwWJj0i4GFhNu4deXMNb3aHd/vgnVbxV22aX28oQJ\nxamHiIjU33RcBQyLmo7XPm/W2jWzltp0nKsvv4T+aVfVO3WCRYugffvi1UlEpDUq9N17FhDOBCGc\nVbb+CNRGbLxx7aFBK1bA5LgpQkREpODqazoeBzxgZu9Fy38ws0VZ8rq7753fqkljmYXm46efrkl7\n803Yvk2MPhYRaVnqO6M9BXgYSPU4bkeYPzju0aGw1ZRc7Zpxh+A33yxOPURE2rqsZ7TuPhs4A9YO\n9Tm11K/RlpLMQKsOUSIixZG1M1StTOFuNzOjWZdKVql0hoLQ+Wm99cKsUBCakxcsgG7dilsvEZHW\npNCdodZy9+mlHmRLTbdusNVWNcvu8NZbxauPiEhblTXQmlmVme0SPa+OlrM91jRflSWpzPG0uk4r\nItL86ut1fCVhGsPU89JoU21Ddt0V7r+/ZlmBVkSk+SW6RttWlNI1WoB33oEddqhZ7tsXZszQnXxE\nRJJqtmu0WXbe08x2NLOOTamAFM4229SeI3nWLPjqq+z5RUQk/xIFWjP7tZn9Lm15ODANmABMMbPB\nhameNEX79rDjjrXT1HwsItK8kp7RHke4k07KtcB7wOHAbOCquI2k+DSeVkSkuOrrDJVuI2AKQHSv\n112Avd290sw6ALcUqH7SROp5LCJSXEnPaKuomWZxOLCCMBcywBygZ57rJXmSeUb71luwRoOxRESa\nTdJA+yFwnJmtS7gJ/Otp96bdBPi6EJWTphswAHr3rlletgz+/e/i1UdEpK1JGmivBL4PLAT2Jlyj\nTTkQmJTnekmemOkGAyIixZR0CsYXga0IwXZrd389bfVYagdeaWHUIUpEpHiSdobC3acCU2PS78xr\njSTv1CFKRKR4ko6jPczMTkxbHmBm/zKzxWb2eHTtVlqonXeuvfzhh7BkSXHqIiLS1iS9RvtrIK1L\nDTcAGwN/JPRCHpnfakk+9egBQ4bULFdXw9tvF68+IiJtSdJAuxnwPoCZdSZ0gDrH3c8FLgaOKEz1\nJF/UIUpEpDiSBtpOwPLo+e6Ea7svRcsfAxvmuV6SZ+oQJSJSHEkD7TTgO9Hzw4C33X1htLwBYdiP\ntGDqECUiUhxJA+2dwEgzews4A7gnbd0wQFMgtHDbbQcd0+6z9NVX4ZZ5IiJSWEnH0d4M/AT4F3CS\nu9+VtrorcF/+qyb51KEDbL997TQ1H4uIFF7i+9G6+5/c/X/dfXRG+mnu/mD+qyb5puZjEZHm1+gb\nv0vrow5RIiLNL3GgNbNTzewdM1tmZlWZj1x2amYbRxNdLDCzhWb2hJltkmC7/mb2VzObFtVjjplV\nmtn3YvKub2b3mtnXUd7xZrZfLvUsNZlntBMnQlVO75yIiOQq6cxQJwC/ByYShvrcBzwELAI+I9x0\nIJFoHO4YYAhwPOGm8oOB16J19VmXcFu+S4DvEe4ktAh4zswOS9tHh2gf+wHnEcb5fgE8a2bDk9a1\n1Gy2Gay/fs3y4sXw0UfFq4+ISFtg7t5wJrNJwDPAVcBqYCd3n2RmPYBK4C53vzXRDs3OAq4HhkTz\nJ2NmAwk3lj/f3W/K6QWYlRPmYJ7k7odHaccBDwAV7v5GWt73gOXuvluWsjzJ8WjNDjwQnn++Zvne\ne+HEE7PnFxFpy8wMd7emlJG06Xgw4S491dGjA4C7zwd+A5yVwz4PAcangmxUzjTCjeQPy7ZRNu5e\nRRjHW52WvCshoL6Rkf0lYGcz65frfkqFOkSJiDSvpIF2OdAuOt2bBQxKW7eE3GaG2hqYHJP+ITA0\nSQEWlJtZHzO7jNAMnX5GXUU48860Mvq7TQ71LSmailFEpHklDbQfEIIZwBvAxWY2zMx2JtxQIJcr\nfT2B+THp84AeCcu4jhBIZwIXAD9y99fS1n8MdDOzLTK22z2tDm1S5p18PvgAli0rTl1ERNqCpIH2\nj0C36PmlhE5J/wDGEwLwufmvWr1uBHYCDgaeBR4yswPT1v8ZmAuMNrNtoh7IFwN7RuuraaN69Qqd\nolKqqmDSpOLVR0Sk1CW68bu7P5r2/FMz25ow9WIX4J/u/k0O+5xP/JlrtjPduPrMAFITCP7dzMYQ\nOlj9PVq/0MyOIHSIeg8w4FPgckKHrpnZyh45cuTa5xUVFVRUVCSpUquy667w2Wc1yxMmwHe+kz2/\niEhbUVlZSWVlZV7LTNTrOK87NHsVaO/uwzPSxwC4+4hGlDkKOMvdO8Ss2wwod/dPzOxXhDPy3u6+\nPCZvyfc6Brj5ZvjlL2uWv/99ePTR7PlFRNqqfPQ6znpGa2b9cynI3b9ImPUZYJSZDYx6G6eG9+xB\nuN6aEzMzQpPwZ3Hr3f2zKN+6wE+B0XFBti1RhygRkeaT9YzWzKqBxKd37l6eaIdmXYB3CT2ZL42S\nrwTWAbZz92VRvv7A58BId786Sruc0MQ8jtD7uS8heO4FHOvuj6Xt57fA28A3hOFJ5wFrgO+4+4Is\ndWsTZ7QrVkC3brA6rV/2rFnQp0/x6iQi0hIV9IyWMOtS3qOOuy8zs70IHZpGE66fvgKcnQqyEUt7\npEwijNn9AdCdEGzfIwTP8Rm76hPtYwPga+BJQtCODbJtSadO4bZ5b71VkzZhAhxySPHqJCJSqpr9\nGm1L1lbOaAF+/nO47baa5UsugauvLl59RERaooLODBVNCnGImWWd3MHMtjUznQe1QpkzROlOPiIi\nhVHfONrjgIcJMz9lsxh42MyOzWutpODibplX3WZHF4uIFE59gfZ44L5Uz+A40bp7gB/nt1pSaIMH\nw3rr1SwvXAhTphSvPiIipaq+QLsDYRL+hrxCmKVJWpGysrrTMWqYj4hI/tUXaLuSbKam+VFeaWU0\nnlZEpPDqC7TfAAMSlNE/yiutTNx1WhERya/6Jqx4FOjh7vvVW4DZS8B8d/9BAerXrNrS8B6Ar7+u\nPUlF+/awaFEYZysiIoW/8ftNwN5mdqOZxc0h3N7MbiLMynRjUyohxbHBBjBwYM3y6tXw7rtFq46I\nSEnKOjOUu//LzM4F/g/4UXTmOj1aPQDYF1gfODdmViZpJXbZBaZNq1l+803YbbeiVUdEpOTUe5s8\nd7/JzCYBvwKOADpHq5YDlcA17v5GQWsoBbXrrvCXv9Qsq0OUiEh+NXg/WncfC4w1szKgV5Q8192r\nClozaRbqECUiUlia6zhNW+sMBbBsWbiTT1Xaz6Y5c6BXr+zbiIi0FYXuDCVtQJcu8K1v1U7TWa2I\nSP402HQspW+XXeCdd2qWTz4Z+vWDzp3DUJ9OnWqex6Vtvz3sv3+YbaoQ5s+HZ5+F9dcP+ylPdOdj\nEZGWQU3Hadpi0zHAfffBSSc1rYwddoAbb4Thw/NTJwjDje64A0aOhHnzQtq228INN8A+++RvPwDL\nl4fbBj7wAHTtCmedBUcfXbgfDyLSOuSj6ViBNk1bDbSffx5uMpCPu/cceSRcdx1stlnjy3CHv/8d\nzjsPPvooPs/BB8P118MWWzR+PxCC+b33wpVXwowZtdftsANccw3su2/T9iEirZeu0UpeDBoEV1yR\nnxmhnnwShg6FCy4IdwTK1eTJcMABIZBmC7IQmpK32QZ++cuas91cVFfDww+Hup5+et0gCzBpEuy3\nXzh7njgx932IiIDOaGtpq2e0KUuWhGkZly+HFSvq/o1Le+89eOqp+PJ69w5nij/9KbRroDfAnDlw\n2WXwxz/Gn1l37hz2Gadnz9C8fPrpYRrJ+qTOli+5JNQ9F0cdBb/5DQwZktt2ItJ65eOMFnfXI3qE\nwyG5+sc/3Hfe2T2EsbqPbbZxf+ml+G1XrHAfNcq9W7f4bc3cTz7ZfeZM9zffdN999+z72XJL92ef\nda+ujt/X66+777FH9u07dnQ/+2z3o4/Onqe83P3UU93/+9/CHU8RaTmiuNC02NLUAkrpoUDbeFVV\n7g8+6L7RRtmD1EEHuf/nPyF/dbX7k0+6b7ZZ9vwVFe7vvFN7P9XV7o884j5gQPbt9t3X/YMParaZ\nNMn9gAMaDp5fflmzzcSJ7nvvnX2bzp3df/Ur93nzCn5oRaSI8hFo1XScpq03HefD0qWhk9J114XJ\nMDK1awenngoffgivvx5fxmabhTIOOwwsS4PN8uVw003w29+GJu9MZWVwyilhaFD6FJOZjj02XJ8e\nPDh+/SuvwIUXwttvx6/v0SOs/9//Dc3bIlJa1Os4zxRo8+err+Dii+HBB5Nv0707XHop/Pzn0LFj\nsm1mzYJf/zr0HM7lrTvooHC9dbvtGs5bXQ2PPx6u6376aXyeDTcMPyBOOgk22SR5PUSkZVOgzTMF\n2vybOBHOPhvGjcuep7wcTjstdGjq3btx+3n3XTjnHBgzpv58w4eHs+A99sh9H6tXwz33hDPgWbPi\n85SVhV7Tp5wSgnlDnbNEpGVToM0zBdrCcIfHHgtDfqZPr71u//3h//4Ptt46P/t55pkw/jbzzHP7\n7UOA3X//7M3RSS1dCrfcAtdeW/8Qpr594cQTQ6/rQYOatk8RKQ4F2jxToC2sFSvg5pth9Ohw5vqr\nX8H3vpf//axaBbfeGpqT11svzPL0P/+T/1me5s4NE1rcdlv2oUcpe+8dznIPPzx5s7iIFJ8CbZ4p\n0EpjLFgAf/4z3HVXaMKuT69e8OMfh7PcLbeMz+Me7qa0cmX40bBqVc3zXr3CjwcRaR4KtHmmQCtN\n4R56J991Vwi8cb2h0w0aFLaJC6jZPoZm4SYQBx8cHttt1/SmcBHJrtUGWjPbGLgJ2Acw4BXgl+7+\nZQPb9QduAb4NbAAsBT4ErnX35zPy9gJGAgcCfYFZwHPAFe7+TZbyFWglL5YsgUcfDUH3zTcLt5+N\nNgqdrg4+ODRPd+lSuH2JtEWtMtCaWWfgfWA5cEmU/BugM/Atd896tcvMhgJnA5XAV0A34BTgYOAI\nd386Le94YFPgUuAjYChwFTDF3XfPUr4CreTd+++HgPvQQ6GZuVA6dYK99gpB96CDoH//wu1LpK1o\nrYH2LOB6YIi7T43SBgJTgPPd/aYcyysHpgKT3P3wKG0IIbie6u53p+U9Dbgd2NLdp8SUpUArBbN8\neRiPe9dd8MYb9ectL4cOHcKjY8fwt6wMvvgi+f623TYE3WHDQhBu1672o3377Mvrr9/w/NQibUFr\nDbSvAB3dfc+M9ErCVFcjGlHmB4Qz1SOj5a0ITco/cPfH0vIdA/wJGOruH8eUo0ArzeLrr8Ndh1LB\nND2gduiQ/eb2//1vuCnCc8/Byy/Hz76VD716wbnnwvnnZ6+LSFvQWgPtTOCv7v6zjPTbgKPcvU+C\nMoxwi79ewGmEJujvuftraXmeBzYCTiSc3W4N3Ad87u6HZClXgVZajRUroLIy3DLw2WfrjlHOh+98\nJwzH2nTT/Jct0hq01vvR9gTmx6TPA3okLOM6YDUwE7gA+FF6kI0cCUwHJgKLgfHAZ8BRjaizSIvT\nqRfLoMQAABLhSURBVFOYherWW2Hq1HAv32uuCcExX2OG//GP0LP5/vtzm+JSRGoU44x2JfB/7n5x\nRvpVwK/cvUOCMjYk9CTuC5wAHA4c6e5/j9Yb8CywHaHn8UfAVsCVwNvufnCWcnVGKyVh7lx48cXw\n+PprWLOm5rF6de3lzLR588LZcqYjj4Q77wzNylLbihUwaVJoyt92W+jTYLuctBattel4FvBUU5qO\nY8ocA/Rx96HR8mHAU8Be7l6Zlm8f4CXgMHf/W0w5fvnll69drqiooKKiItfqiLRqs2aFmyM8/3zd\ndX37hhm3CjGjV2uyfDmMHx+a7l9/PTxfubJm/cYbw847h8dOO4VHj6TtdVJUlZWVVFZWrl2+4oor\nWmWgfRVo7+7DM9LHADSyM9Qo4KzU2bCZXUgYMrSeuy9Oy9cNWABc6O7XxZSjM1oRQjPxHXeEDlFx\n00uecQaMGtV2xu0uXQr//GcIqq+/DhMmhIlFcrH55iHgpgLw9tvDuusWpr7FlJqEZenScIaf5O+a\nNdCtW7iDV/futZ+nlrt0ST45S3V1qMOKFeHzm/rrHqZ/7dUreSe/1npGexYwijC8Z1qUNhD4BLig\nEcN7DPgX0N3dt4rSjgfuB/Zx9zFpefcDXgCOd/c/xZSlQCuS5uOP4bjj4K236q7bYoswNninnZq/\nXoW2eHEIrKkz1okTQzDIp7Iy2Gqr8Mh1KFV5eQg8XbrAOusk+9uuXW7Bb9myEJzSZyzLnMEs7vnK\nlSHQ5Vu7drUDcKdO8cF0xYrarQtxyspCsO3TJ7TS9OkT/+jbF/r1a52BtgvwLmHCikuj5CuBdYDt\n3H1ZlK8/8Dkw0t2vjtIuJ3SmGkeY6akv8FNgL+DY1FAeM1uXMLynHLiammu0lwErgK1T+8momwKt\nSIbVq+Gqq8L9ezO/QNu1g8svhwsvbL3jbtesgQ8/DDN4TZgQ/v7737kHi002gQ02gA8+yP1sV1qy\nVhhoYe0UjDcC+1IzBePZ7v5FWp4B1ATaq6K0Q4CzgG2A7oRg+x5wjbuPz9hHP0JHqL2BflHelwlT\nMM7MUi8FWpEsxo8PZ7effVZ33bBh8OCDYf7muA5X2TpgNdTEGJfWvn2YejL12HjjmufdutX/Gtzh\nyy9rAuqbb4b5qRszHnnTTeG73615DBwYmjZXrQrBduLE0BIwcWII5FVVue9DWoJWGmhbKgVakfot\nWQJnnw13391w3mLo2jU+AC9YUBNcZ81qXNmbb147sOYyxeWyZeHOTunB9+M6U+aUjg4dkjdpr7NO\naMpdtCg8Fi4Mj/TnCxc23BycqWNH6Nw5NDGn/lZXh1748+MGmGalQJtXCrQiyTzzTLjV35w5xa5J\n4Wy5JQwfDhUV4e9GG+W3/IULw5CgxgT+NWtyu9aa6nC0zjr1B7zMtM6da2YsS5+5LNvz1HIhLiOs\nXFkTfBctCtdj04No6m+nTqEO9Y0lX7UqBNzZs2ses2bVXk495s5VoM0rBVqR5GbPDsH22WeLXZOm\n22AD2HXX8Nhll9ArWPf9FWilvY5bMgVakdy4h2bkUaNgSnSbjoZuXpCelmpizKWZcfnyMOfzf/8L\nX31V+3mSTkidOsGOO9YOrAMG6L6+Ek+BNs8UaEUaxz109ikvL17Acg8zYsUF4PJy2GGHEFi32SYE\nepEkFGjzTIFWRETStdabCoiIiLQZCrQiIiIFpEArIiJSQAq0IiIiBaRAKyIiUkAKtCIiIgWkQCsi\nIlJACrQiIiIFpEArIiJSQAq0IiIiBaRAKyIiUkAKtCIiIgWkQCsi/9/e/QfLWdV3HH9/ktAR+SGE\nCtHIDUUTLMUAAUpA+WHMUJAg8mvGTg0p0VrrWIFRLJaSYAewP1IFpUgLQwdFSkRLkQACQ0gIgQgS\nMMgPIRJAfoSEJIRCMAnJt3+cs8nmye69e+/u3ofsfl4zO3f37NnznOfcc+93n/Oc5zxm1kYOtGZm\nZm3kQGtmZtZGDrRmZmZt5EBrZmbWRg60ZmZmbeRAa2Zm1kYOtGZmZm3kQGtmZtZGDrRmZmZtVEqg\nlfQBST+R9Jqk1ZJ+KmnPBj7XI+l/JT0raY2k5ZLmSDqukG+KpI11Hhsk7d6+vTMzM9tMETG4G5S2\nBxYBbwHn5eSLgO2BsRHxVi+f3Rc4G5gDvADsDPwVMAk4KSJuyvl2Az5Y/DgwC1gcEYfVKT8Guz3M\nzOydSxIRoabKKCHQngnMAMZExJKcthfwNHBORFzSz/KGAkuAhRHx6V7yHQHMBb4UEVfUyeNAa2Zm\nm7Qi0JYxdHwCsKASZAEi4llgPnBifwuLiA3AamBjH1mnAGuB6/u7jW4yZ86csqtQOreB2wDcBuA2\naJUyAu2fAL+ukf4YsG8jBSgZKmkPSdOAMcBlveR/F3AqcHNEvDaAOncN/2G5DcBtAG4DcBu0yrAS\ntjkcWFUjfSWwa4Nl/Avw1fx8DfAXETG7l/wnATsB1zRaSTMzs1bYVi/v+Q5wMGkS1CzgWkmf7CX/\nFGAZcNsg1M3MzGyTMiZDLQVujIi/KaT/O3BqROwxgDLvBvaIiK2GniWNAH4HXBoRX+ujHM+EMjOz\nLTQ7GaqMoePHSOdpi/YFHh9gmb8Ezqzz3mTSkfsP+iqk2cY0MzMrKmPo+GfA+HxJD7Dp8p6PAjf1\ntzBJAo4Aflsny2RgUUQs6m/ZZmZmzSpj6PjdwCOkBSvOz8n/COwA7B8Ra3K+HuAZ4IKIuDCnTSdN\nppoPLAVGAJ8HJgB/HhE3FLY1jnS0e3ZEXNrmXTMzM9vKoB/R5kA6AXiKNJz7Q9LR6CcqQTZT1aNi\nIWnY+bvA7cA/k2Ydf6wYZLPTgXXAdfXqM9DlIDuFpKPqLFW5suy6tYukkZK+J+k+SW/m/e2pkW8X\nSVflpT7fkHSnpP3KqHOrNdIGkkb1sozpzmXVvRUknSrpRknP5+Vcn5R0saQdC/k6uQ/02Qad3AcA\nJB0j6S5JL0v6vaTfSZop6Y8L+ZrqB4N+RPtO0sxykJ1C0lHAbOBvSUf/FW9HxMJyatVeeZ+vBx4C\nhgLHAH8UEc8X8t0L9ABfA14D/p70RW//iHhpUCvdYo20gaRRpFXXLgJuLhTx4La8jJqk+0nLuN6Y\nfx4AfBN4IiIOr8rXyX2gzzbo5D4AIOkzwIHAL4DlpN/1N4APAPtFxAs5X3P9ICK69kGaQLWe9A+m\nkrZXTjur7PoNUhscBWwAJpRdl5L2/3N5/3sK6Sfm9COr0nYGVgCXlF3vQWqDUaQV16aWXcc27PNu\nNdIm53Y4uhv6QINt0LF9oJd2GZP3+axW9YNt9TraVmnpcpDbMM+23toJwEsRcU8lISJeJ32r76a+\n0ZEiYkWN5AdJfwsj8+uO7gMNtkE3qpw225B/foom+0G3B9qml4PsID+S9LakVyX9qJvOU9fRW9/o\nyZP6usW3JK3P8xhu6pRzlDUcDQSbLzPsxj5wNKkNniikd3QfkDRE0naSRgP/QZpsOzO/vS9N9oMy\nrqN9J2nFcpDbutWkuynNBV4nna84D7hP0oER8WqZlSvRcNK5qaLKt91dSRPxOtla4ArgDtL5qw+T\n+sZ8SYdExFNlVq6VJI0knZ+8MyIezsld1QcKbVCZn9EtfeAXwEH5+bPAxIhYll833Q+6PdB2vYh4\nhHS5VcU8SfOAB0gTpKaXUjErXUQsBb5UlTRf0u2kb/LnkZY23eZJ2oF0Df86YGrJ1SlFvTbolj4A\nfJZ03nVv0oSn2yV9NAoTJAeq24eOV1H7yLXekW5XyN/onwL+tOy6lKi3vlF5v+tEmoV5Lx3SN5Tu\n7DWLNAnyz2LLGaRd0Qf6aIOtdFofAIiI30TEgxExE5gI7Aicm99uuh90e6Btx3KQ1hl66xvPx5bX\nfNs2SNIw4KfAOOC4iCj+zXd8H2igDbpORKwGFgMfyklN94NuD7QtXQ6yU0g6GNgHWFB2XUr0M2Ck\npCMqCfkC/RPo7r7RA3yMbbxvSBJpIZujgRMj4sEa2Tq6DzTYBrU+1xF9oB5Je5DORS/OSU33g25f\nsKKh5SA7maTKylwPkyZDjSMNmbwBHBQRHblClKRT8tOJwF+TzkMtB5ZHxD35n9C9pAvXv066SP0b\nwH6kvvHi4Ne6tRpogxmk6wkXkCZ+fJjUN3YCxkfE04Nf69aQ9H3SPl8I3FJ4+4WIeLHT+0CDbdCx\nfQBA0v+QVhxcRPr/tw9wFrA7cGhELG5JPyj74uCyH7nxbsiNt5o0jNJTdr0Gcf/PJX3ZWEWaYfgc\n8H3SbQdLr18b93sj6Tq54mN2VZ5dgKuAV0lfPO4grRZTev0How2AM0izMVfkvvESacnU0WXXvQX7\nvqTOvm8ApnVDH2ikDTq5D+T9O4d07fDK/Pt9Ari8GAOa7QddfURrZmbWbt1+jtbMzKytHGjNzMza\nyIHWzMysjRxozczM2siB1szMrI0caM3MzNrIgdbMzKyNHGjNmiRpsqTnql4/JumLLd7GeEkLJL0h\naYOksXXyTZe0oer1e3LaAa2sT39I2j/XYZca722UNK2MepkNFgdas+aNA34Jm243tg/wUIu3cTUw\nFDgeOIx0d6VarszvV+xCutXhuBbXpz8OyHUYXuO98aQVd8w6lu9Ha9a8g4Cf5+fjSEvY/apVhUsa\nAowBLoyIub3ljXSLs+rbnKlV9SjUabuIWN9odqDmEnQR8UDramX2zuQjWrMm5CB4AJuPYA8BHo+I\ndQ1+fidJl0l6UdLvJT0p6ayq96cAb5OC1bQ81PpML+VdIGljfj4KeIYU5K7Kn90g6fSq/CdLul/S\nm5JWSfqxpD0LZS6R9ENJZ0h6QtJa4JP5vW9KekjSaknLJd0l6dBC/a/OLxdX1aEnv7/V0LGkYyXd\nJ2mNpNck3ShpTCHPHEnzJH0ib/9NSY9K+nQh3+j8+VckvSXpOUkz8+/NbFC4s5kNQA4+G0lBcAfg\n1vx6BjC2GFDqlCHgVmAK8K/AJOA24NuSLszZZpFu2yjSEOt44KReqhZsPnp8GTg5f/ai/NnDyHdq\nyeeRfwL8GjgF+ALpjiRz8hB4tY8DZwMXAMeS7nYC8H7gEuBTeT9eAeZKqty/cxbp7jDkbVTq8HKd\nNjk2f+Z14DTgi7lO8yS9r7CfH8zbnpHb5GXgx5L2rsp3K/A+0l1qjgH+jrQ4vv/32eAp++4Jfvix\nLT5ItwsbC/wb8Cjwkfx6NfCV/HwsMKyXMiaR7qAzuZB+JenWjcPz66E537QG6jUd2FD1elT+7NRC\nvh1Id6y6spA+ihSIvlKVtoR0x5L39rHtIbmuTwLfqUqfQhpO37vGZ7bYL9K57t8AQ6rS9gLWATOq\n0u7O9dy7Ku29pC8+5+bXu+XyJ5XdX/zo7oe/1ZkNQEQ8GRGLgD2BORHxKLAG2BG4ISIW5cfbvRRz\nBCkA/Xch/VrgD9hyUlOrHUa6p+h1koZWHsCLpEB5ZCH/gohYXixE0kRJsyW9Sgpy64HRpAlh/ZLv\nD30gMDMiNlbSI+JZYD5wVOEjT0fEM1X5lgPLgJ78egVp6PyfJH1e0of6WyezVnCgNesnSUNyYBpG\nGta9PwepI0mBall+3ZfhwMoawXgpabi31izdVtk9b+MuUnCsPNaRhmp3K+TfaqhX0oGkYejXganA\nocDBpGHldw2gTrvmOtUaVl7K1u2xska+tYVtTyQdJV8MPCXpt62+9MqsL551bNZ/d7H56CpIN8K+\ntur1eiAkfTwi7umlnJXAcEnDCsF2RNX77bIi/zwdeLzG+/9XeF1r1vAppH09ufoIVNKuwKoB1GlV\n3s6IGu+NYADtkY+G/zLXayzwZeBySUsi4vYB1NGs33xEa9Z/XyAduc0AFufnBwPLgfPy80Po+1ra\nuaRzmqcV0j9LOjK7vwV1XZt/bl9Iv48UTEdHxMIaj6cbKPvdpKHvTSRNIA/dNlCHLUTEGlKbnZYn\nilXKHAUcTjovO2B5qP+r+eV+zZRl1h8+ojXrp0oQypel3BIRD0vaB/hD4OqIWNZgUbcB9wJXSNod\neIy0IMVU4OKIaMUR7Suko9fPSHoUeBNYEhErJZ0DXJa3fRtpItdI0tH63RFxfR9l/xw4E7hG0n+R\nzsv+A/BCId/jpCHhL0u6hnQU/Ks656/PJ806vkXS5aTzyBeQjna/3Z8dl/QR4FJgJukL0VDgjLz9\n2f0py6wZPqI1GwBJ2wETSAEK0iUvC/sRZImIIF2Peg3wdVKAOQ44OyLOL2anzqIPtYoubONzpPOf\ndwIPkGY7ExH/SbosZwzwA9L51umkgPRIX9uOiDtIM6wPB24mDdFOJgW16josyuVOAublOry/Vtl5\nOPd44D2kAHk56QvIERGxtN5+1qnrUuA50mVJNwHXkYagj4+Ih2t81qwtlP4OzczMrB18RGtmZtZG\nDrRmZmZt5EBrZmbWRg60ZmZmbeRAa2Zm1kYOtGZmZm3kQGtmZtZGDrRmZmZt5EBrZmbWRv8Pcs4s\nmJaXbXcAAAAASUVORK5CYII=\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "plt.rcParams['figure.figsize'] = 7, 5\n", "plt.plot(range(1,31), error_all, '-', linewidth=4.0, label='Training error')\n", "plt.title('Performance of Adaboost ensemble')\n", "plt.xlabel('# of iterations')\n", "plt.ylabel('Classification error')\n", "plt.legend(loc='best', prop={'size':15})\n", "\n", "plt.rcParams.update({'font.size': 16})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Quiz Question**: Which of the following best describes a **general trend in accuracy** as we add more and more components? Answer based on the 30 components learned so far.\n", "\n", "1. Training error goes down monotonically, i.e. the training error reduces with each iteration but never increases.\n", "2. Training error goes down in general, with some ups and downs in the middle.\n", "3. Training error goes up in general, with some ups and downs in the middle.\n", "4. Training error goes down in the beginning, achieves the best error, and then goes up sharply.\n", "5. None of the above\n", "\n", "\n", "### Evaluation on the test data\n", "\n", "Performing well on the training data is cheating, so lets make sure it works on the `test_data` as well. Here, we will compute the classification error on the `test_data` at the end of each iteration." ] }, { "cell_type": "code", "execution_count": 38, "metadata": { "collapsed": false }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Iteration 1, test error = 0.42330891857\n", "Iteration 2, test error = 0.428479103835\n", "Iteration 3, test error = 0.398104265403\n", "Iteration 4, test error = 0.398104265403\n", "Iteration 5, test error = 0.379900904782\n", "Iteration 6, test error = 0.380008616975\n", "Iteration 7, test error = 0.379254631624\n", "Iteration 8, test error = 0.380008616975\n", "Iteration 9, test error = 0.379254631624\n", "Iteration 10, test error = 0.379685480396\n", "Iteration 11, test error = 0.379254631624\n", "Iteration 12, test error = 0.377962085308\n", "Iteration 13, test error = 0.379254631624\n", "Iteration 14, test error = 0.377854373115\n", "Iteration 15, test error = 0.378500646273\n", "Iteration 16, test error = 0.377854373115\n", "Iteration 17, test error = 0.377962085308\n", "Iteration 18, test error = 0.377854373115\n", "Iteration 19, test error = 0.378177509694\n", "Iteration 20, test error = 0.376884963378\n", "Iteration 21, test error = 0.377531236536\n", "Iteration 22, test error = 0.376777251185\n", "Iteration 23, test error = 0.376777251185\n", "Iteration 24, test error = 0.376884963378\n", "Iteration 25, test error = 0.376777251185\n", "Iteration 26, test error = 0.376561826799\n", "Iteration 27, test error = 0.376454114606\n", "Iteration 28, test error = 0.376992675571\n", "Iteration 29, test error = 0.376777251185\n", "Iteration 30, test error = 0.376777251185\n" ] } ], "source": [ "test_error_all = []\n", "for n in xrange(1, 31):\n", " predictions = predict_adaboost(stump_weights[:n], tree_stumps[:n], test_data)\n", " #error = 1.0 - graphlab.evaluation.accuracy(test_data[target], predictions)\n", " error = 1.0 - evaluate_accuracy(test_data[target], predictions)\n", " test_error_all.append(error)\n", " print \"Iteration %s, test error = %s\" % (n, test_error_all[n-1])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Visualize both the training and test errors\n", "\n", "Now, let us plot the training & test error with the number of iterations." ] }, { "cell_type": "code", "execution_count": 39, "metadata": { "collapsed": false }, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeIAAAFSCAYAAAAuI9zWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3Xl4VNX5wPHvm4R9CSAIkVUsqLhQsFpcwOCCVRHFYov7\nrlXa+nNDkIoRWjcs7rjgUtC6i1tVqAIRiqIoizuCEraAQNhkh+T9/XHuJJPJneQOmckkmffzPPMk\nc+659565c2feOeeec66oKsYYY4xJjrRkF8AYY4xJZRaIjTHGmCSyQGyMMcYkkQViY4wxJoksEBtj\njDFJZIHYGGOMSSILxClORC4WkQUiskVEikTkr8kukymfiJwuIrNFZJP3no2t4v3nichPyd6GqTm8\n83RaDPn/5a3TIZHlqi4sECeRiHT0Trbwxw4RWSIiT4tI5wTv/1jgWaAu8BCQA8xO5D5N5XjnxOtA\nO+AJ3Hs2OYb1+4eda7/dy2LEY/KBGjeBgYgc7x23kckuSwpQauA5srcykl0AA8APwAve/02BbOBS\n4CwR+a2qLk7Qfk/FnewXqeqcBO3DxNcJQB3gelV9dS/Wv4ySL7jLgE/jVTBjzN6xQFw9/KCqo8IT\nRORZ4CJgBC4oJ8J+3t+fE7R9E397/Z6JSCvgdGAqkAX8UUSuU9UdcSxfbSbJLoCpnaxpuvoah/vg\n/yY8UURai8hDIvKj14y9WkSeE5FOkRsIXZcRkXYi8ryXt1BErhORIuASbx95Xt7CiPWvFJE53vXj\nzSIyQ0TO8tlPjrd+HxG5TETmisg2EZnks/xyEfnaW/69iFzg5aknIveKyDIR2S4in4pIL5999RWR\nZ0RkYVi5ZonIH3zyhpr+nxGRA0TkDRFZ7633gYgc7nfgvbxPi8hS7xjni8hkETkjIp94x2i2iPzi\nPWaJyEC/7UYjIq1E5BHvuulOb3//Cn9PQ68F1xQtQG7oPYvhOtpFuB/fz3mPJsA55ZSrr/d6torI\nzyIyXkSaR8nbRUTGiMg87xhv997nESIS9Qe/iDT3jvXP3jnxiYj8LkreTiIywTs+O73j9ZCItIyS\nf6CIzPTOkS3euXy5Tz4Rkau95eu917tURF4TkR5entuBabjWhND5HPj4i0hdEblZROZ729/onYN9\nfPLmetvN8D47P3nn4UIRucYnfz0RGSoiX3qv9Rdx3w/Picj+Pq810DkrJddpO4nILSKy2HuP5obe\nIxFpJiJPisgqb9lUETmwnOPQQUReFZEC7z2ZKiJHVnT89qb8NYqq2iNJD6AjUAS87bPsSG/Zl2Fp\nvwJWAruBt4B7cE3aO4A1wP4R2ygCFgDLgM+BscDj3nZGAvOAQi99JHBb2LrjvPV/Au7DXUNe7aXd\nGLGf273tvAdsBv4N3AkMj1j+JrAWeAp42CtzIdAfmAJ8DzwITPBe43qgacS+3vfyTfD28TiQ75Xr\nr1GO73Rvv9OBMcAkL30d0CpinT7AL97+3wT+gbsWOx+YFJH3ZW87X3mv52HveBUBfwl4DrQClnjH\nYbK3v9e852uArl6+TO89muYteyb0nkUeo3L29bX32hoCbYE9wPQoefsBu4AtwJPAXd659IV3Dv4U\nkf8Wr7wvAfd67+N871hM8tn+EmCFt70vve0/AWzyyjUwIv9B3vu1B3jVO05TvO0vBlpG5B/qLVuN\nO3fv8/ZZBDwckXeMlz4P91m4C/dDZQVwbdh58YyXb5p37AMdf6AeMMN732Z7+3jCK9tu4KyI/NO9\nvK8CecBjwCPe8S0CLo/I/6qXPsN7nfd65+Za4LS9PWdx/UdCn9tlwKNeubfivnN6hZ0T/8T1XSgC\nFgHi8100D1gKfOy9fxO87WwDekXZd4d4f+aq4yPpBUjlB+UH4tCH/qmwtE+A7cCxEXl/i/vSfDsi\nvcg7mR+Psv9oJ/vx3rpzgQZh6W1wQW8nYUEfF2iLgI3AQT77CS3/GWgXlt7TS9+E+3KrF7bsBq9s\nN0Rsq4PP9hvgvvQ3APV9jm8hcFPEOqO89KFhafVwQWYX0NtnP1lh/1/tbfuh8C8dryyzvfepTYBz\n4F9eOUZEpF/ibX+qz7EsBPrEeK4d5W1vYljaB7jAFvkDLg0XtHYBPSPSP/S2ExmI2wAZPvt90ivv\nMRHpoR8f/wXSwtIP8o7daqBuWHqul//8iO3c4ZXn6bC0A3ABbhlhP7SARrgv8MLw9xcoAD6Lctwy\nfT4XI2M89nd5+7w5In0f7zj8HHHuT/f28zHQKCy9q/eefBuW1tTb9ms++82IWD+mcxb3/VAEfAM0\nC0s/m5LP7fMR23rIK8/ZEWUJfQ6fiUg/wVs2PyK9zHdTrOWvSY+kFyCVH5QEiu9xX7C3435ZzvHS\n1wIHeHl7eGmPRNnWq96XT5OwtCLcr83mUdaJFohD6QN81rnRW/a3sLRQoL0nyn5CweNWn2WL8P+i\nbutt89mAx/J6IgJU2PFdXM6xfzUs7Q9e2hMB9vclrsbuF3xO97ZzbQXbqOu9P6sICzoR+yik9I+X\nvQ3ET3jr9QtLu9Ar5+iIvL299Jd9tnM0PoG4nP2GfmyNjEgPBeLfllPWM73nHbxtzPXJWx8XyLaG\n3ouwY/RXn/y/p+wP3AJgRoDXEnMgxl1GWA98FWX5EK+sp4WlhWrEZd7jsGWNvOdNvDI9H+9zlpLv\ngfN8XtMOb1nbiGXHetu5PSK9CPcjoq3Pvqd42+rus+8Oe1v+mvSwzlrVQxdcMxe4YJqPa779h6ou\n9dJDQ03aederImXhaixdcDXZkDxV3RBjeULXTmf4LMvFfRC7R6QrromqPF/6pK0COvssW+393S88\nUUSa4JpBB3jrNYwoQ5bPPub7pK3w/jYLSzvS28YHPvnDy9AAOARX4xohUqYPz77e34PK2w5wIC6Q\n/FdVd/ks/8jbT/ew8sZMROoDf8Qd6/DXNgl3CeIiXBNrSHfccZjls7nZuFp05D4EuAK42CtzU0o6\nN0V7X3arql+v7f8BV3rleIuSc63M+aiqO0TkM+A03PH8horPXyh9/r4MXC0ic3GXBXKBOaq622f9\nWB2IO8eWRvncdsEdp4Nwl3bCzS2bvdR5u1VVfxGRycC5ItIe14z8Ea6GWRRaqZLnbKnPpqqqiKwB\nGqrqyoi8vp9bz1Kf/ODe75Nw78kCn+Xx/MxVSxaIq4d3VXVABXlaeH/P8B5+FNf8Fm5vekQ3Bfao\n6kafZavD8kSqaF+bfdIKAVR1S3iiqhZ6H7Y6oTQRqYP7cj0cF/Sfxf1CLgR+DZyJa16ucL9h208P\nS870/uZX8Dqa4748O1DyA6rMLij9I8FP6BhGO27lHetYnONt4yn1qg8AqrpVRN4AzheRU1R1irco\ndBzWRm7I+xJe57OPR4BrcNc0X/fKvgsXMP4P//elIEp5Q8cjVI5Yj1PU/KpaICJ7KH1M/4K7zngp\nMBr33v4iIhOBYaq6Ncp+gwh9bg+n5AdCmWJR9nNb5jPhCf0ICj9vfw/8DTgXd41YgAIRGYdr7dhD\n8HO2TDmI/rn1Sw+Vr47PsjVR9vuzV7bMKMshfp+5askCcc0ROun/pKrjY1hPK87iu68MEWnmE4xb\nR5SnsvuKxZm4X81PqGqp3qMiMtRbXhmh1+r3az5c6LV/qqrHVGJ/oe20jrK8vGMdi9DwtxtE5Aaf\n5YobUxwKxJtwX3qtIjN6Nd+WhNXQRWRf4E+4zjjHqOrOsGVH4QKxn32ipIde9ybvb6zHKTz/qojy\nt8B97xUfU1UtxAWw+0RkP6AvrkY+BGhM5YYPhvbzsqqeV4ntRKWq23HDHEeIm/DlBFzZb8M1195B\n/M7Zytg3Snpr3Dm4KcpyqB7lTxgbvlRzfOb9PboK9hVqyi0ztAI32Uh4nqp0AO4D+47PsuPisP05\nuADUr7xMXk3le6CbiPjVIIJaiLvWdpRX248UOv6+zXVBeMNXjsd1QnsqymM9MEBKhiaF9ud3TI+m\n7A/4/XHHbWp4EC5nGyF1xGeIGu4atYaVY35YeikiUg/XEW0H7niG8gvln7/z/Aqkqvmq+m/gZFyL\nQHhLVWh4X3qZFaP7DtdT/Tfi054ab6r6k6o+RUknqAFeerzO2croKCJtfdJD72vU87yalD9hLBDX\nEKr6GS4YXygiZWp+3pjDY+O0u4m4L7LbRaS4qUdE2gA34a5jvxBl3URa5pWr1OsUkbNxQ6Aq621c\ns/QlInJ85EKvthTyMK558wkvGETm7SZuAo2ovOvCL+NqBDdGrH8xrilzuqru9fVhXE0X3JCdq/we\nuB76dYELvLyzcE3MZ4nIEWFlSsc13UZa5v0t9SNRRLoCwyi/pWS0t93QOt1wncjW4IaqoarLcdc9\nfy0i50asPwxX03rRa4IFd24WAjeFvwci0hg3Dltxw5NC43v9fgw0xfXGDZ/sZL33t305r6cUr7Yd\nGjJ4t4iU+c4VkaO86/gxE5GW3jGL1Br3/b49LK3S52wlpQN/j9jnibjrw1+pakU/OJNd/oRJStO0\niLQDHsC9AYIbEvF/3gculu0Mw40l/Z+q+v36DeUbjPtwrlDVmjyJ+Hm4YT5viMj/cL/q9+B6APfG\nXXPz+1DGRFU/EpHHcM2NX3nXEeviehW3wg3DWFLZ/eyFd3Bf+reIyKG4X8iHAKfgOh6dXZmNq+pO\n74v+XWCqiLyLG3vbAtdZLi+0D1V9TESOBs4Heoub0H41rlPSYbhr1kfjc501wlBcjfUfXvCfi+tw\ncpa37rV7+3q8GtjFuJrRc+VkfRa4GdcE+7CqFonIn3DHe4aIvIgbw3sa7rpvqeZeVV3lnSNnicgc\nXM/etri+DJOBQVH2uwp37W++d6xbAINx30vXRnRguwaYCUz0fnj9gJvs5mTgR1xADpXnRxG5Fbgb\n+FJEQiMKBuI+K4+q6kwvewPgYxH5Hnfsl+OuVQ7AXS8Nvx75Pe6H2mAR2YVrnlfgIVX9JdrB9bZx\nBO5H7AARmYkL6u289K6482ZvZjhrC8wTkXm4oVn5uB8mZ+He9+IbgsTxnN1bXwJ9ReRjXIe49rj+\nC9txQ5PKVQ3KnzhV3U0bd+Ivwr0poY5HX3ppDWLYTmdck88qyhl6gPtQrcI1zS2r6tdbwWvoiPvl\n/lYM6zTHDYb/GjdkYyOup+h4oG9E3kIixqFGLH8WF8jLjM31ll+Oa67dgrtGk4s3pCQiX7lDaspb\njvvS3hNlvTLlxzWDvo7r4LEJV1M6CRdwCnHzZkce36eDbt9L74Ib37sC9+W4EhecT/XJey5uysgC\n3BdKHq4md1XQ8xl3zfUhb90duC/TZ4GOsR7riLz9vLzvBsj7iXcu9AhLy8b1aN3qHe/xuM5XS4Af\nI9ZvhPvSX4IbkvUV7jplJ7/3ILQNb3vjcV+o23BjZ/tFKWMn733J945THm7ikJZR8p+F69y32TuH\n5wCXReTJwP0ImYz7kbfde7+nEDEZhpf/SNyP4Y3e6yoz/C9KWdJwPyY+9tbdipuIZBIusISPpS7v\nM1HqM4v7frvNW2elV/6l3naPjrKNQOds5L783j+fdN/PnJc2FRd8X8H9sNuC68X/m4peZ7w/c9Xt\nId4LqzIich2uY0RX9WpV4qbyW4SraT0QcDuTcSfDQUC6RqkRi8iTuDd/NXCi1uwasTHGmFomGdeI\nzwBma1jTpqrm4a5LBer1KiLn4Sa4GF5BvmNxzblD9rawxhhjTCIlIxAfgmtWjfQNAa5vikgzXBPY\nzeo/zjWULwM3Q8+9qmo3IDfGGFMtJSMQt8DNCRxpPe76Z0XuAxaq6sQK8g3DdTC6O7biGWOMMVWn\nRk3oISK9cUMselSQ71fArbiORX5TBxpjjDHVQjIC8Qb8a77RasrhHgeeBvJFJBM39CkDSPOeb/cC\n70O4XnWfheWrixvRkQnsVJ+boYtI1fZcM8YYU+OpaqUma0lG0/Q3uOvEkboB31aw7sG4sa0bvMd6\n3OQOR3v//yks32kR+c7Fjblbjxt77CvZ3dirw+P2229Pehmqy8OOhR0HOw52LMp7xEMyasRvA2NE\npJO63tKh4UvH4iY3KE+2T9qDuB8Uf8aNSwR3p5nImWqG427JNgg33s4YY4xJumQE4vG44URviUjo\n1mujcIPQnwxlEpEOuDui5Kjq3wFUtcxtzURkI24ccWimHNRNBxmZ71Jck/TMyGXGGGNMslR507Sq\nbsNNSP4Dbk7j53A12RO9ZSES9qhws0F3H0NRU1Z2dnayi1Bt2LFw7Dg4dhxK2LGInyqfWas6ExG1\n42GMMSYoEUFrYGctY4wxxngsEBtjjDFJVKMm9DDGpLZOnTqxdOnSZBfDpJiOHTuSl5eXsO3bNeIw\ndo3YmOrNux6X7GKYFFPeeWfXiI0xxpgazgKxMcYYk0QWiI0xxpgkskBsjDHGJJEFYmOMqSJpaWnl\nPtLT05kxo8xMvjHLyspi5MiRMa2zc+dO0tLSeOaZZyq9fxMbG75kjDFVZPbs2cX/b9++nb59+zJy\n5EhOO+204vRu3bpVej/vv/8+++67b0zr1KtXj9mzZ3PAAQdUev8mNjZ8KYwNXzKmeqtNw5e2bt1K\nkyZN+Ne//sVFF11UYf6dO3dSr169KihZ9bdr1y7q1q1bJn3Hjh3Urx95471gdu/eTUZGBiJlRyLZ\n8CVjjEkxjz/+OGlpacybN48+ffrQqFEjHnnkEQBuvPFGDjvsMBo3bkyHDh245JJLWLduXan1I5um\nzz33XHr37s3777/PoYceSpMmTcjOzuaHH34ozuPXNH300Udz4YUXMnHiRA444AAyMzMZMGAAa9as\nKbW/JUuWcPLJJ9OwYUO6dOnCiy++yBlnnFGqph/Na6+9xhFHHEGDBg1o27Ytf/vb3ygqKipePmzY\nMNq3b09ubi5HHHEE9evX55133mHKlCmkpaUxffp0Tj/9dBo3bszNN98MuB851157La1bt6ZBgwb0\n6tWL3NzcUvsNvbZHH32Uzp0707BhQ9avX19heRPBmqaNMbWGT2Um7qqiQh6qlQ0ePJghQ4YwevRo\nWrRoQVFREevXr2fEiBHst99+rFmzhjFjxtCvXz/mzp1b7jYXL17MbbfdxqhRo8jIyOD666/n/PPP\nZ86cOeWuN2PGDJYvX86DDz7I5s2bue6667j22mt57bXXAFBVTj/9dPbs2cPEiRNJT08nJyeH9evX\nc9hhh5W77YkTJ3LZZZfx17/+lXvuuYeFCxcyfPhw0tLSGDVqVPGx2LRpE1deeSXDhw+nc+fOdOjQ\ngUWLFgFw6aWXcvnll3PzzTfTsGFDAC666CKmTZvGPffcQ4cOHXjsscc45ZRTmDVrFr/5zW+K9z91\n6lQWLVrE2LFjqVu3bvH6VU5V7eE93OEwxlRXFX1GXZhM7CNetmzZoiKiEyZMKLPs8ccf17S0NB0/\nfny52ygsLNTFixeriOicOXOK09u0aaO33XZb8fPBgwdrvXr1dPny5cVpL730kqalpenSpUtVVXXH\njh0qIvr0008X5+nVq5e2bNlSt27dWpx29913a506dbSwsFBVVV977TVNS0vTr7/+ujjPkiVLND09\nXU899dRyy56VlaVDhgwplT5u3Dht0qSJ/vLLL6qqOmzYME1LS9MPPvigVL7JkyeriOiIESNKpc+f\nP19FRF999dVS++rSpYueddZZpV5bkyZNdMOGDVHLGFLeeectq1TssaZpY4yppvyadt9++22OPvpo\nmjVrRkZGBl26dEFESjUz++natSvt2rUrft6tWzdUlRUrVpS73tFHH12qptitWzcKCwtZvXo1AJ9/\n/jmdOnXikEMOKc7TqVOnCmvDX3/9NatXr2bQoEEUFhYWP/r27cuWLVv47rvvivPWqVOHk046qcw2\nRKTMMfrss8/IyMhg4MCBxWlpaWkMGjSI//3vf6Xy9urVi2bNmpVbzqpggdgYY6qp1q1bl3o+a9Ys\nzj77bLp06cK///1vZs+ezcyZM1FVduzYUe62IgNOqLNTZddbvXo1rVq1KrOeX1q40HXtE088kTp1\n6hQ/unXrhoiwfPnyQNuKPEarVq2iefPmpKenl8m3YcOGctdNFrtGbIypNWpJh+pikT14J02aRMeO\nHZk4cWJxWkU14URr06aN79jntWvXkpWVFXW9Fi1aAO468cEHH1xmefgwKr+ezNGWZWVlsWHDBgoL\nC0sF459//pnmzZuXu26yWI24ChUUwEknQVYWjB2b7NIYY2qa7du3lxm28/zzzyc1oBx55JHk5eXx\n9ddfF6ctWbKEr776qtz1DjvsMFq1akVeXh49e/Ys88jMzNyr8hx11FHs2bOHN954ozitqKiI119/\nnd69e+/VNhPNasRV6KabYOpU9//NN8NZZ0HnzsktkzGm5jj55JN54oknGDp0KL/73e+YMWMGL7/8\ncpWXQ8OaHgYOHMiBBx7IwIEDufPOO0lPT+eOO+4gKyuLtLTodb309HTGjBnDlVdeSUFBAf369SMj\nI4PFixfz5ptv8v7771f4A0N9mkC6d+/O2WefzdVXX01BQQEdO3Zk3LhxLF26lBdffHHvX3QCWY24\niuzaBZMmlTwvKoIKRg0YY2q5WGuyAwcOZPTo0bzwwguceeaZzJkzh7feemuvtxuZz+95tAkuwv9/\n77332H///bn44ou56aabuOGGG+jcuTNNmzYtd/8XXXQRkyZN4rPPPmPQoEEMGjSIp556imOOOSbQ\na4iWZ+LEiQwePJiRI0dy9tlns3btWqZMmULPnj0rfG3JYDNrhUnkzFrTpsGJJ5ZO++c/4YYbErI7\nY2ql2jSzVm22fv16OnfuzK233srQoUOTXZxKS/TMWtY0XUX+85+yaStXVn05jDEm3h599FHq16/P\nr371K1avXs2YMWMQkUBTdxoLxFVCFd55p2y6BWJjTG1Qt25d7rvvPpYtW0Z6ejq9evVi/PjxtGnT\nJtlFqxGsaTpMopqmFy6Egw4qm37ccTBzZtx3Z0ytZU3TJhnspg+1gF+zNFiN2BhjTJICsYi0E5HX\nRGSjiGwSkddFpP1ebGeYiBSJyIyI9MYi8rKILBKRLSKyQUQ+FZHz4/cqgvNrlgbIz699ExAYY4yJ\nTZVfIxaRBsB0YDtwoZf8D2CaiByuqtsDbqczMAL42WdxXWA3cCeQB9QD/gg8JyL7qOpDlXoRMdiw\nASKmNy22c6eb5KNly6oqjTHGmOomGZ21rgI6AV1VdQmAiHwFLAKuBh4IuJ1xwPPAQUCpSUVVdT1w\nQUT+ySJyIHAZUGWBeMoUKCyMvnzlSgvExhiTypLRNH0GMDsUhAFUNQ+YBZwZZAMich7QAxge474L\ngKIKc8VRtOvDIXad2BhjUlsyAvEhwNc+6d8A3SpaWUSaAWOBm1V1Y4D86SLSQkSuAvoBD8ZY3r22\nZw+8917ptLB5zAELxMYYk+qSEYhbABt80tcDzX3SI90HLFTViRVlFJEhuGvF64BHgJtUdUIMZa2U\nTz5x14hDWrSAs88unccCsTGpIy0trdxHenq6752M9sZ3333HHXfcwbZt2+KyPZM4NWpCDxHpjbv2\n2yPgKi8BnwAtgQHA/SKyQ1XHJ6iIpUQ2S592GnToUDrNArExqWP27NnF/2/fvp2+ffsycuTIUje3\n79atwobBQL799lvuuOMOrrnmGho2bBiXbZrESEYg3oB/zTdaTTnc48DTQL6IZAKCew1p3vPtqror\nlFlVC3DXhQH+KyKNgPtE5BlV9e1ClZOTU/x/dnY22dnZQV6Tr8hhS/37Q8QdzMjP3+vNG2NqmKOO\nOqr4/61btwLQuXPnUunxoqoJv6nBzp07qVevXpn0HTt2UL9+/b3aZlFREUVFRWRkVM96Ym5uLrm5\nufHdqKpW6QOYCszwSZ8OTK9g3SKg0Psb+SgE/lrB+kO8fPtFWa7xsnixqhsl7B4ZGaobNqh++mnp\n9O7d47ZLY2q9eH5Gk23Lli0qIjphwgTf5UuWLNFBgwZps2bNtFGjRnr66afrjz/+WCrPHXfcoZ07\nd9b69etrmzZt9PTTT9f169fr5MmTVUQ0LS1NRURFRA8++OByyzNt2jQ97rjjtEGDBtqyZUu95ppr\ndNu2bcXLH3vsMRURnTt3rvbu3VsbNmyo9913n37//fcqIvrKK6/oeeedp5mZmXrGGWeoquqePXv0\n1ltv1fbt22u9evX0sMMO01dffbXUfgcPHqzHHXecvvLKK3rwwQdrnTp19PPPP9+bQ5ow5Z133rJK\nxcVkXCN+G+glIp1CCd7/xwJl7+dVWjbQ1/sbeiwAvvL+fy3A+luANTGUd6+8+27p5717Q7Nm0LZt\n6XRrmjbGRFq7di3HHHMMy5Yt45lnnuGll15i3bp1nHLKKezZsweAJ598kvvvv5/hw4fzwQcfMG7c\nODp27Mj27ds55phjuPPOOwF4//33mT17drn3LZ42bRqnnHIKnTt35o033uCf//wnb7zxBldffXVx\nnlDtevDgwQwaNIj33nuPfv36FS+//vrrad26NZMmTeKmm24CYOjQoYwdO5a//OUvvPPOOxx55JH8\n4Q9/KHPrxh9++IHbb7+dkSNH8t5779G+fczzO9Voyaj7j8fVTN8Skdu8tFHAUuDJUCYR6QD8BOSo\n6t8BVLVMLwYR2Qikq+rMsLSrgF7Ah8AKYB/chB5nA7eo6p4EvK5S/JqlAWj0M7JfPrqqO2ga69a5\niT18WneMMTGSOxJ/f1m9PfHT4d17772AC5CNGjUCoFevXuy///4899xzXHrppcyZM4f+/ftzxRVX\nFK83cODA4v+7dOkCQI8ePdh3333L3d+wYcPo168fEyaU9GVt1aoVAwYM4Pbbb+cAb7iHiHDzzTeX\n2ufChQsBdylv7Nixxelr1qxh3LhxjB49ujgwn3zyyeTl5ZGTk8OZZ5aMVi0oKGDmzJl07do1hqNU\ne1R5jVhVtwEnAD8AE4HngB+BE71lIRL2qHCzEc+/AvYFxgBTcBN4tABOV9X7KvUCAti8GT76qHRa\n//7wwY8fcNBjv0Kv6gkXnFJcbLtObIwJN3XqVH73u99Rv359CgsLKSwspFmzZnTv3p3PP/8cgF//\n+te88cZv0ylIAAAgAElEQVQbjB49mi+++GKvb4axadMmvvjiC84555zifRUWFtKnTx9Ulblz55bK\nH96xrLz0BQsWsGvXLgYNGlQq/Y9//CNffvklW7ZsKU7r3LlzygZhSNJc06q6QlXPUdVmqpqpqr9X\n1WUReZaqarqqjq5gW31V9fiItE9Utb+qtlXVBqraXlX7qerkRLyeSB98ALt3lzzv2hWatV3D+ZPO\nZ8su7+Q74ENo/zFgzdPGmNLWrVvHhAkTqFOnTvGjbt26fPLJJyxfvhyAa665hpycHF544QWOOuoo\n2rRpw6hRo2LeV0FBAarKZZddVmp/TZo0QVWL9xfSunVr3+1Epq9atco3PfR8Q9jYzmjbTBXVs1ta\nDRfZLH16f+Wad69h7ba1pRe0XAjLj7VAbIwppUWLFhx99NEMGzasTE03MzMTcGOSb7rpJm666SaW\nL1/OxIkTGTlyJJ06deKiiy4KvK/mzd0glrvuuouTTjqpzPJ27dqVeh6tJ3ZkelZWFuCaqDt27Fic\nvnr16lL7LW+bqcICcZwVFpadTavBkS8w6btJZTM3db80LRAbEx9Vcf22Kpx44olMmTKFww47LNAw\nnvbt2zNixAjGjx/Pt99+C0Bdb6zkjh07yl23efPm9OjRg0WLFjF06NDKF97TvXt36taty6uvvlp8\njRjglVde4fDDD6dx48Zx21dNZ4E4zubMgbVhFd8m+61k3NI/+2fOdK3xFoiNMeGGDh3Kyy+/zAkn\nnMCQIUPIyspi9erV5ObmcvLJJzNw4EAuu+wy2rZty1FHHUXTpk2ZMmUKK1as4MQTTwTgoIMOQlV5\n9NFH+f3vf0/jxo2jThYyZswYTjvtNAoLCzn77LNp1KgRS5Ys4d133+WBBx7Yq17M++67L0OGDGHk\nyJGoKt27d+ell14iNzeXN954o1LHp7axQBxnpZullUbnXcHqHVGmxLZAbExKi9Yk27p1az799FNG\njBjBddddx+bNm8nKyqJPnz4ceuihABxzzDE8++yzjBs3jl27dtGlSxcmTJjAySefDLhe03fddReP\nPfYYY8eOpUuXLsW15UgnnHAC06dPJycnhwsuuICioiI6duzIqaeeyj777LPXr+Pee++lQYMGPPzw\nw6xZs4YDDzyQV155hQEDBgQ5PClD9ranXW0kIlrZ49G9O3z5pfek53gYcFX0zOsOhEe+57jjYObM\n6NmMMY6I7HXvYGP2VnnnnbesUhe5k9JrurZatiwsCDdbAqfcUGr5EVlHlF4hcxmgViM2xpgUZoE4\njopv8iBFcNalUK9knFyjOo145ZxXaFSnUckKdbZDwwLy892El8YYY1KPBeI4Kg7ERz0MnUrP6PHP\nfv+kc/POdMiMuP1S0+Xs3AkFBRhjjElBFojjZOtWmDYN2GchnDSs1LJ+B/TjqiPcteIygdg6bBlj\nTEqzQBwnU6fCzt17YODFUKdk3F5mvUyeHvB0ca/C9k0jhgFk2lhiY4xJZRaI4+Sdd4BjxkC7T0ul\nP3zqw7RrWjIzjdWIjTHGhLNAHAdFRfDmJ19C39tLpZ910FlccPgFpdLaZ0bWiC0QG2NMKrNAHAef\nfbGLdb0vgvSSOz20bNCSx09/vMxAd7/OWmCB2BhjUpXNrBUHQ98dDW0WlEp7rP9jtG5c9o4i1jRt\nzN7r2LFjyt8gwFS98JtWJEKFgVhE6gKrgUtU9e2ElqYG+mzlZ8zUu0rdNblXw/MY1G2Qb/7w68UA\nNMmHtD3k59tvImMqkpeXl+wiGBN3FTZNq+ouYA9Q/i08UtD23ds5/7WLIa2wJPGXLJ79w8NR16mf\nUZ99G+1bkpBWBE3yrUZsjDEpKug14jcB/ypeCvvbtL+xeOP3pdIOXPgUB3VsUe56fs3T69bBzp3x\nLqExxpjqLmh76PvAQyLyGi4orwJKTcqoqtPiXLZqbcbSGdw/+/7SiV9cwQW/Pa3Cdds3bc/n+Z+X\nJHgdtvLzYf/941lKY4wx1V3QQPy69/ds7xGiuKujCqTHsVzV2pZdW7jkzUvQ8N8iGzrBlLH0H1Xx\n+uV12LJAbIwxqSVoIO6b0FLUMDm5OSzZuKR04lvP0m7fJnTvXvH6ZWfXsp7TxhiTqgIFYlX9qOJc\nqeOWY28hb2Mer3/nNRTMvg7ysun/JwgysqJsjdjGEhtjTKqKacyMiLQAjgZaAOuBT1R1fSIKVp21\natSKVwa9SqsTXmL9rx6GqXcC0L9/sPVtLLExxpiQwIFYRP4O3AjUpWTU7E4RuU9Vb0tE4aqzr74S\n1n90Lnw0GBAaNIATTgi2rk1zaYwxJiTQ8CUR+T/gVuB54ATgYNx14+eBW0XkrwkrYTVVfO9h7zfJ\nSSdBgwbB1m3TuA110uqUJDTYAHW3WCA2xpgUFHQc8Z+AB1X1SlX9SFUXen+vBB4Cro1lpyLSTkRe\nE5GNIrJJRF4XkfYVr1lmO8NEpEhEZkSkdxGRh0XkGxH5RUTyReQtETk81n1EUxKInaDN0gBpklZ2\nhq2myy0QG2NMCgoaiDsB70ZZ9q63PBARaQBMB7oCFwIXAF2Aad6yoNvpDIwAfvZZ3A/IBp4BzgCu\nAVoBs0WkR9B9RLNmDXxa+m6HnH56bNso2zy9nPx8UPXPb4wxpnYKeo24ADgU+NBn2SHe8qCuwgXu\nrqq6BEBEvgIWAVcDDwTczjhc0/hBlB3D/KKqPhqeICLTgTzgOuCSGMpbxuTJpQNmz57Qtm1s2/Dr\nsLXzRygogJYtK1M6Y4wxNUnQGvEbwGgRuVBEMgBEJENEzgVGUTLhRxBnALNDQRhAVfOAWcCZQTYg\nIucBPYDhfsv9enKr6mbgByDGkFnW+efDrFkwfDgcdlhszdIhNpbYGGMMBA/Ew4H5wARgu4j8DGwH\n/g0swHXkCuoQ4Guf9G+AbhWtLCLNgLHAzaq6MehORaQ5rlb/bdB1oklPh2OOgTvvhC+/hJEjY9+G\n3ZfYGGMMBJ/Q4xcR6QOcDvSmZBzxR8D7qjFd2WwBbPBJXw80D7D+fcBCVZ0Ywz4BHvH+PhjjehVK\n34vJPW0ssTHGGAh+P+JrgKmq+h/gPxWskjAi0hvXuSumDlciMhwYDFymqj8lomyxsqZpY4wxEPx+\nxHfjarLxsAH/mm+0mnK4x4GngXwRyfSaqTOAdO953cgVRORPwD+AEao6oXJFjx//aS7VArExxqSY\noL2mvwM6AzMqyhjAN7jrxJG6UfH124NxvaSv8Vm2HrgeN64ZABG5EHgUGKOqdwcpXE5OTvH/2dnZ\nZGdnB1ktZpn1M2larymbd252CRk7odFaVq7cNyH7M8YYU3m5ubnk5ubGdZsS5PKuiPTHXVs9S1W/\nqtQORa4DxuCGL+V5aZ1wPZqHqmrU4UvedepID+Jq9n8GflTVfC/vQOAV4ClV9QvcftuP8XJ35Rw6\n7lC+WftNScKTczi85W9YsKDKimCMMaYSRARVDXC7n+iC1ohvARoD80QkD1gF4TfjRVX1+IDbGg8M\nAd4SkdAc1aOApcCToUwi0gH4CchR1b97OylTIxeRjUC6qs4MS+sDvIDr6T1RRH4btspOVZ0fsKwJ\n1SGzQ+lA3HQ5+fm/SV6BjDHGVLmggbiQOAz7AVDVbSJyAnA/MBE3WfOHwPWqui0sq4Q9KtxsxPO+\nuJtT9AT+F7FsKa6ZPen8Omyt+x527oR69ZJTJmOMMVUr6PCl7HjuVFVXAOdUkGcpZWfM8svX1yft\nDuCOvS5gFYl2X+L8fNh//yQUyBhjTJWrsNe0iNQVkTeiXJ81lWBjiY0xxgQdvnRSkLwmNnZfYmOM\nMUGD6yygVyILkopsmktjjDFBO2vdCLwpIluANynbaxpVLYpz2Wq9tk3aIggaOpRNVkH6LlauLDMv\niTHGmFoqaI34K+AA3JjdpcAuYHfYY1dCSlfL1cuoR+vGrUsSRKHJSqsRG2NMCglaIx5F2SFCJg46\nZHZg9ZbVJQmZy1m50rpMG2NMqgg6fCknweVIWe2btuezlZ+VJGQusxqxMcakkJh7QotIYxHpKCJ1\nElGgVOPXYSs/H6pwpk1jjDFJFDgQi0h/EZkLbMJNPXmYl/6UiJyXoPLVen5jiXfuhIKC5JTHGGNM\n1QoUiEXkLOAtYB1u3unwaSeXABfHv2ipwe5LbIwxqS1ojfh24FlV7QdE3h3pa+DQuJYqhUSb5tIC\nsTHGpIaggfhg4GXv/8irlxuAfeJWohRj01waY0xqCxqINwMtoyzrBKyNS2lSUKtGraibHjaBR/1N\nUG+zBWJjjEkRQQPxB8BwEWkWlqYiUg/4M/B+3EuWItIkrex14qbLLRAbY0yKCBqIRwBtgIXAU7jm\n6WHAfKAdkJOIwqUKv+ZpC8TGGJMaAgViVc0DegL/AU4GCoE+wGzgt6qan6gCpoKyd2GyGrExxqSK\noFNcoqorgMsTWJaU1aGpT414QXLKYowxpmrZPYarAb/7EhcUwM6dySmPMcaYqmOBuBqIdl/ifGvw\nN8aYWs8CcTVgY4mNMSZ1WSCuBspOc7kcpMgCsTHGpAALxNVAk3pNaFY/bIh2+m5otMYCsTHGpAAL\nxNWEjSU2xpjUFHj4koh0Bv4AdADqRyxWVbWhTZXQvml7vvz5y5KEpstZufKo5BXIGGNMlQgUiL3b\nIL6Cq0GvASIH1tht7CvJasTGGJOagjZNjwZygSxV3U9V9494dI5lpyLSTkReE5GNIrJJRF4XkfYV\nr1lmO8NEpEhEZvgsu0FE3haRfC/PyFi3X5X87ktsgdgYY2q/oIG4M3Cfqlb6Lksi0gCYDnQFLgQu\nALoA07xlQbfTGTcH9s9RslwBtALeoAbU2P3uS5yfD1rtS26MMaYygl4j/p743XP4KtytE7uq6hIA\nEfkKWARcDTwQcDvjgOeBg4D0yIWq2s3bdjpwTaVLnWB+TdM7d0JBAbSMdgNKY4wxNV7QGvFQ4Fav\nFlpZZwCzQ0EYim8qMQs4M8gGROQ8oAcwPA7lqRb8prkEm9TDGGNqu6A14hxcjfg7EVkErI9Yrqp6\nfMBtHQK86ZP+DTCoopW9eyKPBW5W1Y0iEnC31VvbJm0RBA21ojf+GdJ3snJlPbp3T27ZjDHGJE7Q\nGnEh7l7EHwNrvefhj6IY9tkC2OCTvh5oHmD9+4CFqjoxhn1We3XS67Bfk/1KJzZdYTViY4yp5QLV\niFU1O8HlCEREeuM6d/VIdlkSoX1me1b+EhZ5M5excuUBySuQMcaYhEvGzFob8K/5Rqsph3sceBrI\nF5FMr5k6A0j3nteNb1Grll/PaasRG2NM7RbLzFpZwI3A8biguR43DGmsqq6OYZ/f4K4TR+oGfFvB\nugfjekn79YJeD1wPPBRDWcrIyckp/j87O5vs7OzKbC4mNpbYGGOqt9zcXHJzc+O6TdEAA1VFpCsw\nE1eTnQWsBtoAx+Bqsb1VdVGgHYpcB4zBDV/K89I6AT8AQ1U16vAlEenjk/wgrmb/Z+BHVc2PWCcd\n2A3kqOqoCsqmQY5Hojz06UNcN/m6koTPr+LwZU+wYEHSimSMMaYcIoKqVqrXcNAa8T3AZuC3oeDp\nFaAj8F9v+dkBtzUeGAK8JSK3eWmjgKXAk2Hb7gD8hAugfwdQVb8ZtDYC6ao6MyL9CNx45dAY424i\n8nvv/3dVdUfA8lYZm+bSGGNST9BA3Bf4U3gQBlDVpSKSg5tcIxBV3SYiJwD3AxMBAT4ErlfVbWFZ\nJexR4WZ90v4MXBS2/BzvAbA/sCxomauKX9N0QQHs2AH1I2+zYYwxplYIGojrAr9EWfaLtzwwVV1B\nSVCMlmcpPjNm+eTrGyX9UuDSWMqVbH41YlBWrRL23z8pRTLGGJNgQXtNzwf+IiKl8oubTeNab7mp\npJYNW1I/I6zqW28L1N9kzdPGGFOLBa0RjwL+g5tZ62VgFa6z1jm4GzacnpjipRYRoX3T9ixaH9bv\nLXMZK1c2S16hjDHGJFSgGrGqTgb645qhRwCPAn8DtgD9VfW/CSthiinTPN3UxhIbY0xtFngcsReM\nJ4tIQ9wwpg0RnatMHPjd/MECsTHG1F6BA3GIF3wtACdIh6Y2u5YxxqSSqIFYREYCT6lqvvd/eVRV\nR8e3aKnJxhIbY0xqKa9GnANMBvK9/8ujgAXiOPBtmp6XnLIYY4xJvKiBWFXT/P43ieXXWSs/H1Sh\nltx62RhjTJhAAVZEOohInSjLMrzpKE0clJldq+kKdu4qpKAgOeUxxhiTWEFrukuIfg/g7t5yEweN\n6jaiRYMWJQnpe6DxartObIwxtVTQQFxeo2gdoCgOZTEeuy+xMcakjvJ6TTfD3Xc4pK2IdI7I1gC4\nGHdbRBMn7Zu2Z/7qsFlDM5excmWv5BXIGGNMwpTXa/o64HZcj2gFXouST7x8Jk5sdi1jjEkd5QXi\nN4E8XKB9Bvg78GNEnp3At6r6ZUJKl6JsLLExxqSO8oYvLQAWAIiIAv9RVeu7WwX87ktsgdgYY2qn\nQFNcquqERBfElPDtrPVdcspijDEmsQLPNS0ihwBXAAcC9SMWq6qeGM+CpTJrmjbGmNQRKBCLyG+B\nj3DXjLsAX+LuwNQBWAEsTlD5UlJWkyzSJI0i9UaFNVpLwabt7NjRgPqRP4GMMcbUaEHHEd8JTAIO\nwXXeulxVOwEnAem4jlwmTjLSMmjbpG3pxKYryM9PTnmMMcYkTtBAfDjwPG4YE7jgi6pOwwXhu+Jf\ntNTmd/MHC8TGGFP7BA3EdYGtqloErAeywpYtBA6Nd8FSnc2uZYwxqSFoIF6Mux4M7vrwZSKSJiJp\nwKXYzFpx16GpddgyxphUELTX9DtAH+A53PXid4HNQCHQGPhrQkqXwnzvS2yB2Bhjap2g44hzwv7/\nUER6Ab8HGgKTVfW/iSle6rJpLo0xJjUEHkccTlXnAfPiXBYTxncs8YLklMUYY0ziBLpGLCK9ROQP\nUZad440zDkxE2onIayKyUUQ2icjrItK+4jXLbGeYiBSJyAyfZSIiw0VkiYhsF5H5InJ2rPtIFr9p\nLlesVP/MxhhjaqygnbXuwo0h9nMwMQxfEpEGwHSgK3AhcAFukpBp3rKg2+kMjAB+jpLl78BI4CHg\nd8AnwKsi8rug+0imFg1a0DCjYUlC3W3kr9+AWiw2xphaJWgg7g7MjrLsM9w446CuAjoBZ6rqO6r6\nDjDAS7s6hu2Mw41t/j5ygYi0Am4E7lLV+1X1I1W9BvcD4O4Y9pE0IlKmw9auBssosNtuGGNMrRI0\nENcvJ2860CiGfZ4BzFbVJaEEVc0DZgFnBtmAiJwH9ACGR8nyO6AO8O+I9OeBw0SkYwzlTRqbc9oY\nY2q/oIH4O1yt1c8A3KQeQR0CfO2T/g3QraKVRaQZMBa4WVU3RsnWDdipqpH3T/4GN0VnhfupDqzn\ntDHG1H5Be00/DjwhIpuB8bgbPbTFNTNfDlwbwz5bABt80tfjbiRRkfuAhao6sYJ9+AXp9WHLqz2/\nDls33AB3x9C43qsXDBsGLWrEKzbGmNQTdBzxeBE5ELgeuCF8EXC/qj6ZiMJFEpHeuM5dPapif8nm\nN83lwg9hYQztDzNnQkEBPP10fMtmjDEmPgKPI1bVm0TkMdwdl/YB1gEfqupPMe5zA/4132g15XCP\nA08D+SKSiWtmzgDSvOfbVXWXt51mUfYBJTXjMnJycor/z87OJjs7u4IiJY7fNeK98fLL8MQTkLFX\no8aNMcaE5ObmkpubG9dtilbxeBgRmQrUUdU+EenTAVS1bznrFuFq4eKzWIHrVfUhEbkQ+BfQJfyH\ngohcggvknVV1qc/2taqPR3l+KPiBAx85sCRhU3u4f++C8RdfQM+ecSqYMcYYwI1wUVW/mBRY1DqS\niHQAVqnqbu//cqlq0AjxNjBGRDp5vaURkU7AscDQCtbN9kl7ENfp7M9AqHPWZGAPcD4wOizvBcDX\nfkG4Ooq8RpzeLJ//Tt9DRlrFVdtRo2Dq1JLns2ZZIDbGmOqovG/0PKAXbpxwHiX3Io4mPeA+xwND\ngLdE5DYvbRSwFCi+1uwF/5+AHFX9O4Cq+s2gtRFIV9WZoTRVXSsiY4HhIrIFmAsMxgXyMwKWM+ka\n1GlAy4YtWbdtHQCFWkiXHqvK3hDCx6mnlg3Ef/lLokpqjDFmb5UXiC+lpIZ5GRUH4kBUdZuInADc\nD0zENTN/iGtW3haWVcIeFW7WJ+1W4BfcnaHa4IZYnaOq71ei+FWuQ2aH4kAMsGzTskCB+NhjSz+f\nNSveJTPGGBMP5QXiTEpqudPwmqnjsVNVXQGcU0GepQSoZUe7puxd7L3Te9RYHTI7MHfV3OLn4+eO\nZ/aKaJOcldhTCOm9oXDVIbD4FFasEJYvh/Yxz+htjDEmkcoLxPfjprVcAywBjsY1U5sqFHmdeMKC\nCcFXPtH7O/UfMPNWZs2CwYPjVzZjjDGVV97MWhtxTbrgmoerT3fiFNIxMw6zcR7xBGDN08YYUx1F\nHb4kIm8DvYEFQB9ch6fNUbajqnpilGU1RnUbvgRuCNPhjx3OzsKdldvQvWvpcWBL5s6tOKsxxphg\n4jF8qbxA3Bq4HTgIOB74Ctf5yZeq9q5MQaqD6hiIAeatmsek7yaxbfe2ijN7Xv7mZVb+EjYx9cT/\nkpZ3Mps2QePGCSikMcakoISOI1bVn/HmkPYm0rhKVe0acRL0yOpBj6zYZvXcuGMjz8x/piQhax5F\nP53Mp5/CiTW+7cIYY2qPoHdf2h+Yn8iCmPjqmRUxe0eWa5O268TGGFO9BL3pQ42YicqUKFODbjMP\nsEBsjDHVTdQasYgUishR3v9F3vNojz1VV2QTxOGtD0fC50Jp+QPU/YXZs6GwMHnlMsYYU1p5NeJR\nuPsOh/6vfr2YTFSN6zbmwJYH8v2670sS2yxg87Lj+OYbOPzw5JXNGGNMifI6a90R9n9OlZTGxFWP\nNj0iAvE8WHYcs2ZZIDbGmOoiaGetMkSkhYgcISL14lkgEz/WYcsYY6q/QIFYRP4mIneFPe+DuyPT\nZ8AiEemSmOKZyujRJqLDVpZ12DLGmOomaI34AtwtCUPuwc24dRbwM6Xv+WuqiTI9p1t9A+k7ycuD\n/PykFMkYY0yEoIG4LbAIQERaAUcBt6nqO8DduKkwTTXTokGL0nNVp++Bfb8G4OOPk1QoY4wxpQQN\nxIVAXe//PsAOINTAuRZoEedymTgpUyu268TGGFOtBA3E3wAXiEhj4DLgo7B7E7fH3SrRVEM920R2\n2LLrxMYYU50EmlkLN474LeB8YDdwStiy03B3ZjLVULQZtubNg23boGHDJBTKGGNMsUA1YlWdAhwM\n/AE4RFU/Cls8A9d5y1RDZYYwtVkAUsiePTBnTnLKZIwxpkTgccSqukRVX1fVHyPSn1DV2fEvmomH\nrMZZ7Nto35KEOtuh5ULAmqeNMaY6CDqO+EwRuTTseUcR+UREfhGR17xrx6YaEhGb2MMYY6qxoDXi\nvwGtwp6PBdoBT+J6UefEt1gmnspM7OFdJ/74YygqSkKBjDHGFAsaiA8AvgQQkQa4Dlo3qOqNwK3A\nwMQUz8RDtBrxxo3w/fc+KxhjjKkyQQNxfWC79/8xuN7W//WeLwT2i3O5TByVrRHPJ3QzLWueNsaY\n5AoaiPOA47z/zwS+UNVN3vN9gU1+K5nqYf/m+9O0XtOShAYboVkeYIHYGGOSLWggfgLIEZHPgWuB\np8OWHQ18G++CmfhJkzSfG0BYhy1jjKkOgo4jfhC4BPgEuExVx4ctbgI8G8tORaSd19t6o4hsEpHX\nRaR9gPU6iMibIpInIttEZK2I5IrIqT559xGRZ0RkjZd3toj0i6WctUm0OzEtXgxrbF40Y4xJmljG\nEf9bVf+iqhMj0q9W1eeCbsfr7DUd6ApciLuzUxdgmresPI1xc1uPAE7FTbe5GXhXRM4M20ddbx/9\ngJtwncmWAf/xbuGYcqJ12AK7AYQxxiRT0Cku4+kqoBPQVVWXAIjIV7i7O10NPBBtRVX9FrgyPE1E\n3gOWAJfipuEEbwYwIFtVZ3ppU0RkAXAv0CteL6amiDbVJbjm6bPOquICGWOMAWKoEYvIVSIyz2vm\nLYx8xLDPM4DZoSAMoKp5uLs5nRltpWhUtRDXWSx8ROxvge1hQTjkv8CRIpIV635quoNaHkT9jPol\nCU1WQ+NVgF0nNsaYZAo6s9ZFwMPAHNxQpmeB53HNwj/ibgoR1CHA1z7p3wDdApZHRCRdRFqLyEhc\nM/cjYVkKcTeniLTT+3toDOWtFTLSMji89eGlE73rxF98ATt2JKFQxhhjAteI/w+4C7jGez5OVS8G\nOuPGFxfEsM8WwAaf9PVA84DbuBcXaFcBQ4HzVXVa2PKFQFMROTBivWPCypByos2wtWuXC8bGGGOq\nXtBA3AV3l6Ui71EXQFU3AP8ArktI6aK7H/gN0B/4D/C8iJwWtvwF3I+DiSJyqNeD+lagt7c8JSd2\nLK/DljVPG2NMcgTtrLUdyFBVFZHVuJpw6I5LW4htZq0N+Nd8o9WUy1DVfCDfe/qeiEwH7gPe85Zv\nEpGBwARgASDAYuB2YDSuJu0rJyen+P/s7Gyys7ODFKlGiDaECSwQG2NMELm5ueTm5sZ1m6KqFWcS\nmQq8oaqPiMiLwGG43st7gEeBdFU9ItAO3bbqqGqfiPTpAKraN7aXACIyBrhOVev6LDvAK98PInIL\ncBvQSlW3++TVIMejptqxZweN72xMoYb1rbt7PexoTsuWbjyxSPLKZ4wxNY2IoKqV+uYM2jT9JBCa\nI/E23Hje/+FqxV2BG2PY59tALxHpFErw/j+WkuFHgYmI4Jqcf/Rbrqo/ekG4MXAFMNEvCKeC+hn1\n6dYqoj9cm/kArFsHixYloVDGGJPiAjVNq+rLYf8vFpFDcFNbNgQ+VtV1MexzPDAEeEtEbvPSRgFL\ncQEfcLNoAT8BOar6dy/tdlwT9ixgNdAGF1x/A5wbvhMRuRP4AliHu8Z9E67X9K0xlLXW6ZnVk6/W\nfHtYArQAACAASURBVFWSkDUX8lwjxKxZ0LVrkgpmjDEpKvA44nCqulVVP1TVt2MMwqjqNuAE4Adg\nIvAcrjZ7orcsRMIeIXNxw58eAqYA9wDbgONU9dWIXbXGdeqaAowEJnv5NsZS3trGrhMbY0z1ErVG\n7NVIA1PVZTHkXQGcU0GepUB6RNo7wDsB93F50PKkEus5bYwx1Ut5TdN5hG5aG0x6xVlMsnVv0710\nwj4Loc422N2Q77+HggLYZ5/klM0YY1JReYH4MmILxKYGaFqvKb9q8SsWr1/sEtKKoPWXsMJNv/3J\nJ9C/fxILaIwxKSZqIFbVf1VhOUwV6pnVsyQQg2ue9gLxrFkWiI0xpipF7azlzed8hohEnZdZRA4T\nkTMSUzSTKNGmugS7TmyMMVWtvF7TFwAv4mbOiuYX4EURObecPKaaKa/D1pw5bu5pY4wxVaO8QHwh\n8Kx3i0Jf3rKngYvjWyyTSGVqxK2/hnQXfXfsgHnzfFYyxhiTEOUF4p64+/dW5EPchBqmhmjVqBXt\nmrYrSUjfBa2+LX5qzdPGGFN1ygvETQh2E4YNXl5Tg9h1YmOMqR7KC8TrgI4BttHBy2tqkIpm2KrF\n974wxphqpbxA/D+CXfu9xMtrapDIDltpbUs6bP38MyxZUtUlMsaY1FReIH4AOFFE7hcRv9sL1hGR\nB3DzRt+fqAKaxOiRVbpGLFnzQYqKn1vztDHGVI2ogVhVP8Hd3vCvwAoReV5E/uE9ngdWAH8GblTV\n2VVTXBMv7Zu2Z58GJXNZFqZvhRYl90G0QGyMMVVDtIKLgSLSB7gFyAYaeMnbgVzgblWdmcDyVSkR\n0YqOR21y8nMn8+FPH5YkvPYifD0YgEaN4MADoX790o969cqm1a8PmZlwyilw0EGJK+/27fDOO1BU\nBAMGQMOGiduXMcYEISKoqlScM7oK70esqjOAGSKSBrT0kgtUtbAyOzbJ17NNz9KBOGtucSDeuhXm\nzo2yYhQicP75cMcd0Llz/Mq5ezc88wyMGgX5+S6tTRu47Ta44gqoW+bCSeV88w3cfTdMmwa//jXc\ncgv06RPffRhjTEjg+xGrapGqrvEeFoRrgcjrxE27Vm4mD1V4/nlXKx4yBFatqtTmKCqCF1+Ebt3g\nT38qCcIAq1e7fRx8sNtnYRzOyPnzYdAgOPRQt838fHjvPTj+eOjbF3JzK78PY4yJFDgQm9oncgiT\n7DePNlmVb5rfvRvGjYMDDoDhw2FDkNHoYVTh3XehZ0847zxYvDh63p9+ggsvdDXXt9/eu2FXc+a4\npu4ePeD11/3z5Oa6YJydDdOn2/AuY0z8VHiNOJWk2jXiIi0i8+5MtuwqmU78xz8vpej/2zv3+Kiq\nc+9/n1xIAgFCuCTcQTFQQMKlCireELVWULRq27cSL6fW81qt+vbU+rZH0aq1rfbU9q0eq7Z9DWIP\ntdWqKIpHwSqIIGC4g3K/BcIl4RJyf84fa89kMplJJskkQzLP9/NZn5m99tprr72yMr+91nrWs44M\noqzMubssL8f/PVQoL3dzt6+95oZ0Q5GRAffdBz/4gZt7boiPPoKf/AQ+DrMgLjXVDYGfPBn6/Dnn\nwM9/7gSzMRYvhkcegXffbTxtMOefD7NmwZQprjyGYcQn0ZgjRlUteMFVR3xx3h/PUx7CH/6x4R/N\nyqeqSjU/X3XIEFXXX6wfsrJUf/971fLy+tevWqV6xRXhr01MVL39dtXdu1X37lW94w7VpKTw6S+7\nTPWzz+rfp6ZGdeFC1SlTwl8LquPGqT77rOqllzac7rzzVN97z+VrGEb84elGy7SnpRl0pBCPQnzX\n23fVEeIHP3iwRfmVlzuxzcoKL15DhzrRrqpS3bxZ9ZvfbFjsvv1tly6YLVtUb7xRVST8tdddp7ph\ngxPKd99VnTy54XtNnKg6b15dYV282Al7Q9ede67L3wTZMOKLaAixDU0HEG9D0wB/XvVnbn3jVv/x\n9JzpvPHtN1qc74kT8Lvfwa9+BcXFodMMHQo7d4Y3tLrySnjsMcjNbfhea9bAv/+7myMORUIC5OTA\nxo3h85g8GR58EKZODT/U/MknznL7nXfC5zNpksvn8svdfQ3D6NhEY2jahDiAeBTigsICxv5hrP+4\nf9f+7P4/u6OW/5EjTox/+9vw87rBTJ4Mjz/uPpvCJ5+4+eWmWDdPmeKWQV14YeRzvZ9+6pZozZ8f\nPs3AgXDjjZCX17prqw3DiC0mxFEmHoW4srqS9MfTqaiu8Mcd+LcD9O7SO6r32bcPHn0UnnsOqqpC\np8nNdYZWV1zRfAMoVXjvPSfIK1aET3f55U6AzzuvefcBWLbM9ZDfeqvhdGed5QT5W9+CXr0aTmsY\nRvsiGkJsg2dxTnJiMmf2ObNO3KrClq0nDkXfvvD007Bpk+spBgrtsGFuvfDKlfD1r7fMClkELrvM\nLUl65RXnHSyQadNcj/add1omwgBnnw3z5rl7TZ8ePt3y5XDXXa4OZsyAV1911uaGYRhgPeI6xGOP\nGOC2N27jhVUv+I8fv+Rx7p98f6vec906N6d72mlw7bWQnNw696mqckurNm92c85jxzZ+TXNZsQKe\nesoJbWlpw2l79IBvftP1lCdNsiVQhtFesaHpKBOvQvzM8mf4/tvf9x/fMOoG5l43N4Ylat8cO+bE\nePZs5yazsSY1bJh7GUlJgYqKhkNlZe13cC8Wl1/u1k2np7f6oxmGEUS7FWIRGYDbZnEqIMB/A/eo\n6q5GrhsE/A4YC/QBTgDrgF+q6vygtL2Ah4CvA9lAIfAW8LCqHgyTf1wK8dLdSznnj+f4j4dlDuOL\nu75o4AojUnbtgjlz4MUXG7babinJyW6o/bLLXBg3zqy2DaMtaJdCLCJpwGrcDk4/9aIfw+3sNEZV\nw9rWishI4F7czk+7gW7AbcA04BpVfT0g7VJgKPAAsBEYCTwCfKGq54bJPy6FuLSylK6Pd6VGa/cj\nLrm/hG4p3WJYqo6Fqhu6zs938+EHQ74KRo9eveDSS11v+dJLoV+/1r2fYcQr7VWI7waeBHJUdZsX\nNwT4AviRqj7VxPwSgW3ASlWd4cXl4MT3e6r6QkDa24FngBGqWq/LF69CDDDqmVGsL1rvP/7w5g+5\nYLBtOdQaVFQ4Y7H8fLetY0VF49e0lDPPdD3lyy93BmxJSeGD9aQNI3LaZBvEVmA6sNQnwgCqul1E\nFgNX44asI0ZVq0WkBKgJiE70PkuCkvuO7acmiHHZ4+oI8ap9q0yIW4lOndwmE1ddBYcPw+uvu80r\nkpPduXAh+Pzhw26p1oIF8EUjMwlr1rjw6183Xj4RSEysK87DhsHdd7ttLs2wzDCiSyyEeBTwjxDx\n64DrIslARAQnpr2A24Ec4G7feVXdICLvAg+IyFZc73gUbpj6bVXd1KIn6ICM7zueOWvm+I+bsoSp\nqqaKGq2hU2KUNwaOAzIz4ZZbmn/91Ve7z23bnCAvWADvvw8lwa+gTUDVWZsHrvf+7DO3y9XLL8Oz\nz8KgQc3P3zCMusRiaLoc+LWq/iQo/hHgx6ra6K+5iDwB/NA7LAVuVtW/BaVJA/4KXBkQPQ+4XlVD\nruKM56HphdsWMiV/iv84My2TSQMmUVZVVieUV5XXi6v2tqc+vcfpTBwwkYn9XRibPZaUpJRml0lV\n2XV0F5/u/pRP97iwct9KVJXxfcdzdv+z3b0GTGRw98GIddUAJ6DLlrldpRYscN9rahq/LlLS0+GX\nv3R7RNswthHvtNc54mgIcT+cJXQ2kAfMAK5V1be984IT3Vyc5fRG4CvAz4AVqjotTL5xK8RHTh4h\n81eZUc2zU2InxmaPZWL/iX7RHJY5LKxgHi0/yvI9y/2i++nuT9l/Yn9E9+rTpY//BWDigImc1e8s\nuqd2b/Q6VeXAiQPsKNnBjuIdbC/e7r6X7KDweCGZaZkM7j6YIRlDaj8zBtM3vS+JCYmN5n8qcOSI\n6yUvWOC2lzx2zPn39vV6g0Ok/wLnnw8vvOD8eBv1UYXt26GgwL28jB1rntU6Iu1ViAuB11T1fwfF\nPw1cp6pZzchzIZClqiO946uB14ApqrooIN1UYAFwtaq+GSIfnTVrlv/4oosu4qJINrbtIAz73TC2\nHNnSqvfITMv0i/KEvhPYe2yvX3g3FG1AiV57HNFrhF+ch/caTuHxQie0xTvYXuI+d5bs5GRVhE6w\nA0hOSGZg94EM7j6YwRmDGdLdCfSQjCGM6j0q6i5CgymvKmdnyU6GZAwhOTG63lBqauoK9Y4dcMcd\nbq/oYFJSnN/tH/7QzSXHMxUVsGoVLFni9rpevBgKC+umGTDALS0LDIMG2bx7e2LRokUsCnBo//DD\nD7dLIX4fSFbVC4LiFwKo6sXNyPMJ4G5fb1pE7scticpQ1WMB6boBxcD9qvqrEPnEbY8Y4OllT3Pn\n/DubfJ3g2mA0RbQ9IwhTT5tKXm4e14y4hi6dukQlX1Vl8a7F5Bfk89d1f6WkvIQ+Xfpwy9hb+O74\n7zIsc1hU7hOKmhr4wx/gvvvg+PH658ePhz/9qfGdsjoShw/Xiu6SJW4KoKys6fn06FErymPHuk+f\nZXt7paamrvMZXygvd17nSkvdJjC+76GOfXFlZdC5M3TrVhu6dg39vVs3N/qQGDBYVVXl8ggM5eWh\n4xISoHdv6NPHfXbv3vhLUnvtEd8NPIFbvrTdixsCbAbua8byJQE+Abqr6le8uJnA/wemqurCgLSX\nAe8AM1V1Toi84lqIAVbsXcHGgxtJTUolJSmF1KTUOiElsX5cUkISJ6tOsnLfyjrzuTtLdra4PGlJ\naUzoN6HOsLMg/qHrT/d8yop9KyitbMSnZIxI75TOdSOvI29MHhcOuZAEafqk6pbDW5i9ejazV89m\n65GtYdNdMvQSvjfhe8wYMaPVDOd27nRzw6F2nkpKgh//2G1JmZraKrePGdXVziHLsmW1vd3WdNCS\nlgZDhjR9Dl7EXdu5swuB38Mdp6U5kYxUIH3H5eXhvb+F29ilrejc2X2WlbXMPiI5uVaUQ3326QNX\nXdU+hbgz8DnOoccDXvTPgC5ArqqWeukGAVuBh1T1US9uFpAJLMZ5ysoGvgtMAb6tqq946dJxVtiJ\nwKPUzhE/CJQBo3z3CSpb3AtxNNl3bB/L9izzC/PyPcs5VnEsbHpB3HBygMHX6D6jGx16raqpYt2B\ndXXEeX3R+oh76F07dfXP/QbOB/fr2o+DpQfrzBv7hrYPnTzUpLoAGNhtIDPHzGRm7kxG9Gp4b8Ti\nsmJeWfcKLxa8yOJdi5t0n96de3Pz2Ju5bfxtnNHzjCaXM5DC44WsPbCWGq1hVO9R9OvaDxDmzHHL\nmQ4frn/NiBHwxz/CuSHd5pz6VFXBhg3OAcuKFW4zks8/b9x/eCjS0tzuW0ePOv/qlZXRL68Ra9qh\nEIPfxeVvgEupdXF5r6ruDEgzmFohfsSLm45bpjQa6I4T4wLgF6q6NOgefXGGWpcAfb207+FcXO4L\nUy4T4lakuqaajQc3+gVz/cH1bs6439lNMrCKhKPlR/ls72d+Yd5/Yj990/vWMbjyCW5GakaTLa6P\nVxxnR3Fdcd5RsoN1RetYe2Bto9ef3f9s8sbk8a3R36Jn556A25JywZYF5K/O5/WNr1Ne3fItmi4e\ncjG3T7idGSNmNGjBXlldyaZDmygoLODzws8p2F9Awf4CDpw4UCddz7Se5GbnkpuVy9C0XN58IZf3\nXh4J1XV74CJw551uv2ffXHNDBmK+85WVtT2ucEOVoeJSU533sL59G/7s1g1AKSkvISkhidSEdNav\nrxXdFSuccVWke2cH07+/czV67rnuMze3dkOTigonxp9/7uaSV61y30MN9RvtiXYqxKcqJsRGNFh7\nYC35Bfm8tPol9h0P+c7nJzkhmStzrmRgt4HMXTe3nvAFk5KYwtUjriZvTB6TB03m1Q2v8tzK51i6\ne2mD1/Xq3Iubc2/mtgm30atzLwoKC/xiW1BYwLqidXX2pG4KiSQhh0ZStTsXCnNhv/dZGmiwppBY\nCUlltSGxvO5xUhkklUN1MlSl1obqlLrHValO+DWhbv4px6DLfkgvhHTvs8v+Ot+lqzunie4lR3af\ni67Kg3U3QFmPJj13QgKMGeME1ye+TTW8qqmBLVtqhdkXDjTcDNoF4RzTBA6NRzJ83qmTe+E6erQ2\nHDsW/vuxoEG3hASXZ0qKe2ELFXznKiqgqMjVf1ERnDgRyZOaEEcVE2IjmlTXVPP+tvfJL8jn1Q2v\nNss628fkQZPJG5PH9aOuJyM1o9751ftX8/yK55m9ejYl5S3w5hFNSjNBapy4JpWBRPl/q6pTrVB3\nOg7Jza9fqlJg01VQkAdfXg419adD+vRxRmmTJjnhnTjRGQpFG1VnbX2o6bMfVFe7edGmzvempIQW\nwYbEMjU1vNgmJsbOErymxo0yJCS4MrbE6K20tK4wh/qcP9+EOKqYEButxbHyY/x9w9/JL8hn4faF\njV8ADM0YSl5uHjPHzOT0zNMjuqa0spRX1r3CcyufY8muJS0pMgCpSamM7jOaRElkzYE1p6xRXFQ5\n0ZvOW/4X4xLyuPgr4/jqBGHCBDfsHIm4qCp7ju3xjzpsPrSZjNQM/7K903qcZs5nOhDt0mr6VMaE\n2GgLdhTvYM6aObxY8CKbD22uc657SnduGHUDebl5nDfwvBb9YK89sJbnVzxP/up8isuKG03fN72v\nf/43NyuX3OxccnrmkJTguhTVNdVsObKl3rD2rqMN7l56alKZFlEPelTvUeTl5vGdM79D/279650v\nrypnfdF6f1346uXwyRBWbB69Oveq9QrnObvpkda0YXHj1MGEOMqYEBttiaqyfO9y/rLmLxw6eYhp\nOdOYnjOdtOS0qN7nZOVJ/rb+bzy38jk+3vkxSQlJjOw9so7g5mblNtsJyeGThyOac05OSA67LM4X\nOiV2orK6kvLq+q5UA92shjJkS0lMITs9m6z0LLLTs8nuUvs9q0sWWV2y6UIW1SXZlBSls/vEVtYm\nvMRrW/MbdWTjWxt+w6gbKC4r9hu0bTy4kaqalq/VyemZU8cDXW52blSXoFXXVIevz+py/9+mzlJF\n7++UnJB8yvfgK6or2HJ4C5sObWLTwU3+v2dGaoY/dE/pXvc41R13Se7SouczIY4yJsRGR8cnjq29\nQUdldSWHTh6iU2In//rzaLoErdEaKqor/GKSlpRGt5RuzfpBVVWW7FpCfkE+c9fNPSXm2FMSUzg9\n83QSpWl1Vq3V9fzBl1eXt+hlQZCQAp3eKb2ewIUTO19I75TerLX04P5ORaVFbDq4iU2HNrHx4Ea/\n8G49stXv876pJEqiv5ydkzv7HRRFypo71pgQRxMTYsOIb8qqynhz05vkr85n/hfzm/3j3jm5M2f2\nOZPcrFxG9xntd+W6fO9yjlfE93qllMSUiJ0FpSSloKp8efhLNh3aFNEUS5vzECbE0cSE2DAMH/uP\n7+cva/9CfkF+g9uCDuw2sN7c+uk9Tg85AlBdU82GgxvqeKDzOUwx2ikPmRBHFRNiwzBCsWb/Gl5a\n/RJri9aS1SXLL7hjssaQmdayXcuOVxxnxd4VdbzQ7T66O0oldwQPLwf2OFMSU6iqqQo5d1xWVRaV\nOfC2oH/X/gzvNZwRPUeQ0zOHlKQUisuK/aGkvKTucZk7bsmyQsCEONqYEBuGcSqw79g+ikqLmnxd\nOMFNSkhqtkFSVU2Vf97ZJ84nK09yvOJ4eIErry94xWXFnKiMyENGWDondyanZw7Dew53oddwRvRy\nwpveKb1ZeZZXlVNSXkJJWUmzRDk3O9eEOJqYEBuGYbQewUZ2gZbw4XrkldWVDOo+iOG9hjOg24Bm\nG3u1FmY1HWVMiA3DMIymEA0hPrVeLQzDMAwjzjAhNgzDMIwYYkJsGIZhGDHEhNgwDMMwYogJsWEY\nhmHEEBNiwzAMw4ghJsSGYRiGEUNMiA3DMAwjhpgQG4ZhGEYMMSE2DMMwjBhiQmwYhmEYMcSE2DAM\nwzBiiAmxYRiGYcSQmAixiAwQkb+JSLGIlIjI30VkYATXDRKRf4jIdhEpFZEiEVkkIlcEpbtJRGrC\nhGoR6dN6T2cYhmEYkdPm2yCKSBqwGjgJ/NSLfgxIA8aoatidmUVkJHAvsAjYDXQDbgOmAdeo6ute\nup7A6cGXA/OAL1X1nDD52zaIhmEYRsS0y/2IReRu4EkgR1W3eXFDgC+AH6nqU03MLxHYBqxU1RkN\npDsf+BC4Q1WfDZPGhNgwDMOImPa6H/F0YKlPhAFUdTuwGLi6qZmpajVQAtQ0kvQmoBz4r6bewzAM\nwzBai1gI8ShgbYj4dcDISDIQR6KIZInIg0AO8PsG0qcC1wFvqmpxM8ocVyxatCjWRThlsLpwWD04\nrB5qsbqIHrEQ4kzgSIj4w0CPCPP4FVAJ7APuA76jqh80kP4aoCvwYhPKGbfYP1gtVhcOqweH1UMt\nVhfRo70uX/oN8FWckdY84CUR+XoD6W8CDgDz26BshmEYhhExSTG45xFC93zD9ZTroap7gb3e4dsi\nshBnAPZ2cFoRyQYuAX6rqo3NIxuGYRhGmxILq+n3gWRVvSAofiGAql7cjDyfAO5W1U4hzv0I+AUw\nTlVXN5KPmUwbhmEYTaKlVtOx6BG/ATwhIkM8a2nf8qXzcPO9TUJEBDgf2BImyUxgdWMiDC2vTMMw\nDMNoKrHoEXcGPsc59HjAi/4Z0AXIVdVSL90gYCvwkKo+6sXNwg1hLwYKgWzgu8AU4Nuq+krQvcYD\nnwH3qupvW/nRDMMwDKPJtHmPWFVLRWQKzuAqH+fx6r9xYlkakFQCgo+VwN3AN4HuODEuACar6tIQ\nt8sDKoCXo/0chmEYhhENYmI1raq7VfV6Vc1Q1e6q+g1V3RmUZoeqJqrqIwFxb6rqVFXNVtU0VR2q\nqjPCiDCqeo+qpqpqUbiyNNfvdUdCRC4M45f7cKzL1pqISH8R+X8iskRETnjPPChEugwRecHzbX5c\nRN4TkdGxKHNrEUldiMjgBvy3d4tV2aOFiFwnIq+JyE7Pl/1GEfm5iKQHpYuH9tBoXXT09gAgIpeJ\nyPsisk9EykRkl4jMFZGvBKVrUZuIxRzxKYPn93ohbph8phf9GPCBiDTo97oDosBduKF8H1UxKktb\nMQzn6GUF8E/gsjDp5gGDgO8DxcBPgIUikutZ8HcEIq0LcP8jbwbFHWulcrUlP8T5sL/f+xwLPAxc\nBJwbkC4e2kOkdQEdtz2Amwr9DHgaKML93f8v8ImIjFbV3V66lrUJVY3bgBvmrgSGBsQN8eLuiXX5\n2rAeLgSqgSmxLksM6+BfvDoYFBR/tRd/QUBcN+AQ8FSsy93GdTEY50r21liXsZWeu2eIuJleXVwU\nT+0hwrro0O2hgbrJ8Z77nmi1ifbq0CNaRNXvdTvHLMZDMx3Yq6r/9EWo6lFcDyDe2kiHRlUPhYhe\njvvf6O8dx0V7iLAu4hXflF2193kVLWwT8S7ELfZ73cGYIyJVInJQRObE21x5GBpqI4O8VQDxxuMi\nUunZVbze0eZHg7gIN22z3juO5/ZwEa4uNgTFd/j2ICIJIpIsImcAf8AZCs/1To+khW0irueIiY7f\n645ACc4z2YfAUWAcbq/oJSIyTlUPxrJwMSYTt81mML634h5AaYjzHZFy4FlgAW6+bASunSwWkbNU\ndXMsCxdtRKQ/bl70PVVd5UXHZXsIqouVXnQ8tYdPgQne9+3AVFU94B23uE3EuxAbgKp+jlvb7eMj\nEfkIWIYz4JoVk4IZpxSqWgjcERC1WETexb35/xTn071DICJdgNdxyx9vjXFxYkq4uoin9gDciJv3\nPQ34N+BdETlPg1b7NJd4H5pusd/rjorXA9gMnB3rssSYhtqI73zcos5q9GM6UDsRt23qPJzh5uVa\n1+o1rtpDI3VRj47YHgBUdZOqLlfVucBUIB1nUQ5RaBPxLsTrcHM+wYykdk7IiG8aaiM7ta4TGqOd\nIyJJwN+B8cAVqhr8OxA37SGCuohLVLUE+BK35A+i0CbiXYjfACZ5vq6BOn6vX49JiU4RROSrwHAg\npLOUOOINoL+InO+L8JwVTCfO2wj4XdFOpgO0ExERnBe+i4CrVXV5iGRx0R4irItQ13WY9hAOEcnC\nzYd/6UW1uE20ua/pUwmJ0O91R0dEZuM2zViFM9Yajxt2OQ5MUNUO62FLRL7hfZ0K3I6b8yoCilT1\nn94P0sfAANymJMW4Bf2jcW1kT9uXunWIoC6exK2fXIozRBmBayddgUmq+kXblzp6iMh/4p77UeCt\noNO7VXVPvLSHCOuiQ7cHABF5FedaeTXut3E4cA/QB5ioql9GpU3EenF0rINXea94lVeCG4oZFOty\ntXEd3I97ITmCs4TcAfwnkBXrsrXBs9fg1gMGhw8C0mQALwAHcS8nC4DRsS57W9cFcAvOevSQ1072\nArOBM2Jd9ig9/7Ywz18NPBhP7SGSuujo7cF7xh/h1k8f9v7WG4BngjWipW0irnvEhmEYhhFr4n2O\n2DAMwzBiigmxYRiGYcQQE2LDMAzDiCEmxIZhGIYRQ0yIDcMwDCOGmBAbhmEYRgwxITYMwzCMGGJC\nbBhRRkRmisiOgON1IvKvUb7HJBFZKiLHRaRaRMaESTdLRKoDjrt7cWOjWZ6mICK5XhkyQpyrEZEH\nY1Euw4gVJsSGEX3GA5+Bfwu54cCKKN/jT0AicCVwDm6nrFA87533kYHb1nJ8lMvTFMZ6ZcgMcW4S\nzkORYcQNth+xYUSfCcA73vfxOLeABdHKXEQSgBzgUVX9sKG06ratC9y6TqJVjqAyJatqZaTJgZAu\n/VR1WfRKZRjtA+sRG0YU8URyLLU94LOA9apaEeH1XUXk9yKyR0TKRGSjiNwTcP4moAonZg96Q7lb\nG8jvIRGp8b4PBrbiRPAF79pqEckLSH+tiHwiIidE5IiI/FVEBgbluU1EZovILSKyQUTKga97Mql7\nLAAABOZJREFU5x4WkRUiUiIiRSLyvohMDCr/n7zDLwPKMMg7X29oWkS+JiJLRKRURIpF5DURyQlK\ns0hEPhKRS7z7nxCRNSIyIyjdGd71+0XkpIjsEJG53t/NMGKCNT7DiAKeONXgRLIL8LZ3/CQwJlhw\nwuQhwNvATcATwDRgPvAfIvKol2webptOwQ3hTgKuaaBoSm3vcx9wrXftY9615+DtruPNY/8NWAt8\nA/gebgeZRd4QeyAXA/cCDwFfw+1OA9APeAq4ynuO/cCHIuLbr3UebkcfvHv4yrAvTJ18zbvmKHA9\n8K9emT4Skb5Bz3m6d+8nvTrZB/xVRE4LSPc20Be3s9BlwI9xGxbYb6ERO2K9u4UFCx0h4LaAGwP8\nGlgDnOkdlwA/8L6PAZIayGMabgekmUHxz+O26sz0jhO9dA9GUK5ZQHXA8WDv2luD0nXB7UD2fFD8\nYJxQ/SAgbhtuh5nejdw7wSvrRuA3AfE34YbrTwtxTZ3nws21bwISAuKGABXAkwFxC71ynhYQ1xv3\nYnS/d9zTy39arNuLBQuBwd4CDSMKqOpGVV0NDAQWqeoaoBRIB15R1dVeqGogm/NxAvWXoPiXgE7U\nNbqKNufg9pF9WUQSfQHYgxPSC4LSL1XVouBMRGSqiHwgIgdxIlgJnIEzWGsS3n7h44C5qlrji1fV\n7cBi4MKgS75Q1a0B6YqAA8Ag7/gQbmj+FyLyXREZ1tQyGUZrYEJsGC1ERBI84UrCDRt/4onYBTgh\nO+AdN0YmcDiEWBfihpNDWRlHiz7ePd7HiacvVOCGgnsGpa83lCwi43DD3EeBW4GJwFdxw9apzShT\nD69MoYatC6lfH4dDpCsPuvdUXC/758BmEdkS7aVlhtFUzGraMFrO+9T2zhS3OfpLAceVgIrIxar6\nzwbyOQxkikhSkBhnB5xvLQ55n3nA+hDnjwUdh7J6/gbuWa8N7MGKSA/gSDPKdMS7T3aIc9k0oz68\n3vTNXrnGAHcCz4jINlV9txllNIwWYz1iw2g538P1/J4EvvS+fxUoAn7qfT+LxtcSf4ibU70+KP5G\nXM/ukyiUtdz7TAuKX4IT2zNUdWWI8EUEeXfGDa37EZEpeEPDEZShDqpaiquz6z1DNl+eg4FzcfPC\nzcabSvihdzi6JXkZRkuwHrFhtBCfSHnLbt5S1VUiMhzoBfxJVQ9EmNV84GPgWRHpA6zDOey4Ffi5\nqkajR7wf1/v9loisAU4A21T1sIj8CPi9d+/5OEOz/rje/kJV/a9G8n4HuBt4UUT+jJsX/ndgd1C6\n9bgh5ztF5EVcL7ogzPz5Azir6bdE5BncPPZDuN7yfzTlwUXkTOC3wFzcC1MicIt3/w+akpdhRBPr\nERtGFBCRZGAKTsDALelZ2QQRRlUVtx73ReA+nABdAdyrqg8EJyeMU4xQWQfd419w86/vActw1tqo\n6nO4ZUc5QD5uvncWTrA+b+zeqroAZyF+LvAmbgh4Jk70Asuw2st3GvCRV4Z+ofL2houvBLrjBPQZ\n3AvK+apaGO45w5S1ENiBW3b1OvAyboj7SlVdFeJaw2gTxP1fGoZhGIYRC6xHbBiGYRgxxITYMAzD\nMGKICbFhGIZhxBATYsMwDMOIISbEhmEYhhFDTIgNwzAMI4aYEBuGYRhGDDEhNgzDMIwYYkJsGIZh\nGDHkfwAA4yM+Rb0ZygAAAABJRU5ErkJggg==\n", "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "plt.rcParams['figure.figsize'] = 7, 5\n", "plt.plot(range(1,31), error_all, '-', linewidth=4.0, label='Training error')\n", "plt.plot(range(1,31), test_error_all, '-', linewidth=4.0, label='Test error')\n", "\n", "plt.title('Performance of Adaboost ensemble')\n", "plt.xlabel('# of iterations')\n", "plt.ylabel('Classification error')\n", "plt.rcParams.update({'font.size': 16})\n", "plt.legend(loc='best', prop={'size':15})\n", "plt.tight_layout()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "** Quiz Question:** From this plot (with 30 trees), is there massive overfitting as the # of iterations increases?" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 2", "language": "python", "name": "python2" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 2 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython2", "version": "2.7.11" } }, "nbformat": 4, "nbformat_minor": 0 }