{ "metadata": { "name": "" }, "nbformat": 3, "nbformat_minor": 0, "worksheets": [ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# CS 109 Final Project\n", "# By: William Chen, Sebastian Chiu, Salena Cui, and Carl Gao\n", "# Buffalo Capital Management\n", "# TF: Lehman Garrison" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Website: https://sites.google.com/site/predictingstockmovement" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "
" ] }, { "cell_type": "heading", "level": 1, "metadata": {}, "source": [ "Overview and Motivation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As long as capital markets have existed, investors and aspiring arbitrageurs alike have strived to gain edges in predicting stock prices. In particular, use of machine-learning techniques and quantitative analysis to make stock price predictions has become increasingly popular with time. Success, however, can be hard to come by; predicting financial price movements is an extremely difficult task. Nevertheless, the motivation for doing so is constantly present, with lucrative opportunities and insightful outcomes awaiting victors in an ongoing information battle.\n", "\n", "Our team wishes to use the Data Science framework we learned in CS 109 to take our chances at predicting stock price movements. Through the Boston Data Festival, we participated in a competition organized by Hack/Reduce and sponsored by DataRobot on November 10. The task of the competition was to take data from the past 9 days of a stock's pricing data and the opening price on the 10th day to predict whether the closing price on the 10th day is greater than or less than the opening price on the 10th day. After experimenting with various singular and stacked models, we eventually were able to devise a model that obtained an AUC of 0.941, which won us 1st place in the competition.\n", "\n", "Through this engaging and rewarding experience with predictive stock price modeling, we became very excited about the real-world applications of this kind of model and wanted to go more in depth into our predictive analysis after the competition. Our CS109 final project gave us a structured opportunity to extend our work. We aim to test a variety of models on S&P 500 stocks encompassing a different industries and quantify the success of our models. Such analysis may give Buffalo Capital Management an upper hand in any future investing endeavors!\n", "\n" ] }, { "cell_type": "heading", "level": 1, "metadata": {}, "source": [ "Related Work" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We were inspired by the successes we've seen in algorithmic trading and computer-assisted stock analysis. Several well-documented effects we've read about online are the momentum effect and regression to the mean. There has been a lot of research conducted about the significance of the momentum effect in stock price prediction. For example, Lee and Swaminathan (2002) studied the relationship between momentum and value trading strategies, while Grinblatt and Moskowitz (2004) examined the effect of consistent positive past returns on the link between past and expected returns. The wealth of research into price momentum made us interested in examining this effect further and seeing how it could be applied to stock price prediction models. " ] }, { "cell_type": "heading", "level": 1, "metadata": {}, "source": [ "Initial Questions" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There are a number of questions that we wished to initially address in our project.\n", "\n", "1. Given historical data and a stock's pricing data for 9 days and its opening price on the 10th day, how much success can we have in predicting whether the closing price on the 10th day is greater than or less than the opening price on the 10th day?\n", "2. Which classification models will perform the best at assisting us in making predictions used to answer Question 1?\n", "3. Do stacked/blended models (models that utilize some committee to combine predictions made from two or more separate classification models) perform better or worse than individual classification models in making predictions used to answer Question 1?\n", "4. Among the S&P 500 stocks, are there any industries for which the predictions used to answer Question 1 tend to be more or less accurate than the predictions made for other industries?\n", "5. What are some of the differences between the behaviors of our classification models when making predictions used to answer Question 1?\n", "\n", "Question 4 was suggested in principle by our TF, Lehman. Question 5 arose when running our models.\n" ] }, { "cell_type": "heading", "level": 1, "metadata": {}, "source": [ "Data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The initial dataset we used was a dataset used by Hack/Reduce at the Boston Data Festival Predictive Modeling Hackathon. The training data consisted of the opening, closing, maximum and minimum prices for 94 stocks over 500 days. The hackathon dataset used stock data from 20 years ago, and we wanted to see how our models would perform on more recent data. We wrote a function that reads in a file with stock tickers, one per row, to create a file of stock price data. We assume that we have a list of the S&P 500 stocks, stored in stocks.csv, that lists each stock ticker, along with the associated company name and industry area. We decide to analyze the S&P 500 stocks in 5 different industry areas.\n", "\n", "1. Consumer Discretionary\n", "2. Health Care\n", "3. Information Technology\n", "4. Financials\n", "5. Industrials\n", "\n", "Thus, including the historical data set provided to us and these 5 additional data sets, we have 6 total data sets that we choose to analyze.\n", "\n", "(The stocks tickers for the historical data set are: AA ABT ABX ADI ADM AET AMD AMR APC AVP AXP BA BAC BAX BBY BK BMC BMY BNI BP CA CAT CI CL COP CVX DD DE DIS DOW EK EMC EMR FNM FRE FRX GE GLW GPS GSK HAL HD HON HPQ HRB IBM IGT JNJ JPM JWN KO KR LLY LOW LTD LUV MCD MDT MMM MO MOT MRK MRO MU MYL NKE NSM NWS OXY PEP PFE PG RSH SLB SLE SLM STJ SYK SYY TGT TJX TMX TXN UN UNH UTX VOD VZ WAG WFC WMB WMT XOM XRX. The stock tickers for the historical data can be found by running the code below that uses the categories in stocks.csv to separate stock tickers to their respective industry classifications.)\n", "\n", "What follows is the data scraper we used to generate the datasets we analyzed for this project. " ] }, { "cell_type": "code", "collapsed": false, "input": [ "# Load in packages\n", "from pandas.io.data import DataReader\n", "from datetime import datetime\n", "import numpy as np\n", "import pandas as p\n", "from itertools import chain\n", "import matplotlib.pyplot as plt\n", "import operator\n", "import sys\n", "\n", "%matplotlib inline" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 49 }, { "cell_type": "markdown", "metadata": {}, "source": [ "We assume that we have a list of the S&P 500 stocks, stored in stocks.csv, that lists each stock ticker, along with the associated company name and industry area. We first want to take stocks.csv and create 5 text files that contain the stock tickers of S&P 500 companies in the 5 aforementioned industry areas." ] }, { "cell_type": "code", "collapsed": false, "input": [ "sp500 = [list(i) for i in np.array(p.read_table('./sp500.csv', sep = \",\"))]\n", "cons_disc = '\\n'.join(i[0] for i in sp500 if i[2] == 'Consumer Discretionary')\n", "healthcare = '\\n'.join(i[0] for i in sp500 if i[2] == 'Health Care')\n", "info_tech = '\\n'.join(i[0] for i in sp500 if i[2] == 'Information Technology')\n", "financials = '\\n'.join(i[0] for i in sp500 if i[2] == 'Financials')\n", "industrials = '\\n'.join(i[0] for i in sp500 if i[2] == 'Industrials')\n", "file_names = {'cons_disc.txt':cons_disc, 'healthcare.txt':healthcare, 'info_tech.txt':info_tech, 'financials.txt':financials, 'industrials.txt':industrials}\n", "for file_name in file_names.keys():\n", " # Uncomment below lines to run code. As we include the data files in the submission, we do not wish to run the code at this time.\n", " #f = open(file_name, 'w')\n", " #f.write(file_names[file_name])\n", " #f.close()" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, we want a function that can take in a text file with stock tickers (one per row), and produce our desired dataset. To do this, we assume that we will be using 9 days of data to predict whether the closing price on the 10th day will be greater than or less than the opening price on the 10th day. Our dataset will be composed of approximately 2 years of stock price data. Since we can use a rolling time window to test our data, for any stock ticker, stock price data for $n$ days will produce $n-9$ rows of data.\n", "\n", "Our function is given below." ] }, { "cell_type": "code", "collapsed": false, "input": [ "# Function that reads in file with stock tickers, one per row,\n", "# to create a file of data with stock price data.\n", "# Data includes open, max, min, close and volume.\n", "def get_stock_prices(fname, start, end):\n", " \n", " # Function to check if date is valid\n", " def check_date(date):\n", " return len(date) == 3 and len(date[0]) == 4 and len(date[1]) == 2 and len(date[2]) == 2\n", " \n", " # Load in stock tickers\n", " tickers = [line.strip() for line in open(fname)]\n", " \n", " # Format start and end days and check validity of dates\n", " start = start.split('-')\n", " end = end.split('-')\n", " errors = not (check_date(start) and check_date(end))\n", " if errors:\n", " print 'Your dates need to be of the form \\'YYYY-MM-DD\\'!'\n", " return\n", " start = datetime(int(start[0]), int(start[1]), int(start[2]))\n", " end = datetime(int(end[0]), int(end[1]), int(end[2]))\n", " timelapse = end-start\n", " if timelapse.days < 14:\n", " print 'You need a date range that exceeds 10 business days!'\n", " return\n", " \n", " # Create data file and return results\n", " results = []\n", " for ticker in tickers:\n", " try: \n", " data = np.array(DataReader(ticker, \"yahoo\", start, end))[::-1]\n", " data = np.delete(data, np.s_[-1:], 1)\n", " result = []\n", " for i in range(10):\n", " result.append(data[i:(i+data.shape[0]-9),:])\n", " except:\n", " pass\n", " results.append(np.hstack(result))\n", " return np.vstack(results)" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We now produce the datasets for each of our industry areas using the generated stock tickers in from July 1, 2012 to December 5, 2013." ] }, { "cell_type": "code", "collapsed": false, "input": [ "data_file_names = {'cons_disc.csv':'cons_disc.txt', 'healthcare.csv':'healthcare.txt', 'info_tech.csv':'info_tech.txt', 'financials.csv':'financials.txt', 'industrials.csv':'industrials.txt'}\n", "for data_file_name in data_file_names:\n", " # Again, as we include data files in our submission, we do not run the below code, but you can uncomment it to run it it you want.\n", " #data = get_stock_prices(data_file_names[data_file_name], '2012-07-01', '2013-12-05')\n", " #np.savetxt(data_file_name, data, delimiter=\",\", header=','.join([str(i) for i in range(1,51)]))" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we can load in the data and process it into the format that we like. This block of code takes in a csv file, and processes it.\n", "\n", "In the case that we are loading in the provided Kaggle data, we break up the window of 500 days of data in a stack of rolling 10 days worth of data. In the case that we are loading in a dataset that we made ourselves, we simply take in the columns that we want, or the 10 opening days and 9 closing days worth of data. In both cases, we normalize each row of the dataset so that all stock prices are expressed as a ratio with the opening on day 1.\n", "\n", "We get a X matrix, where the matrix contains rows of stock open and close prices, and a y matrix, where the matrix contains indicators of whether or not the stock went up that day." ] }, { "cell_type": "code", "collapsed": false, "input": [ "################################################################################\n", "# CHOOSE WHICH TRAIN DATASET TO USE\n", "################################################################################\n", "\n", "train = np.array(p.read_table('./training.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./cons_disc.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./healthcare.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./info_tech.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./financials.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./industrials.csv', sep = \",\"))\n", "\n", "################################################################################\n", "# READ IN THE KAGGLE TRAIN DATA\n", "################################################################################\n", "\n", " \n", "\"\"\"\n", "Function\n", "--------\n", "normalize10day\n", "\n", "Replaces all min/max/vol data with 0, and divides all stock data\\\n", "by the opening price on the first day\n", "\n", "Inputs\n", "-------\n", "stocks : (Nexample, Nfeature) array\n", " data for stocks, with 10 days of OPEN, MIN, MAX, CLOSE, VOL respectively\n", "\"\"\" \n", "def normalize10day(stocks):\n", " def process_column(i):\n", " if operator.mod(i, 5) == 1:\n", " return stocks[:,i] * 0\n", " if operator.mod(i, 5) == 2:\n", " return stocks[:,i] * 0\n", " if operator.mod(i, 5) == 4:\n", " return stocks[:,i] * 0\n", " #return np.log(stocks[:,i] + 1)\n", " else:\n", " return stocks[:,i] / stocks[:,0]\n", " n = stocks.shape[0]\n", " stocks_dat = np.array([ process_column(i) for i in range(46)]).transpose()\n", " return stocks_dat\n", "\n", "if train.shape[0] == 94: # if we're using the kaggle training data\n", " n_windows = 490\n", " windows = range(n_windows)\n", " \n", " # we convert the 500 day data into a stack of 10 day data.\n", " X_windows = [train[:,range(1 + 5*w, 47 + 5*w)] for w in windows]\n", " X_windows_normalized = [normalize10day(w) for w in X_windows]\n", " \n", " X = np.vstack(X_windows_normalized)\n", " \n", " # read in the response variable and convert to indicators\n", " y_stockdata = np.vstack([train[:, [46 + 5*w, 49 + 5*w]] for w in windows])\n", " y = (y_stockdata[:,1] > y_stockdata[:,0]) + 0\n", " \n", "################################################################################\n", "# READ IN THE MODERN TRAIN DATA\n", "################################################################################\n", "\n", "else:\n", " # chain.from_iterable is basically a \"flatten\" function, that takes a list of lists and \n", " # converts it to one list\n", " # columns we want are just the opening and closing prices\n", " columns_we_want = list(chain.from_iterable([[5 * x, 5 * x + 3] for x in range(10)]))[:-1]\n", " \n", " # we get our matrix of open and close prices, and normalize the data such that all data\n", " # is divided by the opening price on the first day\n", " X = np.array([l/l[0] for l in train[:, columns_we_want]])\n", " \n", " # we make indicators of whether or not the stock went up that day.\n", " y = (train[:, 48] > train[:, 45]) + 0" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 25 }, { "cell_type": "heading", "level": 1, "metadata": {}, "source": [ "Exploratory Analysis" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "With the datasets that we are given, we wish to explore several common statistical classifiers in order to investigate the predictive power they may yield on our data set. The models explored were Lasso, Ridge Regression, Random Forest, and Gradient Boosted Trees. In addition, we were interested in the performance of stacked models, or models that operated by combining the predictions from two or more different models. As such, we additionally test two models, one that combines Ridge Regression and Random Forest and another that combines all of the models that we output. We combine models by taking the predictive output from each model on our training data and running the output through Logistic Regression in order to achieve our final output.\n", "\n", "The general philosophy we use in testing our models is to split our dataset into training and test data. For each classifier, we use cross validation techniques to select an optimal tuning parameters when necessary and to evaluate the result predictive accuracy. In this way, we determined which classifier and associated parameters would be best to use. We will observe that our stacked performed better than our individual models, but not by a significant amount." ] }, { "cell_type": "heading", "level": 2, "metadata": {}, "source": [ "Understanding market correction effects" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For the purposes of exploratory analysis, we focus first on understanding the background behind what could drive an increase in stock price. Although we have the previous 9 days of stock data for any stock that we are learning about, we suspect that not all of this information is necessarily useful for making a predictive model, because of our understanding of efficient markets and momentum effects. In particular, in running our models on different subsets of the data, we find that including maximum and minimum prices as well as volumes are not useful in predicting prices. As such, we will only utilize the opening and closing prices.\n", "\n", "To understand market correction effects, we also seek to correlate intraday change (open on day 10 divided by close on day 9) and intraday change (close on day 10 divided by open on day 10). There is a negative correlation between interday change (ratio between closing and opening price today), and intraday change (ratio between opening price today and closing price yesterday).\n", "\n", "This is because markets may overreact to events that happen during closing hours, and then correct themselves. This would explain any negative correlation. The strong correlation gives us a hint that stock market movement up or down is in fact predictable.\n", "\n", "The code below produces scatterplots that demonstrate the negative correlation referred to above." ] }, { "cell_type": "code", "collapsed": false, "input": [ "################################################################################\n", "# Please uncomment out the set you want to use as training\n", "################################################################################\n", "\n", "train = np.array(p.read_table('./cons_disc.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./healthcare.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./info_tech.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./financials.csv', sep = \",\"))\n", "#train = np.array(p.read_table('./industrials.csv', sep = \",\"))\n", "\n", "plt.scatter(train[:, 45] / train[:, 43], train[:, 48] / train[:, 45])\n", "plt.xlim((.8,1.2)); plt.ylim((.8,1.2))\n", "plt.xlabel(\"Opening 10 / Closing 9\"); plt.ylabel(\"Closing 10 / Opening 9\")\n", "plt.title(\"Correlation between interday and intraday stock movement\")\n", "plt.show()" ], "language": "python", "metadata": {}, "outputs": [ { "metadata": {}, "output_type": "display_data", "png": "iVBORw0KGgoAAAANSUhEUgAAAZIAAAEZCAYAAAC99aPhAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsnXd4FMX/x993yV2up/dCEqogTYoECAm9Se+9SFOQZqUJ\nUX4oCqLIF0SpgoAIKEpXIIB0BOmCVCGAJEBCGkku9/79sXuXhCQkpFGc1/Pkye3O7Mxn5m4/7526\nCpKEQCAQCAQFRPmkDRAIBALBs40QEoFAIBAUCiEkAoFAICgUQkgEAoFAUCiEkAgEAoGgUAghEQgE\nAkGhEEKSiSVLliA0NLTA17dq1QrLli0rQosk+vfvj0mTJhV5uk8LxVVv+eVZqN9H/Tb/+ecfGI1G\nPMmZ/M9CHeZGYGAgtm/f/qTNeKZ56oRkxYoVqFmzJoxGI3x8fNCqVSvs3bv3SZuVjSlTpqBPnz5Z\nzm3atCnbuaJAoVBAoVAU6Npn4QZ/nHoLDw/HwoULizT/wtTv00BAQADi4+PzVYYrV65AqVTCYrEU\nqQ0lXYeFfejLzLP+/Rcn+b3fnioh+eyzzzBmzBhMnDgRt2/fxrVr1zB8+HD8/PPPj52W2WzOdi49\nPb0ozHwiiHWjEoW94XNzoP+1+n1UeQt6n/zX6vC/QL7vNz4lxMbG0mAwcM2aNbnGefDgAUeNGkUf\nHx/6+Phw9OjRTElJIUnu3LmTvr6+nD59Or28vNinTx9OmTKFnTp1Yu/evWkymbhw4ULGxsZy4MCB\n9Pb2pq+vLydOnMj09HSS5OLFi1m/fn1bfiNHjqS/vz9NJhNr1KjBPXv2kCQ3b95MtVpNlUpFg8HA\natWqkSTDwsK4YMECkqTFYuGHH37IUqVK0cPDg3379mVcXBxJ8vLly1QoFFy6dCkDAgLo5ubG//u/\n/8u13P379+ewYcPYtGlTGo1GhoWF8erVq7bws2fPskmTJnRxcWH58uW5evVqkuT8+fOpUqmoVqtp\nMBjYpk0bLl68mG3atLFdW6ZMGXbp0sV27Ofnx+PHjz8yXet38eabbzIgIICenp4cNmwYk5OTs3wX\nM2fOpIeHB729vbl48eJcy5e53hYvXsx69erxrbfeorOzM4OCgrh582aS5Pjx42lnZ0eNRkODwcA3\n3ngjTzv79evHYcOGsWXLltTr9dy+fTuPHj3K6tWr02g0slu3buzevTsnTpxIkrx79y5bt25Nd3d3\nOjs785VXXuH169dJkqtXr2aNGjWy2D5z5ky2a9cux3ItWrSIL7zwAo1GI4ODgzl//nxbWF51FBMT\nwzZt2tBkMrF27dqcOHFilt9mZqy/J+vvOCwsjJMmTWK9evVoNBrZrFkzxsTEkCT9/f2pUChoMBho\nNBq5f/9+Ll68mHXr1uWYMWPo6urKSZMm8eLFi2zYsCFdXV3p5ubGXr16MTY21pZnSdXh4sWLGRwc\nTKPRyKCgIH733Xc8e/YsHRwcaGdnR4PBQGdnZ5KSD+nTpw/d3d1ZqlQpTp06lRaLxZbW119/bfs+\nKlasyGPHjpEkAwMDuX37dpLkmTNnGBQUxFWrVuVoj0Kh4Ny5c1mmTBkajUZOmjSJFy5cYJ06dejo\n6Mhu3boxNTU1S55lypShi4sL27Ztyxs3bpAkhw0bxrfeeitL2m3btuVnn31GkoyKimLHjh3p7u7O\noKAgzp492xZv8uTJ7Ny5M3v37k2j0cjKlSvz/PnznDZtGj08PBgQEMBt27bZ4ufl8x73fsuJp0ZI\nNm/eTHt7e1sBc2LSpEkMCQlhdHQ0o6OjWbduXU6aNImkdGPa29vzvffeY2pqKpOTkzl58mSqVCqu\nX7+eJJmcnMz27dtz2LBhTEpK4u3bt1m7dm3bDf6wkCxfvpx3795leno6Z86cSS8vL5twTZkyhX36\n9MliX3h4OBcuXEiSXLhwIcuUKcPLly8zISGBHTt2tMW33vhDhgzhgwcPePz4cTo4OPDs2bM5lrtf\nv340Go3cs2cPU1JSOGrUKJudCQkJ9PPz45IlS5iens5jx47Rzc2NZ86cISmJkLWOSPLSpUt0cnIi\nKf1YS5UqRX9/f5LkxYsXbTdlXumOHj2a7dq147179xgfH882bdpw3LhxWb6LyZMn02w2c9OmTdTp\ndFkcUW71tnjxYqpUKi5YsIAWi4Xz5s2jj49PjnHzY2e/fv3o6OjIffv2kSTj4uIYEBDAzz//nGaz\nmWvWrKFKpbLV0Z07d7hu3TomJyczPj6eXbp0Yfv27UlK4uni4pLle6pWrRrXrVuXY7k2btzIS5cu\nkSR37dpFnU7Ho0eP5quOunXrxm7dujEpKYmnTp2ir68vQ0NDc8wnJyEpU6YM//77byYnJzM8PJzv\nvfceSfLKlStZ4lrr3N7ennPmzGF6ejqTk5N54cIF/vbbb0xNTWV0dDQbNGjA0aNHkyRTUlJKpA4T\nEhJoMpl4/vx5kuStW7d4+vRpkuSSJUuyCWufPn3Yvn17JiQk8MqVKyxXrpztt7J69Wr6+vryyJEj\nJMkLFy7YHsasQvLHH38wICCAGzduzLGeSUlI2rdvz/j4eJ4+fZpqtZoNGzbk5cuXGRcXx4oVK3Lp\n0qUkye3bt9PNzY3Hjh1jSkoK33jjDTZo0IAkuXv3btt9R0riq9VqefPmTaanp/Oll17ihx9+yLS0\nNF66dInBwcHcunUrSUlINBoNt23bRrPZzL59+7JUqVKcNm0azWYzv/nmGwYFBdnSzsvnPc79lhtP\njZAsX76cXl5ej4xTunRpm1qS5NatWxkYGEhSujHVarXN0ZNShYeFhdmOb926RQcHB9uTM0muWLGC\nDRs2JJldSB7G2dmZJ06csKXdu3fvLOGZK71Ro0acN2+eLezcuXNUqVRMT0+33fhRUVG28Nq1a+f6\nFNSvXz/26NHDdpyQkEA7Ozteu3aNq1atyuZghgwZwoiICNu11idFK/7+/jx69ChXrlzJIUOG8OWX\nX+Zff/3FRYsW2Z4MH5WuxWKhXq/nxYsXbWH79u2z/Xh37txJrVabxVl5eHjw4MGDOZbvYSEpU6aM\nLSwxMZEKhYL//vuvLa619ZKXndby9+vXzxa2a9euLDcKySwPJA9z7Ngxm7iS0pPkhAkTSJKnTp2i\ns7NzlifQR9G+fXt+8cUXJB9dR2azmSqViufOnbOFjR8/Pt8tkvDw8Cwt3Llz57JFixY5xiWlOg8I\nCHik7T/++COrV69OsuTqMCEhgU5OTly7di2TkpKyhD18r5rNZqrV6iwCNX/+fIaHh5MkmzVrluWp\nPjOBgYF8//336efnx127duVaB6QkJNaHEpKsUaMGP/nkE9vxm2++aRPcgQMH8t13381SHpVKxatX\nr9JisTAgIIC7d+8mKbVcGjduTJI8cOBAtu9j2rRpHDBgAEnJ9zRr1swW9vPPP9NgMNhaX/fv36dC\noWBcXFy+fN7j3G+5YV+8PWz5x9XVFTExMbBYLFAqcx66uXHjBkqVKmU7DggIwI0bN2zH7u7uUKvV\nWa7x8/Ozfb569SrS0tLg7e1tO2exWBAQEJBjfjNmzMCiRYtw48YNKBQK3L9/HzExMfkqz82bN7PZ\najab8e+//9rOeXl52T7rdDokJibmmJZCochSDr1eDxcXF9y4cQNXr17FwYMH4ezsbAs3m83o27ev\n7dqHCQsLQ2RkJC5cuICwsDA4OTlh165d2L9/P8LCwgDgkenGxMQgKSkJNWrUsIWRzDL+4OrqmuV7\n1Ol0SEhIeESNZfBwvQBAQkICPDw8spUpP+X39fW1hd24cSPLMQCUKlXK1r+flJSEMWPGYOvWrbh3\n754tb5JQKBTo168fevbsialTp2LZsmXo1q0bVCpVjuXYvHkzIiIi8Pfff8NisSApKQlVqlTJs46i\no6NhNpvh7+9vC8vtN5obmetQq9XmWfeZ8wKAf//9F6NGjcLvv/+O+Ph4WCwWuLi4ACi5OtTr9fj+\n++8xY8YMvPrqq6hXrx5mzpyJ8uXLZ4sbExODtLS0bPdcVFQUAOD69esoXbp0jmUnifnz5yM8PBwN\nGjR4ZD0BgKenp+2zVqvNcqzRaHD79m0Akg+oWbNmlvK4uroiKioKAQEB6N69O1auXInQ0FCsWLHC\n9pu9evUqbty4keU3nZ6ensU2671gtcHNzc12X2i1WgBSnV+/fj1Pn/c491tuPDWD7SEhIXBwcMCP\nP/6YaxwfHx9cuXLFdvzPP//Ax8fHdvxwgR+ejeHv7w8HBwfcuXMH9+7dw7179xAXF4eTJ09my2vP\nnj349NNP8cMPPyA2Nhb37t2Do6Oj7WbJq3JzstXe3j7Lj+5xuHbtmu1zQkIC7t69C19fXwQEBCAs\nLMxWnnv37iE+Ph7/+9//crUzLCwMO3fuxJ49exAeHm4Tll27dtmE5FHpurq6QqvV4syZM7aw2NhY\n3L9/v0BlexweLk9e5X/4Gm9vb5tzsXL16lVbnJkzZ+L8+fM4dOgQ4uLisGvXLlBquQMA6tSpA7Va\njd27d2PlypW5zjZLSUlBp06d8M477+D27du4d+8eWrVqla8BaXd3d9jb2+Off/6xncv8uTDk9rt9\n+Pz48eNhZ2eHU6dOIS4uDsuWLbM9KJRUHQJAs2bNsG3bNty6dQsVKlTA4MGDc7TXzc0NKpUq2z1n\nfQDz9/fHhQsXci37/PnzcfXqVYwdOzZXW/JDZrse9gGJiYm4c+eOTYR79OiBNWvW4OrVqzh06BA6\ndeoEQPpNBwUFZflN379/Hxs2bMix7I/icXxeXuV5FE+NkDg6OuKDDz7A8OHDsX79eiQlJSEtLQ2b\nN2/Gu+++C0Cq+KlTpyImJgYxMTH44IMPHvkjfPim9fb2RrNmzTB27FjbU9bFixexe/fubNfGx8fD\n3t4ebm5uSE1NxQcffJDFUXp5eeHKlSu5OoYePXpg1qxZuHLlChISEjB+/Hh0794919ZWTvZmPr9p\n0ybs3bsXqampmDRpEkJCQuDr64vWrVvj/PnzWL58OdLS0pCWlobDhw/jr7/+AiA9PV26dClLelYh\nefDgAXx8fFC/fn1s2bIFd+/eRfXq1QEAr7zySq7pKpVKDB48GKNHj0Z0dDQAICoqCtu2bcu1bEWF\np6cnLl68aDt+lJ3WustM3bp1YW9vj9mzZyMtLQ3r1q3D4cOHbeEJCQnQarVwdHTE3bt3ERERkc2G\nPn36YMSIEVCr1ahbt26OdqampiI1NRVubm5QKpXYvHlzvuvHzs4OHTt2xJQpU5CcnIwzZ85g6dKl\nj+VAcvstubu7Q6lUZqnDnEhISIBer4fJZEJUVBQ+/fRTW1hISEiJ1OHt27exfv16JCYmQqVSQa/X\nw87ODoD0O7A+bQNSnXXt2hUTJkxAQkICrl69ilmzZqF3794AgEGDBmHGjBk4evQoSOLChQtZxNlo\nNGLLli3YvXs3xo0b98i6eZjMdZ1ZMHv06IHFixfj+PHjSElJwfjx41GnTh1ba6BatWpwc3PDoEGD\n0KJFC5hMJgBA7dq1YTQa8cknnyA5ORnp6ek4deoUjhw5ki2/vHgcn5cTD99vufHUCAkAjB07Fp99\n9hmmTp0KDw8PBAQEYO7cuejQoQMAYOLEiahZsyaqVKmCKlWqoGbNmpg4caLt+rxaJADw7bffIjU1\nFRUrVoSLiwu6dOmCW7duZYvfokULtGjRAuXKlUNgYCC0Wm2W5mCXLl0ASN0TmZuvVgYOHIg+ffqg\nQYMGCA4Ohk6nw5dffpmrrbmds57v1asXIiIi4OrqimPHjmH58uUApBtg27ZtWLVqFXx9feHt7Y1x\n48YhNTUVAPDqq6/izJkzcHZ2RseOHQEAZcuWhdFotM3DN5lMKF26NOrVq2ezwWAwPDLd6dOno0yZ\nMqhTpw4cHR3RtGlTnD9/Ps+y5EVO31nm41GjRmHNmjVwcXHB6NGj87Tz4fRUKhXWrVuHJUuWwNXV\nFatXr7Y9CQLA6NGjkZycDDc3N9StWxctW7bMZk+fPn1w+vRpm5PKCaPRiNmzZ6Nr165wcXHBypUr\n0a5du1zL9TBz5sxBQkICvLy8MHDgQAwcOPARtZbzbz/zZ+uxTqfDhAkTUK9ePbi4uODgwYM51vnk\nyZNx9OhRODo6ok2bNujUqZMtjlqtLpE6tFgsmDVrFnx9feHq6oo9e/Zg3rx5AIDGjRujUqVK8PLy\nsnXBfPnll9Dr9QgODkZoaCh69eqFAQMGAAA6d+6MCRMmoGfPnjCZTOjYsaOt282Ko6Mjfv31V2ze\nvBmTJ0/OVz0/qq4bN26MDz/8EJ06dYKPjw8uX76MVatWZbm2Z8+e2LFjB3r27Gk7p1QqsWHDBvz5\n558IDg6Gu7s7hgwZYnuQzeseefg4vz4vp2sfvt9yQ8HHkbfHZODAgdi4cSM8PDxybEp99913+OST\nT0ASRqMR8+bNs/Uhb9myBaNHj0Z6ejoGDRpka5UIBE+a5ORkeHp64tixY7n2uwsejajD54tibZEM\nGDAAW7ZsyTU8ODgYu3fvxokTJzBp0iQMGTIEgDSwNGLECGzZsgVnzpzBypUrcfbs2eI0VSDIN/Pm\nzUPt2rWFAywEog6fL4p11lZoaGiWwaaHCQkJsX1++eWXcf36dQDAoUOHUKZMGQQGBgIAunfvjvXr\n1+OFF14oTnMFgjwJDAyEQqHATz/99KRNeWYRdfj88dRM/124cCFatWoFQBq4zTwd0c/PDwcPHnxS\npgkENh71YCTIH6IOnz+eCiHZuXMnFi1aZNucUWygJhAIBM8OT1xITpw4gcGDB2PLli22BTi+vr5Z\n1k1cu3Yty4I8K9WqVcPx48dLzFaBQCB4HrCuHSsy8lz7XkguX77MF198Mcewq1evsnTp0ty/f3+W\n82lpaQwODubly5eZkpLCqlWr2vZOykwJmF8kTJ48+UmbkC+EnUXLs2Dns2AjKewsaoradxZri6RH\njx7YtWsXYmJi4O/vj4iICNsCoqFDh+KDDz7AvXv38NprrwGQ5vgfOnQI9vb2mDNnDpo3b4709HS8\n+uqrYqBdIBAInlKKVUhWrlz5yPAFCxZgwYIFOYa1bNkSLVu2LA6zBAKBQFCEPFUr259XwsPDn7QJ\n+ULYWbQ8C3Y+CzYCws6nnWJd2V7cKBQK8VY2gUAgeEyK2neKFolAIBAICoUQEoFAIBAUCiEkAoFA\nICgUQkgEAoFAUCiEkAgEAoGgUAghEQgEAkGhEEIiEAgEgkIhhEQgEAgEhUIIiUAgEAgKhRASgUAg\nEBQKISQCgUAgKBRCSAQCgUBQKISQCAQCgaBQCCERCAQCQaEQQiIQCASCQiGEpISIjo5Gw4ZtoNM5\nw9//BezYseNJmyQQCARFgnixVQlRu3Yj/PlnVaSlTQBwGDpdX5w8eRDBwcFP2jSBQPAf45l6sdXA\ngQPh6emJypUr5xj+119/ISQkBBqNBjNnzswSFhgYiCpVqqB69eqoXbt2cZpZ7Dx48AB//PE70tJm\nAHAD0BJKZXPs2bPnSZsmEAgEhaZYhWTAgAHYsmVLruGurq748ssv8dZbb2ULUygUiIyMxLFjx3Do\n0KHiNLPYUavVUKkcAFyWz1gAXICTk9MTtEogEAiKhmIVktDQUDg7O+ca7u7ujpo1a0KlUuUY/qx0\nW+WFUqnEzJmfQqdrBDu7d6DTNcWLLxrQunXrJ22aQCAQFBr7J21AbigUCjRp0gR2dnYYOnQoBg8e\n/KRNKhTDhw/Diy++gL1798Lbuzd69eoFe/untvoFAoEg3zy1nkxyuN6Ijo5G06ZNUaFCBYSGhmaL\nN2XKFNvn8PBwhIeHl5yRj0lYWBjCwsKetBkCgeA/RmRkJCIjI4st/adWSLy9vQFI3V8dOnTAoUOH\n8hQSgUAgEGTn4YfsiIiIIk3/qVhH8vBYSFJSEuLj4wEAiYmJ2LZtW64zvwQCgUDwZCnWdSQ9evTA\nrl27EBMTA09PT0RERCAtLQ0AMHToUNy6dQu1atXC/fv3oVQqYTQacebMGdy+fRsdO3YEAJjNZvTq\n1Qvjxo3LbvwztI5EIBAInhaK2neKBYkCgUDwH+OZWpAoEAgEgucfISQCgUAgKBRCSAQCgUBQKISQ\nCAQCgaBQCCERCAQCQaEQQiIQCASCQiGERCAQCASFQgiJQCAQCAqFEBKBQCAQFIqndtNGgaC4SElJ\nwdy583DhwlXUrVsTPXv2hEKheNJmCQTPLGKLFMF/CrPZjPr1m+PECR2Sk8Og13+HgQObYPbsT5+0\naQJBiSH22sqEEBLB47Jz5060bTsGCQl/ALADcBcqlT9iYm7CZDI9afMEghJB7LUlEBSCxMREKJWe\nkEQEAJygVDogOTn5SZolEDzTCCH5DxEbG4uEhIQnbcYTJSQkBPb2p6FQzANwFirVaFSsWAkeHh5P\n2jSB4JlFCMl/gISEBDRq1AYeHv5wdvbAgAGvwWKxFFn6aWlpuH//fpGlV5y4urpiz55tqFVrLby8\n2qFlyxj8+utPYrBdICgEQkj+A4wZMx779xuRlnYXZvMtrF59EnPmzCuStD/+eCb0eke4ufmgevX6\n+Pfff4sk3eKkYsWKOHjwN9y8eR7r16+Aq6vrkzZJIHimEUJSjKSlpT0VXUm//34IDx4MB6ACYEJS\n0kDs2nWw0On++uuv+PDD/yEt7RzS0u7j1Km66Nbt1UKnKxAIni2EkBQTn3zyGfR6Rzg7e6BmzTBE\nR0c/MVuCgwOgVO6WjwiF4jccPHgQ27dvL1S6Bw4cQHJydwD+AJQwm9/EkSMHCmuuQCB4xhBCUgxs\n27YNERFzkJZ2DmZzAk6cqIEePQY/MXvmzJkOV9f50GjCAVQF+Qeiot5CmzY9sG/fvgKn6+fnB632\nAIB0+cw+eHr6FoHFAoHgWaJYhWTgwIHw9PRE5cqVcwz/66+/EBISAo1Gg5kzZ2YJ27JlCypUqICy\nZcti+vTpxWlmkbNv334kJ/eE9Uk9Le0tHDq0/4nZExQUhPPn/4S7+78AegA4DmAwkpPfw/z53xY4\n3d69e+Oll9QwGGrBaOwIg2Eoli0rmrEXgUDw7FCsQjJgwABs2bIl13BXV1d8+eWXeOutt7KcT09P\nx4gRI7BlyxacOXMGK1euxNmzZ4vT1CLF19cHWu1BZDyp73/iT+pOTk5wdnYHUAWARj6bApWq4Lvk\nqFQqREZuxNq1H2PBgu44e/Yo6tatWxTmCgSCZ4hiFZLQ0FA4OzvnGu7u7o6aNWtCpVJlOX/o0CGU\nKVMGgYGBUKlU6N69O9avX1+cphYp/fr1Q9WqgMHwMozGztDrh2Hp0jnFmuetW7dw9uxZpKam5hpn\n4sSR0OmGAlgIYCb0+s/QsGEIJk58H9OnT0dMTMxj52tnZ4dmzZqha9eu8PPzK3gBBALBM8tTuWlj\nVFQU/P39bcd+fn44eLDws4xKCrVajd27N2Pbtm2Ii4tD/fqzspSnqHnzzfH43//mQaVyg8Fgwa5d\nm1GuXLls8bp06Qy9Xoevv14BjUaNkJCJGDJkLJKTh0ClOodZs2rh5MlDcHd3LzZbBQLB88dTKSSP\nszhsypQpts/h4eEIDw8veoMKgL29PVq1alXs+WzatAnz569DSspFpKS4IDHxS3Tu3B8nTuQ8iN6q\nVSubXWXL1kBS0lIALZCaCty50x9vv/0O+vfvh7p160KtVhe7/QKBoPiJjIxEZGRksaWfq5DEx8fj\n008/xdq1a3Ht2jWo1WqULl0ar732Gvr3719sBgGAr68vrl27Zju+du1art0mmYXkv8jJkyeRkvIK\nABcAANkH586Ny9e18fH3AQTIR1Ewm7di5UpfrFs3FqVKqbF37zaxkaFA8Bzw8EN2REREkaaf6xhJ\nr169EBQUhC1btmDKlCkYOXIkli1bhh07dmD8+PFFasTDu1DWrFkTf//9N65cuYLU1FR8//33aNu2\nbZHm+bxQtmxZODhsB5Aon/kFpUpl79bKiU6d2kKrHQPgAoBBAPoiNfUI4uP/wN9/l8PUqYWbLRcV\nFYWIiA/x9tvjcPjw4UKlJRAInmKYC5UrV85yXKNGDZJkeno6y5Url9tlWejevTu9vb2pUqno5+fH\nhQsX8quvvuJXX31Fkrx58yb9/PxoMpno5OREf39/xsfHkyQ3bdrEcuXKsXTp0pw2bVqO6T/C/Gea\n1NRUXr58mffv388zrsViYe/eg6nT+dLRsQ6dnX147NixfOWTkpLCYcNG09U1gPb2HgQiCVD++5av\nvNKjwGW4du0aXVx8aW8/nMBk6nQe3Lx5c4HTKwjp6em8ePEir1+/XqL5CgRPO0XtO3NNrU6dOty9\nezdJ8qeffmKzZs1sYfkVkuLmeRSSkydP0sMjkDqdHx0cjPzii//leY3FYuGpU6e4e/duxsbGFijf\nwYPfoINDbwJpBBKp0zXmRx99WqC0SPLtt8fRzm50JmH6iS++WLfA6T0ud+/eZbVq9ajT+VCjcWWH\nDj2ZlpZWYvkLBE8zJSYkf/75J2vWrElHR0fWrVuXf/31F0ny9u3b/OKLL4rUiILyPAqJv38FAotk\n53uJOp0P//jjj2LP9/79+6xXrxk1Gleq1SZ27Ni7UI536NCRBD7NJCSHGRhYpcDpXb58mdu2beOl\nS5fyFb9794FUq4cRSCeQRJ2uEWfMmFXg/AWC54kSE5JngedNSJKTk6lUqghYbA5Yr+/LhQsXlkj+\nFouFhw4d4ty5c7lhw4ZHCklqaip37NjBDRs28O7du9nCt2/fTq3Wh8AOAiep09XnuHGTC2TX118v\npFbrRkfHhtRq3Th37td5XlOmTA0CBzIJ2dfs0qV/gfIXCJ43hJBk4nkTEovFQicnLwI7ZecXR72+\nLHfu3Fki+e/atYs6nQv1+s40GGoyJKQJU1JSssVLSkpizZphNBiq0WRqTFdXP547dy5bvFWrvmdg\nYBV6eZXlW29NoNlstoUdOnSIFSrUosnkxaZNO/D27ds52vTvv/9So3EmcF6ukwvUaJx548aNR5al\ndeuutLObJF9jpkbTkRERUx+zRgSC5xMhJJl43oSEJH/99Vfq9W50dGxCnc6fQ4eOosViKfZ8t27d\nSoVCR0BdzOoXAAAgAElEQVRHIJDAIep0jblgwYJscadP/4QaTXu524hUKD5n/fot853XjRs3aDR6\nEFhB4B8Cw+nhEcz09PRscQ8fPkyTqVqmlgVpMtXggQMHHpnHtWvX6OdXjiZTLRoMFVi7dkMmJSXl\n20aB4HmmqH3nU7kg8b9MkyZN8PffJ/Dnn3/Cx8cHVatWLfY8b926hY4de4HcDKABgNUAOiIpqQeu\nX4/KFv/8+at48CAc1tnjZENcufJ1vvPbu3cvUlNrQNpAEgBm4/ZtPaZNm46JE7OugSldujTS068B\n2A8gBMBBmM1XUKZMmUfm4efnh3PnjuHIkSNwcHBAzZo1YWdn98hrBAJBwchTSNauXZttpbmjoyMq\nV64s3nNdTHh7e8Pb27vE8jt16hTs7V+EJCIA0BXAu9BoVqBevSXZ4tevXxMrV/4PSUn9AJigVs/D\nyy/XyHd+JpMJZvNlSJta2gH4FwCxceOubELi7OyM1au/Rdeur0ChcITFEodVq5bm662GOp0ODRo0\nyDOeQCAoJHk1WVq1akVnZ2d27NiRHTt2pIuLC5s0acLSpUtz6dKlRdo8elzyYb4gH5w5c4ZarReB\nO7bZYoADP/hAWr+zadMmli1bg15eZfn662P54MEDDh06iiqVng4OLqxZM4x3797lmTNnOHjwCPbu\nPZjbt2/PNb+0tDQ6OfkQCCBQnoAngRZs375XrtckJiby/PnzTExMLPLyCwT/NYrad+aZWtOmTXnr\n1i3b8a1bt9i0aVPGxMSwYsWKRWrM4yKEpOh4883x1OlK0WjsRq3Wi59/PockeeTIEWq17gQ2EjhN\nrbYZhwwZSZKMjY3lrVu3aLFYeObMGRoM7lQoIgh8Qa3Wi+vXr8+SR2JiIocOHUqt1k0ei/mEwK8E\nQmhnZ+TZs2dLvNwCwX+REheSChUqZDm2WCy2c9WqVStSYx4XISRFy/79+7l8+XIeP37cdu7996dQ\noRiXabD7Ip2d/bJdO3ToSCoUU7IsQKxSJdQWHhUVRU/PQAImAqMJ9MoUN4aAip069c5zYkFqaiqv\nXbuW42wykvzll18YHt6WDRu246ZNmwpYEwLB801R+848x0gaNmyI1q1bo2vXriCJtWvXIjw8HImJ\niXBycirefjdBiVKnTh3UqVMnyzmjUQ+V6jIyXnNyAzqdIdu1yckpIDNv8OhoezfK0aNHUb9+IyQn\nOwLoBeBFAJlfePYAgB1++SUSv/zyS5Z91dLT0/Hhhx/jxx+3ws4uHefPn4fFYg87uzT88MNytGjR\nwhZ3w4YN6NZtKJKSPgVAHDgwED/+uATNmzcveKUIBIK8yUtp0tPT+cMPP3DUqFEcPXo0f/jhhxKZ\njpof8mG+4BEkJCTw3LlzTEhIyDVOdHQ0PT2DqFINIvB/1Ol8uGLFymzxIiMjqdV6ElhLYAd1ukqc\nNWs2SbJcuZcIDCRQi0AzAtEEShF4m8AyAi8RKEN7+1acOXMmSWka9MCBr7NKldrUaGoR+JmAE4Hf\n5FbM79Tr3RgTE2OzISSkKYFOBCYSOE5gMZs371zEtSYQPPsUte9UyIk+kygUimw7Bwvyx88//4Ie\nPfpDqXQCGYvVq5fl+v6U6OhofPXVfNy7dx/t2rVGWFhYjvE2btyI99+fgQcPUjBkSE+MHDkcCoUC\nOp0zkpO3AmgEwASgEoCyAFbIV5YFEA+dLhU///wNYmLuYMCAMUhOfgvAdQDfAVgM4B0AJ235mUy1\nsWzZRCiVSqSnp6NTp75IT+8r5/E1gD6oX/80fvzxO7i5uRW+0gSC54Qi9515Kc2aNWtYpkwZGo1G\nGgwGGgwGGo3GIlWzgpIP8wU5EB0dTZ3OJdMWIvuo17vmuNVJQbh79y5Pnz7NxMREHj9+nIGBFahQ\nDCSwkICewBQCUwnsJTCSSqWWKpWREyZEMD09nY6OAZlaHiQwksBYuUVyUT53jWq1M7VaZzo6tqCd\nnQuBzGM5EwloqdOVo4ODI2fPnmuzLyEhgbNnz+akSe8zMjKySMosEDxLFLXvzDO14OBgnjlzpkgz\nLSqEkBSMAwcO0GSq8dBq8ao8fPhwgdL77bffOGDAaxw58k1+9NF0ajSONBrLUaNxplrtTJ2uFwF/\nKpUmAgYCR235qlRD+OabbzI6OpokOWPG5wRcCRzLZF8EgVZUKFoSMNJobEmVypWAkUAjAt8Q6Ehg\nvhw/gYCzPNOMBC5Tp/PiqVOnmJiYyPLlX6JW254KxUTqdL5cuHBxEdauQPD0U9S+M8+urXr16mHv\n3r1F1wQqQkTXVsG4efMmgoMr4cGDwwBKA/gbGs3LuHLlLDw9PR8rrTVr1qJfv5FISnobCsVfIL8D\n8AeAcgC2AugDYAeA1gAcYGcXDcCI9PQpUCovw2RaiFOnDsPX1xcAEBr6Cn7/3RHARQCfA7gJoB/c\n3Fzh5+eFkJDqWL9+E27f1sNsTgbgBcAVwD5IK+1XAHgDUpdYgs1Ok6kjFizogaSkJAwfvgqJiZsA\nKACcgMnUDHFxtwpUlwLBs0hR+848Z23VrFkT3bp1Q/v27W3v8FYoFOjYsWORGSEoWby9vTFr1nSM\nHVsHanVlpKaexBdfzMiXiOzZswdt2vRGXNx92NsroVSakZrqDuBHkM0A1IYkIgDQHNJP7FUAbwMY\ngfT0JKjVVVG9+jJUr/4ixo/fZxMRANBq7QGUB1ARwOuQxMABMTHvISbmR/z55w8A4iGtiu8A4HtI\ngvA1FIp34eDQBQ8eKAGYAeyGtFr/Fszmgyhb9n3s2bMHZnOgfA0ABCE+/i6OHTuG6tWrF7xSc2H/\n/v1YunQVHBxUGD58CMqVy9/bKwWCZ4k8hSQuLg5arRbbtm3Lcl4IybPNsGGD0bx5E1y4cAFly5ZF\nYGBgntdERUWhYcPWSE//FEAlmM09AdwDEA3AF8CnkH5SNwF4AzgISQiuQHL6AKBDampPNG9uwdSp\nH2LevK8REfEJ0tJSMHjwAHlvr92QRMgRwGUAdSEJS0UA7QGcBjATQAVkCEJtmExGJCbGQRqcJ4C2\nAPwAXMWAAYMRGBiISZM+QmrqbwD+BvARgC9A1kCDBs1x5swf8Pf3L2CNZufXX39F+/a9kZQ0FgpF\nPBYtqo/Dh3ejQoUKeV774MEDfPzxDBw7dhbVq7+A9957CxqNpshsEwiKlCLtKCthnnHznzm+/vpr\nAm0yDYBb36iYJE/rDadCoaNa7UJHx/rUaFzo6VlKHsuYJl93n3r9S1y1ahXXrFkrv+JXT8CbCoUj\nHRycCKySr/GQpwd/TcCNQH8CQ+R0VhCoROAGgQdUqbrSzS2ACkXdTGMr0QSMVCp9uW7dOlavXp9q\n9RB5kP9tOY/uBOKo0/XmN998U6j6SU1N5TfffMOJEydxw4YNrFWrMYHVNnsUiggOHjwiz3TS09MZ\nFtaSWm07Akup1bZnaGiLHHdHFggKQlH7zlxbJNOnT8e7776LN954I1uYQqHA7Nmzi1HeBE8jer0e\nwB1IT/tHAXwIqQViD6AvgKno378bJkyYgCNHjuDjj+fg7Nnj8PX1QWrqXCQlLUJKyk1UqlQNtWrV\nwsCBw2E22wE4BuAoyJ+RkvIzgN8BqAB8BaAGgJUA6gBYD2k85A6A7pAWNfoA0CItTY2YGAukFtI8\nAG1kO1OhVMZjypRPcPLkeZC7II2l1AWwHVL3mQkKxZ1CPfGnp6ejceO2+OOPNCQl1YNePxY6HWR7\nJUg3JCT8k2da586dw+HDp5GcfAGACsnJPXH0aBmcPXsWlSpVKrCNAkFxkauQVKxYEQBQo0b2XV0f\n3g04NwYOHIiNGzfCw8MDJ0+ezDHOyJEjsXnzZuh0OixZssTWTx0YGAiTyQQ7OzuoVCocOnQoX3kK\nio927drB0fEdxMX1AmABsBFAOCSHvRHAdezZcxKTJ0/Fzp17cetWI1gsZkRFXYWdXTK02gews+uI\nP//UoUqVl6HVaiB1fb0EqSvsZUhjH9sApAHYDGAogE4A/CGtgE8AEAzAGcANAEZI4zIxAK5BGhuZ\nDmC0bLU9zOYknDjxBwAtgFQAGjmfGAA/Qa1eAnf3q2jfvn2B6yYyMhLHjt1AUtJRAHZITByG5ORg\n6HSjkZQ0D0A8tNr/Q//+i/NMKy0tDUqlAzJuTzsoFBqYzeYC2ycQFCtF2r55iN27d/Po0aN88cUX\ncwzfuHEjW7aUXoh04MABvvzyy7awwMBA3rlz55HpF7P5ghy4d+8ea9SoTUBLwIVAdXmVuoGAI6XN\nGP3k7ioPAt8SuEBgEAH/TN1OnahQVCVwmdLbDysTWEBgJQFvAu6UdgX+INM1X8jTej8i0IMA5Hz9\nCfxPzuctucuqPIGacrp35e63IAJhBBYRaEMHBzeWK1edOp0H/f0rcfXqH0hK+8k97u4Na9eupcn0\nSiZbLVSpTIyImMrSpV/iCy/UsaWfF2lpaXzhhZpUqUYQ2E21+g1WqFCDqampj/19CQQ5UdS+M8/U\n/vrrLw4aNIhNmjRheHg4w8PD2bBhw3xncPny5VyFZOjQoVy1apXtuHz58radhgMDA7Nsf5Gj8UJI\nnhjr1q1juXI16O4eSLXahdJOvtKmjpIAKAi0y+RYzQQcCMTLx7UJbMoUvoLS9iZ1ZEEyyKKxMlOc\nNZS2nneR4/Qm4Esg85oYi3xdRTmNHgRi5bjessgYCWgImKhUehHYQmAHtVovdu3amw4ORqrVeg4d\nOirL64Efxc2bN7O89dHObjgrV66TRZAeR5xiYmLYs+cgVqwYwh49Xs3zXhAIHocSF5LKlStz7ty5\nPHDgAA8fPszDhw/zyJEj+c7gUULyyiuvcO/evbbjxo0b848//iBJBgUFsVq1aqxRowa//vrrnI0X\nQvJESUtLY9Om7WTRSMvkzPvJTrsSra/jBW4SsCdQlUBdubXyWaZrJhOoJgtEV0or2Y0EShM4SWC9\nLAzB8vn+smjsptQCSpXTuSfHu03gOqXFjQ0o7fV1gdJAvk5u8bSQbXUjsI9AW9rbVyEQReBf6nSh\n/OCDj/JdH4cOHWJgYCUqFHoqFHpqtY5cu3YdFyxYRKPRnXZ2ajZr1oGxsbHF+K0IBHlT1L4zz+m/\nKpUKr732WnF2reV4/vfff4ePjw+io6PRtGlTVKhQAaGhodniTZkyxfY5PDwc4eHhxWSp4GEWLlyI\nvXvvQ5piuxvSXloPAOyFNBYRC6CZfH4+AAcAYyCNVQwFMAHS3lkPAPwEIADSlN59ANwhjXMYAbwC\naRD9MwCDAcRBGizfCKAVpPGaBpCm+y6T83kAaVzlJQC/AfgVgBrSAsw1kN7KuFkuyQr5WgeYzZ9B\nGsAHkpLGYcOGWZg06b0s5f7mm4X48MNZMJvTMGxYf0ya9B4UCgWqV6+OxMQEkF8D6Ink5D/Qs2cz\nKBQqPHiwHUAwIiNHoG/f17B+/QoIBCVFZGQkIiMjiy+DvJRm8uTJnDNnDm/cuME7d+7Y/vJLXl1b\nK1dm7CSbuWsrM1OmTOGMGTOync+H+YJiZPTotyiNV/wqP9W/Inc16eXWQDKBmcwYO5mTqQXyE6Xx\nixlyN1cnuYVhbZ34EdiZKb6awH35c5TckmhJqftMQ6ALpSm9K+T/Iwhck/O2pzQWkyrn8bJ8vTXt\nP+WWS1UC79rOK5Ufs127nrx16xbPnj3LlJQUrlmzljpdEKUpxMeo01XnsGHD+d1333H37t3yDsgZ\nW884ODQi0DnTues0mTwfWa/x8fGcOnUaBw8ewZUrV9JisfDu3bts1qwDHRwMdHcvxbVr15bQt1xy\nXL58mTt27OD169eftCnPPUXtO/NMrVSpUgwMDMz2l18eJSSZB9v3799vG2xPTEzk/fv3SUob7NWt\nW5dbt27NbrwQkifKt99+S52uBoE4AlepUDSns7OnLBoWm/NUqcJkR91DdtqktN28G61jFdJ6EVIa\nGO9OaRzkRVmQIuU0R8uC4yL/ORN4n1JXWaNMeS6l1HXmkEnEHAjYEfCitPmjL4GeBFpTGk/pLIuD\nK6W1Mh0IaOjs7EV7ewP1+tL08gpmkybtKU0KsArDr1QoPGg0dqVG40qVSk/gNK3dbGq1Nx0cGmSy\nbQtLlaqUa50mJyfzhRdq0sGhO4FZ1Ole5Hvvvc+mTdvLa2DuEdhLnc6Dx44dK8Fvu3iZPXsutVpX\nOjqGUqt15XffZX9VgaDoKHEhKQzdu3ent7c3VSoV/fz8uHDhQn711Vf86quvbHGGDx/O0qVLs0qV\nKrbxkYsXL7Jq1aqsWrUqK1WqxGnTpuVsvBCSxyIuLo4jR77Nxo07cPz4yUxOTi5UehaLha++Opz2\n9iZZDPRy68IotxZ6UJopZaQ0s6qTLB6DZCEApdbMJPm6fym1UgYQWC6f08vX6OQ/vdxqSCMwWM53\nsiwOeynNAAtmxqD9IEpjJj9QGvCfJJ83Uno/SncCKlnQjLKQ9KQ0XqOVxceJ0oD+LBqNPlQqIzIJ\nyWICTeXPh6lWG6jVutNk6kCtNoDNmrVlYGAl6vVNqdEMpU7nluNDkZUff/yRBkNoJuG5RXt7B6pU\nOkqCLeWrVo/gZ599Vqjv72nhypUr1GpdKbUaSeAEtVonMZZUjJS4kCQkJPCDDz7goEGDSJLnz5/n\nL7/8UqRGFBQhJBKxsbE8ceIE7927l2uc1NRUBgSUpVJZiUBHajQt2bhxmwK/pMxisTAlJYVRUVHU\naBxlh35edgRXZKffhcASSjOtEuSwc5S6qarJjruqLDKVZKddVXaiaQRek528lkAVOf2LsgDMInBY\nFh5XOT8TpZZHqUyfHSlN+fWkNO3XLAuYkdIkAT2B1yl1hTnJfxGyDZUJtKI0ccCJQCABBRUKPYE3\nKG1bb5DTvkDAQjs7NTdv3syOHTtSpTLQZOpAnS6I9es34uzZs/PcSfu7776jwdAxk1Cl0s7OgS4u\nfszY9t9Cvb4Jly5dWqDv7mkjMjKSjo71MpWZNBrL89SpU0/atOeWEheSLl268OOPP2bFihVJSsJS\npUqVIjWioAghIX/++RfqdC40Gl+gVuvMVatW5xjv9ddHyg77SwLDCJShRuPOK1euPHaey5Z9R53O\niUqlPUuXrkSdriqlbihm+qtAYBelLqy2D4XpZVti5eNvKHVTlaPURUVK60fCCXwvC9HqTNf/QqC5\n7PA7U9pC/nVKU5DdKLVIasufK8j/G8lOf40sOtsptUZmZkr3UzldZ0pbr4TKIjaVwFfM6Iqz2jqe\n0psYP6HUHTaTOp2r/HRdm1L32gcE4qnXl8vXu09u3rxJk8mT0rYwx+ng0IeNGrXh99+vplbrSZVq\nFHW6pqxSJaTQLcqnhaioKOp0rnJdksBu6vWujI+Pf9KmPbeUuJC89NJLJMlq1arZzv1XhCQ+Pp5L\nly7lvHnzeOnSpWLNqyDExsY+9IKqP6nVuuQ4YUGrdabUGshYEKhSOT52uY4ePSoPKB8nYKZSOVx2\n0K6ycJDANvncS7KTd6XU7WShNOBuJDA8ky1xlFoeMyl1hb1PaaHjdgL/UGoNZF6Y+AmlVkJZAodk\nx24VDxdKg+tulFoQ42XhqSfnq5YFipQmB6zLlO4aOczalaal1BVmDd9EadpwHUrjNdMIvEdgOqUW\nkC+VSi2BP+T4twn4EFhDe/sKbNSoZbZ3vty5c4eff/45p02bxv3793Pjxo2cN28ea9VqRD+/iuzV\na7BtvPDIkSP89NNPuWjRoudGRKysXPk9tVonGgxlqde7csuWLU/apOeaEheSkJAQJiUl2YTkwoUL\nrFWrVpEaUVCKU0ju3bvHoKBK1OtbU6sdQIPBnQcOHCi2/ArCn3/+SZOpEjM/7Ts6vszff/89W1y1\n2kBpE8OMtR6BgeUeu2trzpw51GiGZkonhYCKSqXV8XrJIjKAwB7Z4VrHT1SyI7aTnby1RTJfvqYx\npa6rxrIgfCGHz5LT7k1ptpVeFoTq8v8QSi/CSiXwf7JgBFJaJ2K1M17O31527smUWhkvUlqncoLA\nC5RaSp/JgmdixkJLUhJKT0qtEkcCSvmzm5yng3xNEzn/6rKAOVBqPX1EBwcX7tmzhyR548YNGgwe\nVCg6ExhFQE+lsgLV6pdYtmy1R3ZVPo/Exsby9OnToiVSApS4kGzdupUNGjSgm5sbe/TowYCAAO7Y\nsaNIjSgoxSkkEREfUq3OPEV0OatXb1Bs+RWEO3fuUKNxkh0hCZynVuuS4/TJXr0GUaNpSekJfjHt\n7U08ceLEY+e5Zs0a6vUvM2MB4j46OXnT3l5LaQzjV0pjFNaFiJZMTr0hpXGHubJwuFEa+7A6Yi9K\nrYuy8rGOQC9KU3zdKLViPqLUEnCSxcU6QF5WFoXtlBYxOsiCZP3+7ssi4ij/lZaFySpyBvkaB0oD\n9KeZMaNsBYGtch5a+Zx1wP+YfNxMtslAaWuW6ZRaYwYCYzLZsZA1aoTz8OHDrFSpOqWFldawdZTG\nfyxUKPpy9Oh3iuJnIhBk44nM2oqOjuaGDRv4yy+/2F6J+jRQnEIyZMgbzNp/foK+vi8UW34FZfny\nFdRqXejoGEKNxoXz5y8gSV6/fp3z58/nwoULeffuXT548IBvvPE2g4KqsXbtxty5cydHjHiL9eq1\n4ogRb9q6T/LCbDazYcNXaDDUoF7fjxqNM19//XVKrYxEZmyRkiLXm5lSN5QdpfGZo7JTrk7gEqWV\n5k6y0HSiNB7iKse3ozQWoZZF4rzsuK2D81pK4xAnZMGaIDt7D2bM8hojO+i6cj5tKQ2sl5fTBaWW\nRRClAfMY2ZaXZZGybtViHdD/WY7/INNvoz+lrjUTpVaPDyVRdJQ/N6E0uyuSwI8EXGg0Vpbzn57l\nNyaV/xiBFvTxKfdYu0gIBPmlxIXEYrFwzZo1HD16NMeMGcN169YVqQGFoTiF5Mcff6ROV0Z2Lvep\n0XTigAGvF1t+hSEqKoq7du3itWvXSJKnTp2iyeRJrbYPdbqO9PQM4s2bN23xzWYza9RoQAeHPgTW\n08GhL196KTRf+0qZzWbGxcVx/fr1nDJlCg0Gd2q1nZmx/uIopTUaTSjN2OokO20DM7Y8sTr6IwT6\nyOfflf+s60O8ZBFpzoxWh0E+7yaL0k1K3V7BlLqqTsvO/GNK6y2Wyte4y6LjKAtGVwKvyuccZFus\nIldBFgqDLAomSlusvMKMRZU6AvuZIZRV5PhqOc21mYRBL/91l/N2levnczkfP0qtxGuUJgTUlMs3\nlcBH1OncsmwjJBAUBSUuJMOGDWPTpk25aNEiLly4kM2bN+drr71WpEYUlOIebJ8+fSa1Wkfa2zuw\nQ4deTExMLNb8ioomTdpTobCOL5D29mP42mujbeEnT56kXh/EjO6ndOr1wdm6um7cuMG+fYcyPLwt\np06dzhkzZlGt1tHeXsMaNcJYq1YjKhRfyU7QWXbOlSi1GHrJfxGUWgQusihbu3AcZEdtHTvxItCN\nUneU1YlbN1+8T6nF0UR21naywNyR06tIaaHj+7JzXi6Ljw+l7iknOR8Tsw7yL5Lz9ZTFwJfAd7I4\nrJXtqC3b6kOpZaVnRkulm1zeF2SRUsj/rQJDSjO/9JTGQObK9eCfKV1/OV0TM8Q28w4A89m8eWem\npKTw6tWrfPDgQbH+diwWC+fMmcdWrbpxyJA3eOPGjWLN71mnpL6XoqbEhaR8+fJZ3syWnp7O8uXL\nF6kRBaUkpv8WZEvxJ82LL9Zj1u1FlrBdu1628BMnTlCvD35ISErz+PHjtjixsbH08gqmvf3bBNbS\nwaEq7ey8KS0aS6e9/VhqNF6UWhXHZWduze8L2THfotRVVIdSq8QavoLSk7gnpZlSFgIbZYc/mhmt\nhC7M6Pr5lVK3z0VKg+pD5fA42fkGU2rBuMnO/IR83V7ZSVvHRuZmsuOAfF2ILFABmcKOyGEDZFsX\nE5jCjMWMSkprXprIYuVBaQqySq4LpSxMrnJ9uFISP1dKY0haOZ6eUjdfVVlIqlMSM6sdP9DVNZhK\npZ52dkbqdE7cuHFjsf12xo59jzpdTQLLaW//Jj09g3j37t1iy+9Z5rfffqPR6E6dzod6vSs3bdr0\npE3KNyUuJK1bt+bly5dtx5cvX2br1q2L1IiCItaR5Mw770yiTteU0hP7der11fjNNwtt4RldW30J\n/EwHh36sXr0+zWYzU1NTOWjQCKpUBtnJTZcd/URKYwsfys5+Oe3sHGlnVy+TQ5xHaXbUEkqtCXsq\nFCqaTD7y8W3ZOXan9KrerDPOpKf0XbJzLic73LqycEymNAZijXtddrxW8ZhJ6Z0jM5h9TUt5Sq2Y\nspS6l85RWkXfiNLCSHdZIJzk81NlEWgunz+QKa3RlHYSttaPQhaWAErdYnpZOMwEfpOvvyoLS6qc\nph2llkl9WVj0ct5rKK3AL0VpCvVvlFpMJrmcHgSCqFYb+fLL4dTp3GkyeXDQoBFF8kRssVjkFfS3\nbOXV69tzyZIlhU77eSM2NpYGgzuBHbQ+sOj1bs/Mdv8lLiShoaHUaDRs0KABw8LCqNVq2aBBA77y\nyits06ZNkRrzuAghyZnU1FT26zeUKpWWDg4Gvvfe+9laVffv3+fw4WMZEtKCLVu25bhx4/jtt9/y\nnXcmUqdrRGn84bzsHL+THbWB0njBaAIBNJncqFQaKa2v+E12yPYEXKhSOdHJKYBKpZaOjv5UqYyy\nQ6wgx2kgO88b8o0YLTvXlyjN8tJSGk/xZEa3Ujgztg75SXa41mnHFzKd11FquVjHKUyUWhwaOT2t\nXJZhlCYFdKU06yqCGYPkVmcazIz9wUhpNfsoOY1tlGZufUZJzDYzo2vL+teK0ir40pQE0k+2Y2+m\ncrvKYjRaLt+3snA4URJB68SFiczYK+xz+diNDg6hfO21MVm+3/Xr19PHpxwNBnd26tSH8fHx/Oef\nf7hv375cnV16ejrt7R0ojS9J9ut03blw4cIc4/+XOXr0KE2mylm+a0fHWty3b9+TNi1flLiQ7Ny5\nk3ksR0gAACAASURBVDt37mRkZGS2z/lZqVucCCF5NPnplnvzzfHU61+gQjGBen1dGgz+lN7xYb1B\nviHwEpVK61sLu8qOfRQVCgOlp/xBsnN9jYCS9eqF0cnJRz53iNJYhI5KpYPsMO2ZMWbhRqA9paft\nlyhNyf2M0rjDYEpP9kcoTQF2p9SC6C5ft4vSVGAXSl1KHrKTfkEWCusguJbSmxMPMOOlWZcylfE9\nSq2hEZS61TI7iGmyM99KabNGV2a0JAZRmol1Xc43VLbjsnxtouz4HeQyz6TURWd4SGzaUuo2Cybw\njpyXs5xW5qnD85gxeO9L4HdKCy7709OzDElp08fOnXtRoTBSelqOooNDT1asWJNqtTM1mip0cHDi\nhg0bcvw99OkzhFptE0oPBp9Qo3Fky5ZdOHbsu4+16/fzzr///itPvbc+sFylRuNim/DytFPiQnLx\n4kX+/PPP/OWXX3jx4sUizbywCCEpHDExMVSrjZTGMUjgAZVKd9lhZe7K0VKpNDHjKf0apaf+rnJc\no+yMPyZgZI0a1qmzlWRHOkD+bKA0jnCf0hO5QXawL1AaR/Gh9EbEdpQGoTNvi7JFdtQGSi+9st7A\nn8t59WbGHltaOZ8IZryKN4QZrZeqlHb9/Vt2mEZK3VVvynGcKa1HIaXpvjpKrQBn+Xq97PTLM2M8\nZwClbj/rQLy1NWWS87ZurxLIjCnCv8o2eFKacbaVGetaApghjD9TEmw7SkI4TS6fjpJYt2XZsjVI\nkl269KNS6UtpUsF9St2bu+TrrtG6sFKp1HH79u1cunQpjx49avtNpKamcty4yaxWLYwBARWp1dYi\nsIhq9RAGBVViXFwc09LSSEpvyQwIqEQXF38OHDg8x+61tLQ0Llu2jJ988oltIebzwv/+N59arTsd\nHVtQq/XgzJmzn7RJ+abEhCQuLo5dunRhUFAQO3TowA4dOjAoKIidO3dmXFxckRpRUISQFI5Lly5R\np/Nl5i3ftdpqsoMaTOnJvxS12tLU6aoz61N0KUrrHUZQmi1lobRGorvsDP+PGU/lL8vOUUlpmq41\njQHMWB3uTWmKbUVKYxfulGZwJVMaW+hIaapwAIEylLY9WSHnNVh20FNkYfGXy6Ch1CqxrvlYKTvU\nqbKjDWDGQsd/KImdLzN2GdbIwuEqi4GTHGakNJYxSj4uLYvMjwQ2yMKwWg63vtrXm1LLw4VSt551\nhbyDXO7Gst3WnZI9mNHy0VEaKwqitGDR2tU1lIA97e2N3Lx5My0WC+3srDPhymSytaycZ+bvz0SV\nyoUGQ0/qdN787LOsTjAlJUXu5rpL68JSlaoulUp72tmp2bRpW2o0HpQE9yK12tYcOHB4ljTMZjND\nQ1tQrw+lSjWGOp0v586dbwu3WCy8d+9elsk8xUFqair379/P33//vci3ljl//jyXLFnCkSNH8r33\nJth2MH/aKTEh6du3LydPnpxtxlZExP+zd93hUVRf+2zfnW3phQQSEgiBBELvmEiRXqUoIEiNImAB\npUgTkWJBbIgUFZCioKKCIEUQCyAo8ANFaUGQKkIIoSWbfb8/zpnMRhJqKPrlPE8eyO7MnTt3Jue9\np73nOTz00EOFOokblZtdjHnz5iM0NAZ2eyA6derxr0nvLSzxeDwoWTIRBsNYcEzkPbhcoRJoHwrO\ncPoBZrMbdnsQeAftu0vPAKf8vgl2wSSCXTMu5HUdjROF6MtDlQMGmOKiUO2i8LrL/yNlPDWoHS/g\n4RKwaAAtY2oyONuphIzpENDoCmb+7Q22AALl+GAwfcsuUexLwbT3Y8CAmC7jTQdnk0WJEh8g9xcC\nTihwQaNICQTHYdrK9f3AFlqU/K5afQflvE1yvBtsHbnAlpLKkjwaWm/6KdBSihtDo44ZJ2tjgcFg\nRUJCTeh0NlmryuCMtiwwzb0LmsttlTyLoVDdMhaLK0921vnz52EwWJAX+OuDaBaIMmA01ofGWQYQ\n7UNAQPE879cXX3wBh6MKtFTo3TCb7cjJycGOHTtQrFgpmEwOKIo/XnvtdSQnN0epUlWQmvo4zp8/\nXyjveEZGBpKSasPhSIDTWQmxsRVw4sSJQhkbAA4ePAh//2IwGvuBaCQUJRirVq0qtPFvldw2IImN\njS3wpCt9dzvlZhbj22+/haKEgwvLjsBqbYcuXfoU4uz+HXLw4EHUqtUIDkcQoqMT4XAEwGxWA9Ll\nQOSEyeTCxIkT4XYzyDCNegwYREZDc/uk+yic8dAskiRovFhusFumjii39dDo21PBNSb7fBS+HWwN\nBIqCLA/e3d8vCt0B3q0Hg3mxDoHdaQPl+EfBANMHTDsfAQapEJmPXUAiDEx/ryrGsaJoL4Ddb/PB\nVhFEmTvAzMRNwR0ZveDAeTQYPOyyfsXkntX1ygAD5jYZV63KbwN279UWxf8EtD4pKgUOwAzFFlnT\nAHCcpDPYKnofbF3ZwQWW6jnfQ3O5xcl5DcGuPD7G4YjFb7/9lufdaN68A6zWdvKMJsqaqYC4Up6D\nSpXzNYoXL5fn/NmzZ8PheNBnHjkwGMzIzMxEWFgMOKUa4DiaHeyK3AirtR1atXqgUN7vp54aKoW3\nf4HoLEymJ9C5c+9CGRsAnnzyGRgMg3zucTGSkuoV2viFJenp6Vi2bBlWrVqFS5cu4bYBSalSpQo8\n6b8AJM8+OxJ5mV3T4O8fWYizyysnTpzIQy/j9Xpx+PDhuyZHPysrC253KJhA8U9wMaECthgqQ1FC\nsGvXLpw6dQqzZ8+FzRYGojrQ6UpBp1ML67TdKYNENBhgOoMtkJ9FMaoup7WihP3AbqtXRcm+LNcu\nAd59J4BdZhBFNlCUopqF5S+//whW/GozrDNyzWRR5qFy7vdgMFP5tUoiL0nkRbBbaQaY2ysAbDV0\nl+9VUkaPjNtL1i0LbNWUk/NUIC0LBogEaDEcHZieJQVMhR8qa/gl2C3mknWrAXZTZYOzqZLAQf+x\n0JpzmeReSoFjVjXA1pyq5J+XeTYEZ6pFy5r0ku8/REBARB63j9frxZo1a9C0aRuULl0VkZHlYDD0\n8HnGz4MoBDpdGeh0I2CxBGLChAl5CBf37dsHRQkCx37SYTQ+g0qV6uLIkSOwWoN9xlI3H5/K/8/D\nYDAjKyvriu/spUuXsHbtWqxcubJAosd7720N3sQ4Zd3bIykpuVD+ZgCgZ89+8t6q97EJMTGVCm38\nwpC0tDSEhETD5aoPp7MqEhNr4LYByUMPPYTnnnsuT9aP1+vF2LFj0bVr10KdxI3KzSzGK6+8AovF\nd7e0ElFR+bcEvhm5cOECmjRpB7PZDbPZhebNO+DIkSOoVKkurNYgmEwO9OzZ77r9xH/++Sd27dp1\n1T+2q8lnn32Grl37omPHztBYcx3gHfROsA88GETd4HQGY/fu3QCAb775BkOGDMfgwYMlEG+XP6i/\nwKy6NvDOX+32tx9spRhFebcFZ1H9k6V3r1w/Hpq1YgZTh3wNthyqypwUsBXyuSjpJFHuvcCAdAi8\ni54Jjue0Ahcxqm4if1Hkb0Pj4qoq56jZUaVk/BhR7MPBIBUkij8SHCOqC96hh4Otpx2iyBuBySrH\ngK0Bl6xzJXBdTAzYaqkLTihQ1+ENuY9ssLVhAwOnmiLslc+CwS4rL1i5q024bNAsPZesm0eO2w2O\nrdhgNNoQGloyl9MrJycHY8a8AKs1FDpdAKzWSrDbg/DSSy/BYHCBAai5rFEkiEwoX746LJZguFyV\nERAQmaewddWqVQgPLwWz2Y46dRrj6NGjuHjxIiwWJ4h+lXs9AwbShvKMpsFotF5G2XP06FFMnToV\ngwYNwptvvomyZavA6awMl6sOIiJK50tWykqzKzQgroB77mlwU38zvvLVV19BUSLBCQ07oSh18Oyz\nz13TuZ988imaN38AHTs+jG3bthXanP4pjRvfD4NB9RB4YbF0wW0DkvT0dNx///2XBdvbtWt319Bb\n38xipKenIzq6HGy29jAaB8FmCy4wJfJm5OmnR8Bmaw3e5V6AzdYCcXGVYDb3A+/Sz0BRamDGjJnX\nNJ7H40HDhs1gNCqwWoshOjoBBw8evKG5TZ36DhSlJDjGEQstQH4KvKtWd4gTwJlSQ9CzZ9/c871e\nL8qVqwYGhTmiFC2iFIaD6DdRdJ+K4nkM7GqygesrqsjnXX0U6Bkw2KhZWaOhBZ6dYPoTgAslO4Jd\nQlFg95cNDC5GObYXNEuGd7q8e98LjtUEg60I1QoJgVYZb5BxaoALCheLwg+R+3TLWEehAVOCrFUk\nGBxHyDUSRMkflnEjoSUA/C3zjoCWdPCBjKPS9TeW+wqDFmjfA7ZqfClfzsmc/5AxJ4BZB7rIPMrL\nNfRQa2kGDnwSZ86cwaBBw9C0aUc0bNgMZnMFsFtPTatOlONfAFt04eD4zFwQ2WGxlAQTV/YG0XCU\nLn31Hfl7782GzRYCp7MDrNZoeV7TwO7CcDRpkrdGbdWqVbBa/aFuInS6UjKPsyACjMZn0bbt5Rvc\nqKjy0OJyANGbeOCBHjf096KKx+NBWlpabj3OnDkfoESJBISGxmLw4GevibNuzpwPoCglwO7IV2C3\nB2HHjh03Na+CpHTpqtC44QC2tG9z+u+ePXvw2Wef4fPPP8eePXsK9eI3Kze7GGfOnMGbb76J8ePH\n50mBLEypVasJOEisPsRPYDaHQesGBxC9hh49rs5f5vV6UblyXfBuuyeIwqDTtcE99zS7obmFhJQE\n12gAvIs+5jOn4eBdNKBlZo2H2ezClClTsH//fpw8eRJGo8NHGQO8wy8Bbbe5EGxhTPA55hXwLvlB\naNlFc8BWQ2PwDl49tpsotdpgkDrpM0YxMNHjXnBigB1sZTjBQBgIjjuoWWl/Qmt6FQhW6Op3Xmjc\nXE3Bil51JRnBVpFKtRIOtlKc0GhmAN6pvwrm6fIHWwMrRZHVAMcAjGAA9b2uW9bijBwbBFbcC8CK\nNRhcaBkIdmX1hsaQXAUaKC2Xe4Csq6pAs6BR6NcAZ8u5oVotZcpUkTjCPOh094LdjtHglOTTYCuh\nChg054HdkXYQVRfLwi3PYwqIgmEwmLF06VKkpg7EqFFjcPLkyXyV6y+//IL58+ejRYt2Pu9HFoja\nQ693IjQ0FrNmvQev1ws/vzBoVeTp4I1PDWi8ZOuQkFDnsmukpLSAXj85d62t1k547rlxN/T3AjBB\naunSFaEoxWA2O9G//6AbolCKj68h7wa/OzrdGDz22JNXP/EGpHv3R4TFIhtEGVCUurgjNPI3Kj16\n9EBISAgSEwt2GQ0YMAClSpVChQoV8ijz5cuXo0yZMihVqhQmTpyY77n/hvTf7t0fgcn0RO6LbDL1\nR1hYKej1KkW9B1Zra0yc+OJVx1qzZg1MplI+imMXiOzw84u45vl4vV5MmfIGSpZMgl4fAI3LqiY4\nSwngOEMi2NLoBd4JTxPFWQUWy8Ow24OwevVqmEx2aJlGXrACd4myywHvuAORtyaEqdTZVbQcnEXk\n76Pc7KKs2oGV/d9goPBNK/5Kjvf6jFsVnOaqNplyg3f6ncHxj5Jgd5IXnJrrDy5U3CK/q5Txa8GZ\nXO+Lgusoa9JUFNhrMkZtcLbaQXAwXo3rlAYrf982BB/I/FU6lhrg4P5oueejPscOgpbJFgYOqO+X\n+bUFx1ZUapZwMGg0kut/Aa6NUaBVqJ+Q42tAy6BaL8cEy7qogHhRrusL/L/JugSDYxnloDI5u93F\nkZe/7F3odIHQ60NA9BCMxl7iEtOhWLFS+TaH69Onv8/1hoPdZ/tAtAmKUhyffvopDAbzP551V7DV\nVxZEF2CxdEXv3gMuG/v3339HYGAkXK7GcDqro3z5msjMzLzmv5d/SnJycxgMI2Qup2C3V8BHH+Vt\nb717924kJNSA0WhBiRJl873nEiXKg4tZ1fsZj9TUgTc8rytJRkYG6tS5DxaLH0wmOx56qC/+VUCy\nfv16/PzzzwUCybJly9C0aVMAwMaNG1GjRg0AbDrGxsYiLS0NWVlZSEpKwq+//nrZ+f8GIDlx4gSi\no8vB6awNp7MmYmLK48cff0RQUAm4XClQlHhERsZg1KjR+d6jr8yfPx9Waxufl4/95NfTcGvatBlQ\nlLLggPNaUVQvyB+yHQ5HTShKFFJSmqJ371TodFZwKmlZUVbqteeicuVkVKuWDAahueCdbjFwfKM0\n1NRULtyLAWco7RTFpzaTKitKKQi8870kijNWxg0AA4oJDBIx0DKufOscssCgkQQtRuCGRo8SBQYU\nVRntls86QauAnwoGvQpyz53lfDWusl6U6TPy+3FohY6hIPpBxn9Uxmstx22XOT0i318Ex45UsHOD\nlb/6TO8D0VtgV1xdsNUVIWM2kf+HgOMsUfJjlnVUgcoixzSStTWBqK/P88sEWzQlZdw2YAs0FQw6\n1aGBy2IZc5b8ngOOZ6jWyByfcT+StZsv85oLBr+XQNQDZrMDq1evzn0fV69ejd69+8iGZIo89y0+\n401Gnz79ERERBy3LK03WOwhE5aDTWVGzZn1MnDgRo0aNxooVKzBmzBg0bNgIffumYv369fjwww/R\nr18/DB/+7E015mPGhj985jcWQ4YMy/0+KysLERGlhX07E0SL4HKF5qGl8Xq9cDqD5V4/AzM/2DF3\n7twbntfVxOv14sSJE7lhiX8VkACcMVAQkKSmpmLhwoW5v5cpUwZHjx7FDz/8gMaNG+d+PmHCBEyY\nMOGy8/8NQAIA586dw1dffYWVK1fm5senp6dj9uzZcDpDYDL1gcEwCHZ7EDZs2FDgOHv37pUsmG/B\nO8vx0Ovd+P333695LtWrN0JeV9tsKEokqlVrgBUrVmDdunXYunVrLr1K5869YLdXB6fr+u6wtyMy\nshxateoMdreUBfvr/aDRvh8A72TN0Ha0blEwf0BjxlXZc7/yGX8hOI6iFgU+IUouDFrL29pgwBov\n86sIdgFFgmMSr4GVfCswcNWFlsmk1qIADG4K2G1mAQOJltbK8wUYbJ1yL4+Bs/7sYLCb4TP3DTI/\nCxg8yoLB05f8cToYFFWa+2AZ8x6wNXgOmj9bzS5T04DPyrlRYKWvFnueBbugmoEtrxD53R/cK8YB\nBuXW0NoZq+nCfcB1MgrYylETCh6Wz9zgFOD5YAslEOx2bC3jfAZ+r4pBsz5XyDPqJWN2BtEQWCzB\nWLJkCd54YyoUpQR0upGwWJrB5YqA3R4JLvbkdTIa+2PIkGexfft2oeQJkrUIALvZXoZe70R4eCys\n1gfAFqLKRNAdRINgNvujZMmysNlaQacbCUUpnqcw8nqkYsV60OnekfldgqKkYMaMGXn+Ru32KGjP\nGXC7k/PUlpw9e1ZqdGaBNw2tYLM1vK3kmLcNSCpXroyBAwdi+fLlN1UNeiUgadGiRZ6mPQ0aNMCW\nLVuwePFi9O6t5XrPnTsX/fv3v3zy/xIgKUhSUwdCrx/q89K9jzp1mlzxnKVLl8LtDoNOZ0B0dOJ1\nB+jq128NbWcJ6HQvolOngoOPOTk5mDt3Ltq0aQezORocj9gEvb4UQkPLoF69BtDpmoPB5BfwDtcL\nTsWNgBaz6ArNQlEDxGvAwKAq5xRomUV9oFkf7/qs0XDwjt8FrgjvDnaDhYPrROaALaI35Tg1xTsL\nDDRRotwUsAvlJWiV8EtEMfbxuV4mOL5wryjiJaJAy4N3+WrleRMw+HjBrqlwaJXpLlFsambSDlEg\nHcEuQIcoxpJy3CAZxyP3aMHl/FxtZXyDnGMBK+pk5LW6zOCdrxrQHgOOu5SWdesLLRYGsFXmL/dY\nUc4ZAM3CagIGvQE+5zwhzzoWeZMbVoLdYIrMTf18NYoXLwer1SVzZEvMbq+HUaNGQVGCoNcPhtn8\nMIKDS+Q2ZVu1apVwiKnrcw5s/SgwmVrIOBPBG4HHfa73IfT6Yj7rsguK4n9DsY0dO3YgICACbncy\n7PbSuO++trmUMQC3v2baITXeeA6KUgJbt27NPcbr9UqqvUrD8zfs9pL47rvvrns+NyqFrTuNVIBs\n3LiRvvvuO1qxYgWNHj2aAgICqEmTJtS0aVOKi4sr6LTrFr6nG5cxY8bk/j8lJYVSUlJubkK3UU6d\nyiCvN8Hnk5K0ceMWstsDqHHjZjRnzjRyOBx5zmnevDmlpx8lj8dDRmOBj69Aef75Z2jjxtZ0/vyf\npNNlkaK8Q8OHrynweL1eT127dqWuXbvSSy+9SiNHVqJLlzzk9Y6l48dL0dmzY0hRDtOFC3byejsR\n0Tki6kBEq4hIR0RNiWggEbUmoggiOk5EHxBRKBGFE1EGEb1ORDFENIiIYonISUQHiChRvi+ZZ42I\nZhORkYh+J6JXiKglEZ0ioteIaDQRHSGisXJ+dTnPREQTiWgoET1BRD2I6GcishPR30RUg4geIaIs\nIvIQ0QYiGk9E3xNRWSKKJKLNRHQ/EQXJnDvLsfcS0Z9EFCfjXSIiBxHtJqLJRDSJiFKIaAARrSCi\nmkRUmYg+IyIrEf1KRI2JaBMRfUtEb8uPQkTxMic/IppORH2JaCsRrSWibLmvTBn/EyK6h4hWE1EU\nERUnIhBREhEtI6KOsj5ERBWJKFnOq+uzvjFyH7OJyEtEVYhoARG9QPwcSda1tM85DxDReiJ6h/h5\nT5f5Pk5OJ9G5c3ryevM+w7Nnz1B29gWZJxGRjs6dC6Wff95O/v5+dPz4dMrJyaKWLXtQaGgoERE1\nbNiQ1q//kurVa0pEC+X+7yOiSpSd/b2MkyHrlvd6gI74fSQiiqZLlzIJAOl0OroeSUxMpL17d9BP\nP/1ETqeTqlWrRnq9Pvf7gIAAGjLkGZo8uQ5lZzcnk+kbat26ESUlJeUeo9Pp6JNP5lPr1g+QwVCW\nsrJ2U2pqD6pTp851zeV6ZN26dbRu3bpbNv41w9Kff/6JmTNnokOHDqhUqdI1d0m8mmtrwYIFub+X\nKVMGx44dw4YNG/K4tsaPH59vwP06pn/HZMaMWXC7w2A229GmTec8gb7Fiz+Goqg+4d9lB/gIiI7B\nYnkQ7dpdf73OuXPnsHnzZvz++++5Oy61612NGvehUaO2mDNnDgYOHIQnn3wau3btuuJ4Z86cwSOP\nPIHq1Ruhb9+BGDVqlNBBqLu9X+HnF4EZM2agW7du8PNTKdHXya6sGzgO0RRETuj1DrCbpAR4R++7\ns90ju91HZRfbHJzNlCTr8xPYdVJKdthusPXxCzg+obrR9MjLkvsz2B1TEWxZlIXWkbCCjNlcrlsf\nnMSw3Of7I+CYxhCwZfOjXLuu7LgnyXzmgV1mKnOvShL5z+yuytBSqwG2OsbIva8HB9bVWEg3sEW0\nFZorTG3p6wDHn4LAlluUXEulzA8CWyvzwRaLb0HhPmjZdDEy/i5Zmxd9jntYxvjB57PHZK6b5JlV\nBTMNHAKnGkdCr4/AqFGjAQBTp06FyRQk70QarNZWeOABZizma/cGuzIDodO5YTCMhm8we9GiRXne\nyS5duoL/Tr6W9feT9X4K7BZzgN2bG0G0B3p9LRiNbrDr7RDM5p5o2LB17t+G2odn1Kjncc89LdGr\n12M3TaOyZs0avPLKK1iyZEmBls/x48exZs2aq8ZGb4UUtu68odE8Hs81m2FXAhLfYPuGDRtyg+3Z\n2dmIiYlBWloaLl269K8Ntq9Zs0aKlbaB6BQslk548MFeeY55442pCA2Nhc0WBPbzq1k1R+FwBF3X\n9Xbv3o3Q0JJwuZJgs4Wjc+deyMnJwcSJL0NREuUP6R3Y7UHYuXPnVcfzeDyoVKkuLJaHQTQBen1F\nOJ1+MBrVimiA6Bf4+0eiatVkWCz+0OuN4Kpx9fu/RWkHgMgBnU79fykwCPT2OXabKMKqoqhVhWiB\n1uLWJoo6HgwKKjGh+l0xUahdfD5zQXMPtQYr/iho7LqqwveD1tcEYOAYDHZZGaDFTQB24fQCu+wi\nwCnSYTLPd0QRfgfN9aVmUHnALqzqogifB7vW+oPTg9Xxf4PWYTEKnPJ6HgyoOhlTTbH+Wu7RCA1g\nK8j9qzEmlczxJXkPyvmsreqic8n6DQAnBGyGVmHfGuzmOwp2i7WARruitgTwB6dvTwGRgooV6yA9\nPR0AsGjRYkRGxsPfPwIPPtgDISElYTA8Co6LtQeD3lrkZSkGiEYjNTXvpnX69OlQlIbynqhxuz3Q\n6fzhcATBbldjcZyubTS6MX/+fJQsWQEuVxhatXoQ6enpeP75ibBYHDAYzAgLKw2rtSGIPoHJ9Dii\nosreVHbX3S53BZBcqzzwwAMIDw+HyWRCZGQkZs2ahWnTpmHatGm5xzz22GOIjY1FhQoV8jBnfvnl\nl4iLi0NsbCzGjx+f/+TvciAZMmT4P5TD5cR2qkydOhU2m29G1jcICyuYpiY/qVIlBTqdSvCXCbu9\nGubNm4ewsNLwzYTR6YZjyJDhVx1v586d0tt9JniHNxpEbcRPPRZEi6EoFZCQUA0GQ2MwQEQgr59+\noyiaaGgNoEqJwukgimuYXKMUiF6X8waJwiwJtggugne9DnAwX12n/uCAcgloNSavgnfMKjHh26JY\nfGtK0kQZO8AA/rsc49uLpQsYXIbLsT/L5+ehZU09AA3MHKK8fGMZzcCB7kpgJd4AGkVJPWigEoq8\nhZnLoVkYauaZSdbXT+7Z9zqh0GJHalbRT2CgUHvAJIKVdjFZr0ryflYAg8AH8p0fGDgVeSYXZC3M\nMk4g2BJ6EJyJNxUMSipgVwLRT7BYeuGhh7iAdenSpXA4uMDRYFAZidVnobIMzIJOFwidbpp8fhFE\n1WAyOTF+/Eu572VWVhbq1LlP3g8N3G223pg8eTKMRgXahgxwOtviww8/zPNuL1q0CIoSL2t1Tp5R\nX59z6t3SlsZ3Wv5VQHKr5W4HEqZh6eTzx/4FSpXKv+r37NmzKF06CTZba6m0D8Enn3xyXddzucLw\nz91c69ZtodO54VvZqtc/jeHDR1x1vF9++QV2e7Qol62551ssTVG5ck2kpLTC1KnvICIiXpTPtErC\nQQAAIABJREFUFvmjTBSF+RQYXMqCAaQ8tGwatcHTq6KQ/MFAUVMUURwYDHx5jLaDlf1aHwWkck35\nNoBKF0Xo6zYbCo10EaIcTTKvZfJZNZnvc2Bro5goU1UZu0WhFgMrf1VZLRVFqqboqgHkDDBobAKD\npp/cm3qMB5ylpbrBAsBuwGeg7fQJGsXHSbACt8qxB6EBhgImvPRl5IVc/0UZayzYDVhc1vYs1IAw\nr381MCBdACv543Lv0+SZBsi9NwODynFwxpz6rMrAN5GD6HvExVXDb7/9JutXF5yOXAFc7a+6+y5B\nBaFGjVrA37+YHFNS1u0gbLaQPKSSHo8HTmeIz7twAQ5HBXz66acwGq3QOm/mwOGohvnz52P37t25\nlEK9ej2Gy9+tsrm/O50pt4Tp4m6RIiDxkbsdSDIyMhAbWx6K0gIWSz8oSlCeHPp/ytmzZ/H2229j\nwoQJ2Lx583Vfr3r1+tDrX5Y/hrNQlMqSnz8CvLOcC6IJsNuDcjmzriQ5OTmoUaO+KK6/RZEdhdn8\nKCZPnpx7XJUq9USZqH+UamfAMLDlYQMr7PZgV8YA+a61KL5y0FKEvwTXuJQWpdMJ2s51uoxjl7HK\ni8I1IC8F+2w5ZoTPnFaIsp0HjgV0hFZ9rvaC3yjHhMj9qJQrUcgbfzGAQVId+y/wTr032LIKgwY4\nappyiChTNT1VBacEsAJ/Sdb3ZbDLRu3jroB3/eq1XgVbBK/LvzXkmFfB6dZuMFhAlGww2IKyyz1t\nlc/ikRdwYuS+qsv5rcAAp3KjPQfevbeXebWRZ1RRxtwgn7vlWrWh0z2EJk3uR3JyMjSa/BAwSNQA\nx38+BGewlYbRmIQPPvgAW7dulV4nm3OfvdvdEMuXL8/zfq5YsQKKEgSXqzXs9ji0b98NXq8Xo0eP\ng91eBkTPw2ZrhrCwWJjNLjgcJVGsWCns3r0bY8aMhdnczef+34VOFwmipTAan0ZkZFyBRJD/Bbmt\nQLJ8+XKkpqaiRYsWaNGiBVJTUy97mHdS7nYgARgcZs2ahSlTpuCXX3657Ps9e/agbduuqFWrCcaP\nf+mKPD3Z2dlYsGABJk+enG+9yd69e1GsWCk4nQmw2ULQsmUHOBzqLmsBiDrAaCyO2bNnX/P8MzMz\nERubBAYiJ4j8oNe7sXHjRmzatAnLli3DihUrROmqtC+7RJlMB+9iS4uSUXfwXvlMTes9Ct7x+lKf\nrwXvcIuJYuV6BZOpNqpXv0d6wM8Ap5iqXRDDoaaDstIPALtr1onC7gFOUfUDu7OehsbV1Ri8Y1ap\n5R1gsFkj45jlRwFbNoHggHUOOP3VLkqyLzhwX0/WS6W/t4Mr5WuLglZTnj8U5f1tHqWmuRIjoNGA\neOXaVrBFtwMMtipNfXto7rUIMOiuBlss6vUcMid/cKD9DzB4hfusmxtsNVaXexknz/9FMGha5Zmp\nzxhga9QNjeGYrR67PUzWTw3gnwMDyRBZt3jwRiIOFkspfPrpp7h48SJcrhB5vm+A6AVYLP7o1q0P\nbDY3FMUfQ4eOgtfrxYEDB7Bo0SKsX78+T1B7yZIlGDx4CFJSUmAwhIDdp5nQ6aYgMbEm0tPTEROT\nCIejMRSlKxyOYHTv3hs1atyHrl375KYc/1fltgHJwIED0bRpUyxYsADr16/H+vXrMX/+fDRt2hQD\nBlxORXAn5G4DEq/Xi+eeG4+AgOLw94/E6NHjrpirfuTIEfj5hUOvHw+iL6AotdG//6B8j/V4PLj3\n3haw22vDbB4ARSmGadNmXHbchQsXsHXrVuzfvx9nzpyBogRAc2tth6IE4siRI9d1X+PHT4LBUAu8\n48+GydQV0dGJsNtj4HY3gtMZgmHDhkNrY+uCVvD2kyirYGiEg/tEmbnBVkWgKB5fWv9F8PMriUce\nGYDJkyejbt17ERkZj8jIRLRq9SCefnoYrNZA2GxVwO4fqyjTUFGQH0JrmxsIthKeBO+2y0PrSTIc\nXB2/EBy4VcAgFydjqhlYaq2GypNVFRq4qP3sHeD6hdkyxqNyLSMYrILl54CMMVK+fxocvM4E05kk\ninLvC96120WhVwdX0leDRqHfEZxRNQYayJ2W81T2ZTWDzQzOnguBVmToB47X7Jb16QgN9OpCswYP\nyvPdCI3BOACaNTRKxnsbvJF4RMaN97ln9dk+Dy3eEiLjlACRE0ajFc8/PxGvvPKKzKM7iOrC6QyD\nzVYTTHx5AIpSCW+9Ne2K7+0jjzwBk6kimCWgE9ht+hdMJhsA3iTNmzcPM2fORFpaGt5991088cRg\nzJw585qIF//NctuApKB+JF6v9z/Rj+RWCLPpVgBn0+yColTEG29MLfD4t99+Gzabb4D1KCwWR77g\ns2zZMjgclaAFF7Vuc1eSzz//AooSAKezLGw2f8yfv/CKx+cnbdp0hUZPARB9B50uBOwyAYg+RWRk\nGbz77ruwWPxhNDaDtjtVd9KtwLv+eaJYxoHjOcFgq+J3+f8QEE2CooRgxYoVuXMYNGgYFKUGiD6H\nTjcJbncoNmzYgPbtu4gS6icKuAY4AO4Bxyj8wVlEh6H1fFeZe+Pl/+r1G8sxMWBwCpDfj8ozVcCK\n3yvPYYKPQh4OjvGoa/SHfB4GrvruBa1nvXpMhijTweBduco4nAIGERsYGALArrQe4L4tFmjJBL6Z\nZFXAoKk2GHsdDDifgIP3KhmkmqUWCE5lVs/vBrYC7eDYhG9xYZasWbisz1vQACYUbOn5xmc8YLB5\nGBx7ULm0MkGUCEVxoV+/x9C+fXuEhkZAp4sHg9FqKEqsJIiosSsvdLpmYHBSx1+A++67v8B3NjMz\nE0ajDVq2nBcMxENRsmT5PMd6vV507NgddnttEE2AotRF27Zdbqhg8U7K0qVLMWLESEyfPv2q7SVu\nG5AkJiZi06ZNl32+cePGK5Iw3k6524AkObklmJdIfdk/Rd26zQs8ftq0abDZfCt+/4TF4sz3BeZu\nc75/2DkwGCzX1B74zJkz2L59+w3T/w8ZMkJ6GPDuVK8fBZ0uwWcumTAYzACArVu34tVXX0XXrt1E\nianurq0wGFQlHumjnPTQgq67YTQmoV69hnkYDwDAbg+AL8eR1doDY8eOhdUaAI7fqIo5DOw6CRYF\nFwEO2qq1JWoVepgoqspgt01JsDXxPtjtFQEOjG/yuc/7wKD1sBx3DxgcQsDKvaPPsX+Bd90OGbun\n3HtlaGzJy+TcJ+TYZWCXXhcZLxJaOq/aI0a1KBrK/agBcy+0lF9/cOzin50SJ4OByQm22N6Ra6it\nku3g9OcN0NiOF8hnXcFW1jpZS4/PuN3BIFPG51mqLjA1NhQFrdFZaxCVhL9/JBITq4JBrRc4ySAI\nRF1hseR93hzHui/3d4NhJLp1Sy3wndUqzH2Bti4Uxe+y+OOePXukUZtKS3MeNluxyzpG3s0yZswL\nsNtLg9v9NkDduo2vaFXdNiDZsmULqlWrhvj4eDRs2BANGzZEfHw8qlevntsE507L3QYk7do9BJ1u\nUu6Lq9O9jNatOxd4/LFjxxAQEAGDYQw4lbY6nnpqaL7Hajxbq8GK+1lUqFDrVt1KHsnIyEBiYg1p\nIlQPgYHFYbVGgnf5gF7/CsqXv3wuCxZ8CJstAG53VdhsAZg9ey7eekulez8j65QAzgriXbyiROTL\nlmq3B8LXPWK1dsfQoUPhdJb1URQQZaaI4rZA47IqC+bq8ofGp/UkGBDiwDGKdeCYjAWcVpwItgyO\nga2vsmDroBKYoqSijPcyOIHABnbzfA12o6gsxBawJfEjOKYRBS58VK2CQ2CwcYGtpEC5fj1wPYlN\nlOwScO1MJ7mvNtDSrevJuEPAAKo2vfItfBwG3tWrdTc6UfStwO618tDcoBawhRImyj0BbLHtlTVa\nDjWpg8FIpeAvDrYsYuU5vAS2QtWWyJ/J/Q2ScVOQN+NuPvT6YqhSpZ5Q258F0f9gtUZAUfxhtfaA\nzdYFAQER+OOPP6743jLjbQ8QbYFePxkuVyj27t172XHbtm277D1yuSrcUMLLnZALFy7AZLJBy1Tz\nwOFIumLv+NuetXXkyBFs3rwZmzdvvm7f+q2Wuw1Idu3aBaczBEbjozCZHoPTGZJvgN1X0tLS0Llz\nL9x7b2tMnvz6FV1VX331FUJDS8JotKBGjQY4fPhwYd9CgXLp0iV8/fXX+Oqrr5CRkYFx4ybBZLJD\nUcJRokR8vn+gAFfvbtiwAceOHcv9rHTpSmBXziiozLtmcxjMZgdeeeW1fMd55pkRUJQqIFoMvf55\n+PmFSyvXYHAdQwbY/RYOlZLdbE4ShfqlKNWx4N23HzRuq0qi/NSMI6MoPQfY4mgsCj4SnPa6GJyu\nbPf5qexzbhzYckgFWzouaP1P1KJIPzAI2cDuKtXd1BwcUzogSn0aONMu2UfJnYGWPRYk4DBNwKMl\nOK7hArv4GgtgTAQDUTDYGmsp81XAsR415TkUHN+YI+t4Ghy7eRucEvyYAIZT7ruBnKOmHh8DA2Cg\nPN+SYMvkK+j1FphMZcGuvQFy7kdgoPa1mjbAZgvHqVOn0LRpexiNVjidwZgxYxYOHTqEN954A1On\nTsXx48ev+s6eOXMGnTv3RnR0BaSktCiQ3PTixYsoXrwMDIZx4Er4iYiIKH1THIO3UzTrS2NPcLla\nYfHixQWec1ek/16NVuN2yd0GJABw4MABTJo0CRMnTkRaWtqdns4tlYyMDBw8eBAejweZmZnYtGnT\nNTERnzlzBmXLVoJOZ4Zeb0LPnv2wb98+nDlzpsBzvF4vXn/9LaSktEanTj2wb98+rF27FjZbpICB\nURSo2izoJEymQPTv/6R87hvI/0KUpwUa1cg4sB9fVeq+qaHDRfmrLikveMddG5w5tVqUtlonMg1s\nLQSLwlVrLIqBg9VbBXAsYCunJfIG9o1gN84OcHD9n0CiNtp61Ofz70V5X4TWodJfrt8EWnKDG1qW\n2R5oGVdqK18XtKyvcJlnCDigvwMcZ2gp97ccHJdp7DOPHBnrLNj6eUDW5QkYDPcLg29zAZnHwIkJ\nUbIuu2Gx1MawYaPzPPfbIQcOHMA99zRDUFA06tZtgv3799+W6xaGeL1eVKhQC0bjYHDztgVwOkOu\nuPG/K4AkMjKyUCdxo3I3AsndLKdOncLAgYPRrFknvPji5ELLTPn111+lv0ol2GxheOihvtekAE6f\nPn1TNBRvvvkmrFa1He0QsA8+XJRXKPr1ewoAEBurZkGpyu4bsBXSQJRcHDQr5UlwjGS6z/GLRCn7\npi8ngi0Wlc/KX8ZTYwIlRGmHiNKsgLx9O5YJwBig9RkZIQqf/qHQFbm/JQIsbrBbypc5+n9gSwgC\nKAvBsYh/1osUB4NVRXA8RnUXlgJbdEHggHisHOcGu9OcYICqJ4CTAga4YmCQVNdGbfObDa0Z2bLc\n6xuNreT7bmDLLhhslbmg17sQGloKX3311Q2/E/9f5dixY6hfvxVcrlCUKVMFP/744xWPL2zdWSB9\n7IABAwokekxPT79BisgiuVNy4cIFqlYthQ4dqklZWS1p3boZtGPHLpozZ/pNjZuWlkYtWnSkkycH\nENFgIsqkTz5JppYtF1OHDh2ueK6fn99NXTs+Pp4MhteI6DQxO+2nxEy4m8hstpHH4yEioosXs4lo\nHDFjrpWINpNO9zcBDYgogIhWEtGbRPQsEfUmZuF9g4jaELP4ziVmju1ERL2IWXZPErMYbyCiJ+X6\nB4hoIzGDbyUi+ouYefcvYqbagz6z/5OIcoioFjHL7pdyjQeJGXcnELPwdieii8TMuu8R0VkiakBE\nLpljAjHL7wBiFuMnZewVMt4fRLSEiFoRUVsiOkZEZiLaR8x2fJKI+hCzJ7cnoqeI6DwxK+9IYnbh\np4gZj0OJ2YWbELMaO+T3ZURUj5hNeLbcUwUiOkHMhFxK7tlLHs8OYmbmwfLZKCJ6jYxGP/J4ZtPx\n48epbduutHbtUqpeXWVuLpKrSWhoKK1Z89kdu76+oC/ef/99SkxMpCpVqlDVqlVzf6pUqUJms/l2\nzrFICkHWrl1LJ064KCtrGhF1ofPnl9KCBR9QZmbmDY/53HMTqFy56rR/v5eYKv1rInLQhQuN6bff\nfiukmRcsDRo0oF692pLVWoYcjgrEyvAgEe2nrKxtNHv2XDp8+DCZTCYiArGCq0hEpyklpQ7VqLGc\ndLoVxDTxk4noDDEFfgYx5X1xYjr0dCIKIaJviCiVWEE3IKL6xFTzNiIaQ/znZCMGESKiYGK6eAcx\nWIwnoseJqewHEQNEKDHluUpnHkVMTW8hoobEdPLVZP5PE1EJIvqCGCjiiZX8UCJ6iIjeJ6KZxBTr\ny4iBcyExuBiI6EdisDtLTKV/nIiWElPiTyai74hBZjURbScG5Wdkro2JaA8R1SGiecSAqieij2nE\niMcpIeGsrI8i6zdX5mKRuR0mom7EIFXG5ykmEJGZPJ7FxJT2nej8+YE0b95HVCT/IinIVElJSSmQ\n4TcqKqpQzaIblStMv0j+IZ999hmczoY+Lo6LMJnsueys1ys//fQTFCUCWgOfr8VNcRp2exI+/vjj\nQr6DgmXv3r2YNm0aXK5q0OIIi2G1hmHDhg2oXPkecMBYvfe3ULFiHfj7F4NO1wMcM1BdSSOhZTbN\nAge7g8St85q4sSqLG6kdOG3VBg7kZ4srSs2U2gl2l5nA1epR4hqqIK6lZGitej8XN1M3WUc1K2o2\nOCaxAFr85gTYZXYfOBVZva8ssMtrvdzLWp/vknE5rYsNnEVVFpydZQPHbVzg4HxZ5CWx7C5urSmy\nFpzKbTDEwGwOgNbqV2UlVptkqdlzis/PHHAGWDlorr1kEA2GTvcEnn562NUffJEUKGqH04KksHVn\ngaP9/fff11SjcCflvw4kR48exZw5c/Dhhx8iMzMT58+fx8KFCzFr1iwcOHDgusZKT09HaGhJSTX+\nGlbr/WjcuO0Nz23hwoVwOn0L8AAiBRZLAPr0GXDbi7kyMjIQFFQCHDAvCaIU6HT1EBpaEjVrNgbX\nTajzXCAK/mmfzxqCK67V31sLaESBA/Gn5BiV/+oecOwgFkyzPlXO2yhKXlWgkeA4yNPI24nwD3B8\nYQs0EHODQcshc3kJGr16mihjtW6lstxDMLS0z2lyvWDk7SypxnSSoCUMLJY5viv/LgenFTeQ+7HJ\n+L/7zHkYtKQAu9yXCiCBMjdF5vsFOGb1P7lXF7QWvN/kAopOZ5deIc+AkxZaw2DwKzALsLDkt99+\nQ58+/dGpU8+7ivbpZiUnJwcDBz4Nk0mByWRD797983RwVOW2Acm/Qf7LQPLLL7/A7Q6Dw9EeDkdD\nREWVRWxseTgcDWC3MzdQfgWjV5I//vgDrVt3RoUK9dC//+Dc/vE3Itu3b5ciLrVo7Au4XKF3NNvl\n119/RUBAFHzbrBoMI1CzZgoUJUYU1WoBmnrggjxVSTb3AYO14B36GrDFEASDwR8NGrTC1KlTwVlT\nJEBhhhZs/xK8g48DZ2+p1d9h4J13I2hB6fngFN/K4MD1QPlXDb6rNCRqmq0LHMCvByagHCJKvY58\nVxJaCnIw2FIKBe/0E+U6FaDVsPj2dxkGBsofBBjaQyverAnOMlviAxRNwWm7weD6mznyXQNwMN8N\nBlq1V8jvAiq+m44asFrdSEioBo0Y8hmwpezA33//fcvek927d8PhCIZO9xyI3oKiRGDhwg+vfiKY\nf7BcuZooUSIRgwc/m6+SvpMyefLrwgBxDEQnoSjJGDPm8jYcRUDiI/9lIElObg6dTsuvNxgehl6f\nCI37aB6Skure0TlOnvw6LBY3nM6ycLvDLqtGvxOSktIaedkFlqJmzcaYOfNdWK3h4MK690H0sSg3\ntTtlGfCueil4Rz4WWjrrSgQERMLr9eKLL76A0egnirShKOM+olz9BBxeh1YJfhQMXnZRsHFgS8Yu\nSvtBuYZXgEblrtoBLqp0IC+vVU0wUIaDwa+1KO/14CyucuBaDrd8bge71jLlOi8LwKhAEScAEAlO\nJw6QtQgFW0wVZKwq4JqT0jJOKrTeMfFg1xzAfGrlBRi6y2en5T72Q03NNhgC0KZNR5jNzcAV5X/J\n+G9ck8v177//xtatW3Hq1KnrfkeeeOJp6HTDfN6RrxAfX/2q5/34449QlBC515+hKMl48sn8C4jv\nlDRo0Baa5QcQfYlq1RpedlwRkPjIfxlISpWqAnaTqC/ENPDuWP19F0JDr6/x1a2QEydO4H//+991\npfF6vV689dY0NGnSAb16PVaohZXjxk2CotwLLlDMhM3WGM8++xwA7kFhNvcVpZ0Dg6EmbLYQ2O2h\niImpgHvuaYi4uGoIDIyEwdAlDxiVKlUZAFCsWBw4HqS6jOqDOy5ClLQaDygOjbBRpblvLP8fBwat\nDmBroiXYYrCBLQ71uh5R9i+A03M7gN1yMeA+IOpxg8GxkuJgULOCCxYVeWf8wMD2M9TuhazkW0Er\nYpsMtjCWCHCo9CZq7UkOOK5TU44fJNeFjKcyFQBEA+B2h0jcROXhUl1yHUEUjEceeQLlytVC3hjM\nu9Dro9Gp08NXfMbMmOAPlysRihKATz759LrekX79nvjH+v2AkiUrFnj8l19+iSpV6ovr1Lce6VdY\nLEF48cVX7hrLpFu3VBgMw3PnqNePz5ddowhIfOS/DCSpqY/Dam0HpuY4DoulPEymUHBA9gIsli6X\nte39t8jgwcOlSn0ejMZnEBIShZMnTxbK2NnZ2XjooT4wGMwwGCzo2LF7LoHdqVOnUK5cNTgcZeFw\nxKFixTrIyMi4bIxTp04hMjIOVmsnGAyDoCjBuQSSTNVyzEeZPAWNrPGwAMlIUZiDwQFrtXDwPHjn\nr3YYTBKgUd1RDZCXr+qUgMtq5G0ElQLu6+Ib83FBa/XbVMZxQysqrAGOt7jBhZT9kbex0//k/FZg\nC8ggin8BGMQeAVthqsW0QsbqA7ZSRoGB9SCIQmCzBcJi6SjgNU2uvQrcr8WC+fPno1GjttDpJufO\nQafri9q1619RKR87dgw2WwA0DrctUJSA67JMNm7cCJstGFxrswaKUgGTJr2S77Hr1q2DzRYKriV6\nGOwyVNdsLYiioSj10b79Q9d8/Vsphw4dQnBwCShKeyjKA/D3L5ZvvOm2A0mLFi3QsmXL3J4kLVu2\nRJcuXTBlypQ7TiHwXwaS8+fPo2XLTjAYzDAaLRg8eDheeOFFmM12GAwmNG7c7l/ZeMfr9cJicUAL\nEAOKcj9mzZpVqNe5ePFivu9nVlYWNm/ejJ9++umKCuv06dN4/fXXMX78eGzbti3385YtH4DF0h3M\nsLsZ2o69ro8ibwDml+oFjp2oQf0eYOvjVzCliANstTQCx1oCRYnfB7YQyoKtkO/AMRfVrTkGbGmc\nBncpTADTt2yCFseIAlsuvwgA+IOLLF+XY54Fu5JOC0D1BgNbHWh9S5ZCI3qMgFZgqdLJqFxiandI\nP2j09gYwrYpv8sLDYIurOgYPHoLffvsNfn7hcDjuh8PRGMWLl7nqhuKHH36A213NZ1zA5SqPn3/+\n+brej1WrVqF69YZITKyDV155rcDkkG7dUqEB7hFoLY1fBLv+FoLoHMxmF/7666/rmsOtkpMnT2LW\nrFmYOXNmHmoiXyls3amTQQuUgQMH0smTJ+nBBx8kAPThhx+Sy+UivV5PGRkZNHfu3FuWmnw10el0\ndJXp/+slKyuLDAYDGQwGIiICQDk5OWQ0FlhLelcLALJY7JSdfYiIAomISFG60pQpydSnT587O7lr\nkIyMDHrggV60cuVSysmxEFE/4vqL3ykqyo/++OMgcWGgQkQgrpM4RkSLiAsC9xHXlxAR9SWivcRF\njTrimpFnichOXGNyhriO5Q3iuo8mMsYkOSdLrhFIRPcS0RoiqktERuKajhwiipbjRhBRR7nu60T0\nMnFtzA7iosNgInIT18d0IaKPiYsgyxEXLK4mLtzcSFzfYiMuBD1BXAezi7i40SHnHCYuhgyXOdYj\nrilpR1brSpo0qQENHDiQTpw4QStXriSTyUTNmjUjp9NJACgtLY0yMzOpTJkyZLFYctf/6NGjFBub\nSBcufE9cR7OTbLZ76NChPRQYGHiNT/HapW/fATRzZggBI+WT98jPbxidO2em7OzZxOvuIas1nPbt\n207FihW76pg//PADff311xQUFETdunUjRVEKfd5Xk0LXnVdDmipVqhT4Wbly5a547vLly1GmTBmU\nKlUKEydOvOz7U6dOoU2bNqhQoQKqV6+OnTt35n4XFRWF8uXLo2LFiqhWrVq+41/D9IvkLpSePftJ\nHGMldLqX4XaH3VFC0FOnTmHjxo04dOjQNZ9z/vx51KzZAHZ7LJzOcoiLq4Rdu3bBZHJBS7EFOEj9\nJJijygGibT7fNRXLwQKzuQQ48P6Cz/ePyo7fXywBNWbhALuUkqE1jkoFu9Ic4K6QgWAixhliLfT3\nGfd1aE2/vgBnjD0CTj7w+hwXC3arqfGgqmKFREB1gel0FrDb6qIclwkiJ+rXbwq7PRFEr8Nk6gqd\nzgm7vQMcjrqoUKFWgRmDOTk56NTpYdhsoXA641GiRPxlLL/vvjtbWKWrw2YLwLx5C27q+V9Jdu7c\nCbs9CDrdOBC9AZstDAsXLkRERGkYDKNA9C0slu6oVavhNaW8z5nzARQlHHr9M7DZWqBcuWo3lT15\no1LYuvOqo8XHx+epWThw4ADi4+MBABUrFhyg8ng8iI2NRVpaGrKyspCUlIRff/01zzGDBw/G2LFj\nAXBed4MGDXK/i46OvmoKYBGQ/DslOzsbI0aMReXK96JZs47XRPR4q2TNmjVwOILhclWB1RqASZMm\nX/0kEY/Hg23btmHLli24dOkSvF4v7r23BazWjiBaDb1+GHS6ADCBYboo/DBwoLcTiBQYjW588cUX\nsFrd4ED2Kh9FPg8c+0gB07sHgbOqvoOWUVZLPguVY21ynEHcTE5wFloA2J32moBNCTl6cHoKAAAg\nAElEQVTuITDjsL8cozYry5K5qv1YvGD3mhtaYH0X2CXXFuwSmwSDoTKSk5vA6/Vi/vz56NHjUbRo\n0RqRkfEIDi6B3r37XtEl/t5778FurwW1N4jBMA733NPssuOOHDmC77//vkDXTWHKjh070KPHo3jw\nwV5YvXo1AI5FtGr1IMqWrYkePfrlG2vLT/z91dohXlO7vTHef//9Wzn9fOW2A8myZctQvHhxJCcn\nIzk5GcWLF8cXX3yBzMxMvPrqqwWe98MPP6Bx48a5v0+YMAETJkzIc0zz5s3x7bff5v4eGxuLEydO\nAGAguZq/tAhIiuRmJDs7G05nMLheBCA6BJstDDt27LiucU6ePIl9+/YhOzsb586dQ79+T6FixWS0\naNERLleIpHFPEguiNThFdxyI0qAo1fD22+9g1qz3YTC4wfGVcyA6DaOxKvT6ktBShl8AB+Z9M6T6\nibXRHRyjKCGWSA44IB0C9uUHC1i4oRVGngcH3meA4zlucFD+LbC15BbLQw2Wq4261vhc3w/cpvdt\nGI1BeOGFF/K0Qpg3b4HU8HwDom+hKHF4992CFecTTwyG1k0RINqLoKCoG33Ed51w3xDt+Vks/TBl\nypTbPo/bDiSA1gd827Zt1xxgX7RoEXr37p37+9y5c9G/f/88xwwfPhxPPvkkAGDTpk0wGo25QbOS\nJUuiYsWKqFKlCqZPn57/5IuApEhuQo4dOwarNdBHaQEuVxssWrToms73er0YNGg4zGYnFCUS0dEJ\nlzEO/Prrr6hXrxn0egc40FwVWvMogGg6OnbskXts7doNoNebYTCYJbNoBzgDLABc01EJTL9+Dhzs\nD5B/1da2/+xe+LCAgxEa2/Fun+8nQitO7AKmWOkJTiX2TSBQa3PWgN1mJ8B9RNS2xQ7odFbMmDEj\nt7Pg0aNHUadOU3DxpXq9j6/YNXTmzJlQlLpQA/U63QRERyfeUau1MKVJk/thNj8Mri9aA0UJxv/+\n97/bPo/C1p3XFLH9+eefKS0tjTweD23fvp2IiLp163bFc3Q63RW/JyIaOnQoPf7441SpUiUqX748\nVapUKTeo/N1331GxYsXor7/+okaNGlF8fDzVq1fvsjHGjBmT+/+UlBRKSUm5llsqkiKhoKAgMpsN\ndPHiauIA8iHyeDZSfPzz13T+0qVLadq0Tykraz9lZQXSoUMTqFOnXrRx4+rcY8qWLUvr1y+jqlVT\n6KeffidmG/6OiGoSEchs/oFiYiJzj/3++9V06dIl0uv1lJhYi3bvTiMif2KCxMNEtJg4UO5PHNiu\nQkRVickQPcRB8J+JiR6ziGgrcdA7iZiFdwsxkeNIIrpARJ8TE0o2ImYgTiRmFj5MHPCfREw2eb/c\nUX3iAHox4oSCDsRJAgcJqEV9+rxIJtNQio2NprS0/ZSVRUR0n8+qnSCHw0Z79uyhgwcPUkJCAoWF\nhdHZs2epV68BtHr1GtLpcshsLknZ2QoBZ+jw4aZUuXIdWrnyM6pdu/Y1PZu7VRYunEVdu6bS2rWJ\n5HYH0vTp71H58uVv+XXXrVtH69atu3UXuBrSdOnSBbVq1cKjjz6K/v375/5cTTZs2JDHtTV+/Ph8\nA+6+Eh0dnW9K65gxY/Dyyy9f9vk1TL9IiuSKsnbtWjidIXC5kmC1+uPll/Pv0JifPPfcc9Dphvvs\nto9DUQLyPbZcuaqykw+XHf49sNlqoXTpigVWcS9evBhGox0c9D4LTuN9AFw9fgxcv1EdRO9Cp1PJ\nH80yfhs5zwmuXVETAPrLcYni8uoCrpC3ifXxO7hi3ybnJoFjLSoVzl/gWMwS+f6wz/0PB9FomVsA\nOL6yRY4bAaLRUJQg9OiRCpstBG53Muz2ICxbtgxNm7aHxdIVRPtA9CmMRgcMhkbQYjbzULly8o08\n4iLJRwpbd15TsP1GCPiys7MRExODtLQ0XLp0Kd9ge3p6Oi5dugQAmD59Orp37w4AOHfuXG7wKjMz\nE7Vr18632U0RkBRJYUh6ejq2bNmCo0ePXtd5H3zwAez2mtAylmajbNn8MwwNBhO45mMtiFbCaGyM\nXr16FZixs3r1ajgcQTCb48FxibcETFpALWI0mdyIjU1ETEySgIcLXCCpHmOCVgCZAM6ocoHjJlMF\nNLZDK2JsCM4i+1RcW3/JffUBx1dagGMwTnFtRUOjRsmRMabJ78ng+Ms8aFlkFuh0RklAOA61qtxu\nD4Ber9K4MCgZDN2gFXoCRD8hKqr8dT/bIslfbjuQtG/f/oYpLL788kvExcUhNjYW48czcdi0adMw\nbdo0AByQj4uLQ5kyZXD//ffn7sz279+PpKQkJCUlISEhIffcyyZfBCRFcgclJycHrVo9ALs9Bm53\nMvz8wrF169Z8jw0NjYHWBvgS7PZqBfbUvnTpkiQBrJXjD4jy3wXuOe+A0ejCkiVLkJmZCaPRBs74\n8uVYUivna4NJGaPBmVUGcEpvMDiA74e8tB+DBVja+3zmBcdOGgnQ3C9AZJVj28q1yoNjG7sEeF4B\nW00p4FjKVnCFeFOfsQGrNRBWq0vO4+uZTCkwm9UY0THYbE3x2GODbuXj/H8ltx1IkpOT4Xa70ahR\nozzV7XeDFAFJkdxp8Xq92Lx5M1atWnXFdPXVq1fDbg+C09kODkcCmjVrnye7yVcOHTokzMrw+akr\nyj8G3Lo2CMHBUTh69CgslkBR4lt8ji8mx16S30+A3V5OMN/WETD7sT+4gl09b7GATwg0KpjFcDpD\nUbFiDQGOGLDrawfYFTVSxrbJHBVwHxWAEwvc4Ip+NWU4BFq/+M/g719MqGciwGSZ7aDTOTFs2HD4\n+YXDZvNDt26p2LVrF1JTB6JDh4fx8cef3KpH+v9CClt3XrWyvaAAzd0Q1P7/UNl+OwQAeb3e3ESH\nIrk18scff9CmTZsoKCiIUlJSSK/Pv0FpVlYWBQZGUGbmx8StbA8SUXkiWkCcFGAmojZksayh8+fP\nUJkylWnvXj/igPxM4grzBnLODzIqyGQKJa/3LOXkRBEH03WkVd9/RRx0byGfnSCio2QwhJPB8De9\n+eaL9PjjI+nChR+J6BBxp8ctuXN2OBLo9dcH09q162nu3AAiekW+mUbcVvciEVWXe3iXiCaRw1Gc\nDIZMmjt3OnXs2IsuXpxH3NUxmJzOlTRnTiq1adOGiIj+/PNPKl++OmVk9CSvtwQpykSaPHkYpabe\n/WwId6Pc9sr2u1n+5dO/K+TNN9+GzeaGwWBC/fotcfr06Ts9pSIBsHLlStjtQXC5KsNq9YfR6ADR\nj7KL/xtEoUhO5mSWgwcPomrVFOj1Tuh0Dvj7F8eECRNhMDjFMjgOnW40YmLKY8KEl2GzRcBqfRAm\nUzSqVq2L5s3vh06nxlScYqVUBScGJMFgeBSBgcXhdN4n1z8MDqarrqhtsNn8cerUKfzyyy+w24PE\n9TVTLJivxOU1FkQh0OlcmDXrXezatQvnz5/HxYsXxbW1U8ZLh6KUwObNm3PXY9y4F2A09vOxnDYh\nPDzuTj2ef70Utu4scLTatWsDAOx2OxwOR54fp9NZqJO4USkCkpuT1atXQ1FKgOg3EF2A2dwTrVo9\neKenVSQip06dwqZNm3D48GEsWfIZzGZ/6HQ1QeSP2NgKV62m/umnnxAbWxGK4o+aNRvi4MGDAJj9\n9p133sGrr74KP78wARyL/JQWAIC4xeqB6D3Y7fVhNrvAPUX2QKNqKQciC0JDSyMioiyefHIoNmzY\ngCZN2iMuriL0+kY+yt8LIgcCA6NRunQVRESUxdCho+DxeDB37jzYbMFwudrCbo9G//6D89zLyJGj\nodcP8RnrVwQFRd+ytf+vy20Dkn+DFAHJzcmIESPB9N/qH+dBuN3hd3paRVKAHDx4EMuWLcP27dtv\neqyzZ8/C5QoBd3UEuMeKao34pvSOAtFIOJ1N0LNnH1itfjAaw8FV9kcFdPwkzrIdinIvHn/8GQCc\nWm02l4IWp9kPIjP0en8wDf1WKEodDB06CgDTJH300UfYuHHjZfPduXMnFCUI3JRsLRSlBoYMGXnT\n6/D/VW47kOzduze3mv3rr7/Ga6+9dte4P4qA5ObkjTfegM3WChpR3+eIiUm609Mqktsg27Ztg8uV\n4AMYAFfNq1lcXhCdBFE89PoOCAmJxunTp3Hs2DGUKVMdGifYcPyz2VNISCwATkRo3rwDjMby4HoV\nJ4zGQHDditpb5WcUL55wTXP+/vvvUbt2EyQk1MHzz08sMFmhSK4uha0784/2+Ui7du3IaDTS3r17\nKTU1lQ4dOkSdO3cuvCBNkdwx6dmzJ8XGHieHowHZbD1JUXrSrFlT7vS0iuQ2SHh4OGVlHSYO5BMR\nHZX/VyOd7jXS6YJIpytO/v7nqGbNv2j58o/J4XDQN998Q8WLB5HFMomIMogD84d9Rj6WS4uu0+no\n888X0uzZQ8nffx0ZjZ3J43mVmFa/OxFlE9Eq8ngu0N69e68438zMTKpWrRp9//1y2rnzOxoxYkiB\nyQpFcvvlqhQper2ejEYjffLJJzRgwAAaMGAAVapU6XbMrUhusSiKQps3r6MlS5bQmTNnqH79YVS6\ndOk7Pa0iuQ0SEhJC48ePpeHDq9PFixWI6GcyGKLI4dhBGzZsJIvFQn37PkGbNu2hnTttlJLSlGJj\nY2jPHh15PInk8SwivT6IdDodmUwO8ngs5PGUIJvtdZo4UduM6PV6stls5PGUIY9nmnzakoiCiGgF\nEVkoPb0KJSXVonnzZlKbNq3zzPPUqVPUokUn2rz5OyIiGjZsOHk8ObR8+ToqViyEXn11HMXFxd2O\nJSuSK8nVTJbq1atj3rx5SEhIwP79+wEACQnXZorearmG6RdJkRTJFWT69Okwm+2wWErAZHLhkUcG\nAgDmz58vdO5qfGOxVKSrhJDboCj+yM7OxtGjRzFy5Gj07/8U1q1bd9k1PvroIzidzX3cXxckQywG\nKl080Y+w2wPg9XqRlZUFj+f/2rv3qKjL/IHjb+6XwAumxEVFBxiQywCC5M8wvKFmxxTbfmxba67H\nLK3st+bmyTQtD6m1v6zYytWMzcjKzX5WlrqZrAkWiqCJd1IQvIAXQAUCZp7fH8isyN1hYMjP65w5\nh/l+n+c7n+/j+P3M9/I8T41SqnZGSnv7J69/bqGysemj7O1HKPiXsrZ+TXXvflebRyQQnXCP5ODB\ng+qpp55SH3/8sVJKqdzc3BbHzOookkiEuHUGg+H6Dfe6HvQXlbNzf7V7926VmJiobGzm3XDwv6hq\nOxzWva9W1ta2zU5XXOfixYvqzjv7Xp8caoeqHQdsiKqdyOs/T3TZ2NirSZN+b5xe+pln5qlevfqp\n2k6PdcOw2Knauexr6zk7J6g1a9Z0QGv9trT3sbPFi4xBQUG8/vrrBAcHc/DgQfr27cvzzz9v7hMl\nIYSZXblyhfLyq0Ds9SVuWFsP5cSJE0RFReHg8DlwBlBYW7+NtbUDtVP8VmJr+yIREfe0aspnNzc3\nfvopFa32/7CyehTQAO9Qe2nrEABWVkm4uPRm69Yr6PWXqKkpYM2aHdjbO13/TKi9H2MF/GrctpVV\npXSktQQtZZodO3aofv36qZiYGBUTE6P69+/f6OlrZ2hF+EKIJhgMBtWnj4/6zxhdp5Szs4fKzs5W\nSin1yivLlJ2ds3J07KX8/MLU6tWrVc+ensra2lYNGTKyzZeUKisrVUzMuOvTE4cpNzdP5eDgquzt\nXVX//oOUr2+4qp0Aq+4sJVkNHz5Oubr2Ua6uU5SLS7Tq02eAcnaOVPCRsrV9Trm7+6hLly6Zo3l+\n09r72NniECkRERGsX78erVYLwLFjx0hISGDfvn0dkOaaJ0OkCGGazMxM4uIeoKrKkerqYpYvT2TO\nnNnG9eXl5ZSVleHu7m6cY0gp1ar5hhpjMBjYv38/v/76KzqdDjs7O65cuUKPHj0YP/53bNs2FKXm\nAmBn9zRPPOHA88//Dz/88AMuLi6MGTOGNWs+YPPmHXh792HJkhfw8PAwvSFuM+197GwxkYSGhnLg\nwIEWl3UGSSRCmK6yspK8vDz69OlDz549Oy2OY8eOER0dS3X1f2FldY0ePX4hKyuNO++8s9Ni+q3q\n8EQybdo0bGxseOSRR1BKkZKSgsFgYO3ate0WxK2SRCLEb0tRURFbt27Fzs6OCRMm4Orq2mKdsrIy\ntm7dilKKuLg4evToYdYY9+3bx+HDh9FqtURGRjZZzmAwkJubi52dHf3797/lszhz6PBEUllZyd/+\n9jfS0tIAiImJYdasWTg4OLRbELdKEokQt4+8vDzmzl1Ifv5ZRo8expIlC7hw4QKDB8dw5YofYI2z\n8yEyM3/A29vbLDG8+urrLF36BjY2Mej1aTz//GwWLZrfoFxJSQkjRtzPsWP5KFXF8OH/xZdffoK9\nvb1Z4mqrDk8klkwSiRC3h3PnzhEUFEVp6XT0+rtxclrJxIkeODo6kpLiSk3NCgBsbBby4IOFfPJJ\n+18xOXv2LAMHBlFZeZDaOevP4egYzNGj++jXr1+9slOnPsEnnyiqqt4FqnFymsyCBfeyYEHLT7wa\nDAasrKzMegbT3sfOJh//DQkJafIVGhrabgEIIURztm/fzoAB/ly65ItevxgYR0XFP/nssxT27Mmm\npuZuY1m9Ppq8vDNmiePs2bPY2/elNokA3IW9vQ9nz55tUHbv3gNUVT1C7SHWgYqK/yYjo/n7yuXl\n5Uya9DD29k44O3fn1Vdfa+9dMJsmHwL/6quvOjIOIYRooLS0lEmTEqis/B8g44Y1epSCY8dOYWv7\nBjU1YwArnJzeZvToe8wSS+3wQeeBzcAEYCtKnTY+0XqjQYP8OH78S6qr7wEMODpuJjh4ULPbf/rp\nv7B1azV6/UX0+gssXToWrVZDfHy8GfamfTV5RlJdXU1BQQE+Pj71XgUFBej1+o6MUQhxm8rNzcXa\n2hOYC5ygdrbFDdQeyGdQU7MXvT4LG5s7sbHpxcSJno3es2gPrq6ufPPN57i5PY6dnSs9ejzG119v\naPTmflLSCry9t+DqGo6LyyBCQopZsOAvzW5/27bvqaxcCLgAPpSXP8mWLTvMsi/trclE8uyzz9Kt\nW7cGy7t168azzz7bqo1v2bKFgIAA/Pz8WL58eYP1ly9fZvLkyeh0OqKjo8nJyWl1XSHEb5+XlxdV\nVfnAReAHaqcRfhwYCbwFeGBnZ01BQR7XrpXxyScfYGdnZ7Z4hg0bxoULBRQVnebSpTMMHz680XLu\n7u4cOrSHbdveY8eOFHbv/s44KnJT3N3dgWzje3v7bLy8+rRn+ObTVE/FwYMHN9mLsTWDNtbU1CiN\nRqNOnjypqqqqlE6nU4cOHapX5rnnnlMvv/yyUqp2UptRo0a1uu71hwRajEMI0bW9/fa7ysmpj+rW\nbbxydOytHBx6qtophH9RdnbPqIiIGGUwGDo7TJNlZGQoF5feysnpj+qOO8apAQOCVElJiVk+q72P\nnU3eIykpKWky+VRWVraYoDIyMvD19cXHxweAhIQENm3aRGBgoLHM4cOHmT+/9jRUq9Vy6tQpioqK\nyM3NbbGuEOL28NRTTxAXN5IjR47g7+9PeXk5U6c+xZkzLxIdHc2HH260qD4atyoqKoqDB/ewbds2\nnJycmDRpEi4uLp0dVqs0mUgiIyP5+9//zuOPP15v+erVqxk8eHCLGy4sLKRv377G997e3vz000/1\nyuh0OjZu3Mg999xDRkYGeXl5FBQUtKquEOL24e/vX2/ekZ9/Tu/EaMynf//+zJgxo7PDaLMmE8nK\nlSuZPHkyKSkpxsSRmZnJr7/+yhdffNHihlvzC2H+/PnMmTOH8PBwQkJCCA8Px8bGpk2/LhYvXmz8\nOzY2ltjY2FbXFUKI20Fqaiqpqalm236TieSuu+4iPT2dHTt2cPDgQaysrLj//vsZOXJkqzbs5eXF\n6dOnje9Pnz7doLepq6trvaFWBgwYgEajoaKiosW6dW5MJEKIrmHv3r088cQ8zp8vYsyYWJKSXmvx\nZrS4dTf/yF6yZEm7bt9sPdtramrQarVs374dT09PhgwZwvr16+vd5ygtLcXJyQl7e3tWr15NWloa\nycnJraoL0rNdiK4oLy+P4OAorl59DQjH0XEpcXE2bNq0vrNDu22097Gz5VlpbnXDtrYkJSUxduxY\n9Ho906dPJzAwkFWrVgEwc+ZMDh06xGOPPYaVlRXBwcG8//77zdYVQnR927Ztw2AYB0wFoLLyAzZv\ndkOv18skVV2UjLUlhOhQKSkpzJz5EdeufXt9SR4ODsFUVJS1+emrwsJCTp06hUaj4a677mr/YH+j\nOmysLSGEMIdJkybh7l6Avf1U4A2cnceycOHCNieRNWs+wM8vlAkT5jJwYBCffPKZeQIWLZIzEiFE\nhystLeXNN9+moOA8Y8fGMmXKlDbVLywsxM8vlIqK3YA/cAAnp1gKC3M7dXKurqLL3CMRQoimdO/e\nnUWLXrzl+idPnsTe3p+Kirq+JaHY2nqQn58viaQTyKUtIUSXo9FoqKo6Bvx8fUkGev05+vfv35lh\n3bYkkQghuhwPDw/WrPkbTk734uoajLPzeD7++AOzT7MrGif3SIQQXdalS5fIz8/Hx8dHkkgbyFS7\nN5BEIoQQbSeP/wohhLAokkiEEF3G1atXKS0t7ewwxE0kkQghLJ5er+eRR2bQs2cfevf2Yty4eCoq\nKjo7LHGdJBIhhMX73/99iy++OE5NTRHV1Zf497+t+ctfFnV2WOI6SSRCCIuXmvoT5eUzABfAnsrK\nWezcKZPdWQpJJEIIi+fn1w97+38DtU8a2djsZODAfp0blDCSx3+FEBbv8uXLREXFUlTkCjji7PwL\ne/b8u96U3KL1pB/JDSSRCHH7qKioYMeOHej1eu699166devW2SF1WZJIbiCJRAgh2k46JAohhLAo\nkkiEEEKYxKyJZMuWLQQEBODn58fy5csbrL9w4QLjxo0jLCyM4OBgkpOTjet8fHwIDQ0lPDycIUOG\nmDNMIYQQJjDbPRK9Xo9Wq+W7777Dy8uLqKgo1q9fT2BgoLHM4sWL+fXXX3n11Ve5cOECWq2W8+fP\nY2try4ABA8jMzMTNza3p4OUeiRBCtFmXuUeSkZGBr68vPj4+2NnZkZCQwKZNm+qV8fDwoKysDICy\nsjJ69eqFre1/Jm2UJCGEEJbPbImksLCw3jPe3t7eFBYW1iszY8YMcnJy8PT0RKfT8eabbxrXWVlZ\nMXr0aCIjI1m9erW5whRCCGEis83ZbmVl1WKZxMREwsLCSE1NJTc3lzFjxrB//35cXV1JS0vDw8OD\n4uJixowZQ0BAADExMeYKVwghxC0yWyLx8vLi9OnTxvenT5/G29u7Xpn09HQWLFgA1M7BPGDAAI4e\nPUpkZCQeHh4A9O7dm8mTJ5ORkdFoIlm8eLHx79jYWGJjY9t/Z4QQogtLTU0lNTXVbNs32832mpoa\ntFot27dvx9PTkyFDhjS42f7nP/+Z7t2789JLL3H+/HkGDx7MgQMHcHR0RK/X4+rqyrVr14iLi+Ol\nl14iLi6ufvBys10IIdqsvY+dZjsjsbW1JSkpibFjx6LX65k+fTqBgYGsWrUKgJkzZ/LCCy8wbdo0\ndDodBoOBFStW4Obmxi+//EJ8fDxQm5D+8Ic/NEgiQgghLIMMkSKEELeZLvP4rxBCiNuDJBIhhBAm\nkUQihBDCJJJIhBBCmEQSiRBCCJNIIhFCCGESSSRCCCFMIolECCGESSSRCCGEMIkkEiGEECaRRCKE\nEMIkkkiEEEKYRBKJEEIIk0giEUIIYRJJJEIIIUwiiUQIIYRJJJEIIYQwiSQSIYQQJpFEIoQQwiRm\nTSRbtmwhICAAPz8/li9f3mD9hQsXGDduHGFhYQQHB5OcnNzqukIIISyD2RKJXq/nqaeeYsuWLRw6\ndIj169dz+PDhemWSkpIIDw8nOzub1NRU5s6dS01NTavqdiWpqamdHUKrSJztqyvE2RViBInT0pkt\nkWRkZODr64uPjw92dnYkJCSwadOmemU8PDwoKysDoKysjF69emFra9uqul1JV/lySZztqyvE2RVi\nBInT0pktkRQWFtK3b1/je29vbwoLC+uVmTFjBjk5OXh6eqLT6XjzzTdbXVcIIYRlMFsisbKyarFM\nYmIiYWFhnDlzhuzsbGbPns2VK1fMFZIQQghzUGaye/duNXbsWOP7xMREtWzZsnplxo8fr3bt2mV8\nP3LkSLVnz55W1VVKKY1GowB5yUte8pJXG146na5dj/e2mElkZCTHjx/n1KlTeHp68umnn7J+/fp6\nZQICAvjuu+8YNmwY58+f5+jRowwcOJBu3bq1WBfgxIkT5gpfCCFEK5ktkdja2pKUlMTYsWPR6/VM\nnz6dwMBAVq1aBcDMmTN54YUXmDZtGjqdDoPBwIoVK3BzcwNotK4QQgjLY6WUUp0dhBBCiK7LYnu2\nd5XOjKbE6ePjQ2hoKOHh4QwZMqTTYrx8+TKTJ09Gp9MRHR1NTk5Oq+taSpwd1ZZ/+tOfcHd3JyQk\npMkyzzzzDH5+fuh0OrKysozLO7ItTYmzo9qyNXEeOXKEoUOH4ujoyF//+td66yypPZuL05LaMyUl\nBZ1OR2hoKMOGDePAgQPGdSa1Z7vecWknNTU1SqPRqJMnT6qqqiql0+nUoUOH6pV56aWX1Pz585VS\nShUXFys3NzdVXV3dqrqWEKdSSvn4+KiLFy+aJba2xPjcc8+pl19+WSml1JEjR9SoUaNaXdcS4lSq\nY9pSKaV27typ9u3bp4KDgxtdv3nzZjV+/HillFI//vijio6OVkp1bFuaEqdSHdeWrYmzqKhI7dmz\nRy1YsEC9/vrrxuWW1p5NxamUZbVnenq6KikpUUop9e2337bb99Miz0i6SmdGU+Kso8x8ZbE1MR4+\nfJgRI0YAoNVqOXXqFEVFRRbXlo3FWVxcbFxv7rYEiImJoWfPnk2u//LLL5k6dUEHU3gAAAnxSURB\nVCoA0dHRlJSUcO7cuQ7vZHsrcZ4/f964viPaElqOs3fv3kRGRmJnZ1dvuaW1Z1Nx1rGU9hw6dCjd\nu3cHav/dCwoKANPb0yITSVfpzGhKnFDb12b06NFERkayevXqTotRp9OxceNGoPYLlZeXR0FBgcW1\nZVNxQse0ZWs0tR9nzpyxqE62zbW3pbRlc7pSp2VLbc/333+f++67DzC9Pc321JYp2tKZMTU1ldzc\nXMaMGcP+/fs7ILr/MCVOV1dX0tLS8PDwoLi4mDFjxhAQEEBMTEyHxzh//nzmzJlDeHg4ISEhhIeH\nY2Nj06q67cWUOAF27dqFp6enWduytTrq16epmorTktqyKR353TRVR/w/b6sdO3awdu1a0tLSANPb\n0yLPSLy8vDh9+rTx/enTp/H29q5XJj09nd/97ncAaDQaBgwYwNGjR/H29m6xriXECbWXvaD2tHjy\n5MlkZGR0Soyurq6sXbuWrKwsPvzwQ4qLi9FoNK2q29lxDhw4EABPT0/AvG3ZGjfvR0FBAd7e3h3a\nlq3RWJxeXl6A5bRlcyytPZvTEf/P2+LAgQPMmDGDL7/80ngZzNT2tMhEcmNnxqqqKj799FMmTpxY\nr0xdZ0agXmfG1tS1hDjLy8uNw8Fcu3aNbdu2NfuEjTljLC0tpaqqCoDVq1dz77334uLiYnFt2VSc\nHdWWrTFx4kQ+/PBDAH788Ud69OiBu7t7h7alKXFaUlve6OazJ0trzzo3x2lp7Zmfn098fDwfffQR\nvr6+xuUmt+etPx9gXt98843y9/dXGo1GJSYmKqWUeu+999R7772nlKp9Aur+++9XoaGhKjg4WKWk\npDRb19LizM3NVTqdTul0OhUUFGTWOFuKMT09Xfn7+yutVqumTJlifKqjqbqWFucvv/zSYW2ZkJCg\nPDw8lJ2dnfL29lbvv/9+vRiVUmr27NlKo9Go0NBQlZmZ2ez+WVqcHfm9bE2cZ8+eVd7e3qpbt26q\nR48eqm/fvurKlStKKctqz6bitLT2nD59unJzc1NhYWEqLCxMRUVFGeua0p7SIVEIIYRJLPLSlhBC\niK5DEokQQgiTSCIRQghhEkkkQgghTCKJRAghhEkkkQghhDCJJBLRaQoKCnjggQfw9/fH19eXZ599\nlurq6nb/nFWrVrFu3bp22VZSUhK+vr5YW1tz6dKleuuaGpb9ZsuWLePjjz9usPzbb78lKiqKoKAg\nIiIieO655wBYvHhxg6HJW+Orr75qt+HV9+/fz9ChQwkNDWXixInGTnZCAJbbIVH8thkMBhUVFaWS\nk5OVUkrp9Xo1ffp0NW/evE6OrHlZWVnq1KlTDYYGb25Y9puNGDFCXbhwod6yn3/+WWk0GnX06FGl\nVG171HUiW7x4cYOhyTtaZGSk2rlzp1JKqbVr16qFCxd2ajzCssgZiegU33//PU5OTsahzK2trXnj\njTdYu3YtFRUVJCcn88ADDzBixAj8/f15+eWXjXU/+ugjoqOjCQ8P54knnsBgMADg4uLCiy++SFhY\nGEOHDqWoqAio/4s+NjaW+fPnEx0djVarZdeuXUDtUBYPPfQQQUFBxMfHc/fdd5OZmdkg7rCwMPr3\n799geUvDstcpKyujqqqKXr161Vu+YsUKXnzxRfz9/Y3tMXPmzAb1s7Ozufvuu9HpdMTHx1NSUgLA\nW2+9RVBQEDqdjocffhiA5ORknn76aQAee+wx5syZw7Bhw9BoNHz++ecAGAwGZs2aRWBgIHFxcUyY\nMMG47kbHjx83DjQ4evToRsuI25ckEtEpcnJyGDx4cL1lrq6u9OvXjxMnTgCwZ88eNm7cyIEDB9iw\nYQOZmZkcPnyYzz77jPT0dLKysrC2tiYlJQWoTQZDhw4lOzub4cOHG4fstrKyMo5uamVlhV6v56ef\nfmLlypUsWbIEgHfeeYdevXqRk5PDK6+8QmZmZptGRG1sGO66Ie5v9N133zF69OhWtceN6mL54x//\nyGuvvcb+/fsJCQkxxr98+XKys7PZv38/7733Xr06dc6dO0daWhpff/018+fPB2Djxo3k5eVx+PBh\n1q1bx+7duxvd76CgIOP8FBs2bKg3wJ8QkkhEp2juIF134I+Li6Nnz544OjoSHx/Prl272L59O5mZ\nmURGRhIeHs7333/PyZMnAbC3t2fChAkADB48mFOnTjW6/fj4eAAiIiKMZdLS0khISABqD5qhoaFt\n3id102hDje3j1q1bGT9+fJu3DbVnM6WlpcYzg6lTp7Jz504AQkNDefjhh0lJSTEOrX9zLJMmTQIg\nMDDQeLa0a9cuHnroIQDc3d2NE4fdbO3atbzzzjtERkZy9epV7O3tb2kfxG+TJBLRKQYNGtTg0lFZ\nWRn5+fn4+vo2OCgrpYwH5qlTp5KVlUVWVhZHjhxh0aJFAPVmp7O2tqampqbRz3ZwcADAxsamXpmb\nP7MtmhuW/UYZGRmNztsdFBTE3r172/SZN8a7efNmZs+ezb59+4iKikKv1zfYnxsP/nXrrKysWrXf\nWq2WrVu3snfvXhISEtBoNG2KVfy2SSIRnWLUqFGUl5cbn6bS6/XMnTuXadOm4ejoCMC//vUvLl++\nTEVFBZs2beKee+5h1KhR/POf/zROsXvp0iXy8/Ob/SylVIsHy2HDhvHZZ58BcOjQIX7++ecW9+HG\nbTY1LPuNcnJyCAgIaPRMZd68eSQmJnL8+HGg9t7FqlWr6sXfrVs3evbsabyvs27dOmJjY1FKkZ+f\nT2xsLMuWLaO0tJSrV6+2GH/dfn/++ecopTh//jypqamNlqtrb4PBwNKlS3nyySdbtX1xe5BEIjrN\nF198wYYNG/D390er1eLs7ExiYiJQ+0t5yJAhTJkyBZ1Ox4MPPkhERASBgYEsXbqUuLg4dDodcXFx\nnDt3zlinzs33RZq6lFa3fNasWRQXFxMUFMTChQsJCgoyzm19o7feeou+fftSWFhIaGgojz/+OAD3\n3XcfAwcOxNfXl5kzZ/LOO+80qPvtt982eVkrJCSElStX8vvf/55BgwYREhJivGR3Y/z/+Mc/mDdv\nHjqdjgMHDrBo0SJqamp49NFHCQ0NJSIigjlz5tC9e/cG+93Y31OmTMHb25tBgwbx6KOPEhER0eh+\nr1+/Hq1WS2BgIN7e3jz22GON7oe4Pckw8sIiJScnk5mZydtvv90hn2cwGKiursbBwcE4JfKxY8ew\ntW2/2ajj4uJYt25dgzOVznbt2jXuuOMOLl68SHR0NOnp6fTp06ezwxJdiEXO2S5Ec2cR5nDt2jVG\njhxJdXU1Sinefffddk0iANu2bWvX7bWX+++/n5KSEqqqqli0aJEkEdFmckYihBDCJHKPRAghhEkk\nkQghhDCJJBIhhBAmkUQihBDCJJJIhBBCmEQSiRBCCJP8Py+g205UbSbvAAAAAElFTkSuQmCC\n", "text": [ "" ] } ], "prompt_number": 15 }, { "cell_type": "markdown", "metadata": {}, "source": [ "Please see the graphs below for a sample of the same correlations (these graphs are created in ggplot2 for presentation for our website and our video).\n", "\n", "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As a result of the above findings, one can certainly make an argument that there are strong market correction effects at play. Hence, it may not make sense that we are using 9 days of opening and closing prices to make predictions if only the last 3 days, for instance, are most pertinent to making a prediction. However, when running our model, we see that including information from all the days of stock prices that we are given does tend to improve the model. As such, we do include the previous 9 days of data in creating our model." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Selecting the tuning parameters and evaluating our models" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We now run our 4 individual models on our 6 different datasets in order to evaluate the performance of the models and select their relevant tuning parameters.\n", "\n", "Below, we include two helper functions by Ben Hamner of Kaggle. We use AUC as our cross-validation metric since it has a neat probabilitistic interpration in binary predictions without being sensitive to the relative sizes of the classes." ] }, { "cell_type": "code", "collapsed": false, "input": [ "def tied_rank(x):\n", " \"\"\"\n", " This function is by Ben Hamner and taken from https://github.com/benhamner/Metrics/blob/master/Python/ml_metrics/auc.py\n", "\n", " Computes the tied rank of elements in x.\n", "\n", " This function computes the tied rank of elements in x.\n", "\n", " Parameters\n", " ----------\n", " x : list of numbers, numpy array\n", "\n", " Returns\n", " -------\n", " score : list of numbers\n", " The tied rank f each element in x\n", "\n", " \"\"\"\n", " sorted_x = sorted(zip(x,range(len(x))))\n", " r = [0 for k in x]\n", " cur_val = sorted_x[0][0]\n", " last_rank = 0\n", " for i in range(len(sorted_x)):\n", " if cur_val != sorted_x[i][0]:\n", " cur_val = sorted_x[i][0]\n", " for j in range(last_rank, i): \n", " r[sorted_x[j][1]] = float(last_rank+1+i)/2.0\n", " last_rank = i\n", " if i==len(sorted_x)-1:\n", " for j in range(last_rank, i+1): \n", " r[sorted_x[j][1]] = float(last_rank+i+2)/2.0\n", " return r\n", "\n", "def auc(actual, posterior):\n", " \"\"\"\n", " This function is by Ben Hamner and taken from https://github.com/benhamner/Metrics/blob/master/Python/ml_metrics/auc.py\n", " \n", " Computes the area under the receiver-operater characteristic (AUC)\n", "\n", " This function computes the AUC error metric for binary classification.\n", "\n", " Parameters\n", " ----------\n", " actual : list of binary numbers, numpy array\n", " The ground truth value\n", " posterior : same type as actual\n", " Defines a ranking on the binary numbers, from most likely to\n", " be positive to least likely to be positive.\n", "\n", " Returns\n", " -------\n", " score : double\n", " The mean squared error between actual and posterior\n", "\n", " \"\"\"\n", " r = tied_rank(posterior)\n", " num_positive = len([0 for x in actual if x==1])\n", " num_negative = len(actual)-num_positive\n", " sum_positive = sum([r[i] for i in range(len(r)) if actual[i]==1])\n", " auc = ((sum_positive - num_positive*(num_positive+1)/2.0) /\n", " (num_negative*num_positive))\n", " sys.stdout.write('.')\n", " return auc" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 47 }, { "cell_type": "markdown", "metadata": {}, "source": [ "After processing the data and gathering an X and a y, we want to find the cross validation scores of the best model. We do this by calculating the cross-validation scores for a number of tuning parameters, and then selecting the model/tuning parameter that leads to the best cross-validation score." ] }, { "cell_type": "code", "collapsed": false, "input": [ "import sklearn.linear_model as lm\n", "from sklearn.ensemble import GradientBoostingClassifier, RandomForestClassifier\n", "from sklearn import metrics,preprocessing,cross_validation\n", "\n", "################################################################################\n", "# Choose model here\n", "modelname = \"ridge\"\n", "################################################################################\n", "\n", "################################################################################\n", "# The following defines the list of models, all of which we want to test via\n", "# cross validation. C here is the list of tuning parameters that we want to test\n", "# with a list comprehension we make a list of all of the tuning parameters\n", "################################################################################\n", "\n", "if modelname == \"ridge\": \n", " C = np.linspace(500, 5000, num = 10)\n", " models = [lm.LogisticRegression(penalty = \"l2\", C = c) for c in C]\n", "\n", "if modelname == \"lasso\": \n", " C = np.linspace(500, 5000, num = 10)\n", " models = [lm.LogisticRegression(penalty = \"l1\", C = c) for c in C]\n", "\n", "if modelname == \"randomforest\":\n", " C = np.linspace(10, 200, num = 20)\n", " models = [RandomForestClassifier(n_estimators = int(c)) for c in C]\n", "\n", "if modelname == \"gbt\":\n", " C = np.linspace(10, 200, num = 20)\n", " models = [GradientBoostingClassifier(n_estimators = int(c)) for c in C]\n", "\n", "# auc_scorer function is a helper function that takes the given input and uses it to call the auc\n", "# function to calculate the AUC. It does this by calculating the predicted values and passing in the\n", "# predictions to the auc function along with the actual y values.\n", "def auc_scorer(estimator, X, y):\n", " predicted = estimator.predict_proba(X)[:,1]\n", " return auc(y, predicted)\n", "\n", "################################################################################\n", "# This loop here calculates the CV scores for all of the models.\n", "# we choose AUC as our CV metric since it has a neat probabilistic interpretation\n", "# without being sensitive to the size of the groups.\n", "################################################################################\n", "\n", "print \"calculating cv scores\"\n", "cv_scores = [0] * len(models)\n", "for i, model in enumerate(models):\n", " cv_scores[i] = np.mean(cross_validation.cross_val_score(model, X, y, cv=5, scoring = auc_scorer))\n", " print \" (%d/%d) C = %f: CV = %f\" % (i + 1, len(C), C[i], cv_scores[i])\n", "\n", "################################################################################\n", "# find which model and C is the best\n", "# we set the best model to best_model and now can use it for our predictions.\n", "################################################################################\n", "\n", "best = cv_scores.index(max(cv_scores))\n", "best_model = models[best]\n", "best_cv = cv_scores[best]\n", "best_C = C[best]\n", "print \"BEST %f: %f\" % (best_C, best_cv)" ], "language": "python", "metadata": {}, "outputs": [ { "output_type": "stream", "stream": "stdout", "text": [ "calculating cv scores\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (1/10) C = 500.000000: CV = 0.946531\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (2/10) C = 1000.000000: CV = 0.946524\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (3/10) C = 1500.000000: CV = 0.946520\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (4/10) C = 2000.000000: CV = 0.946525\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (5/10) C = 2500.000000: CV = 0.946526\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (6/10) C = 3000.000000: CV = 0.946525\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (7/10) C = 3500.000000: CV = 0.946524\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (8/10) C = 4000.000000: CV = 0.946525\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (9/10) C = 4500.000000: CV = 0.946526\n", "." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ ".." ] }, { "output_type": "stream", "stream": "stdout", "text": [ " (10/10) C = 5000.000000: CV = 0.946524\n", "BEST 500.000000: 0.946531\n" ] } ], "prompt_number": 51 }, { "cell_type": "markdown", "metadata": {}, "source": [ "##Blending models together via Platt Scaling\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Testing out our four different individual models (LASSO, Ridge, Random Forest, Gradient Boosted Trees), we had the hypothesis that combining the models into some sort of ensemble would let us have better predictive performance in terms of AUC. We came to this conclusion because of our understanding of ensemble methods from CS 181 and CS 109. Ensemble methods (like random forests from week 10), help average out the bias and reduce the variance of our models to achieve better predictive performance.\n", "\n", "We decided to use Platt Scaling, a method demonstrated by John Platt in 1999. (paper at http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1639). In this paper, he created binary probability estimates via logistic regression on the inputs from classification models (Platt uses SVM). We will do the same, and place the probability estimates of each of our 4 models into a logistic regression model, to form a blended model that takes in contributions from all of the input models.\n", "\n", "To do this, we needed to create out-of-sample estimates for each of our four models (e.g. the probability estimates for a stock cannot have been trained on the actual increase indicator for that stock), else our input would be subject to overfitting, and our model would reach inflated CV scores due to the actual answer being included in the training of the predictors. We have an intuition for that due to our exposure in CS 109. Below is the code that we use to generate those out-of-sample estimates." ] }, { "cell_type": "code", "collapsed": false, "input": [ "# Code that determines which models we want to blend\n", "# The tuning parameters are seleted\n", "# with cross-validation (we show the code in a previous cell\n", "\n", "# The below model is the blend of all four of our models\n", "models = [lm.LogisticRegression(penalty='l2', C = 5000),\n", " lm.LogisticRegression(penalty='l1', C = 500),\n", " RandomForestClassifier(n_estimators = 100),\n", " GradientBoostingClassifier(n_estimators = 200),\n", " ]\n", "# The below model is the blend of Ridge Regression and Random Forest\n", "#models = [lm.LogisticRegression(penalty='l2', C = 5000),\n", "# RandomForestClassifier(n_estimators = 100)\n", "# ]\n", "\n", "\"\"\"\n", "Function\n", "--------\n", "\n", "This function gives a matrix of predictors \n", "(predicted probability that the stock will go up from each of the models)\n", "and a matrix of responses (did the stock go up or down) that will be used\n", "as input in the logistic regression blender.\n", "\n", "Parameters\n", "----------\n", "models : list\n", " list of models that we want to stack\n", "X : (N_stocks, N_predictors) matrix\n", " matrix of predictors that are input into the above list of models\n", "y : (N_stocks, 1) matrix\n", " matrix of responses that the list of models are trained on\n", "folds : int\n", " folds used in the same way as cross-validation. We break up the \n", " data into this many equally-sized chunks, and then for each chunk,\n", " we use the rest of the chunks to make predictions about it\n", "\n", "Returns\n", "-------\n", "new_X : (N_stocks, N_models)\n", " New predictor matrix, where the predictors for each stock are the \n", " out of sample predicted probabilities of stock inrease from \n", " each of the models in the list of models\n", "new_Y : same contents as y, but will be reordered due to the process for how\n", " the new_X is calculated\n", "\n", "\"\"\"\n", "\n", "def get_oos_predictions(models, X, y, folds = 10):\n", " \n", " # this is simply so we know how far the model has progressed\n", " sys.stdout.write('.')\n", " predictions = [[] for model in models]\n", " new_Y = []\n", " \n", " # for every fold of the data...\n", " for i in range(folds):\n", " \n", " # find the indices that we want to train and predict\n", " indxs = np.arange(i, X.shape[0], folds)\n", " indxs_to_fit = list(set(range(X.shape[0])) - set(np.arange(i, X.shape[0], folds)))\n", " \n", " # put together the predictions for each model\n", " for i, model in enumerate(models):\n", " predictions[i].extend(list(model.fit(X[indxs_to_fit,:], y[indxs_to_fit,:]).predict_proba(X[indxs,:])[:,1]))\n", " \n", " # put together the reordered new_Y\n", " new_Y = new_Y + list(y[indxs,:])\n", " \n", " # format everything for return\n", " new_X = np.hstack([np.array(prediction).reshape(len(prediction), 1) for prediction in predictions])\n", " new_Y = np.array(new_Y).reshape(len(new_Y), 1)\n", " return new_X, new_Y\n", "\n", "# run the code and get the new_X and new_Y estimates.\n", "new_X, new_Y = get_oos_predictions(models, X, y)" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": "*" }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now with the matrix of predicted probabilities from all of our models (new_X), and a reordered responses (new_Y), we can make our blended model, assess it via cross validation, fit a model for prediction, and examine the coefficients that our blended model gives to each of the input models." ] }, { "cell_type": "code", "collapsed": false, "input": [ "# define our Platt Scaling model that uses logistic regression to combine the predicted probabilities for each moel\n", "model_stacker = lm.LogisticRegression()\n", "\n", "# calculate the cross-validation score for our blended models. This is what we use in the 6x6 visualization below\n", "print mean(cross_validation.cross_val_score(model_stacker, new_X, new_Y.reshape(new_Y.shape[0]), cv=5, scoring = auc_scorer))\n", "\n", "# we fit the model so that we are able to make predictions using our new blended model\n", "model_stacker.fit(new_X, new_Y.reshape(new_Y.shape[0]))\n", "\n", "# we see what weights the blended model assigns to the probability predictions of each model.\n", "print model_stacker.coef_" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For exampale, for the industrials dataset we got that the weights in logistic regression assigned to Ridge, LASSO, RF, and GBT were 4.96007394, -0.35052333, 2.10430668, 0.59764035 respectively. If we rescale these to become proportions instead of slopes, we get that each model gets assigned a 68%, -5%, 29%, 8% weight respectively.\n", "\n", "We were curious ourselves about how the LASSO model managed to get a negative weight. We did research by asking on Quora (http://qr.ae/GO95p) and recieved a helpful answer from another student who helped explain it to us. We learned that this was because **the bias of the LASSO model was positively correlated with the biases from at least one of the other models**. Thus a blended model that assigned *negative* weight to LASSO and positive weight to another model whose biases were postively correlated with those of LASSO helped us correct the biases and lead to a better model.\n", "\n", "We also ran the blended model for a blend of Ridge and RF. Now with our 6 datasets, 4 individual models, and 2 blended models, we can visualize our cross-validation scores for each model via the below." ] }, { "cell_type": "heading", "level": 2, "metadata": {}, "source": [ "Visualizing the cross-validation scores" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We wanted a fun way to present our cross-validation scores such that they would be an interesting as our analysis and our research problem. We wanted a way to avoid just showing a matrix of numbers, so we decided to go with a heatmap made in d3 via the plotly API. After calculating the AUC scores for all 6 models on all 6 predictors, we plugged it into the plotly API to visualize our results. **Please view our heatmap of cross-validation scores at https://plot.ly/~william/15**.\n", "\n", "Other than being pretty to look at, we see that the blended models perform the best, and that the historical data is the easiest to predict. The modern data all still has around 90% AUC, which is fairly good. The differences between abilities to predict sectors is minute, and random forest performed the worst out of all of the individual models. Ridge regression performed better than the other individual models probably because the features were fairly linear with respect to the logit probability. There was no sparseness in the feature set, so LASSO did not do as well as Ridge." ] }, { "cell_type": "code", "collapsed": false, "input": [ "import plotly\n", "\n", "# I am using the plotly python API to make my plots\n", "py = plotly.plotly(username=\"william\", key=\"99tin0h4pk\")\n", "\n", "# create the x and y axixes for our plot\n", "x = [\"Logistic Ridge Regression\",\n", " \"Logistic LASSO Regression\",\n", " \"Random Forest\",\n", " \"Gradient Boosted Trees\",\n", " \"Ridge-RF Ensemble\",\n", " \"Ridge-RF-LASSO-GBT\"]\n", "y = [\"Consumer Disc. (12 - '13)\",\n", " \"Financials (12-'13)\",\n", " \"Healthcare (12 - '13)\",\n", " \"Industrials (12-'13)\",\n", " \"Info. Tech (12 - '13)\",\n", " \"Historical (92 - '93)\"]\n", "y.reverse()\n", "\n", "# the data from our runs of the 6 models on the 6 datasets\n", "z = [[0.946526, 0.944749, 0.922878, 0.940487, 0.947485435127, 0.947502046881],\\\n", " [0.912866, 0.904732, 0.891061, 0.905027, 0.918906923113, 0.919287994738],\\\n", " [0.909196, 0.908183, 0.88349, 0.90314, 0.912065493042, 0.912012745436], \\\n", " [0.918751, 0.913991, 0.895197, 0.912037, 0.925478662886, 0.925984227975], \\\n", " [0.912278, 0.906195, 0.895253, 0.906869, 0.9161297007, 0.916070264898], \\\n", " [0.915872, 0.913978, 0.891309, 0.908664, 0.917582810704, 0.917916579234]]\n", "\n", "# prepare what I am sending to plotly\n", "data = {'x': x,'y': y,'z': z, 'type': 'heatmap'}\n", "\n", "# send plot to plotly and return URL\n", "print \"Created a new plot at \" + str(py.plot([data])['url'])\n" ], "language": "python", "metadata": {}, "outputs": [ { "output_type": "stream", "stream": "stdout", "text": [ "\n", "\n", "\n", "Created a new plot at https://plot.ly/~william/26\n" ] } ], "prompt_number": 54 }, { "cell_type": "markdown", "metadata": {}, "source": [ "For the same plot with tuned formatting, please see\n", "\n", "" ] }, { "cell_type": "heading", "level": 2, "metadata": {}, "source": [ "Understanding our models with calibration plots" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "After creating our 6 models (Ridge, LASSO, RF, Gradient Boosted Trees, Ridge-RF Blend, All blend) and calculating the out of sample testing error on all 6 of our datasets, we have found that the blended models do better than the individual models. Interestingly, the blend of the Ridge and Random Forest models perform just as well as the blend between all four individual models.\n", "\n", "To better understand tbe performance of each of our models and to understand why the blends performed better, we produce calibration plots for each of our models. We trained the models on the training set and produced the calibration model for the test set to understand out of sample prediction. The following code is not ours, but instead taken from the official solutions to CS 109 HW3.\n" ] }, { "cell_type": "code", "collapsed": false, "input": [ "################################################################################\n", "# The following code is from CS 109 HW 3 official solutions\n", "################################################################################\n", "\n", "# set some nicer defaults for matplotlib\n", "from matplotlib import rcParams\n", "\n", "#these colors come from colorbrewer2.org. Each is an RGB triplet\n", "dark2_colors = [(0.10588235294117647, 0.6196078431372549, 0.4666666666666667),\n", " (0.8509803921568627, 0.37254901960784315, 0.00784313725490196),\n", " (0.4588235294117647, 0.4392156862745098, 0.7019607843137254),\n", " (0.9058823529411765, 0.1607843137254902, 0.5411764705882353),\n", " (0.4, 0.6509803921568628, 0.11764705882352941),\n", " (0.9019607843137255, 0.6705882352941176, 0.00784313725490196),\n", " (0.6509803921568628, 0.4627450980392157, 0.11372549019607843),\n", " (0.4, 0.4, 0.4)]\n", "\n", "rcParams['figure.figsize'] = (10, 6)\n", "rcParams['figure.dpi'] = 150\n", "rcParams['axes.color_cycle'] = dark2_colors\n", "rcParams['lines.linewidth'] = 2\n", "rcParams['axes.grid'] = False\n", "rcParams['axes.facecolor'] = 'white'\n", "rcParams['font.size'] = 14\n", "rcParams['patch.edgecolor'] = 'none'\n", "\n", "\n", "def remove_border(axes=None, top=False, right=False, left=True, bottom=True):\n", " \"\"\"\n", " Minimize chartjunk by stripping out unnecesary plot borders and axis ticks\n", " \n", " The top/right/left/bottom keywords toggle whether the corresponding plot border is drawn\n", " \"\"\"\n", " ax = axes or plt.gca()\n", " ax.spines['top'].set_visible(top)\n", " ax.spines['right'].set_visible(right)\n", " ax.spines['left'].set_visible(left)\n", " ax.spines['bottom'].set_visible(bottom)\n", " \n", " #turn off all ticks\n", " ax.yaxis.set_ticks_position('none')\n", " ax.xaxis.set_ticks_position('none')\n", " \n", " #now re-enable visibles\n", " if top:\n", " ax.xaxis.tick_top()\n", " if bottom:\n", " ax.xaxis.tick_bottom()\n", " if left:\n", " ax.yaxis.tick_left()\n", " if right:\n", " ax.yaxis.tick_right()\n", " \n", "\"\"\"\n", "Function\n", "--------\n", "calibration_plot\n", "\n", "Builds a plot like the one above, from a classifier and review data\n", "\n", "Inputs\n", "-------\n", "clf : Classifier object\n", " A MultinomialNB classifier\n", "X : (Nexample, Nfeature) array\n", " The bag-of-words data\n", "Y : (Nexample) integer array\n", " 1 if a review is Fresh\n", "\"\"\" \n", "\n", "def calibration_plot(clf, xtest, ytest, title = \"\"):\n", " prob = clf.predict_proba(xtest)[:, 1]\n", " outcome = ytest\n", " data = pd.DataFrame(dict(prob=prob, outcome=outcome))\n", "\n", " #group outcomes into bins of similar probability\n", " bins = np.linspace(0, 1, 20)\n", " cuts = pd.cut(prob, bins)\n", " binwidth = bins[1] - bins[0]\n", " \n", " #freshness ratio and number of examples in each bin\n", " cal = data.groupby(cuts).outcome.agg(['mean', 'count'])\n", " cal['pmid'] = (bins[:-1] + bins[1:]) / 2\n", " cal['sig'] = np.sqrt(cal.pmid * (1 - cal.pmid) / cal['count'])\n", " \n", " #the calibration plot\n", " ax = plt.subplot2grid((3, 1), (0, 0), rowspan=2)\n", " p = plt.errorbar(cal.pmid, cal['mean'], cal['sig'])\n", " plt.plot(cal.pmid, cal.pmid, linestyle='--', lw=1, color='k')\n", " plt.ylabel(\"Empirical P(Stock increase)\")\n", " plt.ylim((0, 1))\n", " remove_border(ax)\n", " plt.title(title)\n", " \n", " #the distribution of P(fresh)\n", " ax = plt.subplot2grid((3, 1), (2, 0), sharex=ax)\n", " \n", " plt.bar(left=cal.pmid - binwidth / 2, height=cal['count'],\n", " width=.95 * (bins[1] - bins[0]),\n", " fc=p[0].get_color())\n", " \n", " plt.xlabel(\"Predicted P(Stock increase)\")\n", " remove_border()\n", " plt.ylabel(\"Number\")\n", "\n", "################################################################################\n", "# The above code is from CS 109 HW 3 official solutions\n", "################################################################################\n" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We choose to use the first 80% of the dataset as our training set and the last 20% as our test set. We fit each of our models and produce calibration plots" ] }, { "cell_type": "code", "collapsed": false, "input": [ "train_size = 36848\n", "\n", "# Ridge Logistic Regression\n", "ridge_model = lm.LogisticRegression(penalty = \"l2\", C = 5000) # define model\n", "ridge_model.fit(X[:train_size,:], y[:train_size]) # train model\n", "calibration_plot(ridge_model, X[train_size:,:], y[train_size:], \"Logistic Ridge\") # make calibration plot\n", "\n", "# LASSO Logistic Regression\n", "lasso_model = lm.LogisticRegression(penalty = \"l1\", C = 500) # define model\n", "lasso_model.fit(X[:train_size,:], y[:train_size]) # train model\n", "calibration_plot(lasso_model, X[train_size:,:], y[train_size:], \"Logistic LASSO\")\n", "\n", "# Random Forest\n", "rf_model = RandomForestClassifier(n_estimators = 100) # define model\n", "rf_model.fit(X[:train_size,:], y[:train_size]) # train model\n", "calibration_plot(rf_model, X[train_size:,:], y[train_size:], \"Random Forest\")\n", "\n", "# Gradient Boosted Trees\n", "gbt_model = sklearn.ensemble.GradientBoostingClassifier(n_estimators = 200) # define model\n", "gbt_model.fit(X[:train_size,:], y[:train_size]) # train model\n", "calibration_plot(gbt_model, X[train_size:,:], y[train_size:], \"Gradient Boosted Trees\")\n", "\n", "# Ridge-RF Blended model and Ridge-LASSO-RF-GBT Blended model\n", "# Please run code that creates the new_X, new_Y (above the get_oos_predictions function)\n", "# as well as the model_stacker code before running this\n", "model_stacker.fit(new_X[:train_size,:], new_Y.reshape(new_Y.shape[0])[:train_size]) # train model\n", "calibration_plot(model_stacker, new_X[train_size:,:], new_Y.reshape(new_Y.shape[0])[train_size:], \"Blended model\")\n", "\n", "################################################################################\n", "# The above code produces the calibration plots that you can see at\n", "# https://sites.google.com/site/predictingstockmovement/model-calibration-plots\n", "################################################################################\n" ], "language": "python", "metadata": {}, "outputs": [] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Please see https://sites.google.com/site/predictingstockmovement/model-calibration-plots for our calibration plots.** We note that the individual models are biased as the calibration lines are not at the ideal 45 degrees. The blended models have averaged away a lot of this bias, as the calibration line becomes closer to the optimal 45-degrees. The calibration plots help us understand why the blended models perform better. We present our calibration plots below.\n", "\n", "### Calibration Plots of Individal Models\n", "\n", "\n", "\n", "Above, we see that regularization models (Lasso and Ridge) under the Elastic-Net Regularization spectrum appear to yield predictions that mostly include extreme values. The calibration plot tends to indicate that predictions tend to be confident, meaning that many of the predictions are of probabilities near 0 and 1. The calibration plot also indicates that for the predictions that are not in the extreme ranges, our model is biased towards predicting probailities closer to 50%, as the predicted probabilities are closer to 50% than the empirical probabilities.\n", "\n", "\n", "\n", "\n", "Above, we see that our ensemble methods seem to yield predictions that include many moderate values. The calibration plot tends to indicate that the predictions tend to be fairly moderate and are not made with particular confidence (meaning that many predictions are not near 0 and 1 and instead are in moderate ranges). Similar to the Ridge and the LASSO models, of many of the moderate probabilities the model has bias towards predicting near 50%, as the predicted probabilities are closer to 50% than the empirical probabilities. The Gradient Boosted Trees model seems to have the least of this bias towards 50%.\n", "\n", "### Calibration plots of blended models\n", "\n", "\n", "\n", "We see that in general, the blended models have better calibration than the indivdiual models. The individual models present bias as in general they are overconfident or underconfident about their predictions. Our conclusion therefore, is that the blended models help average out those biases and make more well-calibrated models. We now understand why the blended models perform better than our individual models with respect to AUC." ] }, { "cell_type": "heading", "level": 1, "metadata": {}, "source": [ "Final Analysis" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "From our analysis, there are a number of conclusions that our group reached that we summarise below." ] }, { "cell_type": "heading", "level": 2, "metadata": {}, "source": [ "Direction Movements of Stocks are Highly Predictable (With 90%+ AUC)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "From the heatmap that we show above, we see that it is possible to achieve 90%+ AUC when trying to predict whether a closing price of a stock is greater than or less than the opening price of a stock. Note that this does not directly correspond to accuracy, and we do not test the direct accuracy of our predictions. The simple layman interpretation of AUC, however, is that if we were given a random data point where the closing price is higher than the opening price and another where the opposite is true, we would be able to correclty differentiate the two with over 90% accuracy. This may come at a surprise to many familiar with the stock market. The usual response to hearing this result is to inquire as to the viability of a trading strategy focused on these patterns. Unfortunately, the predictability of directional movement does not translate to the certainty of returns, for the following three reasons:\n", "\n", "1. Magnitude matters - trading strategies focus on the magnitude of the movement of a stock, not solely the direction. \n", "2. Things can go wrong fast - It may be possible that the trading strategy may yield high accuracy, but when it's wrong, then we lose big.\n", "3. Transaction fees - Trading strategies that focus on daily movement accrue transaction fees that basically can wipe away returns. " ] }, { "cell_type": "heading", "level": 2, "metadata": {}, "source": [ "Market Corrections and Momentum Effects Drive the Predictability of Directional Movement" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is a continuation of the previous point. There is a negative correlation between interday change (ratio between closing and opening price today), and intraday change (ratio between opening price today and closing price yesterday).\n", "\n", "This is because markets may overreact to events that happen during closing hours, and then correct themselves. This would explain any negative correlation. This point was demonstrated in our correlation analysis in our Exploratory Analysis. We see that this result is already strong if you just look at the previous day's data. We were able to achieve very high predictive power by including the past 9 days worth of data. " ] }, { "cell_type": "heading", "level": 2, "metadata": {}, "source": [ "Blended Models Improve Calibration and Predictive Accuracy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "One of our original questions was to address whether or not blended models perform better than our individual models. Our AUC results and our calibration plots above appear to suggest that this is in fact the case.\n", "\n", "Below, we show an illustration of the calibration plot of a blended model, as a function of the calibration plots of the component models." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "\n", "We mention above that the blended models help average out those biases and make more well-calibrated models. Intuitively, this does make sense because we are essentially applying existing philosophies regarding how committee classifiers can often be used to reduce variance while maintaining relatively low bias, which can improve predictive accuracy.\n", "\n", "Overall, this confirms some of the work that we have done. Although, it may make sense intuitively that blending different models may make sense, there is no existing mainstream Python functions that perform this task, although we have provided one over the course of our project. This may, in fact, be the single greatest contribution of our project.\n", "\n", "We also notice that another reason why blending may help is because it can help compensate for models that tend to be inaccurate for different parts of the data. For instance, when we blend Lasso with Ridge Regression, or blend Random Forest with Gradient Boosted Trees, we see that the AUC actually decreases. Based on our understanding of the behavior of these models, it is likely because these respective pairs of models are positively correlated with each other. (In fact, when we blend Lasso with Ridge Regression using Platt Scaling, Lasso actually receives a negative weight!) Blending, as we now understand, makes most sense between models that exhibit negative correlation. Juxtaposing Ridge Regression and Random Forest makes sense for precisely this reason; their negative correlation allows the blend to, in part, compensate for some of their individual insufficiencies in the pursuit of the best possible model." ] }, { "cell_type": "heading", "level": 2, "metadata": {}, "source": [ "Modern Data is Less Predictable Than Historical Data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We generally see that there is not much of a difference in AUC between the different industries that we looked at, although Healthcare yields the highest AUCs. We see that there is a minor dropoff in the AUC values between our historical data set from the early 1990's and our modern data from 2012-13. This may be an indication that it is harder to make accurate predictions on modern data compared to historical data, perhaps because of fewer market inefficiencies over time.\n" ] } ], "metadata": {} } ] }