{ "cells": [ { "cell_type": "markdown", "metadata": { "toc": true }, "source": [ "

Table of Contents

\n", "
" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "data": { "text/html": [ "\n", "\n", "\n", "" ], "text/plain": [ "" ] }, "execution_count": 1, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# code for loading the format for the notebook\n", "import os\n", "\n", "# path : store the current path to convert back to it later\n", "path = os.getcwd()\n", "os.chdir(os.path.join('..', '..', 'notebook_format'))\n", "\n", "from formats import load_style\n", "load_style(css_style='custom2.css', plot_style=False)" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Ethen 2019-08-11 12:40:02 \n", "\n", "CPython 3.6.4\n", "IPython 7.7.0\n", "\n", "numpy 1.17.0\n", "pandas 0.25.0\n", "sklearn 0.21.2\n", "matplotlib 3.1.1\n" ] } ], "source": [ "os.chdir(path)\n", "\n", "# 1. magic for inline plot\n", "# 2. magic to print version\n", "# 3. magic so that the notebook will reload external python modules\n", "# 4. magic to enable retina (high resolution) plots\n", "# https://gist.github.com/minrk/3301035\n", "%matplotlib inline\n", "%load_ext watermark\n", "%load_ext autoreload\n", "%autoreload 2\n", "%config InlineBackend.figure_format='retina'\n", "\n", "import os\n", "import warnings\n", "import numpy as np\n", "import pandas as pd\n", "import matplotlib.pyplot as plt\n", "from subprocess import call\n", "from sklearn.pipeline import Pipeline\n", "from sklearn.preprocessing import StandardScaler\n", "from sklearn.linear_model import LinearRegression\n", "from sklearn.model_selection import cross_val_score\n", "\n", "%watermark -a 'Ethen' -d -t -v -p numpy,pandas,sklearn,matplotlib" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Technical Skills Tips and Tricks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In machine learning, there are many ways to build a solution and each way assumes something different. Many times, it's not obvious how to navigate and identify which assumptions are reasonable. Hopefully, you will learn something from these tips and create a more robust solution." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Taking the default loss function for granted\n", "\n", "When evaluating binary classification algorithm, area under the curve is a great default to start with, but when it comes to real world applications this off-the-shelf loss function is rarely optimum for the business problem we're trying to solve for.\n", "\n", "Take for example, fraud detection. In order to align business objectives what we might want is to penalize false negatives in proportion to the dollar amount lost due to fraud. Using area under the curve might give us a decent result, but it probably won't give us the best result." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Using the same algorithm/method for all problems\n", "\n", "Many will use the algorithm/method that they are the most comfortable/familiar with on every use case they can imagine. This might lead to poor results due to bad assumptions.\n", "\n", "Ideally, once we have preprocessed our data, we should feed it into many different models and see what the results are. After that, we will have a good idea of what models work best and what models don't work so well. Of course based on experience, we've might realized some methods just works well or just don't work well with some types of problems. e.g. Support Vector Machine is extremely powerful in text classification (high-dimensional dataset)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Ignoring outliers\n", "\n", "Outliers can be important or completely ignored, just based on context. Take for example, pollution forecasting. Large spikes in air pollution can occur and it's a good idea to look at them and understand why they occurred. In the case of outliers caused by some type of sensor error, it's safe to ignore them and remove them from the dataset.\n", "\n", "From a model perspective, some are more sensitive to outliers than others. e.g. For gradient boosted trees, it might try to mitigate making errors on those outliers and put unneccessary amount of focus on them, whereas a vanilla decision tree might just treat the outlier as a misclassification." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Not properly dealing with cyclical features\n", "\n", "Hours of the day, days of the week, months in a year and wind directions are all example of features that are cyclical. Many won't think of converting these type of features into representation that can preserve information such as hour 23 and hour 0 should be close to each other.\n", "\n", "Sticking to the hour example, the best way to handle this is to calculate the sin and cos component so that we represent our cyclical feature as (x,y) coordinates of a circle. In this representation hour, 23 and hour 0 are right next to each other numerically, just as they should be.\n", "\n", "In the following section, we will explore this feature engineering approach on the bike sharing dataset to see if it actually improves the performance of the model. The goal is to predict how many bikes are being used at an given hour" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "data_dir = 'bike_sharing'\n", "if not os.path.isdir(data_dir):\n", " url = 'https://archive.ics.uci.edu/ml/machine-learning-databases/00275/Bike-Sharing-Dataset.zip'\n", " call('curl -O ' + url, shell=True)\n", " call('mkdir ' + data_dir, shell=True)\n", " call('unzip Bike-Sharing-Dataset.zip -d ' + data_dir, shell=True)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "dimension: (17379, 17)\n" ] }, { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
instantdtedayseasonyrmnthhrholidayweekdayworkingdayweathersittempatemphumwindspeedcasualregisteredcnt
012011-01-01101006010.240.28790.810.031316
122011-01-01101106010.220.27270.800.083240
232011-01-01101206010.220.27270.800.052732
342011-01-01101306010.240.28790.750.031013
452011-01-01101406010.240.28790.750.0011
\n", "
" ], "text/plain": [ " instant dteday season yr mnth hr holiday weekday workingday \\\n", "0 1 2011-01-01 1 0 1 0 0 6 0 \n", "1 2 2011-01-01 1 0 1 1 0 6 0 \n", "2 3 2011-01-01 1 0 1 2 0 6 0 \n", "3 4 2011-01-01 1 0 1 3 0 6 0 \n", "4 5 2011-01-01 1 0 1 4 0 6 0 \n", "\n", " weathersit temp atemp hum windspeed casual registered cnt \n", "0 1 0.24 0.2879 0.81 0.0 3 13 16 \n", "1 1 0.22 0.2727 0.80 0.0 8 32 40 \n", "2 1 0.22 0.2727 0.80 0.0 5 27 32 \n", "3 1 0.24 0.2879 0.75 0.0 3 10 13 \n", "4 1 0.24 0.2879 0.75 0.0 0 1 1 " ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "data_path = os.path.join(data_dir, 'hour.csv')\n", "df = pd.read_csv(data_path)\n", "print('dimension:', df.shape)\n", "df.head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There are a bunch of features in here that are likely valuable to predict `cnt`, the count of users riding bikes (the sum of \"casual\" riders and \"registered\" riders). Here we'll just focus on the cyclical features: `mnth` (month) and `hr` (hour).\n", "\n", "The next section is where the magic happens. We map each cyclical feature onto a circle such that the lowest value for that feature appears right next to the largest value. We compute the x and y component of that point using `sin` and `cos` trigonometric functions. Here's what it looks like for the \"hours\" variable. Zero (midnight) is on the right, and the hours increase counterclockwise around the circle. In the end, 23:59 is very close to 00:00, as it should be.\n", "\n", "\n", "\n", "We can apply similar transformation for the \"month\" variable, one extra step is that we also shift the values down by one such that it ranges from 0 to 11, this step is only for convenience." ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAwUAAAHwCAYAAAD3pcP6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAWJQAAFiUBSVIk8AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3de7QkdX3v/fcXUG4HhqtHI8YB4jAkj+iZQUVY4ZqHgIhyBJ74RD0KauIF7yTOo0bBFeOYpYLiHYNjRAUznoMahGDCRQXRMHO8ZmRwYBMRPModuYp+nz+qWpqmu3d37967q/fv/VqrV82uqt/vV1VTXV2fukZmIkmSJKlcm016AiRJkiRNlqFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSrcFpOegBJExHXA9sDMhCdFkiRJi9tS4M7M3H2YQoaChbH91ltvvdPee++906QnRJIkSYvXhg0buPfee4cuZyhYGDN77733TuvWrZv0dEiSJGkRW7lyJevXr58Ztpz3FEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmF22LSEyBJUqelq85/RL+Z1UdNYEokqQyNOFMQEcdFxBkR8Y2IuDMiMiLOHrGu3SLirIi4MSLuj4iZiDg9InbsU+YPI+ILEfGLiLgvIq6OiFMjYuvR50qSNKylq87vGghmGyZJmptGhALgbcBJwFOBn41aSUTsCawDTgC+A5wGXAu8DvhWROzcpcwzgH8HjgH+FfgAcCfwduBrEbHlqNMjSRrcoDv8BgNJGr+mhII3AMuA7YFXzqGejwCPAV6bmcdk5qrMPJQqHOwFvKt95IjYHPgUsA1wXGb+eWa+GXgG8EXggHraJEnzaNgdfYOBJI1XI0JBZl6SmddkZo5aR32W4HBgBvhwx+B3AHcDL4qIbdv6HwTsDXw9M7/cNj2/Bf66/vMVERGjTpckSZLUdI0IBWNySN29qN6p/53MvAu4nOqMwH5tgw6tuxd2VpaZ1wIbgScCe4x9aiVJwOhH/T1bIEnjs5iePrRX3d3YY/g1VGcSlgH/NkSZZfVn02wTEBHregxaPltZSZIkaVIWUyhYUnfv6DG81X+HOZaRpIH5aE0tJNc3SaNaTKFg4jJzZbf+9RmEFQs8OZImqN+lLa1h7qxpXFzfJM3VYrqnoHVUf0mP4a3+t8+xjCT15aM1tZBc3ySNw2IKBVfX3WU9hj+p7rbfPzBKGUnqyUdrDm/UI9ge+XZ9kzQ+iykUXFJ3D4+Ih81XRGxH9c6Be4Ar2wZdXHeP6KwsIvagCgvXU70ATZIkSVqUpi4URMSjImJ5/V6C38nMTcBFwFLg1R3FTgW2BT6TmXe39b8M2AAcGBHPaWtjM+A99Z8fm8v7EySVw0drjm7Yo/6eJXB9kzRejbjROCKOAY6p/3xs3X1mRKyp/31zZp5c//vxVDvy11MFgHavAq4APhgRh9XjPYPqHQYbgbe2j5yZv4mIE6jOGKyNiLXAfwKHAftSvdvgtDHMoiRpFjOrjxpoh9VAIEnj14hQADwVeHFHvz146KVh1wMnM4vM3BQR+wLvpLok6FnATcAHgFMz87YuZb4dEU+jOptwOLBd3d47gdWZef9IcyRJGlprh99Ha0rSwmpEKMjMU4BTBhx3Bog+w38KnDBk+/8BHD9MGUnS/DEASNLCmrp7CiRJkiSNl6FAksbER2tqIbm+SRonQ4EkSZJUOEOBJI2Rj9bUQnJ9kzQuhgJJGrNBd7zcQdM4uL5JGodGPH1IkhYbH62pheT6JmmuDAWSNI/cIdNCcn2TNCovH5IkSZIKZyiQJEmSCmcokCRJkgrnPQWSRuZNjZKmjdstqTtDgaShdftR7Rzmj6ykJnG7JfXn5UOShtLvh3WU8SRpvrndkmZnKJA0sGF/MP2BlTRpbrekwRgKJEmSpMIZCiQNZNSjZx51kzQpbrekwRkKJEmSpMIZCiRJkqTCGQokSZKkwhkKJEmSpMIZCiQNZNSX+vgyIEmT4nZLGpyhQJIkSSqcoUDSwIY9eubRNkmT5nZLGoyhQNJQBv3B9IdVUlO43ZJmt8WkJ0DS9Gn9cHZ7wY8/qpKayO2W1J+hQNLI/CGVNG3cbkndefmQJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklS4RoWCiNgtIs6KiBsj4v6ImImI0yNixwHLHxwROcDnCR3l+o175fzMrSRJktQMjXmjcUTsCVwBPAb4EvBj4OnA64AjIuKAzLxllmpmgFN7DHsy8Dzgh5n50y7DrwfWdOl/w6wTL0mSJE2xxoQC4CNUgeC1mXlGq2dEvB94A/Au4BX9KsjMGeCUbsMi4vP1P8/sUXwmM7uWlSRJkhazRlw+VJ8lOJzqSP+HOwa/A7gbeFFEbDti/bsA/x24F/jH0adUkiRJWnyacqbgkLp7UWb+tn1AZt4VEZdThYb9gH8bof4XA1sC/5iZt/cYZ4eIOBF4LHAHsC4zvZ9AkiRJi15TQsFedXdjj+HXUIWCZYwWCl5edz/eZ5ynAP/Q3iMivge8KDN/MEgjEbGux6Dlg5SXJEmSJqERlw8BS+ruHT2Gt/rvMGzFEXEQVej4YWZe0WO09wMHALsC2wFPA9ZSBYWLI+Lxw7YrSZIkTYumnCmYT39Rdz/Ra4TMfFNHr6uA4yNiLXAscDLVzc59ZebKbv3rMwgrBppaTZ2lq85/RL+Z1UdNYEokSerO3yrNpimhoHUmYEmP4a3+ve4H6CoidqLaqb8X+MwI0/WxuvyBI5TVItdtA9s5zA2uJGmS/K3SoJpy+dDVdXdZj+FPqru97jnopXWD8Rf63GDczy/r7khPPdLi1W8jO8p4kiSNm79VGkZTQsEldffwiHjYNEXEdlTX+98DDPs0oNYNxj0vHZrFfnX32hHLaxEaduPpxlaStND8rdKwGhEKMnMTcBGwFHh1x+BTqY7UfyYz7271jIjlEdHzqT4R8cfA3vS/wZiI2CciHtWtP9UL0wDOHnBWJEmSpKnTiFBQexXwC+CDEXFeRLw7Ii6musF3I/DWjvE31J9eZr3BuPZG4Od1m2dExHsj4p+B9cDOVG9A/nzfGlSMUY+keARGkrRQ/K3SKJpyozGZuSki9gXeCRwBPAu4CfgAcGpm3jZoXRGxI3Acg91gfB6wPbAPcCiwFXALcAFwZmZ+echZkSRJkqZKY0IBQGb+FDhhwHGjz7DbgK0HrOc8qmAgSZIkFalJlw9JkiRJmgBDgSRJklQ4Q4E0hFFf8OKLYSRJC8XfKo3CUCBJkiQVzlAgDWnYIykeeZEkLTR/qzQsQ4E0gkE3nm5kJUmT4m+VhtGoR5JK06S1Ee32shc3sJKkJvC3SoMyFEhz5EZVktR0/lZpNl4+JEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBVui0lPgLR01fmP6Dez+qgJTIkkSWoi9xXmX6NCQUTsBrwTOALYGbgJOA84NTNvG7COS4GD+oyydWbe16XcHwKnAAcD2wPXA+cAqzPz3oFnQgPr9gXvHOYXXpKkcrmvsHAaEwoiYk/gCuAxwJeAHwNPB14HHBERB2TmLUNUeWqP/g92afsZwMXAo4C1wE+BQ4G3A4dFxGGZef8QbWsW/b7kneP5ZZckqTzuKyysxoQC4CNUgeC1mXlGq2dEvB94A/Au4BWDVpaZpwwyXkRsDnwK2AZ4bmZ+ue6/GfAF4Ni6/dWDtq3+Bv2St4/vl12SpHK4r7DwGnGjcX2W4HBgBvhwx+B3AHcDL4qIbeeh+YOAvYGvtwIBQGb+Fvjr+s9XRETMQ9uSJEnSxDUiFACH1N2L6p3x38nMu4DLqY7k7zdohRHxZxGxKiLeGBFHRsSWPUY9tO5e2DkgM68FNgJPBPYYtG31Nmzyn2s5SZI0XdxXmIymXD60V93d2GP4NVRnEpYB/zZgned0/P2LiHh1Zq4doe1l9WdTvwYjYl2PQcv7lZMkSZImqSlnCpbU3Tt6DG/132GAur4EHA3sBmxNtUP+7rrsuRFxxDy2LUmSJE2dppwpGJvMPK2j19XAWyLiRuAMqoDwiEuFxtT2ym796zMIK+ajTUmSJGmumnKmoHU0fkmP4a3+t8+hjU9SPY70qRGx3QK3LUmSJDVWU0LB1XV3WY/hT6q7va77n1X9wrK76j/bn2I0723rIaM+LszHjEmSVAb3FSajKaHgkrp7eP1+gN+pj+ofANwDXDlqAxGxF7AjVTC4uW3QxXW3814DImIPqrBwPXDtqG1LkiRJTdaIUJCZm4CLgKXAqzsGn0p1ZP8zmXl3q2dELI+Ihz3VJyJ2j4idOuuPiF2pXlAGcE5mtr/V+DJgA3BgRDynrcxmwHvqPz+WmTnKvOmRhk3yJn9JksrivsLCa0QoqL0K+AXwwYg4LyLeHREXU71NeCPw1o7xN9SfdgcBN0bEv0bEJyJidUR8juqxos8EruKhF5IBkJm/AU6gOhOxNiI+FxGrgW8Dx1G9I6Hz5mXN0aBfXr/kkiSVyX2FhdWYpw9l5qaI2Bd4J9WlPM8CbgI+AJyambcNUM06qvcTrAT+G7A91eVCPwC+AHw8Mx/o0va3I+JpVGclDge2o7pk6J3A6sy8f46zpy5aX+JuLxvxCy5JktxXWDjhVTHzLyLWrVixYsW6db3ebSZJkiTN3cqVK1m/fv36Xo/K76VJlw9JkiRJmgBDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVLgtJj0Bao6lq85/RL+Z1UdNYEokSZImr6R9I0OBuq7wncMW6xdAkiSpU4n7Rl4+VLh+K/0o40mSJE2zUveNGhUKImK3iDgrIm6MiPsjYiYiTo+IHQcsv21EvCAiPhcRP46IuyPiroi4KiLeFBGP7lEu+3yuHO9cNsewK/NiW/klSZLalbxv1JjLhyJiT+AK4DHAl4AfA08HXgccEREHZOYts1Tzx8DZwK3AJcB5wI7Ac4D3As+LiMMy874uZa8H1nTpf8PwcyNJkiRNj8aEAuAjVIHgtZl5RqtnRLwfeAPwLuAVs9Txc+CFwD9l5gNtdZwMXArsD7waeF+XsjOZecocpn+qjJpsl646f9FdQydJklT6vlEjLh+qzxIcDswAH+4Y/A7gbuBFEbFtv3oy87uZ+dn2QFD3v4uHgsDB45hmSZIkabFoypmCQ+ruRZn52/YBmXlXRFxOFRr2A/5txDZ+XXcf7DF8h4g4EXgscAewLjOHup8gItb1GLR8mHokSZKkhdSUULBX3d3YY/g1VKFgGaOHghPr7oU9hj8F+If2HhHxPeBFmfmDEduUJEmSGq8poWBJ3b2jx/BW/x1GqTwiTgKOAL4LnNVllPcDX6QKJfdRHdl/M3AccHFEPDUzfzZbO5m5skf764AVo0y7JEmSNN8acU/BfIqI5wGnU92EfGxm/rpznMx8U2ZekZk3Z+avMvOqzDyeKijsApy8sFM9/0a9IWYx3EgjSZLUqfR9o6aEgtaZgCU9hrf63z5MpRFxDHAO8Avg4My8dsjp+ljdPXDIcpIkSdLUaEoouLruLusx/El1t9c9B48QEccD/wT8H+CgzLx6liLd/LLu9n3q0bQaNtkuliQsSZLUTcn7Rk0JBZfU3cMj4mHTFBHbAQcA9wADPQ0oIl4AfB64kSoQXDPidO1Xd4c9wzA1Bl2ZF9NKL0mS1Eup+0aNuNE4MzdFxEVUTxh6NXBG2+BTqY7Ufzwz7271jIjlddkft9cVES+mupn4euCQzLy+X9sRsQ+wofNeg7r/u+o/zx5lvqZFa6Xu9tKOxbbCS5IkzabEfaNGhILaq4ArgA9GxGHABuAZVO8w2Ai8tWP8DXU3Wj0i4hCqQLAZ1dmHEyKioxi3Z+bpbX+/ETg6Ir4B/BS4n+rpQ0cAmwNnUp11WPQW60ouSZI0ipL2jRoTCuqzBfsC76TaIX8WcBPwAeDUzLxtgGqeyEOXRJ3YY5zrqZ5G1HIesD2wD3AosBVwC3ABcGZmfnnIWZEkSZKmSmNCAUBm/hQ4YcBxH3EKIDPXAGuGbPM8qmAgSZIkFakpNxpLkiRJmhBDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklS4LSY9AZp/S1ed/4h+M6uPmsCUSJIkLT6LYV+rMWcKImK3iDgrIm6MiPsjYiYiTo+IHYesZ6e63Exdz411vbvNd9tNs3TV+V1X0tmGSZIkaXaLaV+rEaEgIvYE1gEnAN8BTgOuBV4HfCsidh6wnp2Bb9XlNtX1fKeud11E7DFfbTfNoCvhNK2skiRJTbHY9rUaEQqAjwCPAV6bmcdk5qrMPJRqB30v4F0D1vN3wDLg/Zl5WF3PMVQ7+I+p25mvthtj2JVvWlZWSZKkJliM+1oTDwX1kfrDgRngwx2D3wHcDbwoIradpZ7/AryoHv+UjsEfAq4H/rT9bMG42pYkSZKm2cRDAXBI3b0oM3/bPiAz7wIuB7YB9pulnv2ArYHL63Lt9fwW+JeO9sbZdmOMmkSnIcFKkiRN2mLd12pCKNir7m7sMfyaurtsHuoZV9sARMS6bh9g+SDlJUmSpEloQihYUnfv6DG81X+HeahnXG1LkiRJU8v3FIxRZq7s1r8+W7BigSdHkiRJGkgTzhS0jsYv6TG81f/2eahnXG1LkiRJU6sJoeDqutvruv0n1d1e1/3PpZ5xtd0Yo749b9reuidJkjQJi3Vfqwmh4JK6e3hEPGx6ImI74ADgHuDKWeq5ErgXOKAu117PZlSPHm1vb5xtS5IkSVNr4qEgMzcBFwFLgVd3DD4V2Bb4TGbe3eoZEcsj4mFP9MnMXwGfqcc/paOek+r6/yUzr51L29Ng2CTa9OQqSZLUJItxXysyc9LT0HqJ2BVUbxb+ErABeAbVewQ2Avtn5i1t4ydAZkZHPTvX9SwDLga+A+wNPBf4RV3Pprm0PeL8rVuxYsWKdevWzaWaoQ3yPNxpWEklSZKaqIn7WitXrmT9+vXrez0Ap5dGhAKAiHgC8E7gCGBn4CbgfwGnZuZtHeN2DQX1sJ2o3kZ8DPA44BbgAuDtmXnDXNsecd4mEgpauq2whgFJkqTxaNK+1tSHgsVs0qFAkiRJZRg1FEz8ngJJkiRJk2UokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCmcokCRJkgpnKJAkSZIKZyiQJEmSCteYUBAR+0fEVyPi1oi4NyK+HxGvj4jNh6jj8RHxmoi4ICJmIuL+iLglIr4WEc/rUebgiMg+n9Xjm0tJkiSpebaY9AQARMRzgS8C9wHnArcCRwOnAQcAxw9Y1WuANwPXAZcAPweeCDwP+JOIOC0z39ij7GXApV36f3PAtqfe0lXnP6LfzOqjJjAlkiRJk1fSvtHEQ0FEbA+cCfwGODgzr6r7/w1wMXBcRDw/M88ZoLrv1HVc1tHG3sCVwBsi4rOZua5L2Usz85Q5zMrU6rbCdw5brF8ASZKkTiXuGzXh8qHjgF2Bc1qBACAz7wPeVv/5ykEqysz/2RkI6v4bqM5AABw8p6ldZPqt9KOMJ0mSNM1K3TdqQig4tO5e2GXY14F7gP0jYss5tvPruvtgj+F/EBEnRcRbIuLEiHjSHNtrvGFX5sW28kuSJLUred9o4pcPAXvV3Y2dAzLzwYi4DvgjYA9gwygN1JcoHQskcFGP0V5Qf9rLfRF4eWbeNmA73S5LAlg+4KRKkiRJC64JZwqW1N07egxv9d9hlMojIoBPAv8V+Gh9KVG7XwKrgCcD21FdynQk8L+pgsRXIqIJy2msRk22iykRS5IktZS+bzSWMwURMUP1lJ9BfTYzXziOtgfwPqqnF30DeMSThzLzR8CP2nr9CrgwIq4Avkv19KOjgS/N1lBmruzWvz6DsGLoKZckSZIWwLguH9pE9TjRQd3Y9u/WmYAl3UZs63/7sBMVEX8PvIHq3oSjMvP+Qctm5p0R8TngrcCBDBAKJEmSpGk0llCQmYfNofjVwL7AMuBh1+RHxBbA7lQ3B187TKURcRrweqr3FTw7M+8ZYdp+WXe3HaGsJEmSNBWacK38xXX3iC7DDgS2Aa4Y9Ch/VD5MFQi+RnWGYJRAALBf3R0qkEiSJEnTpAmhYC1wM/D8iNi31TMitgL+tv7zo+0FImKbiFgeEb/f0T+ATwCvAi4AnpOZ9/ZrvL3Njv4vBP4MeAD4wlBzNAVGfeHGYntRhyRJErhvNPFHktbX7r+cKhxcGhHnALcCz6F6XOlaHnrxWMvTqS4LuoyHv4zs7cDLgHupbhJeVeWEh/luZp7X9vfaiHgQuAq4AdgKeFrdxoPAX2bmzNzmUpIkSWquiYcCgMw8LyIOorqp91iqHfOfUD0t6IOZmQNWtXvd3Rr4/3qM82mgPRR8FPgTqqcM7QIE8DNgDXB6Zn5v8DmZLjOrjxrqMVqLJQlLkiR1U/K+UQy+v61RRcS6FStWrFi3rte7zSZrkJV/Ma30kiRJ/UzzvtHKlStZv379+l6Pyu+lEWcKNFmtlbrbF6CpK7wkSdJ8KXHfyFCg31msK7kkSdIoSto3asLThyRJkiRNkKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkqnKFAkiRJKpyhQJIkSSqcoUCSJEkq3BaTngBp6arzH9FvZvVRE5gSSZLURO4rzL/GhIKI2B94G7AfsDVwDXAWcEZm/maIerLP4G9n5n49yj0bOBn4b8DmwI+Aj2TmpwdtW8Pp9gXvHOYXXpKkcrmvsHAaEQoi4rnAF4H7gHOBW4GjgdOAA4Djh6zyemBNl/439Gj/JOAM4BbgbOAB4DhgTUQ8OTNPHrJ9zaLfl7xzPL/skiSVx32FhRWZ/Q6sL8AERGwP/ARYAhyQmVfV/bcCLgaeCfy/mXnOgPUlcFlmHjzg+EuBHwN3Ayszc6buvyPw78CewP6Z+a2BZ+qRbaxbsWLFinXr1o1axaIy6Je8nV92SZLK4b7C6FauXMn69evXZ+bKYco14Ubj44BdgXNagQAgM++jupwI4JXz2P6JwJbAh1qBoG7/NuDv6j9fMY/tS5IkSRPVhFBwaN29sMuwrwP3APtHxJZD1LlDRJwYEW+JiFdHRNf7CAZo/4KOcTRHoyT/uZSTJEnTxX2FyWjCPQV71d2NnQMy88GIuA74I2APYMOAdT4F+If2HhHxPeBFmfmDIdq/KSLuBnaLiG0y855+jUZEr+uDlg822ZIkSdLCa8KZgiV1944ew1v9dxiwvvdT3Zy8K7Ad8DRgLVVQuDgiHj9i+0t6DJckSZKm2ljOFETEDPDEIYp8NjNfOI62O2Xmmzp6XQUcHxFrgWOpHjv6hnlqu+sNHfUZhBXz0aYkSZI0V+O6fGgT1eNEB3Vj279nOxLf6n/7sBPV4WNUoeDAjv53ALvU7dzSp/1eZxIkSZKkqTaWy4cy87DMXD7E56/bil9dd5d11hsRWwC7Aw8C185xMn9Zd7ft6N+v/cfV498w2/0EGsyojwvzMWOSJJXBfYXJaMI9BRfX3SO6DDsQ2Aa4IjPvn2M7rScQdYaLfu0f2TGOJEmStOg0IRSsBW4Gnh8R+7Z61i8v+9v6z4+2F4iIbSJieUT8fkf/fSLiUZ0NRMQ+wLvqP8/uGPwp4H7gpPpFZq0yOwJvqf/82JDzpD6GTfImf0mSyuK+wsKbeCjIzDuBlwObA5dGxCcj4u+B71K9zXgtcG5HsadTPZ70Hzv6vxH4eUScFxFnRMR7I+KfgfXAzsCZwOc72r8O+CtgJ+CqiPhwRJwGfJ/qbcbvm8vbjNXdoEzUNqwAABeMSURBVF9ev+SSJJXJfYWF1YT3FJCZ50XEQcBbqW4G3gr4CdVO/gczMwes6jxge2AfqheObUV18/AFwJmZ+eUe7Z9RP0HpZOB/UIWl/wDelpmfHnW+1F/rS9ztZSN+wSVJkvsKCycG39/WqCJi3YoVK1asW9fr3WaSJEnS3K1cuZL169ev7/Wo/F4mfvmQJEmSpMkyFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhdti0hMgTbulq85/RL+Z1UdNYEokSerO3yrNxlAgjajbBrZzmBtcSdIk+VulQXn5kDSCfhvZUcaTJGnc/K3SMAwF0pCG3Xi6sZUkLTR/qzQsQ4EkSZJUOEOBNIRRj6R4BEaStFD8rdIoDAWSJElS4QwFkiRJUuEMBZIkSVLhDAWSJElS4RoTCiJi/4j4akTcGhH3RsT3I+L1EbH5EHWcEhE5y2dTR5mDZxl/9fjnVtNq1Be8+GIYSdJC8bdKo2jEG40j4rnAF4H7gHOBW4GjgdOAA4DjB6zq0j7DjgZWABf0GH5Zj/LfHLBtSZIkaSpNPBRExPbAmcBvgIMz86q6/98AFwPHRcTzM/Oc2erKzEvpsmNfn214af3nJ3oUvzQzTxl2+lWemdVHDfXYNo+8SJIWmr9VGlYTLh86DtgVOKcVCAAy8z7gbfWfr5xjG88CdgOuzMzvz7EuaeCNpxtZSdKk+FulYUz8TAFwaN29sMuwrwP3APtHxJaZef+IbfxF3e11lgDgDyLiJGB74OfANzLzmmEaiYh1PQYtH6YeTYfWRrTbkRg3sJKkJvC3SoNqQijYq+5u7ByQmQ9GxHXAHwF7ABuGrTwidgOOBO6gul+hlxfUn/ayXwRenpm3DduuyuFGVZLUdP5WaTZNCAVL6u4dPYa3+u8wYv0vBTYHzs7Me7oM/yWwCjgfmAG2AvYF/g44FnhsRByYmb+draHMXNmtf30GYcVIUy9JkiTNs7HcUxARMwM8CrT9c/Y42h1gujbjoRuMP95tnMz8UWa+JzN/mJm/ysybM/NC4GDgOqqnHx29ENMrSZIkTcK4zhRsonqc6KBubPt360zAkm4jtvW/fdiJorps6AlUNxj/YJiCmXlnRHwOeCtwIPClEdqXJEmSGm8soSAzD5tD8aupLtdZBjzsRt2I2ALYHXgQuHaEuls3GHc9SzCAX9bdbUcsL0mSJDVeEx5JenHdPaLLsAOBbYArhn3yUET8HnAUs99g3M9+dXeUQCJJkiRNhSaEgrXAzcDzI2LfVs+I2Ar42/rPj7YXiIhtImJ5RPx+n3pbNxh/JjPv7TVSe5sd/V8I/BnwAPCFQWZEkiRJmkYTf/pQfe3+y6nCwaURcQ5wK/AcqseVruWRR/qfDlwCXEZ1Q/DDdNxg3O/dBABrI+JB4CrgBqqnDz2tbuNB4C8zc2boGZMkSZKmxMRDAUBmnhcRB1Hd1Hss1Y75T4A3Ah/MzByyyj8FnshgNxh/FPgTqqcM7QIE8DNgDXB6Zn5vyLYlSZKkqdKIUACQmZcDzxpw3Eupdt57Db+g3/COcd8DvGeQcSVJkqTFqAn3FEiSJEmaIEOBJEmSVDhDgSRJklQ4Q4EkSZJUOEOBJEmSVDhDgSRJklS4xjySVNL0Wbrq/Ef0m1l91ASmRJIG43ZL6s5QIGlo3X5UO4f5IyupSdxuSf15+ZCkofT7YR1lPEmab263pNkZCiQNbNgfTH9gJU2a2y1pMIYCSZIkqXCGAkkDGfXomUfdJE2K2y1pcIYCSZIkqXCGAkmSJKlwhgJJkiSpcIYCSZIkqXCGAkkDGfWlPr4MSNKkuN2SBmcokCRJkgpnKJA0sGGPnnm0TdKkud2SBmMokDSUQX8w/WGV1BRut6TZbTHpCZA0fVo/nN1e8OOPqqQmcrsl9WcokDQyf0glTRu3W1J3Xj4kSZIkFc5QIEmSJBXOUCBJkiQVznsKJGkeeVOjFpLrm6RRGQokaR502znrHObOmsbF9U3SXHn5kCSNWb8dtFHGk/pxfZM0DoYCSRqjYXe83FHTXLi+SRoXQ4EkSZJUOEOBJI3JqEdhPXqrUbi+SRonQ4EkSZJUOJ8+JElqHB+tKUkLa+JnCiLiURHxuoj4VER8NyIeiIiMiJfNoc79I+KrEXFrRNwbEd+PiNdHxOZ9yjw7Ii6NiDsi4lcR8e2IePGo0yBJGt7SVef3vLyl3zBJ0txMPBQA2wKnAy8BHgv8fC6VRcRzga8DBwL/C/gQ8GjgNOCcHmVOAr4C/F/A2cCZwO8BayLivXOZHknSYHy0piRNThNCwT3As4Dfy8zHAmeNWlFEbE+1Q/8b4ODMfGlm/hXwVOBbwHER8fyOMkuB9wK3Avtm5qsz8w3APsAm4E0R8cxRp0lSOUa9vMXLYny05ihc3ySN08RDQWY+kJkXZOZNY6juOGBX4JzMvKqtjfuAt9V/vrKjzInAlsCHMnOmrcxtwN/Vf75iDNMmSZIkNdLEQ8GYHVp3L+wy7OtUZyX2j4gtByxzQcc4ktTXsEdhPWrrozXnwvVN0rgstlCwV93d2DkgMx8ErqN64tIeA5a5Cbgb2C0itpmt8YhY1+0DLB9yPiRNsUF3vNxB0zi4vkkah8X2SNIldfeOHsNb/XcYssy29Xj3zGnqJBWjtQPmozW1EFzfJM3VWEJBRMwATxyiyGcz84XjaLtJMnNlt/712YIVCzw5khrAHTItJNc3SaMa15mCTcB9Q4x/45ja7dQ62r+kx/BW/9s7yuxSD7ulT5leZxIkSZKkqTaWUJCZh42jnjG4GtgXWAasax8QEVsAuwMPAtd2lNmlLvOtjjKPo7p06IbM9NIhSZoHM6uPGummYY+KS9L4LLYbjS+uu0d0GXYgsA1wRWbeP2CZIzvGkSRJkhadqQwFEbEkIpbXR/LbrQVuBp4fEfu2jb8V8Lf1nx/tKPMp4H7gpPpFZq0yOwJvqf/82PimXpLUyUdrStJkNSIURMSqiFgTEWuAY+reJ7T6RcTLOor8d2AD8O72npl5J/ByYHPg0oj4ZET8PfBd4JlUoeHcjjLXAX8F7ARcFREfjojTgO8DewLvy8yHXVYkSRo/H60pSZPTlEeSHgEc1NFv//rT8slBKsrM8yLiIOCtwLHAVsBPgDcCH8zM7FLmjPoJSicD/4MqLP0H8LbM/PRwsyJJGpWP1pSkyWhEKMjMg4ccfw2wps/wy4FnDVnnV4CvDFNGkjQ/DACStLAacfmQJEmSpMkxFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYWLzJz0NCx6EXHL1ltvvdPee+896UmRJEnSIrZhwwbuvffeWzNz52HKGQoWQERcB2wPzExwMpbX3R9PcBqmhctqOC6vwbmshuPyGpzLajgur8G5rAbXlGW1FLgzM3cfppChoBARsQ4gM1dOelqazmU1HJfX4FxWw3F5Dc5lNRyX1+BcVoOb9mXlPQWSJElS4QwFkiRJUuEMBZIkSVLhDAWSJElS4QwFkiRJUuF8+pAkSZJUOM8USJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEiSJEmFMxRIkiRJhTMUSJIkSYUzFEyhiHhURLwuIj4VEd+NiAciIiPiZXOoc/+I+GpE3BoR90bE9yPi9RGxeZ8yz46ISyPijoj4VUR8OyJePOo0zLdR5rFLHafUy7rfZ1NHmYNnGX/1+Od2bsaxrOp6+s33lX3KlbhuPT4iXhMRF0TETETcHxG3RMTXIuJ5Pco0ct2KiN0i4qyIuLGej5mIOD0idhyynp3qcq3lcWNd727z3fZCmev0RsS2EfGCiPhcRPw4Iu6OiLsi4qqIeFNEPLpHuZG+m5M2jv/fetvSb/636lHuDyPiCxHxi4i4LyKujohTI2Lr8c3h+Ixh3Zpt+9L6PKGj3FStWxFxXEScERHfiIg76+k8e8S6hl7mTVqvtljoBjUW2wKn1//+P8DPgSf0Hr2/iHgu8EXgPuBc4FbgaOA04ADg+C5lTgLOAG4BzgYeAI4D1kTEkzPz5FGnZz6MMo89XNpn2NHACuCCHsMv61H+mwO2vSDGuKxargfWdOl/Q4/2S123XgO8GbgOuITqe/1E4HnAn0TEaZn5xh5lG7NuRcSewBXAY4AvAT8Gng68DjgiIg7IzFsGqGfnup5lwMXAOcBy4ATgqIh4ZmZeOx9tL5QxTe8fU31PbqVab84DdgSeA7wXeF5EHJaZ93UpO9R3c9Lm4f/31B79H+zS9jOo1sNHAWuBnwKHAm8HDquX8f1DtD2vxrSsZui9jJ5MtW36YWb+tMvwaVq33gY8BfgV1fQtH6WSUZZ549arzPQzZR/g0cCRwOPqv08BEnjZCHVtD/wCuB/Yt63/VlQrdwLP7yizlGoH6BZgaVv/HYGf1GWeOenlNJd5HKGNzam+zAns0zHs4Lr/KZNeFgu9rOrxLx1i/GLXLaof2IO69N8buKOua2XT1y3gX+ppek1H//fX/T82YD0fr8d/X0f/19b9L5yvtqdpWQFPBV4APLqj/3bAurqeN3UpN9R3swmfMa5blwI5RLubA/9Rt/Gctv6bUe3IJbBq0stnPpZVn/o/X9fz2mlft4BDgCcB0bZNPXu+l3kT16uJ/2f4GcN/4txCwYl12U93GXZoPeyyjv7vrPufOkx9E1w+Q8/jCG0cXdfzrS7DWhuZUya9LBZ6WQ374+C61bOdT9Bl565p6xawZz091wGbdQzbjupI3N3AtrPU81+Ae+rxt+sYthnVEcwE9hh329O2rGZp48/rNr7SZdi07biNbXkxfCjo+V0G9qiHzQAx6eW0EOsWsAvVwZt7gB2mfd3qmPbWNnWoUDDKMm/ieuU9BTq07l7YZdjXqb70+0fElgOWuaBjnCYYZR6H9Rd19xN9xvmDiDgpIt4SESdGxJPm0N58mY9ltUM9v2+JiFdHxH4jtl/qugXw67r7iMsaak1Ztw6puxdl5m/bB2TmXcDlwDZAv3WAevjWwOV1ufZ6fkt1RK69vXG2vVAWYnpnW2+G+W5O2tiXV0T8WUSsiog3RsSRfb6nPb/nWV3CtpHqUr89Bm17ns33uvViYEvgnzLz9h7jTNO6NQ6jLPPGrVeGAu1Vdzd2DsjMB6lS7xY8fKXsV+YmqjS8W0RsM95JHdko8ziwqG56PJLqEo9z+4z6Aqpr5d8F/AOwMSLWDnOD3AKYj2X1FKr5fRfwIeBbUd0g/+Qh2y9u3QKIiO2BY6mOGl3UY7SmrFs9l0ftmrq7bB7qGVfbC2UhpvfEutsttMJw381Jm4/ldQ7wbuB9wFeB/4yI4xao7fk039P78rr78T7jTNO6NQ6LYptlKNCSuntHj+Gt/juMUGZJj+ELbZR5HMZLqa4NPDsz7+ky/JfAKqobs7YDdqUKEf+bamfvKxHRlO/iuJfV+6lutt2Vat6fRnWt5FOAiyPi8SO2X8S6FREBfBL4r8BHM3NDxyhNW7fGtTzmc7s06vd83OZ73TkJOAL4LnBWl1GG/W5O2jiX15eoLvncjeqM1HKqcLADcG5EHDGPbS+EeZveiDiIamf2h5l5RY/Rpm3dGodFsc1qyo5IcerHVA3yqK/WZ6THYy0WTV1e9Q7XS+s/ux41ycwfZeZ7MvOHmfmrzLw5My+kunbxOqqN59FjnKbGLKvMfFNmXlHP868y86rMPJ7qaT27ABN/klCTllcX76N6etE3gEc8eWih1y1Nh6geYXs61ROsjs3MX3eOMw3fzfmSmadl5j9n5s8y877MvDoz3wK8iWq/6N0TnsQmm/VS2ZLXrWnnI0knZxPVjTqDunGepmO2I6+t/u3XDd5B9cVeQvWUmF5leqXfUcxleY0yj4M6kupxsFdm5g+GKZiZd0bE54C3AgdSHb0ah6Yuq3YfozqSfWBHf9etWkT8PfAGqnsTjsohHks3j+vWbMa1PEbdLo2j7YUyL9MbEcdQXRbzC+CQ7Hhs6wB6fTcnbSH+fz9J9Sjhp0bEdm33s7huUb03hGrduBf4zAjT1dR1axwWxTbLUDAhmXnYpKehdjWwL9U1a+vaB0TEFsDuVDepXdtRZpe6zLc6yjyO6j0KN/S4lGYkc1xeo8zjoFpHTfpdW9nPL+vutiOWf4QGL6t2vebbdasqexrweqrnzj97xPkd+7o1gKvrbq9rYFs3QPe6hnYu9Yyr7YUy9umNiOOBz1GdITg0M6+ZpUg3k1hvBjHv/7+ZeV9E3EX1CORtgVYoKH7dqrVuMP50nxuM+2nqujUOi2Kb5eVDurjudl5DCVWa3wa4ouMoZb8yR3aM0wSjzOOsIuL3gKOY/QbjflpPIpjrTva4zMuy6qLXfBe9bkXlw1SB4GtUZwhGDUCTWLcuqbuHd97LEBHbUV3OdA8w21tNr6Q6GnlAXa69ns2AwzvaG2fbC2Ws0xsRL6B6dvyNVO+7GCUQQPO2SS3z/v8bEXtRBYK7gJvbBvX8nkfEHlQ7ddfTnGU2X8uqdYNxv6fs9dPUdWscRlnmzVuvFurZp37m78MA7ymgOg21nPqFZ239t6dK78O8vGx3pu8FU8PO4zb18vr9PvX+TV32jFna37dH/xcCv62na+kg8zJNywrYB3hUlzb2ofrBTeDPXbd+1z+AM+syXwW2GqD9xq1bDP8Cn+XA8i71+PKywZfVi4HfUO08PHGAdof+bjbhM47lVW9jdupS965t39lPdAzr95Kpf2IRvLys17rVNvyP63I/WIzrVtt0Hkyf9xRQvXl4ObDnGJZ549arqCdAUyYiVvHQq7ifSnVX/xU89Airb2bmJ9vGfwnwKarTfi/pqOsYqicD3Ed1LeqtwHOonjCwFvh/smNFiYjXAB+k2nk7F3gAOI7qaQ7vy8xG3Ug07DxGxMFUyf+yzDy4S32bUf8AU73BuOf9BBExQ3UJyVVUr1DfiuppDE+v+788M9fMcRbHZlzLKiLWUN3k+g2qtz3fT7XOHkG1MTwT+EvXrd/1fwdVwL+X6ibRB7o0993MPK+tzAwNW7ciYk+qbdFjqO5l2AA8g+o53huB/TPzlrbxEyAzo6Oenet6llEdUfsO1dudn0t1vfz+mblpLm1P2jiWVUQcAvwr1Y7EWVTftU63Z+bpbWXWMMJ3c9LGtLxeQnVt+zeptuG3Ar8PPIvq4NlVwP+dHZfHRMQzqNbDR1F9r/8TOIzq8sHLgcNy7mdQx2Zc38O24Z+hOtjw2sw8o0+7a5iydavehh9T//lY4E+p1o1v1P1ubv3uRMRSqoc4XJ+ZSzvqGXr707j1atKpzM9oH+o3Mvb5rOkY/yXd+rcNP4Dq6ORtVDslP6C6yXHzPtNwNHAZ1anWu4F/B1486WXTZ3oHnkceOlpwaY+6jqTHG4y7jPtmqktBflq3ex/Vza2fAp4y6eUyX8uKaiP7P6mO8N9JtZN7E/AV2o6KuG79rv+aWb7T3b7XjVy3qG6+/1T9//0A1Snw04Edu4yb9Hi7LLAT8IG6fGv9OQvYbRxtN+Ez12XFQ9v2fp+ZjjIjfzcn/RnD8npy/V37AdWBh19TBYNvAK8BHt2n7T+kOoJ7M9XO7kbgVGDrSS+X+VhWbcN2rLcvXd9gPO3rFg9dbTHr9wdY2u07Ncoyb+J65ZkCSZIkqXDeaCxJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFc5QIEmSJBXOUCBJkiQVzlAgSZIkFe7/B1DFjnbrQnn2AAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] }, "metadata": { "image/png": { "height": 248, "width": 386 } }, "output_type": "display_data" } ], "source": [ "# Now instead of hours ranging from 0 to 23,\n", "# we have two new features \"hr_sin\" and \"hr_cos\" which\n", "# each ranges from 0 to 1, and we'll use these two features\n", "# as inputs to our model as oppose to the raw form\n", "hour_col = 'hr'\n", "df['hr_sin'] = np.sin(df[hour_col] * (2. * np.pi / 24))\n", "df['hr_cos'] = np.cos(df[hour_col] * (2. * np.pi / 24))\n", "\n", "month_col = 'mnth'\n", "df['mnth_sin'] = np.sin((df[month_col] - 1) * (2. * np.pi / 12))\n", "df['mnth_cos'] = np.cos((df[month_col] - 1) * (2. * np.pi / 12))\n", "\n", "plt.scatter(df['hr_sin'], df['hr_cos'])\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The claim is that the transformed features should improve the performance of our model, let's give it a shot. We'll use a model pipeline that consists of standardizing the numeric features and feeding those features into a Linear Regression. We'll use a 10-fold cross validation and report the negative mean squared error." ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "CV Scoring Result: mean = -29904.664260591177 std = 14327.111793459853\n" ] } ], "source": [ "# model pipeline\n", "standardize = StandardScaler()\n", "linear = LinearRegression()\n", "pipeline = Pipeline([\n", " ('standardize', standardize),\n", " ('linear', linear)])\n", "\n", "# input features and labels\n", "label_col = 'cnt'\n", "input_cols = ['mnth', 'hr']\n", "y = df[label_col].values\n", "X = df[input_cols].astype(np.float64)\n", "\n", "# sklearn's LinearRegression may give harmless errors\n", "# https://github.com/scipy/scipy/issues/5998\n", "warnings.filterwarnings(\n", " action='ignore', module='scipy', message='^internal gelsd')\n", "\n", "cv = 10\n", "scoring = 'neg_mean_squared_error'\n", "results = cross_val_score(pipeline, X, y, cv=cv, scoring=scoring)\n", "print('CV Scoring Result: mean =', np.mean(results), 'std =', np.std(results))" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "CV Scoring Result: mean = -21968.675852049608 std = 9965.803743174423\n" ] } ], "source": [ "input_cols = ['mnth_sin', 'mnth_cos', 'hr_sin', 'hr_cos']\n", "X = df[input_cols].values\n", "\n", "# negative mean squared error, the closer the value is to 0, the better\n", "results = cross_val_score(pipeline, X, y, cv=cv, scoring=scoring)\n", "print('CV Scoring Result: mean =', np.mean(results), 'std =', np.std(results))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By taking this extra feature engineering step of transforming the month and hour feature into a more representative form, our model had an easier time learning the underlying relationship, which resulted in an increase in model performance." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## L1/L2 regularization without standardizing features\n", "\n", "Using regularization methods such L1 and L2 regularizations is a common way to prevent models from overfitting, however, keep in mind that it is important to standardize the features and put them on equal footing before applying this technique." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Misinterpreting feature importance\n", "\n", "Tree-based models or linear models are commonly used algorithms as they have the capability of giving us feature importance or coefficients without having to rely on other external methods. When intrepreting these result, there are a couple of caveats to keep in mind.\n", "\n", "- If the features are co-linear, the importance can shift from one feature to another. The more features the data set has the more likely the features are co-linear and the less reliable simple interpretations of feature importance are. Thus it is recommended to remove multicollinearity before feeding the data into our model.\n", "- Linear model such as linear and logistic regression outputs coefficients. Many times these coefficients will cause people to believe that the bigger the value of the coefficient, the more important the feature is. Not that this is wrong, but we need to make sure that we standardized our dataset beforehand as the scale of the variable will change the absolute value of the coefficient." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Take Away:**\n", "\n", "- Always build a custom loss function that closely matches your solution objectives.\n", "- If we use the same algorithm over and over again we might not be getting the best results that it can.\n", "- Always look at the data closely before we start our work and determine if outliers should be ignored or looked at more closely.\n", "- If we have cyclical features, we should convert them to take advantage of the cyclical pattern.\n", "- Regularization is great but can cause headaches if we don't have standardized features.\n", "- Understanding what features are most essential to a result is important, but do make sure to not misinterpret them.\n", "\n", "Just like with any projects/industries, the devil is in the details, and even fancy plots can hide bias and error. In order to achieve good result, it's important to double-check our process to ensure we're not making some common errors." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Soft Skills Tips and Tricks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Important questions to ask" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### What are the Key Performance Indicators (KPI) in this domain?\n", "\n", "As a data scientist in an organization, the sooner we understand how people on our team's work is measured against, the better. This helps us understand project prioritization and we need to be able to show.\n", "\n", "### What are the relevant/classic case studies in this domain?\n", "\n", "The industry, the organization, and our team has faced issues in the past that have been resolved or studied. There is a set of working knowledge that people in the group, company, and other companies will have about what worked and what didn't. Yes, we may be applying new techniques or using new frameworks, but it will all be flavored by what has been done in the past. This means that we should try and get the historical context to not only helps us understand decisions managers and senior members of the team will make, but also hone our personal approach to data science problems.\n", "\n", "### Who are the industry thought leaders (internal & external)?\n", "\n", "Within industries and organizations there are thought leaders who are driving the agenda, experiments, knowledge, and how to think about what is going on within the small world they inhabit and lead. Every group and organization will have different people they inherently trust and listen to. We should know who they are, try to familiarize ourselves with their work, and become conversant in their viewpoints, world views, and recommendations. This will help us better understand how the team and organization learns about cutting edge techniques and applications, as well as gives us a natural topic to talk about. This will also help us stay on top of industry news, gossip, firings, hirings, and the softer-side of the industry." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Mentality to have" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Help the business first\n", "\n", "Stacking outputs as features to go for the best models is not exactly what we are paid for. Instead we should aim for generating a actionable solution or get get a minimum viable product first. This does not mean we shouldn't work on improving the model, it's simply saying premature optimization is the root of all evil. When starting a project aim for the low hanging fruit first, there will come a time to improve our models or add complexity as needed.\n", "\n", "As a side note, from time to time, our business partner might be asking for a very small data pull. Don't ignore these requests and think that they are trivial since they might be able to provide the necessary insights to aid the business. When given such data request, make sure they support a decision and that decision will improve the business if it has the data – and when they do, swallow the pride and run those 30 lines of SQL query.\n", "\n", "### Always be learning but don't fall for the hype\n", "\n", "Deep learning ..., Distributed systems ... Sometimes we will not need all the fancy stuff that we've learned and it will be a distraction. Use familiar tools and if it ain't broken don't fix it. Consider that using any project younger than a few year may result in unavoidable pains (breaking APIs, bugs). Allow ourselves only a couple of new pieces of technology per work project. This is not saying that we can't work or keep up with the new and cool stuff. After all we should constantly be open to learning new knowledge. This is just reminding us, don't use the state-of-art tool because it's state-of-art, use it because it's required to solve to problem at hand. Another way to put it is: be boring until you can't.\n", "\n", "Regarding this topic, the blog post at the following link might be worth a read. [Blog: No, you don't need ML/AI. You need SQL](https://cyberomin.github.io/startup/2018/07/01/sql-ml-ai.html)\n", "\n", "### Talk & Learn from colleagues\n", "\n", "Already mentioned in the important questions to ask section. Learn from software engineers. Chances are they know more about best practices, tooling and devops than we do as a data scientist. Learn from non-technical peers. They know how we can help them from a business perspective, may understand the data much better and will recognize data quality issues faster than we will. Our feature engineering work will benefit a lot from their insight. Talk to our peers. Learn about the history of projects started by our colleagues. Ask about issues they encountered along the way. Try to take different paths or revisit them. When working on the project, it is also nice to look for peer reviews. Peer reviewer can provide feedbacks from a different standpoint, suggest better approaches and better sanity-checks than the consumers of our project can.\n", "\n", "### Educate our consumers\n", "\n", "Communication is key as we will often be presenting our analysis and results to people who are not data experts. Part of our job is to educate them on how to interpret and draw conclusions from our analysis data. This is especially important when the analysis has a high risk of being misinterpreted. We are responsible for providing the context and a full picture of the analysis and not just the number a consumer asked for.\n", "\n", "### Be skeptical and check for reproducibility\n", "\n", "As we work with the data, we must be a skeptic of the insights that we are gaining. When we have an interesting phenomenon we should ask both \"What other data could I gather to validate how awesome this is?\" or \"What could I find that would invalidate this phenomenon?\". If we are building models of the data, you want those models to be stable across small perturbations in the underlying data. Using different time ranges or random sub-samples of our data will tell us how reliable/reproducible this model is. And speaking of reproducibility, leaving \"paper trails\" and making it easy for other people (including the future you) to reproduce the work are extremely important.\n", "\n", "### Expect and accept ignorance and mistakes\n", "\n", "There are many limits to what we can learn from data and mistakes can still occur. Admitting our mistake is a strength but it is not usually immediately rewarded. But proactively owning up to our mistakes will translate into credibility and will ultimately earn us respect with colleagues and leaders who are data-wise." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Notes Worth Reading" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "[Paper: P. Domingos A few useful things to know about machine learning (2012)](https://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf): Like any discipline, machine learning has a lot of \"folk wisdom\" that can be hard to come by, but is crucial for success. This article summarized some of the most salient items.\n", "\n", "e.g. It's generalization that counts: The most common mistake among machine learning beginners is to test on the training data and have the illusion of success. Remember, if we are hired to build a predictive model, be sure to set some data aside from the beginning, and only use it to test the model at the very end, followed by learning the model on the full dataset. The following links also talks about this topic. [[Blog: How (and why) to create a good validation set](http://www.fast.ai/2017/11/13/validation-sets/)][[Blog: Do You Re-train on the Whole Dataset After Validating the Model?](http://www.chioka.in/do-you-re-train-on-the-whole-dataset-after-validating-the-model/)]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "[Blog: The Ten Fallacies of Data Science](https://towardsdatascience.com/the-ten-fallacies-of-data-science-9b2af78a1862): includes some gotchas to look out for when starting a data science project in the real world. e.g. double check to make sure relevant data is accessible, relatively consistent before agreeing to specific deadlines for the completion of the analysis." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "[Slideshare: Feature Engineering](https://www.slideshare.net/HJvanVeen/feature-engineering-72376750): Contains lots of feature engineering ideas that are worth knowing/trying. Applied machine learning is basically feature engineering, as we all probably heard of the saying \"more data beats clever algorithms, but better data beats more data\"." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "[Notes: Rules of Machine Learning: Best Practices for ML Engineering](http://martin.zinkevich.org/rules_of_ml/rules_of_ml.pdf): Extracted from the notes' overview section.\n", "\n", "> To make great product, do machine learning like the great engineer you are not like the great machine learning expert you aren't\n", "\n", "In the applied machine learning field, most of the problem that we will face, are in fact, engineering problems. Even with all of the open source machine learning toolkit at our disposable, often times great gains comes from encorporating great features not great machine learning algorithms. So the basic approach is:\n", "\n", "1. make sure our pipeline is solid end to end\n", "2. start with a reasonable objective\n", "3. add common-sense features\n", "4. make sure our pipeline stays update to date\n", "\n", "This approach will make a lot of people happy for a long period of time. Diverge from this approach and start looking into cutting-edge machine learning only when there are no more simple tricks to get us further." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "[Blog: Lessons on How to Lie with Statistics](https://towardsdatascience.com/lessons-from-how-to-lie-with-statistics-57060c0d2f19): Advice on how to be data literate & a responsible data producer and consumer.\n", "\n", "- View Correlations with Skepticism. Always check for confounding variables.\n", "- Relationships Don’t Last Forever. When we identify a relationship, don't assume it lasts forever be it in the positive or negative direction.\n", "- Always look at the axes on the chart. The trend might not be as big as we think if we examine the chart closely.\n", "- Small Samples Produce Shocking Statistics. The smaller the sample size, the more extreme the values. Always look out for sample size when reading summary statistics.\n", "- Look at all the Numbers that Describe a Dataset. Looking at the whole distribution will lead to a clearer picture as a single number such as average should be complemented by ranges/standard errors/quantiles to provide a holistic view.\n", "- Use Comparisons to a Common Baseline. It's usually comparison that matters, don't let a single/isolated number sway our rational thinking.\n", "- Look for Bias in Sample Selection. Is the sample representative of the population that we're interested in.\n", "- Be Wary of \"Big Names\" on Studies and Scrutinize Authority. A fun alternative way of phrasing this is: \"In god we trust, all others must bring data\", a study with a big name attached should get as much scrutiny as others." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Reference" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- [Blog: 10 pieces of advice to beginner data scientists](https://shapescience.xyz/blog/10-pieces-of-advice-to-beginner-data-scientists/)\n", "- [Blog: Feature Engineering - Handling Cyclical Features](http://blog.davidkaleko.com/feature-engineering-cyclical-features.html)\n", "- [Blog: Get Up To Speed Fast As A Junior Data Scientist](https://www.datascienceweekly.org/articles/get-up-to-speed-fast-as-a-junior-data-scientist)\n", "- [Blog: Three Mistakes That Set Data Scientist Up For Failure](http://dataconomy.com/2017/05/three-mistakes-data-scientist-failure/)\n", "- [Blog: Top 6 errors novice machine learning engineers make](https://medium.com/ai%C2%B3-theory-practice-business/top-6-errors-novice-machine-learning-engineers-make-e82273d394db)" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.4" }, "notify_time": "30", "toc": { "nav_menu": { "height": "171px", "width": "252px" }, "number_sections": true, "sideBar": true, "skip_h1_title": false, "title_cell": "Table of Contents", "title_sidebar": "Contents", "toc_cell": true, "toc_position": {}, "toc_section_display": "block", "toc_window_display": true }, "varInspector": { "cols": { "lenName": 16, "lenType": 16, "lenVar": 40 }, "kernels_config": { "python": { "delete_cmd_postfix": "", "delete_cmd_prefix": "del ", "library": "var_list.py", "varRefreshCmd": "print(var_dic_list())" }, "r": { "delete_cmd_postfix": ") ", "delete_cmd_prefix": "rm(", "library": "var_list.r", "varRefreshCmd": "cat(var_dic_list()) " } }, "types_to_exclude": [ "module", "function", "builtin_function_or_method", "instance", "_Feature" ], "window_display": false } }, "nbformat": 4, "nbformat_minor": 2 }