{ "metadata": { "name": "" }, "nbformat": 3, "nbformat_minor": 0, "worksheets": [ { "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "#Final Lab: Classification and Clustering Grab Bag\n", "\n", "In this final lab, we will introduce two methods of classification called Neural Networks and Support Vector Machines, and an additional method of clustering called Hierarchical Agglomerative Clustering. We hope that introducing these methods will broaden your toolkit for tackling problems both for the final project and in the future. " ] }, { "cell_type": "code", "collapsed": false, "input": [ "%matplotlib inline\n", "import numpy\n", "import math\n", "import scipy\n", "import random\n", "import brewer2mpl\n", "import matplotlib.pyplot as plt" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 145 }, { "cell_type": "markdown", "metadata": {}, "source": [ "#Neural Networks\n", "\n", "A neural network \"learns\" by solving an optimization problem to choose a set of parameters that minimizes an error function, which is typically a squared error loss. This definition of learning isn't unique to a neural network model. Consider, in the simplest case, a linear model of the form $y = XB$. Given a vector of data $y \\in \\mathbb{R}^m$, we choose $B \\in \\mathbb{R}^n$ that minimizes $||y-XB||_2$, where $X$ is an $m$ by $n$ matrix with \"training data\" on the rows. Solving this least squares minimization problem has a nice well known closed form analytical result: $\\hat{B} = (X^TX)^{-1}X^Ty$. There is, however, a tradeoff between computational ease of finding optimal model parameters and model complexity. This is evident in the linear case since the model is extremely easy to fit but has a very simple form. The neural network attempts to find a \"sweet spot\": while the model is highly non-linear, its particular functional form allows for a computationally slick fitting procedure called \"backpropogation\". \n", "\n", "A \"neural network\" is a function from $f: \\mathbb{R}^m \\rightarrow \\{-1,1\\}$. The input to each neuron is a linear combination of the outputs of each of the neurons in the lower layer. The output of the neuron is a nonlinear threshold applied to its input: it's something that maps the real line to (-1,1) in a 1-1 fashion such that \"most\" of the positive line is mapped pretty close to 1 and and \"most\" of the negative line is mapped pretty close to -1. A common choice is the function $g(x) = \\frac{e^x-e^{-x}}{e^x+e^{-x}}$.The final output of the signal is the sign of the top neuron. Note that the top layer is constrained to have a single neuron.\n", "\n", "The goal of fitting the model is to find a suitable set of \"weights\", which we can compactly refer to as $w$, for the inputs of each of the neurons. One way to do this is to choose a set of weights that minimize the sum of squared errors for training examples we already have. The intuition here is to choose a model that is \"close\" to the true model, just like we would in a linear regression. However, unlike linear regression an analytical solution is not feasible because of the ugly threshold functions, so we need to resort to a computational approach like gradient descent. Gradient descent takes steps in the direction of greatest error decrease in the parameter space, hoping to find a (global) minimum. (Recall that from multivariable calculus, the gradient of a scalar field points in the direction of the greatest increase of the function, and so walking in the opposite direction points in the direction of greatest decrease repeating the argument with the negative of the function.) \n", "\n", "We start by initializing the model with some arbitrary set of weights. Feeding forward, given an input $x \\in \\mathbb{R}^m$, we may compute the output as delineated above. Hence, we may calculate the error $E = (y-f(x))^2$. Now, we compute the gradient of this error function with respect to the weights. \n", "\n", "The beauty of the method is that computing the gradient of the error is computationally slick and can be done recursively using the chain rule. To see this, we'll introduce some notation. Let $k = 1...L$ indicate layers, assume that $s_{jk}$ is the output of the jth neuron in layer $k$, $x_{jk}$ is the input into neuron $j$ in layer $k$, and $w_{ijk}$ is the weight for the signal input into the $j$th neuron in layer $k$ coming from the ith neuron in layer $k-1$, so that $x_{jk} = \\sum_{i=1}^{d_{k-1}}w_{ijk}s_{ik-1}$, $s_{jk} = g(x_{jk})$. Note that $d_{k-1}$ stands for the number of neurons in layer $k-1$. By the chain rule, $$\\frac{\\partial E}{\\partial w_{ijk}} = \\frac{\\partial E}{\\partial s_{jk}}\\frac{\\partial s_{jk}}{\\partial w_{ijk}}$$. Since $s_{jk}$ is linear in the weights, the tricky part is only in computing the former component of the product. We may recursively compute this as \n", "\n", "$$ \\frac{\\partial E}{\\partial s_{jk}} = \\frac{\\partial E}{\\partial x_{jk}}\\frac{\\partial x_{jk}}{\\partial s_{jk}} = \\sum_{i=1}^{d_{k+1}} \\frac{\\partial E}{\\partial s_{ik+1}}\\frac{\\partial s_{ik+1}}{\\partial x_{jk}}\\frac{\\partial x_{jk}}{\\partial s_{jk}}$$\n", "\n", "The recursion terminates since $\\frac{\\partial E}{\\partial x_{1L}} = 2(x_{1L}-y)$. We start by computing the gradients from the top layer, and store the gradients as we progress down each layer,so that we need not recompute them. This leads to computational efficiency. \n", "\n", "Thus \"learning\" in a neural network is nothing but switching between computing the error using the training data and updating the weights by calculating the gradient of the error function. In the code block that follows, we write a class for NeuralNetworks, and apply it to performing classification." ] }, { "cell_type": "code", "collapsed": false, "input": [ "#Neural Network Class \n", "class Neural_Net:\n", "\n", "\t#constructor initializes a new neural network with randomly selected weights and pre-specified height, and number of neurons per layer\n", "\tdef __init__(self,non,height):\n", "\t\t#list to store the number of neurons in each layer of the network\n", "\t\tself.num_of_neurons = non\n", "\t\t#height of the network\n", "\t\tself.L = height\n", "\t\t#list to store number of weights in each layer of the network, indexed by layer, output neuron, input neuron\n", "\t\tself.weights = numpy.zeros(shape=(10,10,10))\n", "\t\t#delta_matrix: stores the gradient that is used in backpropagation\n", "\t\tself.deltas = numpy.zeros(shape=(10,10))\n", "\t\t#matrix that stores thresholded signals\n", "\t\tself.signals = numpy.zeros(shape=(10,10))\n", "\t\t#(tunable) learning_rate used in backpropagation\n", "\t\tself.learning_rate = .001\n", "\t\t#initialize weights to be between -2 and 2\n", "\t\tfor i in range(1,self.L+1):\n", "\t\t\tfor j in range(1,self.num_of_neurons[i]+1):\n", "\t\t\t\tfor k in range(self.num_of_neurons[i-1]+1):\n", "\t\t\t\t\tself.weights[i][j][k] = random.random()*4-2\t\t\n", "\t\t\t\t\t\t\t\t\t\t\t\n", "\t#forward_pass computes the output of the neural network given an input\n", "\tdef forward_pass(self,x):\n", "\t\t#(for convenience, we index neurons starting at 1 instead of zero)\n", "\t\tself.signals[0][0] = -1\n", "\t\tfor i in range(1,self.num_of_neurons[0]+1):\n", "\t\t\tself.signals[0][i] = x[i-1]\n", "\t\tfor i in range(1,self.L+1):\n", "\t\t\tself.signals[i][0] = -1\n", "\t\t\tfor j in range(1,self.num_of_neurons[i]+1):\n", "\t\t\t\tself.signals[i][j] = self.compute_signal(i,j)\n", "\t\treturn self.signals[self.L][1]\n", "\t\t\t\t\t\n", "\t#tune_weights performs the backpropagation algorithm given a training example as input\n", "\tdef tune_weights(self,y):\n", "\t\tself.deltas[self.L][1] = 2*(self.signals[self.L][1]-y)*(1-math.pow(self.signals[self.L][1],2))\n", "\t\tfor i in range(self.L-1,0,-1):\n", "\t\t\tfor j in range(1,self.num_of_neurons[i]+1):\n", "\t\t\t\tself.deltas[i][j] = self.compute_delta(i,j)\n", "\t\tfor i in range(1,self.L+1):\n", "\t\t\tfor j in range(1,self.num_of_neurons[i]+1):\n", "\t\t\t\tfor k in range(self.num_of_neurons[i-1]+1):\n", "\t\t\t\t\tself.weights[i][j][k] = self.weights[i][j][k]-self.learning_rate*self.signals[i-1][k]*self.deltas[i][j]\n", "\t\n", "\t#compute_signal: computes the delta for a given neuron at a given level\n", "\tdef compute_signal(self,level,neuron):\n", "\t\ts = 0\n", "\t\tfor i in range(self.num_of_neurons[level-1]+1):\n", "\t\t\ts += self.weights[level][neuron][i]*self.signals[level-1][i]\n", "\t\treturn self.g(s)\n", "\t\n", "\t#compute_delta: computes the signal s for a given neuron at a given level\n", "\tdef compute_delta(self,level,neuron):\n", "\t\ts = 0\n", "\t\tfor j in range(1,self.num_of_neurons[level+1]+1):\n", "\t\t\ts += self.weights[level+1][j][neuron]*self.deltas[level+1][j]\n", "\t\treturn (1-math.pow(self.signals[level][neuron],2))*s\n", "\t\n", "\t#soft threshold function\n", "\tdef g(self,s):\n", "\t\treturn (math.exp(s)-math.exp(-s))/(math.exp(s)+math.exp(-s))\n", "\n" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 146 }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's train a neural network and see how well it performs on the test and training sets epoch by epoch. We will use a mock training and test set with two covariates. We instantiate a neural network with one hidden layer with four neurons, and a learning rate of .001. The learning rate is how much we scale the gradient in \"walking\" the parameter space. \n", "\n", "To gain some intuition, try to tweak some of these knobs.\n", "\n", "Note that this will take about a minute to run!" ] }, { "cell_type": "code", "collapsed": false, "input": [ "#read in the train and test dat, assuming csv format\n", "training = numpy.genfromtxt('train.csv',delimiter = ',')\n", "testing = numpy.genfromtxt('test.csv',delimiter = ',')\n", "\n", "#specify the number of neurons in each layer\n", "num_of_neurons = [2,4,1]\n", "\n", "#initialize a new neural network\n", "network = Neural_Net(num_of_neurons,2)\n", "\n", "#store the training error and test error during each epoch\n", "training_error = 0\n", "test_error = 0\n", "\n", "#store the training and test error for all epochs\n", "train = numpy.zeros(shape = (1000))\n", "test = numpy.zeros(shape = (1000))\n", "\n", "for epoch in range(1000):\n", "\ttraining_error = 0\n", "\ttest_error = 0\n", "\t#compute the test errors\n", "\tfor j in range(250):\n", "\t\ttest_error = test_error+math.pow(network.forward_pass(testing[j]) - testing[j][2], 2)\n", "\t#compute the training errors, SEQUENTIALLY. In other words, we perform backpropagation for *every* example\n", "\t#instead of all at once. \n", "\tfor i in range(25):\n", "\t\ttraining_error = training_error+math.pow(network.forward_pass(training[i])- training[i][2], 2)\n", "\t\tnetwork.tune_weights(training[i][2])\t \n", "\ttraining_error = training_error/25\n", "\ttest_error = test_error/250\n", "\ttrain[epoch] = training_error\n", "\ttest[epoch] = test_error" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 147 }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's compare the training and test error, epoch by epoch. Do we see signs of overfitting?\n" ] }, { "cell_type": "code", "collapsed": false, "input": [ "fig, ax = plt.subplots()\n", "ax.plot(numpy.arange(1000), test, lw=2, label = 'test')\n", "ax.plot(numpy.arange(1000), train, lw=2, label = 'train')\n", "ax.legend(loc=0)\n", "ax.set_xlabel('Epoch')\n", "ax.set_ylabel('MSE')" ], "language": "python", "metadata": {}, "outputs": [ { "metadata": {}, "output_type": "pyout", "prompt_number": 151, "text": [ "" ] }, { "metadata": {}, "output_type": "display_data", "png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEOCAYAAABmVAtTAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xd8FHX+x/HX9k3vhITQe0dAuhAFQTyRImKD87gT1Dvg\nPP2pnN5hwMJZ7049EaygSFPgFDwRxEgHQ42SAIFAKKmkJ9t3fn9s2IABhCSb3SSf5+Oxj9nMzM58\n5ivue6d8Z1SKoigIIYQQgNrbBQghhPAdEgpCCCHcJBSEEEK4SSgIIYRwk1AQQgjhJqEghBDCzeOh\nkJycjNPp9PRqhBBC1AKtJxe+a9cuRowYQX5+Pnq93j3+gw8+ICwsjLS0NLp3787o0aM9WYYQQohr\npPJ057XWrVtz5MiRS0Jh6NChbNmyheLiYiZPnsyXX37pyRKEEEJcI4/uKVxJZGQkr732GkFBQfzl\nL3/xRglCCCEuwyuh8O677zJixAiCg4NZu3btZedp164dx48fr+PKhBCifmvbti1paWnV/rxXrj56\n7LHH2LNnD5MnT+aRRx657DzHjx9HURR5KQrPPfec12vwlZe0hbSFtMXVXzX9MV0noXCh2NzcXAAy\nMzMxGo088sgj5OXl1UUJQgghroFHDx8lJSWRl5fHxo0bad68OS+99BIrVqzgnnvuYdGiRRgMBh5/\n/HFPliCEEOI6eDQU+vbtS0lJifvvFStWAFzxkJG4vPj4eG+X4DOkLSpJW1SStqg9Hr8ktbpUKhU+\nWpoQQvismn53euXqIyGEqInw8HAKCgq8XYZXhYWFkZ+fX+vLlT0FIUS9I98PV26DmraN3BBPCCGE\nm4SCEEIINwkFIYQQbhIKQggh3CQUhBBCuEkoCCFELTp58mSdfMZTJBSEEKKW5OTkMGHChOv6zBtv\nvMEXX3zhoYqun3ReE0I0KHEfza61ZZ2Z+o/rmn/jxo0cPnyYd955hxtvvJHExESSk5MZMmQI06dP\nZ9euXezZswe1Ws2pU6d45plnWLlyJbGxsXTr1o1Ro0bVWu3VJaEghBC1ZPDgweh0Oh5++GEmTpzI\nmjVryMvLIzY2lgkTJrB06VLGjBnDyJEj2b17N2FhYXTu3Jnu3bv7RCCAhIIQooG53l/3nnDkyBEy\nMzNZuHAhAGPGjCEnJ4eJEycyYcIEnnvuOWbOnOme35d6Z0soCCFELbNYLDRt2pSHH34YwD1UFIVv\nvvmG6dOn8+OPP/LJJ58ArltT+AoJBSGEqCU6nQ6r1UrHjh3ZsmULW7ZsYejQoWzfvh2n08nJkyeZ\nMmUK27dvp1OnTu7PmEwmHA4HGo3Gy1sgVx8JIUStiY2NpUePHjzxxBO8/fbbTJ48mQ4dOrBt2zZu\nuukmvv/+e1544QXWrFnD7NmuE+KjRo1i6dKlrFmzxsvVu3j8LqnJycl07doVtfrS/DGbzaxcuZLm\nzZtz8803Vy1M7oIohLgC+X6op3dJ3bVrFwMHDsRut18yvqioiEmTJjF06NDLBoIQQgjv8GgoDBgw\ngKioqCrjZ8+ezb333kurVq2u+vnG/ktACCHqWp2fU7Db7SxdupTMzEwefPBBEhISrjjv5jNH6q4w\nIYQQdX/1UW5uLs2bN+eJJ54AoFu3bkybNo1mzZpVmXfm7CeZ1L4P4HowtzycWwghLpWYmEhiYmKt\nLa/OQyE4OPiSk84dOnQgMzPzsqFQPKIr48c8RM/IuLosUQgh6o1f/mCeO3dujZZXJ4ePFEVBURRy\nc3MJCAggMjKS0tJSAEwmE+3bt7/iZz88vL0uShRCCIGHQyEpKYm8vDw2btzIoUOHmDFjBgCvvPIK\nzz33HMuWLeO3v/0tISEhV1zG+pM/UWw1e7JMIYQQFTzeT6G6VCoVd3+9kB1ZJ5g/cBxTOg3wdklC\nCB/RUPopZGdnEx8fT0pKynV/tl72U6ip+zr0A2D5sSQvVyKEENfmeh6YEx0dzZIlSzxXTDX4dCjc\n1rIrIXojB/POkJKf5e1yhBDiqqrzkJ0bb7zRQ9VUj0/fEM9Pq+M3rXrw2dE9rDt5iM7hTb1dkhDC\nxx39Xe3dVK7Dx47rmv/CQ3bmzZvHyZMniY2N5bvvvmPKlCm0bt2apKQkjEYjFouFv/3tb6xYsYLn\nnnuOHTt2UFhYyPTp04mPj+enn34iJSWF7du3ExgYWGvbcy18ek8B4I5W3QFYdzK5QRxDFEI0XBce\nsjNnzhxsNhsFBQVs2LCBu+66i+XLl3P77bfz+OOP8+KLLwJw1113cfToUQDatGmD1WolLCyM5cuX\nExAQwKZNm+p8G3x6TwFgUEwbwgz+HC/K5UhhNp3CZG9BCHFl1/vr3lO0Wi1t27YlODiY4OBgFi9e\nTHJyMuvWrcNisbjnuZhGoyEmJgaAiIgIiouL67xun99T0Ko13NayK+DaWxBCiProX//6FwcOHGDs\n2LHeLuWqfD4UoPIQ0tcSCkIIH3bhITuKomCz2S455D1v3jxuvPFGcnJyAKpMh8qOvgBOp9Mrh8x9\nOhRMabsAGBTTliCdgaOFOWSU5Hu5KiGEuLwLD9mZNm0aO3bsYMeOHWRlua6cHDFiBA8++CDr169n\n4MCBzJs3jy+//BKVSsW6des4deoUJ06cYMuWLZw9e5bU1FR27NhR5dEDnubTndfOvXMfMY9+BsDD\n3y9l/clkXhwwlgc7D/RydUIIb2oonddqolF2XitJ+gJbwVkAbonrCMjttIUQwpN8OhRw2Cn6bgEA\n8c06ALA98zgmu82bVQkhRIPl26EAFCYuwmk1Ee0fTPeIZpgdNnZlnfB2WUII0SD5dCgYWvXBWXqe\n0qTVQOUhpO/lEJIQQniET4dCyNDfA1C8bTEAN8W2A1yHkIQQQtQ+nw6FoP73otIZKU/ZjC3vFDdE\ntcCo0XGkMJtcU4m3yxNCiAbH46GQnJyM0+ms1mc1AaEE9hkHikLx9iUYNFr6R7cCYEemnFcQorEK\nCwtDpVI16ldYWJhH2tajobBr1y4GDhx4xc4XEydO5NSpU1ddRvCQBwEo3v4JiqIwKKYtIIeQhGjM\n8vPz3b1/G+srP98zHXk9GgoDBgwgKirqstPWrFmD1Wr91WX4dxmOJjgaW85xLBkHGSyhIIQQHuOV\ncwoHDhygRYsWREREoFKprjqvSq1xHUICSpO+oHtEM4L1Rk6VnOdMaUFdlCuEEI1GnYdCQUEBaWlp\n9OnTB+CaumMH9nU9yajkxy9Qq1QMbNoGkL0FIYSobXX+PIWvv/6aVatWsXTpUvbt20dmZiYffvgh\nsbGxVeZNSEgAQHE66FAUxI0cwXruMINi2rIh4zDbM49zT/u+dbwFQgjhOxITE0lMTKy15Xn8hnit\nW7cmNTUVvV5Pbm4uTZo0cU+bOnUqc+fOpUWLFlUL+8VNnbI+eIjirR8RMT6B3KHTGLH2X0T7B5M0\n6a+/eghKCCEaC5++IV5SUhJ5eXls3LiRQ4cOMXPmzCrzXGvxgb1dD6YoO/g/OoZGE2EMILu8mPTi\nvFqtWQghGjOfvnX2xaU5LWUc/1MkisNG2zez+FPSBtadTOYfA8czuVN/L1YqhBC+w6f3FGqT2hCA\nX8ebQFEo++lbd3+FHVlyslkIIWpLvQkFgIDuowEoS97AoIorkHZmnWj0D9sQQojaUq9Cwb/7KADK\nkzfQJiiCJn5B5JpKOVaU4+XKhBCiYahXoaCP7Yw2ogWOklyspw8wMMa1tyD3QRJCiNpRr0JBpVLh\n33UEAOWHv2dQ04rzCtKJTQghakW9CgUA/07xAJSnfs+gmMrzCk6lendiFUIIUan+hULnmwEwHd1G\nS79gYvxDKLCUk1qQ7eXKhBCi/qt3oaANi0XXtAOKuRTLqX3uvQU5hCSEEDVX70IBKg8hmVIT3f0V\ndmbJyWYhhKip+hkKneMBKE/53t1fYVfWCRzVfMKbEEIIl3oZCn4X9hSO7SDOL5DmgWEUWc0czs/0\nbmFCCFHP1ctQ0IZEo4/tgmItx3xiT+V5BbnlhRBC1Ei9DAWovAqpPOV7Brr7K8h5BSGEqIl6Gwp+\nF84rpP7gPtm8Ozsdu9PhxaqEEKJ+q7eh4N9xKADmYztoqjPQKiiCUpuF5PPnvFyZEELUX/U2FDRB\nkeib90CxWzCf2F15K23pryCEENXm8VBITk7G6aFLRf07DQOgPCWRIRWhsOXcMY+sSwghGgOPhsKu\nXbsYOHAgdrv9kvHLly9nyJAhtG/fnp07d1Z7+ZWd2H7gpmbtUatU7Mk+SYnVXJOyhRCi0fJoKAwY\nMICoqKhLxpnNZrRaLdu2bWPevHk8//zz1V6+X8ehoFJhPr6TEJWKPlEtsDkdbMtMq2npQgjRKNX5\nOQWdTsddd90FQK9evYiMjKz2sjSB4Ria90SxWzEf38UtcZ0A2HzmSK3UKoQQjU2dh4JGo0GlUgGw\ndetWnnrqqRotz++i8wq3xHUEXKEgj+gUQojrp/XWitPT02nZsiXdunW74jwJCQnu9/Hx8cTHx1eZ\nx79zPIXf/htTaiJdxicQ7RdEdnkxKQWZdAmP9UDlQgjhOxITE0lMTKy15akUD/+kbt26Nampqej1\nenJzc2nSpAk5OTns3buX0aNHYzabKSkpqXLuQaVSXdOvfUdZAcdnRIFaS7t38nnyx69ZcSyJ2X1u\nY0aPeA9tlRBC+KZr/e68Eo8ePkpKSiIvL4+NGzdy6NAhZs6ciclkYty4cTz11FN0796d/v37ExER\nUe11aALCMLS4ARw2TGk7LjqElFpbmyGEEI2GRw8f9e3bl5KSEvffK1asAGDHjh21uh7/TsOwnNrn\nujR1zLPo1BqSck5x3lxKhDGwVtclhBANWb3t0Xwx932QUhIJ1hsZHNMWp6KwIeOwdwsTQoh6pmGE\nQoebQKXGnL4Hp7mU21u5Tl5/ffInL1cmhBD1S4MIBY1/CIZWvcFhx3R0K6NadEGtUrE98zhFFpO3\nyxNCiHqjQYQCQEDXWwEoS95AhDGQ/tGtsTkdbJITzkIIcc0aTij0HA1A2aFvALi95YVDSMleq0kI\nIeqbBhMKxjb9UfuHYss+hjU7jdtadgUg8exRymwWL1cnhBD1Q4MJBZVGi3+3ikNIh74hJiCEPlEt\nsDjsbDoth5CEEOJaNJhQAAjoUXEIKfl/AIxt0xOANSf2e60mIYSoTxpWKHS/DQBTSiJOq4k7W/dE\no1KTeOYo582lXq5OCCF8X4MKBW1INIaWvVFsZsoPf0ekXyDDmrXHrjj5Kl1OOAshxK9pUKEAENh7\nLACle9cAMKHtDQCsPi6HkIQQ4tc0vFDoOwGA0n3/RbHbGNWiCwFaPftyM0gvzvNydUII4dsaXCgY\nmnVBH9sZZ1kB5amJ+Gn1jK7osyB7C0IIcXUNLhQAAvtU7C0kfQHAXe1ch5BWHtuLw+n0Wl1CCOHr\nGmQoBN3oegZ06d61KE4Hg2Pa0iIwnLNlhWw5d8zL1QkhhO9qkKGgb94DXZO2OEpyMaX+gFql5v6O\nNwKw9MgeL1cnhBC+y+OhkJycjLOOD9moVCqC+t8DQPH2TwCY1K4vWpWajadTyC4vrtN6hBCivvBo\nKOzatYuBAwdit9svGb9582beeust3nzzTfbs8cwv9+DBUwAoSfoCp7mUJv5BjGjeGYfiZFXaXo+s\nUwgh6juPhsKAAQOIioq6ZJzD4eDpp59m5syZzJo1i2eeecYj69Y37YCx3SAUSxmle1cD8EDHfgB8\nduRHnIqccBZCiF+q83MKGRkZREZGuv/WarWkp6d7ZF3BQ1x7C0XbXIeQhsa2Jy4wlIzSfLacS/PI\nOoUQoj6r81DIysoiKCjI/XdQUBDZ2dkeWVfQjZNQaQ2YUr/HlncKjVrN5I79Afjg5+0eWacQQtRn\n2rpeYWRkJKWllTenKy0trXKI6YKEhAT3+/j4eOLj469rXZqAUAL7jqdk13KKfnifyLue54EO/fjX\ngc18f/YIxwpzaB/apDqbIYQQPiExMZHExMRaW55KURSl1pZ2Ga1btyY1NRW9Xk9eXh5RUVHcdNNN\nbN26FUVRGDp0KFu3bq1amEpFbZRmOrqN0y8NQxPchNavn0StMzB7xxo+PbKbKR37M3/Q+BqvQwgh\nfEVNvzs9evgoKSmJvLw8Nm7cyKFDh5gxYwYA8+fP5/XXX+eNN97gH//4hydLwNh+MIbmPXEU51D6\n4+cA/KHLYABWpe2jwFLu0fULIUR94vE9heqqrT0FgMLE98j5+BGM7QbS4m/bAJj87Ycknj3K7D63\nMaNHfK2sRwghvK3O9xQ2b97MiRMnqr1CbwgeeD9q/1DMaTsxpycB8FDXIQB8nLIDq8N+tY8LIUSj\ncdVQuPnmm0lJSWHt2rUsXryYlJQU+vbty3333VdX9dUKtSGAkKF/ACD/61cBGBbbno6h0WSVF/P5\n8X3eLE8IIXzGVUNh/PjxdO7cmTvuuIOtW7fSuXNngoODmTRpUl3VV2vCRj2GSqunNOkLrFlHUalU\nzOhxMwD/OfQDdqfDyxUKIYT3XTUUDAYD4Opg1r17d/f4gIAAz1blAdqwWIIGTQFFIf/r1wAY07o7\nLYMiOFVyXh7XKYQQ/MqJ5lGjRjF48GAURXHfx0hRFDZt2nTZy0hrtbBaPNF8gTXrKCf/2gXUWlq/\ndhxdWDM+O7qHp7avpmNoNBvH/Rm1qkHeOFYI0Uh49ESzVqt1L7x///4oioKiKISEhFR7hd6kb9qB\nwL53gcNGwf9eB2Bi297E+IdwpDCbbzNSvFyhEEJ411X3FHJzcy/b2zg7O5vo6GjPFuaBPQUA86kD\nZDzXB5XWQKtXjqILj+ODw9t5bvdXdAmP4Zs7Z8reghCi3vLonsKiRYt47bXXyM3NJTc3lzFjxjBg\nwACOHz9e7RV6m7FlLwL73Y1it5D/5QsA3N+hH039gzmcn8k6ObcghGjErhoKn3/+OY8++ihRUVHc\ne++9hISE8P7777N06dK6qs8jIsfPBbWGoi0fYs06hp9Wx2O9hgPw6v6N2ORKJCFEI3XVUJgwYQIB\nAQEsWbKEs2fP8tFHH9GtWzc6depUV/V5hD6mI8FDHgSng/NrEwC4p31fWgVFkF6cx8pj8hAeIUTj\ndNVQSEtL49FHH2XWrFm8++676HQ6TCYTn3zySV3V5zERY/+OSqunZNdyzKcOoFNreLL3SAD+eWAT\nJrvNyxUKIUTdu2oo/PWvf2XUqFFs376d+Ph4rFYr77//PpMnT66r+jxGF9GCkOF/BCD3s8dRFIUx\nrbvTJTyGrPJilqTu9HKFQghR96569VHLli1p27ZtlTPZaWlpnD592rOFeejqo4s5ygo4ObsTjpI8\nYv64nKB+d/Pd6VQe3PQxwXojW+/6PyKMgR6tQQghalNNvzuv+pCdu+++m6NHjzJx4kRatmwJgKIo\nfPPNN9VeoS/RBIQRMWEeOYv/SO6KpwjodQe3xHVkWGx7fjh3jNf2bZTnLQghGpVfvXW22Wxm9erV\nZGRk0Lt3b0aOHInFYnHfAsNjhdXBngKA4nRw6rm+WE8fImJ8AhFj/87RwmxuXftvFBS+uXMWXcJj\nPF6HEELUBo/fOttoNHL//fcze/ZsioqK6NmzJwsXLqz2Cn2NSq2hyQP/AiB//cvYzmfQITSaBzsN\nwKkoJOz+qk7CSQghfME1dd399ttvueOOO/jTn/7E7bffzvjxDeuQin+nYQTeOBHFaiLn01koisJf\nbhhBqMGfHVkn+N+pn71dohBC1ImrhsI777xDly5dePrpp5kwYQIZGRnMnz8fq9X6qwu22+3MmTOH\ntWvXMn/+/Et+bX/wwQesXr2aV155hf/9738134paEHX/G6iNQZTt/4rSvWsIM/jz5A23AjB3zzrK\nbBYvVyiEEJ531XMK/v7+DB48mCFDhrjH2e121q1bx/79+6+64AULFqDRaJg+fToLFy4kLCzM/RyG\noUOHsmXLFoqLi5k8eTJffvll1cLq6JzCxQo3vUPOpzPRhMbS6qWfwBjEmHX/4dD5s0zrOoTn+t1R\np/UIIcT18ujVRxs3bmTw4MFVxl8cEleye/du/vhHVz+Anj17smDBAncoREZG8tprrxEUFMRf/vKX\n6tTtESG3PEzxjk8xn9hN3hd/I3rKW7w8aAK/Wfc2Hxzezvg2vegRGeftMoUQwmOuGgqXCwRwPWfh\n12RlZREUFARAYGAg2dnZ7mnvvvsuI0aMIDg4mLVr115xGQkJCe738fHxxMfH/+p6a0Kl1hA99V1O\nJdxI0eYFBA+8n+7tBvJQl8Es+nkbT+9Yw1d3/BGtWuPROoQQ4lolJiaSmJhYa8v71UtSq+uBBx7g\nz3/+M/369WPXrl28/fbbfPrppwDcf//9fPjhh3z88cds2rSJzz//vGphXjh8dEHuqmcoWP8y+tjO\ntEhIwqRSccuaf3K2rJA5N/6G6d1u8kpdQgjxazx+SWp1jRo1ioMHDwJw6NAhRo4cSW5uLgCZmZkY\njUYeeeQR8vLyPFVCtUWM/Tu6ph2xnkvh/Oo5BOgMvDRwHAAv79tAWmGOlysUQgjP8FgoTJkyhYyM\nDFatWsWZM2fo1q0bM2bMAOCee+5h0aJFLF68mMcff9xTJVSbWu9H02kfgUpNwYY3MB3dxvDmnbi7\nXW8sDjuPbV2FXW6vLYRogDx2+KimvHn46IK8z/9G/rr56Jq0peXz+ylVaRix9p+cKyviyRtu5c8V\nz2AQQghf4bOHjxqC8LF/R9+8B7ac4+SunE2w3sgbQ+4G4J8HvuOn82e9XKEQQtQuCYWrUOsMxEz7\nGDQ6ir57h7LkDQyJbcfvOg/ErjiZ8cNy6dQmhGhQJBR+haFFTyLGPQdA1ntTsRdl82zf0XQIbUJa\nUS7P7vyvlysUQojaI6FwDcJ/8xR+neJxFGeT9d7vMKq1LIh/AKNGx+fH97HyWJK3SxRCiFohoXAN\nVGoNTR9egjowgvKfvqVgwz/pGBbNiwPHAvDsrv9ytDD7V5YihBC+T0LhGunCmtH0Dx8AkPf5M5hP\n/Mikdn24q+0NmOw2Hvl+qZxfEELUexIK1yHwhjGE3joTHHYyF9yP01zCSwPH0S4kiqOFOfx5y0qc\nitPbZQohRLVJKFynyEkvY2jRC1vuCbI/+AP+Wj0fDv8twXoj32T8zL8ObPZ2iUIIUW0SCtdJrTMQ\n88flqP2CKU1aTcH/XqNNSBT/GXYfapWKNw5s4uuTP3m7TCGEqBYJhWrQN21P0+lLAMhb9Qzlh7/j\n5riO/LXPbQA8tnUlh/PPebNEIYSoFgmFagq8YQzhY54FxUnmgvuxnc/gkW5DmdD2BsrtVqZ8+xFn\nSgu8XaYQQlwXCYUaiBj/HP7dRuIoyePc23ej2Cy8MmgCA5q2JttUwpRvP6LAUu7tMoUQ4ppJKNSA\nSq0h5pGlaCNbYUlPIvuj6Rg0Wj645bd0DI3mWFEOv9+0GJPd5u1ShRDimkgo1JAmMJxms1ajMgRQ\nsnMp+V+9RIjBj09G/p4Y/xB+zDnFjB+WYZNbbQsh6gEJhVpgaNGTmEeWgkrF+dVzKNmzktiAED4Z\nOZUQvZENGYeZ9cMKeQaDEMLnSSjUksAbxhB1z6uA68Z5puO76RTWlKUj/0CgzsBXJw/xxLbPcTil\nc5sQwnd5LRTMZjNLlizh+++/91YJtS501GOEDHsIxWbm3L/HYc05Tq+o5nx66+/x1+r54vh+nt6x\nWno9CyF8lsdCwW63M2fOHNauXcv8+fMveRJQUVERkyZNYujQodx8882eKqHOqVQqmkx5G/+uw3EU\n53D21duwF2bRN7oli2/9HUaNjuXHkuRxnkIIn+WxUHjvvfeIi4tj3LhxhIeHs2rVKve02bNnc++9\n99KqVStPrd5rVFodsTO+wNCqD7bcE5x9/XYc5UUMbNqGxbf+jgCtntXH9/Pw90sxy1VJQggf47FQ\n2L17N7169QKgZ8+erF+/HnDtQSxdupTMzEwefPBBEhISPFWC16j9gmj2+Hp0TTtgOX2Qc/8eh9Nq\nZnBMW5bd9hAhej82ZBxm6neLKbdZvV2uEEK4aT214KysLIKCggAIDAwkO9v1vIHc3FyaN2/OE088\nAUC3bt2YNm0azZo1q7KMiwMjPj6e+Ph4T5Vb67TBUcT93zdkvDAE05EtZP5nErEzP6d3VAtWjZ7O\nA99+wNZzaUz65j0+GvFbovyCvF2yEKIeSkxMJDExsdaW57FQiIiIoKSkBIDS0lIiIyMBCA4ORq2u\n3EHp0KEDmZmZvxoK9ZEusiVxT37D6ZfiKTu4nnP/uYfYP62gS3gMX4x+hPu/fZ8Deae5c907LLl1\nKu1Dm3i7ZCFEPfPLH8xz586t0fI8dvho1KhRHDx4EIBDhw4xcuRIcnNzCQgIIDIyktLSUgBMJhPt\n27f3VBleZ2jWlbinvkUdEEbZ/i859869KHYbbUIi+fI3f6JnZBynSwsYt/4dtmce93a5QohGTqVc\nfFlQLVIUhTlz5tCjRw+Sk5MZN24cL7/8MitWrODHH39k+fLl9O3bF4D77ruvamEqFR4qzSvMJ/dy\n5pWROMsLCewznphHl6HS6jDZrcz8YQXfZPyMTq3h+QF3Mrljf2+XK4Sop2r63emxUKiphhYKAOb0\nJFcwmIoI6HWH67kMej8cTicvJf2PhT9vBeDe9n15YcBYjFqdlysWQtQ3Egr1jPnEj5x5/XacZfn4\ndRxK7J/XovEPAXB1btu+GrPDRs/IOBbe/ABxgWFerlgIUZ9IKNRDlrM/c+bV23AUnsPQ8gaaPfE1\n2mDXSeafz59j2uZPySjNJ8zgz7+H3sMtcR29XLEQor6QUKinbLknOfPaKGzZaeiadqDZ4+vQN2kL\nQIGlnJk/LCfx7FEA/tBlMH/tc5scThJC/CoJhXrMXpTN2ddGYzl9EHVgBM1mrcavwxAAnIqTBclb\neHXft9gVJ53DmvL2sPvoGBbt5aqFEL5MQqGec5iKyVxwH+WHvkGl1RP9+/cIHjTZPf1A7mlm/LCc\nkyXnMWi0zO4zit93HoxGLTe4FUJUJaHQACgOO7nLnqBw09sAhN/5NyLGPYeq4ou/zGZhzu4vWXFs\nLwC9o1o2Uq+VAAAaeklEQVTw+pCJ0tlNCFGFhEIDUrDpP+QufQwUJwE9RtN0+hI0geHu6ZtOpzB7\nxxqyyovRqzU81ms4j3Yfhk6t8WLVQghfIqHQwJQlbyDz3ck4y/LRRrYidsZKjK36uKcXW8288OPX\nfHZ0DwAdQ6N5YeBYBjZt462ShRA+REKhAbLlneLcfyZhSU9CpTUQNfnfhAx7CJVK5Z5n+7k0ntqx\nmlMl+QCMbdOTv/W9nZiAEG+VLYTwARIKDZTTaib3s79QlLgIgMC+dxH9u3cvOZxksttY+NMW3jr0\nPRaHHX+tnj/3vIWHug7BoPHYvQ6FED5MQqGBK97+CdmfzEAxl6INa0bTaR/j3+WWS+Y5XZLPvB/X\n879TPwPQPDCM/+s9kvFteqJWyVVKQjQmEgqNgDXnBFmLfos5bScAYbc9TsSEeaj1fpfMt+XsMebu\nWceRQtezK7qExzC7z23c3KzDJYeehBANl4RCI6E47OSvm8/5/z4PTge66PZET12If6dhl8zncDpZ\nfXw/r+7/lnNlRQAMaNqax3uNYGDTNhIOQjRwEgqNjOnEHrI/eAjrWdehopD4aUTe/Q80AaGXzGe2\n21icupM3D35PkdUEQL/oVjzWczg3xbaTcBCigZJQaIQUu5X89S9z/ssXwWFDExpDk/vfIPDGu6t8\n2RdZTHyYsp33f97uDocboprz5563cEtcRznnIEQDI6HQiFnOHib7o+nucw1+HYfRZPK/MDTvUWXe\nEquZxam7WPTTVvItZQC0C4nioS5DuKvdDfhp9XVauxDCM3w2FOx2O/PmzaN3796kpKQwe/bsKr9i\nJ06cyOuvv07Lli2rFiahcE0Up4OiLR9w/ou/4yjJA5WakJsfJnLCXDSBEVXmL7dZ+fTIbt77eRuZ\n5a5zDmEGf6Z07M+DnQcS7R9c15sghKhFPhsKCxYsQKPRMH36dBYuXEhYWBiTJk1yT1+zZg0fffQR\nb731loRCLXCUFXB+7VwKv3sHnA7UfiGE/+YpQm+didoQUGV+m9PB+pPJvPfzNg7mnQFAp9Zwe8tu\nPNCxn5yUFqKequl3p8cOKO/evZtevXoB0LNnT9avX++eduDAAVq0aEFERIR88dQSTUAYTR74Fy3n\n7cO/6wicpiLyPn+W9Kc6ULh5AYrddsn8OrWGcW16se6OP7Hm9ke4vWU3HIqT/6YfZNI37xG/5g0W\n/bSVAnOZl7ZICOENHguFrKwsgoKCAAgMDCQ723XtfEFBAWlpafTp47qfj+wN1C5DXDfintxAsyc3\nYGjdF0dRFjlLZnDyma4Ubf24SjioVCpujG7Folsms3Pi0/y55y1E+wdzvCiXeT+up8+Kl5jxw3J+\nOHsUh9Pppa0SQtQVj90LISIigpKSEgBKS0uJjIwE4Ouvv2bVqlUsXbqUffv2kZmZyYcffkhsbGyV\nZSQkJLjfx8fHEx8f76lyG5yAriPw7zKc0qTV5H3xd2xZR8j+4A+cXzuP8N88SfCQqaj1xks+0yww\nlCd7j+QvvYbz3elUPj2yh8SzR1l74gBrTxwg2i+IsW16clfb3nQJj5G9PCF8QGJiIomJibW2PI+d\nU1iyZAkWi4Vp06axaNEijEYjo0ePJioqyj3P1KlTmTt3Li1atKhamJxTqDWKw07Jrs/IX/cy1sxU\nADShMYSNepzQ+Gmo/YKu+NnTJfl8fnwfX6Tt52TJeff4jqHRTGh7A2Pb9CQuMMzj2yCEuDY+e6JZ\nURTmzJlDjx49SE5OZty4cbz88susWLHCPc/UqVNJSEiQE811RHE6KE1aTf5X87GcPgiA2i+Y4Jum\nEjpiBvomV779tqIo7Ms9zerj+/ky/SAFlnL3tJ6Rcdzeshu3t+pG6+BIj2+HEOLKfDYUakpCwXMU\nRaHs4NcUfP0qpqNbXSNVKgJ63UHYrbPw63zzVQ8NWR12Es8eZc3xA3x3JpVyu9U9rWt4TEVAdKdd\nSJQcYhKijkkoiBoxn9pP4bdvUrJ7OUrFl7s+tgshw/5A8KDJaIKu/svfZLfxw9mjrD+ZzMbTKZTa\nLO5prYIiGN68EyOad6J/dGv0cjtvITxOQkHUCntxDkXfL6Jw8wIcRVmukRodgb3HEjLsD/h3GeF+\nZvSVWBx2tp47xtcnf+Lb0ykUXnSIKUCrZ2iz9gyP68QtcZ1o4n/l8xhCiOqTUBC1SrHbKDu4nqIt\nH1B26BtQXJehaiNaEDxoMkED7sPQrMuvLsfhdLIvN4NNp1P57kwqqQVZl0zvGh7DkNj23BTbjv7R\nreQ2G0LUEgkF4TG2grMUb11M8dYPseWmu8frm/cgeMB9BPW/B11k1YsELudsaSHfnUnlu9OpbMtM\nw+KwVy5PraFPk5bcFNuOIbHt6BHRDK1aU+vbI0RjIKEgPE5xOjEd+YGSXcsp+fFznOWF7ml+HYYQ\n2PcuAnuPveaAMNlt7M05xdZzaWzLTONQ3lkUKv9bB+uN9ItuRb/o1vSPbkX3iGZyPkKIayShIOqU\n02ah/KcNlOxcRumBr1AqbscNYGh5A4G9xxHYeyz6uG7XfOVRgaWcHZnH2XYuja3n0i7pDwFg1Oi4\nIao5/aJb0T+6NX2atCBAZ6jV7RKioZBQEF7jNJVQeuArSveupSz5GxRL5X2SdE3aEnjDnQT0HI2x\n/RDU1/Elfqa0gF1Z6ezJPsme7HTSinIvma5RqekaHsMNUc3pFdmcXlHNaRsSKc+GEAIJBeEjnFYz\n5Yc3Ubrvv5Tt/wpHSeUXucoQgH/nmwnofhv+3UddtZPc5Zw3l/Jj9in2ZKezO/skP50/h0O59D5M\nQToDPSObVwRFHL2imsttwEWjJKEgfI7idGA6toOy/V9R9tO3WM8kXzJdF92egO6j8O86HL8OQ6s8\nSvTXlNksHMw7w4G8M+zPzeBA7hn3syEuFu0fTNfwGLqGx9I1PIYu4bG0Cg6XPQrRoEkoCJ9nKzhL\nefIGypI3UP7zpktOVKNSY2h5A/6d4vHrNAy/jjeh8bv+X/hZ5cUcyD3NgbzTHMg9w8G805Rc1JHu\nAn+tns5hTekaEUuX8Bg6hzWlXUgTQgx+NdlEIXyGhIKoVxSHHfOJPZQlb8CUmojp+G5wXHQ7b7UG\nY6s++HWKx7/TUIxtB173ngSAU3FyqiSfn/MzOXz+HD/nZ/Jz/jmyyosvO3+0fzDtQ5rQPrQJHUKb\n0K5iGGEMrO6mCuEVEgqiXnNayjAd24EpNZHylETM6T+C01E5g0qFPrYLfu0HYWw3EL/2g9E1aVvt\neyqdN5dyOD+Tn8+7QuJoYTZpRbmX9Ju4WLghgLYhkbQMiqBlcDitgiJpGRxO66AIQg3+cm8n4XMk\nFESD4jSVYDq2nfLURExHt2M5meS+J9MFmqAod0D4tR+IoWVv1PrqH/5xOJ2cKSvgWGFO5avINSy9\nzCGoC4L1RloGRdAqKIIWQeE0CwwlNiCE2IAQYgJCCdX7SWiIOiehIBo0p9WM5dQ+195E2g7Mx3Zc\ncmUTAGoNhmbdMLTui7F1H4ytb8QQ1x2VVlejdSuKQmZ5MenFeZwqOc/J4vOcKsnnVPF5Tpacv2pg\ngOv8hSskQolxh0UIUcZAovyCaOIXRKRfoHTME7VKQkE0KoqiYMs5junYdsxpOzGl7cJ69mf3PZou\nUGkNGFr0wti6L8Y2fTG06os+piOqWrp9hqIo5FvKOFl8npMl+WSUnOdcWRHnyorILCvkbGkhZb/Y\nw7mSEL0fTfyCiPILJMo/iChjIJF+QYQb/QnV+xNm9CfMUPmSEBFXI6EgGj2npQzLqf2Y05PcL1v2\nsSrzqQwBGJr3xNCiB4YWPTG06IWhWTfUBv9ar0lRFIqt5oqgKKx4FZFdXkyOqYQ8Uym5phLyzGVV\n+lz8Gn+tvjIkKgIjRO9HkN5IkM7oGuqNBOkMBOmNBOuMBOoNFUMjOrmvVIMmoSDEZTjKCrCc2of5\nRBLm9B8xpydhzz9ddUaVGn1MRwzNe7hCoiIstCHRdVKnU3FSYCknp7yUPHMJOaZScstLyDWXUmgp\no9BiosBSToG5nAJLOYWWcuzXGSK/ZNToCK4IjkCdAX+tvvKlcw0DdFXH+Wv1BGj1+Okq31+YZtTo\n5PyJj/DZULDb7cybN4/evXuTkpLC7Nmz3f9oli9fzttvv012djZLlixh4MCBVQuTUBC1zFGSh+X0\nQcynDmLJOIDl9EGs51IuvdqpgiY4GkNcN/TNuqBv1gVDbBf0zbqiCfDu86gVRaHEZqHQ4gqJC4FR\nZDVRajNTbDVTYjVTYrNQYjVXjrOZKbFaKLGZcXrg/ysVKgwaLUatDqNGi1Gjq3ivw6it+Ns9Tuue\nZtBo8bsw38XzXmWcQaPFoNGiU2skiC7DZ0NhwYIFaDQapk+fzsKFCwkLC2PSpEmYzWbWrVvHxIkT\nWbZsGZ988glff/111cIkFEQdcFrNWM8ddoVExkHX6/RBnKbL92fQhMZgaNYFfWxFWDTrij62a7X6\nUniDoiiU263u0CixmTHZrJTbrZTbbZTZLZTbrJgueu+aVvGyWSmzWzFV/F1WMf1Kl/R6mkGjxajR\noq8ICoNGh16twaDRYdBcGFZOvzDt0s9c9F59mXEXL/vCMtWVy9b8ysOn6prPhsLvfvc7/vjHP9Kv\nXz927drFggULWLx4MQ6HA7VajUqlIiUlhfnz57NkyZKqhUkoCC9RFAV73kksZ3/GevawKzQqhhff\nFfZimtAY9NEd0Ddtj66pa6iP7oCuSRtUjeABQg6nE4vDjtlhw2y3uYYOGyb7L8fZK9+7h3b3/Fce\nV/k5q8OOxWGv8WG02qJRqS8JEqNGi/4q4aJXu8ZdHDD6ivlcoXVxiF1Yhsb9uQthp7/ocwa1ax5t\nxd5TTb47PXYZQ1ZWFkFBrkcuBgYGkp2dDYBGU3mSa+vWrTz11FOeKkGIalGpVOiiWqOLag297nCP\nV5xObHknsZ79Geu5lMrQyEzBUZiJqTAT05EffrEwNbrIVuiatkfftIMrKJq2Rx/dDm1YXI0vm/UV\nGrUaf7XrHENdcTidWJ2ugLjwuhAYFudlxrlfNqwOh2vodGD+5biKocVZMbxoGRemmR1297oditO9\nJ+Vtmlq4r5fHQiEiIoKSkhIASktLiYy89AHw6enptGzZkm7dul1xGQkJCe738fHxxMfHe6JUIa6J\nSq1G36SN6y6vN4xxj1ecDuznM7BmH8OadQxb1lGsWUexZadhyzuJLfcEttwTlCdvuHSBag3asLiK\nAGqFLvLCsBXaqNZoQ2J+9bnYjZlGrcZPrff6o1ztTscvQqciMOyXBpT1MsFlrfjbetEyrBWfdwVQ\nZfhcPJ/1ovEFyWkUHz6BQ3FSGwdXPHb4aMmSJVgsFqZNm8aiRYswGo2MHj2aqKgocnJy2Lt3L6NH\nj8ZsNlNSUkJUVNSlhcnhI9EAOG0WVyhkHcWadawiLI5hyz2BveAsV/u/WKU1oI1s6drTiGiBNjwO\nbVgz1zC8ObqwONR+QXW4NcLX2Z0OdBqtb55TUBSFOXPm0KNHD5KTkxk3bhwvv/wyH3/8McOHD3fv\nRajVavbv34/6F7+IJBREQ+e0WbCfz8CWl44tNx1b3insuekVexfpVXtuX4baLxhtWFxFUMS59jzC\nm6ENa4YmpCnakKZogqJQSYe3RsNnTzTXlISCaOyc5lJs509hy03Hnn8ae/5Z7AVnsOWfwZ5/Bnv+\naRSb+dcXpFKhCYqqCIloV1BcCIzgJpf8rQ4Ik8s86zkJBSEaKUVRcJblY8+vCIqCC2FxBnvhWexF\nOTiKsnCU5l31MNUlNFo0ARFogiLRBEaiCYqoGF7p70hUhgAJEh8ioSCEuCrFbsNRkou9KAtHcbZr\nWOQaXvzeUZR1xf4ZV6PSGtAERaIOjEDjH4Y6IBSNfyjqgDDX3/6haAJCUfuHXTR0ja/J3W3F5Uko\nCCFqjWK34ijJw1Gad9Hw/C/+zsNRet41LMm9tkNYV6DSGirCI7Ry6BfsehmDL3ofWHW8Mcg9rbZu\ndNgQSCgIIbzKaSl3h4WzvBBHWYFrWF6Is6zANSyvGJZVTneWF1R5VkZ1qYyBqI3BaPyCURmD0PgF\no/YLQm0Mdk0zBKI2BLgCxODv/ltlCPjFtADXeH39fYCShIIQol5SFAXFaqoIkAJXgJQV4DSX4DQV\nVw4vfm8uxmlyvXeYilHMJTjNJbVfnEqFSv+L8DAGojb4o7oQIhdP0/uh0vmhMvih1vmh0vu5Pn/x\nUOfnCqQL03VGjwSPz/ZoFkKIq1GpVBW/2v3RhsVWezmK04nTUloZIBeHx4WhpQzFUobTUobTUnrR\n+4vGm0tRrBVDmxnFUobDUkbV2yXWHlVFWKgN/q7Q0Puh1vu7hjqja49F71c1XPQV4aMzuMKl4qXW\nG2tck4SCEKJeU6nVaPxch45qi+J0oFjKXWHjDo7SyiAxX3hf7goSqwnFWu7a87kwtJlQLCYUm8k9\n3Wl1/a1YylHsFhSryTVvWX6t1V5TEgpCCPELKrUGlV+QR3uMK06na4/korBwWspdoXFRuFSON1eG\nz4X3NnPFy4TTZkaxmoHNNapLzikIIUQDUtPvTrnblhBCCDcJBSGEEG4SCkIIIdwkFIQQQrhJKAgh\nhHCTUBBCCOEmoSCEEMJNQkEIIYSbx0LBbrczZ84c1q5dy/z58y/pTLF582beeust3nzzTfbs2eOp\nEhqMxMREb5fgM6QtKklbVJK2qD0eC4X33nuPuLg4xo0bR3h4OKtWrQLA4XDw9NNPM3PmTGbNmsUz\nzzzjqRIaDPkHX0naopK0RSVpi9rjsVDYvXs3vXr1AqBnz56sX78egIyMDCIjI93zabVa0tPTPVWG\nEEKI6+CxUMjKyiIoyHUzqcDAQLKzs6uMBwgKCnJPE0II4V0eu0tqREQEJSWuh1+Ulpa69w4iIiIo\nLS11z1daWkpUVFSVz7dt27bePvnIE+bOnevtEnyGtEUlaYtK0hYubdu2rdHnPRYKo0aN4uDBg/Tr\n149Dhw4xcuRIcnNz6dChgzssFEWhtLT0shuRlpbmqdKEEEJcgcduna0oCnPmzKFHjx4kJyczbtw4\nXn75ZVasWMG2bdvYvXs3AAMGDGDw4MGeKEEIIcR18tnnKQghhKh70nlNiHogKSmJZcuWcfr0aW+X\nIrwoOTkZp9Pp0XVoEhISEjy6hutkt9tJSEigqKiINWvWMGTIkEZxwrm4uJipU6fy1FNPceDAAe64\n444q7eBwOBpN2+zdu5fnn3+e0aNHN+p2AFi8eDE7d+5k1qxZBAQENNr2yMzM5N133yUnJ4f33nuP\nW265hblz5zaatti1axfDhg3jySefRFGUa/p3UJ328Lk9hSt1emvoNm7cyEcffURqaip79+7lpZde\nqtIO77//fqNom8LCQhITEzGbzZf999BY2gFc/Xr++c9/8vzzzwOX//+jsbTHZ599RseOHRk/fjwq\nlYpXX321UbXFgAEDiIqKQlGUa/53UJ328LlQuFKnt4buzjvvxGAwoNPp6NKlC0ePHq3SDrt27WoU\nbfPFF19w1113AVx2mxtLOwAsW7aMiIgIXnrpJUaNGsXOnTsbbXsMGzaMefPmsWXLFnQ6HYcPH260\nbXG578na+n/FY5ekVteVOr01dDqdDgCz2UxcXBw//fQTgYGBwKUd/C6Ma6ht8/nnnzN+/HiKi4tR\nFIWsrKxG2Q4XpKen8+ijjzJx4kTatWvH+++/7/7/o7G1R9++fRkzZgz3338/L7zwAgcOHGi0bXHx\n9+Tltr0m7eFzoXClTm+NxcqVK5k7dy7Tp093d/IrKSkhMjLS3a8DGm7bfPzxxyxdupTy8nJSU1PJ\ny8ujrKwMaFztcEFISIj7fceOHdHpdO7/PxpbeyQmJhIcHMy+ffsYOXIkbdu2bbRtcfH35OW2vSbt\n4XOHjy50egM4dOgQo0aN8nJFdWf9+vXcfvvtBAQEVGmHkSNHNoq2WbduHWvWrGHRokXccsstvPvu\nuxw4cABoXO1wwfDhw9m/fz8ABQUFpKSkNMp/FwD79u2jffv2NGnShKlTpzJ27NhG2xbX8v1Q3fbw\nuVCYMmUKGRkZrFq1ijNnzjB58mRvl1Qnli1bxqOPPsrNN99Mly5dKCsru6QdpkyZ0qjaRlEUVCoV\nkydPbtTtMHLkSCwWC5988gk7d+5k7969jbY9Jk+ezObNm1m7di2FhYVMnDixUbVFUlISubm5bNy4\nscp2Xm7bq9se0nlNCCGEm8/tKQghhPAeCQUhhBBuEgpCCCHcJBSEEEK4SSgIIYRwk1AQQgjhJqEg\nRC0qKytj0aJFHD582NulCFEtPnebCyHq2rp167jzzjv597//jU6n48SJEzRv3pyZM2de97ICAgLY\ntGkTrVq1okuXLh6oVgjPklAQjd4dd9wBcEkIpKSkVHt5AQEBSJ9QUV/J4SMhfmHlypUcPXqU4cOH\n8+KLL9KjRw9++OEHLBYLr7zyCl9++SXTpk0jJycHp9PJggULWLlyJePHj8fhcADw3XffMWXKFJ59\n9lkvb40Q10f2FISo8Oabb2K1Wlm/fj0ff/wxhYWFPPvsswwaNIiEhAQmTJhAixYtuPPOO7FYLMyd\nO5d+/frRs2dPBg0ahMPhcIfCkCFDiI+Pp1+/frz44ote3jIhrp2EghAVZs2aBcBvfvMbFEVBr9cD\n0Lt3b/Ly8khNTaVp06YA9OjRg4ULF6JWq+nbty8A9913n3tZRqMRvV6PxWKp460Qombk8JEQv9C5\nc2fWrFnj/js7O5v+/fvTvXt3kpKSADh//jy9evUiLi6OTz/9FIDjx49TUFAAIOcURL0lewqi0fvy\nyy9RqVS89NJLhIaGcvz4cQ4ePEhWVhYfffQRGRkZvPjii4SHhzNr1iyWLVvG6dOnefbZZ1GpVIwf\nP55+/frx0EMPcf/993Ps2DEOHTpEWFgYhYWFnDt3jtjYWG9vphDXRG6dLcRlnDp1invvvZedO3d6\nuxQh6pQcPhLiMpKSksjOziY9Pd3bpQhRp2RPQQghhJvsKQghhHCTUBBCCOEmoSCEEMJNQkEIIYSb\nhIIQQgi3/weTlF3S8CpwJQAAAABJRU5ErkJggg==\n", "text": [ "" ] } ], "prompt_number": 151 }, { "cell_type": "markdown", "metadata": {}, "source": [ "#Support Vector Machines: Background and Visual Intuition\n", "\n", "Assume your data set $D$ is = ${(x_1,y_1),...(x_n,y_n)}$, where $x_i \\in \\mathbb{R}^k$ and $y_i \\in \\{1,-1\\}$. A support vector machine attempts to find a plane in $\\mathbb{R}^k$, that is, a \"hyperplane\", that seperates the data of either class. While there may exist an infinite number of planes that achieve this goal (or perhaps none), intuitively we want the one which maximizes the distance between the seperating hyperplane and the closest vectors, termed the \"support vectors\". The hope is that this will allow the model to generalize well. It turns out this condition can be codified as a quadratic programming (optimization) problem. \n", "\n", "Denote an arbitrary hyperplane in $\\mathbb{R}^k$ as $w^tx+b = 0$. The support vector machine is the solution to the constrained optimization problem:\n", "$$minimize$$ $$w^tw$$\n", "Subject to constraint:\n", "$$(w^tx_i)+b \\geq 1, y_i = 1$$\n", "$$(w^tx_i)+b \\leq -1, y_i = -1$$\n", "\n", "Note that this method is useful only in the case that the data appear to be linearly seperable to begin with. In the case they are not, we apply a \"Kernel\" function to the data which warps the points with the hope that they become linearly seperable. There are a variety of kernels that one could utilize to achieve this goal, and sci-kit learn supports a number of them. \n", "\n", "In the following example, we visualize a small mock data set to gain some intuition before applying the SVM method of classification. " ] }, { "cell_type": "code", "collapsed": false, "input": [ "#read in svm data\n", "svm_dat = numpy.genfromtxt('svm.csv',delimiter = ',')\n", "class1 = svm_dat[0:5]\n", "class2 = svm_dat[5:10]\n", "plt.figure()\n", "plt.scatter(class1[:,0],class1[:,1], c = 'blue', marker = 'o')\n", "plt.scatter(class2[:,0],class2[:,1], c = 'red', marker = 'x')" ], "language": "python", "metadata": {}, "outputs": [ { "metadata": {}, "output_type": "pyout", "prompt_number": 152, "text": [ "" ] }, { "metadata": {}, "output_type": "display_data", "png": "iVBORw0KGgoAAAANSUhEUgAAAXUAAAEACAYAAABMEua6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAD6NJREFUeJzt3X9s1PUdx/FXaWn50eKktMG0JJjShVkGJeNHUcRbCJw4\nR5qtiBk08heYzC1LDH9gSEXEwEIi/8x/ILNbAlHsktXNUpDozoxEEBiUgrmQdgknkZaOaOi13sH1\nvvvjhOOXtNd+e9/r+56P5OL3yPV7by9fn375wPe+OY7jOAIAmDDO6wEAAO4h6gBgCFEHAEOIOgAY\nQtQBwBCiDgCGjDjq7e3tisfjbswCABihEUX9+PHjWrJkiWKxmGKxmBoaGtTc3KydO3eKv/4OAOk3\noqjX1NSopKREjuNo3759Ki8vV21traZOnaqmpia3ZgQADJFra+onTpxQdXW1JGnevHlqaWlxa9cA\ngCFyLepdXV0qKiqSJBUWFqq7u9utXQMAhijPrR0VFxert7dXkhQOhzVt2rT7XjNr1ix1dna69ZYA\nkBUqKirU0dExpNe6dqbu9/vV1tYmSTp37pz8fv99r+ns7JTjODwcR6+//rrnM2TKg8+Cz4LP4uGP\nVE6GRxT1U6dOqaenR0ePHlV9fb1CoZCampp0+fJlrV+/fiS7BgAMw4iWXxYsWKBwOHz7+ZtvvilJ\nWrNmzcimAgAMC1eUesTn83k9Qsbgs0jis0jisxieHMdx0naVUE5OjtL4dgBgQirt5EwdAAwh6gAy\n11dfSb//vXTzZuL5kSPSO+94O1OGI+oAMldJidTZKa1bJ7W0SPX10vz5Xk+V0VhTB5DZIhGpsFAa\nGJA++0xatszridKONXUAdnz2mTR5svSjH0l/+lNyKQYPRNQBZK6OjsSSS2urdOWK1Ncnvfaa11Nl\nNJZfAGS2r76SZsxIbEci0vXrUmmptzOlWSrtJOoAkOFYUweySTSa3I7FEo9Mcu8aOGvio4qoA2PZ\n559LixZJPT2JmNfXS3v2eD1VUk+PVFUlBYOJ52+/LW3Y4OlI1rn2feoAPFBTI/3yl9Izz0hlZVJu\nrvS733k9VVJJibR1q7R8ufSLX0iffir9619eT2Uaa+rAWBeLSePHJ7ZDoeQfKmaS6mqprU36+GNp\nxQqvpxlzWFMHssWtJZeVK6VXX02cDff0eD3V3d5+WwqHpR07Eksvt5ZiMCpYfgHGsv/8J/F3tz/8\nUCookCZMkA4elF55xevJEnp6pPffTyy5zJiReOzeLf35z15PZhbLL8BY5zhSTk5yW0o+zwR3zveg\n5xhUKu3kTB0Y6+4MZCbG8t6ZMnFGQ1hTBwBDiDoAGELUAcAQog4AhhB1ADCEqAOAIUQdAAwh6gBg\nCFEHAEOIOgAYQtQBwBCiDgCGEHUAMISoA4AhRB0ADCHqAGAIUQcAQ4g6ABhC1AHAEKIOAIYQdQAw\nhKgDgCFEHQAMIeoAYAhRBwBDiDoAGELUAcAQog4AhhB1ADBkVKMeDodHc/cAgHu4HvXt27ersrJS\nVVVVRB0A0izPzZ319fUpEono/PnzKigocHPXAIAhcPVM/eLFizpz5ozKysrU2Njo5q4BAEOQ4ziO\n4/ZOg8Ggli9frtOnT2v69OnJN8vJ0Si8HQCYlko7XV1+uWX27Nmqq6vTpUuX7oq6JG3btu32ts/n\nk8/nG40RAGDMCgQCCgQCw/pZV8/Uo9Ho7bX0l19+Wbt371ZRUVHyzThTB4CUpdJOV6O+efNmXbp0\nSatXr9bMmTO1dOnSYQ8GAEjwLOqDvhlRB4CUpdJOrigFAEOIOgAYQtQBwBCiDgCGEHUAMISoA4Ah\nRB0ADCHqAGAIUQcAQ4g6ABhC1AHAEKIOAIYQdQAwhKgDgCFEHUD6vPuu9Ne/JrYHBqRXXpG+/NLb\nmYwZldvZAcADPfmktHy5FI9Ln34qXbkizZzp9VSmcJMMAOl14YI0Z05iu69PmjTJ23nGAG6SASAz\nDQxIu3ZJZWXSo49Kf/ub1xOZw/ILgPT54x8TSy4XL0qhUGIppqpK+tnPvJ7MDJZfAKRPb6+Um5tc\ncunulkpLpZwcb+fKcNx4GgAMYU0dALIUUQcAQ4g6ABhC1AHAEKIOAIYQdQAwhKgDY93AwIO3kZWI\nOjCWvfeeVFcn3bghRSLS6tXS3//u9VTwEBcfAWPZjRvSmjXSzZuJx6OPSgcOSOPHez0ZXMQVpUA2\n6e2VpkxJbPOthyYRdSBbRCLSr3+dODO/eVPKz5cOHkz8E2bwNQFAtvjwQ2nyZKmpKbGW7jjSoUNe\nTwUPcaYOjHXxuDRu3P3bMIMzdSCb3Blxgp71OAIAwBCiDgCGEHUAMISoA4AhRB0ADCHqAGAIUQcA\nQ4g6ABhC1AHAEKIOAIYQdQAwhKgDgCGuRj0Wi6mhoUHNzc3auXMn38gIAGnmatT37dun8vJy1dbW\naurUqWpqanJz9wCAQbga9RMnTqi6ulqSNG/ePLW0tLi5ewDAIFyNeldXl4qKiiRJhYWF6u7udnP3\nAIBBuBr14uJi9fb2SpLC4bCmTZvm5u4BAIPIc3Nnfr9fbW1tWrRokc6dOye/33/fa7Zt23Z72+fz\nyefzuTkCAIx5gUBAgUBgWD/r6j1KHcdRQ0OD5s6dq/b2dr3xxhvKyclJvhn3KAWAlKXSTm48DQAZ\njhtPA0CWIuoAYAhRBwBDiDoAGELUAcAQog4AhhB1ADCEqAMu6OuTBgakaDTxT8ArRB0Yob4+6Z//\nlAoLpTlzpP/9z+uJkM2IugF9fdKpU9Lhw1J/v9fTZB/HkV57TYpEpI4O6Z13OFuHd4j6GDcwILW0\nSAsXSqtWSa++Kn3/RZlIE8eRvr+NgCSppkbKzfVuHmQ3V7+lEel344b0738nn3/+uRSPezdPNioq\nkvbvl/7xD2nmTOmnP/V6ImQzvtDLgM5O6cknpW++kd57L3HGPmmS11MBcAvf0phlbtyQxn2/kBaN\nSpMnezsPAHcRdQAwhK/eBYAsRdQBwBCiDgCGEHUAMISoA4AhRB0ADCHqAGAIUQcAQ4g6ABhC1AHA\nEKIOAIYQdQAwhKgDgCFEHQAMIeoAYAhRBwBDiDoAGELUPTAwIEUiXk8BwCKinmZ9fdKBA9KWLdLX\nX0uxmNcTAbCEe5Sm2ccfS35/YvvHP5ba2qQJE7ydCUBm4x6lGayrK7l99aqUm+vdLADsIeppVlcn\n/eY30ty50sGD0s2bXk8EwBKWXzxw/brkOFJ+vjRxotfTAMh0qbSTqANAhmNNHQCyFFEHAEOIOgAY\nQtQBwBCiDgCGEHUAMISoA4AhRB0ADBm1qMfjcfX394/W7gEAD+B61Ddu3KjKykotXrxYeXl5bu8e\nAPAQrkY9FAqpoqJCwWBQJ0+eVH5+vpu7BwAMwtWoB4NBHT58WOXl5WptbXVz1wCAIRiVL/Q6duyY\n1q5dq1AopNw7vjCcL/QCgNSl0s5hLXofOXJEu3btuu/X9+7dq8rKSi1dulQLFizQtWvXVFpaetdr\ntm3bdnvb5/PJ5/MNZwRkme++k6LRxF2iuFMUrAsEAgoEAsP6WVfP1KPRqAoKCiRJ69at04EDB+5+\nM87UMQx9fVJjo9TSIv32t9LPfy5Nnuz1VED6ePZ96i+++KIeeeQR1dTUaOHChZozZ86wBwNu+fJL\nqaoqsZ2XJ33zjVRY6O1MQDpxkwyYEgxKP/lJYjs/PxH1SZO8nQlIJ26SAVNmzJDefVf61a+kQ4e8\nngbIbJypY0yIRBJ/UDpxYuJsHcgmLL8AgCEsvwBAliLqAGAIUQcAQ4g6ABhC1AHAEKIOAIYQdQAw\nhKgDgCFEHQAMIeoAYAhRBwBDiDoAGELUAcAQog4AhhB1ADCEqAOAIUQdAAwh6gBgCFEHAEOIOgAY\nQtQBwBCiDgCGEHUAMCTP6wGATDEwIPX3J7YnTJDGj/d2HmA4OFMHJMXj0tdfS888Iy1ZIv33v1Is\n5vVUQOqIOiApHJa2bJHOnJEuXJD+8IfErwFjDVEHJOXlSY8/nnz++ONSbq538wDDleM4jpO2N8vJ\nURrfDkjJd99Jf/mLFIlImzZJkyZ5PRGQkEo7iTpwB8dJPMbxe1hkkFTayd9+Ae6Qk5N4AGMV5yMA\nYAhRBwBDiDoAGELUAcAQog4AhhB1ADCEqAOAIUQdAAwh6gBgCFEHAEOIOgAYQtQBwBCiDgCGEHUA\nMGREUT979qxbcwAAXDDsqH/00Ud67rnnbj+/fv26tm7dqubmZu3Zs8eV4SwLBAJej5Ax+CyS+CyS\n+CyGZ9hRf/755++6E8dbb72lZcuWqba2VlevXtUXX3zhyoBWccAm8Vkk8Vkk8VkMj2tr6idOnFB1\ndbUkad68eWppaXFr1wCAIXIt6l1dXSoqKpIkFRUVqbu7261dAwCG6KE3nj5y5Ih27dp136/v3btX\nlZWVeuyxx3TlyhVJ0lNPPaXm5maVlJTo/fff1/nz57Vjx467fm7WrFnq7Ox0+V8BAGyrqKhQR0fH\nkF770BtP+/1++f3+Ie3I7/fr7NmzWrFihdrb2/Xss8/e95qhDgUAGJ5hL78cOnRI3377rU6fPi1J\n2rx5sz755BN98MEHmjJlip5++mnXhgQADM1Dl18AAGNLWq8o/aGLleLxuPr7+9M5iue4cOuHZePx\ngB8WDoe9HsET7e3tisfjKf9c2qJ+78VKkrRx40ZVVlZq8eLFyst76PK+KVy49WDZejxIUiwWU0ND\ng5qbm7Vz505l+2+gt2/frsrKSlVVVWVl1I8fP64lS5YoFoulfGykLer3XqwUCoVUUVGhYDCokydP\nKj8/P12jeI4Lt+6XzceDJO3bt0/l5eWqra3V1KlT1dTU5PVInunr61MkEtH58+d14cIFTZ8+3euR\n0q6mpkYlJSVyHCflY8OzL/QKBoM6fPiwysvL1dra6tUYGYELtzgeOAaSLl68qDNnzqisrEyNjY1e\nj+O5VI8Nz36Pu3LlSq1cuVLHjh3T2rVrFQqFlJub69U4nuLCLY6HO4+BwsLCrDwGbpk/f75aW1sV\nDAa1fPlyrVq1KivP1m9J9dhwNeqDXaz0IEuXLtWCBQt07do1lZaWujmOp1L5LIqLixUOhzVx4kT1\n9vZq2rRp6Roz7Qb7XKweD4MpLi5Wb2+vpMQfDFo+BoZq9uzZqqur06VLl7I66qkeG65GPZWLlSKR\niCZMmCAp8X8fa/8Bu33hlhU/9LlEo9Hb2xaPh8H4/X61tbVp0aJFOnfu3JCPHYui0agKCgpubz/x\nxBMeT+StVI+NtK2p33ux0oYNG7Rp0yY1NjZqy5Yt6RojI3Dh1v1eeumlrD0eJKm+vl6hUEhNTU26\nfPmy1q9f7/VIntm6dateeOEF7d+/X+vXr7+99JBNTp06pZ6eHh09ejTlY4OLjwDAEG5nBwCGEHUA\nMISoA4AhRB0ADCHqAGAIUQcAQ4g6ABhC1AHAkP8DBF5k8LSpvS4AAAAASUVORK5CYII=\n", "text": [ "" ] } ], "prompt_number": 152 }, { "cell_type": "markdown", "metadata": {}, "source": [ "Can you guess what the support vectors are in this data set? (Hopefully it is easy to see.) Let's actually fit a SVM to check our suspicions." ] }, { "cell_type": "code", "collapsed": false, "input": [ "from sklearn import svm\n", "X = np.column_stack((svm_dat[:,0],svm_dat[:,1]))\n", "y = svm_dat[:,2]\n", "clf = svm.SVC(kernel='linear')\n", "clf.fit(X, y)\n", "clf.support_vectors_\n", "plt.scatter(class1[:,0],class1[:,1], c = 'blue', marker = 'o',s=20)\n", "plt.scatter(class2[:,0],class2[:,1], c = 'red', marker = 'x',s=20)\n", "plt.scatter(clf.support_vectors_[0,0],clf.support_vectors_[0,1],c='blue',marker = '*',s=200)\n", "plt.scatter(clf.support_vectors_[1,0],clf.support_vectors_[1,1],c='red',marker = '<', s = 100)\n", "#Get the separating hyperplane\n", "w = clf.coef_[0]\n", "a = -w[0] / w[1]\n", "xx = np.linspace(-15, 10)\n", "yy = a * xx - (clf.intercept_[0]) / w[1]\n", "t = numpy.linspace(-15,10,100) # 100 linearly spaced numbers\n", "plt.plot(xx,yy)" ], "language": "python", "metadata": {}, "outputs": [ { "metadata": {}, "output_type": "pyout", "prompt_number": 155, "text": [ "[]" ] }, { "metadata": {}, "output_type": "display_data", "png": "iVBORw0KGgoAAAANSUhEUgAAAXUAAAEACAYAAABMEua6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAHUtJREFUeJzt3XlwFHX+//Hn5CBAEo4AghIViEEMLuEmyBUFCS4CUUHw\n4Fv79aD0tyi7fhfFo4BQnqsLq+BaKypV67EBFKOIHOsRAgjhkCMcMQqaAGvAGI5MyEyOmd8fI1e4\ncvRMp3tejyrKmZhMv+nRF81nuvvl8Hq9XkRExBZCzB5ARESMo1AXEbERhbqIiI0o1EVEbEShLiJi\nIwp1EREbqXeo5+Tk4PF4zvm60+ms70uLiEgt1SvUN2zYQP/+/amsrARg1qxZxMfH07VrV4W6iIgJ\nwurzw0lJSbRp0waA0tJSXC4XO3fuJCIiwpDhRESkdgxbU8/Ly2Pr1q20b9+eBQsWGPWyIiJSC476\n3iagY8eOfPfddzRq1AiA3Nxchg4dypYtW2jXrp0hQ4qISM3Ua/nlfLp06cLYsWPJz88/J9SvueYa\n9u7da/QmRURsLS4ujh9++KFG32vI8ovX68Xlcp167na7SUhIOOf79u7di9frteyvGTNmmD6D5jd/\njmCc38qz22H+2hwM1+tIffPmzRQVFbFq1SqysrLIz89n9OjR3HvvvURHR9fnpUVEpA7qFeq9e/em\npKQEgFGjRhkykIiI1J2uKK2F5ORks0eoF81vLivPb+XZwfrz10a9z36p1cYcDgK4ORERW6hNdupI\nXUTERhTqIiI2olAXEbERhbqIiI0o1EVEbEShLiJiIwp1EREbUaiLiNiIQl1ExEYU6iIiNqJQFxGx\nEYW6iIiNKNRFRGxEoS4iYiMKdRERG1Goi4jYiEJdRMRGFOoiIjaiUBcRsRGFuoiIjSjURURsRKEu\nImIjCvUaenbT57y2/SuqPB6zRxERuaAwswewgn3HivjnzjV48ZJ5MI9XB93JldExZo8lInIOh9fr\n9QZsYw4HAdycobIOfs+f1yziUFkJ0eERPNs/lds7dcfhcJg9mojYXG2yU6FeC8WuUh5ft4QVBbsA\nGNMxkef7p9I8oonJk4mInSnU/cjr9ZL+/WamZ39KWWUFV0Q257XB40lq18ns0UTEphTqAbDvWBGP\nZKWzvegADhz8v98N4f96DKNRqD6mEBFjKdQDpMJTxd+3fcncHV/j8Xrp1qo9c4dMIK55G7NHExEb\nUagH2MZDP/FoVjoHnEdpEhbOjL63ck/nvvoQVUQMoVA3wfFyF89s+IQle7cCkHJVAn8dcDutGkeZ\nPJmIWJ1C3UQZ+7bx1PoMjpe7uKxJNH8bOJYbY681eywRsTCFuskOOI8wJWsR2Yd+BOC+627gqd63\n0Dgs3OTJRMSKAhrqOTk5dO3alZCQS99xIFhCHaDK4+GNnVm88u0qKr0erm3RlrlDJpAQc7nZo4mI\nxQQs1Dds2MCwYcMoLi4mJCSEWbNm0bNnT/bs2cO0adPO+aAwmEL9pO1FB3hkdTr7jhfRKCSUJ3uP\n4P6EAYQ4dNsdEamZ2mRnvZIlKSmJNm3a4PV6mT9/PrGxsaSmphITE8PixYvr89K2kdg6lhWjH+Xe\na/tR7qkibeMy7l21gMITx80eTURsyLDDxezsbLp37w5AYmIiy5YtM+qlLa9peCNevOE23r5pIjER\nkWT993uGZfyd5fk7zR5NRGzGsFAvLCwkOjoagKioKA4dOmTUS9tGytVd+U/qFIa078xR9wke/Oo9\n/rL2Q0or3GaPJiI2YViot2rVipKSEgCcTietW7c26qVtpW3TZrx38/8yq98oIkLDSP9+MymfvMbW\nX/abPZqI2IBhNypJSUlh+/bt9O3blx07dpCSknLe75s5c+apx8nJySQnJxs1gmU4HA7uSxjADZfH\nMXl1OrlHCkld9gaPdR/K5G43ElqDM4lExL4yMzPJzMys08/W6+yXzZs3k5ycTHp6OiNHjmT69Ol0\n69aNnJwc0tLSdPZLDbgqK3jp25XM37UWgL5tO6iEQ0TOoouPLKh6Ccdz/VO5Pa6H2WOJSAOgULeo\nc0o4OiXyfJJKOESCnULdwqqXcLSPbMGrg+9UCYeYz+2GiIgLPxe/CdjFR2I8h8PBXZ37sHL0FBJb\nx3Kw9Cjjls/npS0rqfBUmT2eBKviYkhIgO3bfc9feQXuvdfcmeS8dKTegFUv4UhsHcvcwePppBIO\nMcOHH8LkyXDzzbB+PWRmQmys2VMFBS2/2IxKOKTBSEqC7Gz47DMYOdLsaYKGll9spm/bDqwa8ydu\nj+tBWWUF0775mAe+epdiV6nZo0kweeUVKCqCOXPg/vtPL8VIg6IjdYupXsIxZ9A4hrTvbPZYYnfF\nxTB6NKSn+5ZcPvwQPv8c3nnH7MmCgpZfbK56Ccf9CQN4stcIlXBYndcLZy6pVX9utoY+n41p+cXm\nYqNasmjEg0zrNYIwRwhv717HyKXz2FNcaPZoUlf790Nysm95A+Dtt+Gxx/y7zUWL4NFHfeFcVQV/\n+AOsWnXh768e4Ar0BklH6hZ3ZglHRGgYT/YawX0JN6iEw2q8XnjmGVi6FO65B+bNg6++gvh4/23z\n2DEYPhx69/Y9LiyETz+Fpk39t02pEy2/BJkTFeXM2rSM977LBmDwFfHMHjSOdk2bmTyZ1IrXC5df\nDocOwTffQP/+/t9mcTG0auV77HRCZKT/tym1plAPUivzdzF13RKK3aW0jGjKywPuYMTVXc0eS2rq\n7bdh5kxfmOfm+o7U/XkL65NLLrm5vqtDhwyB117TskoDpDX1IHVmCccR9wke+Opdpq77SCUcVrB/\nPzz/vC/IFy6EUaPg6af9u81334Wff4bVq2HNGti40bf8I5amI3Ub8ng9LNiznuc3L8ddVUnHZq2Z\nO3g83dtcafZocjFn3kvF64Xycv/eW8Xj8W2jcWPf87Iy32MdqTc4Wn4RAHKPFJ4q4QhzhPBYj2H8\n8XfJKuEQsRiFupxSvYSjX9sOvDp4PLFRLU2eTERqSqEu51AJh4h1KdTlvFTCIWJNCnW5IK/Xy7+/\n38SM7KUq4RCxCIW6XNK+Y7/wSNZCthcdwIGDyd2SeazHMMJDQs0eTUSqUahLjVR4qpiz9Qvm5WSq\nhEOkAVOoS61kF/7IlDULT5VwzOw7irs791EJh0gDoVCXWjte7uLp9Rl8vG8bAClXJfDygDuIaax7\ngYiYTaEudfbx3m08tf5jSircXNYkmtmDxpGsEg4RUynUpV58JRwLyT70EwD3XXcDT/W+RSUcIiZR\nqEu9VXk8/CNnNX/b+h8qvR6ubdGWeUPu4rqYdmaPJhJ0FOpimDNLOBqFhPJU71tUwiESYAp1MdSJ\ninLSNn7G+3kbAZVwiASaQl38QiUcIuZQqIvfHDpxnMfWfsjqg3kA3NW5DzP73kpkuB/v+y0S5BTq\n4lcq4RAJLIW6BMSe4kIeyVIJh4i/KdQlYFyVFby4ZQVv7V4HqIRDxB8U6hJwqw/m8ec1izmsEg4R\nwynUxRTFrlKmrvuIlQW7AUjt1J3nksaohEOknhTqYhqVcIgYr8GEutPpJCoqqk6DibWphEPEOLXJ\nTsNPU5g1axbx8fF07doVp9Np9MuLRXRq3oaMkQ/zaLcbcThg7o6vSV32BvuO/WL2aCK2Zmiol5aW\n4nK52LlzJ7t27aJdO938KZiFh4TyeK8UFo+YRGxUC7YXHSDl09d4/7uN+hubiJ8YGup5eXls3bqV\n9u3bs2DBAiNfWiysX7uOrBrzJ27r1J2yygqe+GYJD3z1LsWuUrNHE7Edv6yp5+bmMnToULZs2XLW\n0brW1KV6CcecQeMYohIOkYuqTXaG+WOALl26MHbsWPLz889Zgpk5c+apx8nJySQnJ/tjBGmgbovr\nTp+2V58q4bhn1TvcnzCAJ3uNUAmHyG8yMzPJzMys088aeqTudruJiPDd2Omhhx7i5ZdfJjo6+vTG\ndKQuv1EJh0jNmXZK49SpU8nPz2f06NF06NCBgQMH1nkwCQ7biw4weXU6Px4vIiI0jCd7jVAJh0g1\nDeY89XM2plCX81AJh8jFKdTFklbm7+Iv6z7iiPuESjhEzqBQF8s6dOI4j61ZzOr/fg+ohEMEFOpi\ncR6vh3d2f8MLW1aohEMEhbrYhEo4RHwU6mIb1Us4+rbtwKuD7uTK6BiTJxMJHIW62I5KOCSYKdTF\nllTCIcFKoS62pRIOCUYKdbE9lXBIMFGoS1Co8FQxZ+sXzN2RiRcvia1jmTt4PJ2atzF7NBFDmdp8\nJBIoJ0s4PrxlEu0jVcJRL4sXQ3m573F5ue+5WJKO1MUWjrnLeHrDJ2Ts2wZAylUJvDzgDmIaR5o8\nmQVUVcEdd/gev/ce3H03hIXBhx+CrgloEHSkLkGneUQT5g2ZwNzBE4gOj2BlwW6GZfyd1QfzzB6t\n4QsNhUWLfEfo0dHg8UB6ugLdovSuia3cFted/6T+iX5tO3C4rIR7Vr3DjOyluCorzB6t4Tt5JKi/\nTVuaQl1sJzaqJYtGTOKJnimEOUJ4e/c6Ri6dx57iQrNHa5iqqmDcOGjSBJxOCA+HO+/0HbGL5WhN\nXWxNJRw19NFHMGoUNGrkW4ZZuvT0OruYTqc0ipxBJRxidQp1kfNQCYdYlUJd5AJUwiFWpFAXuQhL\nlXA4nRARAS6X73RDCUoKdZEaaNAlHE4nuN0wfTqUlMC8edBMnwEEK118JFID18W047Nb/8gDCQOo\n9Hr467eruHPFmxxwHgn8MKWlsH69L8x//RWeeAKuuAL+8Q9fuIvUkEJdglrjsHBm9hvF+8Pv47Im\n0WQf+ombM/7Okr1bAztIaSlceSU8/vjpMD95LxaRWlCoiwBD2nfmi9Q/kXJVAiUVbh7NWsjk1ekc\nc5f5d8Mnj8zT0iAuDt54Q2Eu9aJQF/lNTONI3rppIn8dcDtNwsLJ2LeN4Z+8yobCff7bqNfrW14p\nKfFd2SlSTwp1kTM4HA7u7tyXlaMfJbF1LAdLjzJu+Xxe2rKSCo8fQjc62rfcMns25ObCxIm+G2yJ\n1JFCXeQ8OjVvQ8bIh3m02404HDB3x9ekLnuDfcd+8c8Gw8Phv/+F11+HggKFu9SZTmkUuYTswh+Z\nsmYhB5xHaRIWzsy+o7i7cx8cDof/NlpS4vs1bZpvWeaNN3RKYxDTeeoiBjte7uLp9Rl8HOgSjpIS\n31F8eblCPYgp1EX85OO923hq/ceUVLi5rEk0cwaNY0j7zmaPJTanUBfxowPOI0zJWkj2oZ8AuD9h\nAE/2GkHjsHBzBxPbUqiL+FmVx8M/clbzt63/odLr4doWbZk35C6ui2ln9mhiQwp1kQDZ9st+Hsla\nqBIO8SuFukgAVS/hGPJbCUdblXCIQRTqIiZQCYf4i0JdxCQq4RB/UKiLmMhSJRxiCaaFemVlJbNm\nzaJnz57s2bOHadOmnXXVnUJdgkmDLuEQSzGtJGP+/PnExsaSmppKTEwMixcvNvLlRSylQZVwSNAw\nNNSzs7Pp3r07AImJiSxbtszIlxexnAuVcHy8d5vZo4lNGRrqhYWFRP9WjhsVFcWhQ4eMfHkRy6pe\nwvFIVjqTV6dzvNxl9mhiM2FGvlirVq0oKSkBwOl00rp163O+Z+bMmaceJycnk5ycbOQIIg3WyRKO\nf3+/iRnZS8nYt43Nh3/i1UHj6deuo9njSQOSmZlJZmZmnX7W0A9K//Wvf+F2u3nwwQd58803adKk\nCRMnTjy9MX1QKgLAvmO/8EjWQrYXHSDE4eCPv0vmsR7DCA/RPdTlXKad/eL1epk+fTrdunUjJyeH\ntLQ0nf0icgEVnirmbP2CuTsy8eIlsXUscwdPoFPzc/+GK8FN56mL+FlpKTRubEw5UfUSjrR+o7gr\n3s8lHGIpCnURP6uogLIy43orTCvhEEtQqIv4WXY2XHed8WVEZ5ZwtG0SzWyVcAgKdRG/cjrhz3+G\ngQN9/dBGXyBavYTjgYQBTFMJR1BTqIv4kcsFV14JPXrARx/Bb5dmGKp6CUeXlu2YN2QCXVqqhCMY\nKdRFDFJZCR4PbNni+yfAunXwxBMQFgYffABXXOH7euPG0KuXb7093KCDapVwCCjUxUAnz/KorPSd\n6RFm6OVq1uB0wrffwrhxcPjw+b+na1dYuhTatoWmTY3d/jklHO07M3vgWJVwBBHTbugl9lJa6guq\nyEhISICiIrMnMkdUFCQlQW4u3Hjjuf/+oYdg40a4+mrjAx2gaXgjXhpwO2/fNJGWEU1ZfTCPmzNe\nZWX+LuM3JpanI3W5IKcTEhNh3z7f86eeglmzjDk326q++QYGDDj7ayUlvuAPhOolHHd37svMvrfS\nNLxRYAYQU+hIXQzh8fg+DDzphhuCO9DLynwfjILvD7sOHXyPv/wycDO0bdqMd4f/LzP73kpEaBgf\n5G1kxKevsb3oQOCGkAZNR+pyUSdOwLJlcNVVcP31vqWYYFVW5tsHDzwAU6b4/tB7+GHfPlqwwPhz\n1i9lT3Ehk1f/m++OHiLMEcJfet7Mw9cPUQmHDemDUhE/KCqC/fuhc+fTf7iVlsLXX8OIEeZ8iOyq\nrODFLSt4a/c6APq17cirg+8kNqpl4IcRv1GoixjM5fKdpuj1nhveZWXQpIk5c520+mAef16zmMNl\nJTRr1Jjn+6eS2qm7uUOJYRTqIgbzeuFi99e61L8PhGJXKVPXfcTKgt0A3NapO8/1T6VZo8bmDib1\nplAXCVJer5cP8jYxc+NSyioriI1qoRIOG1CoiwQ5lXDYi0JdRFTCYSMKdRE5RSUc1qdQF5GzqITD\n2hTqInJeS/Zu5en1GSrhsBiFuohc0P6SYqasWcRGlXBYhkJdRC6qyuPh9ZxMZm/9QiUcFqBQF5Ea\nUQmHNSjURaTGSivczNz4Gf/O2wSohKMhUqiLSK0tz9/J4+uWcMR9gpiISF4ecDspV3c1eyxBoS7V\nuFy+xnuPx3c/dKP6M8V+Cn8r4chSCUeDolCXU8rLYfNmSE313S72yy995Q665bZciMfr4Z3d3/D8\n5uWUe6ro2Kw1cwePp3ubK80eLWgp1OWUo0dhzBjIyvI9/8MfYO7cwNWviXXtLv6ZR1anq4SjAVCd\nnZwSGupr6zmpRw8tv0jNJMRczrJRk7k/YQCVXg8vblnJnSvmc8B5xOzR5CJ0pB4EysogI8N3dD50\nqH8a78XeMg/m8dgZJRzPJaVyW5xKOAJFyy8iYrhfXU4eX7dEJRwmUKiLiF94vV7ez9tI2sbPVMIR\nQAp1EfGrfcd+YfLqdHb8elAlHAGgUBcRvyuvqmT2ti94fcdqlXD4mUJdRAJmQ+E+pmQt4mCpSjj8\nRaEuIgF1zF3G0xs+IUMlHH7RIELd4/Hgcrloesb5cwp1EXtTCYd/mBrqkyZN4uuvv6ZFixasW7eO\nRo1O3zNCoS5ifyrhMJ5pV5QWFBQQFxdHbm4umzZtOivQRSQ4XBkdw+IRk3i853DCHCG8tXsdt372\nOrlHCs0eLSgYGuq5ubmsWLGC2NhYli9fbuRLi4iFhIaE8GjiTWSMfJiOzVqTe6SQkUvn8dautXi8\nHrPHszW/rKmvXbuW8ePHU1BQQGjo6fNWtfwiEnxUwlF/fl9TX7lyJS+++OI5X3/zzTeJj48HYMyY\nMcyfP5/LLrusToOJ1MSJExAW5rtXfFWV7/bC0jCphKPuTPug1O12ExERAcA999zD+++/f85gM2bM\nOPU8OTmZ5ORkozYvQaayEjZtgmHDoKICPv7Yd8OyxroVSYNVvYTjns59maESjnNkZmaSmZl56nla\nWpo5oT5hwgSaN29OUlISffr04foz7/mKjtTFWMePw6RJsHCh7/lNN8FHH0GLFubOJRdXvYSjU7PW\nzB0ygcTWsWaP1mA1iPPUz7sxhboYyO2G996DBx7wPX/uOZgyRUswVqESjppTqEvQKC2FH37w1fYl\nJCjQrcZVWcELW1bw9u51APRr25FXB99JbFRLkydrWBTqImIp1Us4nu+fSmonlXCcpFAXEctRCceF\nKdRFxJJUwnF+CnURsTSVcJxNoS4ilqcSjtMU6iJiGyrhUKiLiM0EewmHQl1EbClYSzgU6iJiW8FY\nwqFQFxFbq/J4eD0nk9lbv6DS66FLy3bMGzKBLi3bmT2aXyjURSQobPtlP49kLeTH40VEhIbxZK8R\n3JdwAyEOe90/RqEuIkEjGEo4FOoiEnTsXMKhUBeRoFR44jj/t2Yxq21WwqFQF5GgZccSDoW6iAS9\n6iUcbw/9H4Ze2cXssepEoS4iwukSji/357JizKNEhUeYPVKdKNRFRM5QWuEm0qKBDgp1ERFbqU12\n2usMfZE6crvB6YTvvoMTJ8yeRqTuFOoiwPHj0LEjdOkCt98OZWVmTyRSNwp1EeCbb6CoyPd45UoI\nCzN3HpG6UqiLAIMGweWX+x6PGQMVFebOI1JX+qBUBF+IV1TAL79AmzbQtKnZE4mcprNfRERsRGe/\niIgEKYW6iIiNKNRFRGxEoS4iYiMKdRERG1Goi4jYiEJdRMRGFOoiIjaiUBcRsRGFuoiIjdQr1Ldt\n22bUHCIiYoA6h/pnn33G73//+1PPjx8/zjPPPENGRgZz5swxZLiGJjMz0+wR6kXzm8vK81t5drD+\n/LVR51C/9dZbz7rBzHPPPcfgwYNJTU3l8OHDbNy40ZABGxKr/4eh+c1l5fmtPDtYf/7aMGxNPTs7\nm+7duwOQmJjIsmXLjHppERGpIcNCvbCwkOjoaACio6M5dOiQUS8tIiI1dNH7qa9cuZIXX3zxnK+/\n+eabxMfHc/nll/Pzzz8DMGDAADIyMmjTpg3p6ens3LmTZ5999qyfu+aaa9i7d6/BvwUREXuLi4vj\nhx9+qNH3XrSJMSUlhZSUlBq9UEpKCtu2bePmm28mJyeHESNGnPM9NR1KRETqps7LL59//jlHjx5l\ny5YtAEydOpUvv/ySRYsW0axZMwYNGmTYkCIiUjMBrbMTERH/0hWltXChi608Hg8nTpwI8DS1Z6eL\nxayyz+3G6XSaPUKN5eTk4PF4zB6jzi40/6Xeg4CE+v79+7ntttu4+uqreeaZZwDrXaxU/WIrgEmT\nJhEfH0+/fv0IC7voxxOms8vFYlba5wCVlZVMnz6djIwMXnjhBUsWr8+aNYv4+Hi6du1qmVDfsGED\n/fv3p7Ky0pLvwZnzQ+3eg4CE+po1a1iyZAk7duxg/vz5HD582HIXK1W/2KqgoIC4uDhyc3PZtGkT\njRo1MnG6S7PDxWJW2+cA8+fPJzY2ltTUVGJiYli8eLHZI9VKaWkpLpeLnTt3smvXLtq1a2f2SDWS\nlJREmzZt8Hq9lnwPTs4PtX8PAhLq48aNw+Fw0Lx5c6677joiIyMtf7FSbm4uK1asIDY2luXLl5s9\nTq1Zcf9bcZ9bcT+fKS8vj61bt9K+fXsWLFhg9jh1EmzvQUD+/hoeHg7A4cOHGTp0KJGRkZa/WGn4\n8OEMHz6ctWvXMn78eAoKCggNDTV7rBqz4v634j4/cz9HRUVZYj+fqUePHixfvpzc3FyGDh3KLbfc\nYpmj9ZOC7T0wNNQvdrHSNddcwyeffHJqTb1Vq1Y4nU6aNGlCSUkJrVu3NnKUOrnUxVbnM3DgQHr3\n7s2vv/7KZZdd5u8RL6o28zfE/X/SpX4fDWmfX0qrVq0oKSkBfB9wNaT9XBtdunRh7Nix5OfnWy7U\ng+09MDTUL3ax0qJFi5g4cSIOh4OCgoIaXawUaLW52MrlctG4cWPA96d/QwgXoy8WM8uFfh9ut/vU\n44ayzy8lJSWF7du307dvX3bs2FHj96ehcLvdREREnHqckJBg8kS1Z+X3wOv1npU1NXkPAnKe+ksv\nvcQ///lPIiMjKS8vZ/bs2dx0002kpaXRs2dPfvzxR5544gl/j1Evn3/+OXfccQdr166lV69eTJgw\ngebNm5OUlESfPn24/vrrzR7xoqrPX1ZWZqn9D1hun4Pvf8rp06fTrVs3cnJySEtLw+FwmD1WjU2d\nOpX8/HxGjx5Nhw4dGDhwoNkj1cjmzZtJTk4mPT2dkSNHWu492Lx5MzfeeCMffPABWVlZtXoPdPGR\niIiN6OIjEREbUaiLiNiIQl1ExEYU6iIiNqJQFxGxEYW6iIiNKNRFRGxEoS4iYiP/H3IVimi2i8zT\nAAAAAElFTkSuQmCC\n", "text": [ "" ] } ], "prompt_number": 155 }, { "cell_type": "markdown", "metadata": {}, "source": [ "As you can see, there are precisely two support vectors from the data set of 10 vectors with one from each class. To play with this example, you may want to consider other kernels for data sets that are not inherently linearly seperable like this example. Sci-kit learn's documentation expounds on various choices of kernel. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#Hierarchical Agglomerative Clustering: Background and An Example From Molecular Biology\n", "\n", "Switching gears from classifiation to clustering, we finish by covering hierarchical agglomerative clustering. This is a nice contrast to K-Means which was covered in class becuase one does not need to specify K, the number of clusters in advance. Given a data set $D = {x_1,...x_m}$, hierarchical clustering proceeds by computing the matrix of pearson similiarities (analgously to homework 4). One then builds a binary tree utilizing this matrix with an iterative procedure. Specifically, the leaves of the tree are the initial $m$ data points. At each step of the iteration, we link the two closest nodes, where the distance between nodes $A$ and $B$ is given by\n", "\n", "$$D(A,B) = min(s(a,b): a \\in A, b \\in B)$$.\n", "\n", "where s(.,.) is the familiar pearson similarity from homework 4. Finally, to extract the clusters from this tree, we want to select nodes from which the data is \"highly correlated\". One way to determine this is to check if these pairwise distances within the node are normally distributed, which can be checked with the Anderson-Darling test of normality. \n", "\n", "An area of biology called functional genomics attempts to determine the function(s) of genes, as the name would suggest. One approach to solving this problem is to cluster microarray data, which measures the expression levels of all of the genes in a cell under different time points or experimental conditions. If an unannotated gene ends up in a cluster with many genes of a known function, this lends evidence that the gene may be associated with said function and could prompt additional experiments. In the next example, we will cluster public yeast microarray data, available here: http://rana.lbl.gov/data/yeast/yeastall_public.txt.gz. It contains 6221 genes under 81 experimental conditions. We will utilize scipy's implementation of Hierarchical Clustering for this example. Note that this will take several minutes to run!\n", "\n", "\n", "NOTE: Sadly, the implementation of hierarchical clustering in scipy or sci-kit learn does not utilize the stopping criterion used above. To complete the example, we will use their implementation nonetheless, but please consult reference [3] for more if you would like to implement this stopping criterion. " ] }, { "cell_type": "code", "collapsed": false, "input": [ "from scipy import cluster\n", "from scipy.cluster import hierarchy\n", "#Read in the gene expression data set\n", "gene_expression = numpy.genfromtxt('yeastall_public.txt',delimiter = '\\t',skip_header = 2, usecols = range(2,83))\n", "#Perform hierarchical clustering\n", "C=hierarchy.fclusterdata(gene_expression,1,metric='correlation')" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 170 }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, try exploring the clusters and in particular compare this to the clusters found in reference [3]. Do you find any clusters are enirched for a particular gene function?" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#References\n", "\n", "1. \"Learning From Data\" by Yaser S. Abu-Mostafa et al.\n", "2. \"Cluster Analysis and Display of Genome-Wide Expression Patterns\" by Eisen at al.\n", "3. \"De-Trending Time Series for Astronomical Variability Surveys\" by Kim et al." ] } ], "metadata": {} } ] }