{ "metadata": { "name": "" }, "nbformat": 3, "nbformat_minor": 0, "worksheets": [ { "cells": [ { "cell_type": "code", "collapsed": false, "input": [ "%matplotlib inline \n", "from numpy import exp, log, linspace\n", "import matplotlib\n", "import matplotlib.pyplot as plt" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 1 }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Let's Write: The Logistical Regression Classifier\n", "\n", "## Introduction\n", "\n", "Logistical regression classifiers are a well-known machine learning technique for classifying data. In this notebook we'll go through the basics of how they work and write a simple implementation. Finally we'll test our implementation by using it to detect spam in a corpus of email\n", "\n", "Throughout this notebook we'll use spam classification as a convenient example to help explain how the classifier works.\n", "\n", "### Resources\n", "\n", "This notebook and the code it uses can be found on [github](https://github.com/mfournierca/logistic-regression).\n", "\n", "### Prerequisites\n", "\n", "To get the most out of this notebook, you'll need the following prerequisites:\n", "\n", "- Python, numpy\n", "- Basic calculus, sums and series\n", "- Basic linear algebra, vectors, matrices, the dot product\n", "- Basics of numerical optimization, finding the minimum point of functions\n", "- Machine learning basics, test and training sets, cost functions, supervised learning, precision and recall\n", "\n", "## The Logistical Regression Classifier\n", "\n", "A logistical regression classifier gets its name from the function at its core, the logistical, or sigmoid, function:\n", "\n", "$$ h(x) = 1 / (1 + e^{a - bx}) \\ \\ a, b \\in \\mathbb{R} $$ \n", "\n", "This function has an interesting shape:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "x = linspace(-15,15,100)\n", "a = 0\n", "b = 1\n", "plt.plot(x, 1 / (1 + exp(a + -b*x)))\n", "plt.show()" ], "language": "python", "metadata": {}, "outputs": [ { "metadata": {}, "output_type": "display_data", "png": "iVBORw0KGgoAAAANSUhEUgAAAXUAAAEACAYAAABMEua6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGsVJREFUeJzt3XtwVGWexvEnkCAILuAAkaTDRJKQBAIxbLiNg8ZlIgwj\nccZLGax1EaOVwqWs2bKmqKkd1uiuCFODtRbRreiK4i0TcdCoBS0SjXghhJtgAbIBgyYNRsJFUcRA\nc/aPM8aEJJ3upDtv9+nvp6qr08npzu9Upx9efu8574mxLMsSAMAR+pkuAAAQPIQ6ADgIoQ4ADkKo\nA4CDEOoA4CCEOgA4SLehftdddyk+Pl4TJ07scpv77rtPaWlpys7O1q5du4JaIADAf92G+sKFC+V2\nu7v8+fr163Xw4EHV1dXpySef1KJFi4JaIADAf92G+syZMzV8+PAuf/76669rwYIFkqRp06bp1KlT\nampqCl6FAAC/9bqn7vF4lJSU1PrY5XKpsbGxty8LAOiBoEyUXrzSQExMTDBeFgAQoNjevkBiYqIa\nGhpaHzc2NioxMbHDdqmpqTp06FBvfx0ARJWUlBQdPHjQ7+17PVIvKCjQc889J0mqqanRsGHDFB8f\n32G7Q4cOybIsx94eeOAB4zWwf9G5b+fOWdq61dKqVZbuuMNSZqalSy6xlJpq6YYbLE2b9oAefdTS\n2rWWamos1ddb+vZb83Xz/vl3C3Qw3O1Iff78+XrvvffU3NyspKQkPfjggzp37pwkqbi4WHPnztX6\n9euVmpqqwYMH65lnngmoAACBa2qS3nxTcrulTZukpCRp+nTpmmuk+++XMjKkSy6xty0pkf7t34yW\niz7UbaiXl5d3+yKlpaVBKQZA1374wQ7yZ5+VPvhAmjNHmjdPWrVKuuIK09UhXPS6pw5bXl6e6RJC\nysn7F+779t130hNPSCtXSpmZ0sKFUnm5NGSIf88P9/3rLafvX6BiLMvqk4tkxMTEqI9+FeAIZ89K\npaXSX/4iXXuttHSplJVluir0tUCzk5E6EIa2bpXuvFNKT7d75oQ5/EWoA2Hk7FnpgQekNWvsXvmt\nt5quCJGGUAfCxJEj0g03SGPHSnv2SKNGma4IkYild4EwsG+f9ItfSLfcIq1dS6Cj5xipA4Zt3my3\nWf7yF+mOO0xXg0hHqAMGvf++PTovL5dmzTJdDZyAQxoBQw4csA9VfP55KT/fdDUIV4FmJz11wIBj\nx6Tf/EZ6+GECHcHFSB3oY99/b7darrvODnXAl0Czk1AH+ti990rHj9t99H78Xxnd4IxSIIy98469\nKNeePQQ6QoM/K6CPnD4tFRVJZWXSsGGmq4FT0X4B+siiRVJLi/T006YrQSSh/QKEoaoqu+3yySem\nK4HT0X4BQuzcOam4mLYL+gahDoTY6tX2Il1z55quBNGAnjoQQt9/L6WlSa++Kk2ZYroaRCLOKAXC\nyP/8jx3mBDr6CiN1IES++cYepVdVceUi9BwjdSBM/Pd/S9dfT6CjbzFSB0LgxAlp3Dj7WqMpKaar\nQSRjpA6EgdWrpV//mkBH32OkDgTZhQt2L/3FF6Xp001Xg0jHSB0w7K237JOMpk0zXQmiEaEOBNkT\nT9jL68bEmK4E0Yj2CxBE9fX2MelffCFdeqnpauAEtF8Ag8rKpH/5FwId5jBSB4Lk7FlpzBjpww/t\niVIgGBipA4asXSvl5BDoMItQB4Lkueeku+82XQWiHe0XIAi++so+g/TIEfrpCC7aL4AB69bZZ5AS\n6DCNUAeCoKJCuu0201UAtF+AXjt6VBo/3r4fONB0NXCaoLdf3G63MjIylJaWphUrVnT4+ddff615\n8+bpqquuUlZWlp599tmACgYi3d/+Jt1wA4GO8OBzpO71epWenq5NmzYpMTFRU6ZMUXl5uTIzM1u3\nWbZsmU6fPq1HHnlEzc3NSk9PV1NTk2JjY9v/IkbqcKiZM6UlS+xgB4ItqCP12tpapaamKjk5WXFx\ncSosLFRlZWX7F+jXT998840k6ZtvvtHPfvazDoEOOJXHI+3dK+Xnm64EsPkMdY/Ho6SkpNbHLpdL\nHo+n3TaLFy/Wvn37lJCQoOzsbD322GOhqRQIQ2vXSjfeKF1yielKAJvPIXWMH8vMud1uTZ48We++\n+64OHTqk/Px87d69W5dddlmHbUtKSlq/zsvLU15eXsAFA+Hk5ZelpUtNVwEnqa6uVnV1dY+f7zPU\nExMT1dDQ0Pq4oaFBLper3TbPPvus/vjHP0qSUlJSdOWVV+rAgQPKzc3t8HptQx2IdE1N0r590qxZ\npiuBk1w84H3wwQcDer7P9ktubq7q6up0+PBhtbS0qKKiQgUFBe22GTNmjDZt2iRJampq0oEDBzR2\n7NiAigAi0caN0j/9kzRggOlKgJ/4HKnHxsaqtLRUs2fPltfrVVFRkTIzM1VWViZJKi4u1tKlS3Xn\nnXdq0qRJsixLf/7zn3X55Zf3SfGASRs22GeRAuGEk4+AHvB6pfh4aedOe7ldIFRY+wXoAzt22KFO\noCPcEOpAD9B6Qbgi1IEecLulOXNMVwF0RE8dCNDx49KVV0rHjnHSEUKPnjoQYm+/LV17LYGO8ESo\nAwFyu+mnI3zRfgECcOGClJAgffSRxDl26Au0X4AQ2r1b+od/INARvgh1IABVVSyzi/BGqAMB2LzZ\nniQFwhU9dcBPXq80YoS0f790xRWmq0G0oKcOhMgnn0ijRhHoCG+EOuAnWi+IBIQ64Kf33pOuucZ0\nFYBv9NQBP1iW3XrZuVNqc9leIOToqQMhsH+/NGQIgY7wR6gDfnjvPfrpiAyEOuCHzZvppyMyEOpA\nNyyLkToiB6EOdOPQIalfP9Z7QWQg1IFu/HgoY0yM6UqA7hHqQDfopyOSEOpANz76SPrlL01XAfiH\nUAd8aG6WvvpKysw0XQngH0Id8GHrVmnKFKl/f9OVAP4h1AEfamqkGTNMVwH4j1AHfKipkaZPN10F\n4D8W9AK64PVKl19uH6c+YoTpahCtWNALCJL9++2VGQl0RBJCHegCrRdEIkId6AKTpIhEhDrQBUbq\niERMlAKd+PprKTFROnlSioszXQ2iGROlQBBs2yZNnkygI/IQ6kAnaL0gUnUb6m63WxkZGUpLS9OK\nFSs63aa6ulo5OTnKyspSXl5esGsE+hyTpIhUPnvqXq9X6enp2rRpkxITEzVlyhSVl5crs83qRqdO\nndLVV1+tt956Sy6XS83NzRrRyYG99NQRKSxLGjlS2rNHSkgwXQ2iXVB76rW1tUpNTVVycrLi4uJU\nWFioysrKdtu89NJLuvnmm+VyuSSp00AHIsmhQ9KllxLoiEw+Q93j8SgpKan1scvlksfjabdNXV2d\nTpw4oeuuu065ubl6/vnnQ1Mp0Ee2b7dXZgQiUayvH8b4cf2uc+fOaefOnaqqqtKZM2c0Y8YMTZ8+\nXWlpaUErEuhL27ZJubmmqwB6xmeoJyYmqqGhofVxQ0NDa5vlR0lJSRoxYoQGDRqkQYMG6ZprrtHu\n3bs7DfWSkpLWr/Py8phURVjavl1autR0FYhW1dXVqq6u7vHzfU6Unj9/Xunp6aqqqlJCQoKmTp3a\nYaL0008/1eLFi/XWW2/phx9+0LRp01RRUaHx48e3/0VMlCICeL3S8OHS55/b94BpgWanz5F6bGys\nSktLNXv2bHm9XhUVFSkzM1NlZWWSpOLiYmVkZGjOnDmaNGmS+vXrp3vuuadDoAOR4sABe2VGAh2R\nimUCgDaee07asEEqLzddCWBjmQCgF5gkRaQj1IE2OJwRkY72C/B3585Jw4ZJX34pXXaZ6WoAG+0X\noIf27pWSkwl0RDZCHfg7+ulwAkId+Dv66XACQh34O0bqcAImSgFJZ89Kl18uHT8uDRpkuhrgJ0yU\nAj2we7eUnk6gI/IR6oDsfjqtFzgBoQ6IUIdzEOqApB07pH/8R9NVAL3HRCmi3pkz0ogR0smT0iWX\nmK4GaI+JUiBAu3dLmZkEOpyBUEfUo/UCJyHUEfV27GCSFM5BqCPqbd/OSB3OwUQpohqTpAh3TJQC\nAWCSFE5DqCOqcdIRnIZQR1TjyBc4DaGOqEaow2mYKEXU+u47aeRIJkkR3pgoBfy0e7c0fjyBDmch\n1BG1aL3AiQh1RC2OfIETEeqIWozU4URMlCIqffutFB9vT5IOGGC6GqBrTJQCfti1S8rKItDhPIQ6\nohL9dDgVoY6otH27NGWK6SqA4CPUEZW2bWOkDmdiohRR59QpKSnJvu/f33Q1gG9MlALd2LFDuuoq\nAh3ORKgj6tBPh5MR6og69NPhZN2GutvtVkZGhtLS0rRixYout9u2bZtiY2O1bt26oBYIBBsjdTiZ\nz1D3er1avHix3G639u3bp/Lycu3fv7/T7ZYsWaI5c+YwGYqwduyYPUGakmK6EiA0fIZ6bW2tUlNT\nlZycrLi4OBUWFqqysrLDdqtWrdItt9yikSNHhqxQIBh+POmoH41HOJTPP22Px6OkpKTWxy6XSx6P\np8M2lZWVWrRokST78BsgXNFPh9P5DHV/Avr3v/+9li9f3nosJe0XhDP66XC6WF8/TExMVENDQ+vj\nhoYGuVyudtvs2LFDhYWFkqTm5mZt2LBBcXFxKigo6PB6JSUlrV/n5eUpLy+vF6UDgdu+XVq1ynQV\nQNeqq6tVXV3d4+f7PKP0/PnzSk9PV1VVlRISEjR16lSVl5crMzOz0+0XLlyoefPm6aabbur4izij\nFIZ5PFJOjtTUJNElRKQINDt9jtRjY2NVWlqq2bNny+v1qqioSJmZmSorK5MkFRcX965aoA/V1tr9\ndAIdTsbaL4gaS5ZIQ4ZIS5eargTwH2u/AF2oqZGmTzddBRBajNQRFc6fl4YPlxobpaFDTVcD+I+R\nOtCJTz6Rfv5zAh3OR6gjKtB6QbQg1BEVCHVEC0IdUWHLFkId0YGJUjje8ePS2LHSiRNc7QiRh4lS\n4CJbt9rrvRDoiAaEOhyPfjqiCaEOxyPUEU3oqcPRLlyQLr9cqquTuIYLIhE9daCNTz+1w5xAR7Qg\n1OFotF4QbQh1ONpHHxHqiC6EOhxt82Zp5kzTVQB9h1CHYx09ap94lJVluhKg7xDqcKzNm6Vf/lLq\nx185ogh/7nCszZula681XQXQtwh1ONZ770nXXGO6CqBvcfIRHKm5WUpNte9jfV5eHQhvnHwESHr/\nfekXvyDQEX0IdTgSrRdEK0IdjsQkKaIVPXU4zqlTUlKSfYz6gAGmqwF6h546ot4HH0jTphHoiE6E\nOhyH1guiGaEOx2GSFNGMnjoc5euvJZdLOnZMGjjQdDVA79FTR1SrqpKuvppAR/Qi1OEoGzZIc+aY\nrgIwh/YLHMOypDFjpE2bpPR009UAwUH7BVFr7157WYBx40xXAphDqMMx3G7p17+WYmJMVwKYQ6jD\nMeinA/TU4RDffiuNHm1fwm7IENPVAMFDTx1R6Z13pKlTCXTAr1B3u93KyMhQWlqaVqxY0eHnL774\norKzszVp0iRdffXV2rNnT9ALBXz5sZ8ORLtu2y9er1fp6enatGmTEhMTNWXKFJWXlyszM7N1my1b\ntmj8+PEaOnSo3G63SkpKVFNT0/4X0X5BiFiWNHas9MYbUlaW6WqA4Ap6+6W2tlapqalKTk5WXFyc\nCgsLVVlZ2W6bGTNmaOjQoZKkadOmqbGxMcCygZ77v/+Tzp2TJkwwXQlgXreh7vF4lJSU1PrY5XLJ\n4/F0uf3TTz+tuXPnBqc6wA9vvCHNncuhjIAkdXsFx5gAPinvvvuuVq9erQ8//LDTn5eUlLR+nZeX\np7y8PL9fG+hKRYW0bJnpKoDgqK6uVnV1dY+f322oJyYmqqGhofVxQ0ODXC5Xh+327Nmje+65R263\nW8OHD+/0tdqGOhAMn30mff65dN11pisBguPiAe+DDz4Y0PO7bb/k5uaqrq5Ohw8fVktLiyoqKlRQ\nUNBumy+++EI33XSTXnjhBaWmpgZUANAbL78s3XyzvTwAAD9G6rGxsSotLdXs2bPl9XpVVFSkzMxM\nlZWVSZKKi4v10EMP6eTJk1q0aJEkKS4uTrW1taGtHJAd6itXmq4CCB+cUYqIVVcnzZwpeTxS//6m\nqwFCgzNKETUqKqRbbiHQgbYIdUSsl1+WbrvNdBVAeCHUEZH275eOH7cvXQfgJ4Q6ItLLL0u33ir1\n4y8YaIePBCLOhQvSCy9I8+ebrgQIP4Q6Is7bb9tL7E6daroSIPwQ6og4Tzwh/eu/stYL0BmOU0dE\n+fxzafJk6YsvpMGDTVcDhB7HqcPRysqkO+4g0IGuMFJHxPjhB2nMGGnzZik93XQ1QN9gpA7HeuUV\nadIkAh3whVBHxHj8cXuCFEDXCHVEhJoae+GuG24wXQkQ3gh1RIQ//cm+sW464BuhjrBXVSUdPizd\neafpSoDwR6gjrFmW9O//Lj30kBQXZ7oaIPwR6ghrb74pnTkjFRaargSIDIQ6wtaFC/Yo/T//k9UY\nAX/xUUHYKi+XBg2SLrrOOQAfOKMUYam5WZo40T7hiAthIJoFmp2EOsLS/PnS6NHSo4+argQwK9Ds\n5KhfhJ1166QdO6SPPzZdCRB5GKkjrNB2Adqj/YKIZVnSbbdJSUnSypWmqwHCA+0XRKxly6SDB6U1\na0xXAkQuQh1h4aWXpCeflLZssQ9jBNAztF9g3PvvSzffbK/xMnGi6WqA8MJFMhBRPvlEuvVW6cUX\nCXQgGAh1GPPuu9KsWdJjj0n5+aarAZyBUIcRL71kH+lSUWHfAwgOJkrRp86flx5+WFq9WnrnHSkr\ny3RFgLMQ6ugze/dKCxdKQ4dKH30kJSaarghwHtovCLmzZ+1j0PPypLvvljZuJNCBUGGkjpA5e1b6\n3/+Vli+XJk+Wtm+Xfv5z01UBzkaoI+iOHJGef15atUrKyZFee03KzTVdFRAdum2/uN1uZWRkKC0t\nTStWrOh0m/vuu09paWnKzs7Wrl27gl4kwt+JE/YRLXPnShMm2Kf7v/aa9MYbBDrQl3yGutfr1eLF\ni+V2u7Vv3z6Vl5dr//797bZZv369Dh48qLq6Oj355JNatGhRSAsOV9XV1aZLCKmL9+/kSentt6WS\nEmnGDCk52Q7122+XGhulp56KnDCPtvfOaZy+f4HyGeq1tbVKTU1VcnKy4uLiVFhYqMrKynbbvP76\n61qwYIEkadq0aTp16pSamppCV3GYcuIflmVJx47ZvfBVq6r1H/9hH1M+bpw0Zox97dAzZ6T/+i97\nuzfflP75n6XBg01XHhgnvndtsX/RxWdP3ePxKCkpqfWxy+XS1q1bu92msbFR8fHxQS4VgbpwQWpp\nsScsv//evv/uO+nbb+3b6dPS119Lp07ZI+/mZumrr+zb0aP2iHvgQDvAz52z2yo33ij96U/S+PFS\n//6m9xDAxXyGekxMjF8vcvFiM109b948P6sKomCsIebrNX78WV2d1Pbfux+/3/a+q+91drtw4af7\ntjev96d7r9c+mcfrtUO37e2HH+yfDRhgr3o4cKB9f+ml0mWX2aPpIUOk4cOlYcPsY8czM6Vrr5VG\njZLi4+11zYcMsWstKbFvAMKc5cOWLVus2bNntz5etmyZtXz58nbbFBcXW+Xl5a2P09PTrS+//LLD\na6WkpFiSuHHjxo1bALeUlBRfMd2Bz5F6bm6u6urqdPjwYSUkJKiiokLl5eXttikoKFBpaakKCwtV\nU1OjYcOGddp6OXjwoK9fBQAIAp+hHhsbq9LSUs2ePVter1dFRUXKzMxUWVmZJKm4uFhz587V+vXr\nlZqaqsGDB+uZZ57pk8IBAB312UUyAAChF9K1X9auXasJEyaof//+2rlzZ+v3Dx8+rEGDBiknJ0c5\nOTm69957Q1lGyHS1f5L0yCOPKC0tTRkZGdq4caOhCoOnpKRELper9T1zu92mSwoKf06ui2TJycma\nNGmScnJyNHXqVNPl9Npdd92l+Ph4TWxzRZUTJ04oPz9f48aN0/XXX69Tp04ZrLDnOtu3Hn3uAurA\nB2j//v3WgQMHrLy8PGvHjh2t36+vr7eysrJC+av7RFf7t3fvXis7O9tqaWmx6uvrrZSUFMvr9Rqs\ntPdKSkqslStXmi4jqM6fP2+lpKRY9fX1VktLi5WdnW3t27fPdFlBlZycbB0/ftx0GUGzefNma+fO\nne3y4w9/+IO1YsUKy7Isa/ny5daSJUtMldcrne1bTz53IR2pZ2RkaNy4caH8FUZ1tX+VlZWaP3++\n4uLilJycrNTUVNXW1hqoMLgsh3Xq/Dm5zgmc9L7NnDlTw4cPb/e9tidALliwQK+99pqJ0nqts32T\nAn//jC29W19fr5ycHOXl5emDDz4wVUZIHDlyRC6Xq/Wxy+WSx+MxWFFwrFq1StnZ2SoqKorY/+K2\n1dmJc054n9qKiYnRr371K+Xm5uqpp54yXU5INDU1tR5xFx8f77gz2gP93PU61PPz8zVx4sQOtzfe\neKPL5yQkJKihoUG7du3So48+qttvv12nT5/ubSkh0ZP964y/J3KZ1NW+vv7661q0aJHq6+v18ccf\na/To0br//vtNl9trkfCe9NaHH36oXbt2acOGDXr88cf1/vvvmy4ppGJiYhz1vvbkc9frpXfffvvt\ngJ8zYMAADRgwQJI0efJkpaSkqK6uTpMnT+5tOUHXk/1LTExUQ0ND6+PGxkYlRsBVIfzd17vvvlvz\nTJweHGQXv08NDQ3t/oflBKNHj5YkjRw5Ur/73e9UW1urmTNnGq4quOLj4/Xll1/qiiuu0NGjRzVq\n1CjTJQVN233x93PXZ+2Xtn2h5uZmeb1eSdJnn32muro6jR07tq9KCYm2+1dQUKC//vWvamlpUX19\nverq6iL+yIOjR4+2fv3qq6+2m6GPVG1PrmtpaVFFRYUKCgpMlxU0Z86caf0f8HfffaeNGzc64n27\nWEFBgdasWSNJWrNmjX77298arih4evS5C+LkbQfr1q2zXC6XNXDgQCs+Pt6aM2eOZVmW9corr1gT\nJkywrrrqKmvy5MnWm2++GcoyQqar/bMsy3r44YetlJQUKz093XK73QarDI477rjDmjhxojVp0iTr\nxhtv7HQpiEi0fv16a9y4cVZKSoq1bNky0+UE1WeffWZlZ2db2dnZ1oQJExyxf4WFhdbo0aOtuLg4\ny+VyWatXr7aOHz9uzZo1y0pLS7Py8/OtkydPmi6zRy7et6effrpHnztOPgIAB+HC0wDgIIQ6ADgI\noQ4ADkKoA4CDEOoA4CCEOgA4CKEOAA5CqAOAg/w/pjKAO2CguGwAAAAASUVORK5CYII=\n", "text": [ "" ] } ], "prompt_number": 2 }, { "cell_type": "markdown", "metadata": {}, "source": [ "The function varies between 0 and 1 and is close to one of either extreme over most of its domain. This suggests a way of using this function as a classifier - if we can find some way of representing our problem as points on the x axis, then we can use this function to put each point into one of two classes, either 0 or 1. If for some point ```x```, ```h(x) < 0.5```, we say that ```x``` is in class 0. If ```h(x) > 0.5```, we say that ```x``` is in class 1. We call ```0.5``` the *decision boundary*. \n", "\n", "For example, consider classifying spam. We'd like to use the sigmoid function ```h(x)``` to classify a message as spam or not-spam. Do do this, we need to find a way of mapping each message onto the X-axis. After that the sigmoid function will predict which class the method falls into. \n", "\n", "One such mapping could be the length of the longest capitalized word in the message. Spammers will sometimes use capitalized words to get our attention, so we expect spam messages to have more and longer capitalized words. We can take the length of the longest capitalized word and use that as the message's position on X-axis.\n", "\n", "This is a good start but it won't quite work. The decision boundary is set at ```0.5```, which occurs at ```x = 0```. Since we can't have a capitalized word of negative length, this strategy would classify almost all messages as spam. To solve this we modify the parameters of the sigmoid funtion, ```a``` and ```b```. We can shift the function left or right by setting the ```a``` parameter:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "a = 5\n", "b = 1\n", "plt.plot(x, 1 / (1 + exp(a + -b*x)))\n", "plt.show()" ], "language": "python", "metadata": {}, "outputs": [ { "metadata": {}, "output_type": "display_data", "png": "iVBORw0KGgoAAAANSUhEUgAAAXUAAAEACAYAAABMEua6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAGslJREFUeJzt3XtwVGWC/vFvMI1kGEdAMJJ0MJg0SQQSggFkWJi4DgOV\nGsLoTO0Ea10UtLK4rOuu5bjWzpShpgTC7rilxKmCWa7DGFOMl8QpaJVIRLnYgPHKpQIG6TQQTDDA\nAGOgc35/nB+RkGsn3Tndp59P1SnS5KT7OdXm8eV9zzkdYxiGgYiI2MIAqwOIiEjwqNRFRGxEpS4i\nYiMqdRERG1Gpi4jYiEpdRMRGui31BQsWEB8fz/jx4zvd5/HHH8flcpGVlUV1dXVQA4qISM91W+oP\nP/wwbre70+9v2bKFI0eOUFNTw+rVq1m0aFFQA4qISM91W+rTp09n6NChnX6/oqKC+fPnAzBlyhSa\nmpqor68PXkIREemxPs+p+3w+kpKSWh87nU7q6ur6+rQiItILQVkovf5OAzExMcF4WhERCVBsX58g\nMTERr9fb+riuro7ExMR2+6WmpnL06NG+vpyISFRJSUnhyJEjPd6/zyP1/Px8Nm7cCMCePXsYMmQI\n8fHx7fY7evQohmHYdnv22Wctz6Dj07HZ8fguXTJ47z2D3/7W4L77DFJSDOLiDLKyDO6/32Dq1Gf5\n/e8NKioM9u418PkMmputzx2sLdDBcLcj9Xnz5vHee+/R0NBAUlISS5Ys4fLlywAUFhaSl5fHli1b\nSE1NZfDgwaxbty6gACIi1/vqK3jtNSgvh3374M47Yfp0+OUv4bnnwOWC2P/fXkVFoJPuvtNtqZeW\nlnb7JCUlJUEJIyLR6/x52LDB3I4dg7lz4amnYMYMuOkmq9NFjj7PqYspNzfX6gghZefjs/OxQfgf\n37Fj8OKLZpnfcw8sWwa5ud+NxLsT7sfX32IMw+iXD8mIiYmhn15KRCLAuXPmVMqaNbBgAfzLv8Dt\nt1udKvwE2p2694uI9CvDgHXrID0dvv4aPv8cVqxQoQeLpl9EpN80NcH8+XDihLkIOmmS1YnsRyN1\nEekX1dVw113miHznThV6qGikLiIh9/LL8G//BitXQkGB1WnsTaUuIiG1aRM8/TRs3w7jxlmdxv5U\n6iISMi+/DL/6FWzbZl5AJKGnUheRkHjlFXjySXjnHRV6f9J56iISdHv2QH4+VFZCFx+aJj2g89RF\nxFKNjeY9Wv7v/1ToVtBIXUSCpqUFfvpTGDsW/vu/rU5jDxqpi4hliovh7FlYutTqJNFLC6UiEhS7\nd5s35tq7FxwOq9NEL43URaTP/H547DF4/nlwOq1OE91U6iLSZ6tXww9+oKtFw4EWSkWkTxobISPD\nPB89K8vqNPYTaHeq1EWkTxYtghtuAH0AWmgE2p1aKBWRXquuNj9L9OBBq5PIVZpTF5Fe+9WvYMkS\nGDbM6iRylaZfRKRX9u+H++6DI0dg4ECr09iXLj4SkX5RXAz//u8q9HCjkbqIBOzIEZg6FWpr4fvf\ntzqNvWmkLiIh9z//A//8zyr0cKSRuogE5NQp8/7ohw7Brbdancb+NFIXkZB68UWYN0+FHq40UheR\nHrtwAW6/HTweuOMOq9NEB43URSRkXn3VXCBVoYcvlbqI9Nj69TB/vtUppCuafhGRHvnqK7jrLvD5\n4MYbrU4TPTT9IiIhsXGjeWtdFXp400hdRLplGJCaCmVlkJNjdZroopG6iATdBx/AoEHm9IuEN5W6\niHRrwwZ46CGIibE6iXRH0y8i0qULF8zPHT1wAEaOtDpN9NH0i4gE1RtvmOemq9AjQ7el7na7SU9P\nx+VyUVxc3O77Z8+eZc6cOUyYMIFx48axfv36UOQUEYu8+ir88pdWp5Ce6nL6xe/3k5aWxrZt20hM\nTGTSpEmUlpaSkZHRus/SpUs5f/48y5Yto6GhgbS0NOrr64mNbftJeZp+EYk8ly7BbbfB0aMwfLjV\naaJTUKdfPB4PqampJCcn43A4KCgooLy8vO0TDBjAuXPnADh37hy33HJLu0IXkchUWQkTJqjQI0mX\npe7z+UhKSmp97HQ68fl8bfZZvHgxBw4cICEhgaysLF544YXQJBWRfldeDnPnWp1CAtHlkDqmB+cv\nud1uJk6cyPbt2zl69CgzZ87kk08+4aabbmq3b1FRUevXubm55ObmBhxYRPpHSwu8+Sb8539anSS6\nVFVVUVVV1euf77LUExMT8Xq9rY+9Xi9Op7PNPuvXr+eZZ54BICUlhdGjR3P48GFyOrjs7NpSF5Hw\n9uGH5rRLSorVSaLL9QPeJUuWBPTzXU6/5OTkUFNTw7Fjx2hubqasrIz8/Pw2+4waNYpt27YBUF9f\nz+HDh7lD9+UUiXjl5XDdr7tEgC5H6rGxsZSUlDBr1iz8fj8LFy4kIyODVatWAVBYWMhvfvMbHnro\nITIzMzEMgxUrVjBs2LB+CS8ioVNRAevWWZ1CAqUrSkWknZoa+NGPoK4OBugSRUvpilIR6bPycpgz\nR4UeifSWiUg7FRU6lTFSafpFRNo4f968z8vp0/C971mdRjT9IiJ9smMHTJqkQo9UKnURaaOyEu69\n1+oU0lsqdRFpQ6Ue2TSnLiKtTp8GlwsaG0H35QsPmlMXkV7bvh1mzFChRzKVuoi00tRL5FOpi0gr\nlXrkU6mLCADHjsFf/wrjxlmdRPpCpS4igDlK//u/hx58jIKEMZW6iACaerELndIoIhiGeWuA3bth\n9Gir08i1dEqjiATswAHztgAq9MinUhcR3n/fPD9dIp9KXUTYuROmTbM6hQSDSl1EVOo2olIXiXIn\nT0JTE6SnW51EgkGlLhLldu6EH/5QH11nF3obRaKcpl7sRaUuEuVU6vaii49EotjFizBiBDQ0QFyc\n1WmkI7r4SER6zOOB8eNV6HaiUheJYpp6sR+VukgUU6nbj+bURaJUSwvccgscOgTx8Vankc5oTl1E\neuTgQbPUVej2olIXiVKaerEnlbpIlNq1y7ySVOxFpS4SpTwemDLF6hQSbFooFYlCZ89CYiJ88w04\nHFanka5ooVREurV/P0yYoEK3I5W6SBTyeGDyZKtTSCio1EWikErdvrotdbfbTXp6Oi6Xi+Li4g73\nqaqqIjs7m3HjxpGbmxvsjCISZCp1++pyodTv95OWlsa2bdtITExk0qRJlJaWkpGR0bpPU1MT06ZN\n46233sLpdNLQ0MDw4cPbv5AWSkXCgs8HWVnw9dcQE2N1GulOUBdKPR4PqampJCcn43A4KCgooLy8\nvM0+L7/8Mj//+c9xOp0AHRa6iISPvXvNUboK3Z66LHWfz0dSUlLrY6fTic/na7NPTU0NZ86c4Z57\n7iEnJ4c//vGPoUkqIkGhqRd7i+3qmzE9+F/55cuX+eijj6isrOTixYtMnTqVu+++G5fLFbSQIhI8\nHg/8x39YnUJCpctST0xMxOv1tj72er2t0yxXJSUlMXz4cOLi4oiLi2PGjBl88sknHZZ6UVFR69e5\nublaVBXpZy0tsG8fTJpkdRLpTFVVFVVVVb3++S4XSq9cuUJaWhqVlZUkJCQwefLkdgulhw4dYvHi\nxbz11lt8++23TJkyhbKyMu688862L6SFUhHLHToEeXnw5ZdWJ5GeCrQ7uxypx8bGUlJSwqxZs/D7\n/SxcuJCMjAxWrVoFQGFhIenp6cyePZvMzEwGDBjAo48+2q7QRSQ8XF0kFfvSvV9Eosi//iuMHq05\n9Uiie7+ISKc8Hs2n251G6iJRorkZhgwxLzoaPNjqNNJTGqmLSIc++wxSUlTodqdSF4kS+/dDTo7V\nKSTUVOoiUWLfPrjrLqtTSKip1EWixL59GqlHAy2UikSBv/0Nhg2DxkaIi7M6jQRCC6Ui0s5nn4HL\npUKPBip1kSigRdLooVIXiQJaJI0eKnWRKKBF0uihhVIRm7t0CW65Bc6cgUGDrE4jgdJCqYi08emn\nkJamQo8WKnURm9MiaXRRqYvYnBZJo4tKXcTmNFKPLlooFbGxixdh+HD45hu48Uar00hvaKFURFp9\n8glkZKjQo4lKXcTG9u/XfHq0UamL2JguOoo+KnURG9MiafTRQqmITV24ACNGQFMTDBxodRrpLS2U\niggAH38MY8eq0KONSl3EpjT1Ep1U6iI2pStJo5NKXcSmdOZLdNJCqYgN/fWvEB9vLpI6HFankb7Q\nQqmIUF0N48ap0KORSl3EhrRIGr1U6iI2pEXS6KVSF7EhLZJGLy2UitjMuXMwciScPQuxsVankb7S\nQqlIlKuuhsxMFXq0UqmL2IymXqKbSl3EZnQP9eimUhexmb17NVKPZt2WutvtJj09HZfLRXFxcaf7\n7d27l9jYWF577bWgBhSRnjtzBurrzY+wk+jUZan7/X4WL16M2+3mwIEDlJaWcvDgwQ73e/rpp5k9\ne7bOcBGx0N695tTLDTdYnUSs0mWpezweUlNTSU5OxuFwUFBQQHl5ebv9Vq5cyS9+8QtGjBgRsqAi\n0j2PByZPtjqFWKnLUvf5fCQlJbU+djqd+Hy+dvuUl5ezaNEiwDynUkSsoVKXLku9JwX9xBNPsHz5\n8tYT5DX9ImINw1CpC3R5eUJiYiJer7f1sdfrxel0ttln//79FBQUANDQ0MDWrVtxOBzk5+e3e76i\noqLWr3Nzc8nNze1DdBG51vHjMGAAXPcrKhGmqqqKqqqqXv98l7cJuHLlCmlpaVRWVpKQkMDkyZMp\nLS0lo5Ol9Ycffpg5c+Zw//33t38h3SZAJKQ2b4ZNm6CDZS+JYIF2Z5cj9djYWEpKSpg1axZ+v5+F\nCxeSkZHBqlWrACgsLOxbWhEJGk29COiGXiK28aMfwa9/DTNnWp1EginQ7lSpi9jAlSswdKg5rz50\nqNVpJJh0l0aRKHTwICQkqNBFpS5iC5pPl6tU6iI2sHevSl1MKnURG9BIXa7SQqlIhLt0CYYPh8ZG\nGDTI6jQSbFooFYkye/fC2LEqdDGp1EUi3M6dMG2a1SkkXKjURSKcSl2upTl1kQjW0mLOp3/xBYwc\naXUaCQXNqYtEkUOHYMgQFbp8R6UuEsE09SLXU6mLRLBdu1Tq0pZKXSSCaaQu11Opi0So06fh66/N\nc9RFrlKpi0SoXbvg7rvNj7ATuUr/OYhEKE29SEdU6iIRSqUuHdHFRyIR6G9/My86qq+HwYOtTiOh\npIuPRKLAvn2QkaFCl/ZU6iIR6IMPNPUiHVOpi0Sgd9+Fe+6xOoWEI82pi0SYb78159O9XvO+L2Jv\nmlMXsbndu835dBW6dESlLhJhKivh3nutTiHhSqUuEmFU6tIVzamLRJBz5yAhwbznS1yc1WmkP2hO\nXcTGduyAyZNV6NI5lbpIBNHUi3RHpS4SQVTq0h3NqYtEiNOnYcwYaGiA2Fir00h/0Zy6iE29+y7M\nmKFCl66p1EUihKZepCdU6iIRwDDgnXdU6tI9lbpIBPjsM/NPfR6pdEelLhIBysth7lyIibE6iYS7\nHpW62+0mPT0dl8tFcXFxu+//6U9/Iisri8zMTKZNm8ann34a9KAi0exqqYt0p9tTGv1+P2lpaWzb\nto3ExEQmTZpEaWkpGRkZrfvs3r2bO++8k5tvvhm3201RURF79uxp+0I6pVGkV+rqIDPT/Og6h8Pq\nNNLfgn5Ko8fjITU1leTkZBwOBwUFBZSXl7fZZ+rUqdx8880ATJkyhbq6ugBji0hnKiogL0+FLj3T\nban7fD6SkpJaHzudTnw+X6f7r1mzhry8vOCkExEqKjT1Ij3X7WUMMQGszGzfvp21a9eyc+fODr9f\nVFTU+nVubi65ubk9fm6RaHTuHOzaBZs3W51E+ktVVRVVVVW9/vluSz0xMRGv19v62Ov14nQ62+33\n6aef8uijj+J2uxk6dGiHz3VtqYtI99xu8wOmb7rJ6iTSX64f8C5ZsiSgn+92+iUnJ4eamhqOHTtG\nc3MzZWVl5Ofnt9nn+PHj3H///WzatInU1NSAAohI53TWiwSqRzf02rp1K0888QR+v5+FCxfyzDPP\nsGrVKgAKCwt55JFHeP311xk1ahQADocDj8fT9oV09otIQC5fhvh4+Pxz84MxJDoF2p26S6NImHr7\nbfj1r+G68ZFEGd2lUcQmNm6Ef/xHq1NIpNFIXSQMnTsHo0ZBTQ2MGGF1GrGSRuoiNrB5M9xzjwpd\nAqdSFwlD69fDQw9ZnUIikaZfRMLMkSPwwx+a93wZONDqNGI1Tb+IRLiNG+GBB1To0jsaqYuEkZYW\nuOMOeOMNmDDB6jQSDjRSF4lg770HQ4ao0KX3VOoiYWTNGpg/3+oUEsk0/SISJo4fh+xsOHrUHK2L\ngKZfRCLW88/DggUqdOkbjdRFwkBjI7hc8NlnkJhodRoJJxqpi0Sgl16C++5ToUvfaaQuYrELF2D0\naNixA9LTrU4j4UYjdZEIs3Yt/N3fqdAlODRSF7FQczOMGQNlZTBlitVpJBxppC4SQV54AcaOVaFL\n8GikLmKREycgMxN27zbPfBHpiD7OTiRC/NM/mWe7LFtmdRIJZ4F2Z2wIs4hIJ3btgnffhUOHrE4i\ndqM5dZF+5vfD4sWwYgV8//tWpxG7UamL9LOVK80ynzfP6iRiR5pTF+lHHg/89KewZ49533SR7uiU\nRpEwdeYM/MM/wKpVKnQJHY3URfpBSwvMnWueuvj881ankUiis19EwtCKFdDQAK++anUSsTuVukiI\nrV4Nv/89fPCBPkxaQk+lLhJCa9bAb38L27fDqFFWp5FooFIXCZF16+DZZ81CT021Oo1EC5W6SJC1\ntMDy5eaUS2Wl7usi/UulLhJEZ87Agw/C2bPw4Yf6JCPpfzpPXSRIdu+Gu+4yP+xi+3YVulhDI3WR\nPjpxAp55BrZtg5IS87NGRayikbpILzU1wXPPwfjxkJBg3nFRhS5W00hdJECHD8OLL0JpKeTlmXPn\nOrtFwkW3I3W32016ejoul4vi4uIO93n88cdxuVxkZWVRXV0d9JAiVvvqK/jf/4Vp02DGDBg2DD7/\nHDZtUqFLeOmy1P1+P4sXL8btdnPgwAFKS0s5ePBgm322bNnCkSNHqKmpYfXq1SxatCikgcNVVVWV\n1RFCys7H19GxnT5tXtL/xBOQnQ05OWaJ/9d/wfHj5gVFCQn9n7U37Pzegf2PL1BdlrrH4yE1NZXk\n5GQcDgcFBQWUl5e32aeiooL58+cDMGXKFJqamqivrw9d4jBl9/+w7Hh8LS3mIueGDVVs2GAudubl\ngdMJY8bA2rUwciS89BKcPGleHZqXBzfeaHXywNjxvbuW3Y8vUF3Oqft8PpKSklofO51OPvzww273\nqaurIz4+PshRRdry++Hbb83t0iW4eNHcLlyAc+e+2775xjx/vLERvv7aLOiTJ+HUKRgyBBwOuHzZ\nLPLCQvPDoG+/HQboNAKJQF2WekxMTI+e5PrbQnb2c3Pm9DBVBDp8GPbv77/XC8ZdjLt6jmu/ZxhQ\nU2MuCF7/M1cfG0bbr6/9u+u3lpb2f17d/P6225Ur322XL5tbc7P5p98PgwaZI+dBg2DwYPje98zt\nBz8wt5tugqFD4ZZbzHuvDB9ujr6vbnFxUFRkbiK2YHRh9+7dxqxZs1ofL1261Fi+fHmbfQoLC43S\n0tLWx2lpacapU6faPVdKSooBaNOmTZu2ALaUlJSuarqdLkfqOTk51NTUcOzYMRISEigrK6O0tLTN\nPvn5+ZSUlFBQUMCePXsYMmRIh1MvR44c6eqlREQkCLos9djYWEpKSpg1axZ+v5+FCxeSkZHBqlWr\nACgsLCQvL48tW7aQmprK4MGDWbduXb8EFxGR9vrt4+xERCT0Qrq+v3nzZsaOHcsNN9zARx991Pr3\nx44dIy4ujuzsbLKzs3nsscdCGSNkOjs+gGXLluFyuUhPT+ftt9+2KGHwFBUV4XQ6W98zt9ttdaSg\n6MnFdZEsOTmZzMxMsrOzmTx5stVx+mzBggXEx8czfvz41r87c+YMM2fOZMyYMfzkJz+hqanJwoS9\n19Gx9er3LqAZ+AAdPHjQOHz4sJGbm2vs37+/9e9ra2uNcePGhfKl+0Vnx/fFF18YWVlZRnNzs1Fb\nW2ukpKQYfr/fwqR9V1RUZPzud7+zOkZQXblyxUhJSTFqa2uN5uZmIysryzhw4IDVsYIqOTnZaGxs\ntDpG0OzYscP46KOP2vTHU089ZRQXFxuGYRjLly83nn76aavi9UlHx9ab37uQjtTT09MZM2ZMKF/C\nUp0dX3l5OfPmzcPhcJCcnExqaioej8eChMFl2GymricX19mBnd636dOnM3To0DZ/d+0FkPPnz+eN\nN96wIlqfdXRsEPj7Z9nlFbW1tWRnZ5Obm8sHH3xgVYyQOHHiBE6ns/Wx0+nE5/NZmCg4Vq5cSVZW\nFgsXLozYf+Jeq6ML5+zwPl0rJiaGH//4x+Tk5PCHP/zB6jghUV9f33rGXXx8vO2uaA/0967PpT5z\n5kzGjx/fbnvzzTc7/ZmEhAS8Xi/V1dU8//zzPPDAA5w/f76vUUKiN8fXkZ5eyGWlzo61oqKCRYsW\nUVtby8cff8zIkSN58sknrY7bZ5HwnvTVzp07qa6uZuvWrbz00ku8//77VkcKqZiYGFu9r735vevz\nrXffeeedgH9m4MCBDBw4EICJEyeSkpJCTU0NEydO7GucoOvN8SUmJuL1elsf19XVkRgBH4PT02N9\n5JFHmGODy4Ovf5+8Xm+bf2HZwciRIwEYMWIE9913Hx6Ph+nTp1ucKrji4+M5deoUt912GydPnuTW\nW2+1OlLQXHssPf2967fpl2vnhRoaGvD7/QB8+eWX1NTUcMcdd/RXlJC49vjy8/N55ZVXaG5upra2\nlpqamog/8+DkyZOtX7/++uttVugj1bUX1zU3N1NWVkZ+fr7VsYLm4sWLrf8CvnDhAm+//bYt3rfr\n5efns2HDBgA2bNjAz372M4sTBU+vfu+CuHjbzmuvvWY4nU5j0KBBRnx8vDF79mzDMAzjz3/+szF2\n7FhjwoQJxsSJE42//OUvoYwRMp0dn2EYxnPPPWekpKQYaWlphtvttjBlcDz44IPG+PHjjczMTGPu\n3Lkd3goiEm3ZssUYM2aMkZKSYixdutTqOEH15ZdfGllZWUZWVpYxduxYWxxfQUGBMXLkSMPhcBhO\np9NYu3at0djYaNx7772Gy+UyZs6caXzzzTdWx+yV649tzZo1vfq908VHIiI2opuLiojYiEpdRMRG\nVOoiIjaiUhcRsRGVuoiIjajURURsRKUuImIjKnURERv5fzcNoRu1Nj8NAAAAAElFTkSuQmCC\n", "text": [ "" ] } ], "prompt_number": 3 }, { "cell_type": "markdown", "metadata": {}, "source": [ "The decision boundary is now above 5 instead of above 0, so any message with a capitalized word of length greater than 5 will be classified as spam. By changing the parameters of the sigmoid function we've changed how we classify emails. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Generalizing to Multiple Dimensions\n", "\n", "In the discussion above we chose the length of the longest capitalized word to determine if a message was spam. However there are many more features of email messages that can be used, for example:\n", "\n", "- the email address they originate from\n", "- the presence of the word \"free\" or \"money\"\n", "- the presence of some symbols\n", "- the length of the message\n", "- etc. \n", "\n", "Later on in this notebook we'll use the logistical regression classifier to classify messages with 57 features. We need to find some way of generalizing our approach to many dimensions.\n", "\n", "Assume we can process any email message to extract a vector of real-valued features. Call this vector $ \\vec{x} $. To use this multidimensional feature vector with the sigmoid function we need to find a way of projecting it onto one dimension. Linear algebra provides us with a way of performing this projection operation: the dot product. We need another vector, call it $ \\vec{\\theta} $, in order to do this.\n", "\n", "If we let\n", "\n", "$$ \\vec{x} = [x_0, x_1, x_2 ... x_m] $$\n", "\n", "and \n", "\n", "$$ \\vec{\\theta} = [p_0, p_1, p_2 ... p_m] $$\n", "\n", "where $ m \\in \\mathbb{N} $ is the number of features we've extracted, then the dot product is\n", "\n", "$$ \\vec{x} \\cdot \\vec{\\theta} = x_0p_0 + x_1p_1 + x_2p_2 ... x_mp_m $$ \n", "\n", "If we fix $ x_0 = 1 $, then this looks similar to the exponential term in the sigmoid function above. We can rewrite it as\n", "\n", "$$ h(\\vec{x}, \\vec{\\theta}) = 1 / (1 + e^{\\vec{x} \\cdot \\vec{\\theta}}) $$\n", "$$ h(\\vec{x}, \\vec{\\theta}) = 1 / (1 + e^{p_0 + x_1p_1 + x_2p_2 ... x_mp_m}) $$\n", " \n", "Set $ m = 1 $ and this becomes\n", "\n", "$$ h(\\vec{x}, \\vec{\\theta}) = 1 / (1 + e^{p_0 + x_1p_1}) $$\n", "\n", "Which is identical to the original sigmoid function with the parameters $ a = p_0 $ and $ b = -p_1$. \n", "\n", "So if we are given a parameter vector $ \\vec{\\theta} $ and an email message, we can classify the email by:\n", "\n", "- extracting a feature vector $ \\vec{x} $ from the message\n", "- performing the dot product operation with $ \\vec{\\theta} $\n", "- passing the result into the sigmoid function\n", "- checking which side of the decision boundary the output is on\n", "\n", "It's clear that if we are given a new feature vector $ \\vec{x} $ the way we classify it depends *entirely* on the values in $ \\vec{\\theta} $. Our classification model is completely defined by the values in the parameter vector $ \\vec{\\theta} $. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Choosing the Parameters\n", "\n", "Following the conventions we set above, we call $ \\vec{\\theta} $ the *parameter vector*. The problem now becomes choosing the values in $\\vec{\\theta} $. \n", "\n", "In our previous discussion we chose the parameters $ a $ and $ b $ by guesswork. However with multiple dimensions and large training sets we need an algorithmic way of choose the values of $ \\vec{\\theta} $. Choosing the parameters is called *training* the classifier. \n", "\n", "The logistical regression classifier is a *supervised algorithm*. This means that in order to train the algorithm we must provide it with a *training set* of feature vectors $ \\{ \\vec{x_i} \\mid i = 1 .. n \\} $ and a set of *labels* $ \\{ \\vec{y_i} \\mid y_i \\in \\{ 0, 1 \\} \\forall i = 1 .. n \\} $ which indicate the correct class in which to place each of the $ \\vec{x_i} $.\n", "\n", "Training the algorithm depends on the concept of a *cost function* which does the following:\n", "\n", "- Given a parameter vector $ \\vec{\\theta} $,\n", "- For each feature vector $ \\vec{x_i} $ in the training set \n", " - Make a prediction from our sigmoid model, $ h( \\vec{x_i} , \\vec{\\theta} $)\n", " - Compare our prediction against the correct label for $ \\vec{x_i} $\n", " - If our prediction was incorrect, return a high value for the cost\n", " - If the prediction was correct, return a low value for the cost. \n", "\n", "To train the classifier we can use numerical methods to iteratively find the values of $ \\vec{\\theta} $ which minimize this cost function. \n", "\n", "The cost function for the logistical regression classifier is\n", "\n", "$$ J(\\vec{\\theta}, X, \\vec{y}) = (-1 / n)[ \\sum_{i=1}^{n} y_i log(h(\\vec{x_i}, \\vec{\\theta})) + (1 - y_i) log(1 - h(\\vec{x_i}, \\vec{\\theta}))] + \\lambda\\sum_{j=1}^{m} \\theta_j^2$$\n", "\n", "Where:\n", "\n", "- $ n \\in \\mathbb{N} $ is the number of training vectors in our training set\n", "- $ X $ is an $ n $ x $ m $ matrix containing all $ n $ vectors in training set, each with $ m $ features\n", "- $ \\vec{x_i} $ is the $ i^{th} $ vector in the training set\n", "- $ y_i $ is is the class label for $ x_i $\n", "- $ h(\\vec{x}, \\vec{\\theta}) $ is the sigmoid function used by our model, as above\n", "- $ \\vec{\\theta} $ is the parameter vector, as above\n", "- $ \\lambda $ is the *regularization parameter*\n", "\n", "The way we derive this function and the properties which make is useful as a cost function are outside the scope of this discussion. However we can see that it does what we need it to do:\n", "\n", "- if $ y_i $ is 1 and $ h(\\vec{x_i}, \\vec{\\theta}) $ is close to 1 then the first and second terms in the sum are close to 0 and the cost $ J $ is low\n", "- if $ y_i $ is 1 and $ h(\\vec{x_i}, \\vec{\\theta}) $ is close to 0 then the first term becomes large and the cost is high\n", "- if $ y_i $ is 0 and $ h(\\vec{x_i}, \\vec{\\theta}) $ is close to 1 then the second term becomes large and the cost is high\n", "- if $ y_i $ is 0 and $ h(\\vec{x_i}, \\vec{\\theta}) $ is close to 0 then both terms approach 0 and the cost is low\n", "\n", "By finding the cost of each training example $ \\vec{x_i} $ and summing them we get a total cost of $ \\vec{\\theta} $ over the entire training set. \n", "\n", "The final term in the cost function, $ \\lambda\\sum_{j=1}^{m} \\theta_j^2 $, is used to *regularize* the cost. Regularization prevents the parameters of $ \\vec{\\theta} $ from becoming too large by penalizing large values. This strategy tends to improve the accuracy of the classifier on the test set because it prevents the classifier from being overfitted to the training set. The parameter $ \\lambda \\in \\mathbb{R} $ can be used to control the degree of regularization and is called the *regularization parameter*.\n", "\n", "We now need to find the values of $ \\vec{\\theta} $ that minimize the cost function $ J $. The problem of minimizing functions is a well-studied branch of mathematics and there are many libraries that can do this for us. Having chosen the cost function $ J $ we can have the ```scipy.optimize``` library automatically find the best value of $ \\vec{\\theta} $, thereby training our classifier. \n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Vectorizing the Cost Function\n", "\n", "There's one important thing we need to do before training the classifier - we need to vectorize the function $ J $. \n", "\n", "The function $ J $ as we wrote it performs an operation and sums over the entire training set. There are various optimizations in the ```numpy``` and ```scipy``` libraries that depend on using matrices and vectors. We can express our cost function in terms of matrix and vector operations to take advantage of these optimizations. \n", "\n", "We can write our cost function as\n", "\n", "$$ J(\\vec{\\theta}, X, \\vec{y}) = (-1 / n) [\\vec{y}' \\cdot log(h(X, \\vec{\\theta})) + (\\vec{I}' - \\vec{y}') \\cdot log(\\vec{I} - h(X, \\vec{\\theta}))] + \\lambda \\vec{\\theta}\\cdot\\vec{\\theta}'$$\n", "\n", "Where:\n", "\n", "- $\\vec{I}$ is the unit vector, a vector of ones\n", "- the $ ' $ symbol represents the transpose of a vector\n", "- the $ log $ and $ h() $ functions are applied to each element of the vectors they operate on\n", "- all the other parameters are as above\n", "\n", "The code included with this notebook implements the algorithm in this way. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Putting it all Together\n", "\n", "Now we're ready to look at some code. We have 3 components of our classifier algorithm:\n", "\n", "- the prediction function\n", "- the cost function\n", "- the optimization function\n", "\n", "Implementations of each of these components can be found in ```src/logistic.py```:\n", "\n", "- The ```predict()``` function takes a matrix of feature vectors $ X $ and the parameter vector $ \\vec{\\theta} $. It mulitples the feature matrix by the parameter vector and then applies the sigmoid function to each element of the resulting vector. This creates a vector of $ n $ predictions for each of the $ n $ feature vectors in $ X $. \n", "- The ```cost()``` function computes the cost of a parameter vector $ \\vec{\\theta} $ over a training set $ \\{ X $ , $ \\vec{y} \\} $.\n", "- The ```train()``` function finds the parameter vector which minimizes the cost function. \n", "\n", "To train our classifier, we pass a training set $ \\{ X , \\vec{y} \\} $ and an initial guess at the model parameters $ \\vec{\\theta} $ into the ```train()``` function. The optimization routines in ```scipy.optimize``` will find the parameter vector which minimizes the ```cost()``` function over the training set $ \\{ X , \\vec{y} \\} $." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Classifying Spam\n", "\n", "To test the classifier, we'll use the well-known [UCI Machine Learning Repository Spambase Data Set](https://archive.ics.uci.edu/ml/datasets/Spambase). This data set is constructed by extracting feature vectors with 57 dimension from over 4600 emails. Each email has been manually labelled as spam and not-spam, making this perfect for our purposes. \n", "\n", "To download the data set, run the ```download()``` method in the ```src/spambase.py``` file:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "from src import spambase\n", "spambase.download()" ], "language": "python", "metadata": {}, "outputs": [ { "output_type": "stream", "stream": "stdout", "text": [ "downloading spam data set ...\n", "download complete, unzipping data ..." ] }, { "output_type": "stream", "stream": "stdout", "text": [ "\n", "complete\n" ] } ], "prompt_number": 4 }, { "cell_type": "markdown", "metadata": {}, "source": [ "We use the ```read_data()``` function in ```src/spambase.py``` to get read $ X $ and $ \\vec{y} $, split them into train and tests sets, and generate the initial value for $ \\vec{\\theta} $:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "theta, Xtrain, Xtest, ytrain, ytest = spambase.read_data()" ], "language": "python", "metadata": {}, "outputs": [], "prompt_number": 5 }, { "cell_type": "markdown", "metadata": {}, "source": [ "Then we train the classifier using the ```train()``` function in ```src/logistic.py```:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "from src import logistic\n", "trained_theta = logistic.train(theta, Xtrain, ytrain)" ], "language": "python", "metadata": {}, "outputs": [ { "output_type": "stream", "stream": "stdout", "text": [ "Optimization terminated successfully.\n", " Current function value: 0.615512\n", " Iterations: 5\n", " Function evaluations: 660\n", " Gradient evaluations: 11\n" ] } ], "prompt_number": 6 }, { "cell_type": "markdown", "metadata": {}, "source": [ "```trained_theta``` now defines our model and we can use it to make predictions. To test the accuracy or our classifier we generate a set of predictions for ```Xtest``` and then check the accuracy and precision against ```ytest```." ] }, { "cell_type": "code", "collapsed": false, "input": [ "from numpy import matrix, array\n", "predictions = logistic.predict(matrix(trained_theta).transpose(), Xtest)\n", "\n", "# recall that 0.5 is our decision boundary\n", "predictions = [1 if p > 0.5 else 0 for p in predictions]\n", "\n", "# make a list of tuples, each tuple contains the predicted class and the labeled class\n", "predictions = zip(predictions, [i[0] for i in ytest.tolist()])\n", "print(predictions[:10])" ], "language": "python", "metadata": {}, "outputs": [ { "output_type": "stream", "stream": "stdout", "text": [ "[(0, 1), (0, 1), (1, 1), (0, 0), (0, 0), (1, 0), (1, 0), (0, 0), (1, 1), (1, 1)]\n" ] } ], "prompt_number": 7 }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can now calculate the precision and recall of our classifier:" ] }, { "cell_type": "code", "collapsed": false, "input": [ "tp = len([1 for x in predictions if x[0] == 1 and x[0] == x[1]])\n", "fp = len([1 for x in predictions if x[0] == 1 and x[0] != x[1]])\n", "precision = float(tp) / (float(tp) + float(fp))\n", "\n", "tn = len([1 for x in predictions if x[0] == 0 and x[0] == x[1]])\n", "fn = len([1 for x in predictions if x[0] == 0 and x[0] != x[1]])\n", "recall = float(tp) / (float(tp) + float(fn))\n", "\n", "print(\"Precision: {0}\".format(precision))\n", "print(\"Recall: {0}\".format(recall))" ], "language": "python", "metadata": {}, "outputs": [ { "output_type": "stream", "stream": "stdout", "text": [ "Precision: 0.916955017301\n", "Recall: 0.864600326264\n" ] } ], "prompt_number": 8 }, { "cell_type": "markdown", "metadata": {}, "source": [ "Of all the messages we marked as spam, 91% were actually spam. Of all the spam messages in the training set, we found 87% of them. Not bad for our first try. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Next Steps\n", "\n", "There are many improvements that we could make to the algorithm, and many details that were ommitted from our discussion for the sake of brevity. \n", "\n", "### Choosing the Initial Values of $ \\vec{\\theta} $\n", "\n", "In ```src/spambase.py``` the initial values of $ \\vec{\\theta} $ are chosen to be normally distributed about 0. This was done to prevent the ```cost()``` function from finding an initial cost of ```Inf``` or ```-Inf```, which is possible because of the way ```cost()``` uses the logarithm. By initializing $ \\vec{\\theta} $ around 0, we ensure that the values passed into the logarithm are manageable and the function does not run away. \n", "\n", "There are other ways of dealing with this problem but we did not implement them here to avoid the complexity. The purpose of this exercise was to learn how to implement the basics of logistical regression classifier, not deal with all possible edge cases. \n", "\n", "### The Gradient\n", "\n", "The optimization routine we used to train the classifier was ```scipy.optimize.fmin_cg()```. This is an implementation of gradient descent which seeks the optimal solution by following the gradient (the derivative in multiple dimensions) down to the minimal point. The algorithm depends on being able to calculate the gradient at each point and its performance can be improved by finding the gradient function and passing it in to the ```fprime``` parameter. Since we did not do this, the ```fmin_cg()``` function was forced to numerically estimate the gradient at each point which hurt performance. \n", "\n", "### Better Implementations\n", "\n", "It's important to note that the code included with this notebook is meant as a basic implementation of the logistical regression classifier and you should not use this code for anything other than instruction. If you're looking for a classifier to use in production you should use one of the standard implementations such as ```sklearn.linear_model.LogisticRegression```.\n", "\n", "### Comparisons to Other Classifiers\n", "\n", "In this discussion we did not say anything about how the logistical regression classifier compares to other classifiers such as the naive bayes or the support vector machine. Different algorithms perform better on different problems and you should evaluate several before choosing one solution for the problem you are working on. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Conclusion\n", "\n", "In this notebook we went through the basics of how a logistical regression classifier works and wrote a simple implementation. We then evaluated its performance on a spam classification problem and found that it performed fairly well. \n", "\n", "There are many improvements that could be made but I hope this was useful, if only for a basic introduction to some of the concepts in machine learning and classification. \n", "\n", "Thanks for reading!" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Appendix\n", "\n", "### More Resources:\n", "\n", "- Machine Learning on Coursera: https://www.coursera.org/learn/machine-learning\n", "- An Introduction to Statistical Learning: http://www-bcf.usc.edu/~gareth/ISL/getbook.html\n", "- ```Scikit-learn```'s implementation of the logistic regression classifier: http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html\n", "- Wikipedia: http://en.wikipedia.org/wiki/Logistic_regression\n", "\n", "### Meta\n", "\n", "- Author: Matthew Fournier\n", "- Website: [http://mfournier.ca](http://mfournier.ca)\n", "- First Published: May 18, 2015" ] } ], "metadata": {} } ] }