{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "## Homework II: Logistic Regression, SVM and some computational drug design\n", "\n", "__Given date:__ Saturday March 7 \n", "\n", "__Due date:__ Friday March 20\n", "\n", "__Total__ : 30 pts + Bonus (One of the bonus question is on 3pts. For the other, it depends on what you can do)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise I.1. Logistic regression (10pts)\n", "\n", "Use the lines below to load the variables __HW2_ExI_X__ and __HW2_ExI_Y__. In this first exercise, you will learn a logistic regression classifier on this dataset. Recall that the logistig regression model takes the form\n", "\n", "$$p(t=1|\\mathbf{x}) = \\sigma(\\mathbf{\\beta}^T\\tilde{\\mathbf{x}})$$\n", "\n", "where $\\tilde{\\mathbf{x}} = [1, \\mathbf{x}] = [1, x_1, x_2, \\ldots, x_D]$. Consequently, we thus have \n", "\n", "$$p(t = 0|\\mathbf{x}) = 1 - \\sigma(\\mathbf{\\beta}^T\\tilde{\\mathbf{x}})$$\n", "\n", "we can then write the total probability that a point $\\mathbf{x}$ will be from class $c = \\left\\{0,1\\right\\}$ as \n", "\n", "$$p(t = c|\\mathbf{x}) = p(t = 1|\\mathbf{x})^{c} p(t = 0|\\mathbf{x})^{1-c} $$\n", "\n", "or equivalently\n", "\n", "$$p(t = c|\\mathbf{x}) = (\\sigma(\\mathbf{\\beta}^T\\tilde{\\mathbf{x}}))^{c} (1-\\sigma(\\mathbf{\\beta}^T\\tilde{\\mathbf{x}}))^{1-c} = p^c (1-p)^{1-c}$$\n", "\n", "which is a binomial distribution with probability of success $\\sigma(\\mathbf{\\beta}^T\\tilde{\\mathbf{x}})$. If we assume that all the samples are independent, the probability of observing the dataset can read as the product\n", "\n", "$$p(\\left\\{\\mathbf{x}_i, t_i\\right\\}) = \\prod_{i=1}^N p(t = t(\\mathbf{x}_i)|\\mathbf{x}_i)\\quad (*)$$\n", "\n", "We can then try to learn the parameters $\\mathbf{\\beta}$ that maximize this probability (i.e such that the probability $p(t = t_i |\\mathbf{x}_i)$ is high for every sample pair in the dataset. To do this, we take the negative logarithm of $(*)$ which gives \n", "\n", "$$-\\log \\prod_{i=1}^N p(t = t(\\mathbf{x}_i)|\\mathbf{x}_i) = -\\sum_{i=1}^N c_i \\log(\\sigma(\\mathbf{\\beta}^T\\tilde{\\mathbf{x}}_i)) - \\sum_{i=1}^N (1-c_i)\\log(1-\\sigma(\\mathbf{\\beta}^T\\tilde{\\mathbf{x}_i}))\\quad (**)$$\n", "\n", "and find the $\\mathbf{\\beta}$ that minimizes this expression." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAD8CAYAAAB0IB+mAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAF5pJREFUeJzt3W+MHddZx/Hfk252vUX8aWWnLfUG\nx2mK7BaI8DaCF4BKS5sismmQgtxXRUW4VA4CBEV2XSEQqmS1iAiBkYggoiCo0xbCXlFoafqCConQ\nrqskjTFp45Skdilxy4tWwo7j9uHF3A3r9cy9c+/MmTlnzvcjXa139vreM3Pv/u7Zc545Y+4uAMDw\nXdd3AwAA3SDwASATBD4AZILAB4BMEPgAkAkCHwAyQeADQCYIfADIBIEPAJlY6LsBW+3cudP37NnT\ndzMAICmnTp36mrvvmna/qAJ/z5492tjY6LsZAJAUM3u6zv0Y0gGATBD4AJAJAh8AMkHgA0AmCHwA\nyASBjziMRtL+/dKOHcXX0ajvFgGDQ+Cjf6ORdPCgdOaM9NxzxdeDBwl9oGUEPuoL1Qs/ckS6ePHq\nbRcvSkePtvP4ACQR+KgrZC/8qafKt5892/yxAbyAwEc9IXvhe/eWb7/55uaPDeAFBD7qCdkLP35c\nWl6+etvycrEdQGsIfNQTshe+tiadPFnMCywtFV8feEC6447mjw3gBVEtnoaIHT9ejNlvHdZpsxe+\ntlbcAARDDx/10AsHkkcPH/XRCweSRg8fADJB4ANAJgh8AMgEgQ8AmWgl8M3sfjN71swe37LtpWb2\nSTP74vjrS9p4LgDAfNrq4f+5pNu3bTsi6VPufoukT42/R+xYphgYrFYC390/Lel/tm2+U9IHx//+\noKS3tvFcCIhlioFBCzmG/zJ3/6/xv78q6WUBnwttYJliYNA6mbR1d5fkZT8zs0NmtmFmGxcuXOii\nOajCMsXdYegMPQgZ+P9tZq+QpPHXZ8vu5O73ufuqu6/u2rUrYHMwFcsUd4OhM/QkZOCPJL19/O+3\nS1oP+FzD00cPkGWKu8HQGXrSVlnmhyT9q6TvN7NzZvYLko5L+ikz+6KkN46/Rx199QBZIK0bDJ2h\nJ1YMr8dhdXXVNzY2+m5G//bvL0K+bPvp0923B+3i9UXLzOyUu69Oux9n2saIHuCwMXSGnhD4MWLy\ndNgYOkNPWA8/RqGvLoX+cW0B9IAefozoAQIIgMCP1dpaMYF36VLx9Y474jhZJ1QbRiNpZUW67rri\ntrJCXTrQMgI/FTGcrBOqDaORdPfd0rlzkntxO3eu2EboA62hLDMVMZTyhWpD1eO28dhABijLHJoY\nSjUntaHJUE/V424+9qxiGPoCIkTgpyKGUs2qNtxwQ7OhnqrHlWbfvxiGvoBIEfipiOFknao2uM+2\nNsz2Hvhdd0mLi9feb2Fh9v1jnRqgEoGfihhKNavaULWsddlwTFkP/N57q2vSZ51jimHoC4gUk7Zo\nbpbJ3Kr7LixIV65cu31lRXrmmTBtAQaCSVt0Z5bhpqoeeFnYS0V5Zqi2AJkh8NHcLMNNkyZou24L\nkBmGdNCtzTH87esEvfjF0te/fu39Zx3SATLEkA7iVNUDf+c7r73v4qJ04kT3bayDWn8kiMBH97av\nE+ReVOpsZSa9+91xDsVQ649EEfgpGlrvsqx23l168MF+2jMNtf5IFIGfmiH2LkPVzof6YMyx1n9o\nnYxMEfipGWLvMsSyESE/GGNY5qJLQ+xkZIrAT80Qe5chaudDfjDmVus/xE5GpvIO/BT/TB1i7zJE\n7XzID8bcav2H2MnIlbtHcztw4IAHsb7uvm+f+9JS8XV9vbgtL29ebqO4LS8X22NW1e7RKMxzbT9u\nqdi37+pjtHnbv7+/NqV6PGM8lriKpA2vkbG9h/zWW5DArwrI3bvTfROvrxftXFoqvjYN+yF9IG6a\n9MFYFbwhAznl49llJwNzIfA3VfVOzMq3Ly2134aYDfEDcVPZB+N73nPta7+8XGwPGcgx9pJn+YBr\nu5OBVtUN/OEvrbBjR1FZsJ1Z+dK7ua2qWLW6ZNXxWVoqTphKzWgkHT5cvRjb4qJ0+fK129t6P1S9\nD/s6nlVLXJw8Wb1UNaLF0gqbqiY5d+/Oq9KiyqTLC5ZJcXJ4M9wmrbxZFvZSexOTsU22U3mTpeEH\nflUJ3YkTeVVaVBniB+L26qvDh68Nt+3KrrgltRfIsZVyUnmTpzrjPl3dglbpbI4/rqwU49OpVUqE\nMm1yM/TkcNvK9mfazcz92LHwE5MxjYPHOKeAuYlJ2xIpV0qE1EUQdXXsq4JsUti/973/38ZYAjk0\nKm8GhcAv01evJsb6667b1NWxX1qqH/YrK3kHXE4fcANH4JepCoN5SjHrBmaMf1Wsr7svLl7dpsXF\nsG1q89hPUvXBsrJCuGGwCPwybfUyZwnxGMdKq2rsV1bCPWdXx4GhCmSobuAPv0pnq7YqJWYpaYux\nGuL8+fLts14wfBZdVankts4NMIO8Ar+tMJglxGOrv+5Ll0G8/YpauYV9iosCoht1/gzo6hZ8SKct\nswxPxDjEUDWks3Nnf21KXSwT8zHOGSE4MaQT0CzDEzEOMZw4IS0sXLv9G98Ybm8wZK83pguEcAYt\nJgi+lo6Z/aekb0r6lqQrPmG9hyBr6YQyGhW/RGfPFsMzx4+nNXSwslI+Zj/EtYRCrxtTtR5RH8cy\ntjV70InY1tJ5vbvfWqdByUh9nPjChfLtQzy1PnSvt82J+aZ/iTBnlJ4O51wY0slVVTDccMPwJvxC\nV0q1FbJtDA3FtmYPJut6OLDOQH+Tm6QvSfqcpFOSDpX8/JCkDUkbN954Y6g5DWxXNrm3uHjtCVlD\nmPALfQ5AWxPzbZ4nwklmaWjpNVcsJ15JeuX46w2SHpX041X3TaZKZyi2B8MQLnpSpotKqTZCtquz\nkRGPll7zuoEffEjH3c+Pvz4r6UFJt4V+TtS0fR5iqOP6XVRKtTGnw/h7fjp+zYMGvpl9h5l95+a/\nJb1J0uMhnxMNDDlwUphkb2v8nROv0tHxnEvoHv7LJP2LmT0q6TOSPubuHw/8nJhXShN+Qwy1Nv4S\niemcAEzX8Xk6w7+mLWaTwvkFXI+1WkznBKAzdevwCXykh1CrxolXWYrtxCugPTGuQBqLIc/DoDEC\nH+kh1KqlNA+DzhH4SA+hVi3GxfoQDQIf6SHUJkuhBBW9IPCRpmmhNsSyTaAhAh9xahLY1KIDpQh8\nxKdpYHMREKAUgY/4NA1syjanY8grSwQ+4tM0sCnbnIwhr2wR+IhP08CmbHMyhryyReAjPk0Dm7LN\nyRjyyhaBj/i0EdjUoldjyCtbC303ACi1tsbKl6EcP16+2ihDXoNHDz9lVFpgHgx5ZYvlkVPFmvAA\nxlgeeeiotAAwIwI/VVRaAJgRgZ8qKi0AzIjATxUnFwGYEYGfKiotAMyIOvyUUasOYAb08AEgEwQ+\nAGSCwAeATBD4AJAJAh8AMkHgA0AmCHwAyASBDwCZIPABIBMEPgBkgsAHgEwQ+ACQCQIfADJB4ANA\nJoIHvpndbmZPmNmTZnYk9PMBAMoFDXwze5GkE5LeImm/pLeZ2f6Qz4mMjUbFhWB27Ci+jkZ9twiI\nSuge/m2SnnT3p9z9sqSTku4M/JzI0WgkHTwonTkjPfdc8fXgQUIf2CJ04L9S0pe3fH9uvA1o15Ej\n0sWLV2+7eFE6erSf9gAR6n3S1swOmdmGmW1cuHCh7+YgVU89Vb797Nlu2wFELHTgn5e0suX73eNt\nL3D3+9x91d1Xd+3aFbg5GKy9e8u333xzt+0AIhY68D8r6RYzu8nMFiUdlMSgKtp3/Li0vHz1tuXl\nYjsASYED392vSLpH0icknZH0YXc/HfI5kam1NenkyaI6Z2mp+PrAA9Idd/TdMiAa5u59t+EFq6ur\nvrGx0XczACApZnbK3Ven3a/3SVsAQDcIfADIBIEPAJkg8AEgEwQ+AGSCwAeATBD4AJAJAh8AMkHg\nA0AmCHwAyASBDwCZIPABIBMEPgBkgsAH0AjXjk8HgQ9gblw7Pi0EPoC5ce34tBD4AObGtePTQuAD\nmBvXjk8LgQ9gblw7Pi0EPoC5ce34tCz03QAAaVtbK26IHz18AMgEgQ8AmSDwW8ZZhwBiReC3iLMO\nAcSMwG8RZx0CiBmB3yLOOgQQMwK/RZx1CCBmBH6LUjnrMLeJ5dz2F6gyyMDv6xc8hbMOc5lY3nwP\nXH+99Na3Dn9/gTrM3ftuwwtWV1d9Y2Oj0WNsBtrWydPl5SKIORuwCMEzZ8q3nz7dfXtCKHsPbDek\n/QXM7JS7r0673+B6+FTKTNbmxHKsQyVl74HtmEhHjgYX+FTKTNbWxHLMQ0NV74GtmEhHjgYX+FTK\nTNbWxHLMf0lVvQc2xTiRDnRhcIGfSqVMX9qaWI75L6my94BZMYEb40Q60JXBBX4KlTJ9W1srJiwv\nXSq+znNsYv5Lquw9sL4uXb48//4CQzC4Kh10o6oaig9XoHu9V+mY2W+b2Xkze2R8++lQz4Xu5f6X\nVKwVSsAkoYd07nX3W8e3fwj8XOhYG0NDKZpUoTS0D4Kh7U/uBjWGz5sTXaiqULrnnnhLVecRc+kt\n5hM68O8xs8fM7H4ze0nIJ+LNia5UVSidOxdvqeo8Yi69xXwaTdqa2UOSXl7yo2OSHpb0NUku6Xcl\nvcLd31HyGIckHZKkG2+88cDTTz89V1tyWDIAcah6r1VZWJCefz5ce0LZsaPoPG23tFQM4yEedSdt\nO6nSMbM9kv7e3V876X5NqnR4c6IrZRVKZlLVr9LiYvl7M3Z0otIRQ5XOK7Z8e5ekx0M9lxR3Xfg8\nupyPYO5jNlsrlBYWJoe9NPlns+j6deIkxgFy9yA3SX8p6fOSHpM0UjGkM/H/HDhwwOe1vu6+vOxe\n/HoVt+Vl99Fo7ofsTdW+rK+n/VxDtG/f1ceu7LZ/f/Pn6et1Wl8v2r+0VHxN8fcpB5I2vE4u17lT\nV7cmge8+nDdnVYi0ERx9PteQrK/XC/u2Oh28TpikbuBzpm2EupyPYO5jdqORdPfdxVINVcykffuK\n4Y82zk/gdcIkvY/hY35dzkcMbe6jC4cPTw775eVi7Z42T0bjdUIbCPwIdTlZxsTc7M6fr/5ZqCUm\neJ3QBgI/Ql2uU5P7mjhtMgu3xASvE9rAGD4wo5WV4qzasu3PPNN9ewDG8Adusyb7+uuLHt/iIjX0\nXTlxojjeWy0uFtuBmBH4PZvnZJqt6wZduVJMID7/POsHdWVtTfrIR64eXvnoRxleQfwY0ulR1UVE\nTp4sQmXr/Y4cKRbt2rtX+uY3y4cUNnHqO5CXqNbSqSu3wK+zVknZh8I01GYDeWEMPwF1LgRetkTt\nNHVrs1lDB8gLgd+jOifTVH0oVJlUm7014FdWirNF275+AB8iaeJ1y0Sd9Re6ujVdSyc1dRZ8q1pD\nZWWlWEdlYcF9cdH9+usnrx9U9lxtr80ScoGvzbVrlpaKryzu1h4W0Eufclw8LUXTFnxraxXQOgt9\nSUU75hVqga/UAyn2D6s+F2aL/dikgsAfkDZWAV1aqhf4TX7Jq55j3g+RzTAw6y+Qmkrhw6rt162u\nFI5NKuoGPmP4CVhbK6p2Ll2a/9T9qvmCrZquzdLmAl9bzzXwikKyrZPbsUrhurB9LcyWwrEZGgI/\nMfNOrpUtvrW4WEzetrU2S5sLfNWpTkphpcg6lVh962ththSOzeDU+TOgqxtDOpM1/RO46dBQnfHW\nti5CU2cIavfu+f/8n3fseNb/l8qFS/q4eFAqxyYFYgx/ePqeXOtyvLXuJPM8bZh3X+b5f0O69Gbb\nODbtIfAHqK/JNffuP2zKwmDS5O0sPe9592Xe/zeUS2+GwLFpR93AZ2mFhNRZiiGUPi6xNxoVE3hn\nzxbj9V/4QrFY3HYLC8WqodPWJNo0775wmUHEiqUVBqjPqx71UcmxvTrpllvK73fddbNVe8y7L1xm\nEKkj8BPS51WPYrjEXlUbqv5Irar2mHdfYjgGIbG8QgbqjPt0dWMMP25djLdOG4sva8M8Y+vbH+fY\nsXpzAEMdc+YkqLSJSVukpu3qmbphTNhRIpm6uoHPkA6iMe+Zl02Hujjjk5OgcrHQdwOATU1CZ22t\nvCIn9PMOxd695RVgTEgPCz38gUpxAq6vKhiqb4Y/IY0CgT9AWxcea/PiJqH1FTp9PW9MH8p9VoCh\nQ3UG+ru6MWnbjpQn4Pqqgun6eZkoRpvEpG1+NnuMZWOxUhpj0ttPtnLvphfcxhLUswgxURzTXwyp\nGvwxrPOp0NWNHv786lzCMIUe/lZD7gVXrYu0sDD/Kp5DPVZdSfkYijr8vExbXTK2VQjrLHaW8tDU\nNFX7tn2BuLqBM+Rj1ZWUj2HdwGdIZyCqSgul+Cbg6k4qD7lcsmyi2OzaZSLqDvMM+Vh1JYdjSOAP\nRFVp4eZKmrGEvVR//HrI5ZJlVTELFWfF1AmcIR+rruRwDAn8gUipjrpuTyqlfZrH9oniV72q/H51\nAmfox6oLORxDAn8gmtRRd12ZULcnlVtteJ3AqXqtcjtWIWRxDOsM9Hd1Y9K2fXVWn+y6MoFL21Wb\ndD5AylUkCEtdVOlIulvSaUnflrS67WdHJT0p6QlJb67zeAR+u+oERF+VCUNdZjiklKtIEFbdwG90\niUMz2zcO+z+R9BvuvjHevl/ShyTdJul7JT0k6dXu/q1Jj8clDttV55KIXLYvHbxWqNLJJQ7d/Yy7\nP1HyozslnXT359z9Syp6+rc1eS7Mrs7kaA6VCUPBa4WmQk3avlLSl7d8f2687RpmdsjMNsxs48KF\nC4Gak6c6AZFDZcJQ8FqhqamBb2YPmdnjJbc722iAu9/n7qvuvrpr1642HhJjdQIii8qEgeC1QlNT\nL4Di7m+c43HPS1rZ8v3u8TZ0aDMgjh4thnFuvrkI++0B0eTiIegWrxWaCHXFq5Gkvzaz31cxaXuL\npM8Eei5MQEAA2NRoDN/M7jKzc5J+VNLHzOwTkuTupyV9WNK/S/q4pMPTKnQAAGE16uG7+4OSHqz4\n2fskva/J4wMA2sPSCgCQCQIfADJB4ANAJgh8AMgEgQ8AmWi0eFrbzOyCpKf7bsecdkr6Wt+NCGCo\n+yUNd9/Yr7S0sV/f5+5TlyqIKvBTZmYbdVarS81Q90sa7r6xX2npcr8Y0gGATBD4AJAJAr899/Xd\ngECGul/ScPeN/UpLZ/vFGD4AZIIePgBkgsBvyMzuNrPTZvZtM1vd9rOjZvakmT1hZm/uq41Nmdmt\nZvawmT0yvjrZYC5XaWa/bGb/MX4N3993e9pmZr9uZm5mO/tuSxvM7APj1+sxM3vQzL6n7zY1YWa3\nj/PhSTM7Evr5CPzmHpf0s5I+vXXj+ELuByW9RtLtkv7YzF7UffNa8X5Jv+Put0r6rfH3yTOz16u4\n/vIPuftrJP1ez01qlZmtSHqTpGf6bkuLPinpte7+g5K+IOloz+2Z2zgPTkh6i6T9kt42zo1gCPyG\nMrmQu0v6rvG/v1vSV3psS5veJem4uz8nSe7+bM/tadu9kn5Txes3CO7+T+5+Zfztwyquppeq2yQ9\n6e5PuftlSSdV5EYwBH44tS/knoBflfQBM/uyil5wsr2qbV4t6cfM7N/M7J/N7HV9N6gt42tOn3f3\nR/tuS0DvkPSPfTeigc4zItQlDgfFzB6S9PKSHx1z9/Wu2xPCpH2U9AZJv+buf2NmPyfpzyTNc63j\nzk3ZrwVJL5X0I5JeJ+nDZrbXEyldm7Jv71ExnJOcOr9vZnZM0hVJf9Vl21JH4NeQw4XcJ+2jmf2F\npF8Zf/sRSX/aSaNaMGW/3iXpb8cB/xkz+7aKdU0udNW+Jqr2zcx+QNJNkh41M6l4733OzG5z9692\n2MS5TPt9M7Ofl/Qzkt6Qyodzhc4zgiGdcEaSDprZkpndpLQv5P4VST8x/vdPSvpij21p099Jer0k\nmdmrJS1qAItzufvn3f0Gd9/j7ntUDBX8cAphP42Z3a5iXmLN3f+37/Y09FlJt5jZTWa2qKLIYxTy\nCenhN2Rmd0n6Q0m7VFzI/RF3f7O7nzazzQu5X1HaF3L/RUl/YGYLki5JOtRze9pyv6T7zexxSZcl\nvT3xHmMO/kjSkqRPjv96edjdf6nfJs3H3a+Y2T2SPiHpRZLud/fTIZ+TM20BIBMM6QBAJgh8AMgE\ngQ8AmSDwASATBD4AZILAB4BMEPgAkAkCHwAy8X+Y9usk9ukdSgAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from sklearn.datasets import make_blobs\n", "import matplotlib.pyplot as plt\n", "import numpy as np\n", "\n", "\n", "np.load('Ex1_HW2_X.npy') \n", "np.load('Ex1_HW2_t.npy') \n", "\n", "X_class1 = X1[Y1==0,:]\n", "X_class2 = X1[Y1==1,:]\n", "\n", "plt.scatter(X_class1[:, 0], X_class1[:, 1], marker='o', c='r',\n", " s=35)\n", "plt.scatter(X_class2[:, 0], X_class2[:, 1], marker='o', c='b',\n", " s=35)\n", "\n", "plt.show()\n" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "__Question I.1.1 Logistic from scratch (7pts)__ Write a function that takes as inputs a set of training pairs $\\left\\{\\mathbf{x}_i, t_i\\right\\}$ such as those stored in the variables $\\texttt{HW2_ExI_X}$ and $\\texttt{HW2_ExI_Y}$, and return the logistic regression classifier by learning it through gradient descent from the minimization of the negative log likelihood function $(**)$. Apply your function to the dataset given above and plot the discriminant function on top of this dataset.\n" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# start by coding the sigmoid function. \n", "\n", "\n", "def sigmoid(x):\n", " \n", " '''The function should return sigma(x) = 1/(1+e^-x) at the point x'''\n", " \n", " return sigma\n", " \n", " \n", "\n", "\n", "\n", "def LogisticRegression(X, y):\n", " \n", " '''The function should learn a logistic classifier '''\n", " \n", " \n", " \n", " \n", " return beta\n", " " ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "__Question I.1.2 Logistic vs OLS (3pts)__ Consider the dataset given below. On this dataset, __using the corresponding modules from scikit__, learn a linear regression classifier. Then learn a logistic classifier. How do the two approaches compare with each other? Display each each of the classifiers using 'meshgrid + contourf'" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD8CAYAAABn919SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAE8dJREFUeJzt3W2MXGd5xvHrIobdsUuBJksaCNRx\nWqUeRSXEoxQKRLQmiECyINRKRoKiCtWtGtEEVCGnX6p+aRcJVfQTUpQUkCBxIBDtKkJREAVEpZJq\nnLiNjXmp3RASkniQAhSwE6B3P5wxXq9nduflvD7n/5NGY49nZ+4d77n2Oc/9nHMcEQIANN/zqi4A\nAJAPAh0AEkGgA0AiCHQASASBDgCJINABIBEEOgAkYstAt/0vtk/aPrLusd+w/UXb3xnev6TYMgEA\nW5lkhP4JSW/Z8NgBSV+KiN+R9KXh3wEAFfIkR4ra3inpvoi4cvj3b0l6Y0Q8afsSSV+JiCu2ep2L\nLroodu7cOVfBANA2hw4d+kFELG31vG0zvv7FEfHk8M9PSbp4ki/auXOn+v3+jG8JAO1k+7uTPG/u\npmhkQ/yxw3zb+233bfcHg8G8bwcAGGPWQH96ONWi4f3JcU+MiNsiohcRvaWlLfcYAAAzmjXQ1yS9\nd/jn90pazaccAMCsJlm2eJekf5d0he3Hbb9P0oqk62x/R9Kbhn8HAFRoy6ZoRLxrzD/tzbkWAMAc\nOFIUABJBoKdubU3qdqXFxex+ba3qigAUhEBP2dqatG+fdOyY9Oyz2f2+fYQ6kCgCPWUHDkinTp37\n2KlT0q23VlMPgEKlGehMM2ROnBj9+PHj5dYBVKlFeZBeoDPNcNauXaMfv/zycuvIS4s2TOSkZXmQ\nXqAzzXDWyorU6Zz7WKeTPd40LdswkZOW5UF6gc40w1nLy9LBg9lodmEhu7/7bunGG6uubHot2zCR\nk5blwaxnW6yvXbuy0dtGTZ1mmNfycnZrupZtmMhJy/IgvRF6StMMOKup/QDm/avVsjxIL9BTmmbA\nWU3cMFOY92/6L6S25UFElHbbs2dPJG11NWL37oiFhex+dbXqitKyuhrR7Wafb7cbsbZWdUWb2707\nQjr/1u1WXdlkVlcjOp1za+90+LmugKR+TJCxE12CLi+9Xi+SvWLRmdHY+sZdp5ONDlKYw8b0Fhez\nkflGCwvS6dPl1zOtbnf0/HO3Kx09Wn49LWb7UET0tnpeelMuVWEVBjZq6rz/GTSiG4dAzws//Nio\nifP+6zX9F1ILEeh54YcfGxXdkCu6Ydn0X0gtRKDnpS0//E1f9VC25eVsvvn06ew+zzAvegVN21aI\nFKHs7WWSzmlet1ascmnSKoxpseqhPpq+gqYNctxexCoX5I5VD/XR9BU0bZDj9sIqF+SPxm990LOp\nvwq2FwIdkyNE6qMtPZsmq2B7IdAxOUKkPmhY1l8F2wuBjskRIvVS1Aoa5KOC7YWmKADUHE1RIE+s\nv0cDzBXotm+2fcT2Udu35FXUOdiQULUUToOLVpg50G1fKenPJV0j6VWSbrD923kVJokNCfXAidfQ\nEPOM0HdLejAifhYRv5D0VUnvzKesITYk1AHr79EQ8wT6EUlvsH2h7e2S3irpFRufZHu/7b7t/mAw\nmO4d2JBQB6y/R0PMHOgRcUzShyU9IOl+SYcl/XLE826LiF5E9JaWlqZ7EzYk1AHr79EQczVFI+KO\niNgTEddKekbSt/Mpa6hJGxLN23Sx/h4NMe8ql5cO71+pbP78zjyK+pWmbEg0b9PXpoN4GJw01lwH\nFtn+mqQLJf1c0gcj4kubPT/ZA4s4CyFSwbVxa2nSA4s4UjQPnMoUqWBwUkscKVommrdIBSvLGo1A\nz0OTmrfAZhicNBqBnoemNG+BrTA4aTQCPS9tWgWBdDE4abRtVRcAoGaWl1nR0lCM0AEgEQQ6ACSC\nQAeARBDoAJAIAh0AEkGgA0AiCHQA6WrZmSMJdABpauFprQl0AGlq4TWJCXQAaWrhmSMJdABpauGZ\nIwl0AGlq4ZkjCXQAaWrhmSM52yKAdLXszJGM0AEgEQQ6ACSCQAeARBDoQOpadvh7mxHoQMpaePh7\nm80V6LY/YPuo7SO277K9mFdhAHLQwsPf22zmQLf9ckl/LakXEVdKukDSvrwKA5CDFh7+3mbzTrls\nk9SxvU3Sdknfn78kALlp4eHvbTZzoEfEE5I+IukxSU9K+lFEPJBXYag5Gm3N0MLD39tsnimXl0h6\nu6TLJL1M0g7b7x7xvP22+7b7g8Fg9kpRHzTamqOFh7+3mSNiti+0/0TSWyLifcO//6mk10TEX437\nml6vF/1+f6b3Q410u1mIj3r86NHy6wESZ/tQRPS2et48c+iPSXqN7e22LWmvpBFbOZJDow2opXnm\n0B+UdI+khyQ9Mnyt23KqC3VGow2opblWuUTE30XE70bElRHxnoh4Nq/CUGM02oBaataRoqysqAca\nbUAtzdwUncVcTdEzKyvWH/XW6WTB0qLzHQNonzKaouXiEGYA2FRzAp2VFQCwqeYEOisrAGBTzQl0\nVlYAwKaaE+isrACATW2ruoCptOwK3gAwjeaM0AEAmyLQASARBDoAJIJAB4BEEOgAkAgCHQASQaAD\nQCIIdABIBIEOAIkg0AEgEQQ6ACSCQAeARBDoAJAIAh0AEkGgA0AiCHQASASBDgCJmDnQbV9h+/C6\n249t35JncQCAyc18CbqI+JakqyTJ9gWSnpB0b051AQCmlNeUy15JxyPiuzm9HgBgSnkF+j5Jd436\nB9v7bfdt9weDQU5vBwDYaO5At/0CScuSPjvq3yPitojoRURvaWlp3rcDAIyRxwj9ekkPRcTTObwW\nAGBGeQT6uzRmugUAUJ65At32DknXSfp8PuUAAGY187JFSYqIn0q6MKdaAABz4EhRAEgEgQ4AiSDQ\nASARBDoAJIJAB4BEEOgAkAgCHQASQaADQCIIdABIBIGOZlhbk7pdaXExu19bq7oioHYIdNTf2pq0\nb5907Jj07LPZ/b59hDqwAYGO+jtwQDp16tzHTp2Sbr21mnqAmiLQUX8nTox+/PjxcusAao5AR/3t\n2jX68csvL7cOoOYIdNTfyorU6Zz7WKeTPQ7gVwh01N/ysnTwYLa6ZWEhu7/7bunGG6uuDKiVuS5w\nAZRmeTm7ARiLEToAJIJAB4BEEOgAkAgCHQASQaADQCIIdKDpOHEZhgh0oMk4cRnWmSvQbb/Y9j22\nv2n7mO3X5lUYgAlw4jKsM++BRf8s6f6I+GPbL5C0PYeaAEyKE5dhnZlH6LZfJOlaSXdIUkQ8FxE/\nzKswABPgxGVYZ54pl8skDSR93PbDtm+3vSOnuoD6qWPzkROXYZ15An2bpKslfSwiXi3pp5IObHyS\n7f22+7b7g8FgjrcDKlTX5iMnLsM6jojZvtD+TUlfj4idw7+/QdKBiHjbuK/p9XrR7/dnej+gUt1u\nFuKjHj96tPx60Cq2D0VEb6vnzTxCj4inJH3P9hXDh/ZK+sasrwfUGs1HNMC869DfL+nTtv9L0lWS\n/mH+khJVx/nXWaTyfUyL5iMaYK5Aj4jDEdGLiN+LiHdExDN5FZaUus6/TiuV72MWNB/RABwpWoYy\nDv4oY+Tc5oNYaD6iAWZuis6itU3RxcVsRLvRwoJ0+vT8r39m5Lw+bDudLIDyvMpP0d8HgJEKb4pi\nCkXPv5Y1cmYeGag1Ar0MRc+/lrUCg3lkoNYI9DIUPf9a1si57vPIbV2BAwwxh56CcXPodQrb9dbW\nsmmiEyeyX0YrK/PP9ZfVRwAqwBx6m9R95LxeUUsf27wCBxgi0FOxvJwdgn76dHY/SZhXMUVRVPBy\nJCdAoLdWVQcJFRW8rMABCPTWqmqKoqjgZQUOQKC3VlVTFEUFb5P6CEBBCPS2qmqKosjgnaWPUAcs\nt0ROWLbYVk1b6pgqlltiAixbxOaYoqgHllsiR9uqLgAVWl5mFFg1llsiR4zQgSqx3BI5ItCLRLML\nW6lquSU/m0ki0ItS1oE7bJjNVkUvo81Xnkocq1yKUsZV4lkhgVmU8bOJXLHKpWplNLtYIVGelPaE\naMQmi0AvShnNLjbMcqQ2RUEjNlkEelHKaHaxYZYjtT0hznuTLAK9KGU0u9gwy5HanhAHlSWLpmjT\nra1lI8Xjx7OR+coKG2beaCKiYpM2RTlStOk42rN4KyujVxOxJ4SamWvKxfajth+xfdg2Q2+kiSkK\nNEQeI/Q/jIgf5PA6QH2xJ4QGoCkKAImYN9BD0gO2D9neP+oJtvfb7tvuDwaDOd8OADDOvIH++oi4\nWtL1km6yfe3GJ0TEbRHRi4je0tLSnG8HABhnrkCPiCeG9ycl3SvpmjyKAgBMb+ZAt73D9gvP/FnS\nmyUdyaswAMB05lnlcrGke22feZ07I+L+XKoCAExt5kCPiBOSXpVjLQCAObBsEQASQaADQCIIdABI\nBIGOmaV0EZ824/8xHQR6CVLcYMq4iE+Kn1vdpHYxptaLiNJue/bsibZZXY3odCKks7dOJ3u8yXbv\nPvd7OnPrdvN5/VQ/t3FWV7PPdGEhuy/r+yz6/xH5kNSPCTKWEXrBUrt62RlFX8Snqs+tir2CKkfJ\nqV2Mqe0I9IKlusEUfTnTKj63qoK1yl/6XJY2LQR6wVLdYIq+nGkVn1tVwVrlL30uS5sWAr1gZWww\nVUwTFH0RnyqCpqpgrfKXPhdjSswkE+153drYFI3IGlzdbtbw6nYj1tbyfe1Um4dFfm6jVNUgHPd/\nWPT3i+bQhE1RZ88tR6/Xi36fS4/miQvS5+fMHPrGa0GXMWJdW8umdo4fz0bmKyuMknGW7UMR0dvy\neQR6sy0uZg28jRYWpNOny6+n6QhW1NGkgZ7HRaJRoV27Ro/Qm950rQrXgkaT0RRtOFYpADiDQG+4\nolYpcNg90DzMoeM845qDBw8yHQFUYdI5dEboOE+qpysAUkeg4zypnq4gT0xJoY4IdJwn1dMV5IVT\nzqKuCHSch5Uzm6vTlFRRewrsgTQTgY7zcH6PzdVlSqqoPQX2QJqLVS7AlOpyuoWi6qjL94ezWOXS\nQOzmNkNdpqSK2lOoyx4Ipjd3oNu+wPbDtu/Lo6C2Yje3OeoyJVVU85qmeHPlMUK/WdKIHTRMo06N\nNmxteTmbfjh9Oruvor9Q1J5CXfZAML25At32pZLeJun2fMppL3ZzMa2i9hTqsgeC6c17tsWPSvqQ\npBfmUEurcdZEzKKos0Ny1slmmnmEbvsGSScj4tAWz9tvu2+7PxgMZn272iiqcclu7vloEgNTmuSy\nRqNukv5R0uOSHpX0lKSfSfrUZl/T9EvQFX25t7IvuVZnKV9aD5iWyrwEne03SvqbiLhhs+c1fR06\n63PLw2cNnMU69ALQuCwPnzUwvVwCPSK+stXoPAWszy0PnzUwPUboU6BxWR4+a2B6BPoUWJ9bHj5r\nYHqcnAsAao6mKAC0DIEOAIkg0AEgEQQ6ACSCQAeARBDoAJAIAh0AElHqOnTbA0nfLe0NN3eRpB9U\nXcQEqDNf1Jkv6szfqFp/KyKWtvrCUgO9Tmz3J1moXzXqzBd15os68zdPrUy5AEAiCHQASESbA/22\nqguYEHXmizrzRZ35m7nW1s6hA0Bq2jxCB4CktCrQbb/C9pdtf8P2Uds3V13TOLYXbf+H7f8c1vr3\nVdc0ju0LbD9s+76qa9mM7UdtP2L7sO3ansfZ9ott32P7m7aP2X5t1TVtZPuK4ed45vZj27dUXdco\ntj8w3IaO2L7L9mLVNY1i++ZhjUdn/SxbNeVi+xJJl0TEQ7ZfKOmQpHdExDcqLu08ti1pR0T8xPbz\nJf2bpJsj4usVl3Ye2x+U1JP063W+FKHtRyX1IqLW65Ftf1LS1yLidtsvkLQ9In5YdV3j2L5A0hOS\nfj8i6nKciSTJ9suVbTvdiDhl+zOSvhARn6i2snPZvlLSQUnXSHpO0v2S/jIi/nua12nVCD0inoyI\nh4Z//l9JxyS9vNqqRovMT4Z/ff7wVrvfvrYvlfQ2SbdXXUsKbL9I0rWS7pCkiHiuzmE+tFfS8bqF\n+TrbJHVsb5O0XdL3K65nlN2SHoyIn0XELyR9VdI7p32RVgX6erZ3Snq1pAerrWS84VTGYUknJX0x\nIupY60clfUjS/1VdyARC0gO2D9neX3UxY1wmaSDp48NprNtt76i6qC3sk3RX1UWMEhFPSPqIpMck\nPSnpRxHxQLVVjXRE0htsX2h7u6S3SnrFtC/SykC3/WuSPifploj4cdX1jBMRv4yIqyRdKuma4W5Z\nbdi+QdLJiDhUdS0Ten1EXC3pekk32b626oJG2Cbpakkfi4hXS/qppAPVljTecEpoWdJnq65lFNsv\nkfR2Zb8oXyZph+13V1vV+SLimKQPS3pA2XTLYUm/nPZ1Whfow/noz0n6dER8vup6JjHc5f6ypLdU\nXcsGr5O0PJybPijpj2x/qtqSxhuO1hQRJyXdq2y+sm4el/T4ur2xe5QFfF1dL+mhiHi66kLGeJOk\n/4mIQUT8XNLnJf1BxTWNFBF3RMSeiLhW0jOSvj3ta7Qq0IeNxjskHYuIf6q6ns3YXrL94uGfO5Ku\nk/TNaqs6V0TcGhGXRsROZbvd/xoRtRv9SJLtHcNGuIZTGG9WtptbKxHxlKTv2b5i+NBeSbVr2q/z\nLtV0umXoMUmvsb19uP3vVdY7qx3bLx3ev1LZ/Pmd077GtryLqrnXSXqPpEeGc9OS9LcR8YUKaxrn\nEkmfHK4geJ6kz0RErZcF1tzFku7Ntmltk3RnRNxfbUljvV/Sp4fTGSck/VnF9Yw0/MV4naS/qLqW\ncSLiQdv3SHpI0i8kPaz6HjX6OdsXSvq5pJtmaYa3atkiAKSsVVMuAJAyAh0AEkGgA0AiCHQASASB\nDgCJINABIBEEOgAkgkAHgET8P1RqBDdseBX+AAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "import numpy as np\n", "import scipy.io as sio\n", "\n", "from sklearn.datasets import make_classification\n", "\n", "data_class1 = sio.loadmat('XHW2_EX2_Class1.mat')['XHW2_EX2_Class1']\n", "data_class2 = sio.loadmat('XHW2_EX2_Class2.mat')['XHW2_EX2_Class2']\n", "\n", "\n", "plt.scatter(data_class1[:, 0], data_class1[:, 1], marker='o', c='r',\n", " s=35)\n", "plt.scatter(data_class2[:, 0], data_class2[:, 1], marker='o', c='b',\n", " s=35)\n", "\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise II: Maximal Margin classifier and non linearly separable data (10pts)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "So far we have studided Maximal Margin classifier when the data was linearly separable. In this case, the plane will naturally position itself in between the two classes. The formulation in the linearly separable case is of the form\n", "\n", "$$\\max_{\\mathbf{w}, b_0}\\min_{i} \\frac{y(\\mathbf{x}^{(i)})t^{(i)}}{\\|\\mathbf{w}\\|} = \\max_{\\mathbf{w}, b_0}\\min_i \\frac{(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)t^{(i)}}{\\|\\mathbf{w}\\|}$$\n", "\n", "This formulation is not very nice because the optimization variable appears at the denominator. Note that the formulation $(*)$ can be written as \n", "\n", "\\begin{align}\n", "\\max_{\\gamma, \\mathbf{w}, b_0} & \\quad \\frac{\\gamma}{\\|\\mathbf{w}\\|} \\\\\n", "\\text{subject to} &\\quad t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0) \\geq \\gamma \n", "\\end{align}\n", "\n", "Here we simply introduce the parameter $\\gamma$ to represent the margin (smallest distance between any point from the dataset and the plane). An important thing to notice when looking at $(**)$ is that any solution for $\\mathbf{w}, b_0, \\gamma$ can generate an other solution with the same objective by simply scaling it with a positive weight $\\alpha$. Indeed if $\\mathbf{w},b_0, \\gamma$ is an optimal solution to $(**)$, it is easy to check that $\\alpha \\mathbf{w}, \\alpha \\gamma, \\alpha b_0$ will be a valid solution as well. Since we only need one solution, we can choose to optimize over the set of solution such that $\\gamma = 1$. Concretely this means that for each 'line' of solution $(\\alpha \\gamma, \\alpha b_0, \\alpha \\mathbf{w})$, we only retain the $\\alpha$ corresponding to $\\alpha\\gamma = 1$. The problem then becomes\n", "\n", "\\begin{align}\n", "\\max_{\\mathbf{w}, b_0} & \\quad \\frac{1}{\\|\\mathbf{w}\\|} \\\\\n", "\\text{subject to} &\\quad t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0) \\geq 1.\n", "\\end{align}\n", "\n", "To get the final formulation, note that the maximum value for $1/\\|\\mathbf{w}\\|$ is achieved when $\\|\\mathbf{w}\\|$ is the smallest. We an thus solve the problem\n", "\n", "\\begin{align}\n", "\\min_{\\mathbf{w}, b_0} &\\quad \\|\\mathbf{w}\\| \\\\\n", "\\text{subject to} &\\quad t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0) \\geq 1., \\quad\\quad (***) \n", "\\end{align}\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "So far we have assumed that the dataset was linearly separable. In this case, all the points will satisfy $t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)\\geq 1$. Sometimes, however, we could be in a situation where the dataset is not linearly separable and there are points which will be misclassified so that $t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)$ might be negative. To account for such situation, we will consider slack variables $\\xi_i$ and use a more general formulation of the form\n", "\n", "\\begin{align}\n", "\\min_{\\mathbf{w}, b_0} &\\quad \\|\\mathbf{w}\\| + C\\sum_{i} \\xi_i \\\\\n", "\\text{subject to} &\\quad t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0) \\geq 1 - \\xi_i., \\quad\\quad (***) \n", "\\end{align}\n", "\n", "It is in fact possible to write formulation $(***)$ as an unconstrained optimization problem. Recall that we decided to set the minimum margin to $1$. This in particular means that all the points that are correctly classified should satisfy $t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)\\geq 1$. The approach encoded in $(***)$ corresponds to penalazing those points for which $t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)<1$. As soon as we have a positive value for one of the variables \n", "\n", "\n", "#### Question II.1 Hinge Loss (3pts)\n", "\n", "We consider the hinge loss $\\ell(x, y) = \\max(0, 1-x\\cdot y)$ for $x = t^{(i)}$ and $y = (\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)$. Note that if $t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)\\geq 0$ (meaning the pair $\\left\\{\\mathbf{x}^{(i)}\\right\\}$ is correctly classified), the output of the hinge function is $0$. On the other hand, if $t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)<0$ (which corresponds to a pair that is incorrectly classified, as can occur when the data is not linearly separable), then the cost is $1 - t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)$ (i.e. we can think of this cost as by how much we violate the constraint $t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0)\\geq 1$). From this, we can now write the objective as\n", "\n", "\\begin{align}\n", "\\min_{\\mathbf{w}, b_0} &\\quad \\|\\mathbf{w}\\|^2 + C\\sum_{i} \\max(0, 1 - t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0))\n", "\\end{align}\n", "\n", "Instead of weighting the constraints, we can weight the first term. This gives a similar formulation of the form\n", "\n", "\\begin{align}\n", "\\min_{\\mathbf{w}, b_0} &\\quad \\frac{1}{N}\\sum_{i=1}^{N} \\left\\{\\frac{\\lambda}{2} \\|\\mathbf{w}\\|^2 + \\max(0, 1 - t^{(i)}(\\mathbf{w}^T\\mathbf{x}^{(i)} + b_0))\\right\\}\\quad (****)\n", "\\end{align}\n", "\n", "The PEGASOS algorithm finds the corresponding Soft Margin classifier by applying batch gradient descent to this last objective. \n", "\n", "What is the gradient of the Hinge loss?" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def hingLoss(X, t):\n", " \n", " \n", " '''Answer the question above by '''\n", " \n", " \n", " return \n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question II.2 Maximum Margin Classifier from scratch (7pts)\n", "\n", "Using your answer to question II.2.1 above, implement the PEGASOS algorithm. Use a simple batch gradient descent approach with a sufficiently small learning rate (let us say $\\eta = 1e-4$). Update the lambda as $1/t$ where $t$ is the step counter. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def approximatePEGASOS(X, t, lambda0, eta):\n", " \n", " '''The function takes as input a set of sample pairs {x^{(i)}, t_^{(i)}} as well as an initial value for b0 \n", " and a value for the learning rate eta. It should return the weight vector w and the bias b0 \n", " for the Max Margin Classifier'''\n", " \n", " \n", " \n", "\n", "\n", "\n", " \n", " return w, b0\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Apply the algorithm to the dataset below" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXYAAAD8CAYAAABjAo9vAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAGoZJREFUeJzt3V+IJVedB/Dvb53bMw0KPmREyMzs\nGP8xrci6uYgi7IP6kJW1TWSF2QdBFLILLqiRlcnM64INLslLAhJW2ZfguGK0L4jECC6+qGyPZCXt\nmCHTEIwIGXcfFJwYx/z2ofqm73TXn1NV58/vnPp+4NIz3fdP3VNV3zp1zqlToqogIqJy/EXqBSAi\nIr8Y7EREhWGwExEVhsFORFQYBjsRUWEY7EREhWGwExEVhsFORFQYBjsRUWGOpfjQO+64Q8+ePZvi\no4mIsnXlypXfqurJruclCfazZ89iZ2cnxUcTEWVLRJ53eR6bYoiICsNgJyIqDIOdiKgwDHYiosIw\n2ImICsNgJyIqDIOdyILFAtjYAE6cqH4uFqmXiDI2OthF5LSI/FBEfiEiuyLyWR8LRjQZiwVw/jxw\n9Srwxz9WP8+fZ7jTYD5q7LcAfEFVNwC8F8BnRGTDw/sSTcOFC8DNm7f/7uZN4MEH0ywPZW90sKvq\nb1T1Z/v//j2AqwDuHPu+RJOxt1f/++vX4y4HFcNrG7uInAXwbgA/9fm+REW7667637/5zXGXg4rh\nLdhF5LUAvgXgc6r6u5q/3y8iOyKyc+PGDV8fS5S/rS1gff32362vV78nGsBLsIvIDFWoP66qT9Q9\nR1UfU9W5qs5PnuycnIxoOjY3gcuXq9Ewx49XP7/xDeAjH0m9ZJQpH6NiBMBXAVxV1YfGLxJljcP2\nhtncBHZ3gZdeqn4y1GkEHzX29wP4BIAPiMjT+48Pe3hfGiJlsHLYXjce+CgCUdXoHzqfz5XzsQew\nDNbVoXPr69Vp/uZm+M/f2KjCvO73u7vhP9+61OuHsiciV1R13vk8BntBUgfriRNVTf2w48erJoap\nS71+KHuuwc4pBUqSejw0h+21S71+aDIY7CVJHawcttcu9fqhyWCwlyR1sHLYXrvU64cmg8FeEgvB\nymF7zSysH5oEdp4SEXmyWFRzuu3tVS1vW1t+Bzyx85SIKCJLl3Ew2ImmouHiKF4z5Yel2ZcZ7OSO\nCZCvhurk4tJPzdQyc2dpNCuDndxYOs+k/hqqkxf+7Q4ztczcWRrNymAnN5bOM6m/hurk3sv198Th\nNVP9WRrNymAnN5bOM6m/hurkXWu/rv09r5nqz9JoVgY7ubF0nkn9NVQnt/7lf83UMktg5TIOBju5\nsXSeSf01VCc3//U9ZmqZ5A8vUCJ3i0XVpn79elVT39piAhBFxAuUyD8r55lERlgdAcxgJyKzrAYn\nYHsEMIOdiEyyHJyA7RHADHbqb2g1ynL1i8yxHJyA7RHADHbqZ2g1ynr1i8yxHJyA7RHADHbqZ2g1\nynr1i8zpG5xjTgiHvNb0CGBVjf64++67lTJ1/LgqcPRx/HiY19FkbW+rrq/fvrmIqB47pnruXPX3\ntueur9/+nMPvfe5ctfmdOqW6tub+2sPvs7FRvc/Ghupi4ee7NwGwow4Zy2Cnfs6dqw/ojY0wryvc\nasAcDis6CM7ZrAr1pvDts3nVHQRcXmthXTHYKYymqlFXVWXo6wrWt5Y5ZV3B3eeEsOm92l5rZV25\nBjvb2EsUcvTJ0JmOLM2QZAS7Hdx1daT2aY9veq+212a3rlzS3/eDNfaArFQtqBO7Hdx11dgvXqxv\nqqk7IXSpsR9+rZV1BdbYJyq7qkVGPJ8JWR4uZ03bCJTFAnj44Spql0SABx6oPyGse6+1NeD06eaT\nyezWlUv6+36wxh6QlapFaQKcCbHboZ+mEShD+uX7jmaxsq7AztNILHSVr+LokzAClWvs4XKl2d4+\n2gQTqi5jYV0x2GOw2J5tpWpRGp4JjRKi/tM1bLHEuoxrsLONfQyL7dkcfRJGdo2sfvjoVgg1m0Td\n7rdk5grQVFzS3/ejmBo7a3HTMcEzIV8npKFaB5t2P5H+q8Vai2oTsMYeQa61OM6y2N8Ez4R8nZCG\nmsyrafc7d67failyfjqX9Pf9KKbGnmMtzmK/ALmLWLX0dUIaqsbua/fLabwBYnaeAvgagBcBPOPy\n/GKCXdVGV3kfOW3FdLvIB2Vfm0rI+k/b7ud6DMypRTV2sP8NgL+eZLDnJqetmG43JmkH1PR9BnLs\n+k/bMfBwUZw6lU9dJ2qwV5+Hswz2DLDG3s5yL9rQg/KImn5uJ6RLTZv56dNHi2JtrX7aXovf1Vyw\nA7gfwA6AnTNnzgQvAGqQY79ALNb7HzhlsrO2ETNNgZ/DAcw12KONilHVx1R1rqrzkydPxvpYOmyC\nozucWbwuYdXQW/ZYv8dcAE0jZpq8+CKwuwu89FL1M/fdgcMdp2hzs6yt2BfrATj0oJzrsNwRmo6B\nd95Z//zSioLBTrSUQwAOOSibvjlnGE3HwEcfnUZReAl2Efk6gB8DeLuIvCAin/bxvkRRlRqAE21+\nqzsGTqUopGqPj2s+n+vOzk70zyXqtFhUberXr1c19a2t8vb6TC0WVTfI3l51crW1VQX1lIjIFVWd\ndz2PTTFEq9j/cISFGSiKvOw/IAY7ER2xDPPZDLj33vSBan3AkjUMdiK6zWrt+NataqT3qhSBan3A\nkjUMdiILbQ2GtM1zvhQ7UHMYsGQJg52mjY23RzTVjlfFDtStreqG06vW1vIfsBQKg53SsVBTZuPt\nEV1XbfYdARpyNScY1JcHl3kHfD8GTwJmeYKmnKUoVyvzshQ42+XY1Vm3akRUZ7P+86i4ruauZZ7g\ndDe1UNzNrK0EQWl8lOuQJGnbU2MeaApLDF+7ia9ZHV2K12WZCzz+DlJesBe2A5oxtlyHJknTnjqb\nxT2AFzbbpbXdxCWQXZbZ9/fK9eS/vGDnITuMseXqeyrZwxNjx0imXCcdr2FtN3HZPNqm2F0G78WL\nfm/6kevJv2uw59N5yvFOYYwt16EDjJvmZXnllWHvN5bqwc/lvzPUtDrf8IY0/dQu0+80LbPqwUCl\nhx8GPv95P3O8TKK/3CX9fT+8trFnXLsyYWy5jr1d2+Gacuy2hJyrbzXqvk7THYJifcWuE6K6ZQ6x\nCSybX5reP4eTfxTXFKNa1CmzKWPK1fcBN/YB3Ocdm4002h5enTnc03N1mZvucjQmeF0OHpbKo0mZ\nwU42+T7g1r1fqOD00ShtvNZvrd29y9Bjbdsm0lZTz+nkn8FO5QgZnD5q7NaGohxifPGOGHLS1rWJ\nNB3cluWQQ6irMtjHM3RqPXltt5wfy0fTj/EqcY7dU31PArsOXm2bUE67OYN9DOOn1pPTVt3ysU7G\nNiVlUCUuvXuq69hqsVN5CAb7GBnsqJPS1kBqYZ3kWCVeUcLJqesVrrl1Kh/GYB/D+Kn15GxvNwe7\nlXWSaZW4lJPTrmNr3cErx92cwT4Ga+z25Fi9ykBJm3rTsbUp9HPcpFyDPZ8rT2Mq9W71OXv0Ua6T\nAEq6M1HT7WqbrjQVKXeTYrDX2dwELl/2c/0y+ZFynViYNz6QKczU0XTwevHFgndzl2q974f5phhy\nU0KvW5dSGqEbZN7ve0TdJllScxPYxk5BFR54ryopFRqE7PeNeexv2iR9zgyZmmuwS/XcuObzue7s\n7ET/XPJoY6Oadq/u97u78ZcnlBMnqikGDzt+vGrMpUbL28mutm+vr1fNH5ub/j+vbZP80peq2Ruv\nX6+amba28mxyEZErqjrvfB6DnQaZSuBN5QAWQOyim8Im6Rrs7DylYabQ6wb0HyFVcEdrX7FH3Exl\nk3TBYKdmbSFldUio72DtMxpn2fZw9erBHSLOn59MuB8u+pMn65+3GrQ+V5fVTTIJl4Z43w92nmbA\npXPU2tWWqTt0M+po9d2p2Wculq4Lh8Ysi7VN0jdwVAyNklFIvSr1MmdyjXqIQG2bPbEpaFOvrhy5\nBjs7T6lejj1RqZc5k47WEIs5pOhTr64csfOUxknREzW2wTV175mFRl6HMuzbqemyWtpuSN30utSr\nq2gu1XrfDzbFZCD2JYk+2gcsXEaZ8mofxzLs0wTiulpc7ila10VT996XLh18zVOnqkfJFzf3gZht\n7ADuAfAsgOcAXOh6PoM9EzF7onzeVLrE3jOXhHUsQ9fj3/b20c7ProPAsuiPHWt/3fIYdexY9Rmz\nWfW3S5faDxC5Xdzsu5M6WrADeA2A6wDuArAG4H8AbLS9hsFOR2TS8ZiMS2j3KMOu419XDbxttXRN\nn992jOq66XROnashOqljBvv7ADy58v8HATzY9hoGewS5TdDFIRLtXELbYxl2BWzbW3bd8KptMdvu\ngpjbsT7EJu0a7D46T+8E8KuV/7+w/ztKJccLZWJ0POZ8VahLT6PHMmzqYHV5y7bXbm01//3ateav\nuSqXztWkc927pH/bA8DfA/j3lf9/AsAjNc+7H8AOgJ0zZ84MP2RRt1xrv6E7HnOejbJPw7iHMmza\nhNbWut+ybUx7299F6mdi7PrKVqWssfsIdjbFWMP26qNyPdititgxPGaAkcv9R0WaV8fq1zx9unrk\n2BceYpBWzGA/BmAPwJtw0Hn6jrbXMNgDsxhiqdv8ebDrbcxxpOu1s5nf1ZF682ri+1gcLdirz8KH\nAVxDNTrmUtfzGeyBWRjP7bI8Mfc+iwe7CfO5OixsXrFEDfa+DwZ7BJbGc1sIVWsHu4nzuTosbF6x\nuAY754qh8KxMCrJYlHEbnUL4Wh1WNq8YeAclsqNp1qm1NeCb3wxznzSajEzmXvOCk4CRHXXjqwHg\n5ZeBe+8FZrP8xpWTGRbmXrOGwW5VzhfTHLa8C9Ha2tG/qQK3buVxEVVKJW0PnvW5ydVkuDTE+36w\n87RDym7+kOPGXK4XL7HHa6wpDfugVog4pQD5duECcPPm7b+7ebPqaQop9FQELteLh7re2rXGa7Fm\nHHF7sPj1u/he5hzL4AiX9Pf9YI29Q6qLaUKPG3OZtDtEjX3MpOIWasaRtgerX7+N72V2vdVvqouh\nwHHsGUs1MDdGgCzH189mR68rDzWu3LU8h5R7jL080vaQ43hw38vc9X6pD34M9pylupgm9p4d6yIq\n1wNW3wNbrL080vaQ46wLrsvsevzter/UBz8Ge+5SXDla6tWZoWrsMffyCNtD6tAawmWZ+xx/u94v\n9cGPwU7DNAWI1VmWXPSZ8rbPgS31Xu5Zjsd1l2X2cY/X5fulPvgx2Mmfuq1dpLphpY+Qj3HQcK3x\n9qkZ99zLczg2ppxiaGj5dC1z0/1X21rYmt4v9cGPwU7+dN0nLfQwBKt67OU5f80YQpVP19zvQ98z\n1cGPwU7+hLywKPW57ViOe3nuXzO0UOXTdrcmy01MTVyDnZOAUbemWZZWDZ1KbyJT803kaw42m1Uz\nSxw2tnyayn02q6Yqyg0nASN/mibxWjX0DsMuN2kuwES+5iCLBfDnP9f/bWz5NJX7W9867n2tY7BT\nt9VZlmYzQOT2v4+ZSq/v1HyZXu/NGQibXbhQNZAcJtJePi6bwmTL3aW9xveDbeyZ296u7jAsUj1O\nnRrXy9VnxErGPZCWbmoVi8tIl6YunNms/X1dN4WSyh3sPKVgUgUseyCz4rqZDFmtU90UXIOdTTHU\nX6rZJ/f26n8fakZIGqVrM1k2pTz3XH3r3n33NTe1cFNox2Cn/lLtVeyBzErbZrI6Q/Sf/lTVt0WA\nY8eqEH/gAeChh5pnkOam0I7BTv2l2qsm2xOWp7bNpK42rwq87W3VfUqfeKK9ts9NoR2DnfpLtVfF\nvAdapqNvLGnbTJpq89euVT+7Tgp9bgpFrmqXhnjfD3aeBmZp7pUcZT76xpKmzaTtitDl5ju2c9Rl\nN8htVYOjYiYqty3VokKGXFiedKxrDpexk22FHJGTEoN9qnLbUi0qYDreHI7vs1l7MY85KXTdDXJb\n1a7Bzjb20kx1HJjPhtKYncOBGnhTjUjt4y1vqf/9spg3N6uO1Jdeqn72aT933Q2KHV3jkv6+H6yx\nB1RajT1FQ2moSbcPf5eLF4NVq3OoiYac29x1N0g9v3pfYFPMROW2pbZJ2VDqu3O46WYlfZa7R6N5\nLsd3X8Xsesyse/+cxgEw2Kcspy21TSkNpdvbqmtr9cvoutw9z0pKOr53afquFy+WsRusYrBT/lwD\n23L1tC51uh51yz3gO5ZyfO9iefX75hrs7Dyl21m6WsO1Z8vyZYh1vZirXKdAHtApPqbzMSdTHS/Q\nhsFOB1Yn8KiboCM218COeUVqX02pA1Tf5eJFt+UudvjGeEOKxlL9JQiXan3TA8DHAewCeAXA3PV1\nbIoxKsY5bd+rZnJvT2gq07W1ft9lSo3mPfUtmhzG+DdBjDZ2AOcAvB3AfzHYCxC6EzLnPWqoIanT\ndODL/SAXUJ+iyblNPkqwv/omDPYytNUufVyXnvMeNcZE7hCVC+uDqNq4Bjvb2OlA002rX37ZT5v7\nVHu5XHsxc7hcNDN1belT6K7oDHYR+YGIPFPz+GifDxKR+0VkR0R2bty4MXyJKZzDnZBra0efMyZo\nprBHjTHVA18gTWMB7rvP7iAqXzqDXVU/pKrvrHls9/kgVX1MVeeqOj958uTwJaawVmuXh4fiLQ0N\nGivDEq0OieCBz6umE6DvfMfuICpf2BRDzXwHjYVhidaGdK6ycuArRNsJUOlj/EcFu4jcJyIvAHgf\ngO+KyJN+FosG8V0TDRE0qfcoy+3YFg58BZn0CZBLD6vvB0fFBBBqREVpQ+xyHhJRoJA3Aylx6D84\nV8zETHUoYV9TLCejt1KKMbqztHqJa7Czjb0UHFHhZmrt2Ib7FGK0ih1u+VO12W/uG4O9FJNuUOxh\nau3YhvsUYtdFDB/jvGOwl2JqNdExUnfgxmT4TC52XcTwMc47Brsrq2Ofl6ZWE12yvl5SM3wmF7su\nYvgY559LQ7zvR3adp5zDwyaul27Gh4bE7Nwsod8cjp2nUj03rvl8rjs7O9E/d7CNjapBru73u7vx\nl4cqXC9uFouqveH69aqmvrVV/plcjWUb+2pzzPp6Xie2InJFVeedz2OwOzhxouptOez48aqdltLg\neqGecj/GuQb7sRgLk7277qqvGRpop5w0rhfqaXOzepSOnacuOOLEJq4XoloMdhdTHXFiHdcLUS22\nsRMRNVgsqvHve3tVy9/WVtqmHNc2dtbY6QDHhBO9KucrVRnsVMl5KyYKIOcrVRnsVMl5K6as5HJi\nmPOVqgx2quS8FVM2cjoxNDwbQycGO1Vy3oopGzmdGOY8mpbBTpXQW3Eu598UVE4nhjmPpmWwUyXk\nVpzT+TcFlduJYa4zPHMcO4XHybpoXwkTcaXEcexkR07n3xRUzs0bOeEkYBQeJ+uiFVOZiCsl1tgp\nvJyHFxBliMFO4fH8mygqNsVQHDz/JoqGNXYiosIw2GkaeIFUdCzydBjsVD5eIBUdizwtBjuVL6cJ\nSgoRo8h5RtCMwU7pxNozeYFUdKGLnGcE7RjsJbNcpYm5Z+Y2QUkBQhc5T8LaMdhLZb1KE3PP5AVS\n0YUucp6EtWOwl8p6lSbmnskLpKILXeQ8CWs3KthF5Msi8ksR+bmIfFtEXu9rwWgk61Wa2HtmrvOv\nZmxokbu0IPIkrN3YGvtTAN6pqu8CcA2Akeogma/ScM+kGq4tiDwJazcq2FX1+6p6a/+/PwFwavwi\nkRfWg5N7JtXo04LIk7BmPtvYPwXgex7fj8bIITgt7ZmWRxBNwLL462Z3Buy0IOai8w5KIvIDAG+s\n+dMlVd3ef84lAHMAH9OGNxSR+wHcDwBnzpy5+/nnnx+z3ET+NN3W5/JlTlwWQV3xH8abbVVc76A0\n+tZ4IvJJAP8I4IOq+geX1/DWeGQKb92XVFtNHeCt81ZFuTWeiNwD4IsANl1Dncgc6yOICtdU/IDN\nFsQcjG1jfwTA6wA8JSJPi8hXPCwTUVzWRxAVrqn4lydMDPX+xo6KeYuqnlbVv9p//JOvBSOKxvoI\nosKx+P3jladEOYwgKhiL37/RnadDsPOUJmmxqAZq7+1V7Q9bWxx1Q71E6TwlIkfWJ2WjojDYiWKw\nPikbFYXBThQDh1RSRAx2ohg4pJIiYrATxcAxfRQRg50oBo7po4iOpV4AosnY3OTwRoqCNXYiosIw\n2ImICsNgJyIqDIOdiKgwDHYiosIw2ImICsNgJyIqDIOdiKgwDPaUFovqCsQTJ6qfnMKViDxgsKfC\n+bmJKBAGeyqcn5uIAmGwp8L5uYkoEAZ7Kpyfm4gCYbCnwvm5iSgQBnsqnJ+biALhfOwpcX5uIgqA\nNXYiosIw2ImICsNgJyIqDIOdiKgwDHYiosIw2ImICiOqGv9DRW4AeD76B9tzB4Dfpl4II1gWB1gW\nFZbDgWVZ/KWqnux6cpJgp4qI7KjqPPVyWMCyOMCyqLAcDvQtCzbFEBEVhsFORFQYBntaj6VeAENY\nFgdYFhWWw4FeZcE2diKiwrDGTkRUGAZ7YiLyZRH5pYj8XES+LSKvT71MqYjIx0VkV0ReEZHJjYYQ\nkXtE5FkReU5ELqRenlRE5Gsi8qKIPJN6WVITkdMi8kMR+cX+vvFZl9cx2NN7CsA7VfVdAK4BmPJN\nT58B8DEAP0q9ILGJyGsAPArgbwFsAPgHEdlIu1TJ/AeAe1IvhBG3AHxBVTcAvBfAZ1y2CwZ7Yqr6\nfVW9tf/fnwA4lXJ5UlLVq6r6bOrlSOQ9AJ5T1T1VfRnAZQAfTbxMSajqjwD8X+rlsEBVf6OqP9v/\n9+8BXAVwZ9frGOy2fArA91IvBCVxJ4Bfrfz/BTjswDQdInIWwLsB/LTrubyDUgQi8gMAb6z50yVV\n3d5/ziVUp12Px1y22FzKgohuJyKvBfAtAJ9T1d91PZ/BHoGqfqjt7yLySQB/B+CDWvj4066ymLBf\nAzi98v9T+7+jiRORGapQf1xVn3B5DZtiEhORewB8EcCmqv4h9fJQMv8N4K0i8iYRWQNwHsAi8TJR\nYiIiAL4K4KqqPuT6OgZ7eo8AeB2Ap0TkaRH5SuoFSkVE7hORFwC8D8B3ReTJ1MsUy34H+j8DeBJV\nB9l/qupu2qVKQ0S+DuDHAN4uIi+IyKdTL1NC7wfwCQAf2M+Hp0Xkw10v4pWnRESFYY2diKgwDHYi\nosIw2ImICsNgJyIqDIOdiKgwDHYiosIw2ImICsNgJyIqzP8DJ6pRYriMXxkAAAAASUVORK5CYII=\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from sklearn.datasets import make_blobs\n", "import matplotlib.pyplot as plt\n", "import numpy as np\n", "\n", "X = np.load('Ex2_HW2_X.npy') \n", "Y = np.load('Ex2_HW2_t.npy') \n", "\n", "XEx2_Class1 = X[Y==0,:]\n", "XEx2_Class2 = X[Y==1,:]\n", "\n", "plt.scatter(XEx2_Class1[:, 0], XEx2_Class1[:, 1], marker='o', c='r',\n", " s=35)\n", "plt.scatter(XEx2_Class2[:, 0], XEx2_Class2[:, 1], marker='o', c='b',\n", " s=35)\n", "\n", "plt.show()\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Bonus II.3. dynamic learning rate. (3pts) \n", "The exact PEGASOS algorithm updates the learning rate dynamically as $\\eta_t = \\lambda \\cdot t$ for some particular choice of $\\lambda$ and where $t$ denotes the step counter. Modify the approximate version of the PEGASOS algorithm that you derived in question II.2.2 to make it work with such a dynamical learning rate. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "def PEGASOS(X, t, lambdaPG, eta):\n", " \n", " '''The function takes as input a set of sample pairs {x^{(i)}, t_^{(i)}} as well as an initial value for b0 \n", " and a value for the learning rate eta. It should return the weight vector w and the bias b0 \n", " for the Max Margin Classifier'''\n", " \n", " \n", " \n", "\n", "\n", "\n", " \n", " return w, b0\n" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "### Exercise III: Computational drug design: Part I, Training on Thrombin (10pts)\n", "\n", "With the recent spreading of the COVID-19 \"Corona\" Virus. Many initiatives relying on Machine Learning, have been taken to find new drugs that would be able to prevent the virus from infecting host cells. The Human Angiotensin-Converting Enzyme 2 (ACE2) has been proved by many studies to be the specific receptor for the Spike RBD of SARS-CoV. A common approach (followed for example for researchers at Oak Ridge National Laboratory) is to design compounds (such as the one shown in gray in the figure above) that would be able to bind to the SARS-CoV-2 spike protein (shown in cyan), thus making the virus unable to dock to the human ACE2 receptors (shown in purple in the figure below). \n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\"Drawing\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Source: [Physics.org](https://phys.org/news/2020-03-early-drug-compounds-supercomputing-combat.html)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "We do not have enough data on the COVID-19 yet so we are going to train on an older computational drug design dataset from NIPS 2003. The general idea is the same. A Drug is a small organic molecule that can achieve its desired activity by binding to a target site on a receptor. The first step in the discovery of a new drug is usually to identify and isolate the receptor to which it should bind, followed by testing many small molecules for their ability to bind to the target site (see the [UCI ML Repo](https://archive.ics.uci.edu/ml/datasets/Dorothea) for more details). For this particular dataset, we are interested in checking which compounds can or cannot bind to thrombin. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question III.1 (4pts)\n", "Start by downloading the DOROTHEA dataset on the [UCI ML website ](https://archive.ics.uci.edu/ml/datasets/Dorothea). Once you have downloaded the data, as we did for face recognition, use the PCA module from scikit-learn to reduce the dimension of your feature vectors from their initial size (100000) to about 100. Split the data into a training and test part using the train_test_split module from scikit learn " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import numpy as np\n", "from sklearn.model_selection import train_test_split\n", "\n", "\n", "# Start by splitting the dataset into a training part and a validation part (take about 80 to 90% training data)\n", "\n", "\n", "\n", "from sklearn.decomposition import PCA\n", "\n", "# Use the PCA module to reduce compress the feature vectors\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ " #### Question III.2 (6pts)\n", "\n", "Once you have learned the compressed representation for your samples, learn the following two classifiers\n", "\n", "- SVC with RBF kernel (Combine the SVC with GridSearchCVto determine the optimal values for $C$ (how much you penalize misclassification) and $\\gamma$ (width of kernel)). To start you can take your grid to be defined as below\n", "\n", "param_grid = {'C': [1e2, 5e2, 1e3, 1e4, 5e4],\n", " 'gamma': [0.0005, 0.001, 0.005, 0.01, 0.1], }\n", "\n", "Keep in mind that larger $\\gamma$ means smaller kernel. I.e. $K(x, y) = exp(-gamma ||x-y||^2)$. $C$ is defined as in exercise II. \n", "\n", "- Logistic regression classifier " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "from sklearn.model_selection import GridSearchCV\n", "from sklearn.decomposition import PCA\n", "from sklearn.svm import SVC\n", "\n", "\n", "# put your code here\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Bonus : Computational drug design. Part II: Tackling COVID-19\n", "\n", "Now that you understand the basic idea behind computational drug design, we are ready to try to tackle the real challenge. A couple of days ago, Sage Health started a competition which was aimed at developing new drugs to fight corona virus. The competition, which was [advertised through youtube](https://www.youtube.com/watch?v=1LJgkovowgA), is now closed. However it would be interesting to see whether anybody could make additional discoveries from the data. Check the competition video and try to implement some of the step, using the data from the github pages of the winners (see here for the main [SageHealth webpage](https://www.sage-health.org/) and [https://www.sage-health.org/coronavirus/](here for the links to each of the winners github pages)) \n", "\n", "Points for this question depends on how far you can go. " ] } ], "metadata": { "kernelspec": { "display_name": "Python 2", "language": "python", "name": "python2" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 2 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython2", "version": "2.7.13" } }, "nbformat": 4, "nbformat_minor": 2 }