{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "## CSCI-UA 9473 Final Assignment\n", "\n", "Total: 55pts" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this last assignment, you will get to work on each of the learning frameworks that we introduced throughout the course. The assignement is organized in Three main parts.\n", "\n", "- Supervised Learning\n", "- Unsupervised Learning\n", "- Reinforcement Learning \n", "\n", "The Total (final) grade will be computed according to the following rule \n", "\n", "\n", "\n", "__if__ Mean(Assignment1,Assignment2,Assignment3)< Assignment4 \n", "\n", " \n", " Total grade = 0.25*Mean(Assignment1+Assignment2+Assignment3) + 0.75*Assignment4\n", " \n", "__else if__ Mean(Assignment1,Assignment2,Assignment3)> Assignment4 \n", "\n", " \n", " Total grade = 0.75*Mean(Assignment1+Assignment2+Assignment3) + 0.25*Assignment4\n", " \n", " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Part I. Supervised Learning (25pts)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Exercise I.1.1 Regression tree (5pts)\n", "\n", "Tree based methods are simple methods that can be used in both regression and classification. The idea is to split the space into regions and then approximate the data by taking the mean of the targets from each region. The approach starts by splitting the space into two regions $R_1$ and $R_2$. Each of the two regions is then itself split into two more regions, $R_{11}$, $R_{12}$ such that $R_{11}\\cup R_{12} = R_{1}$ and $R_{21}$, $R_{22}$ such that $R_{21}\\cup R_{22} = R_{2}$. By proceeding like this, we end up with a model of the form\n", "\n", "$$f(x) = \\sum_{m=1}^M c_m I(x\\in R_m)$$\n", "\n", "where $I(x\\in R_m)$ is a function that takes the value $1$ when $x$ belongs to the subregion $R_m$ and $0$ otherwise. \n", "\n", "if we minimize the sum of squares, one can show that the optimal value for the coefficients $c_m$ is the average of the targets from region $R_m$, i.e. \n", "\n", "$$\\hat{c}_m = \\frac{1}{N_m}\\sum_{x_i\\in R_m} t_i$$\n", "\n", "Now finding the optimal region separation is more tricky. For this reason, the algorithm usually relies on some greedy procedure. \n", "\n", "For a splitting value $s$, Let $R_1(j,s)$ and $R_2(j,s)$ denote the regions we want to get (with respect to the coordinate/feature $j$), i.e. $R_1(j,s) = \\left\\{\\mathbf{x}|x_j\\leq s\\right\\} $ and $R_2(j,s) = \\left\\{\\mathbf{x}|x_j>s\\right\\}$.\n", "\n", "In order to find the optimal value $s$, we solve the following problem\n", "\n", "$$\\min_{(j,s)} \\left[\\min_{c_1} \\sum_{\\mathbf{x}_i\\in R_1(j,s)} (t_i - c_1)^2 + \\min_{c_2} \\sum_{\\mathbf{x}_i\\in R_2(s,j)} (t_i - c_2)^2\\right]\\quad (*)$$\n", "\n", "As explained before, we can take $c_1$ and $c_2$ to be the averages of the targets from each region. To determine the split, for convenience, we can take it to happen at one of the points from the training set, $\\mathbf{s} = \\mathbf{x}_i$ for some $i\\in \\mathcal{D}$. From this computing the optimal splitting for the criterion (*) becomes easier as we are left with computing the value of this criterion for $s = \\mathbf{x}_i$ given by any of the points from the training set. And return the value that achieves the minimum. \n", "\n", "Consider the dataset below. Compute the Regression tree for this dataset. You can stop when you reach a certain node size (take it to be $2$ for example)" ] }, { "cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD8CAYAAACMwORRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAF7lJREFUeJzt3X9s1Ped5/HnqwZkjk1htZjF/EgC\nK0rDtuhgfemqK117SQq0lkm2qSKQerqsukW7Ko0hPbSJ2iCL6NTbIpW4Ok46dlU1t1LLsihC9pke\n3cumOm3TnHAgAQFywnqzC8Yc3qiwEms3wL3vjxnTsWM837G/nu/M16+HZHm+n/nI88pAXv7y/cz3\n+1VEYGZm+fKRrAOYmVn6XO5mZjnkcjczyyGXu5lZDrnczcxyyOVuZpZDLnczsxxKVO6Stkjqk3RR\n0nMTPP+ApFclnZH0U0kr0o9qZmZJqdxJTJIagHeAzwGXgZPA9og4XzLnr4D/EREvS3oE+IOI+Pcz\nF9vMzCYzJ8Gch4GLEdEPIOkw8DhwvmTOOuDZ4uPXgGPlfujixYvjwQcfrCismdls9+abb/5TRDSV\nm5ek3JcDl0q2LwOfGjfnbeCLQCfw+8B9kn4jIt4vnSRpB7AD4P7776e3tzfBy5uZ2ShJ/5BkXloL\nqv8R+Iyk08BngAHgzvhJEXEoIloioqWpqewvHjMzm6Ike+4DwMqS7RXFsbsi4gqFPXck/RrwZERc\nTyukmZlVJsme+0lgjaRVkuYB24Cu0gmSFksa/VnPA99PN6aZmVWibLlHxG1gJ3ACuAAciYhzkvZJ\n2lqc9lmgT9I7wG8C/2mG8pqZWQJlPwo5U1paWsILqmZmlZH0ZkS0lJvnM1TNzHLI5W5mlkMudzOz\nHHK5m5nlUJLPuc9qx04PsP9EH1euD7Ns0Xz2bF7LExuWZx3LzGxSLvdJHDs9wPOvnGX4VuFk24Hr\nwzz/ylkAF7yZ1TQflpnE/hN9d4t91PCtO+w/0ZdRIjOzZFzuk7hyfbiicTOzWuFyn8SyRfMrGjcz\nqxUu90ns2byW+XMbxozNn9vAns1rM0pkZpaMF1QnMbpo6k/LmFm9cbmX8cSG5S5zM6s7PixjZpZD\nLnczsxzyYRmrST4z2Gx6XO5Wc3xmsNn0+bCM1RyfGWw2fS53qzk+M9hs+hKVu6QtkvokXZT03ATP\n3y/pNUmnJZ2R9IX0o9ps4TODzaavbLlLagAOAp8H1gHbJa0bN+1bFG6cvQHYBvzXtIPa7OEzg82m\nL8mC6sPAxYjoB5B0GHgcOF8yJ4CPFh8vBK6kGdJmF58ZbDZ9Scp9OXCpZPsy8KlxczqAn0j6OrAA\neCyVdDZr+cxgs+lJa0F1O/CDiFgBfAH4C0kf+tmSdkjqldQ7NDSU0kubmdl4Scp9AFhZsr2iOFbq\nK8ARgIj4OdAILB7/gyLiUES0RERLU1PT1BKbmVlZScr9JLBG0ipJ8ygsmHaNm/OPwKMAkh6iUO7e\nNTczy0jZco+I28BO4ARwgcKnYs5J2idpa3HaN4CvSnob+BHwdETETIU2M7PJJbr8QEQcB46PG9tb\n8vg88HvpRjMzs6nyGapmZjnkcjczyyGXu5lZDrnczcxyyOVuZon09Pew6egm1r+8nk1HN9HT35N1\nJJuEb9ZhZmX19PfQ8XoHI3dGABi8OUjH6x0AtK5uzTCZ3Yv33M2srM5TnXeLfdTInRE6T3VmlMjK\ncbmbWVlXb16taNyy53I3s7KWLlha0bhlz+VuZmW1b2ynsaFxzFhjQyPtG9szSmTleEHVzMoaXTTt\nPNXJ1ZtXWbpgKe0b272YWsNc7maWSOvqVpd5HfFhGTOzHHK5m5nlkMvdzCyHXO5mZjnkcjczyyF/\nWqYGHTs9wP4TfVy5PsyyRfPZs3ktT2xYnnUsM6sjLvcac+z0AM+/cpbhW3cAGLg+zPOvnAVwwZtZ\nYokOy0jaIqlP0kVJz03w/AFJbxW/3pF0Pf2os8P+E313i33U8K077D/Rl1EiM6tHZffcJTUAB4HP\nAZeBk5K6ijfFBiAidpfM/zqwYQayzgpXrg9XNG5mNpEke+4PAxcjoj8iPgAOA49PMn878KM0ws1G\nyxbNr2jczGwiScp9OXCpZPtycexDJD0ArAL+5h7P75DUK6l3aGio0qyzwp7Na5k/t2HM2Py5DezZ\nvDajRDYVN7q7efeRR7nw0DrefeRRbnR3Zx3JZpm0Pwq5DTgaEXcmejIiDkVES0S0NDU1pfzS+fDE\nhuV8+4ufZPmi+QhYvmg+3/7iJ72YWkdudHcz+MJebl+5AhHcvnKFwRf2uuCtqpJ8WmYAWFmyvaI4\nNpFtwNemG2q2e2LDcpd5Hbt24CViZOxdi2JkhGsHXmJhW1tGqWy2SbLnfhJYI2mVpHkUCrxr/CRJ\nHwd+Hfh5uhHN6svtwcGKxs1mQtlyj4jbwE7gBHABOBIR5yTtk7S1ZOo24HBExMxENasPc5qbKxo3\nmwmJTmKKiOPA8XFje8dtd6QXy6x+Ldm9i8EX9o45NKPGRpbs3pVhKpttfIaqWcpGj6tfO/AStwcH\nmdPczJLdu3y83aqqrsvd12CxWrWwrc1lbpmq23L3NVjqh38Jm1Vf3V7y19dgqQ+jv4QHrg8T/OqX\n8LHT9/o0rZmloW7L3ddgqQ/+JWyWjbotd1+DpT74l7BZNuq23H0NlvrgX8Jm2ajbcvc1WOqDfwmb\nZaNuPy0DvgZLPRj98/GnZcyqq67L3eqDfwmbVV/dHpYxM7N7c7mbmeWQy93MLIdc7mYzwLfZs6x5\nQdUsZaO32Ru95O/obfYAX0zMqsZ77mYpm+w2e2bV4nI3S5lvs2e1wOVuljLfZs9qQaJyl7RFUp+k\ni5Keu8ecpySdl3RO0g/TjWmWvpla9FyyexdqbBwz5tvsWbWVXVCV1AAcBD4HXAZOSuqKiPMlc9YA\nzwO/FxG/kLRkpgKbpWEmFz19mz2rBUk+LfMwcDEi+gEkHQYeB86XzPkqcDAifgEQEdfSDmqWpskW\nPdMoYd9mz7KW5LDMcuBSyfbl4lipjwEfk/QzSW9I2pJWQLOZ4EVPy7u0FlTnAGuAzwLbgT+TtGj8\nJEk7JPVK6h0aGkrppc0q50XP2aGnv4dNRzex/uX1bDq6iZ7+nqwjVU2Sch8AVpZsryiOlboMdEXE\nrYj4e+AdCmU/RkQcioiWiGhpamqaamazafOiZ/719PfQ8XoHgzcHCYLBm4N0vN4xawo+SbmfBNZI\nWiVpHrAN6Bo35xiFvXYkLaZwmKY/xZxmqVrY1kbzi/uYs2wZSMxZtozmF/f5OHmOdJ7qZOTO2HWV\nkTsjdJ7qzChRdZVdUI2I25J2AieABuD7EXFO0j6gNyK6is9tknQeuAPsiYj3ZzK42XR50TPfrt68\nWtF43iS6tkxEHAeOjxvbW/I4gGeLX2ZmmVu6YCmDNz+8QL50wdIM0lSfz1A1s1xq39hOY8PYdZXG\nhkbaN7ZnlKi6fFVIM8ul1tWtQOHY+9WbV1m6YCntG9vvjuedy93Mcqt1deusKfPxfFjGzCyHXO5m\nZjnkcjczyyGXu5lZDrnczcxyyJ+WMY6dHmD/iT6uXB9m2aL57Nm8lic2jL/wp5nVE5f7LHfs9ADP\nv3KW4Vt3ABi4Pszzr5wFcMGb1TEflpnl9p/ou1vso4Zv3WH/ib6MEplZGlzus9yV68MVjZtZfXC5\nz3LLFs2vaNzM6oPLfZbbs3kt8+c2jBmbP7eBPZvXZpTIzNLgBdVZbnTR1J+WMcsXl7vxxIblLnOz\nnPFhGTOzHHK5m5nlkMvdzCyHEpW7pC2S+iRdlPTcBM8/LWlI0lvFrz9MP6qZmSVVdkFVUgNwEPgc\ncBk4KakrIs6Pm/qXEbFzBjKamVmFkuy5PwxcjIj+iPgAOAw8PrOxzMxsOpKU+3LgUsn25eLYeE9K\nOiPpqKSVE/0gSTsk9UrqHRoamkJcMzNLIq0F1W7gwYhYD/w18PJEkyLiUES0RERLU1NTSi9tZmbj\nJSn3AaB0T3xFceyuiHg/In5Z3Pxz4HfSiWdmZlORpNxPAmskrZI0D9gGdJVOkNRcsrkVuJBeRDPL\nUk9/D5uObmL9y+vZdHQTPf09WUeyBMqWe0TcBnYCJyiU9pGIOCdpn6StxWnPSDon6W3gGeDpmQps\nVutudHfz7iOPcuGhdbz7yKPc6O7OOtKU9fT30PF6B4M3BwmCwZuDdLze4YKvA4qITF64paUlent7\nM3lts5lyo7ubwRf2EiMjd8fU2Ejzi/tY2NaWYbKp2XR0E4M3Bz803rygmZ986ScZJDJJb0ZES7l5\nPkPVLEXXDrw0ptgBYmSEawdeyijR9Fy9ebWicasdLnezFN0e/PBe7mTjtW7pgqUVjVvtcLmbpWhO\nc3NF47WufWM7jQ2NY8YaGxpp39ieUaL6lMWitMvdLEVLdu9CjWPLUI2NLNm9K6NE09O6upWOT3fQ\nvKAZIZoXNNPx6Q5aV7dmHa1uZLUo7QVVs5Td6O7m2oGXuD04yJzmZpbs3lWXi6mWjrQXpZMuqPpO\nTGYpW9jW5jK3u7JalPZhGTOzGZTVorTL3cxsBmW1KO3DMmZmM2h08bnzVCdXb15l6YKltG9sn/FF\naZe7mdkMa13dWvVPGPmwjJlZDrnczcxyyOVuZpZDLnczsxxyuZuZ5ZDL3cwsh1zuZmY55HI3M8uh\nROUuaYukPkkXJT03ybwnJYWkslcsMzOzmVO23CU1AAeBzwPrgO2S1k0w7z6gHfg/aYc0M7PKJNlz\nfxi4GBH9EfEBcBh4fIJ5LwJ/CoxM8JzZzDtzBA58AjoWFb6fOZJ1IrPMJCn35cClku3LxbG7JG0E\nVkbEzN87ymwiZ45A9zNw4xIQhe/dz7jgbdaa9oKqpI8A3wW+kWDuDkm9knqHhoam+9Jmv/LqPrg1\nPHbs1nBh3GwWSlLuA8DKku0VxbFR9wGfAH4q6T3gd4GuiRZVI+JQRLREREtTU9PUU5uNd+NyZeNm\nOZek3E8CayStkjQP2AZ0jT4ZETciYnFEPBgRDwJvAFsjwjdItepZuKKycbOcK1vuEXEb2AmcAC4A\nRyLinKR9krbOdECzRB7dC3Pnjx2bO78wbjYLJbpZR0QcB46PG5vw/5qI+Oz0Y5lVaP1The+v7isc\nilm4olDso+Nms4zvxGT5sf4pl7lZkS8/YGaWQy53syq60d3Nu488yoWH1vHuI49yo7s760iWUz4s\nY1YlN7q7GXxhLzFSOIn79pUrDL5QWLpa2NaWZTTLIe+5m1XJtQMv3S32UTEywrUDL2WUyPLM5W5W\nJbcHBysaN5sOl7tZlcxpbq5o3Gw6XO5mVbJk9y7U2DhmTI2NLNm9K6NElmdeUDWrktFF02sHXuL2\n4CBzmptZsnuXF1NtRrjczapoYVuby9yqwodlKuUbQphZHfCeeyVGbwgxet3w0RtCgE97N7Oa4j33\nSviGEGZWJ1zulfANIcysTrjcK+EbQphZnXC5V8I3hDCzOuFyr8T6p6Dte7BwJaDC97bveTHVzGqO\nPy1TKd8QwszqgPfczcxyKFG5S9oiqU/SRUnPTfD8H0k6K+ktSX8raV36Uc3MLKmy5S6pATgIfB5Y\nB2yfoLx/GBGfjIh/DXwH+G7qSc3MLLEke+4PAxcjoj8iPgAOA4+XToiIfy7ZXABEehHNzKxSSRZU\nlwOXSrYvA58aP0nS14BngXnAI6mkMzOzKUltQTUiDkbEbwF/AnxrojmSdkjqldQ7NDSU1kubmdk4\nScp9AFhZsr2iOHYvh4EnJnoiIg5FREtEtDQ1NSVPaWZmFUlS7ieBNZJWSZoHbAO6SidIWlOy2Qq8\nm15EMzOrVNlj7hFxW9JO4ATQAHw/Is5J2gf0RkQXsFPSY8At4BfAf5jJ0GZmSfT099B5qpOrN6+y\ndMFS2je207q6NetYVaGIbD7Y0tLSEr29vZm8tpnlX09/Dx2vdzByZ+TuWGNDIx2f7qjrgpf0ZkS0\nlJvnM1TNLJc6T3WOKXaAkTsjdJ7qzChRdbnczSyXrt68WtF43rjczSyXli5YWtF43rjczSyX2je2\n09jQOGassaGR9o3tGSWqLl/y18xyaXTRdLZ+Wsblbma51bq6ddaU+Xg+LGNmlkMudzOzHHK5m5nl\nkMvdzCyHXO5mZjnkcjczyyGXu5lZDrnczcxyyOVuZpZDLnebHc4cgQOfgI5Fhe9njmSdyGxG+fID\nln9njkD3M3BruLB941JhG2D9U9nlMptB3nO3/Ht136+KfdSt4cK4WU653C3/blyubNwsBxKVu6Qt\nkvokXZT03ATPPyvpvKQzkl6V9ED6Uc2maOGKysbNcqBsuUtqAA4CnwfWAdslrRs37TTQEhHrgaPA\nd9IOajZlj+6FufPHjs2dXxg3y6kke+4PAxcjoj8iPgAOA4+XToiI1yLiX4qbbwDeJbLasf4paPse\nLFwJqPC97XteTLVcS/JpmeXApZLty8CnJpn/FeDHEz0haQewA+D+++9PGNEsBeufcpnbrJLqgqqk\nLwMtwP6Jno+IQxHREhEtTU1Nab60mZmVSLLnPgCsLNleURwbQ9JjwDeBz0TEL9OJZ2ZmU5Fkz/0k\nsEbSKknzgG1AV+kESRuA/wZsjYhr6cc0M7NKlC33iLgN7AROABeAIxFxTtI+SVuL0/YDvwb8laS3\nJHXd48eZmVkVJLr8QEQcB46PG9tb8vixlHOZmdk0+AxVM7MccrmbmeWQy93MLIdc7vXA1yI3swr5\neu61ztciN7Mp8J57rfO1yM1sClzutc7XIjezKXC51zpfi9zMpsDlXut8LXIzmwKXe63ztcjNbAr8\naZl64GuRm1mFvOduZpZDLnczsxxyuZuZ5ZDL3cwsh1zuZmY55HI3M8shl7uZWQ4pIrJ5YWkI+Id7\nPL0Y+Kcqxpkq50yXc6arHnLWQ0aorZwPRERTuUmZlftkJPVGREvWOcpxznQ5Z7rqIWc9ZIT6yVnK\nh2XMzHLI5W5mlkO1Wu6Hsg6QkHOmyznTVQ856yEj1E/Ou2rymLuZmU1Pre65m5nZNGRa7pK2SOqT\ndFHScxM8/28lnZJ0W9KXsshYzFEu57OSzks6I+lVSQ/UaM4/knRW0luS/lbSulrMWTLvSUkhqeqf\nUkjwXj4taaj4Xr4l6Q+rnTFJzuKcp4p/P89J+mG1MxYzlHs/D5S8l+9Iul6jOe+X9Jqk08X/37+Q\nRc5EIiKTL6AB+DtgNTAPeBtYN27Og8B64L8DX6rhnP8O+FfFx38M/GWN5vxoyeOtwP+sxZzFefcB\n/xt4A2iptYzA08B/yeLvZIU51wCngV8vbi+pxZzj5n8d+H4t5qRw7P2Pi4/XAe9l+Xdgsq8s99wf\nBi5GRH9EfAAcBh4vnRAR70XEGeD/ZRGwKEnO1yLiX4qbbwBZ3OA0Sc5/LtlcAGSx4FI2Z9GLwJ8C\nI9UMV5Q0Y9aS5PwqcDAifgEQEdeqnBEqfz+3Az+qSrKxkuQM4KPFxwuBK1XMV5Esy305cKlk+3Jx\nrNZUmvMrwI9nNNHEEuWU9DVJfwd8B3imStlKlc0paSOwMiJ6qhmsRNI/8yeL/zQ/KmlldaKNkSTn\nx4CPSfqZpDckbalaul9J/P9Q8ZDmKuBvqpBrvCQ5O4AvS7oMHKfwr4ya5AXVFEn6MtAC7M86y71E\nxMGI+C3gT4BvZZ1nPEkfAb4LfCPrLGV0Aw9GxHrgr4GXM85zL3MoHJr5LIU94j+TtCjTRJPbBhyN\niDtZB7mH7cAPImIF8AXgL4p/Z2tOlqEGgNK9nRXFsVqTKKekx4BvAlsj4pdVylaq0vfzMPDEjCaa\nWLmc9wGfAH4q6T3gd4GuKi+qln0vI+L9kj/nPwd+p0rZSiX5M78MdEXErYj4e+AdCmVfTZX83dxG\nNodkIFnOrwBHACLi50AjhevO1J6sDvZT2KPop/BPsNHFi9++x9wfkN2CatmcwAYKCzFravn9LM0H\ntAG9tZhz3PyfUv0F1STvZXPJ498H3qjF9xLYArxcfLyYwmGH36i1nMV5Hwfeo3j+TY2+nz8Gni4+\nfojCMfdM8pb978n0xQv/rHmnWIzfLI7to7D3C/BvKOx53ATeB87VaM7/Bfxf4K3iV1eN5uwEzhUz\nvjZZqWaZc9zcqpd7wvfy28X38u3ie/nxWnwvAVE4zHUeOAtsq8Wcxe0O4D9nka+C93Md8LPin/tb\nwKYs80725TNUzcxyqCYXAszMbHpc7mZmOeRyNzPLIZe7mVkOudzNzHLI5W5mlkMudzOzHHK5m5nl\n0P8HFPaBm+wCbjoAAAAASUVORK5CYII=\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from scipy.io import loadmat\n", "Points1 = loadmat('Points1_ExTree.mat')\n", "Points2 = loadmat('Points2_ExTree.mat')\n", "Points3 = loadmat('Points3_ExTree.mat')\n", "Points4 = loadmat('Points4_ExTree.mat')\n", "\n", "\n", "Points1 = Points1['Points1_ExTree']\n", "Points2 = Points2['Points2_ExTree']\n", "Points3 = Points3['Points3_ExTree']\n", "Points4 = Points4['Points4_ExTree']\n", "\n", "\n", "import matplotlib.pyplot as plt\n", "plt.scatter(Points1[:,0], Points1[:,1])\n", "plt.scatter(Points2[:,0], Points2[:,1])\n", "plt.scatter(Points3[:,0], Points3[:,1])\n", "plt.scatter(Points4[:,0], Points4[:,1])\n", "plt.show()\n", "\n", "\n", "target_1 = np.ones(np.shape(Points1)[0])+np.random.normal(0, 0.2, np.shape(Points1)[0])\n", "target_2 = 2*np.ones(np.shape(Points1)[0])+np.random.normal(0, 0.2, np.shape(Points1)[0])\n", "target_3 = 5*np.ones(np.shape(Points1)[0])+np.random.normal(0, 0.2, np.shape(Points1)[0])\n", "target_4 = -3*np.ones(np.shape(Points1)[0])+np.random.normal(0, 0.2, np.shape(Points1)[0])\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# put your code here\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Exercise I.1.2. Weakest link (5pts)\n", "\n", "Once we have built a sufficiently deep tree, we are left with determining whether one cannot reduce the depth of this tree (i.e. how coarse the model can be while still achieving sufficient accuracy). A (too) fine model will obviously lead to overfitting. For any tree, we define the set of leaf nodes (i.e. final set of regions) as $M$. We will use this parameter to encode the complexity of the tree. \n", "\n", "We can define the error that we make with a particular tree by considering the error on each region and summing those error terms across each region. If we let $R_m$ to denote the (average) error induced by region $R_m$, \n", "\n", "$$Q_m(T) = \\frac{1}{N_m} \\sum_{x_i\\in R_m} (t_i - \\hat{c}_m)^2$$\n", "\n", "To account for the complexity of the tree, one can also extend this cost into a loss $\\ell(T)$ which penalize complex trees (with a large number of leaf nodes $M$) as\n", "\n", "$$\\ell(T) = \\sum_{m=1}^M N_m Q_m(T) + \\alpha M$$\n", "\n", "We can then look for the optimal tree for any particular value of $\\alpha$. \n", "\n", "The search for the optimal $\\alpha$ and the associated tree is usually done through a procedure known as _weakest link prunning_ which works as follows \n", "\n", "For any node other than the leaf nodes, we can study how much of an improvement one can get by deleting the subtree located below the node. To do this, we proceed as follows. For any given subtree, we can define the costs $\\tilde{R}(t) = Q(t) + \\alpha$ and $R(T_t) = Q(T_t) + \\alpha|T_t|$. The first one is the cost of the node $t$ (that is the contribution if we removed the whole subtree below $t$) and the second one is the contribution of the term if we were to keep all the leaf of the subtree. For each node in $T_0$, we can look for the $\\alpha$ at which $R(T_t)>R(t)$. In other words, we look for the value $\\alpha$ such that \n", "\n", "$$Q(t) + \\alpha < Q(T_t) + \\alpha|T_t|$$\n", "\n", "This is equivalent to looking for the $\\alpha$ such that $\\alpha>\\frac{Q(t)-Q(T_t)}{|T_t|-t}$. We can do this for every node $t$. The _weakest link_ $t'$ is then the connection for which the $\\alpha(t')$ is the smallest. If there are multiple nodes achieving the same minimum we remove all the sub-branches associated to those nodes. We define the next subtree by removing the sub-branches for _Weakest link_ $t'$. Let us denote that subtree as $T'$. We then repeat the procedure on $T'$, looking for the nodes with smallest $\\alpha$. The procedure generates a sequence of subtrees $T, T', T''$ with associated values $\\alpha_0, \\alpha_1, \\ldots$. \n", "\n", "Implement _weakest link prunning_ below. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import numpy as np\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Exercise I.1.3. Cross validation (5pts)\n", "\n", "Given the sequence of subtrees $T_0,T_1, T_2,\\ldots$ and its accompanying sequence of weights $\\alpha_0,\\alpha_1,\\ldots$, one can show that the sequence contains the optimal $\\alpha^*$. To find this $\\alpha^*$, one can use $k$-fold cross validation. \n", "\n", "Divide the dataset into $K$ bins of size $N/K$. For each of those bins, we will use one bin as our validation/test set and the remaining $K-1$ bins as our training set. \n", "\n", "1- Using the training set, compute the tree for each of the values of $\\alpha$ obtained above. You don't need to optimize anything as the value on a region is defined as the average of the targets in this region and the number of levels can be set by only retaining those subtrees for which $R_\\alpha(t)>R_{\\alpha}(T_t)$\n", "\n", "2-Once you have computed all the subtrees, evaluate the prediction error for those subtrees on the set consisting of the remaining $K$ points. The prediction error is just the average target ($c_m$) from the region in which the new points is located minus the true target of this point.\n", "\n", "And compute the average error as \n", "\n", "$$E(\\alpha) = \\frac{1}{N}\\sum_{i=1}^N (\\text{prediction}_{T_k(\\alpha)}(\\mathbf{x}_i) - t_i)^2$$\n", "\n", "where $N$ is the total number of points in the dataset and $\\text{prediction}_{T_k}$ is the prediction obtained on the tree $T_k(\\alpha)$ learned on the set of $N - K$ points to which $\\mathbf{x}_i$ did not belong (i.e. learned without $\\mathbf{x}_i$)\n", "\n", "\n", "Select the $\\alpha$ which gives the smallest error.\n", "\n", "Take $k$ between $2$ and $4$ and find the optimal $\\alpha$. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# put your code here\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "#### Exercise I.2. Bias-Variance (10pts)\n", "\n", "In this exercise, we will study the decomposition of the prediction error into the bias and variance contribution for a simple regression model. For a given learning model, the prediction error can read as\n", "\n", "$$\\mathbb{E}\\left\\{(f(\\mathbf{x},\\theta)) - t(\\mathbf{x}))^2\\right\\}$$\n", "\n", "After some calculation, this expression can reduce to\n", "\n", "$$\\mathbb{E}_D\\left\\{(f_D(\\mathbf{x},\\theta)) - t(\\mathbf{x}))^2\\right\\} = \\left(\\mathbb{E}\\left\\{f_d(\\mathbf{x}, \\theta)\\right\\} - t(\\mathbf{x})\\right)^2 + \\mathbb{E}_D\\left\\{\\left(f_D(\\mathbf{x}; \\theta) - \\mathbb{E}_D\\left\\{f_D(\\mathbf{x}, \\theta)\\right\\}\\right)^2\\right\\}$$\n", "\n", "In this expression, the first term represents the squared bias (that is how much the choice of the family of models we pick differs is able to capture of the measurements we have on average). In the second term, you can recognize the expression of the variance. This second term captures how much the models vary within a particular family of models when we change the subset $D$ on which we learn the models. \n", "\n", "\n", "In this exercise, we will illustrate this decomposition. for a simple regression model on noisy degree $3$ datasets with model of degrees from $0$ to $5$\n", "\n", "Consider the noisy dataset below. For the sake of the exercise, we will not change the dataset each time when computing the average, but rather just generate new points by changing the noise. " ] }, { "cell_type": "code", "execution_count": 25, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXYAAAD8CAYAAABjAo9vAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAAIABJREFUeJzt3Xl8VPW9//HXl2yENUACCQkQ1iCr\nQEARV1yC1gWoqLVWQS2uXW5bbOly29v+utJ7vVWrLfUqbtWWRbRUjCi41QUStoQlJKxZyAaENevM\n9/dHBhswmGXOrHk/Hw8ezJxz5ns+nBneOTnzPd+vsdYiIiLho1OgCxAREWcp2EVEwoyCXUQkzCjY\nRUTCjIJdRCTMKNhFRMKMgl1EJMwo2EVEwoyCXUQkzEQGYqfx8fE2NTU1ELsWEQlZ2dnZldbahJa2\nC0iwp6amkpWVFYhdi4iELGPM/tZsp0sxIiJhRsEuIhJmFOwiImFGwS4iEmYU7CIiYcaRXjHGmH3A\nccAFNFhr051oV0RE2s7J7o5XWGsrHWxPRETaISD92EVEOoqVm4r55T93UHGilv49O/PIjJHMnJDs\n0306dY3dAm8ZY7KNMfOb28AYM98Yk2WMyaqoqHBotyIiwWvlpmK+v3wrFSdqASg5WsPCFTms3FTs\n0/06FewXW2snAtcCDxljLj17A2vtYmtturU2PSGhxTtiRURC3m9X76S2wX3Gsup6F4sy83y6X0eC\n3Vpb7Pm7HHgVmOJEuyIioaquwc3BYzXNriupqvbpvr0OdmNMV2NM99OPgWuAXG/bFREJVdZaFq7I\nOef6/nGxPt2/E2fs/YAPjTFbgPXAP621bzrQrohISHpibQHLNxYxY3QisVERZ6yLjYpgQUaaT/fv\nda8Ya+0eYLwDtYiIhLzXNhfz32t2MXtiMv89ZzyvbS5hUWYeJVXV9I+LZUFGms97xai7o4iIQ9bv\nPcyCpVu5cEhvfjN7HMYYZk5I9nmQn01DCoiIOGBPxQnmv5BFSu9Y/nxHOtGRgYtXBbuIiJcOnahl\n3pINRBjDkrlT6NklKqD16FKMiIgXaupdzH8hm9KjNbw8/0IG9ukS6JIU7CIi7eV2W767dAvZ+4/w\n5FcnMnFgr0CXBOhSjIhIuy16K49/bj3IwmtHct3YpECX8xkFu4hIO7y8/gBPvbub2y8YyPxLhwS6\nnDMo2EVE2uj9XRX8eGUul41I4Oc3jsYYE+iSzqBgFxFpg7zS4zz40kaG9+3GE7dPIDIi+GI0+CoS\nEQlS5cdqmPfserrGRPDsvMl07xzYbo3nol4xIiKtcKqugXuey6Kqup6/3zeVpJ6+HcjLGwp2EQlb\nKzcVOzJOi8tt+ebLm9hWcpSn70pnTHJPH1TrHAW7iISllZuKWbgih+p6FwDFVdWfDaXb1nD/xart\nvL2jnJ/fNJrpI/s5XqvTdI1dRMLSosy8z0L9tPbMXvTsv/ay5KN93HPxYO6cmupghb6jYBeRsHSu\nWYraMnvRmu1l/HzVdjJG9+OH153nVGk+p2AXkbB0rlmKWjt7UU7RUb758ibGJffkf2+dQESn4Oqr\n/kUU7CISlhZkpLV79qKiI6e4+7kN9O4azdN3TSY2OqLF1wQTfXkqImHp9Bekbe0Vc6ymnruXbKCm\n3sVf772AhO4x/ijXUQp2EQlbbZ29qN7l5qGXNrKn4iTP3T2F4f26+7A633HsUowxJsIYs8kYs8qp\nNkVE/MVay09W5vJBfiW/mj2WacPiA11Suzl5jf1bwA4H2xMR8Zun3tvNKxsKefiKYdySPiDQ5XjF\nkWA3xqQAXwKedqI9ERF/+seWEn73Zh43ju/Pd68ZEehyvObUGfv/Ao8AbofaExHxi6x9h/nu0i1M\nTu3Fojnjgm4I3vbwOtiNMdcD5dba7Ba2m2+MyTLGZFVUVHi7WxERr+2rPMnXn88iOS6WxV9LJyYy\ntLo1nosTZ+zTgBuNMfuAV4DpxpgXz97IWrvYWpturU1PSEhwYLciIu135GQd85ZsAODZuZPp1TU6\nwBU5x+tgt9YutNamWGtTgduAtdbaO7yuTETER2obXNz3QjbFVdX85c50UuO7BrokR+nOUxHpUKy1\nPLJsK+v3Heb3c8aTnto70CU5ztEblKy17wLvOtmmiIiT/mfNLl7bXMKCjDRuHN8/0OX4hM7YRaTD\nWJpVyONrC7g1fQAPXj400OX4jIJdRDqEjwoqWbgih4uHxfP/Zo0Ji26N56JgF5Gwl192nPtezGZI\nQleevGMiURHhHX3h/a8TkQ6v4ngt85ZsoHNUBM/MnUyPzlGBLsnnFOwiEraOVtdz73MbOHSijv+7\nK52UXl0CXZJfaNheEQlLxVXVzH1mPfsOneTJr05iXEpcoEvyGwW7iISdbSVHmffsBqrrXDx39xQu\nGhq6Q/C2h4JdRMLK+7sqeODFbHrERrHsgYtISwzNyTK8oWAXEcet3FTc5inpnLA0q5CFK3IY1rcb\nS+ZNIbFnZ5/vMxgp2EXEUSs3FbNwRQ7V9S6g8Vr3whU5AD4Ld2stj71TwKNv7+LiYfE8dcdEuneA\n3i/nol4xIuKoRZl5n4X6adX1LhZl5vlkf/UuNz9YnsOjb+9i9sRknpk7uUOHOuiMXUQcVlJV3abl\n3jhR28BDL23kvV0VfHP6MP7j6hFhfUdpaynYRcRR/eNiKW4mxPvHxTq6n/JjNcxbsoGdpcf59eyx\nfGXKQEfbD2W6FCMijlqQkUZs1JkzEcVGRbAgI82xfRSUH2fWkx+xt/IkT9+ZrlA/i87YRcRRp78g\n9VWvmPV7D/P157OIiujE3+ZPZWxKT0faDScKdhFx3MwJyT7pAbNqawnf+dsWUnrH8ty8KQzo3TGG\nCGgrBbuIBD1rLU9/sJdfvrGD9EG9ePqudOK6hM8cpU5TsItIUHO5Lb9YtZ0lH+3jurGJ/M8t59P5\nrGv4ciYFu4gErZp6F996ZROZ28q45+LB/Oi68+jUSd0ZW+J1sBtjOgPvAzGe9pZZa3/qbbsi0rEd\nPlnHvc9tYFNhFf95/SjuvnhwoEsKGU6csdcC0621J4wxUcCHxpjV1tpPHGhbRDqg/YdOMvfZDZRU\nVfPk7RO5dmxSoEsKKV4Hu7XWAic8T6M8f6y37YpIx7S5sIp7lmzAZS1//foFTBrUO9AlhRxHblAy\nxkQYYzYD5cAaa+2nTrQrIh3L29vLuG3xx3SJiWD5Axcp1NvJkWC31rqstecDKcAUY8yYs7cxxsw3\nxmQZY7IqKiqc2K2IhJEXP9nP/BeyGNGvOysemMbQhG6BLilkOTqkgLW2ClgHzGhm3WJrbbq1Nj0h\nIcHJ3YpICHO7Lb99cyc/XpnL5Wl9eWX+hSR0jwl0WSHN62A3xiQYY+I8j2OBq4Gd3rYrIuGvrsHN\nd/6+mafe3c3tFwxk8dcm0SVavbC95cQRTAKeM8ZE0PiD4u/W2lUOtCsiYexodT33v5DNx3sOsSAj\njQcvH6ohdx3iRK+YrcAEB2oRkQ6ipKqauc+uZ2/lSR69dTyzJqQEuqSwot95RMSvtpccY96S9Zyq\ndbFk3hSmDYsPdElhR8EuIn7zYX4l97+YTbeYSJY+MJWRiT0CXVJYUrCLiF8szy7i+8u3MqxvN56d\nN5mkns7OqCT/pmAXEZ+y1vLE2gL+e80upg3rw1N3TKJHB59s2tcU7CLiMw0uNz9emcsrGwqZNSGZ\n3355HNGRmpHT1xTsIuITJ2sbePivG1mXV8FDVwzle9ekqTujnyjYRcRxG/Yd5pFlW9l/6CS/mjWW\n2y/QZNP+pGAXEcecrG1gUWYez328j5Resbx47wVcNFTdGf1NwS4ijviooJLvr9hK0ZFq7pqayoKM\nNLrGKGICQUddRLxyrKaeX7+xk5fXH2BIfFf+ft9UJqdquN1AUrCLSLutyyvnhytyKDtWw32XDuE/\nrh6hiaaDgIJdRNqs6lQdv1i1g+UbixjRrxtP3TGN8wfEBbos8VCwi4SZlZuKWZSZR0lVNf3jYlmQ\nkcbMCcmOtZ+5rZQfr8zl8Mk6vjF9GA9PH0ZMpM7Sg4mCXSSMrNxUzMIVOVTXuwAorqpm4YocAK/D\n/dCJWn72j+38Y0sJo5J68OzcyYxJ7ul1zeI8BbtIGFmUmfdZqJ9WXe9iUWZeu4PdWsuqrQf56evb\nOF5Tz3evHsH9lw8lKkJ3kAYrBbtIGCmpqm7T8paUH6vhJ6/lkrmtjPEpPVk050JG9OvuTYniBwp2\nkTDSPy6W4mZCvH9c20ZStNayYmMxP1+1nZp6Fz+8biR3TxtMpM7SQ4LeJZEwsiAjjdizuhvGRkWw\nICOt1W2UVFUzb8kGvrt0C8P7dmP1ty5h/qVDFeohRGfsImHk9HX09vSKsdbyyoZCfvnPHbjclp/d\nMIo7p6bSqZMG7go1CnaRMDNzQnKbvygtPHyKH6zYyr8KDnHR0D78ZvY4Bvbp4qMKxde8DnZjzADg\neaAfYIHF1to/eNuuiPie22154ZP9/PbNnXQyhl/NGstXpgzQ8Lohzokz9gbgu9bajcaY7kC2MWaN\ntXa7A22LiI/srTzJI8u2sGHfES4bkcCvZ49t85esEpy8DnZr7UHgoOfxcWPMDiAZULCLBCGX2/LM\nh3v5/Vt5xER24vdzxvPlick6Sw8jjl5jN8akAhOAT5tZNx+YDzBwoAbdFwmE/LLjfG/ZVrYUVnH1\nqH78cuYY+vboHOiyxGGOBbsxphuwHPi2tfbY2euttYuBxQDp6enWqf2KSMvqXW7+/N5uHnungK4x\nETz2lQncMC5JZ+lhypFgN8ZE0RjqL1lrVzjRpog4Y1vJUR5ZtpVtJce4flwSP7txNPHdYgJdlviQ\nE71iDPB/wA5r7f94X5KIOKG2wcUf1xbw5Lu7iesSzZ/umMSMMYmBLkv8wIkz9mnA14AcY8xmz7If\nWmvfcKBtEWmHLYVVLFi2hV1lJ5g9MZn/vH4UcV2iA12W+IkTvWI+BHShTiQI7Cw9xuNrC3gj5yD9\nunfm2bmTuWJk30CXJX6mO09FwkBu8VEeeyeft7aX0S0mkgcvH8p9lw2lR+eoQJcmAaBgFwlhmw4c\n4fG1BazdWU6PzpF868rh3D1tMD27KNA7MgW7SAjasO8wj72Tzwf5lcR1ieJ714zgzotSdYYugIJd\nJGRYa/lkT2Ogf7znEH26RrPw2pHcceEgusbov7L8mz4NIkHOWssH+ZU8vjafDfuOkNA9hh9/6Ty+\nesEgYqM1ibR8noJdJEhZa1mXV85j7xSwubCKpJ6d+a8bR3Pr5AF0jlKgy7kp2EWCjNttWbOjjMfX\n5pNbfIzkuFh+NWssX56UTEykAl1apmAXCRJut2V1bimPr81nZ+lxBvXpwu9uHsesCclEaVo6aQMF\nu0iAudyWVVtLeHxtAQXlJxia0JVHbx3PDeP6a55RaRcFu4SFlZuK2zXPZyDVu9y8trmEP64rYG/l\nSdL6defxr0zgurFJRGieUfGCgl1C3spNxSxckUN1vQuA4qpqFq7IAQjKcK9rcLN8YxFPvltA4eFq\nRiX14E93TOSaUYmaOFocoWCXkLcoM++zUD+tut7Fosy8oAr2mnoXS7MKeerd3ZQcrWF8Sk9+ev1o\nrjyvr8ZFF0cp2CXklVRVt2m5v1XXuXh5/QH+/P5uyo7VMmlQL341eyyXjUhQoItPKNgl5PWPi6W4\nmRAP9MTMJ2sbeOnT/Sx+fw+VJ+q4YHBvHr3lfKYO7aNAF59SsEvIW5CRdsY1doDYqAgWZKQFpJ7j\nNfU8//F+nv5gD0dO1XPxsHi+MX0YFwzpE5B6pONRsEvIO30dPdC9YnZXnGBZdhF//fQAR6vruTwt\ngW9MH86kQb38WoeIgl3CwswJyQH5ovR4TT2rth5kaVYhGw9UEdHJcOXIvjw8fRjjUuL8Xo8IKNhF\n2szttnyy5xBLs4tYnXuQmno3w/p2Y+G1I5k1IZm+PToHukTp4BTsIq1UePgUy7KLWL6xiKIj1XSP\niWT2xBTmTErh/AFx+kJUgoYjwW6MeQa4Hii31o5xok2RYFBd52J17kGWZhXx8Z5DGAPThsazICON\njNGJGmVRgpJTZ+xLgCeA5x1qTyRgrLVsPHCEpVlFrNp6kBO1DQzs3YXvXD2CL09KITnA3ShFWuJI\nsFtr3zfGpDrRlkiglB2rYfnGIpZlF7Gn4iRdoiO4bmwSN09KYUpq7xZv9w/F8WokPOkau3RotQ0u\n3t5eztLsQt7fVYHbwpTU3tx/2VCuG5tEt1ZOORdq49VIePNbsBtj5gPzAQYOHOiv3Yp8jrWW3OJj\nLM0u5LXNJRytriepZ2cevHwYN09KITW+a5vbDJXxaqRj8FuwW2sXA4sB0tPTrb/2K3LaoRO1vLqp\nmGXZRewsPU50ZCcyRicyZ1IK04bFezVUbrCPVyMdiy7FSFird7l5N6+CpVmFrN1ZToPbMn5AHL+Y\nOYYbx/WnZ5coR/YTrOPVSMfkVHfHl4HLgXhjTBHwU2vt/znRtkh77Co7ztKsQl7dVELliVriu8Vw\n98WDuXlSCiP6dXd8f8E2Xo10bE71ivmKE+2IeOPoqXpe31rCsqxCthQdJbKT4crz+jJn0gAuS0vw\n6byhwTJejQjoUoyEuCMn61izo4zM3FI+yK+kzuVmZGJ3fnL9KGae358+3WL8VkugxqsROZuCXUJO\n+fEaMrc1hvnHew7hcluS42L52tRBzJqQzOj+PXR7v3RoCnYJCUVHTvFmbimZ20rJ2n8Ea2FIfFfu\nu3QI145JYkyywlzkNAW7eMWXd1vuqTjBak+Yby06CsB5ST349pUjuHZsIsP7dlOYizRDwS7t5vTd\nltZadpYebwzz3FLyyo4DcP6AOH5w7UhmjE5s181DIh2Ngl3azYm7La21bCk6yurcg2TmlrLv0Ck6\nGZic2puf3jCKjNGJ6gsu0kYKdmm39t5t6XJbsvYd/uwyy8GjNUR2Mlw0LJ75lw7lmtH9iPdjbxaR\ncKNgl3Zry92W9S43H+8+xOrcUtZsL6XyRB3RkZ24bEQC37smjavO6+fYXaAiHZ2CXdqtpbsta+pd\nfJBfyercg7y9vYxjNQ10jY7gipF9mTEmkSvS+tK1laMnikjr6X+VtFtzd1t+c/owIiMMD/11I+t2\nlnOqzkXP2CiuHpXIjDGJXDI8XrMOifiYgl28MnNCMlek9eXtHWWszi3lJ69vo67BTXy3aGZOSGbG\n6ESmDu3j09v5ReRMCnZpl4rjtazZXsbq3IN8vPsQDW5L/56d+eoFA7l2TBKTBvXyahhcEWk/Bbu0\nWklVNZnbSlmdW0rWvsO4LaT26cK9lwxhxphExqf01A1DIkFAwS5faF/lSVbnlvLmtlK2FFYBkNav\nO9+YPpxrxyaS1q+7wlwkyCjY5QzWWnaVnWB17kHezC1lZ2nj3Z/jUnryyIw0ZoxOZEhCtwBXKSJf\nRMEuWGvJKT762a38eypPYgykD+rFT64fRcbofqT06hLoMkWklRTsHZTLbdl44Aircxrv/iyuqiai\nk2HqkD7cffFgrhndj77dOwe6TBFpBwV7B1LvcvPpnsOszj3IW9vLqDheS3REJy4ZHs+3rxrO1aP6\nEdclOtBlioiXFOxhrqbexb8KKlmdW8rbO8qoOlVPbFQEV4xMYMaYJK5IS6B7Z93KLxJOnJrMegbw\nByACeNpa+xsn2pX2OVnbwHu7KlidW8q6neWcqG2ge+dIrjqvHzPGJHLZiATd/SkSxrwOdmNMBPBH\n4GqgCNhgjHndWrvd27al9Y5W1/POjjLezC3lvV0V1Da46dM1mhvGJ5ExOpGLhsYTHam7P0U6AifO\n2KcABdbaPQDGmFeAmwAFu48dra5ndc5BVueW8tHuSupdln49Yrht8gBmjElicmovInUrv0iH40Sw\nJwOFTZ4XARc40K40w+W2fLS7kqVZRWRuK6W2wc2A3rHMmzaYGWMSOT8ljk66lV+kQ/Pbl6fGmPnA\nfICBAwf6a7dhY/+hkyzLLmJ5dhElR2vo0TmSW9IHcPOkFMbpVn4RacKJYC8GBjR5nuJZdgZr7WJg\nMUB6erp1YL9h72RtA2/kHGRpdhHr9x7GGLhkeAILrzuPq0f1483cUh58aaNPJpIWkdDlRLBvAIYb\nYwbTGOi3Abc70G6HZK1l/d7DLM0u4o2cg5yqczE4visLMtKYPTGZpJ6NsxM5PZG0iIQPr4PdWttg\njHkYyKSxu+Mz1tptXlfWwZRUVbM8u4hlG4vYf+gUXaMjuGFcf+akpzBpUK/PXWpxYiJpEQlPjlxj\nt9a+AbzhRFsdSU29i8xtpSzLLuLDgkqshQuH9OabnpETu0Sf++1p70TSIhL+dOepn1lr2VJ0lKVZ\nhby+pYTjNQ0kx8XyjenDuXliCgP7tG6wrbZMJC0iHYuC3U/Kj9ewclMxS7OKyC8/QUxkJ64dk8ic\n9AFMHdKnzV0UW5pIWkQ6LgW7D9U1uFm7s4ylWUW8u6sCl9sycWAcv549li+NS6KHF2O0NDeRtHrF\niAgo2H1ie8kxlmYX8trmEg6frKNv9xi+fskQbp6UwrC+zk1SMXNCsoJcRD5Hwe6QIyfreG1zMUuz\ni9hWcozoiE5cNaovcyYN4JLh8bq1X0T8RsHuhQaXmw/yK1maXcjb28upc7kZk9yD/7pxNDeO70+v\nrhrbXET8T8HeDlWn6njmX/t4Zf0Byo/X0rtrNHdcOIg56Smcl9Qj0OWJSAenYG+DQydqefrDvTz/\n0T5O1rmYPrIvt6QPYPrIvhoSV0SCRsgF+8pNxX7vCVJ+vIa/vL+HFz85QE2Diy+NTeLh6cMYmaiz\ncxEJPiEV7P4eH6X0aA1/em83L68/QL3LzU3nJ/PQFUMZ1re74/sSEXFKSAW7v8ZHKTpyij+9t5u/\nbyjCbS2zJiTz0BXDSI3v6tg+RER8JaSC3dfjoxw4dIon3y1gWXYRxsDNkwbw4OVDGdC7dbf5i4gE\ng5AKdl+Nj7Kn4gR/XLeblZuLiehkuP2Cgdx/2VCNuyIiISmkgt3p8VHyy47z+NoCVm0tITqyE3Mv\nSmX+pUPo16OzUyWLiPhdSAW7U+Oj7Dh4jCfWFvBG7kFioyL4+iVDuPeSISR0j/FF2SIifhVSwQ7e\njY+SU3SUx9bms2Z7Gd1iInno8mHcffFgeusOUREJIyEX7O2x8cARHn8nn3V5FfToHMm3rxrOvIsG\n07NL+0dXFBEJVmEd7Bv2Heaxd/L5IL+SuC5RLMhI42tTB3k1XK6ISLALu2C31vLxnkM89k4+n+w5\nTHy3aBZeO5I7LhxE15iw++eKiHxO2CSdtZYP8it57J18svYfoW/3GH5y/ShunzKQ2OiIQJcnIuI3\nXgW7MWYO8DPgPGCKtTbLiaLawlrLurxy/vBOAVsKq0jq2Zmf3zSaW9IH0DlKgS4iHY+3Z+y5wGzg\nzw7U0iZut+Wt7WU8sS6f3OJjpPSK5VezxvLlScnERCrQRaTj8irYrbU7AIxp20TM3nC5LatzD/LE\n2gJ2lh4ntU8XfnfzOGZNSCZKsxSJiPjvGrsxZj4wH2DgwIHtbuf9XRU8/NdNDE3oyqO3jueGcf01\n7ZyISBMtBrsx5m0gsZlVP7LWvtbaHVlrFwOLAdLT022rKzzLZSMS+Mud6Uwf2ZeITv77TUFEJFS0\nGOzW2qv8UUhrdepkuHpUv0CXISIStHQNQ0QkzHgV7MaYWcaYImAq8E9jTKYzZYmISHt52yvmVeBV\nh2oREREH6FKMiEiYUbCLiIQZBbuISJhRsIuIhBkFu4hImFGwi4iEGQW7iEiYUbCLiIQZBbuISJhR\nsIuIhBkFu4hImFGwi4iEGQW7iEiYUbCLiIQZBbuISJhRsIuIhBkFu4hImFGwi4iEGQW7iEiY8WrO\nU2PMIuAGoA7YDcyz1lY5UVgwW7mpmEWZeZRUVdM/LpYFGWnMnJAc6LJERADvz9jXAGOsteOAXcBC\n70sKbis3FbNwRQ7FVdVYoLiqmoUrcli5qTjQpYmIAF4Gu7X2LWttg+fpJ0CK9yUFt0WZeVTXu85Y\nVl3vYlFmXoAqEhE5k5PX2O8GVjvYXlAqqapu03IREX9r8Rq7MeZtILGZVT+y1r7m2eZHQAPw0he0\nMx+YDzBw4MB2FRsM+sfFUtxMiPePiw1ANSIin9disFtrr/qi9caYucD1wJXWWvsF7SwGFgOkp6ef\nc7tgtyAjjYUrcs64HBMbFcGCjLQAViUi8m/e9oqZATwCXGatPeVMScHtdO8X9YoRkWBlvuAku+UX\nG1MAxACHPIs+sdbe39Lr0tPTbVZWVrv3KyLSERljsq216S1t59UZu7V2mDevFxER5+nOUxGRMKNg\nFxEJMwp2EZEwo2AXEQkzXvWKafdOjakA9nvZTDxQ6UA5TgrGmkB1tUUw1gSqqy2CsSZwpq5B1tqE\nljYKSLA7wRiT1ZpuP/4UjDWB6mqLYKwJVFdbBGNN4N+6dClGRCTMKNhFRMJMKAf74kAX0IxgrAlU\nV1sEY02gutoiGGsCP9YVstfYRUSkeaF8xi4iIs0I2mA3xswxxmwzxriNMef8JtkYM8MYk2eMKTDG\n/KDJ8sHGmE89y/9mjIl2qK7expg1xph8z9+9mtnmCmPM5iZ/aowxMz3rlhhj9jZZd76/6vJs52qy\n79ebLHf8eLXyWJ1vjPnY815vNcbc2mSdo8fqXJ+VJutjPP/2As+xSG2ybqFneZ4xJsObOtpR13eM\nMds9x+cdY8ygJuuafT/9UNNcY0xFk33f22TdXZ73PN8Yc5dTNbWyrkeb1LTLGFPVZJ2vjtUzxphy\nY0zuOdYbY8xjnpq3GmMmNlnnm2NlrQ3KP8B5QBrwLpB+jm0iaJxEewgQDWwBRnnW/R24zfP4T8AD\nDtX1O+AHnsc/AH7bwva9gcNAF8/zJcDNPjheraoLOHGO5Y4fr9bUBIwAhnse9wcOAnFOH6sv+qw0\n2eZB4E+ex7cBf/M8HuXZPgYY7Gknwo91XdHk8/PA6bq+6P30Q01zgSfO8Xnf4/m7l+dxL3/Vddb2\n3wCe8eWx8rR7KTARyD3H+utonF3OABcCn/r6WAXtGbu1doe1tqWJRKcABdbaPdbaOuAV4CZjjAGm\nA8s82z0HzHSotJs87bW23Zuv7FB0AAAEAUlEQVSB1db349W3ta7P+PB4tViTtXaXtTbf87gEKAda\nvAGjHZr9rHxBvcuAKz3H5ibgFWttrbV2L1Dgac8vdVlr1zX5/PhjbuHWHKtzyQDWWGsPW2uP0Djh\n/YwA1fUV4GWH9n1O1tr3aTx5O5ebgOdto0+AOGNMEj48VkEb7K2UDBQ2eV7kWdYHqLL/nmj79HIn\n9LPWHvQ8LgX6tbD9bXz+w/VLz69kjxpjYvxcV2djTJYx5pPTl4fw3fFq07Eyxkyh8Uxsd5PFTh2r\nc31Wmt3GcyyO0nhsWvNaX9bV1D2cObdwc++nv2r6sue9WWaMGdDG1/qyLjyXqwYDa5ss9sWxao1z\n1e2zY+XVeOzeMq2YTzUQvqiupk+stdYYc85uRZ6fymOBzCaLF9IYctE0dn/6PvBzP9Y1yFpbbIwZ\nAqw1xuTQGGDt4vCxegG4y1rr9ixu97EKR8aYO4B04LImiz/3flprdzffgqP+Abxsra01xtxH4286\n0/2w39a6DVhmrXU1WRaoY+V3AQ1228J8qq1QDAxo8jzFs+wQjb/uRHrOvE4v97ouY0yZMSbJWnvQ\nE0blX9DULcCr1tr6Jm2fPoOtNcY8C3zPn3VZa4s9f+8xxrwLTACW087j5URNxpgewD9p/IH+SZO2\n232smnGuz0pz2xQZYyKBnjR+llrzWl/WhTHmKhp/WF5mra09vfwc76e3YdViTdbaQ02ePk3j9ymn\nX3v5Wa9918t6Wl1XE7cBDzVd4KNj1RrnqttnxyrUL8VsAIabxh4d0TS+ma/bxm8m1tF4fRvgLsCp\n3wBe97TXmnY/d43PE3Cnr2vPBJr9Jt0XdRljep2+nGGMiQemAdt9eLxaU1M08CqN1yCXnbXOyWPV\n7GflC+q9GVjrOTavA7eZxl4zg4HhwHovamlTXcaYCcCfgRutteVNljf7fvqppqQmT28EdngeZwLX\neGrrBVzDmb+x+rQuT20jafwy8uMmy3x1rFrjdeBOT++YC4GjnpMW3x0rp74ZdvoPMIvGa061QBmQ\n6VneH3ijyXbXAbto/Mn7oybLh9D4n68AWArEOFRXH+AdIB94G+jtWZ4OPN1ku1QafyJ3Ouv1a4Ec\nGkPqRaCbv+oCLvLse4vn73t8ebxaWdMdQD2wucmf831xrJr7rNB4aedGz+POnn97gedYDGny2h95\nXpcHXOvwZ72lut72/B84fXxeb+n99ENNvwa2efa9DhjZ5LV3e45hATDPn8fK8/xnwG/Oep0vj9XL\nNPbmqqcxs+4B7gfu96w3wB89NefQpJefr46V7jwVEQkzoX4pRkREzqJgFxEJMwp2EZEwo2AXEQkz\nCnYRkTCjYBcRCTMKdhGRMKNgFxEJM/8f7wM08uRvXPsAAAAASUVORK5CYII=\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import numpy as np\n", "import matplotlib.pyplot as plt\n", "\n", "x = np.linspace(-1,1,10)\n", "y = x**3 + x**2 + 2*x + 1\n", "\n", "numTest = 400\n", "\n", "ynoisy = y +np.random.normal(0,.6,len(x))\n", "\n", "plt.scatter(x, ynoisy)\n", "plt.plot(x, y)\n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To represent the three terms (bias, variance and Mean Squared Error) code the following steps \n", "\n", "__1.__ We will write two nested loops. The first one on the maximum degree of the regression model. The second one on the number of experiments. For each experiments we will generate new points as we did above but with a different noise vector. That is re-use a line of the form ynoisy = y +np.random.normal(0,.6,len(x)) for each XP \n", "\n", "\n", "__2.__ In each experiment, for each maximum degree of the regression model (0,1,..5), generate the polynomial features up to this degree and learn the regression model on the noisy points with the LinearRegression function from scikit (just use the plain simple synthax : LinearRegression(), no need for any argument and fit the model to the noisy points)\n", "\n", "\n", "__3.__ For each experiment keep track of the models you learn by using the points xprediction below and computing the prediction of the model at those points. Store those predictions in a matrix of size num_MaxDegree x numXP\n", "\n", "\n", "__4.__ Compute each of the three terms (bias, variance and Mean squared error = Bias + variance) by using the expressions given above. \n", "\n", "\n", "__5.__ plot the results" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import matplotlib.pyplot as plt\n", "\n", "x = np.linspace(-1,1,10)\n", "y = x**3 + x**2 + 2*x + 1\n", "\n", "maxDegree = 5\n", "numTest = 400\n", "\n", "from sklearn.linear_model import LinearRegression\n", "\n", "\n", "xprediction = np.linspace(-1,1,100)\n", " \n", " \n", "averagePrediction = np.zeros((len(xprediction),maxDegree)) \n", "true = np.zeros((len(xprediction),maxDegree)) \n", "variance = np.zeros((len(xprediction),maxDegree)) \n", "bias = np.zeros((len(xprediction),maxDegree)) \n", "\n", " \n", "for dmax in np.arange(0,maxDegree):\n", "\n", " \n", " ynoisy = np.zeros((len(x),numTest))\n", " predictionTesti = np.zeros((len(xprediction),numTest))\n", "\n", " clf = LinearRegression()\n", "\n", " for testi in np.arange(0,numTest):\n", "\n", " ynoisy[:,testi] = y +np.random.normal(0,.6,len(x))\n", " \n", " \n", " '''put your code here. Generate the polynomial features up to degree dmax and fit the clf model'''\n", " \n", " \n", " \n", " \n", " \n", " predictionTesti[:,testi] = 0 # fill in the matrix with the predictions from each model\n", " \n", "\n", " averagePrediction[:,dmax] = 0 # Compute the average prediction across the XPs\n", " \n", " \n", " centeredPrediction = predictionTesti - np.expand_dims(averagePrediction[:,dmax],axis=1)\n", " \n", " variance[:,dmax] = 0 # Compute the variance by averaging over the XP (see expression above)\n", " \n", " \n", " true[:,dmax] = 0 # compute the prediction from the noiselss model to get the targets of the points in xprediction\n", " \n", " bias[:,dmax] = 0 # Compute the bias by averaging over the XP (see expression above)\n", " \n", "\n", "'''plot the result using the lines below''' \n", "\n", "plt.semilogy(np.arange(0,maxDegree), np.mean(bias,axis=0), label='bias')\n", "plt.semilogy(np.arange(0,maxDegree), np.mean(variance,axis=0), label = 'variance')\n", "plt.semilogy(np.arange(0,maxDegree), np.mean(bias,axis=0).reshape(-1,1) + np.mean(variance,axis=0).reshape(-1,1), label = 'MSE')\n", "\n", "plt.legend() \n", "\n", "plt.show()\n", "\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Desired result :\n", " \n", "" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "### Part II. Unsupervised Learning (20pts)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "### Exercise II.1. Clustering and latent representation\n", "\n", "The lines below can be used to load and display (low resolution) images of digits from 0 to 9. The labels associated to each image are stored in the vector $y$. From this vector, only retain the images representing $4$ and $3$. We will temporarily forget about the labels for now and learn a 2D representation of the images through ISOMAP. " ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPgAAAD8CAYAAABaQGkdAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi40LCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcv7US4rQAACrdJREFUeJzt3V+IXOUZx/Hfr6vSWq2G1hbZDU0i\nEpBCjQkBSRGaxBKraC9qSEChUlhvFKUFjb3rnVdiL4oQolYwVbpRQcRqE1Ss0Fp3Y2xNNpZ0sWQX\nbSKJRL1oSHx6sScQJXbOZs5558zj9wOL+2fY95nEb87Z2ZnzOiIEIKevDHoAAO0hcCAxAgcSI3Ag\nMQIHEiNwIDECBxIjcCAxAgcSO6eNb2o75dPjFi1aVHS90dHRYmsdO3as2Fpzc3PF1jp58mSxtUqL\nCPe6TSuBZ7V+/fqi691///3F1tq1a1extbZs2VJsraNHjxZbq4s4RQcSI3AgMQIHEiNwIDECBxIj\ncCAxAgcSI3AgsVqB295g+x3bB2yXe5YCgL70DNz2iKTfSrpO0hWSNtu+ou3BAPSvzhF8taQDETET\nEcclPSnppnbHAtCEOoGPSjp42sez1ecAdFxjLzaxPS5pvKnvB6B/dQKfk7T4tI/Hqs99RkRslbRV\nyvtyUWDY1DlFf0PS5baX2j5P0iZJz7Y7FoAm9DyCR8QJ23dIelHSiKRHImJv65MB6Futn8Ej4nlJ\nz7c8C4CG8Uw2IDECBxIjcCAxAgcSI3AgMQIHEiNwIDECBxJjZ5MFKLnTiCQtW7as2Folt2U6cuRI\nsbU2btxYbC1JmpiYKLpeLxzBgcQIHEiMwIHECBxIjMCBxAgcSIzAgcQIHEiMwIHE6uxs8ojtQ7bf\nLjEQgObUOYL/TtKGlucA0IKegUfEq5LKPXkYQGP4GRxIjK2LgMQaC5yti4Du4RQdSKzOr8mekPQX\nScttz9r+eftjAWhCnb3JNpcYBEDzOEUHEiNwIDECBxIjcCAxAgcSI3AgMQIHEiNwILGh37po5cqV\nxdYquZWQJF122WXF1pqZmSm21s6dO4utVfL/D4mtiwAUROBAYgQOJEbgQGIEDiRG4EBiBA4kRuBA\nYgQOJEbgQGJ1Lrq42PbLtvfZ3mv7rhKDAehfneein5D0y4jYbftCSVO2d0bEvpZnA9CnOnuTvRcR\nu6v3P5I0LWm07cEA9G9BryazvUTSCkmvn+FrbF0EdEztwG1fIOkpSXdHxLHPf52ti4DuqfUouu1z\nNR/39oh4ut2RADSlzqPolvSwpOmIeKD9kQA0pc4RfI2kWyWttb2nevtxy3MBaECdvclek+QCswBo\nGM9kAxIjcCAxAgcSI3AgMQIHEiNwIDECBxIjcCCxod+bbNGiRcXWmpqaKraWVHa/sJJK/zl+mXEE\nBxIjcCAxAgcSI3AgMQIHEiNwIDECBxIjcCAxAgcSq3PRxa/a/pvtt6qti35dYjAA/avzVNX/Slob\nER9Xl09+zfYfI+KvLc8GoE91LroYkj6uPjy3emNjA2AI1N34YMT2HkmHJO2MiDNuXWR70vZk00MC\nODu1Ao+IkxFxpaQxSattf+8Mt9kaEasiYlXTQwI4Owt6FD0iPpT0sqQN7YwDoEl1HkW/xPbF1ftf\nk3StpP1tDwagf3UeRb9U0mO2RzT/D8IfIuK5dscC0IQ6j6L/XfN7ggMYMjyTDUiMwIHECBxIjMCB\nxAgcSIzAgcQIHEiMwIHE2LpoAXbt2lVsrcxK/p0dPXq02FpdxBEcSIzAgcQIHEiMwIHECBxIjMCB\nxAgcSIzAgcQIHEisduDVtdHftM312IAhsZAj+F2SptsaBEDz6u5sMibpeknb2h0HQJPqHsEflHSP\npE9bnAVAw+psfHCDpEMRMdXjduxNBnRMnSP4Gkk32n5X0pOS1tp+/PM3Ym8yoHt6Bh4R90XEWEQs\nkbRJ0ksRcUvrkwHoG78HBxJb0BVdIuIVSa+0MgmAxnEEBxIjcCAxAgcSI3AgMQIHEiNwIDECBxIj\ncCCxod+6qOTWNCtXriy2VmkltxMq+ec4MTFRbK0u4ggOJEbgQGIEDiRG4EBiBA4kRuBAYgQOJEbg\nQGIEDiRW65ls1RVVP5J0UtIJrpwKDIeFPFX1hxHxQWuTAGgcp+hAYnUDD0l/sj1le7zNgQA0p+4p\n+g8iYs72tyXttL0/Il49/QZV+MQPdEitI3hEzFX/PSTpGUmrz3Abti4COqbO5oNft33hqfcl/UjS\n220PBqB/dU7RvyPpGdunbv/7iHih1akANKJn4BExI+n7BWYB0DB+TQYkRuBAYgQOJEbgQGIEDiRG\n4EBiBA4kRuBAYo6I5r+p3fw3/QLLli0rtZQmJyeLrSVJt99+e7G1br755mJrlfw7W7Uq70sjIsK9\nbsMRHEiMwIHECBxIjMCBxAgcSIzAgcQIHEiMwIHECBxIrFbgti+2vcP2ftvTtq9uezAA/at7XfTf\nSHohIn5q+zxJ57c4E4CG9Azc9kWSrpH0M0mKiOOSjrc7FoAm1DlFXyrpsKRHbb9pe1t1fXQAHVcn\n8HMkXSXpoYhYIekTSVs+fyPb47YnbZd9yRWAL1Qn8FlJsxHxevXxDs0H/xlsXQR0T8/AI+J9SQdt\nL68+tU7SvlanAtCIuo+i3ylpe/UI+oyk29obCUBTagUeEXskceoNDBmeyQYkRuBAYgQOJEbgQGIE\nDiRG4EBiBA4kRuBAYgQOJDb0e5OVND4+XnS9e++9t9haU1NTxdbauHFjsbUyY28y4EuOwIHECBxI\njMCBxAgcSIzAgcQIHEiMwIHECBxIrGfgtpfb3nPa2zHbd5cYDkB/el50MSLekXSlJNkekTQn6ZmW\n5wLQgIWeoq+T9K+I+HcbwwBoVt3rop+ySdITZ/qC7XFJZV+NAeD/qn0ErzY9uFHSxJm+ztZFQPcs\n5BT9Okm7I+I/bQ0DoFkLCXyzvuD0HEA31Qq82g/8WklPtzsOgCbV3ZvsE0nfbHkWAA3jmWxAYgQO\nJEbgQGIEDiRG4EBiBA4kRuBAYgQOJNbW1kWHJS30JaXfkvRB48N0Q9b7xv0anO9GxCW9btRK4GfD\n9mTWV6JlvW/cr+7jFB1IjMCBxLoU+NZBD9CirPeN+9VxnfkZHEDzunQEB9CwTgRue4Ptd2wfsL1l\n0PM0wfZi2y/b3md7r+27Bj1Tk2yP2H7T9nODnqVJti+2vcP2ftvTtq8e9Ez9GPgpenWt9X9q/oox\ns5LekLQ5IvYNdLA+2b5U0qURsdv2hZKmJP1k2O/XKbZ/IWmVpG9ExA2Dnqcpth+T9OeI2FZdaPT8\niPhw0HOdrS4cwVdLOhARMxFxXNKTkm4a8Ex9i4j3ImJ39f5HkqYljQ52qmbYHpN0vaRtg56lSbYv\nknSNpIclKSKOD3PcUjcCH5V08LSPZ5UkhFNsL5G0QtLrg52kMQ9KukfSp4MepGFLJR2W9Gj148e2\n6nqEQ6sLgadm+wJJT0m6OyKODXqeftm+QdKhiJga9CwtOEfSVZIeiogVkj6RNNSPCXUh8DlJi0/7\neKz63NCzfa7m494eEVmuSLtG0o2239X8j1NrbT8+2JEaMytpNiJOnWnt0HzwQ6sLgb8h6XLbS6sH\nNTZJenbAM/XNtjX/s9x0RDww6HmaEhH3RcRYRCzR/N/VSxFxy4DHakREvC/poO3l1afWSRrqB0UX\nujdZ4yLihO07JL0oaUTSIxGxd8BjNWGNpFsl/cP2nupzv4qI5wc4E3q7U9L26mAzI+m2Ac/Tl4H/\nmgxAe7pwig6gJQQOJEbgQGIEDiRG4EBiBA4kRuBAYgQOJPY/qbaNczQ1iIEAAAAASUVORK5CYII=\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "from sklearn import datasets\n", "import matplotlib.pyplot as plt\n", "\n", "digits = datasets.load_digits(n_class=10)\n", "X = digits.data\n", "y = digits.target\n", "n_samples, n_features = X.shape\n", "\n", "plt.figure()\n", "plt.imshow(np.reshape(X[0,:], (8,8)),cmap='gray') \n", "plt.show() " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question II.1.1 Building the graph (5pts)\n", "\n", "We will start by building the graph representing the data. For this, we will follow the steps below \n", "\n", "\n", "__1.__ Center the dataset as $\\mathbf{x}_i \\leftarrow \\mathbf{x}_i - \\mathbb{E}_i\\mathbf{x}_i$\n", "\n", "__2.__ Compute the matrix of pairwise distances between the centered images. You can do this either by hand, noting that $D(\\mathbf{x}_i, \\mathbf{x}_j) = \\|\\mathbf{x}_i\\|^2 + \\|\\mathbf{x}_j\\|^2 - 2\\langle \\mathbf{x}_i, \\mathbf{x}_j\\rangle$ or using a call to the 'pairwise_distances' function from scikit learn. \n", "\n", "__3.__ Once you have the distance matrix, obtain the matrix of scalar products by squaring the distances and applying double centering\n", "\n", "$$\\mathbf{S} = -\\frac{1}{2}(\\mathbf{I} - \\frac{1}{n}\\mathbf{1}\\mathbf{1}^T)\\mathbf{D}^2(\\mathbf{I} - \\frac{1}{n}\\mathbf{1}\\mathbf{1}^T)$$\n", "\n", "where $\\mathbf{1} = \\left[1,1,\\ldots,1\\right]$ is a vector of all ones and $\\mathbf{I}$ is the indentity matrix.\n", "\n", "__4.__ Compute the graph representation. The graph $\\mathcal{G} = (\\mathcal{V}, \\mathcal{E})$ si defined on a set of vertices $\\mathcal{V}$ and a set of edges between those vertices $\\mathcal{E}$. The set of vertices corresponds to the set of images in the original dataset. The set of edges will be defined according to the $K$-rule as explained below.\n", "\n", "We will represent the graph through its adjacency matrix $A$ where $A_{ij} = 1$ if we draw an edge between vertex $i$ and vertex $j$. To build this adjacency matrix, we will add an edge between image $\\mathbf{x}_i$ and image $\\mathbf{x}_j$ whenever $\\mathbf{x}_j$ is among the $K$ nearest neighbors of $\\mathbf{x}_i$. \n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question II.1.2 Computing the geodesic distances (5pts)\n", "\n", "__1.__ Once we have the graph representation of the data, we need to compute the shortest path between any two vertices in this graph (shortest geodesic distance between any two images). To do that, connect the vertices that were not connected by the K nearest neighbors approach, with an edge of sufficiently large weight (To avoid having to take huge values, you might want to normalize the distances (resp scalar product) for example by normalizing by the norm of the matrix). You should then have an adjacency matrix $\\mathbf{A}$ with $0$ on the diagonal and such that $A_{ij} = 1$ if the two images are connected and $A_{ij} = \\inf$ or some large number if they are not.\n", "\n", "\n", "__2.__ Let us denote the updated adjacency matrix as $\\tilde{\\mathbf{A}}$. From this matrix, we will now compute the shortest geodesic distance. That can be done through the Floyd-Warshall algorithm as indicated below.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "\n", "\n", "# code the Floyd Warshall algorithm as follows\n", "# for k = 1 to n\n", "# for i =1 to n\n", "# for j = 1 to n\n", "# \\tilde{A}_{ij}\\leftarrow \\min(A_{ij}, A_{ik + A_{kj}})\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question II.1.3 Low dimensional projection (2pts)\n", "\n", "To conclude, from the matrix of geodesic distances, compute the low dimensional representation. Do this by\n", "\n", "1. First getting the singular value decomposition of the geodesic distance matrix as $\\mathbf{S}_\\mathcal{G} = \\mathbf{U}\\mathbf{\\Lambda}\\mathbf{U}^T$. Define the projection as $\\mathbf{I_{P\\times N}}\\mathbf{\\Lambda}^{1/2}\\mathbf{U}^T$ with $P=2$ (that is retain the first two rows of the matrix $\\mathbf{\\Lambda}^{1/2}\\mathbf{U}^T$).\n", "\n", "2. Represent each image $\\mathbf{x}_i$ from the $2$ tuple encoded in the $i^{th}$ column of $\\mathbf{I}_{2\\times N}\\mathbf{\\Lambda}^{1/2}\\mathbf{U}^T$. Display the result below.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import numpy as np\n", "import \n", "import matplotlib.pyplot as plt\n", "\n", "\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Exercise II.2. (K-means)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question II.2.1 (8pts)\n", "\n", "Now that we have a two dimensional representation for the images. We will use a clustering algorithm to learn how to distinguish between the two digits. \n", "\n", "\n", "__1.__ Start by splitting the dataset into a training and a validation set (let us take $90\\%$ training and $10\\%$ validation). \n", "\n", "__2.__ Initialize the $K$-means algorithm with $2$ centroids located at random positions\n", "\n", "__3.__ Assign each point to its nearest centroid as \n", "\n", "$$\\mathcal{C}(\\mathbf{x}_i) \\leftarrow \\underset{k}{\\operatorname{argmin}} \\|\\mathbf{x}_i - \\mathbf{c}_{k}\\|^2$$\n", "\n", "__4.__ Update the centroids as \n", "\n", "$$\\mathbf{c}_k \\leftarrow \\frac{1}{N_k}\\sum_{\\ell\\in \\mathcal{C}_k}\\mathbf{x}_\\ell,\\quad k=1,2.$$\n", "\n", "__5.__ Make sure to properly treat empty clusters. If you end up with an empty cluster, restart the iterations by splitting the single cluter you have into two sub-clusters and define your new centroids as the centers of mass of those clusters. \n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "import numpy as np\n", "\n", "\n", "while \n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "### Part III. Reinforcement learning (10pts)" ] }, { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "In this last exercise, we will tackle a simple reinforcement learning problem. Consider the map given below. There are 5 rooms + the garden. We would like to train an agent to get out of the house as quickly as possible. To set up the evironment, we will consider 6 possible state (the rooms in which the agent is located) and 6 possible actions (moving from one room to any other room). \n", "\n", "The Q-table can thus be encoded by a $6$ by $6$ matrix. We will consider three types of rewards. Impossible moves (example 1 to 4) will be penalized by $1$. possible moves will be associated to a $0$ reward. Finally any move leading to an escape (e.g. 2 to 6) will be rewarded by 100. \n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "Map\n", "" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question III.1 (5pts)\n", "\n", "As a first approach, we will just run a couple of pure exploration iterations" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "done = False \n", "\n", "while not done: \n", " \n", " \n", " '''complete the lines below by sampling an action at random and updating the state of the environement\n", " until the variable Done is not set to True. Set this variable to True when the agent is able to escape the house'''\n", " \n", " \n", " \n", " \n", " \n", " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question III.2 (5pts)\n", "\n", "Now that you can expore, we will start to exploit and we will do that through the use of a $Q$ table. In this case, as indicated in the statement of the exercise, the Q-table is 6x6. Train the agent by alternating between exploitation and exploration. \n", "\n", "Since we want to update the $Q$-table, we will now add a line of the form \n", "\n", "$$Q[s, a] \\leftarrow (1-\\alpha)Q[s,a] + \\alpha\\left(R[a] + \\gamma\\max_{a'}Q[s',a']\\right)$$\n", "\n", "When in the exploration framework, we will sample the action at random as in Question III.1. When in the exploitation framework however, we will simply choose the action as the one that maximizes the entry in the $Q$-table for the particular state at which we are. Hence we have $a^* = \\underset{a}{\\operatorname{argmax}} Q[s,a]$. \n", "\n", "\n", "Code this epsilon-greedy approach below. You can start $\\epsilon =0.8$ \n", "Take a sufficiently small learning rate (you can for example start with 0.5) and a relatively large discount factor $\\gamma=0.9$ (You can later change those values to see how they affec the learning)\n", "\n", "Once you are done with the algorithm, try a couple of different values for $\\epsilon$ and describe the evolution in the learning. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": true }, "outputs": [], "source": [ "done = False \n", "\n", "epsilon = 0\n", "gamma = 0\n", "alpha = 0\n", "\n", "for episode in range(NumEpisodes):\n", "\n", " done =False \n", " \n", " while not done: \n", " \n", " \n", " '''Draw a number at random from the uniform distribution between 0 and 1''' \n", " \n", " \n", " '''If the number is less then epsilon, explore if it is larger, exploit'''\n", " \n", " if randomDraw < epsilon:\n", " \n", " # exploration\n", " \n", " '''update the Q-table'''\n", " \n", " else:\n", " \n", " # exploitation\n", " \n", " '''update the Q-table'''\n", " \n", " \n", " " ] } ], "metadata": { "kernelspec": { "display_name": "Python 2", "language": "python", "name": "python2" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 2 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython2", "version": "2.7.13" } }, "nbformat": 4, "nbformat_minor": 2 }