{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "

Artificial Intelligence, Assignment 2

\n", "\n", "

Search agents

\n", "\n", "\n", "\n", "\n", "__Total:__ 27pts + 5pts\n", "\n", "__Given date:__ Tuesday September 28\n", "\n", "__Due date:__ Friday October 15" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Question 1. A story of robots and batteries \n", "\n", "##### (15pts + 2pts)\n", "\n", "We consider the simple 12$\\times$12 world depicted below. In this first exercise, we will study the behavior of an agent that can only see the immediately adjacent cells (that is it only sees the cells that are directly in front, behind, or on its left/right). Your agent is a simple robot that enters the maze from the bottom left cell and must reach the exit which is located on the uppermost rightmost cell. \n", "\n", "\n", "\n", "\n", "

\n", "\n", "The objective of the agent is twofold:\n", "\n", " - 1) It has to find the exit (In a first approach, we won't take any step cost into account), while avoiding all the holes.\n", "\n", " - 2) It has to collect all the batteries.\n", "\n", "\n", "\n", "##### Question 1.1. (5pts) A simple reflex agent \n", "\n", "Using a simple while loop and follow the ideas discussed during the recitations to code a simple reflex agent that achieves this objective. When the agent faces a pit, it should avoid it. When the agent is on a cell containing a battery, it should collect it. Finally the agent can only move in the four immediately adjacent cells to its current position. When it sees no pit and there are no batteries in any of the adjacent cells, it should move at random. Consider adding on the order of 14 batteries and 18 holes (first manually, then at random) \n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# put your solution here\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Question 1.2. (5pts) Search agent\n", "\n", "We will now assume that our agent has a map of the world. On top of the pits from above, the world now also contains walls, which are additional obstacles in the search for the exit.\n", "\n", "Solve the problem using Breadth First search. The children of a node are given by the adjacent cells. Once you have the path to a battery, stores it. Then continue BFS from the location of this battery and store your second path,.... Proceed like this until you have all the batteries. From the last battery find the exit. \n", "\n", "\n", "\n", "

\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# put your code here\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Question 1.3. (5pts) Informed search agent\n", "\n", "In this third question, we will use an informed search strategy to improve our agent. We want to use as our heuristic the $\\ell_1$ distance to the closest battery that has not been picked. Code a Best First Search agent whose heuristic changes as it picks up new batteries. As soon as it picked up the last battery, the heuristic becomes the $\\ell_1$ distance to the exit. You can assume that the cells have unitary side length. Also recall that the $\\ell_1$ distance is given by $\\|\\boldsymbol x_1 - \\boldsymbol x_2\\|_1 = |x_{11} - x_{21}| + |x_{12} - x_{22}|$ where $\\boldsymbol x_1 = (x_{11}, x_{12})$, $\\boldsymbol x_{2} = (x_{21}, x_{22})$. \n", "\n", "\n", "\n", "\n", "

\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# put your code here\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Bonus (2pts) \n", "Generate and display the movie of the search for each of the questions above. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# put your code here\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Question 2. Rook Jumping\n", "\n", "##### (12pts + 3pts)\n", "\n", "In this second question, we consider a \"rook jumping\" maze. An example of such a maze is given below (the starting position is shown in red and the goal position is shown in green). \n", "\n", "\n", "\n", "\n", "\n", "Each state in the maze has an associated jump number that provides the exact number of cells one may move horizontally or vertically in a straight line to change state. As an example, in the maze above, the first move may either be 2 cells on the right of (0,0) or 2 cells down to (2,0)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Question 2.1. Generate the maze (2pts)\n", "\n", "Start by completing the function Maze_generation which takes as argument the dimension of the maze as well as a maximum jump number (don't take it much larger than n/2) and returns a matrix of random integers between 0 and the maximum jump length. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import \n", "\n", "def Maze_generation(n):\n", " \n", " '''function should return a random maze'''\n", " \n", " \n", " return maze\n", " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can use the lines below to display your maze" ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAOsAAADuCAYAAADYx/BmAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8vihELAAAACXBIWXMAAAsTAAALEwEAmpwYAAAQiUlEQVR4nO3df2zUdZ7H8dd3ZvqlP5cbsEhnpiBTqtKeQFLqtVTM/UMEWpoN2NOAxopeLocbnBWp8IdsNiRms41HZA1/7RFy3qK18XLyY+VHPNZb95KtOfBwEa+Sa6udFmnt2dah58x8vt/7gywhsd/pjIG+562vx3/TaTKvfOizMx3oF8t1XRBR/vNJDyCi7DBWIiUYK5ESjJVICcZKpARjJVKCsRIpwViJlGCsREoEcvnkv5g3310YXnS7ttxSY1Mp6Qk5GR26Kj0hJ3eEFkhPyFrl3ELpCVkbGOjH6OioNd19OcW6MLwIh/7l327NqtvsNx8NS0/IyaF9B6Un5KTtxe3SE7L2cmuN9ISsNf3VKs/7+DKYSAnGSqQEYyVSgrESKcFYiZRgrERKMFYiJRgrkRKMlUgJxkqkBGMlUoKxEinBWImUYKxESjBWIiVy+n3WW+mlPT/BH86eRnD+HfjnE/8hNSMrk6PDePfAHlz76ktYloWatW1Y0fK49KwZua6DZG83rIIS2NEW6TnT0ni2p0+dxPPPPQtjDNq3PY1dHbtn5XHFYt2waQs2P/a32Nfx91ITsubzB9DU3oHyaA2SUwl072pD5YpGzKtcKj0tIzNyAdacIOAkpad40na2xhjEdjyDE++cQTgSwQMN9WhpacWymtv/C+5iL4NX1q/Gj+YGpR4+JyXBcpRHr/9h2EUlCEaiSIzl92VY3OTXcCb64Z+f31dJ0Ha2H/T0oKpqKZZEo7BtG22PPIrjx96elcfmz6w5mrgax2jfJdxZvVx6Skap+PsIhFYDmPZyPnlJw9kODcURiVTeuB0ORxCPx2flsRlrDlJTCZzqjKHpyd2wi0ul53gy4/2wAkXwFeu5qJmWs53uv0i1rNn5hij2M6s2Jp3Cyc4Yqtc0o6phrfScjJzEMMxEH8zFAcBNAyaF5MAZ2Ivzc7emsw2HIxgc/PzG7Xh8EKFQaFYem7FmwXVdnD24F8FIFCtb26XnzKgg1IiCUCMAwEzGYUbO522o2s52VX09Ll/+FP19fQiFw+juegOHXzsyK48tFuvPfvo0zvf8AV/975f48ZpaPLVjNza25edb9lc+OYfe945i3qK70bVzEwCgYUsMi+seFF6mn7azDQQC2P/Kq9jY/BCMMXiifRtqamtn57Fn5VGm8fP9v5Z66JxVLKvD9rcuSs/4TvxlYfjLwtIzPGk823XrN2Dd+g2z/rh8g4lICcZKpARjJVKCsRIpwViJlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZRgrERKMFYiJRgrkRKMlUiJnK4UMTaVwm8+Gr5dW37QFv71eukJOTm076D0hKz99vd6znZocNzzPj6zEinBWImUYKxESjBWIiUYK5ESjJVICcZKpARjJVKCsRIpwViJlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKZHTlSJupcnRYbx7YA+uffUlLMtCzdo2rGh5XGpORpq2AoCTTuJK9wtwTQpwHBRXNyHYuFV6Vkau6yDZ2w2roAR2tEV6jifJsxWL1ecPoKm9A+XRGiSnEuje1YbKFY2YV7lUapInTVsBwPIXYOHml+Czi+CaNIbf7EDRXXUorLhXeponM3IB1pwg4CSlp2QkebZiL4NLguUoj9YAAOyiEgQjUSTGrkrNyUjTVgCwLAs+uwgA4DppwDGwYAmv8uYmv4Yz0Q///BrpKTOSPFuxZ9abTVyNY7TvEu6sXi49ZUZatrqOwdCRGNLjwyhb3ow5FfdIT/KUir+PQGj19ZeWCkidrfgbTKmpBE51xtD05G7YxaXSczLStNXy+RF+7FeIPHUYyS96kRztl540LTPeDytQBF/xAukpWZM6W9FYTTqFk50xVK9pRlXDWskpM9K09Wb+wlIURu7D1MA56SnTchLDMBN9+L+L/4TUwCk4k3EkB85Iz8rKbJ+t2Mtg13Vx9uBeBCNRrGxtl5qRFU1bAcBcGwd8fvgLS+Gkv8HUZx9i7qqHpWdNqyDUiIJQIwDATMZhRs7DXpy/3wwlz1Ys1iufnEPve0cxb9Hd6Nq5CQDQsCWGxXUPSk3ypGkrAJjEGEZP74frOoDroKR6DYqj90vP+l6QPFvLdd2sP3nB0r9023755m2c88P129/3SU/IyZXfvSM9IWua/reDoSMxfPPFp9O+vSz+BhMRZYexEinBWImUYKxESjBWIiUYK5ESjJVICcZKpARjJVKCsRIpwViJlGCsREowViIlGCuREoyVSAnGSqRETleKqJxbiJdb8/9ykQCwbNcJ6Qk50fTL3ACw7cXt0hOydmjfQekJWUtNjnvex2dWIiUYK5ESjJVICcZKpARjJVKCsRIpwViJlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZRgrERKMFYiJXK6UsStdvrUSTz/3LMwxqB929PY1bFbco4nJ53Ele4X4JoU4Dgorm5CsHGr9KwZua6DZG83rIIS2NEW6TnTmhwdxrsH9uDaV1/CsizUrG3DipbHpWfNSOJsxWI1xiC24xmceOcMwpEIHmioR0tLK5bV5N9lYyx/ARZufgk+uwiuSWP4zQ4U3VWHwop7padlZEYuwJoTBJyk9BRPPn8ATe0dKI/WIDmVQPeuNlSuaMS8yqXS0zKSOFuxl8Ef9PSgqmoplkSjsG0bbY88iuPH3paak5FlWfDZRQAA10kDjoEFS3hVZm7yazgT/fDPz79vfjcrCZajPHp9o11UgmAkisTYVeFVmUmdrdgz69BQHJFI5Y3b4XAEPT1/lJozI9cxGDoSQ3p8GGXLmzGn4h7pSRml4u8jEFp9/aW7EhNX4xjtu4Q7q5dLT8lI6mzFnlld1/3Wxywrf5+tLJ8f4cd+hchTh5H8ohfJ0X7pSZ7MeD+sQBF8xQukp2QtNZXAqc4Ymp7cDbu4VHqOJ8mzFXtmDYcjGBz8/MbteHwQoVBIak7W/IWlKIzch6mBc7DvuEt6zrScxDDMRB/MxQHATQMmheTAGdiL10pPm5ZJp3CyM4bqNc2oasjPjX8mebZisa6qr8fly5+iv68PoXAY3V1v4PBrR6TmZGSujQM+P/yFpXDS32Dqsw8xd9XD0rM8FYQaURBqBACYyTjMyPm8DdV1XZw9uBfBSBQrW9ul58xI8mzFYg0EAtj/yqvY2PwQjDF4on0bamprpeZkZBJjGD29H67rAK6Dkuo1KI7eLz3re+HKJ+fQ+95RzFt0N7p2bgIANGyJYXHdg8LL8o/o37OuW78B69ZvkJyQFbt8CUJbD0jP+E78ZWH4y8LSMzxVLKvD9rcuSs/4Tmb7bPkvmIiUYKxESjBWIiUYK5ESjJVICcZKpARjJVKCsRIpwViJlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZTI6UoRfxocx7JdJ27XllvqUmez9ISc7Dy6RHpCTg7tOyg9IWvbXtwuPSFr3R2/87yPz6xESjBWIiUYK5ESjJVICcZKpARjJVKCsRIpwViJlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZRgrERK5HSliFvJSSdxpfsFuCYFOA6Kq5sQbNwqNWdGp0+dxPPPPQtjDNq3PY1dHbulJ3maHB3Guwf24NpXX8KyLNSsbcOKlselZ2Xkug6Svd2wCkpgR1uk53iSPFuxWC1/ARZufgk+uwiuSWP4zQ4U3VWHwop7pSZ5MsYgtuMZnHjnDMKRCB5oqEdLSyuW1dRIT5uWzx9AU3sHyqM1SE4l0L2rDZUrGjGvcqn0NE9m5AKsOUHASUpPyUjybMVeBluWBZ9dBABwnTTgGFiwpOZk9EFPD6qqlmJJNArbttH2yKM4fuxt6VmeSoLlKI9e/0ZiF5UgGIkiMXZVeJU3N/k1nIl++Ofn5ze/m0merdgzKwC4jsHQkRjS48MoW96MORX3SM7xNDQURyRSeeN2OBxBT88fBRdlb+JqHKN9l3Bn9XLpKZ5S8fcRCK2+/iORIrN9tqJvMFk+P8KP/QqRpw4j+UUvkqP9knM8ua77rY9ZVn6+CrhZaiqBU50xND25G3ZxqfScaZnxfliBIviKF0hPyYnE2Yo+s/6Zv7AUhZH7MDVwDvYdd0nP+ZZwOILBwc9v3I7HBxEKhQQXzcykUzjZGUP1mmZUNayVnuPJSQzDTPTBXBwA3DRgUkgOnIG9OH83S52tWKzm2jjg88NfWAon/Q2mPvsQc1c9LDUno1X19bh8+VP09/UhFA6ju+sNHH7tiPQsT67r4uzBvQhGoljZ2i49J6OCUCMKQo0AADMZhxk5n9ehSp6tXKyJMYye3g/XdQDXQUn1GhRH75eak1EgEMD+V17FxuaHYIzBE+3bUFNbKz3L05VPzqH3vaOYt+hudO3cBABo2BLD4roHhZfpJ3m2YrHa5UsQ2npA6uFztm79Bqxbv0F6RlYqltVh+1sXpWfkzF8Whr8sLD0jI8mz5b9gIlKCsRIpwViJlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZRgrERKMFYiJRgrkRKMlUgJxkqkRE6/fL5kQSkObV99u7bcUjuPfiw9IScvt+b/ZThvdmif9ILsbb2vQnpC1t4tKvC8j8+sREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZRgrERKMFYiJRgrkRKMlUgJxkqkBGMlUoKxEinBWImUYKxESojF+tKen6C54W481pz/V56YHB3Gv+5tx5EdG/H6s634r+OvSU+a0elTJ7G89h7U3rsUnb/8hfScGbmug2/+uwvJ/zkuPSUjya9bsVg3bNqCf/jHbqmHz4nPH0BTewe2HDiGzb94HX86+TrGPr8sPcuTMQaxHc/g7WPv4PyFj9H9xuu49HF+X+bGjFyANScoPWNGkl+3YrGurF+NH83N/z8cACgJlqM8ev0aSXZRCYKRKBJjV4VXefugpwdVVUuxJBqFbdtoe+RRHD/2tvQsT27yazgT/fDPz//rUEl+3fJn1hxNXI1jtO8S7qxeLj3F09BQHJFI5Y3b4XAE8XhccFFmqfj7CIRWA7Ckp+Q1xpqD1FQCpzpjaHpyN+ziUuk5nlzX/dbHLCs/QzDj/bACRfAVL5CekvdyuhTpD5lJp3CyM4bqNc2oalgrPSejcDiCwcHPb9yOxwcRCoUEF3lzEsMwE30wFwcANw2YFJIDZ2Avzu8zlsBYs+C6Ls4e3ItgJIqVre3Sc2a0qr4ely9/iv6+PoTCYXR3vYHDrx2RnjWtglAjCkKNAAAzGYcZOc9QPYi9DP7ZT5/G3z3yED7ru4wfr6nFse78/euQK5+cQ+97RzH4UQ+6dm5C185NGPjPf5ee5SkQCGD/K69iY/NDWHnfMmxu+xvU1NZKz/pekPy6FXtm/fn+X0s9dM4qltVh+1sXpWfkZN36DVi3foP0jJz4y8Lwl4WlZ2Qk+XXLN5iIlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZRgrERKMFYiJRgrkRKMlUgJxkqkBGMlUoKxEinBWImUsKa7Ep7nJ1vWCICB2zeH6Advseu65dPdkVOsRCSHL4OJlGCsREowViIlGCuREoyVSAnGSqQEYyVSgrESKcFYiZT4f+IWQA/q4y1kAAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "import numpy as np\n", "import matplotlib.pyplot as plt\n", "\n", "fig, ax = plt.subplots()\n", "\n", "min_val, max_val = 0, 5\n", "\n", "intersection_matrix = np.random.randint(0, 5, size=(max_val, max_val))\n", "\n", "ax.matshow(intersection_matrix, cmap=plt.cm.Blues)\n", "\n", "for i in range(max_val):\n", " for j in range(max_val):\n", " c = intersection_matrix[j,i]\n", " ax.text(i, j, str(c), va='center', ha='center')\n", " \n", " \n", "ax.set_xticks([]) \n", "ax.set_xticklabels([]) \n", "ax.set_yticks([]) \n", "ax.set_yticklabels([]) \n", "plt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Question 2.2. (5pts) Maze Evaluation\n", "\n", "We now want to solve the maze (or equivalently, make sure it has a solution) \n", "\n", "Using _Breadth First Search_ (start with a reasonably small maze, e.g. 5 by 5), compute the minimum distance (depth number of moves) to each cell from the start cell (take the start cell to be the _uppermost leftmost_ cell). For this, keep track of a list of the depth distances to each node and update this list each time you encounter the corresponding node. Once BFS completed, return the minimum element from the list. \n", "\n", "Finally return the negative of the minimmum length of the path from the start to the goal and a large positive number (e.g. 1e6) if there is no such path.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def Maze_Evaluation(maze):\n", " \n", " '''The function takes a maze (random n x n array of integers)\n", " and should return the the minimum \n", " distances of the start node to the goal'''\n", " \n", " \n", " # put your solution here\n", " \n", " \n", " return dist_start2Goal\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Question 2.3. (5pts) Stochastic local search (Hill climbing)\n", "\n", "Now that we have a way of representing each maze, we will try to improve our maze with a stochastic local search. In this question, each state in our graph will encode a whole maze. Our local search algorithm will work as follows\n", "\n", "- For a random, non goal cell, change the jump number to a different random legal jump number\n", "\n", "- Re-evaluate the start to goal distance according to the _Maze_Evaluation_ function that you implemented in Question 2.2.\n", "\n", "- If the objective function has not increased, accept the change and store the new maze if its evaluation is the best evaluated so far. Otherwise, reject the change and revert the cell to its previous jump number\n", "\n", "Perform a few iterations of Stochastich HC and return the RJM with the best (minimum) objective function" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def Maze_improvement(maze_init, maxIter):\n", " \n", " '''The function takes an initial Rook Jumping maze and \n", " a maximum number of iterations as an input and returns \n", " the improvement of the original maze obtained \n", " through maxIter iteration of Stochastic Hill Descent'''\n", " \n", " \n", " # put your code here\n", " \n", " \n", " return \n", " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "##### Bonus (3pts) Random Restart\n", "\n", "One problem with pure hill descent is that stochastic local search may become trapped in local minima. A possible escape strategy is to restart the search periodically. Another way of viewing this is that we iteratively perform pure hill descent, starting each descent at a random state. The end result is the best result from all descents.\n", "\n", "Add Random Restart to your 'Maze_improvement' function." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "def Maze_improvement_RR(maze_init, max_iter_SHD, max_iter_RR):\n", " \n", " '''The function takes an initial (random) Rook Jumping maze and \n", " a maximum number of iterations as an input and evaluates \n", " the improvement of the original maze obtained \n", " through max_iter_SHD iterations of Stochastic Hill Descent. \n", " This process is repeated max_iter_RR times \n", " (with different random initializations) and each of the \n", " obtained solutions are stored. The max_iter_RR solutions are finally compared in \n", " terms of the cost between the start and the goal node and \n", " the best solution is returned as the final output'''\n", " \n", " \n", " # put your code here\n", " \n", " \n", " return " ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.5" } }, "nbformat": 4, "nbformat_minor": 4 }