{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Introduction to neural networks\n", "In this notebook we will start with the basics of neural networks for tasks such as regression and classification." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Keras\n", "\n", "Keras is a high-level neural networks API, written in Python and capable of running on top of either TensorFlow or Theano, as well as other frameworks. It was developed with a focus on enabling fast experimentation.\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Exercise 1 (Regression)\n", "\n", "The first talk we are going to solve using neural networks is Regression. This is a supervised machine learning task, where the goal is to approximate an underlying function based on data observations. For this exercise the underlying function will be \n", "$$ f: \\mathbb{R} \\rightarrow \\mathbb{R}$$\n", "$$ f(x) = 10\\sin(\\pi x^2) + 20 (x-0.5)^ 2 + 15 *x$$\n", "\n", "**a)** Construct a dataset by first generating $500$ uniformly distributed $x_i$-samples and then computing $y_i = f(x_i) + 5\\eta_i$ where $\\eta_i \\sim \\mathcal{N}(0,1)$. Create a plot with the data-points and the underlying function.\n" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfgAAAFACAYAAABQsW5nAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xl4VOX5//H3M5OAsitJ2JGqoCyyKCoQFnGp1qqtXVxa992q9VftYluX1q1Wq9Zq0W9t1boraltrXWoRFBIXQEBUcGcJhBB2iAiZmfv3x5lJZiaTZBJmy+Tzuq65ZubMc8555mTgPs/uzAwRERHJL75sZ0BERERSTwFeREQkDynAi4iI5CEFeBERkTykAC8iIpKHFOBFRETykAK8iIhIHlKAFxERyUMK8CIiInmoINsZ2BVFRUU2aNCgbGdDREQkY+bPn7/OzIqbS9emA/ygQYOYN29etrMhIiKSMc655cmkUxW9iIhIHlKAFxERyUMK8CIiInlIAV5ERCQPKcCLiIjkIQV4ERGRPKQALyIikocU4EVERPKQAryIiMiuCIWgqgrMsp2TGArwIiIirRUKwdSp0K8fTJgAwWC2c1QnbQHeOfeAc26tc+79qG1POecWhh/LnHMLw9sHOee2R312X7ryJSIikjLV1VBW5gX2t96CSZO8oJ8D0jkX/UPAPcDDkQ1mdnLktXPudmBzVPrPzGx0GvMjIiKSWiUlcPDBXnAHeOcdWLsWevfObr5IYwnezN4ANiT6zDnngJOAJ9J1fhERkbRzDubMgXHjvPehEJx8ck6U4rPVBj8JqDKzT6K2fc05t8A597pzblKW8iUiItIyfj889xwUFHgd7crLvar7LMtWgD+V2NJ7JTDQzMYAVwCPO+e6JdrROXeBc26ec25edQ5cQBEREXr39jrZFRR4zyUl2c5R5gO8c64A+A7wVGSbme0ws/Xh1/OBz4AhifY3s7+Y2VgzG1tc3Ox69yIiIunnHMycCRUVMGuW9z4QgPffz1p1fTo72TXmSGCpmVVENjjnioENZhZ0zu0NDAY+z0LeREREmhcKedXwJSXgHCtXruS2227jk08+Ydu2bWzbupUdixdzZyjE0d27w7p1Xuk+g9J2NufcE8BhQJFzrgK4zsz+BpxCw851k4HrnXMBIAhcZGYJO+iJiIhkVWTse3k5TJjAkxdeyIUXX8yWLVsaJB0KsHkzLF0KI0ZkNJvOcmzmnZYYO3aszZs3L9vZEBGR9qSqCvr3h0CA13w+vu4cwfAENyNHjqSoqIgunTsz8JVXuHvnTujeHTZsAF9qWsWdc/PNbGxz6bJRRS8iItJ2lZTAhAlYWRnXde5McMsWdtttN6ZPn85xxx1Xny4Q8Eruw4alLLi3hAK8iIhIS4Q71Lnqav6z226cceaZfPe73+W4Y4/1SvfhdnkKCjJeLR9NAV5ERKSlfD7o1YtuwD/+8Q+cWUy7PDNnZqXUHpPFrJ5dRESkjdm2bVvMe+ec16O+vNyrlm/nE92IiIi0ORs3bmTffffl4osvZu3atfUfhNvl2/VENyIiIm3Vn/70J6qqqrjvvvuYPXt2/QeJJrrJMrXBi4iIJOGrr75i2rRpABw4ciTfOfHE2AThdvlcoRK8iIhIEp544om6avkr3n8fN3VqTqwa1xgFeBERkWaYGXfeeScAfYDvh0I505muMQrwIiIijQmFoKqK12bMYPHixQBcOmgQHXKoM11j1AYvIiKSSNSc83/s5q1gvvvuu3PhO+94n0UmtMlRCvAiIiKJVFXB7Nl8bMYLG7z1z8444wx6tpGlylVFLyIikohzYEYNMCW86fLLL89mjlpEAV5ERCSRXr1g0iTG+P3MmjSJTz7+mKFDh2Y7V0lTFb2IiEgiznmT1lRXQ0kJ++Zwe3siCvAiIiKNybHJa1pCVfQiIiIJvPXWW3zjG9/gscceY/v27dnOTospwIuIiCTwyCOP8PLLL3PGGWewefPmbGenxRTgRURE4uzcuZOnnnoKgKOOOorevXtnOUctpwAvIiIS55VXXmH9+vUAnHbaaVnOTesowIuIiMR57LHHAOjUqRPf/va3s5yb1lGAFxERibJlyxb+9a9/AXDiiSfSpUuXLOeodRTgRUREojz33HN89dVXAPzwhz/Mcm5aTwFeREQkyqOPPgpAcXExRx11VJZz03oK8CIiImGbNmygbOZMAE4tLKTA13bDZNpy7px7wDm31jn3ftS23zjnVjnnFoYfx0Z99kvn3KfOuY+cc0enK18iIiKN6VFby2rn+AtwflWVN01tG5XOW5OHgGMSbL/TzEaHHy8COOeGAacAw8P7THPO+dOYNxERkYZKStijtJTzCwoYUVrqrfneRqUtwJvZG8CGJJN/C3jSzHaY2RfAp8Ah6cqbiIhIjFDIW/8dYOZMqKjwFpppYwvMRMtG48Klzrn3wlX4e4S39QNWRqWpCG8TERFJr1AIpk6lvF8/qiZM8Lb16tWmgztkPsDfC+wDjAYqgdvD2xNdRUt0AOfcBc65ec65edVtuG1ERERyRHU1obIyTgkG6ffWW/zs0kuznaOUyGiAN7MqMwuaWQi4n/pq+ApgQFTS/sDqRo7xFzMba2Zji4uL05thERHJfyUlzBo2jJVAENh7+PBs5yglMhrgnXN9ot6eCER62D8PnOKc6+ic+xowGHgnk3kTEZF2yjn+PmoUAB2Akx97zKu2b+MK0nVg59wTwGFAkXOuArgOOMw5Nxqv+n0ZcCGAmX3gnHsa+BAIAJeYWTBdeRMREYnYtm0bzz73HAAnAHvOnesNj+vVK7sZ20VpC/BmdmqCzX9rIv1NwE3pyo+IiEgizz77LDVffgnAmT4fTJjQpofHRaQtwIuIiLQFDz30EAAlJSUcPX8+9OvX5nvQgwK8iIi0V6EQy+bPZ9asWYC3sExh//7ZzVMKtd1JdkVERForPPb9kXHj6jadeeaZWcxQ6qkELyIi7UsoBB9+COXlTAiF+LZzrBo1ilHhnvT5QgFeRETaj3DJnbIy6NKFI2pqOGLCBAL/+1+2c5ZyCvAiItJ+VFdDeTkEg7BtGyxcCMOHU5AHneriqQ1eRETaj5ISmDCBgN+PTZgAw4fnRY/5RBTgRUSk/XAOZs7k/ptuYuTGjdz/179SW1ub7VylhQK8iIi0KyHgTw89xPvvv8/NN9+Mz5efoTA/v5WIiEgjXn31VZYuXQrApZdeit/vz3KO0kMBXkRE8l8oBFVVYMZdd90FQOfOnTn33HOznLH0UYAXEZHcERWIU3rMqVOhf3+WHnIIL730EuBNbNOjR4/UnSfHKMCLiEhuiArEHHZY6pZsjQyNCwS4Yf58AJxzXHbZZak5fo5SgBcRkdwQFYgpL/fep0J4aNxSv58nwjUDJ510Evvvv39qjp+jFOBFRCQ3hAMxBQWxS7buarV9eGjcVUcdheGV3q+99tqUZTtXKcCLiEjWmBmLFi2KvKl7fLF9O1/W1KSs2t6c4+vHH0+PHj0444wzGDZsWOq+RI5ylsqODBk2duxYmzdvXrazISIiLREKQXU1Szds4KKLL2b27NmsmDuXfoWFcOCBEAgwBVjcowfn//CH/Oi++9grGPRK9hUV0KtXq09dXV2Nc46ioqLUfZ8Mc87NN7OxzaZTgBcRkYwJl8hnzJnDd51jczAIwM3ALwG6d2fRtm2MDm8Hr6r5287xq9GjOWj+/LydWjZZyQZ4VdGLiEjmVFfz2Jw5HBMK1QX3y5zj/Mjn27YxbO5cnn7qKSZOnAh4M889Z8YhixZx5U9/Sk1NTdKn+/e//83bb7+d2u/QRijAi4hIxry+ZAlnmREAOjrH4/vtx5/MKAKvZF5aSuHo0Xz/pJOYPXs27777LmeffTaFhYWEQiHuuOMODjjgAF5++eVmz/Xyyy9z6qmnMmnSJB544IF0f7WcowAvIiIZ8cUXX/Dd732PgBkdCgt55ZlnOPWzz7wPfT5YtAhmzYqpgh8zZgwPPPAACxYsYNy4cXXHefbZZxs9TyAQ4N577+W4446jpqaGQCBASXFxOr9aTlIbvIiIpF0oFGLq1Km88cYbADzk83FmaakXzMvLvWFxccE97gAE16xh2jPP8LtbbmH+/Pn06dOn7uOnnnqKdevWUVVVxYMPPkhFRQUAu/l8PAZ8Z+JEmDnTu5Fo45Jtgy/IRGZERKR9++STT1i8eDEAFzvHmaEQvPkmrFjhBd2SkiaDO4cdhr+8nMtKS7nw00/psHWrN6TOOWpqajj77LPZvn17zG4lRUU8t3EjpcFg/cQ5xcXec1PnyxNt/1ZGRERy3n777ceHH37IxRddxK3Rk9n07u0F3bVrG5/IprISZs+GYBDeeIMOhx8eMy5+1qxZMcH9gAMO4P777+eLZcsoLS2tP1dRUXqmws1RqqIXEZHUCo9zryslR2aicw569oQlS7yg3ru3F9SnTq2vpk9Ujb54MYwc2fA8UePi169fTzAYpGPHjnTv3j1xXtau9YJ7IJCSMfXZomFyIiKSefEzzwUC3nPfvtCnD+y+O4waBUOHeiXyZOafHz4cunSJ3RY3nW3Pnj0pKSmJDe7g3Sz06uXdXDQ2FW6eSluAd8494Jxb65x7P2rbbc65pc6595xz/3DO9QhvH+Sc2+6cWxh+3JeufImISBrFBezLzzqL386Zw+bI55EJbDZvhqVLkwu6Ph9s2AAHHeSlmzQJVq5stlNeg/nrw3PSU1HR9L55Im1V9M65ycA24GEzGxHe9nXgNTMLOOd+D2Bmv3DODQJeiKRLlqroRURyjJlXYi8v5+PRoxn67ruEQiFOBx4G8Pu9IN+9uxe0fb6GVfqNaUm65qr927CsV9Gb2RvAhrht/zWzQPjtW0D/dJ1fRESyIKqUfP1++xEKhXDO8fPXXvM6y23f7rWpR4I7xFajN6WpdNEl9nQtO9vGZPOW5hzgpaj3X3POLXDOve6cm9TYTs65C5xz85xz86rb6R9NRCSn+Xws2bCBxx9/HPDWXh8xdarXqa6wEEaMaH2JOlHV+86dcMghXrv/lCleDUE7amtvTFYCvHPu10AAeCy8qRIYaGZjgCuAx51z3RLta2Z/MbOxZja2uB3OTCQikpPiAu9vf/tbzAznHNddd13qzhHfgW/VKm/42/z53vvZs2Gvvbx8rFjRLtraG5PxAO+cOxM4DvihhTsAmNkOM1sffj0f+AwYkum8iYhIK8QF3sWLFvH0008D8IMf/IChQ4em5jzRVe9lZTB5MgwcCFu3xqYLBLxJdHy+dhvcIcMB3jl3DPAL4AQz+zJqe7Fzzh9+vTcwGPg8k3kTEZFWqqqKafO+4ZprMDN8Ph/XXntt6s4T3eP+4INh7tzYyWq6dPF62LfzqvmItE1V65x7AjgMKHLOVQDX4S332xF41Xl3VW+Z2UXAZOB651wACAIXmdmGhAcWEZHcEQrBySd7wd05lo8Zw7P//jcAPywqYsi++6buXJEOfJEpZyM95ceNg2nTvPHy0G6mom2OZrITEZHWq6qqnx3O76fsr3/lrLPP5lNgATB69Wpvgpt0SHbYXJ7J+jA5ERFpB0pKYPx4b3z7hAmUHn00HwHlwGhIb+BNdnhdO6UALyIirRde0a3uUVSEb9w4xvv9Xnt4G5zrPV8owIuISOuEQvDhh7E926dM8Tq/HXyw116u0nXWKMCLiEjLRYbGjR4NnTvznt/PD3v0oPydd7BgEObNg3Xrsp3Ldk0BXkREWi4yJj0YhG3bmPad7/D4+vVMDAapCLfHt/dhatmmAC8iIi0XNSZ986GH8uiLLwLwzW9+kwGrVrXrGeRyhQK8iIi0XNSiMo+ccgo1NTUA/OiSS9SzPUcowIuISOv4fFhJCdPuvReAvffem6OPPjrLmZIIBXgREWm1WbNmsWTJEgAuPu00fCq55wwFeBERabVp06YB0NE5zr7pJm+Vt+j54SVrFOBFRKRVVq9ezT/+8Q8ATgF6BoNez/rq6uxmTAAFeBERaaUZM2YQCpfWfzRqlFZxyzFpW01ORETy2+mnn86ECRP417/+xcGXX+5NbNPOFn7JZVpNTkREpA3RanIiIiLtmAK8iIi0yJIlS3jwwQfZvn17trMiTVCAFxGR5IVC3HnzzZxzzjkMGjSIbdu2ZTtH0gh1shMRkeSEQmyYOJFH33wTgHGHHkqXmhro3Fkd63KQSvAiIpKc6moeePttIhXzl336KfTvr8ltcpQCvIiIJCW4xx78ubAQgKG7784RH38MgYAmt8lRCvAiIuKVwKuqoLGh06EQ/znwQJbt2AHApVdfjSst1eQ2OUwBXkQkl4VCUFkJa9Y0HnxTcY6pU5uubq+u5o8ffABAN+CMa6/18rNihdZ+z1EK8CIiuSoU8gJu377eI11t3dXVXjV7E9Xtb3/2GTPDr88DugSD8Oab4PMpuOcoBXgRkVwVCbzglZbT1dZdUuJVszdR3X7vXXcBUAj8BFQ13wYowIuI5KpI4AWvlJyugOoczJwJFRWNVrff99BD3LvvvvzSOfpPmgQrV6pqPseldRy8c+4B4DhgrZmNCG/bE3gKGAQsA04ys43OOQfcBRwLfAmcZWbvpjN/IiI5zTkviFZVea979UpfQPX5vONHC4W8GoOSEnbbfXcu+uijuvcK7Lkv3SX4h4Bj4rZdBcwws8HAjPB7gG8Ag8OPC4B705w3EZHcFenV7hz06QO9e2c2qCbqeBe5CVBwbxPSGuDN7A1gQ9zmbwF/D7/+O/DtqO0Pm+ctoIdzrk868ycikpOS6dWeblVVUF7O7YEAn86ZAx9+mL5e/JIW2WiD72VmlQDh50iDUj9gZVS6ivC2GM65C5xz85xz86o1sYKI5KMkerU30Nw49pYIheDkkykLBPgpMDQU4rFRozRjXRuTS53sEtX5NPilmtlfzGysmY0tLi7OQLZERDIsiV7tMeJL/IHArgX76mqsvJxfhN8WAFNCIc1Y18ZkI8BXRarew89rw9srgAFR6foDqzOcNxGR7EuiV3uM6BJ/WRlMnuwF+ylTvElyWhroS0p4evBgysJvf9y/P/01LK7NyUaAfx44M/z6TOBfUdvPcJ5xwOZIVb6ISLsR3bku2Q5t0SX+gw+GuXO9YD97NgwcmFzVelQV/4qVK7lozRoAioqKuGrhwuRvNiRnpDXAO+eeAN4E9nPOVTjnzgVuAY5yzn0CHBV+D/Ai8DnwKXA/8KN05k1EJOdEVbVvnjiRO++4g2eeeSYmSU1NDbNnz2b79u31G6NL/GVlXrD3+73tybTjR503MGUKP/jBD9i0aRMA9913H3v07Kne822QszbcK3Ls2LE2b968bGdDRCQ1qqqo6dePe4JBbsUbgnTkkUfy6quv1iX573//y9FHH01hYSEHHngghx12GJdddhn9+vWrH7deVOQ9n3yyF9wnTGi69F1V5VXpBwJc6xw3hOPChRdeyH333Zf2ry0t45ybb2Zjm0uXS53sRETatdc/+IAhPh9XUT++eNXy5YSCwbo05eGpa2tra3n77bf5/e9/z7777ssvfv5zNkyc6AXqww/3qu2TbccPV/HP9vu5MRzcR4wYwZ133pmeLyoZoRK8iEg2RM0Sh3OUzZ7NkYcdxlfhtvLhw4ZxA/Dtjz7ylmWdORN8Pqqqqpg9ezZlZWXMnj2b+fPn1x2yB/AL4Md+P51WrWo4M10z+dny+ef86De/4bnnnmPu3LkMHzpUM9floGRL8ArwIiKZFmnzDlefL5k2jdKJE9m4aRN+4B6fj/PffRf/2LFeG3pBgVcSTxCwZ8+ezVVXXVVXsgf4xYAB3LJ8edNBOe4GI7LN1q7l0y1bGLzvvjF5jNxgSPapil5EJFdFDWtbXVbGMUcfzcZwp7a/+nxcNHEi/gMOaH4sfCjEpCFDmDN7Ns8//zwjRoygc6dO/OSdd2KC+7Zt2xrsx9SprO3Xj/P69mXN6tV129yAAQw+//y6mexaNNmO5JS0LjYjIiIJRIa1lZezY8wYOmzcCMCNN9zAWeefX1+qnjmz8SryqFoAN2ECx8+cybHHHsv7779Pr96965KtW7eOQYMGMXXqVCZMmEDfvn3ZuGIFn8yezRNmbFyzhi0XXcTT998fG9Ajq9dFSvAa/97mKMCLiGRaVPD+WkkJ5evW8fDDD3PFFVfEBvJEK7xFJJjO1t+rF6NGjYpJ9sQTT1BTU8MLL7zACy+8kPBQgYICdnTvTsfx471jjR/vnbepGwzJeaqiFxHJhqiV2YqLi7nyyitxLQmiSU5nO2HCBM455xz69+8fs71Dhw5MLS3lH889x3PPPUfHggKorfUCuXPe7HdaPa5NUyc7EZEM++Mf/0i3bt0455xzdu1AjXSUS1jqDoXY/NlnVAM99tiDPffcE1+k01woBKWl8NZb3vsmOvVJ9iXbyU5V9CIiGfTFF1/wq1/9iu3bt/Pll19y6aWXtv5g8VX40b3zx4+Hp57y1pE3g6lT6V5eTvdEPeKrq73pbSMOPlht7nlAVfQiIhliZvzoRz+qm2Z25MiRqT1BdLt89Dz0lZWxi9HEr+1eUuKV4P1+GDcO5sxRtXweaDTAO+dedM4NylxWRETySIL12adPn87LL78MwDnnnMPkyZOT2i9pkXb56Hnoy8rge9/zXgN06QJjxsQuQBPp9LdqlXcjoPHueaGpv+JDwH+dc792zhVmKD8iIm1f/PrsoRCbNm3i8ssvB7wV2m699dak9muR6EVnJk2qX10u0lfJ54OamsRj29WhLu802gZvZk875/4DXAvMc849AoSiPr8jA/kTEWl7Egxh++VvfsOa8BKsd9xxBz179kxqvxZ3dPP5vHb3yBC34uLYdnnnNLa9nWiuk10tUAN0BLoSFeBFRKQRURPZMGECiyor+b//+z8AjjjiCE477bSk9tulABzdAS96PLtZ7Ou1azXOPU81GuCdc8cAdwDPAwea2ZcZy5WISFsWNwvdz44+GjPD7/dzzz33ND7evbnZ61orOtg7572Omw9fc83nn6b+mr8Gvm9mVym4i4i0UDioBkMhxo0bx+67786FF17I/vvvn9R+rQruLemgl6g5QPKKJroREcmAVatW0bFjR4qKitJzgpaWyM28jnyR9M2tGS85QxPdiIjkkH79+qX3BC3toJeu5gDJGWpwERFJg2AwyKbwErAZkeTc9DE0NC6vKcCLiKTB448/zqBBg7jhhhuoqalJ/wmjx8Crul1QgBcRSblAIMBvf/tbNm/ezLRp03Br10Iw2PoZ6pKlErlEUYAXEUmxhx9+mM8++wyAX3buTKfBg6Fnz9bPUCfSCgrwIiIptHPnTq6//noA+vXpwwXLlnml982bNSRNMkoBXkQkhR78299Yvnw5AL/u1IndIqu0de/esg5wIrtIw+RERFLkq6++4sYbbwRgL+DcZcu8ZVt9PigqgnXrNCRNMibjAd45tx/wVNSmvfEWtOkBnA9E6q5+ZWYvZjh7IiKt9te//pWK1asBuMbno0NpqbfwSySgt3ThGJFdkNWZ7JxzfmAVcChwNrDNzP6Q7P6ayU5EcsX27dvZZ599qKysZJ999mHJrFkU9uun0rqkXLIz2WW7Df4I4DMzW57lfIiI7JIOHTpw++23M2TIEK699loK+/dXcJesynYJ/gHgXTO7xzn3G+AsYAswD7jSzDYm2OcC4AKAgQMHHhTpzCIikgsCgQDOOfx+f7azInkq50vwzrkOwAnA9PCme4F9gNFAJXB7ov3M7C9mNtbMxhYXF2ckryIiySooKFBwl5yQzSr6b+CV3qsAzKzKzIJmFgLuBw7JYt5ERJKydetWFixYkO1siDSQzQB/KvBE5I1zrk/UZycC72c8RyIiLXTXXXdx4IEH8p3vfIfNmzdnOzsidbIyDt451wk4CrgwavOtzrnRgAHL4j4TEck5mzZt4vbbvdbEzz//nK5du2Y5RyL1shLgzexLoGfcttOzkRcRkda6884765aEvf766/H5sj0wSaSefo0iIq2wfv167rzzTgAOPvhgjj/++CznSCSWAryISDJCoZjlXv/whz+wdetWwCu9O415lxyjAC8i0pxQCKZOrVvude2aNdx9990AjB8/nqOPPjrLGRRpSIvNiIg0JhTylnY185Z5DS/3+purrqKmpgaAG264QaV3yUkqwYtI/oirRt/lY0VK7Sed5C3zWlBA9dixPPDkkwAcPXUqRxx++K6fSyQNFOBFJD/EVaMTCu3a8aqr60vtb74JTz4JFRUUl5ezYP58TujZk1vfeCM15xJJAwV4EcmsVJayo0UH5PJy7/2uKCmpK7UzYYK37GuvXuAcQ4uK+NfmzYwMBlNzLpE0UIAXkcxJdSk7WnxALinZteM5BzNnQkUFzJoVuzJcqs8lkgZZXU1uV2k9eJE2pqrKC+6BgBccV6wAn88LkKnoqBbpFJeq40WZNm0aABdeeKG3mEwazyXSlJxfTU5E2qHoku/48XDKKaktzft8ddXoqbRy5Up+/vOfc8kll3D66aen9VwiqaJhciKSOZFq78jQswEDYtvMe/WKTR9pr3eu6WCaxtK0mXH++efXDYs775xzvDyp5C45TiV4EcmsSMm3V6+m27F37oSDD4a+fb1HY6X8dLTrR3UEvPfee3nllVcAOP200zj8hhvS04dAJMXUBi8iqdPSknRj6QMB2HNPCE8FC3g3AhUVDUv58e36idK09DtMnQrl5Xw8ejSjP/iA7du3M2DAABbPmEH3YcNSdy6RVlAbvIhkVipL0kuXxgZ38NrszRoOr0t1j/bwcLtAIMDp8+axfft2AP7+97/Tfd991Xte2gwFeBFJjZaOQ2/qhmDYMOje3XvdpQssXw61tV6bfSRtpBodGh/O1holJTB+PL8E3glv+snllzN12LDUn0skjVRFLyKpYeYF3/Jyr3TbRAD88ssv6bR1a0zV+mknnMD6jRvZe//92Xuffdh7r70YUlDAsG9+Ezd5Mrz1lrdzZHjdKafUn2vmTK9tP0UW/e9/jD7qKABGAHMPPZTd5s3z+gTMmQN+f8rOJdJSyVbRK8CLSOo00wa/bNkyrrnmGqZPn857CxYw5NxzYe5cGD+eIxY6+5wbAAAgAElEQVQv5rVNmxrsM6BvX06orOT7ZkwG3Pjx8Nxz9T3wo8fTFxXBunX1529Jn4DotMDTw4fz0yVLmDV6NHsvXgzBoJdu/HgvyKfwhkKkJdQGLyKZ18jY8I0bN/LTn/6U/fbbj0cffZQdO3bwz29+E955B8aOhSef5PAtWxgDdIs75MrVq/mzGYcBv+rb1wuu0T3wI+Pp+/WDnj3rq/wDgeT7BMQ3F5hx0vvv88ny5ew9f75Xco+YO1dT00qboAAvImmzY8cO7rjjDvbZZx9uv/12du7cCcB3v/lNJqxY4ZWK586F9ev59cSJvFtQwKZJk1j/4YfM9fu5GzjSOQoKvCk7vvvPf3o3EeHx9OsWLYKnn/aq6oNB2Ly5vg/A0qXJ9wmoquJfb7zBp4EAzJ7tte37fHQcONA735w53o2EOtdJG6IALyKpEzV+fMGCBQwbNowrr7ySjRs3AjBp0iTeKi/nmb/+lYmlpV5bdpcucOCBXhv+ihW4119nz/33Z2xpKZcWFPDqpElUVlTw4B//yNix9bWSQTPGHnssh55wAo8UFrIDvONFgvCwYUn1eF+/fj0/vOQSvg0cBnxq1rA63+/3grw610lbYmZt9nHQQQeZiOSIYNBs8mSzggKbN2aMderUyQADbL/99rN//vOfFgoE6tLYpElmixZ5r8F7XrMm9nhr1phF7zN5stmOHWaLF9u//vGPuuMDVgz2K7Blr7xiFgrFHiPyPs706dOtpKSk7hjdwf4zfHij6UVyATDPkoiRWQ/Su/JQgBfJEcGg2eLFZn6/GdhOn88OnzjRfD6f3XLLLbZz504v3Zo1sQG9sjI2eCcKrNH7+HxmXbuaga3u2tWu+fWvYwJ05DFixAi78sor7dVXXzWrrfXyFgza5s2b7d1337Vp99xjk8eNi9nn+OOOs1ULFyq4S85TgBeRzIiU3H2+ugBvXbvahjVr7L9PPRUbMEOhhgG9mVJ2zD4HHmhWP92N2eLF9tVXX9ljjzxi48eObRDox48cada9u5e2e3f77XXXNUjTs2dPe/zxxy2kwC5tRLIBXsPkRGTXVFUR6tePZ4JBvg/UtU537Qrbtzccp96ahWECAa/T3P77e0PhNm/2JsLZsCFmuNqCBQt4/vnneeXll3n7rbe4BvhN1GEevvFGzrz6agD6AD90jp++9x69RozYpUsgkkkaBy8iGWGhEJf278+0ykouLizkntra2N67uzpne9Tc8EyYAK++Ch9/7HWia2ws+vvvs+mAAwgARVGbVx5yCG9feSX7/v73jFy0CF9pqTrNSZuTbIDP2nKxzrllwFYgCATMbKxzbk/gKWAQsAw4ycw2ZiuPItK8391yC9MqKwGYMWgQmzp1Ys/33/d6x9fU7PqwsvgpcDduhOZK3MOG0aN7d6+k36kT7NgBwSAD3n2XAVOmwPe+l7blZUVyRbaHyU01s9FRdyJXATPMbDAwI/xeRHLUE088wa9//WsA9tprL17bc08vuB98sBdAUzGsrDWLyfh83ox2ixbBQQd5Y+Sdq9+/kQl5RPJJtgN8vG8Bfw+//jvw7SzmRUQaEwox5/nnOeusswDo3r07Lz78MP3mz/eC6bx5Xvt4KoJoeFKbFt8s+Hze4803698/9ZTXPa+qystneMy+SD7KZoA34L/OufnOuQvC23qZWSVA+FnTRYnkmlCIT8aN49vf+hY7d+6koKCAZ599lmGlpdC5s5emc2evM1yqtLTEHWm3Hz3ay0tBAZSWQnGxtz1+WttdWdpWJEdlrQ0eKDWz1c65EuBV59zSZHYK3wxcADBw4MB05k9EEtjy+eccP3cu68Pv/+/WWzniiCO80vC2bd7Gbdu8KvLWdqzbVZF2+2DQW1d+xgyYPDl2++bNXtrINLbZyqtImmStBG9mq8PPa4F/AIcAVc65PgDh57UJ9vuLmY01s7HFxcWZzLKIAI+8/DIfhV//csAAzvl//897U1LilZIjpeVsztceabf3+73hekcd5ZXci4rqt3fr5j2PH6+55SUvZSXAO+c6O+e6Rl4DXwfeB54HzgwnOxP4VzbyJyKN+9Ell/Dne+7hxGOP5cYvvqivNm9tW3k6RPKycKHXkz/SA3/dOm/7ypUwapSXzjm1w0teyso4eOfc3nildvCaCR43s5uccz2Bp4GBwArg+2a2obHjaBy8SPaYGS7Xe6GbeW3skTH0kRuPqiqv/T2ynvyujNMXybCcHgdvZp8DoxJsXw8ckfkciUhC4Vnnajp3pqCwkI4dO9Z9lPPBHbwA/+STXlCP7qQXqcKPBH5V0UseyrVhciKSKyI90fv359J992XcuHF89NFHze+XKyL5HzgQTj45tho+l5oTRNJEAV5EEgv3OH88EOChqioWLlzIz372s8Rpo9aBzxnxM+BVV8d+rsluJM8pwItIYiUlfD5mDBeF3xYVFXHfffc1TBdV0s+pMeWtmQFPJI9kcxy8iOSw2kCAU51ja/j9gw8+SN++fRsmTFRSzoUOa5FqeM05L+2USvAiktB1113HO++8A8CPf/xjjjvuOO+D+Or4XC4pqxpe2jEFeBFpYMaMGdxyyy0AjBo1it///vfeB4mq49VhTSQnKcCLSIzq6mpOP/10zIxOnTrx5JNPsttuu0U+TNxxrbGSci52vhNpJxTgRSTGypUr68a4/+lPf2L//fev/7Al1fG52vlOpJ3Iykx2qaKZ7ETSIBRi3dKlPPzSS/zkiisaTmgTnvym2Y5rmi1OJC2SnclOJXgRqRcudReNGsUVzz+PS1QASLbjWi53vhNpBxTgRYSamhpmRoaUNTU5TEuo851IVinAi7RHcZ3ffvKTn3D44Yfz01tvpXbcuNSVujVMTSRrNNGNSHsT6fwWXmjlmUsu4f777wdgbnk5vtdfh40bNTmMSBunAC/S3kRVwy8vK+P8RYsA2KOggEfnzsV/1FFe1bqCu0ibpip6kfamqAg6d6YW+AGwafNmAP4WCjEgGNz1tncRyQkK8CLtzZo1sHUrVwPlwSAAF114ISdOnKge7yJ5RFX0Iu1JIADDhvGfUIhbw5tGjRrFHXfeCR07amEWkTyiAC/SnixdyqotWzgj/LZLp05Mnz6d3Xff3dugiWhE8oaq6KV9ae9zow8bRkm3blwQfnv//fczePDgrGZJRNJDJXhpP+KGhzFzpjdOuz3x+Shcv57fLV3KD0MhRowcme0ciUiaKMBLyyU7F3muSTRLWzuqkjYzb+rZ9eth+HBGtKW/nYi0WDsrvsgua8srhDU2N3o7qLb/5z//ydSpU6kuLW2bfzsRaTGtJictkwsrhLWmBiE6iEdPn5qtavsM1oIsXLiQiRMnUlNTw0hgAeCL/O2Ki9tmbYxIO6bV5CQ9MrFCWFMl6tbUIIRCMGUK9O3r7XfSSd5YcLPULq6SrEzUgoRCUFnJmvfe44QTTqCmpgbnHDcNH+4F9/HjIRhsu7UxItIsBXhpmXSvENZc8GtNQK6qgjlzvNdmMHs2DBzoHb+oKPaGpago/dX16b6pCIXgsMPY3rcv3x41ipUrVwJw2223cdx778GKFd7fbeBA71pk8uZGRDIm4wHeOTfAOTfTObfEOfeBc+7y8PbfOOdWOecWhh/HZjpvkqRUrBDWWCm9ueDX0hqEUAjWrWu4PXL8devqb1heew0OP7z5Eu2uttmnoxYkOk9VVYTKyjgXeDv88bk/+AFXXHGF97fz+bzvHgx66TV7nUheykYJPgBcaWZDgXHAJc65YeHP7jSz0eHHi1nImyRjVwNcU6X05oJfS2oQIucZMwa6dQO/Hw49FCZN8oLcQQd5bdCRG5Z165ovWaeiej3VtSDReZoyBTvpJC4NhXgi/PHk7t2Z9sADuMh5oq/xpEmwcqXWaxfJQxkP8GZWaWbvhl9vBZYA/TKdD2mlVAS4pkrpyQS/ZGsQIucJBqGmBhYuhLKy+u/xzjved4l8h2RK1lVV9dXas2d771sjleukx13PX5WVcW/4o/2BZ4cOpUNhYX366Gv8+uvQu7eCu0geymobvHNuEDCG+prES51z7znnHnDO7ZG1jEnjUtF+3FwgTVXwiz5PaSkMH15fSgevBqKsDNau9d4nc3PhXH3NhVl2AmN8DUrc9Txu5Ei6AvsCM4CiefMa/p1SeYMhIjkpawHeOdcFeBb4f2a2BbgX2AcYDVQCtzey3wXOuXnOuXnV6hSUealoP25pFXVrmwQSnSeS/+hjn3xyfSm+ucDXq5dXre33e8/ZGCIYX4MS/T1nzqT0pZd47aWXmHHIIfRV+7pIu5WVcfDOuULgBeAVM7sjweeDgBfMbERTx9E4+DRrbKx2S7e39PjxaaZOJVRWRvXBB7PyrrtYuWoVFRUVbN26lZqaGr788ku+/PJL+vfvzzXXXFO369q1a3nttdfo3r073bp1o1+fPgzYbTf8vXrBBx94bfAtHc8fudlwLjsl4ATzEGzs0IGlH37I+L33hlNOqR/TP2OGN2udxriL5JVkx8FjZhl9AA54GPhj3PY+Ua9/AjzZ3LEOOuggkzQJBs0mTzYrKPCeg8Hm00+aZOb3e8/JpE9w/FAoZB9//LFVVFR46dasMSsosEvBaOZx+OGHx5xixowZDdIUgg3ZfXc7/rjj7JqBA+1Zn892TJxoFgp5eVizxiwQ8J5DoebzHNknkjb+fXPXINm0EaFQTB7K5syxgQMH2u4+n5X7fGbOmYH3+Zo1yR9XRNoMYJ4lE2+TSZTKBzAx/J/te8DC8ONY4BFgcXj789EBv7GHAnwahQNr0sFi9WovbeSxenVSx98B9pbfb3+47jo78cQTrbi42AC7/vrrvXThgPY7ny9hUPf5fNa1a1fr1auXnXfeeTGneO6555q9KejQoYPt3LGjPnj7/fZm5872ld/f8MYm/pqsXh0b8Gtrk78paukNVNy+wdWr7eabbjK/31/3Xc6OXPvIMVty4yAibUayAT7ji82Y2Ry8Unw8DYvLJZG26kh1b3NtuM7Vd0CLvG7E8uXLeeHf/+aFrl15feNGtgeD8NvfxqRZvHhx/XFnzuTrM2bQ6cMPGTBwIAMHDqR/377sEQhQ2K8fzuerr+6P6vh21FFH8cEHH7BlyxY2bdzI8h//mE8/+4xP99yTD3v04JPPP2fEiBEUdujgVX2Xl7MiGGR8TQ0dgYmzZ/P1667jqO98h1GjRuGLvybOxXY4XLo0+cVsqqu9Dn7BoPdcXZ142tgEzRhV1dWcfuaZvPrqqwB06NCBP/h8XPrVV95wwCVLoE8fVcuLtHfJ3AXk6kMl+DRrSRVyXNVxwn3Cxzv/vPMSlqb79etnJ598st199922aNGipvPVmpJzMBhT6t5aWmqfffJJTP4fcy5h3oqLi+3UU0+1B//2N6tYsMBLH/2dJ01qWKJv6roFAmbdu3sl7u7dzXbubNjEEfc9a3fssGnTpllRUVFdvgYPHmzz//tfbz/wnlU1L5LXyNUq+lQ+2kWAb007bbZE5TUUCllZWZlddNFFdvfdd8cEq38OG2aA9e3b184991x79NFHbdmyZRZq7jtGjl9ZGVtVvnhx080J0dcwuprd7/f2jZy3ttbWjh1rTzlnF5WU2D677ZYw2O+xxx4WjNxERG4aIsF54kTvfXPfJb66f9Gihk0cUWmCfr8dNHJkTD5OO+0027JlS3I3VyKSNxTg88GutNOmIy/N3WgEg/ZRWZldc/XVtvfee9cFoqFDh1ooKijX+P224H//az6gx58/urQcf10aC3DxnwUCdW3t1r177DHig39BgX0Gdp/PZ9/55jetR48eBtj3vve9mKzNf/llGwN2JdgLYJs/+qj56xcflFevru8g55yFVq+2mm3bYtJcdOGFBtigQYNs+vTpLf/7iEheUIDPBy3t6NYaSQbupm40qqqq7E933WWHdO3aoLTrB/vGMcfYls2bd62UGRcAraIiud7ria5hMJi41B9f5R6X39raWnvzzTdt/vz5Mae49eqrY76zc86GDBliJ510kt188832n3//25Ydckhsk0JUb/1QMGgbN2yw9w46yJ5yzi7p1csGDBhgl112Wcz3+vjjj+22226z7TU1CuYi7ZgCfD5IZ9VrXHt0g8AdHTBXr270RmPu3LkxPbkjj7Fgd4GtiW4T3pUhZZWVFlOFXVmZ3Pds7Bo2tj06T0nm7+933WXjwjcziar0ATsycnPi95uNH2+v+nx2QOfONnjwYOvSpUvCffbaa6+GtRy5VKsjIlmhAJ8v0lH1Ghmz3ti46fjq8EmTzMACYK+PHGmhqKBSW1trJSUlXtVxx452tXO25KCDkutwl2zHuEipO7oTWkur9xsb156KaxsKmU2aZJt9Pnth+HC75uqr7fjjj7cBAwbUBetziou9vB94oFlBgT3cxM1AQfiG4K4bb7SdO3fG5re5/gYikveSDfBZmckuVTSTXQLJzA5XWQl9+9a/j0xnGpnONWq2NPP5WOQcjwSDPIE3h/CCBQsYPXp03e7Tp0+nb9++TBg3DrdunXdus6bzkWBGtgZDyiLTsiYzM1tLZ9FLtcj5i4q8+e7D+VhfXc3iI4+kx+LFjO7SBb78Erp04dWtW5m2xx50PPJISkpKGDBgAP379WPv225j5OLF7F5aGjuNb+RalJVBly7e4jnRfzMRaTdydia7VD6yUoLP5c5MyZaKKyvrS+5g9t57sd8nFLLlhxxiv/P5bHinTg1KmFdeeWXD8zZ1TRJ9nkzzQ7J9EHKl2jpRPqK/Q+QR33s//hjN9SVoan8RyXuoij4NcimQJNuhLJEmguuTTz5pU6ZMSVh1PHnyZLv//vttw4YNsXlp6po09XlzNwbJ9kFId2fEZG/qEuUj8h3ie+23NDhrKJyIhCnAp0MmerU3p6mA2ZIgUFvrlQLjAvLll18eE9SHDh1qN998sy1btizxcZq7Jrt6zZIJrqkOfvEd7ZK9qWuu415jc9w3de7WdkoUkbylAJ8O6SpFNfUfd/xnzQXMJHqBB3butNdGjrQLnLMb9tqrPmgFgzb35Zetd+/edsVPfmLvvvpqTIe6hJq7JpkqeaZqkZf4gN7ECIJdzkdT50401l9ExBTgUyPRf9apLkU1VkJsbBhbooDZWD6jep0Ha2ttzpw5dtlll1nvcK93wPqABVatqstHyO+32okTW76SXEvb4LOluRJ5/A1UZeWu3aC05LsnmGhHveVFJJ4C/K5KZ3t7Y1OnRk/EEmm3bWwYW2NVyJFJVFatshDYXLCfgg3o2zfhcKxj99zTqtasic2Hz1c/t3lztQRtTXM1IM3dQLW0pqAlv6FmJtoRETFTgN916Wpvb2zq1Oj/yON7Xifb0zw8iYoVFJgdeqgtStBRzhceY33/4MG2/oMPGk784vebdetmdTPGNTf1a1urOm7BojhJrQfflJY0pyTa1pZvpEQkbRTgd1W62o4bK7E3Nk/5xIneMLbGgkkoZJsnTLCnfT47p7jYdkStKhY69FAbEg7skyZNsj//7nfezHJNBZzoiVT8/oYzxuVCR8Nd1drA2dLv3tRvqK3fKIlI1ijAp0I6SlDJ3jjU1potWGA2blxsEAgGrbaiwt5680278cYbberUqVZYWFhXQv/vAQfE1A6Uv/CCrVyxIvlz50qnuVzUmu++q0MaRUTiJBvgNZNdIi2dFS0+fXP7J/P5YYfB7Nn12woKuPeGG3j5D39g1vr1bEmQjc6dO/PHO+7gvG99q/XnTjZ/2Zw1LptS9d3NvL9xZJY+zUgnIklKdiY7XyYy06ZEpgTt39/7DzgUqt9eVeX9x9xU+kDAe+7XD6ZMqd8/On2iABF1/J2rVvFOWRnrovc7+GAeefZZno8L7kOGDOFHZ53FSy++yLp16zjvggu8KV8bCxY+X9OfN5Umkkfnmj9Gvkrm+iXDOZg505uiV8FdRNJAAT5edbVXqgoEvOfq6sRBPxLs1q6NTf/hh17JOxj0ntesqT92guNs2rSJ12fO5K4hQzi7Tx/GdO1Kl3324dBQiP9E9hs3DubM4Yijj6Zvhw6c7hwPDRnCymXL+Kh3b/786KMcc8st7NahQ/quS2M3PtJ6Ph8UF3u/oTZckyYiOSqZevxcfaSlDT5RO2t8e2n0+PTIcCa/32svX7XKYuYdHzfOAjt3equCRR3np87ZoKjVxhI9LjjtNK+TW7j99quvvrJQ9GxomWzHVZtx6qmjnYi0Akm2wRdk79Yi95gZO3fupHDGDHzr13srg61d65WyJkwgVFZG7bhxFJrhi5Ta33yTsunTWbNmDRveeYcNpaWs79OHispKVgIr33qLVZ068dif/8xJ553ntbeWl7Nqjz1YtnJlzPn3BMb06MHoc87hwIMOYuLEidC7d93nHTt29F5EVl0rKak7HhMmeO/TJZPnai8S1RbFr6gnItJKCvBRduzYwe677w6A3++n0IyCUIiAz0etcwSDQZgzh/mrV3NgVLA7/+c/Z8knn3gHWbas4YEDAVZefDE89ljdkqel06ezc9YsRo8e7T1GjqRfhw64lrTvxtYVeI90teVG2ozba+e6dNBNk4ikkXrRR9m6dSvdunVrNt1bo0dz6Ny5dWuTT5w0ibKysrrPCwsL6Vtby0BggHMMAI4zY2Jj6563VjJrqktua88jEkSkVZLtRa8SfJTCwkJuuukmamtrqd25k50PP0xg1SoK+/Wj8LvfpfDuuykMhei3eLEX3MPBdNq0aVgwyJ5m7DF4MJ07d8ZNneqVzMaP9/7jTkcpTSXAti/SK19EJMVUgm9KdOkKmh+3HJ3eLPHrVJfSVAIUEWlXNA4+FaLHPJvBk0/CypUNg3soBJWV9cPIpkzxqs8jQTfRcKjGxtXvSh5FRETCci7AO+eOcc595Jz71Dl3VbbzA9SPAR84EE46yRvbHh2op06FAQO8ce+BgPc8cGDsmPn4yXCaGlOequAvIiLtVk4FeOecH/gz8A1gGHCqc25YxjLQWGCNHs4UH7wjnwWD3n5+v1eajh76FD8caunShsOjovOgCWVERGQX5VSABw4BPjWzz81sJ/Ak8K2MnLmpwBrpzJYoeBcVQefOXrpu3WD5cpg0yevVHun4Fp2mc2fYf3/vs+g0EYnGRouIiLRQrgX4fkD07C8V4W11nHMXOOfmOefmVacy+DUVWKPnDY8P3uvWwbZtXrqaGu+z+DnGo9Ns2+b1wG9sHvLIzUSi4C8iIpKkXAvwiXqKxdSXm9lfzGysmY0tLi5O3ZmbC6w+nzerXHxgLimB0lJvv9JS7318x7dk0tRdAS1CIiIiuy7XxsFXAAOi3vcHVmfkzMnO1BY/bjmZ/Vo6C1wyY6M1PE5ERJqQayX4ucBg59zXnHMdgFOA5zN29tYOOduVJVhbQx3xRESkGTkV4M0sAFwKvAIsAZ42sw+ym6scpI54IiLSjFyrosfMXgRezHY+cpqmqBURkWbkXICXJGhlNxERaYYCfFulRUpERKQJOdUGLyIiIqmhAC8iIpKHFOBFRETykAK8iIhIHlKAFxERyUMK8CIiInlIAV5ERCQPKcCLiIjkIQV4ERGRPOTMrPlUOco5Vw0sT9HhioB1KTpWvtA1iaXr0ZCuSUO6JrF0PRra1Wuyl5kVN5eoTQf4VHLOzTOzsdnORy7RNYml69GQrklDuiaxdD0aytQ1URW9iIhIHlKAFxERyUMK8PX+ku0M5CBdk1i6Hg3pmjSkaxJL16OhjFwTtcGLiIjkIZXgRURE8pACvIiISB5qdwHeOXeMc+4j59ynzrmrEnze0Tn3VPjzt51zgzKfy8xK4pqc5Zyrds4tDD/Oy0Y+M8U594Bzbq1z7v1GPnfOuT+Fr9d7zrkDM53HTEriehzmnNsc9fu4NtN5zCTn3ADn3Ezn3BLn3AfOucsTpGlvv5Fkrkl7+53s5px7xzm3KHxNfpsgTXrjjZm1mwfgBz4D9gY6AIuAYXFpfgTcF359CvBUtvOdA9fkLOCebOc1g9dkMnAg8H4jnx8LvAQ4YBzwdrbznOXrcRjwQrbzmcHr0Qc4MPy6K/Bxgn8z7e03ksw1aW+/Ewd0Cb8uBN4GxsWlSWu8aW8l+EOAT83sczPbCTwJfCsuzbeAv4dfPwMc4ZxzGcxjpiVzTdoVM3sD2NBEkm8BD5vnLaCHc65PZnKXeUlcj3bFzCrN7N3w663AEqBfXLL29htJ5pq0K+G//bbw28LwI75Xe1rjTXsL8P2AlVHvK2j4I6xLY2YBYDPQMyO5y45krgnAd8NVjc845wZkJms5K9lr1p6MD1dFvuScG57tzGRKuEp1DF7pLFq7/Y00cU2gnf1OnHN+59xCYC3wqpk1+jtJR7xpbwE+0Z1R/B1VMmnySTLf99/AIDMbCfyP+jvO9qq9/Uaa8y7e3NijgLuBf2Y5PxnhnOsCPAv8PzPbEv9xgl3y/jfSzDVpd78TMwua2WigP3CIc25EXJK0/k7aW4CvAKJLn/2B1Y2lcc4VAN3J7+rJZq+Jma03sx3ht/cDB2Uob7kqmd9Ru2FmWyJVkWb2IlDonCvKcrbSyjlXiBfIHjOz5xIkaXe/keauSXv8nUSY2SZgFnBM3EdpjTftLcDPBQY7577mnOuA16nh+bg0zwNnhl9/D3jNwj0g8lSz1ySu7fAEvPa19ux54IxwT+lxwGYzq8x2prLFOdc70m7onDsE7/+V9dnNVfqEv+vfgCVmdkcjydrVbySZa9IOfyfFzrke4de7A0cCS+OSpTXeFKTqQG2BmQWcc5cCr+D1Hn/AzD5wzl0PzDOz5/F+pI845z7Fu5M6JXs5Tr8kr8mPnXMnAAG8a3JW1jKcAc65J/B6/BY55yqA6/A6yGBm9wEv4vWS/hT4Ejg7OznNjCSux/eAi51zAWA7cEqe3xSXAqcDi8PtqwC/AgZC+/yNkNw1aW+/kz7A351zfrybmafN7JURPUoAAAEaSURBVIVMxhtNVSsiIpKH2lsVvYiISLugAC8iIpKHFOBFRETykAK8iIhIHlKAFxERyUMK8CKSlPCKYV845/YMv98j/H6vbOdNRBpSgBeRpJjZSuBe4JbwpluAv5jZ8uzlSkQao3HwIpK08HSk84EHgPOBMeFVCEUkx7SrmexEZNeYWa1z7mfAy8DXFdxFcpeq6EWkpb4BVALxK2OJSA5RgBeRpDnnRgNHAeOAn8QtRCQiOUQBXkSSEl4J7F68tb5XALcBf8hurkSkMQrwIpKs84EVZvZq+P00YH/n3JQs5klEGqFe9CIiInlIJXgREZE8pAAvIiKShxTgRURE8pACvIiISB5SgBcREclDCvAiIiJ5SAFeREQkD/1/iuxUbGHdsQsAAAAASUVORK5CYII=\n", "text/plain": [ "<Figure size 576x360 with 1 Axes>" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "%matplotlib inline\n", "import numpy as np\n", "import matplotlib.pyplot as plt\n", "\n", "np.random.seed(10)\n", "\n", "X = np.sort(np.random.uniform(0, 3, 300))\n", "Y_true = 10 * np.sin(np.pi * X * X) + 20 * (X - 0.5) ** 2 + 15 * X \n", "Y = Y_true + np.random.normal(size=X.shape) * 5\n", "\n", "plt.figure(figsize=(8,5))\n", "plt.scatter(X, Y, color='red', s=5)\n", "plt.plot(X, Y_true, color='black', linestyle='--', linewidth=2.5)\n", "\n", "plt.xlabel('X')\n", "plt.ylabel('Y');" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**b)** Split the data set into **training**, and **validation** sets and scale the values for achieving faster convergence. Plot the training points and the testing points with different colors." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "<matplotlib.legend.Legend at 0x2b145ace198>" ] }, "execution_count": 2, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAe8AAAEyCAYAAADTKVg8AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3X+QW+V5L/DvI8mGYDvGWCt57TUYCEnYNcY2a2Pven8ZkgJtQ6DGmDRNyCR1w03ntnN7Z8rkzkDDnd6bTtM0l5ukuSRlknaYACEl0IZME+xdsK01rPwLYzAxCWDLu6vVLmTB/LR0nvvHkbRH2qNfq6MfR/p+ZjQrrY7OeY+09qPnPc/7vqKqICIiIvfw1LoBREREVBoGbyIiIpdh8CYiInIZBm8iIiKXYfAmIiJyGQZvIiIil2HwJiIichkGbyIiIpdh8CYiInIZX60bkIvf79dVq1bVuhlERERVc+DAgUlVbSm0Xd0G71WrViEcDte6GURERFUjIq8Vsx27zYmIiFyGwZuIiMhlGLyJiIhcpm6veds5e/YsIpEI3nvvvVo3pWGce+65aGtrw7x582rdFCIiKpKrgnckEsGiRYuwatUqiEitm+N6qoqpqSlEIhFcfPHFtW4OEREVyVXd5u+99x6WLl3KwO0QEcHSpUvZk0FE5DKuCt4AGLgdxveTiMh9XBe8iYiImh2Dd4l+97vf4bvf/W7Jr7vhhhvwu9/9Lu82d911F5588sm5No2IiJoEg3eJcgXvRCKR93VPPPEEzj///Lzb3HPPPbj22mvLah8RETnHMIBoFFCtdUsyNX7wdvidv/POO/Gb3/wGa9euxYYNGzAwMIDPfOYzuOKKKwAAn/70p3HVVVeho6MD9913X/p1q1atwuTkJF599VVcfvnl+NM//VN0dHTgk5/8JN59910AwO23345HHnkkvf3dd9+N9evX44orrsDx48cBALFYDJ/4xCewfv16/Nmf/RkuuugiTE5OOnJuREQ0wzCAgQFgxQqgqwsokKNVVWMH79Q739YG9Pebj8v09a9/HZdeeikOHz6Mv//7v8ezzz6Lv/3bv8ULL7wAALj//vtx4MABhMNh3HvvvZiampq1jxMnTuArX/kKjh07hvPPPx8//elPbY/l9/tx8OBB3HHHHfjGN74BAPja176GrVu34uDBg7jppptw8uTJss+JiIhmi8WAffvMoL1/P9DT40gYcURjB+9YDAiFgHjc/BmLOX6IjRs3ZoyRvvfee3HllVdi06ZNOHXqFE6cODHrNRdffDHWrl0LALjqqqvw6quv2u775ptvnrXN3r17sWPHDgDAddddhyVLljh4NkRElBIIABs2zDweGalIGJmTxg7egYDZ1+HzmT8DAccPsWDBgvT9oaEhPPnkkxgeHsaRI0ewbt062zHU55xzTvq+1+tFPB633XdqO+s2Wm8XXoiIGpQI8PTTwKJF5uMFCwC/v7ZtSik7eIvIShEZFJEXReSYiPyFzTb9IjItIoeTt7vKPW6RjQMGB4FIBBgaMh+XadGiRXjrrbdsn5uensaSJUtw3nnn4fjx49i/f3/Zx8u2ZcsWPPzwwwCAX/7yl3jjjTccPwYREZlefx145x3z/pkzQL2UGDkxPWocwF+p6kERWQTggIj8SlVfyNpuj6r+gQPHK43HAwSDju1u6dKl6O7uxurVq/GhD30IQcu+r7vuOnzve9/DmjVr8LGPfQybNm1y7Lgpd999N2677TY89NBD6OvrQ2trKxalvhYSEZGjAgGgu9u88lqhDtw5Eae7YUXkMQDfVtVfWX7XD+C/lxK8Ozs7NRwOZ/zuxRdfxOWXX+5UU13p/fffh9frhc/nw/DwMO644w4cPny4rH3yfSUiys0wzGvdgYDZgRuPA8ePA+3tZn7oJBE5oKqdhbZzdGESEVkFYB2AZ2ye3iwiRwCMwgzkx2xevxPATgC48MILnWxawzh58iS2b98OwzAwf/58fP/73691k4iIGkd2pEZmB248bl73np4GFi82u9F9NVjiy7FDishCAD8F8Jeq+mbW0wcBXKSqZ0TkBgA/A3BZ9j5U9T4A9wFm5u1U2xrJZZddhkOHDtW6GUREjSc1vDjVRz44OCu1Pn7cDNyA+fP4cWD16uo31ZGEX0TmwQzcD6jqv2U/r6pvquqZ5P0nAMwTkTqp2SMiIkJRw4vb282MGzB/trdXuY1JTlSbC4B/BvCiqn4zxzbLkttBRDYmjzt79hIiIqJaKWJ4scdjdpUfPWpWojt9zbtYTnSbdwP4EwBHRSRVOfVVABcCgKp+D8A2AHeISBzAuwB2KAcsExFRPUkNL8665p19Hdznq01XuVXZwVtV9wLIO4BaVb8N4NvlHouIiKiisocXF3EdvBZq34IGt3DhQgDA6Ogotm3bZrtNf38/sofFZfvWt76Fd1IzBaC4JUaJiKhMVZhmey4YvKtk+fLl6RXD5iI7eBezxCgREZWpCtNsz0XDB2+n12L967/+64z1vP/mb/4GX/va13DNNdekl+987LHHZr3u1VdfxerkRZJ3330XO3bswJo1a3DrrbemlwQFgDvuuAOdnZ3o6OjA3XffDcBc7GR0dBQDAwMYGBgAMLPEKAB885vfxOrVq7F69Wp861vfSh8v19KjRERUpApMs+0IVa3L21VXXaXZXnjhhVm/yyeRUO3tVfX5zJ+JREkvt3Xw4EHt7e1NP7788sv1tdde0+npaVVVjcVieumll6phGKqqumDBAlVVfeWVV7Sjo0NVVf/hH/5Bv/CFL6iq6pEjR9Tr9erIyIiqqk5NTamqajwe176+Pj1y5Iiqql500UUai8XSx009DofDunr1aj1z5oy+9dZb2t7ergcPHtRXXnlFvV6vHjp0SFVVb7nlFv3Xf/1X23Mq9X0lImpEiYTq+Lhq8r/vmgAQ1iJiZENn3pW4VLFu3TpMTExgdHQUR44cwZIlS9Da2oqvfvWrWLNmDa699lqcPn0a0Wg05z6efvppfPaznwUArFmzBmvWrEk/9/DDD2P9+vVYt24djh07ll4nPJe9e/fipptuwoIFC7Bw4ULcfPPN2LNnD4Dilx4lImp2qbq0tjagv79+1u3OpQaTulVP6lKF0xPKb9u2DY888gjGx8exY8cOPPDAA4jFYjhw4ADmzZuHVatW2S4FaiU2XS+vvPIKvvGNb2BkZARLlizB7bffXnA/mud6QPbSo+w2JyKyZ5fsObimleMaOvOu1KWKHTt24MEHH8QjjzyCbdu2YXp6GoFAAPPmzcPg4CBee+21vK/v7e3FAw88AAB4/vnn8dxzzwEA3nzzTSxYsACLFy9GNBrFL37xi/Rrci1F2tvbi5/97Gd455138Pbbb+PRRx9FT0+PMydKRNTILEVRdVqXllNDZ96A4yuCAgA6Ojrw1ltvYcWKFWhtbcUf//Ef4w//8A/R2dmJtWvX4uMf/3je199xxx34whe+gDVr1mDt2rXYuHEjAODKK6/EunXr0NHRgUsuuQTd3d3p1+zcuRPXX389WltbMTg4mP79+vXrcfvtt6f38aUvfQnr1q1jFzkRUT6GYfaPJ7tmZWgIg4OeWfOz1CvHlwR1CpcErR6+r0TUdMbGgOXLZx6PjgKtrbVrT1KxS4I2dLc5ERGRLZGZ9Np63yUYvImIqPkEg0BPj3mRu6envqvTbLjumreq2lZq09zU62UTIqKKyrUIiUu4KvM+99xzMTU1xYDjEFXF1NQUzj333Fo3hYio+lIVzS4L3IDLMu+2tjZEIhHE6mRi+EZw7rnnoq2trdbNICKiErgqeM+bNw8XX3xxrZtBRERUU67qNiciIiIGbyIiItdh8CYiInIZBm8iIiKXYfAmIqKmYFmHxPUYvImIqOG5bb3uQhi8iYio4dmt1+1mDN5ERNTw3LZedyFlB28RWSkigyLyoogcE5G/sNlGROReEXlZRJ4TkfXlHpeIiKhYqanMIxFgaMiVM6JmcCLzjgP4K1W9HMAmAF8Rkfasba4HcFnythPAPzlwXCIiooJShWoirp3KfJayg7eqjqnqweT9twC8CGBF1mY3AvgXNe0HcL6I1H7VcyIiamiNVqiW4ug1bxFZBWAdgGeynloB4JTlcQSzAzxEZKeIhEUkzMVHiIioXI1WqJbiWPAWkYUAfgrgL1X1zeynbV4ya6Sdqt6nqp2q2tnS0uJU04iIqEmlCtW8XqCzE2iU0OJI8BaReTAD9wOq+m82m0QArLQ8bgMw6sSxiYiIchEBdu0CNm4EwmGzC70Rus6dqDYXAP8M4EVV/WaOzR4H8Llk1fkmANOqOlbusYmIiAqZmgJGRhqr69yJ9by7AfwJgKMicjj5u68CuBAAVPV7AJ4AcAOAlwG8A+ALDhyXiIiooFTXeSjUGGO8AQeCt6ruhf01bes2CuAr5R6LiIioVKkx3rGYGbg5VIyIiKgeZa1C4vE0zhhvgMGbiIgaTaMO7rZg8CYiosaQyrYnJhpzcLcFgzcREbmfNdvevr2xViGx4US1ORERUW1Zp1IbHgZOnjQvdDdKhVoWZt5EROR+2Wt+LlvWWBVqWZh5ExGR+zXieLA8GLyJiKgxpMaDNQF2mxMREbkMgzcREblW1lwsTYPBm4iIqsPhSNsEc7HkxOBNRESVV4FIax0d1qBzseTE4E1ERJVXgUibPTqsAediyYnBm4iIKs7wBxDt/H2oNzPSltOTnhodFokAQ0MNPzosA4M3ERFVlGEAA1sFbeFH0b/xbRi7hwARR3rSG221sGIxeBMRkWOsmbRhANExAxPHYgiFFPG4IDQyH7FJM9I28zXrcjF4ExGRI6yZdF8f0NWlWLHcwPY1L6LrvCPw+TTj2nQzX7Mul2idDo7r7OzUcDhc62YQEVGRolEzcMfjqd8oAIEXZ3HKczE8Rw4h0NGS0cVtGE0zo2lRROSAqnYW2o6ZNxEROSKVSXu9qUAsABQbEMay7ksR7PDPCtDlXLNu1glaAAZvIiJyiAiwaxdw6BDQ02N2h2/eBOwbvQTy1FDuCD2HKNzME7QADN5EROQQwwCuuQZYv96MwydPAvtCAk9rntR6jlG42YvdGLyJiMgR1oA6PGx2iRfqDjfGooju+TU0HoexN4ToC1NFJeCzit38zdWHzuBNRESOyA6ofn/+eGrEDQy0R9GmJ9GHQQzIINrW+YtKwDMmaNltQLY2Vx+6I8FbRO4XkQkReT7H8/0iMi0ih5O3u5w4LhERVVnW9el4HHj+eeCDD4Bjx8xr3pEIsHs3sHVr/ngaOz6F0JsdiGMeQuhGKHG1ORa8yG7wdLHbZPP1oTuVef8QwHUFttmjqmuTt3scOi4REVVL1vXp+AcG/H7giiuAc84xf/r9wNKlwORk4XgaaPeja/Ex+HAWXdiHrg8fmzUWvChNOGDc58ROVPVpEVnlxL6IiKhOZVWJHQ+9julpf8Ym09PA8eNAR4cZR0Oh3PFUPILByTWIvRBDoOXj0EAQsUnJO+bbdlx4qg+9iQaMV/Oa92YROSIivxCRDrsNRGSniIRFJBxrgm4PIiJXycpw23uWYvHizE0WLwba24tfNMTj8yC4JghpXQaPV/KO+c5bmN5kk5w7NsNaMvP+D1VdbfPchwEYqnpGRG4A8H9U9bJ8++MMa0REdSgr9Y3HzUz7ox8Ffv1rM3B7HEwL04fzG5h4cQpt6/yIxwU+n/nFIBh07lj1oK5mWFPVN1X1TPL+EwDmiYi/wMuIiKjeZGW4Ph+wejUwf775c86B22aiFiNuYKD7fbS1KfoveA7+K1ega8HsOdKbUVWCt4gsE0lOlieyMXncqWocm4iIylPxaUjt+sPjccSu/gOE9nvMCvQ3OzBpLMHgmY2IHJpsuvW7szk1VOzHAIYBfExEIiLyRRH5soh8ObnJNgDPi8gRAPcC2KH1uiIKERGlGXEDA1s+MLPffvOx45E8e7q0aBTo6UHg4C/QhX3pavSAZwqe7s22c6Q3G6eqzW8r8Py3AXzbiWMREVGVGAZiPTcjtP8niEMQCiliPTcjGP65Wbg2OOjMBe5UIVyqNF0EGBmBABjEVsTQgsCWj0F+EmmqorR8OMMaERHZi8UQGPl5OvvdvOYd6LMjUKcnQ8kuTQ8Gge5uwOuF5+qNCI4ehjz9FLBsGQN3EoM3ERHZCwSgXd140PNZvLZxO2TBeVipr6EfT8HY3O1sxZi1EC4VzE+fNidJb21l0M7iSLc5ERE1HkMFAzKIkAfo9ADhYUFcfQj5ehB7eBDBSgbUVDAnW8y8iYjIlllHJojHBSMjgs7O1PwsgkCQmXAtMXgTEVGm5NiwQIumJ1Tr7jZ/qs7cqHbYbU5ERDNSY65DIUhXFwZ3DSI25YEqsHIlkEiYl6FjMfZq1xIzbyIimpE15tozFUMwaAbqJlu4q64xeBMR0Ywcy2sWu9AIVQe7zYmIaEae5TVZAF4/GLyJiCgTo3TdY7c5ERHlVfGFSahkDN5ERJST3YJfVHsM3kRElFP2gl9OTWdO5WHwJiKinHIUn1ONsWCNiIhyylN8TjXE4E1ERHmx+Lz+sNuciIjIZRi8iYiIXIbBm4iIOJjbZRi8iYiaXfZg7nicgbzOMXgTETW7aBTYt88M2vv2Ab29nJWlzjF4ExE1unxd4oYB7NgxE6Q7O4GREc7KUucYvImIaiEVUBOJynZRF5rfNDWFmqo5E8tPfwps2MBZWeqcI8FbRO4XkQkReT7H8yIi94rIyyLynIisd+K4RESuZA2oS5cCK1ZUrou60Pymfj+wYIF5f8ECMwt/9lkzA9+9m7Oy1CmnMu8fArguz/PXA7gsedsJ4J8cOi4RkftYA+r0tJl9V6qLutD8ppOTwJkz5v0zZ4DhYbM94bD5HNUlR4K3qj4N4PU8m9wI4F/UtB/A+SLS6sSxiYhcxxpQFy8GvN7KdVGn5jeNRIChodmZdCAAdHebbenu5kTmLlGt6VFXADhleRxJ/m7MupGI7ISZmePCCy+sUtOIiKrMOmG4329muJWcONxmflPDSM1XLhDr5OWqnMjcBapVsGb3FzCrOkNV71PVTlXtbGlpqUKziIhqYCZymll3MFjVQDmrhg2emTZ4PFVvD5WuWsE7AmCl5XEbgNEqHZuIqH4Uqv6uAq7R7X7VCt6PA/hcsup8E4BpVR0r9CIiooZTB5HT788cDabKydTcxqmhYj8GMAzgYyISEZEvisiXReTLyU2eAPBbAC8D+D6A/+LEcYmIXKdQ9Xc2h+ccNwxg61ZzNNhVV5m7XbmSk6m5jWidft3q7OzUcDhc62YQETnPes0737XlVBd7KGQG+sFB8/fRqPm6OVybjkbNHvt43LzcLmLe9/nMgnSu211bInJAVTsLbccZ1oiIqiSdREuRRWHZXezRqBnMV6wAWluBvr6S0+XsxJ8jw9yJwZuIqArmVKeWHWlFZqYyBYq6Zp7d624d9v3UU/mHgFP9YvAmIqqCOdWpZU+wEgzOBHGgYLqc6wuDhyPDXI/Bm4ioCvx+c7rwkidT83iAlhZgYsJ8PDgInD4NjI2ZqXOeqFsHhe1UIQzeREROsqkOT1V4h8PAxo3Ak0+asbioeuHs9Bkwr3cvW1YwXc5Z2O5wBTtVH4M3EZFTcvRTWzPgZ58168yKvvZdRvpsO615HUwSQ+Vj8CYickqOQGvNgDdsAEZGSojFJY4Lz06qPR4g2GJAJqIz85azL931GLyJiJySI9BaM+B9+4qMxakoDBRdEm6bVGf/0u/n+LAGwElaiIicVMQELAU3sZucxVM417JOwJKedAU2v2xp4cphdYqTtBAR1UIRY68KbjLHrm3bxD81kbnXa5a7t7RwfFgDYPAmIqo3ObrfcxWJ5+xh12SZ+zPPAOedZ5a7DwywSK0BMHgTEdUbmzLxXEXidiPJ0kl1KoM3DOCtt1ik1kAYvImI6lFW13Z2T3pqzpZY1EAopIjHzWK41O8BzGTwXi+weDGL1BoIgzcRUQnmPL9JmROjpOJwahWw7dsBI24gcOsANsf3ADBgGIpbb7X0iqcy+NOngakpTmLeQBi8iYiKNOf5TRyYGEUEePDBmaLz4WEgdnwKMhzCQ9gOHxJQldm94qkM3utlkVoDYfAmIirSnOc3cWhilGXLgO5uS+93uzlme5l3Cl2Lj8HnU/aKNwkGbyKiIpU42ZkDL8w0q47NY/5CTkcwOHUlIhGZqTLn3OUNzVfrBhARuUUqeJY8v8mcXzhbqhc8+xceJH8/xwleyF34iRIRlWDO85uUMTFKSbVunLu8KTB4ExHVsZJr3Rzqoqf6xm5zIqI6ZpdIZ3SbZ3Owi57qFzNvIqI6NqdEmnOXNzxm3kREdYyJNNlxJPMWketE5CUReVlE7rR5/nYRiYnI4eTtS04cl4iopsqcNa1YTKQpW9nBW0S8AL4D4HoA7QBuE5F2m00fUtW1ydsPyj0uEVFNWSrJjL4BRMcMDqumqnEi894I4GVV/a2qfgDgQQA3OrBfIqL6lawkM+IJDOy5B20XylxnPiUqmRPBewWAU5bHkeTvsv2RiDwnIo+IyEq7HYnIThEJi0g4xrGJRFSPUl3lLS1AVxdi3laEpAvxuM284kQV4kTwtrsKk9159O8AVqnqGgBPAviR3Y5U9T5V7VTVzpaWFgeaRkTkIOug64EBYNcuBCIH0dXj5bBqqionqs0jAKyZdBuAUesGqjplefh9AH/nwHGJiKore9D11BRkWZDV4FR1TmTeIwAuE5GLRWQ+gB0AHrduICKtloefAvCiA8clIqquHIOuWQ1O1VZ25q2qcRH5cwD/CcAL4H5VPSYi9wAIq+rjAP6riHwKQBzA6wBuL/e4RERVx0HXVCdE63RsQ2dnp4bD4Vo3g4iIqGpE5ICqdhbajtOjEhERuQyDNxERkcsweBMREbkMgzcREZHLMHgTERG5DIM3ERGRyzB4E1FTq9KqnkSOYvAmoqZlnaqcK4KRmzB4E1HzSabbsQnNmKqcK4KRWzB4E1Hdc7Rr25JuB7b3o6tLZ60Ixq50qncM3kRU1xzv2rasDCbDIQw+OIFIBBgaMqcqZ1c6uQGDNxE5o0LpavYqnGV3bWetDOZZFshYEczx4xFVAIM3EZWvgulqjlU45y61Mpg13a7k8YgqgKuKEVH5olEzcMfjZtSLRICWFseWzjSM6q7CWe3jEaVwVTEiqp7sdNXvdzQT93iQ0bVdadU+HlGpGLyJqHzZXdGTkwUvHBsGEB0zoON5rpNXseybFebkJgzeROQMa7qa58JxPA489xzQ16doW2Ggv/UlGH0Ds7PzKpZ9s8Kc3IbXvImoKCVfB7Z5QTxu9qhPTwOAAhD4cBYR7yoETx80g3+K3XV06/MOnkeFDkVUMl7zJiLHzCUzNeBBFEEoZiL98eOpwG3yII4uDCOw+VKzv9qaTFSg7DvXebDCnNyGwZuICip17HOuINneDixebN5fvFgQiXgwdOpSSPwssHJl5sYFhnSVfx6K2IQChgGZiGJwtzp5KKKKYvAmooJKzUxnBckXYoAqPB6zlu3oUeD114HWVkBu2Qbs35/xzSBdPCbOln2b56HwSRxd8T0I3NKX/pbh2dqPYIvBwE2uwOBNRAWVlAQbBgIaTc4ZruhacASBtcvTWbXPB6xebda3IRYDRkZmXrthAwx/IDNrjztQBp78NiBQczpUz0UYQh9kOATs22d+cdi3j9OpkWsweBNRUYoa+5zsL5eVbRjUAUQOTGDozAZIIqu/PZVat7QA3d2A1wts2gTs3YvYpGRm7T03m5G8rw8YGwNUSxvWldWH7wn4Eez+CMTnAzZvBhYuNLdbuNCspiNyAUeCt4hcJyIvicjLInKnzfPniMhDyeefEZFVThyXiOqMpb/cM7wPQX8CsnFDZn+7NZgODAC7dgGnT5uv83gyu+g7P0Dg2f8wI/mePcCFF8LoG8DAgBZfPBeLZWbXk5Mz3Qg/+Qnw9tvmdm+/bT5H5AJlB28R8QL4DoDrAbQDuE1E2rM2+yKAN1T1IwD+EcDflXtcIqqeYjNdwx9AtPP3od5ksN6xA3j2WaCzE9i920zbrRfE9+0zS9At47bSXfQnDQzN+yRELdE5HkcsdKK0hUP8/tnZdaobIRhkmTm5khOZ90YAL6vqb1X1AwAPArgxa5sbAfwoef8RANeIsCyEyA2KHSZmGMDAVkFb+FH0b3wbxo8fAoaHgUQCCIdnstpUau31msF03bqZHcfjwPPPwwMDQU/MvCatagbXTZsAnw+Bro+UFm8nJ3Nn1xWoaCeqBieC9woApyyPI8nf2W6jqnEA0wCWZu9IRHaKSFhEwjEWjhDVhVjUQCikmcOr7LZLJ9SC0Mh8xDw5stpUwDx82AymqRR6bMzMiq+4ArjgAvNmff3evUAkAnlqCIODgpMngYcemt2OWb0EhUrlOZE5uZATwdvuLz77X3cx20BV71PVTlXtbGlpcaBpRFQWw0Dg1gF0xZ+GF2fRGd+Pllv6bdPvWTEymCer9XiAjo7MF7z++swMLtPTwEsvZb7e650JsoaBHTd/gJUrNaM3wLaXgNk1NSAngncEwErL4zYAo7m2EREfgMUAXnfg2ERUSTGz63oXtmIjRhBGJwb2/k8YY9FZF8FtY2S+rDb5AuNkBNGHhqDtHdYZXMwZXexebxiIbvkj7NnvQzwu2LNHEX3eHEeeczIZZtfUYJwI3iMALhORi0VkPoAdAB7P2uZxAJ9P3t8GYLfW66TqRDQjmU5PeYIYwQbEMQ8h6UJs2x22F8FLjZEGPBjYEUTbSkH/Vg+MCcsMLp4c/z1Fo5BnhtNdd6oKWbcW6O9HwG+w/oyaQtnBO3kN+88B/CeAFwE8rKrHROQeEflUcrN/BrBURF4G8N8AzBpORkR1KJkdB04fQlePFz6vouvqBAIjPy+h3Du3WZnyG9YZXHK3KYgoemB25fdgL4LGKBAKQSZj7CGnpuBzYieq+gSAJ7J+d5fl/nsAbnHiWERUZR4PJNCCQWxFDCcQ8F0K6e4yo22Z6W3qOnlJuwoGIVu2YGjvVsTQgsCH34e8M5Nqe4QrglHj45KgRJTBdunP7DUzT540s+Oi1wct8XiFnD7S3B+nAAAXTUlEQVQNXHSROQzN6zUr1zs6YKggFjUQkBgkWH7biKqNS4ISUclyjunOLiVftsyxArCSa8kMA/jMZ2Ya19WVDtwDA4q2FQb6W1+C0TdQ3NqlRC7E4E1EaTmrtVNrbVtvtW5kavKW73wns+3qQwibEQud4EIj1LAYvIkoLed8JrHYzGxpw8O1DYrWRi5YkJ6hLV1pLnF0YRjauQHawnJzakwM3kSUlnM+k1IX9K5GIw8dAs6cMb9QpCvNBSdPeaCbNmHlgUfRPyDsOaeGxOBNRBlsr0HX2yxlqRnaurszvlB4PIDH58FweL45TWt5I9mI6haDN1GTKmlNbKCuZikzDCA6IdDds79Q1FMnAVGlMHgTNaFiVwqrRxlt3+qB0ZL5haLeOgmIKoHBm6gJ5awqt1Nyil5ZxbS9jjoJiCqCwZuoCRXdtVyHKTq7xYkcmh6ViNwl1bVccGYzuzS3xnOPFt12ogbGzJuoSdl2LWd3kddpmstucWp2DN5EZLLrImf1F1FdYvAmIlOuSrBcaW6dFbIRNRMGbyIyldJFXoeFbETNhAVrRE2i4NKbpVSC1WEhG1EzYeZN1ASKTpSLrQSr00I2ombBzJuoCTieKHO8FlFNMfMmajB2dWQVSZQ5XouoZhi8iRqIXfe4YQATE8Du3RzxRdQoGLyJGkg0mtk9Ho3OBPOtW4GWFgZuokbA4E3UIAwDuPVWM3CLmN3jIiUsQEJErsHgTdQgYjFgeNi80O3xKB56yLwkzaJwosZTVvAWkQtE5FciciL5c0mO7RIicjh5e7ycYxKRvcAFcXTpPvhwFt3Yh+DSOGc3JWpQ5WbedwLYpaqXAdiVfGznXVVdm7x9qsxjEpENeek4BhO9iKANQ4keyEvHAbAonKgRlRu8bwTwo+T9HwH4dJn7I5qzpp9qu70dnsUfRhATkMWLgfb2WreIiCqk3OAdVNUxAEj+zHVF7VwRCYvIfhHJGeBFZGdyu3CMlTVUAk61DTPFnpwEjh4FXn/dfExEDUm0QJoiIk8CWGbz1P8A8CNVPd+y7RuqOuu6t4gsV9VREbkEwG4A16jqb/Idt7OzU8PhcDHnQA4qOP91nYpGzcAdj5vFWZFIc0217dbPjYgyicgBVe0stF3Br+aqeq2qrra5PQYgKiKtyQO2ApjIsY/R5M/fAhgCsK6Ec6EqcXP2mmsGsWboSnfz50ZEc1Nuv9rjAD6fvP95AI9lbyAiS0TknOR9P4BuAC+UeVyqgFzLOVfVXKKtYUDGxzD443FETmm6qrpmQa3K3xhyfW7N8MWFqFmVG7y/DuATInICwCeSjyEinSLyg+Q2lwMIi8gRAIMAvq6qDN51qBoLReUNKHOJtoYBo28A0eVrISuWI7i9DzI+BqjW5stINb4xGAYwNgaMjwOqsz83vwFjLIqBAWU2TtSgCl7zrhVe866NSl47TcW1UMgMMoODWTVVc7hwbYxFMbD8OELoQhf2YRBb4fF6gO5u6O5B9G/1pI+3e7dZz1XR68KVvvieehP37DG/AfX0AENDMOAxPze/Adk6gOi+l9FmvIa4+pqyBoDIrRy75k3NxZExwTnS64KZcKmpv2EgNikISTfimIcQuhFDC5BIAKEQZDKWnqBk925zbu+CmWi5fc0Od19kNMcwgBdeMN+8VPuSb2T6c5s03+RAYhRdGoLPp5xZjagBMXhThrKvk+bpNi4Y10qZDix5nMC6Feha9LwZpDYZ8Hd/HFFPK/SqTqClJR3UJicLd6EbcQPR7puhK8roa3ZwSrPMt1Jh9G8F1q6Fcd5CRBGEArPfyOSbLD4fBnvuQuQUZ1YjakiqWpe3q666Sqm6EgnV3l5Vn8/8mUjMYSfj4+YOAPPn+PisY4yPqxpGmY21HCfhnafjRyc0Hlft7UmoD2e1F09poqcvfRKGkXlu2cdPJFR7N7+vPnygvRjUhHferLZXW+Zbaei4d7kmINqLIfV5De3d9J4m4jZvpGNvMhFVG4CwFhEjmXlTmiMFXgXSa8em6rQcx9O9GcEOv5ldDwvi8CGEzYiFTqRPolBCHIsBoZF5M93vG26oTV+zpetj1lvZ9RHEvK3mZYKEIBQ+B7FJmzeS86ESNTwGb0pz5HJtqd3Gc+2ntzlOIAB0bQZ8OIsu7ENgwduA359+Sb6YZp67mN3vmxWBfY9WP/hlXXIQNSynKNDdg9BDh7F5i5erhBE1OQbvJmUXM/PF3ZJirMcDoyWI6ITk377cYVXWaGwYkIkoBh+MIuK5CEMYgLx9xrzYXQQRYNcu4NAhweDe+RBPDbJWm64PjwcIthjQ8SgGrhGsXN8CEcHJk7yWTdTMGLybUL6YaZedGoa53YoVQF9f4RhbdEx2aiC25YCe225FsPsjEJ8P6O5Op6apLx+JhP2XEMMArrkGWLfO3JVhIOMbS6kdBHPqULDr+kieW2zleoT2JNJvlcfDwE3U1Iq5MF6LGwvWKqdATdkso6Pmtqnb6KhD+y9URVas7AOOjmYUbFkL8RYvVvV6ZxfkzWrz6MyLEj192ttrFF3IV1bhX1axWWJ0PFmoBu3FU+rzGWW9VURU38CCNcql1GvbIjNZnvV+2fvP1U+flbYWzGItBzQ2dyMqy6CBme4Da4I/PZ0eBp6R6M9qs8y8KBY6UVIHgW2Hgs1J2J6XpevDMICBHQG0Ga9hAE9h15a7OfSLiACw27wplVpTFgyaE3n5fOZPu5m6rIGopP1n99Nn9bkbcaNwF3zygMbJCAZkEG0rJWPbVGD2eoFFi8xDdnUpAjoTOWe1OTgTzQNdH0FX5wdFT3ji9wMLFpj3FywA/BcYMPq3IrpiPbTPbFgxlxbMLwGCuPoQ8vVg6ie7EVwmDNxExG7zWnLTcNx8bXVifHh6/2OZ/dfjRyfydsFb22Xt+vZ6VY8enWnv2bOqmzebz2/eZGh8S1/hBicSqpGI6tVXa8Lj0/FNN6oRL3xy4+Pm8VPtGD0yYXZ54wNz/PnoeFGXFpy6qkBE7gF2m9e3ulrGscTqquzNy607y3gvbg3A2Nyd7r8OtPtzdsFnv4d+/0yGvXChWXyWem+npoCREbONI2FgcvhEcQ3etg145hl4jDiC+x+DTEQLvh+BgFkrl6qZkxY/QtJljiGXLsQkUNSlBQcnayOiRlNMhK/FrdEz71KLxkpWbFpfRNps3aSnZ/bm5WaIo6OqIuZ7IaI6GknMKjizOxW79zCRMDPu7N9nttFQo6eIBlsPkLqNjeV5+wxNjJoHs7bZPLahXq+hmzYZ6bfY7rzc1BtDRM5DkZl3zYN0rlujB+9KdomefT+hR9d/ThMe+4CcESBGR2f6eHN8i8jujrb70pEddEoJQmNjeeNjTrnew1y/z2hTMQ1M7Sj1zWLLllnbj48m1OczzPdDzuq4d7nte27ttrf9jpRIaGJ0vKSqdiJqPAzeLlCJLOvsWdXFixIKGLoYb+hZz/yMgDwrU+zpmwlOPT22jbEGw+zMe67XwLMz054e84tBjibklOs9dOy9TSTMLzhjY6rx+KxvKEZPb/J69lndjJAmcnwJytvTknzDxr3L1SdnK9cbQ0R1j8G7ERURkY4eVQWMZBZr6NH1n8vY3m6xi3S0yJPyWg9dqBmFLgnYBfdCBXE170q2a3TyRM/Co5uwT73WBVGyGpu3pyW5H4NjuYmaXrHBmwVrVmWvh1lBRVa4tbcDixcLAMXiRYr2Z3+YUelkt9hF+oHNGLDUWyIyM6IrY3SXzXtWqBjLrsAt17zjdVPYZ9fo5IlOeZchLBuRgDmkK/bw4KwTyVt8xmU8iahUxUT4Wtyqnnk7sh6mQ+0otjorh7NnzQw83wioYq79FnxL8myQL1su5Xp/pQv7is7q81xIN8Yyr1XPKWOui+4FIqo1MPMukVPzbJcjX5pZwrRoHg/Q0pI7c8vIcvMstVXwLcmzQb4VvEoZAuXISmdJ2Z0EJWX1uRrt8UCWBTE4KIXPJ99c6VzGk4hKwOCd4mSUsMrXFV/MgOnUNkBG8DBUci6wYRuQrMcq8vJAwbekjPes2FhVSqAv9FZnvy8lf1/L0+iC52NpgNE3gIEBrf2lACJyr2LS81rcqtJtXs74pmL3b9etnKpgLjRgOh63fb25WyNZ2GRkdGfbdjMXGqhdwltU+gbVUaiL3+59KWe4XsmnbWnAuHf5zPAyVpUTkQVYbV5Apa5x55qv0zqLSG+vOS4qNUQr14DpHKtljZ+OzwwpkrPmCljJl46N2QQk6348nvzjuuskGJeq0LVxx8Z+6xz/dCwNMHp6y79GTkQNqSrBG8AtAI4BMAB05tnuOgAvAXgZwJ3F7LviwbsSlVDZ/6tnZ87ZATl17Fz/g+cYYG1s2qy9GErPlW2Mjc9KrkdHLbtM7cfrVf3wh3OP666Xor05KCaLzhubSzj3Yv50bI9l+aVLvyMRUYVVK3hfDuBjAIZyBW8AXgC/AXAJgPkAjgBoL7TvigfvSkxxlivTtv4vbTluYkuvjj8XVSOR59ip14+NZew7cfVmHfcuN6f5NIzCAcVu3tDscd0Vn7O1ssoKiCWce6E/HRd/ByKiGis2eJdVsKaqL6rqSwU22wjgZVX9rap+AOBBADeWc1xHzHXVh3xVUXYFXNmVTCIwdg3i9Mgous8OoW19AP0D5trNedd3DgYz9u3ZtwfB0wchT5ltL1g75vEAHR2ZG2WP665U0V6VlFWwXcK5F/rTqYeBC0TU2KpRbb4CwCnL40jyd7VhjZDJ/+1zVW5nb15wbFGO/9WzC70HrvGgbV0L9j8j6f/go9ECw5ay9+31ZkSqor6L5NvIMICJCWD37uZcxqrEL3P5vii4/DsQEblAweAtIk+KyPM2t2KzZ7v/BW3HKInIThEJi0g4Vol0xSb45ovHs56L5k+pDAOIxjzQQDAjcFv3EY2aL7XasMHcfNaus1PxAqllUZmn3UbWRm7dmn+QeCNzaKw1l/IkokorGLxV9VpVXW1ze6zIY0QArLQ8bgMwmuNY96lqp6p2trS0FLn7Etj0Z+YbWj0xkfWc+oEFC8x9fehDwNKl6V3n+hKQvX8RMxtL/Yd+9dXA3r2zesUR8FdxXlD28zrO4wGCLYa5/rddlw4RURmq0W0+AuAyEblYROYD2AHg8Socdzab/szsX/n9MzFz+/bUc4quDR8gIDHgzBkYEETf+hB0S086qOaKf9n7DwbNrOz0aWBsDBgeNv+jn5WtTVYxoLKf13l1Myk7ETUi0TKyAhG5CcD/BdAC4HcADqvq74nIcgA/UNUbktvdAOBbMCvP71fVvy20787OTg2Hw3NuW06pqbX8fmByEggEYKggFjUQkBgmEEDbSvNatM8HnHzVgGfbzQiM/BzS3QXj/bMYeOZ/IYRudCGEwchH4fF5oC1m4VkoZMY/a3dp6pCBQAldqKpAXx/SO3zqqcr2v86pkZRTNGoG7tQfUiRiu/ALEZGViBxQ1c5C25Vbbf6oqrap6jmqGlTV30v+fjQVuJOPn1DVj6rqpcUE7opKTfy9dWs6K/IYcQR3DEBWtiGwvR9dXZpOQpd5YwiGfw5JmBlw7P/9G0LYgjjmISTdiG27A2hrgwz0Y3CXYXudc06XUlXNF6Rule565dzazmJvBhFVUFmZdyVVLPMGZmdFhw4B69alHxsnI4h5gmYSCjW7PZMZsA4OoX8g+bDzAwyNLDQDu9PZFTM392NvBhGVqCqZt2tlZ0Xt7ZljqJcFZpLQrIvR4pHkClKCoX3zId0Vyq6YubkfezOIqEKaM/MGZmdFhbKkXM9XMrti5kZE1FSYeRdiyYrsxmdbGXED0e6boSvM5RyjY8bMJejUNfSJifR16SJX3CypjURERCnNG7yTskf0jI1lBl3DAAZ642jb/xP0J55E/5570HahzIz+ydqBETfyjhByLLATEVHTap5u8xxd0Na6MBEz2e3uNi9zezxmMF+5UpFICHw4C4UgAd9MDRmiMFasRCyxBAHv65g4PIq2dS22dWapOJ8a/ZU6BhEREcBu80x5Jsyw1oWpAolE5kxrO3YAhiEAFJuvVnT1eDNqyAx/AAMLn0UbIuhfOAL/x/0568w4kRkRETmhOTLvAsOuUmtybN9uzniWmmRlYiLzZadOmcHYmsCbu1bE4wKfTxGJCFpa7OvMNHPUGee9JiKiDMy8rQoMu/J4gGXLzGBqnWTFbmrT7BoycxtJbiO2q4CmcMEKIiJyQnNk3sCch10V8zLHR3RxiBgRUVNi5p1tjsOuinmZoyO6uKAFEREV0DzB2y1Y1UZERAUweNcbTotKREQF+GrdAMqSqmrjNW8iIsqBwbsepS6iExER2WC3ORERkcsweBMREbkMgzcREZHLMHgTERG5DIM3ERGRyzB4ExERuQyDNxERkcsweBMREbkMgzcREZHL1O2SoCISA/BardtRIj+AyVo3ogp4no2F59l4muVcG/E8L1LVlkIb1W3wdiMRCRezDqvb8TwbC8+z8TTLuTbLedphtzkREZHLMHgTERG5DIO3s+6rdQOqhOfZWHiejadZzrVZznMWXvMmIiJyGWbeRERELsPgTURE5DIM3mUQkVtE5JiIGCKSc7iCiLwqIkdF5LCIhKvZRieUcJ7XichLIvKyiNxZzTY6QUQuEJFficiJ5M8lObZLJD/LwyLyeLXbOVeFPh8ROUdEHko+/4yIrKp+K8tXxHneLiIxy2f4pVq0s1wicr+ITIjI8zmeFxG5N/k+PCci66vdRicUcZ79IjJt+TzvqnYba4HBuzzPA7gZwNNFbDugqmtdOiax4HmKiBfAdwBcD6AdwG0i0l6d5jnmTgC7VPUyALuSj+28m/ws16rqp6rXvLkr8vP5IoA3VPUjAP4RwN9Vt5XlK+Hv8CHLZ/iDqjbSOT8EcF2e568HcFnythPAP1WhTZXwQ+Q/TwDYY/k876lCm2qOwbsMqvqiqr5U63ZUWpHnuRHAy6r6W1X9AMCDAG6sfOscdSOAHyXv/wjAp2vYFqcV8/lYz/8RANeIiFSxjU5ohL/Doqjq0wBez7PJjQD+RU37AZwvIq3VaZ1zijjPpsTgXR0K4JcickBEdta6MRWyAsApy+NI8nduElTVMQBI/gzk2O5cEQmLyH4RcUuAL+bzSW+jqnEA0wCWVqV1zin27/CPkl3Jj4jIyuo0reoa4d9ksTaLyBER+YWIdNS6MdXgq3UD6p2IPAlgmc1T/0NVHytyN92qOioiAQC/EpHjyW+TdcOB87TL0OpuHGK+8yxhNxcmP89LAOwWkaOq+htnWlgxxXw+rvgMCyjmHP4dwI9V9X0R+TLM3oatFW9Z9TXC51mMgzDnAz8jIjcA+BnMSwUNjcG7AFW91oF9jCZ/TojIozC79uoqeDtwnhEA1gymDcBomft0XL7zFJGoiLSq6liye3Eixz5Sn+dvRWQIwDoA9R68i/l8UttERMQHYDHc111Z8DxVdcry8Ptw4bX9Irni32S5VPVNy/0nROS7IuJX1UZbsCQDu80rTEQWiMii1H0An4RZANZoRgBcJiIXi8h8ADsAuKYSO+lxAJ9P3v88gFk9DiKyRETOSd73A+gG8ELVWjh3xXw+1vPfBmC3um8Wp4LnmXXd91MAXqxi+6rpcQCfS1adbwIwnbos1EhEZFmqNkNENsKMa1P5X9UAVJW3Od4A3ATz2+37AKIA/jP5++UAnkjevwTAkeTtGMxu6Jq33enzTD6+AcCvYWahbjzPpTCrzE8kf16Q/H0ngB8k73cBOJr8PI8C+GKt213C+c36fADcA+BTyfvnAvgJgJcBPAvgklq3uULn+b+T/xaPABgE8PFat3mO5/ljAGMAzib/fX4RwJcBfDn5vMCsvP9N8m+1s9ZtrtB5/rnl89wPoKvWba7GjdOjEhERuQy7zYmIiFyGwZuIiMhlGLyJiIhchsGbiIjIZRi8iYiIXIbBm4iIyGUYvImIiFzm/wOL2Kxp0Z5jVAAAAABJRU5ErkJggg==\n", "text/plain": [ "<Figure size 576x360 with 1 Axes>" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "from sklearn.preprocessing import scale\n", "from sklearn.model_selection import train_test_split\n", "\n", "X = scale(X)\n", "Y = scale(Y)\n", "X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.5)\n", "\n", "plt.figure(figsize=(8,5))\n", "plt.scatter(X_train, Y_train, color='red', s=5, label='training')\n", "plt.scatter(X_test, Y_test, color='blue', s=5, label='validation')\n", "plt.legend()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**c)** Create a feed-forward neural network with one hidden layer. Train the network using the training set. Play around with the number of neurons and the number of layers to give more o less complexity to the model." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Using TensorFlow backend.\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "WARNING:tensorflow:From c:\\users\\dany\\miniconda3\\lib\\site-packages\\tensorflow\\python\\framework\\op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Colocations handled automatically by placer.\n", "_________________________________________________________________\n", "Layer (type) Output Shape Param # \n", "=================================================================\n", "dense_1 (Dense) (None, 10) 20 \n", "_________________________________________________________________\n", "dense_2 (Dense) (None, 5) 55 \n", "_________________________________________________________________\n", "dense_3 (Dense) (None, 1) 6 \n", "=================================================================\n", "Total params: 81\n", "Trainable params: 81\n", "Non-trainable params: 0\n", "_________________________________________________________________\n" ] } ], "source": [ "import keras\n", "from keras.models import Sequential\n", "from keras.layers import Dense\n", "from keras.optimizers import adam\n", "\n", "\n", "model = Sequential()\n", "\n", "hidden1 = Dense(10, input_dim=1, activation='tanh')\n", "hidden2 = Dense(5, activation='tanh')\n", "output = Dense(1, activation='linear')\n", "\n", "model.add(hidden1) \n", "model.add(hidden2) \n", "model.add(output)\n", "\n", "model.compile(optimizer=adam(0.01), loss='mse')\n", "\n", "model.summary()" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "WARNING:tensorflow:From c:\\users\\dany\\miniconda3\\lib\\site-packages\\tensorflow\\python\\ops\\math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Use tf.cast instead.\n", "Train on 150 samples, validate on 150 samples\n", "Epoch 1/5000\n", "150/150 [==============================] - 0s 3ms/step - loss: 2.2133 - val_loss: 1.6177\n", "Epoch 2/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.9431 - val_loss: 0.5719\n", "Epoch 3/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.2933 - val_loss: 0.1743\n", "Epoch 4/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.1289 - val_loss: 0.1694\n", "Epoch 5/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.1728 - val_loss: 0.1967\n", "Epoch 6/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.1751 - val_loss: 0.1644\n", "Epoch 7/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.1238 - val_loss: 0.1238\n", "Epoch 8/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0856 - val_loss: 0.1144\n", "Epoch 9/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0786 - val_loss: 0.1222\n", "Epoch 10/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0801 - val_loss: 0.1193\n", "Epoch 11/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0745 - val_loss: 0.1060\n", "Epoch 12/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0665 - val_loss: 0.0918\n", "Epoch 13/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0598 - val_loss: 0.0838\n", "Epoch 14/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0570 - val_loss: 0.0788\n", "Epoch 15/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0548 - val_loss: 0.0729\n", "Epoch 16/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0506 - val_loss: 0.0704\n", "Epoch 17/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0480 - val_loss: 0.0677\n", "Epoch 18/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0463 - val_loss: 0.0664\n", "Epoch 19/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0446 - val_loss: 0.0622\n", "Epoch 20/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0429 - val_loss: 0.0598\n", "Epoch 21/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0413 - val_loss: 0.0553\n", "Epoch 22/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0405 - val_loss: 0.0544\n", "Epoch 23/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0388 - val_loss: 0.0520\n", "Epoch 24/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0380 - val_loss: 0.0504\n", "Epoch 25/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0374 - val_loss: 0.0488\n", "Epoch 26/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0365 - val_loss: 0.0473\n", "Epoch 27/5000\n", "150/150 [==============================] - 0s 566us/step - loss: 0.0357 - val_loss: 0.0460\n", "Epoch 28/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0354 - val_loss: 0.0451\n", "Epoch 29/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0347 - val_loss: 0.0453\n", "Epoch 30/5000\n", "150/150 [==============================] - 0s 260us/step - loss: 0.0349 - val_loss: 0.0444\n", "Epoch 31/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0341 - val_loss: 0.0425\n", "Epoch 32/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0341 - val_loss: 0.0420\n", "Epoch 33/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0339 - val_loss: 0.0426\n", "Epoch 34/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0335 - val_loss: 0.0411\n", "Epoch 35/5000\n", "150/150 [==============================] - 0s 260us/step - loss: 0.0332 - val_loss: 0.0400\n", "Epoch 36/5000\n", "150/150 [==============================] - 0s 300us/step - loss: 0.0331 - val_loss: 0.0400\n", "Epoch 37/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0329 - val_loss: 0.0395\n", "Epoch 38/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0325 - val_loss: 0.0391\n", "Epoch 39/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0329 - val_loss: 0.0393\n", "Epoch 40/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0326 - val_loss: 0.0381\n", "Epoch 41/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0322 - val_loss: 0.0383\n", "Epoch 42/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0320 - val_loss: 0.0378\n", "Epoch 43/5000\n", "150/150 [==============================] - 0s 227us/step - loss: 0.0323 - val_loss: 0.0370\n", "Epoch 44/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0319 - val_loss: 0.0383\n", "Epoch 45/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0321 - val_loss: 0.0376\n", "Epoch 46/5000\n", "150/150 [==============================] - 0s 187us/step - loss: 0.0318 - val_loss: 0.0363\n", "Epoch 47/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0317 - val_loss: 0.0366\n", "Epoch 48/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0317 - val_loss: 0.0356\n", "Epoch 49/5000\n", "150/150 [==============================] - 0s 220us/step - loss: 0.0316 - val_loss: 0.0365\n", "Epoch 50/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0319 - val_loss: 0.0367\n", "Epoch 51/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0319 - val_loss: 0.0357\n", "Epoch 52/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0314 - val_loss: 0.0347\n", "Epoch 53/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0313 - val_loss: 0.0359\n", "Epoch 54/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0315 - val_loss: 0.0360\n", "Epoch 55/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0312 - val_loss: 0.0353\n", "Epoch 56/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0310 - val_loss: 0.0346\n", "Epoch 57/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0313 - val_loss: 0.0353\n", "Epoch 58/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0310 - val_loss: 0.0340\n", "Epoch 59/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0308 - val_loss: 0.0342\n", "Epoch 60/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0308 - val_loss: 0.0344\n", "Epoch 61/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0309 - val_loss: 0.0342\n", "Epoch 62/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0307 - val_loss: 0.0348\n", "Epoch 63/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0310 - val_loss: 0.0349\n", "Epoch 64/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0306 - val_loss: 0.0337\n", "Epoch 65/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0313 - val_loss: 0.0332\n", "Epoch 66/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0310 - val_loss: 0.0349\n", "Epoch 67/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0303 - val_loss: 0.0335\n", "Epoch 68/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0309 - val_loss: 0.0335\n", "Epoch 69/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0306 - val_loss: 0.0339\n", "Epoch 70/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0307 - val_loss: 0.0338\n", "Epoch 71/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0307 - val_loss: 0.0332\n", "Epoch 72/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.033 - 0s 100us/step - loss: 0.0304 - val_loss: 0.0338\n", "Epoch 73/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0320 - val_loss: 0.0348\n", "Epoch 74/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0315 - val_loss: 0.0324\n", "Epoch 75/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0304 - val_loss: 0.0342\n", "Epoch 76/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 113us/step - loss: 0.0305 - val_loss: 0.0337\n", "Epoch 77/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0308 - val_loss: 0.0321\n", "Epoch 78/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0303 - val_loss: 0.0332\n", "Epoch 79/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0303 - val_loss: 0.0327\n", "Epoch 80/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0316 - val_loss: 0.0317\n", "Epoch 81/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0302 - val_loss: 0.0351\n", "Epoch 82/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0307 - val_loss: 0.0333\n", "Epoch 83/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0299 - val_loss: 0.0319\n", "Epoch 84/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0303 - val_loss: 0.0317\n", "Epoch 85/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0300 - val_loss: 0.0324\n", "Epoch 86/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0307 - val_loss: 0.0329\n", "Epoch 87/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0306 - val_loss: 0.0337\n", "Epoch 88/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0301 - val_loss: 0.0318\n", "Epoch 89/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0310 - val_loss: 0.0315\n", "Epoch 90/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0297 - val_loss: 0.0337\n", "Epoch 91/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0312 - val_loss: 0.0331\n", "Epoch 92/5000\n", "150/150 [==============================] - 0s 193us/step - loss: 0.0305 - val_loss: 0.0314\n", "Epoch 93/5000\n", "150/150 [==============================] - 0s 173us/step - loss: 0.0308 - val_loss: 0.0307\n", "Epoch 94/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0310 - val_loss: 0.0334\n", "Epoch 95/5000\n", "150/150 [==============================] - 0s 187us/step - loss: 0.0306 - val_loss: 0.0332\n", "Epoch 96/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0302 - val_loss: 0.0314\n", "Epoch 97/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0310 - val_loss: 0.0325\n", "Epoch 98/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0296 - val_loss: 0.0311\n", "Epoch 99/5000\n", "150/150 [==============================] - 0s 193us/step - loss: 0.0302 - val_loss: 0.0320\n", "Epoch 100/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0299 - val_loss: 0.0333\n", "Epoch 101/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0300 - val_loss: 0.0314\n", "Epoch 102/5000\n", "150/150 [==============================] - 0s 213us/step - loss: 0.0318 - val_loss: 0.0309\n", "Epoch 103/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0299 - val_loss: 0.0357\n", "Epoch 104/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0311 - val_loss: 0.0326\n", "Epoch 105/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0319 - val_loss: 0.0310\n", "Epoch 106/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0307 - val_loss: 0.0341\n", "Epoch 107/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0305 - val_loss: 0.0316\n", "Epoch 108/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0301 - val_loss: 0.0317\n", "Epoch 109/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0296 - val_loss: 0.0318\n", "Epoch 110/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0299 - val_loss: 0.0308\n", "Epoch 111/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0302 - val_loss: 0.0322\n", "Epoch 112/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0304 - val_loss: 0.0323\n", "Epoch 113/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.025 - 0s 93us/step - loss: 0.0297 - val_loss: 0.0310\n", "Epoch 114/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0298 - val_loss: 0.0313\n", "Epoch 115/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0291 - val_loss: 0.0324\n", "Epoch 116/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0314 - val_loss: 0.0308\n", "Epoch 117/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0309 - val_loss: 0.0317\n", "Epoch 118/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0301 - val_loss: 0.0362\n", "Epoch 119/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0313 - val_loss: 0.0321\n", "Epoch 120/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0306 - val_loss: 0.0303\n", "Epoch 121/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0298 - val_loss: 0.0335\n", "Epoch 122/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0302 - val_loss: 0.0314\n", "Epoch 123/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0298 - val_loss: 0.0305\n", "Epoch 124/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0300 - val_loss: 0.0317\n", "Epoch 125/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0292 - val_loss: 0.0307\n", "Epoch 126/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0292 - val_loss: 0.0305\n", "Epoch 127/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0298 - val_loss: 0.0299\n", "Epoch 128/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0291 - val_loss: 0.0316\n", "Epoch 129/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0292 - val_loss: 0.0304\n", "Epoch 130/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0314 - val_loss: 0.0295\n", "Epoch 131/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0306 - val_loss: 0.0339\n", "Epoch 132/5000\n", "150/150 [==============================] - 0s 173us/step - loss: 0.0298 - val_loss: 0.0299\n", "Epoch 133/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0308 - val_loss: 0.0296\n", "Epoch 134/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0324 - val_loss: 0.0369\n", "Epoch 135/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0308 - val_loss: 0.0301\n", "Epoch 136/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0296 - val_loss: 0.0305\n", "Epoch 137/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0305 - val_loss: 0.0325\n", "Epoch 138/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0292 - val_loss: 0.0306\n", "Epoch 139/5000\n", "150/150 [==============================] - 0s 187us/step - loss: 0.0300 - val_loss: 0.0301\n", "Epoch 140/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0312 - val_loss: 0.0334\n", "Epoch 141/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0315 - val_loss: 0.0309\n", "Epoch 142/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0306 - val_loss: 0.0314\n", "Epoch 143/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0301 - val_loss: 0.0304\n", "Epoch 144/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0289 - val_loss: 0.0315\n", "Epoch 145/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0295 - val_loss: 0.0305\n", "Epoch 146/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0290 - val_loss: 0.0309\n", "Epoch 147/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0288 - val_loss: 0.0301\n", "Epoch 148/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0289 - val_loss: 0.0296\n", "Epoch 149/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0291 - val_loss: 0.0307\n", "Epoch 150/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0288 - val_loss: 0.0305\n", "Epoch 151/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0287 - val_loss: 0.0304\n", "Epoch 152/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.035 - 0s 87us/step - loss: 0.0297 - val_loss: 0.0293\n", "Epoch 153/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 87us/step - loss: 0.0293 - val_loss: 0.0323\n", "Epoch 154/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0298 - val_loss: 0.0294\n", "Epoch 155/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0287 - val_loss: 0.0314\n", "Epoch 156/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0292 - val_loss: 0.0314\n", "Epoch 157/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0290 - val_loss: 0.0305\n", "Epoch 158/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0291 - val_loss: 0.0294\n", "Epoch 159/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0288 - val_loss: 0.0308\n", "Epoch 160/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0287 - val_loss: 0.0308\n", "Epoch 161/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0288 - val_loss: 0.0303\n", "Epoch 162/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0288 - val_loss: 0.0300\n", "Epoch 163/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0286 - val_loss: 0.0294\n", "Epoch 164/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0290 - val_loss: 0.0305\n", "Epoch 165/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0298 - val_loss: 0.0301\n", "Epoch 166/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0291 - val_loss: 0.0290\n", "Epoch 167/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0292 - val_loss: 0.0324\n", "Epoch 168/5000\n", "150/150 [==============================] - 0s 180us/step - loss: 0.0296 - val_loss: 0.0315\n", "Epoch 169/5000\n", "150/150 [==============================] - 0s 193us/step - loss: 0.0297 - val_loss: 0.0289\n", "Epoch 170/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0302 - val_loss: 0.0289\n", "Epoch 171/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0302 - val_loss: 0.0337\n", "Epoch 172/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0296 - val_loss: 0.0308\n", "Epoch 173/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0286 - val_loss: 0.0290\n", "Epoch 174/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0292 - val_loss: 0.0296\n", "Epoch 175/5000\n", "150/150 [==============================] - 0s 227us/step - loss: 0.0300 - val_loss: 0.0314\n", "Epoch 176/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0289 - val_loss: 0.0287\n", "Epoch 177/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0289 - val_loss: 0.0298\n", "Epoch 178/5000\n", "150/150 [==============================] - 0s 173us/step - loss: 0.0290 - val_loss: 0.0300\n", "Epoch 179/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0290 - val_loss: 0.0315\n", "Epoch 180/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0286 - val_loss: 0.0292\n", "Epoch 181/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0311 - val_loss: 0.0290\n", "Epoch 182/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0300 - val_loss: 0.0343\n", "Epoch 183/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0302 - val_loss: 0.0306\n", "Epoch 184/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0294 - val_loss: 0.0296\n", "Epoch 185/5000\n", "150/150 [==============================] - 0s 167us/step - loss: 0.0293 - val_loss: 0.0309\n", "Epoch 186/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0289 - val_loss: 0.0306\n", "Epoch 187/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0298 - val_loss: 0.0291\n", "Epoch 188/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0302 - val_loss: 0.0307\n", "Epoch 189/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0297 - val_loss: 0.0298\n", "Epoch 190/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0289 - val_loss: 0.0311\n", "Epoch 191/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0285 - val_loss: 0.0298\n", "Epoch 192/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0284 - val_loss: 0.0290\n", "Epoch 193/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0290 - val_loss: 0.0295\n", "Epoch 194/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0286 - val_loss: 0.0292\n", "Epoch 195/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0290 - val_loss: 0.0289\n", "Epoch 196/5000\n", "150/150 [==============================] - 0s 366us/step - loss: 0.0289 - val_loss: 0.0303\n", "Epoch 197/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0293 - val_loss: 0.0294\n", "Epoch 198/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0299 - val_loss: 0.0285\n", "Epoch 199/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0281 - val_loss: 0.0304\n", "Epoch 200/5000\n", "150/150 [==============================] - 0s 267us/step - loss: 0.0294 - val_loss: 0.0311\n", "Epoch 201/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0278 - val_loss: 0.0282\n", "Epoch 202/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0291 - val_loss: 0.0285\n", "Epoch 203/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0294 - val_loss: 0.0290\n", "Epoch 204/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0282 - val_loss: 0.0323\n", "Epoch 205/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0290 - val_loss: 0.0290\n", "Epoch 206/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0287 - val_loss: 0.0290\n", "Epoch 207/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0288 - val_loss: 0.0294\n", "Epoch 208/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0284 - val_loss: 0.0300\n", "Epoch 209/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0284 - val_loss: 0.0293\n", "Epoch 210/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0287 - val_loss: 0.0291\n", "Epoch 211/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0294 - val_loss: 0.0290\n", "Epoch 212/5000\n", "150/150 [==============================] - 0s 180us/step - loss: 0.0318 - val_loss: 0.0300\n", "Epoch 213/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0296 - val_loss: 0.0301\n", "Epoch 214/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0291 - val_loss: 0.0322\n", "Epoch 215/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0283 - val_loss: 0.0284\n", "Epoch 216/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0285 - val_loss: 0.0285\n", "Epoch 217/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0283 - val_loss: 0.0292\n", "Epoch 218/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0291 - val_loss: 0.0304\n", "Epoch 219/5000\n", "150/150 [==============================] - 0s 200us/step - loss: 0.0289 - val_loss: 0.0279\n", "Epoch 220/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0287 - val_loss: 0.0316\n", "Epoch 221/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0299 - val_loss: 0.0323\n", "Epoch 222/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0287 - val_loss: 0.0297\n", "Epoch 223/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0290 - val_loss: 0.0281\n", "Epoch 224/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0304 - val_loss: 0.0312\n", "Epoch 225/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0295 - val_loss: 0.0307\n", "Epoch 226/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0294 - val_loss: 0.0281\n", "Epoch 227/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0300 - val_loss: 0.0322\n", "Epoch 228/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0286 - val_loss: 0.0297\n", "Epoch 229/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0294 - val_loss: 0.0301\n", "Epoch 230/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 87us/step - loss: 0.0285 - val_loss: 0.0294\n", "Epoch 231/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0291 - val_loss: 0.0299\n", "Epoch 232/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0293 - val_loss: 0.0283\n", "Epoch 233/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0284 - val_loss: 0.0292\n", "Epoch 234/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0287 - val_loss: 0.0293\n", "Epoch 235/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0290 - val_loss: 0.0282\n", "Epoch 236/5000\n", "150/150 [==============================] - 0s 193us/step - loss: 0.0281 - val_loss: 0.0293\n", "Epoch 237/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0293 - val_loss: 0.0293\n", "Epoch 238/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0292\n", "Epoch 239/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0282 - val_loss: 0.0296\n", "Epoch 240/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0282 - val_loss: 0.0291\n", "Epoch 241/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0282 - val_loss: 0.0288\n", "Epoch 242/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0294 - val_loss: 0.0285\n", "Epoch 243/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0280 - val_loss: 0.0308\n", "Epoch 244/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0299 - val_loss: 0.0290\n", "Epoch 245/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0286 - val_loss: 0.0299\n", "Epoch 246/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0284 - val_loss: 0.0288\n", "Epoch 247/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0283 - val_loss: 0.0301\n", "Epoch 248/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0282 - val_loss: 0.0292\n", "Epoch 249/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0285 - val_loss: 0.0285\n", "Epoch 250/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0280 - val_loss: 0.0289\n", "Epoch 251/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0279 - val_loss: 0.0293\n", "Epoch 252/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0285 - val_loss: 0.0291\n", "Epoch 253/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0291 - val_loss: 0.0280\n", "Epoch 254/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0309 - val_loss: 0.0305\n", "Epoch 255/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0312 - val_loss: 0.0299\n", "Epoch 256/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0293 - val_loss: 0.0328\n", "Epoch 257/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0294 - val_loss: 0.0304\n", "Epoch 258/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0300 - val_loss: 0.0304\n", "Epoch 259/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0280 - val_loss: 0.0282\n", "Epoch 260/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0287 - val_loss: 0.0286\n", "Epoch 261/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0288 - val_loss: 0.0310\n", "Epoch 262/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0281 - val_loss: 0.0290\n", "Epoch 263/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0296 - val_loss: 0.0274\n", "Epoch 264/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0285 - val_loss: 0.0308\n", "Epoch 265/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0284 - val_loss: 0.0277\n", "Epoch 266/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0285 - val_loss: 0.0290\n", "Epoch 267/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0308\n", "Epoch 268/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0284 - val_loss: 0.0279\n", "Epoch 269/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0287 - val_loss: 0.0288\n", "Epoch 270/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0282 - val_loss: 0.0303\n", "Epoch 271/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0280 - val_loss: 0.0283\n", "Epoch 272/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0281 - val_loss: 0.0293\n", "Epoch 273/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0288 - val_loss: 0.0291\n", "Epoch 274/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0290 - val_loss: 0.0286\n", "Epoch 275/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0315 - val_loss: 0.0320\n", "Epoch 276/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0289 - val_loss: 0.0277\n", "Epoch 277/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0289 - val_loss: 0.0294\n", "Epoch 278/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0284 - val_loss: 0.0293\n", "Epoch 279/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0288 - val_loss: 0.0289\n", "Epoch 280/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0295 - val_loss: 0.0282\n", "Epoch 281/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0286 - val_loss: 0.0309\n", "Epoch 282/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0282 - val_loss: 0.0283\n", "Epoch 283/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0306 - val_loss: 0.0305\n", "Epoch 284/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0316 - val_loss: 0.0284\n", "Epoch 285/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0305 - val_loss: 0.0283\n", "Epoch 286/5000\n", "150/150 [==============================] - 0s 300us/step - loss: 0.0310 - val_loss: 0.0340\n", "Epoch 287/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0283 - val_loss: 0.0290\n", "Epoch 288/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0296 - val_loss: 0.0291\n", "Epoch 289/5000\n", "150/150 [==============================] - 0s 247us/step - loss: 0.0291 - val_loss: 0.0296\n", "Epoch 290/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0287 - val_loss: 0.0284\n", "Epoch 291/5000\n", "150/150 [==============================] - 0s 180us/step - loss: 0.0291 - val_loss: 0.0295\n", "Epoch 292/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0278 - val_loss: 0.0291\n", "Epoch 293/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0297 - val_loss: 0.0311\n", "Epoch 294/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0287 - val_loss: 0.0278\n", "Epoch 295/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0283 - val_loss: 0.0292\n", "Epoch 296/5000\n", "150/150 [==============================] - 0s 260us/step - loss: 0.0281 - val_loss: 0.0301\n", "Epoch 297/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0286 - val_loss: 0.0277\n", "Epoch 298/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0305 - val_loss: 0.0278\n", "Epoch 299/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0287 - val_loss: 0.0347\n", "Epoch 300/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0283 - val_loss: 0.0283\n", "Epoch 301/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0309 - val_loss: 0.0284\n", "Epoch 302/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0290 - val_loss: 0.0306\n", "Epoch 303/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0290 - val_loss: 0.0287\n", "Epoch 304/5000\n", "150/150 [==============================] - 0s 306us/step - loss: 0.0282 - val_loss: 0.0276\n", "Epoch 305/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0294 - val_loss: 0.0292\n", "Epoch 306/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0304 - val_loss: 0.0313\n", "Epoch 307/5000\n", "150/150 [==============================] - 0s 400us/step - loss: 0.0292 - val_loss: 0.0282\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 308/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0297 - val_loss: 0.0308\n", "Epoch 309/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0280 - val_loss: 0.0281\n", "Epoch 310/5000\n", "150/150 [==============================] - 0s 386us/step - loss: 0.0296 - val_loss: 0.0292\n", "Epoch 311/5000\n", "150/150 [==============================] - 0s 227us/step - loss: 0.0280 - val_loss: 0.0280\n", "Epoch 312/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0281 - val_loss: 0.0296\n", "Epoch 313/5000\n", "150/150 [==============================] - 0s 280us/step - loss: 0.0277 - val_loss: 0.0285\n", "Epoch 314/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0281 - val_loss: 0.0284\n", "Epoch 315/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0281 - val_loss: 0.0281\n", "Epoch 316/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0279 - val_loss: 0.0295\n", "Epoch 317/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0276 - val_loss: 0.0277\n", "Epoch 318/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.038 - 0s 80us/step - loss: 0.0293 - val_loss: 0.0279\n", "Epoch 319/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0284 - val_loss: 0.0318\n", "Epoch 320/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0281\n", "Epoch 321/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0279 - val_loss: 0.0276\n", "Epoch 322/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0290 - val_loss: 0.0304\n", "Epoch 323/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0306 - val_loss: 0.0278\n", "Epoch 324/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0286 - val_loss: 0.0302\n", "Epoch 325/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0281 - val_loss: 0.0288\n", "Epoch 326/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0283 - val_loss: 0.0288\n", "Epoch 327/5000\n", "150/150 [==============================] - 0s 213us/step - loss: 0.0288 - val_loss: 0.0299\n", "Epoch 328/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0295 - val_loss: 0.0295\n", "Epoch 329/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0279 - val_loss: 0.0305\n", "Epoch 330/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0285 - val_loss: 0.0295\n", "Epoch 331/5000\n", "150/150 [==============================] - 0s 200us/step - loss: 0.0305 - val_loss: 0.0285\n", "Epoch 332/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0306 - val_loss: 0.0290\n", "Epoch 333/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0299 - val_loss: 0.0304\n", "Epoch 334/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0298 - val_loss: 0.0300\n", "Epoch 335/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0283 - val_loss: 0.0292\n", "Epoch 336/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0287 - val_loss: 0.0306\n", "Epoch 337/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0293 - val_loss: 0.0273\n", "Epoch 338/5000\n", "150/150 [==============================] - 0s 200us/step - loss: 0.0288 - val_loss: 0.0315\n", "Epoch 339/5000\n", "150/150 [==============================] - 0s 493us/step - loss: 0.0298 - val_loss: 0.0282\n", "Epoch 340/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0294 - val_loss: 0.0291\n", "Epoch 341/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0287 - val_loss: 0.0292\n", "Epoch 342/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0284 - val_loss: 0.0300\n", "Epoch 343/5000\n", "150/150 [==============================] - 0s 240us/step - loss: 0.0284 - val_loss: 0.0273\n", "Epoch 344/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0277 - val_loss: 0.0303\n", "Epoch 345/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0309 - val_loss: 0.0297\n", "Epoch 346/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0279 - val_loss: 0.0278\n", "Epoch 347/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0295 - val_loss: 0.0297\n", "Epoch 348/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0284 - val_loss: 0.0291\n", "Epoch 349/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0279 - val_loss: 0.0279\n", "Epoch 350/5000\n", "150/150 [==============================] - 0s 167us/step - loss: 0.0293 - val_loss: 0.0288\n", "Epoch 351/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0281 - val_loss: 0.0277\n", "Epoch 352/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0279 - val_loss: 0.0308\n", "Epoch 353/5000\n", "150/150 [==============================] - 0s 453us/step - loss: 0.0297 - val_loss: 0.0277\n", "Epoch 354/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0276 - val_loss: 0.0286\n", "Epoch 355/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0296 - val_loss: 0.0300\n", "Epoch 356/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.031 - 0s 107us/step - loss: 0.0289 - val_loss: 0.0274\n", "Epoch 357/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0300 - val_loss: 0.0302\n", "Epoch 358/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0296 - val_loss: 0.0290\n", "Epoch 359/5000\n", "150/150 [==============================] - 0s 200us/step - loss: 0.0280 - val_loss: 0.0292\n", "Epoch 360/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0293 - val_loss: 0.0290\n", "Epoch 361/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0288 - val_loss: 0.0280\n", "Epoch 362/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0275 - val_loss: 0.0298\n", "Epoch 363/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0276 - val_loss: 0.0284\n", "Epoch 364/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0280 - val_loss: 0.0285\n", "Epoch 365/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0292\n", "Epoch 366/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0282 - val_loss: 0.0276\n", "Epoch 367/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0288 - val_loss: 0.0284\n", "Epoch 368/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0273 - val_loss: 0.0284\n", "Epoch 369/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0275 - val_loss: 0.0283\n", "Epoch 370/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0288 - val_loss: 0.0279\n", "Epoch 371/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0282 - val_loss: 0.0285\n", "Epoch 372/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0276 - val_loss: 0.0279\n", "Epoch 373/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0281 - val_loss: 0.0280\n", "Epoch 374/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0281 - val_loss: 0.0287\n", "Epoch 375/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0275 - val_loss: 0.0275\n", "Epoch 376/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0276 - val_loss: 0.0282\n", "Epoch 377/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0279 - val_loss: 0.0283\n", "Epoch 378/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0280 - val_loss: 0.0289\n", "Epoch 379/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0282 - val_loss: 0.0295\n", "Epoch 380/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0282 - val_loss: 0.0279\n", "Epoch 381/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0283 - val_loss: 0.0283\n", "Epoch 382/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0287 - val_loss: 0.0301\n", "Epoch 383/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0283 - val_loss: 0.0277\n", "Epoch 384/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0280 - val_loss: 0.0300\n", "Epoch 385/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0292 - val_loss: 0.0285\n", "Epoch 386/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.029 - 0s 87us/step - loss: 0.0275 - val_loss: 0.0276\n", "Epoch 387/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0282 - val_loss: 0.0283\n", "Epoch 388/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0283 - val_loss: 0.0274\n", "Epoch 389/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0277 - val_loss: 0.0298\n", "Epoch 390/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0279 - val_loss: 0.0281\n", "Epoch 391/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0282 - val_loss: 0.0293\n", "Epoch 392/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0276 - val_loss: 0.0280\n", "Epoch 393/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0277 - val_loss: 0.0277\n", "Epoch 394/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0277\n", "Epoch 395/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0285 - val_loss: 0.0281\n", "Epoch 396/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0273 - val_loss: 0.0272\n", "Epoch 397/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0277 - val_loss: 0.0291\n", "Epoch 398/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0278 - val_loss: 0.0293\n", "Epoch 399/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0279 - val_loss: 0.0280\n", "Epoch 400/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0281 - val_loss: 0.0273\n", "Epoch 401/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0286 - val_loss: 0.0277\n", "Epoch 402/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0268 - val_loss: 0.0313\n", "Epoch 403/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0286 - val_loss: 0.0286\n", "Epoch 404/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0283 - val_loss: 0.0273\n", "Epoch 405/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0281 - val_loss: 0.0293\n", "Epoch 406/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0278 - val_loss: 0.0280\n", "Epoch 407/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0274 - val_loss: 0.0275\n", "Epoch 408/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0273 - val_loss: 0.0287\n", "Epoch 409/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0276 - val_loss: 0.0279\n", "Epoch 410/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0274 - val_loss: 0.0279\n", "Epoch 411/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0273 - val_loss: 0.0278\n", "Epoch 412/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0276 - val_loss: 0.0282\n", "Epoch 413/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0271 - val_loss: 0.0275\n", "Epoch 414/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0282 - val_loss: 0.0286\n", "Epoch 415/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0278 - val_loss: 0.0275\n", "Epoch 416/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0279 - val_loss: 0.0290\n", "Epoch 417/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0270\n", "Epoch 418/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0275 - val_loss: 0.0285\n", "Epoch 419/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0285 - val_loss: 0.0284\n", "Epoch 420/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0278 - val_loss: 0.0271\n", "Epoch 421/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0274 - val_loss: 0.0275\n", "Epoch 422/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0273 - val_loss: 0.0285\n", "Epoch 423/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0275 - val_loss: 0.0278\n", "Epoch 424/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0280 - val_loss: 0.0280\n", "Epoch 425/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0281 - val_loss: 0.0274\n", "Epoch 426/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0285 - val_loss: 0.0294\n", "Epoch 427/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0283 - val_loss: 0.0290\n", "Epoch 428/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0281 - val_loss: 0.0290\n", "Epoch 429/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0281 - val_loss: 0.0276\n", "Epoch 430/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0287 - val_loss: 0.0281\n", "Epoch 431/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0286 - val_loss: 0.0284\n", "Epoch 432/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0289 - val_loss: 0.0300\n", "Epoch 433/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0285 - val_loss: 0.0281\n", "Epoch 434/5000\n", "150/150 [==============================] - 0s 167us/step - loss: 0.0267 - val_loss: 0.0323\n", "Epoch 435/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0300 - val_loss: 0.0283\n", "Epoch 436/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0294 - val_loss: 0.0277\n", "Epoch 437/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0298 - val_loss: 0.0295\n", "Epoch 438/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0269 - val_loss: 0.0281\n", "Epoch 439/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0286 - val_loss: 0.0306\n", "Epoch 440/5000\n", "150/150 [==============================] - 0s 273us/step - loss: 0.0281 - val_loss: 0.0286\n", "Epoch 441/5000\n", "150/150 [==============================] - 0s 300us/step - loss: 0.0277 - val_loss: 0.0276\n", "Epoch 442/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0289 - val_loss: 0.0303\n", "Epoch 443/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0289 - val_loss: 0.0273\n", "Epoch 444/5000\n", "150/150 [==============================] - 0s 446us/step - loss: 0.0290 - val_loss: 0.0296\n", "Epoch 445/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0290 - val_loss: 0.0303\n", "Epoch 446/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0292 - val_loss: 0.0280\n", "Epoch 447/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0302 - val_loss: 0.0287\n", "Epoch 448/5000\n", "150/150 [==============================] - 0s 346us/step - loss: 0.0286 - val_loss: 0.0284\n", "Epoch 449/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0294 - val_loss: 0.0296\n", "Epoch 450/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0279 - val_loss: 0.0280\n", "Epoch 451/5000\n", "150/150 [==============================] - 0s 187us/step - loss: 0.0279 - val_loss: 0.0300\n", "Epoch 452/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0281 - val_loss: 0.0285\n", "Epoch 453/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0282 - val_loss: 0.0273\n", "Epoch 454/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0274 - val_loss: 0.0289\n", "Epoch 455/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0282 - val_loss: 0.0278\n", "Epoch 456/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0281 - val_loss: 0.0313\n", "Epoch 457/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0279 - val_loss: 0.0272\n", "Epoch 458/5000\n", "150/150 [==============================] - 0s 167us/step - loss: 0.0276 - val_loss: 0.0285\n", "Epoch 459/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0283 - val_loss: 0.0296\n", "Epoch 460/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0281 - val_loss: 0.0270\n", "Epoch 461/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0278 - val_loss: 0.0288\n", "Epoch 462/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0291 - val_loss: 0.0275\n", "Epoch 463/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0280 - val_loss: 0.0287\n", "Epoch 464/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0299 - val_loss: 0.0274\n", "Epoch 465/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0284 - val_loss: 0.0298\n", "Epoch 466/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0302 - val_loss: 0.0276\n", "Epoch 467/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0303 - val_loss: 0.0319\n", "Epoch 468/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0279 - val_loss: 0.0280\n", "Epoch 469/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0287 - val_loss: 0.0287\n", "Epoch 470/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0284 - val_loss: 0.0291\n", "Epoch 471/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0271 - val_loss: 0.0280\n", "Epoch 472/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0283 - val_loss: 0.0277\n", "Epoch 473/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0284 - val_loss: 0.0291\n", "Epoch 474/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0274 - val_loss: 0.0273\n", "Epoch 475/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0279 - val_loss: 0.0292\n", "Epoch 476/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0276 - val_loss: 0.0271\n", "Epoch 477/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0273 - val_loss: 0.0283\n", "Epoch 478/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0274 - val_loss: 0.0295\n", "Epoch 479/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0275 - val_loss: 0.0272\n", "Epoch 480/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0283 - val_loss: 0.0281\n", "Epoch 481/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0280 - val_loss: 0.0291\n", "Epoch 482/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0287 - val_loss: 0.0274\n", "Epoch 483/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0290 - val_loss: 0.0272\n", "Epoch 484/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0281 - val_loss: 0.0317\n", "Epoch 485/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0284 - val_loss: 0.0269\n", "Epoch 486/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0279 - val_loss: 0.0273\n", "Epoch 487/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0281 - val_loss: 0.0295\n", "Epoch 488/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0272 - val_loss: 0.0271\n", "Epoch 489/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0283 - val_loss: 0.0292\n", "Epoch 490/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0293 - val_loss: 0.0287\n", "Epoch 491/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0277 - val_loss: 0.0270\n", "Epoch 492/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0279 - val_loss: 0.0292\n", "Epoch 493/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0281 - val_loss: 0.0287\n", "Epoch 494/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0281 - val_loss: 0.0279\n", "Epoch 495/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0280 - val_loss: 0.0281\n", "Epoch 496/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0279 - val_loss: 0.0281\n", "Epoch 497/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0278 - val_loss: 0.0281\n", "Epoch 498/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0281 - val_loss: 0.0289\n", "Epoch 499/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0275 - val_loss: 0.0271\n", "Epoch 500/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0273 - val_loss: 0.0274\n", "Epoch 501/5000\n", "150/150 [==============================] - 0s 167us/step - loss: 0.0277 - val_loss: 0.0277\n", "Epoch 502/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0279 - val_loss: 0.0276\n", "Epoch 503/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0272 - val_loss: 0.0281\n", "Epoch 504/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0275 - val_loss: 0.0289\n", "Epoch 505/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0276 - val_loss: 0.0273\n", "Epoch 506/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0280 - val_loss: 0.0272\n", "Epoch 507/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0276 - val_loss: 0.0287\n", "Epoch 508/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0293 - val_loss: 0.0279\n", "Epoch 509/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0288 - val_loss: 0.0277\n", "Epoch 510/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0268 - val_loss: 0.0307\n", "Epoch 511/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0284 - val_loss: 0.0273\n", "Epoch 512/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0276 - val_loss: 0.0281\n", "Epoch 513/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0269 - val_loss: 0.0281\n", "Epoch 514/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0275 - val_loss: 0.0283\n", "Epoch 515/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0281 - val_loss: 0.0281\n", "Epoch 516/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0273 - val_loss: 0.0270\n", "Epoch 517/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0276 - val_loss: 0.0276\n", "Epoch 518/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0275 - val_loss: 0.0276\n", "Epoch 519/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0271 - val_loss: 0.0288\n", "Epoch 520/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0279 - val_loss: 0.0279\n", "Epoch 521/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0286 - val_loss: 0.0284\n", "Epoch 522/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0300 - val_loss: 0.0278\n", "Epoch 523/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0286 - val_loss: 0.0306\n", "Epoch 524/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0293 - val_loss: 0.0272\n", "Epoch 525/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0291 - val_loss: 0.0319\n", "Epoch 526/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0272 - val_loss: 0.0270\n", "Epoch 527/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0285 - val_loss: 0.0271\n", "Epoch 528/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0273 - val_loss: 0.0293\n", "Epoch 529/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0272 - val_loss: 0.0271\n", "Epoch 530/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0281 - val_loss: 0.0281\n", "Epoch 531/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0275 - val_loss: 0.0285\n", "Epoch 532/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0273 - val_loss: 0.0278\n", "Epoch 533/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0290 - val_loss: 0.0276\n", "Epoch 534/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0297 - val_loss: 0.0281\n", "Epoch 535/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0291 - val_loss: 0.0283\n", "Epoch 536/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0290 - val_loss: 0.0272\n", "Epoch 537/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0279 - val_loss: 0.0312\n", "Epoch 538/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0282 - val_loss: 0.0283\n", "Epoch 539/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0271 - val_loss: 0.0296\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 540/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0284 - val_loss: 0.0268\n", "Epoch 541/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0284 - val_loss: 0.0292\n", "Epoch 542/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0278 - val_loss: 0.0275\n", "Epoch 543/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0271 - val_loss: 0.0275\n", "Epoch 544/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0270 - val_loss: 0.0282\n", "Epoch 545/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0285 - val_loss: 0.0278\n", "Epoch 546/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0288 - val_loss: 0.0267\n", "Epoch 547/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0291 - val_loss: 0.0291\n", "Epoch 548/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0284 - val_loss: 0.0282\n", "Epoch 549/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0274 - val_loss: 0.0282\n", "Epoch 550/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0276 - val_loss: 0.0278\n", "Epoch 551/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0272 - val_loss: 0.0285\n", "Epoch 552/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0276 - val_loss: 0.0273\n", "Epoch 553/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0280 - val_loss: 0.0279\n", "Epoch 554/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0276 - val_loss: 0.0281\n", "Epoch 555/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0278 - val_loss: 0.0267\n", "Epoch 556/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0283 - val_loss: 0.0289\n", "Epoch 557/5000\n", "150/150 [==============================] - 0s 213us/step - loss: 0.0270 - val_loss: 0.0271\n", "Epoch 558/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0269 - val_loss: 0.0278\n", "Epoch 559/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0275 - val_loss: 0.0282\n", "Epoch 560/5000\n", "150/150 [==============================] - 0s 213us/step - loss: 0.0270 - val_loss: 0.0279\n", "Epoch 561/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0271 - val_loss: 0.0276\n", "Epoch 562/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0273 - val_loss: 0.0280\n", "Epoch 563/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0271 - val_loss: 0.0272\n", "Epoch 564/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0280 - val_loss: 0.0276\n", "Epoch 565/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0289 - val_loss: 0.0292\n", "Epoch 566/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0275 - val_loss: 0.0272\n", "Epoch 567/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0286\n", "Epoch 568/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0279 - val_loss: 0.0278\n", "Epoch 569/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0283 - val_loss: 0.0277\n", "Epoch 570/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0271 - val_loss: 0.0277\n", "Epoch 571/5000\n", "150/150 [==============================] - 0s 200us/step - loss: 0.0270 - val_loss: 0.0286\n", "Epoch 572/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0268 - val_loss: 0.0275\n", "Epoch 573/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0276 - val_loss: 0.0292\n", "Epoch 574/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0276 - val_loss: 0.0285\n", "Epoch 575/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0276 - val_loss: 0.0268\n", "Epoch 576/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0274 - val_loss: 0.0285\n", "Epoch 577/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0277 - val_loss: 0.0279\n", "Epoch 578/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0278 - val_loss: 0.0292\n", "Epoch 579/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0286 - val_loss: 0.0270\n", "Epoch 580/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0273 - val_loss: 0.0301\n", "Epoch 581/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0275 - val_loss: 0.0276\n", "Epoch 582/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0286 - val_loss: 0.0267\n", "Epoch 583/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0289 - val_loss: 0.0280\n", "Epoch 584/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0262 - val_loss: 0.0316\n", "Epoch 585/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0279 - val_loss: 0.0275\n", "Epoch 586/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0287 - val_loss: 0.0281\n", "Epoch 587/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0273 - val_loss: 0.0272\n", "Epoch 588/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0271 - val_loss: 0.0270\n", "Epoch 589/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0267 - val_loss: 0.0280\n", "Epoch 590/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0277 - val_loss: 0.0269\n", "Epoch 591/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0299 - val_loss: 0.0277\n", "Epoch 592/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0287 - val_loss: 0.0276\n", "Epoch 593/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0275 - val_loss: 0.0282\n", "Epoch 594/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0311 - val_loss: 0.0270\n", "Epoch 595/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0288 - val_loss: 0.0283\n", "Epoch 596/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0277 - val_loss: 0.0316\n", "Epoch 597/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0274 - val_loss: 0.0273\n", "Epoch 598/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0299 - val_loss: 0.0272\n", "Epoch 599/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0273 - val_loss: 0.0270\n", "Epoch 600/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0267 - val_loss: 0.0298\n", "Epoch 601/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0271 - val_loss: 0.0272\n", "Epoch 602/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0303 - val_loss: 0.0266\n", "Epoch 603/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0292 - val_loss: 0.0308\n", "Epoch 604/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0275 - val_loss: 0.0267\n", "Epoch 605/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0273 - val_loss: 0.0272\n", "Epoch 606/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0269 - val_loss: 0.0307\n", "Epoch 607/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0281 - val_loss: 0.0269\n", "Epoch 608/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0269 - val_loss: 0.0278\n", "Epoch 609/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0282 - val_loss: 0.0288\n", "Epoch 610/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0283 - val_loss: 0.0272\n", "Epoch 611/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0280 - val_loss: 0.0272\n", "Epoch 612/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0275 - val_loss: 0.0286\n", "Epoch 613/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0271 - val_loss: 0.0268\n", "Epoch 614/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0271 - val_loss: 0.0294\n", "Epoch 615/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0270 - val_loss: 0.0279\n", "Epoch 616/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0274 - val_loss: 0.0284\n", "Epoch 617/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0276 - val_loss: 0.0268\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 618/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0303 - val_loss: 0.0281\n", "Epoch 619/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0329 - val_loss: 0.0267\n", "Epoch 620/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0309 - val_loss: 0.0317\n", "Epoch 621/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0313 - val_loss: 0.0281\n", "Epoch 622/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0284 - val_loss: 0.0305\n", "Epoch 623/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0283 - val_loss: 0.0296\n", "Epoch 624/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0285 - val_loss: 0.0268\n", "Epoch 625/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0284 - val_loss: 0.0287\n", "Epoch 626/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0287 - val_loss: 0.0275\n", "Epoch 627/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0272 - val_loss: 0.0297\n", "Epoch 628/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0269 - val_loss: 0.0268\n", "Epoch 629/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0277 - val_loss: 0.0273\n", "Epoch 630/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0272 - val_loss: 0.0278\n", "Epoch 631/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0267 - val_loss: 0.0273\n", "Epoch 632/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0268 - val_loss: 0.0280\n", "Epoch 633/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0287 - val_loss: 0.0271\n", "Epoch 634/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0273 - val_loss: 0.0287\n", "Epoch 635/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0272 - val_loss: 0.0269\n", "Epoch 636/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0278 - val_loss: 0.0275\n", "Epoch 637/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0278 - val_loss: 0.0297\n", "Epoch 638/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0287 - val_loss: 0.0269\n", "Epoch 639/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0268 - val_loss: 0.0292\n", "Epoch 640/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0302 - val_loss: 0.0275\n", "Epoch 641/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0292 - val_loss: 0.0280\n", "Epoch 642/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0290 - val_loss: 0.0304\n", "Epoch 643/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0272 - val_loss: 0.0280\n", "Epoch 644/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0273 - val_loss: 0.0278\n", "Epoch 645/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0276 - val_loss: 0.0276\n", "Epoch 646/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0265 - val_loss: 0.0269\n", "Epoch 647/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0272 - val_loss: 0.0280\n", "Epoch 648/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0269 - val_loss: 0.0276\n", "Epoch 649/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0274 - val_loss: 0.0289\n", "Epoch 650/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0265 - val_loss: 0.0268\n", "Epoch 651/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0275 - val_loss: 0.0275\n", "Epoch 652/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0267 - val_loss: 0.0288\n", "Epoch 653/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0263\n", "Epoch 654/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0270 - val_loss: 0.0281\n", "Epoch 655/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0268 - val_loss: 0.0271\n", "Epoch 656/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0266 - val_loss: 0.0269\n", "Epoch 657/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0272 - val_loss: 0.0270\n", "Epoch 658/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0267 - val_loss: 0.0268\n", "Epoch 659/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0270 - val_loss: 0.0268\n", "Epoch 660/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0269 - val_loss: 0.0278\n", "Epoch 661/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0270 - val_loss: 0.0279\n", "Epoch 662/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0266 - val_loss: 0.0264\n", "Epoch 663/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0278 - val_loss: 0.0267\n", "Epoch 664/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0266 - val_loss: 0.0298\n", "Epoch 665/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0276 - val_loss: 0.0268\n", "Epoch 666/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0280 - val_loss: 0.0262\n", "Epoch 667/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0274 - val_loss: 0.0294\n", "Epoch 668/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0280 - val_loss: 0.0272\n", "Epoch 669/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0275 - val_loss: 0.0265\n", "Epoch 670/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0267 - val_loss: 0.0280\n", "Epoch 671/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0264 - val_loss: 0.0264\n", "Epoch 672/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.020 - 0s 80us/step - loss: 0.0271 - val_loss: 0.0271\n", "Epoch 673/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0265 - val_loss: 0.0270\n", "Epoch 674/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.028 - 0s 93us/step - loss: 0.0265 - val_loss: 0.0267\n", "Epoch 675/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0266 - val_loss: 0.0265\n", "Epoch 676/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0264 - val_loss: 0.0269\n", "Epoch 677/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0270 - val_loss: 0.0262\n", "Epoch 678/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0272 - val_loss: 0.0278\n", "Epoch 679/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0268 - val_loss: 0.0266\n", "Epoch 680/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0268 - val_loss: 0.0275\n", "Epoch 681/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0268 - val_loss: 0.0271\n", "Epoch 682/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0273 - val_loss: 0.0268\n", "Epoch 683/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0269 - val_loss: 0.0274\n", "Epoch 684/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0276 - val_loss: 0.0276\n", "Epoch 685/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0279 - val_loss: 0.0273\n", "Epoch 686/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0263 - val_loss: 0.0267\n", "Epoch 687/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0268 - val_loss: 0.0265\n", "Epoch 688/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0268 - val_loss: 0.0265\n", "Epoch 689/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0279 - val_loss: 0.0264\n", "Epoch 690/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0259 - val_loss: 0.0295\n", "Epoch 691/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0277 - val_loss: 0.0260\n", "Epoch 692/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0285 - val_loss: 0.0261\n", "Epoch 693/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0276 - val_loss: 0.0272\n", "Epoch 694/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0268 - val_loss: 0.0265\n", "Epoch 695/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 87us/step - loss: 0.0265 - val_loss: 0.0264\n", "Epoch 696/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.017 - 0s 73us/step - loss: 0.0267 - val_loss: 0.0280\n", "Epoch 697/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0265 - val_loss: 0.0260\n", "Epoch 698/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0265 - val_loss: 0.0270\n", "Epoch 699/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0266 - val_loss: 0.0267\n", "Epoch 700/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0276 - val_loss: 0.0269\n", "Epoch 701/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0266 - val_loss: 0.0269\n", "Epoch 702/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0266 - val_loss: 0.0264\n", "Epoch 703/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0271 - val_loss: 0.0272\n", "Epoch 704/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0264 - val_loss: 0.0260\n", "Epoch 705/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0266 - val_loss: 0.0262\n", "Epoch 706/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0276 - val_loss: 0.0287\n", "Epoch 707/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0267 - val_loss: 0.0259\n", "Epoch 708/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0266 - val_loss: 0.0271\n", "Epoch 709/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0273 - val_loss: 0.0274\n", "Epoch 710/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0259 - val_loss: 0.0263\n", "Epoch 711/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0267 - val_loss: 0.0260\n", "Epoch 712/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0256 - val_loss: 0.0286\n", "Epoch 713/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0279 - val_loss: 0.0266\n", "Epoch 714/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0269 - val_loss: 0.0265\n", "Epoch 715/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0262 - val_loss: 0.0261\n", "Epoch 716/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0263 - val_loss: 0.0260\n", "Epoch 717/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0306 - val_loss: 0.0269\n", "Epoch 718/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0291 - val_loss: 0.0263\n", "Epoch 719/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0280 - val_loss: 0.0295\n", "Epoch 720/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0277 - val_loss: 0.0254\n", "Epoch 721/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0263 - val_loss: 0.0268\n", "Epoch 722/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0261 - val_loss: 0.0266\n", "Epoch 723/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0263 - val_loss: 0.0263\n", "Epoch 724/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0265 - val_loss: 0.0268\n", "Epoch 725/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0268 - val_loss: 0.0253\n", "Epoch 726/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0261 - val_loss: 0.0270\n", "Epoch 727/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0261 - val_loss: 0.0257\n", "Epoch 728/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0259 - val_loss: 0.0263\n", "Epoch 729/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0259 - val_loss: 0.0255\n", "Epoch 730/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0259 - val_loss: 0.0255\n", "Epoch 731/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0274 - val_loss: 0.0268\n", "Epoch 732/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0277 - val_loss: 0.0254\n", "Epoch 733/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0282 - val_loss: 0.0254\n", "Epoch 734/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0271 - val_loss: 0.0270\n", "Epoch 735/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0274 - val_loss: 0.0257\n", "Epoch 736/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0259 - val_loss: 0.0273\n", "Epoch 737/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0272 - val_loss: 0.0258\n", "Epoch 738/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0260 - val_loss: 0.0249\n", "Epoch 739/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0278 - val_loss: 0.0256\n", "Epoch 740/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0261 - val_loss: 0.0249\n", "Epoch 741/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0259 - val_loss: 0.0268\n", "Epoch 742/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0266 - val_loss: 0.0259\n", "Epoch 743/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0255 - val_loss: 0.0252\n", "Epoch 744/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0260 - val_loss: 0.0254\n", "Epoch 745/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0262 - val_loss: 0.0253\n", "Epoch 746/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0260 - val_loss: 0.0259\n", "Epoch 747/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0256 - val_loss: 0.0250\n", "Epoch 748/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0266 - val_loss: 0.0248\n", "Epoch 749/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0266 - val_loss: 0.0282\n", "Epoch 750/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0259 - val_loss: 0.0246\n", "Epoch 751/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0264 - val_loss: 0.0251\n", "Epoch 752/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0256 - val_loss: 0.0246\n", "Epoch 753/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0258 - val_loss: 0.0246\n", "Epoch 754/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0257 - val_loss: 0.0250\n", "Epoch 755/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0251 - val_loss: 0.0244\n", "Epoch 756/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0253 - val_loss: 0.0255\n", "Epoch 757/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0261 - val_loss: 0.0242\n", "Epoch 758/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0269 - val_loss: 0.0242\n", "Epoch 759/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0258 - val_loss: 0.0261\n", "Epoch 760/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0253 - val_loss: 0.0242\n", "Epoch 761/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0255 - val_loss: 0.0249\n", "Epoch 762/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0260 - val_loss: 0.0250\n", "Epoch 763/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0259 - val_loss: 0.0243\n", "Epoch 764/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0254 - val_loss: 0.0257\n", "Epoch 765/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0264 - val_loss: 0.0238\n", "Epoch 766/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0252 - val_loss: 0.0263\n", "Epoch 767/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0255 - val_loss: 0.0239\n", "Epoch 768/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0256 - val_loss: 0.0242\n", "Epoch 769/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0248 - val_loss: 0.0248\n", "Epoch 770/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0251 - val_loss: 0.0239\n", "Epoch 771/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0247 - val_loss: 0.0247\n", "Epoch 772/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0251 - val_loss: 0.0243\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 773/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0286 - val_loss: 0.0244\n", "Epoch 774/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0277 - val_loss: 0.0248\n", "Epoch 775/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0303 - val_loss: 0.0240\n", "Epoch 776/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0265 - val_loss: 0.0265\n", "Epoch 777/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0255 - val_loss: 0.0246\n", "Epoch 778/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0249 - val_loss: 0.0267\n", "Epoch 779/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0263 - val_loss: 0.0243\n", "Epoch 780/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0254 - val_loss: 0.0239\n", "Epoch 781/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0244 - val_loss: 0.0240\n", "Epoch 782/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0246 - val_loss: 0.0243\n", "Epoch 783/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0250 - val_loss: 0.0233\n", "Epoch 784/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0252 - val_loss: 0.0234\n", "Epoch 785/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0245 - val_loss: 0.0254\n", "Epoch 786/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0245 - val_loss: 0.0232\n", "Epoch 787/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0248 - val_loss: 0.0230\n", "Epoch 788/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0249 - val_loss: 0.0229\n", "Epoch 789/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0247 - val_loss: 0.0238\n", "Epoch 790/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0236 - val_loss: 0.0228\n", "Epoch 791/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0251 - val_loss: 0.0233\n", "Epoch 792/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0245 - val_loss: 0.0232\n", "Epoch 793/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0247 - val_loss: 0.0236\n", "Epoch 794/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0258 - val_loss: 0.0226\n", "Epoch 795/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0240 - val_loss: 0.0245\n", "Epoch 796/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0240 - val_loss: 0.0228\n", "Epoch 797/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0244 - val_loss: 0.0231\n", "Epoch 798/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0238 - val_loss: 0.0224\n", "Epoch 799/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.016 - 0s 67us/step - loss: 0.0239 - val_loss: 0.0225\n", "Epoch 800/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0237 - val_loss: 0.0229\n", "Epoch 801/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0242 - val_loss: 0.0226\n", "Epoch 802/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0245 - val_loss: 0.0224\n", "Epoch 803/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0249 - val_loss: 0.0233\n", "Epoch 804/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0245 - val_loss: 0.0220\n", "Epoch 805/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0233 - val_loss: 0.0235\n", "Epoch 806/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0250 - val_loss: 0.0234\n", "Epoch 807/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0239 - val_loss: 0.0219\n", "Epoch 808/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0242 - val_loss: 0.0236\n", "Epoch 809/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0237 - val_loss: 0.0219\n", "Epoch 810/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0241 - val_loss: 0.0224\n", "Epoch 811/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0230 - val_loss: 0.0225\n", "Epoch 812/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0243 - val_loss: 0.0221\n", "Epoch 813/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0247 - val_loss: 0.0255\n", "Epoch 814/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0240 - val_loss: 0.0217\n", "Epoch 815/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0234 - val_loss: 0.0230\n", "Epoch 816/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0244 - val_loss: 0.0222\n", "Epoch 817/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0231 - val_loss: 0.0217\n", "Epoch 818/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0243 - val_loss: 0.0264\n", "Epoch 819/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0256 - val_loss: 0.0222\n", "Epoch 820/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0238 - val_loss: 0.0214\n", "Epoch 821/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0227 - val_loss: 0.0225\n", "Epoch 822/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0231 - val_loss: 0.0211\n", "Epoch 823/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0229 - val_loss: 0.0216\n", "Epoch 824/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0230 - val_loss: 0.0217\n", "Epoch 825/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0228 - val_loss: 0.0209\n", "Epoch 826/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0225 - val_loss: 0.0228\n", "Epoch 827/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0229 - val_loss: 0.0209\n", "Epoch 828/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0231 - val_loss: 0.0216\n", "Epoch 829/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0239 - val_loss: 0.0223\n", "Epoch 830/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0244 - val_loss: 0.0212\n", "Epoch 831/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0214 - val_loss: 0.0255\n", "Epoch 832/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0248 - val_loss: 0.0216\n", "Epoch 833/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0232 - val_loss: 0.0214\n", "Epoch 834/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0230 - val_loss: 0.0217\n", "Epoch 835/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0242 - val_loss: 0.0206\n", "Epoch 836/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0257 - val_loss: 0.0223\n", "Epoch 837/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0234 - val_loss: 0.0211\n", "Epoch 838/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0231 - val_loss: 0.0238\n", "Epoch 839/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0234 - val_loss: 0.0215\n", "Epoch 840/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0232 - val_loss: 0.0215\n", "Epoch 841/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0230 - val_loss: 0.0219\n", "Epoch 842/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0227 - val_loss: 0.0214\n", "Epoch 843/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0251 - val_loss: 0.0219\n", "Epoch 844/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0238 - val_loss: 0.0209\n", "Epoch 845/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0214 - val_loss: 0.0235\n", "Epoch 846/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0224 - val_loss: 0.0214\n", "Epoch 847/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0223 - val_loss: 0.0206\n", "Epoch 848/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0229 - val_loss: 0.0230\n", "Epoch 849/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0257 - val_loss: 0.0210\n", "Epoch 850/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 87us/step - loss: 0.0230 - val_loss: 0.0220\n", "Epoch 851/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0220 - val_loss: 0.0214\n", "Epoch 852/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0231 - val_loss: 0.0214\n", "Epoch 853/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0211 - val_loss: 0.0209\n", "Epoch 854/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0220 - val_loss: 0.0215\n", "Epoch 855/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0216 - val_loss: 0.0211\n", "Epoch 856/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0218 - val_loss: 0.0207\n", "Epoch 857/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0224 - val_loss: 0.0206\n", "Epoch 858/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0218 - val_loss: 0.0206\n", "Epoch 859/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0215 - val_loss: 0.0213\n", "Epoch 860/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0222 - val_loss: 0.0206\n", "Epoch 861/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0238 - val_loss: 0.0201\n", "Epoch 862/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0224 - val_loss: 0.0264\n", "Epoch 863/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0229 - val_loss: 0.0215\n", "Epoch 864/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0232 - val_loss: 0.0224\n", "Epoch 865/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0234 - val_loss: 0.0234\n", "Epoch 866/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0211 - val_loss: 0.0211\n", "Epoch 867/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0230 - val_loss: 0.0212\n", "Epoch 868/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0213 - val_loss: 0.0203\n", "Epoch 869/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0213 - val_loss: 0.0200\n", "Epoch 870/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0208 - val_loss: 0.0212\n", "Epoch 871/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0215 - val_loss: 0.0204\n", "Epoch 872/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0210 - val_loss: 0.0203\n", "Epoch 873/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0212 - val_loss: 0.0199\n", "Epoch 874/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0204 - val_loss: 0.0218\n", "Epoch 875/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0214 - val_loss: 0.0205\n", "Epoch 876/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0206 - val_loss: 0.0201\n", "Epoch 877/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0214 - val_loss: 0.0209\n", "Epoch 878/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0209 - val_loss: 0.0201\n", "Epoch 879/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0207 - val_loss: 0.0199\n", "Epoch 880/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0211 - val_loss: 0.0206\n", "Epoch 881/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0213 - val_loss: 0.0199\n", "Epoch 882/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0209 - val_loss: 0.0227\n", "Epoch 883/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0216 - val_loss: 0.0198\n", "Epoch 884/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.022 - 0s 87us/step - loss: 0.0211 - val_loss: 0.0219\n", "Epoch 885/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0213 - val_loss: 0.0201\n", "Epoch 886/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0207 - val_loss: 0.0214\n", "Epoch 887/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0211 - val_loss: 0.0203\n", "Epoch 888/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0215 - val_loss: 0.0198\n", "Epoch 889/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0209 - val_loss: 0.0213\n", "Epoch 890/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0206 - val_loss: 0.0201\n", "Epoch 891/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0206 - val_loss: 0.0204\n", "Epoch 892/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0211 - val_loss: 0.0199\n", "Epoch 893/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0208 - val_loss: 0.0214\n", "Epoch 894/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0218 - val_loss: 0.0199\n", "Epoch 895/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0206 - val_loss: 0.0202\n", "Epoch 896/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0207 - val_loss: 0.0217\n", "Epoch 897/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0216 - val_loss: 0.0198\n", "Epoch 898/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0212 - val_loss: 0.0205\n", "Epoch 899/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0214 - val_loss: 0.0204\n", "Epoch 900/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0225 - val_loss: 0.0201\n", "Epoch 901/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0205 - val_loss: 0.0222\n", "Epoch 902/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0211 - val_loss: 0.0202\n", "Epoch 903/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0208 - val_loss: 0.0207\n", "Epoch 904/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0208 - val_loss: 0.0205\n", "Epoch 905/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0216 - val_loss: 0.0198\n", "Epoch 906/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0221 - val_loss: 0.0212\n", "Epoch 907/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0210 - val_loss: 0.0203\n", "Epoch 908/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0207 - val_loss: 0.0221\n", "Epoch 909/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0203 - val_loss: 0.0207\n", "Epoch 910/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0206 - val_loss: 0.0202\n", "Epoch 911/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0213 - val_loss: 0.0199\n", "Epoch 912/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0206 - val_loss: 0.0233\n", "Epoch 913/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0212 - val_loss: 0.0197\n", "Epoch 914/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0207 - val_loss: 0.0205\n", "Epoch 915/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0212 - val_loss: 0.0199\n", "Epoch 916/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0224 - val_loss: 0.0206\n", "Epoch 917/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0206 - val_loss: 0.0201\n", "Epoch 918/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0205 - val_loss: 0.0207\n", "Epoch 919/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0202 - val_loss: 0.0205\n", "Epoch 920/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0210 - val_loss: 0.0197\n", "Epoch 921/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0203 - val_loss: 0.0205\n", "Epoch 922/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0201 - val_loss: 0.0203\n", "Epoch 923/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0203 - val_loss: 0.0204\n", "Epoch 924/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0204 - val_loss: 0.0203\n", "Epoch 925/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0201 - val_loss: 0.0214\n", "Epoch 926/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0219 - val_loss: 0.0202\n", "Epoch 927/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0207 - val_loss: 0.0199\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 928/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0213 - val_loss: 0.0212\n", "Epoch 929/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0207 - val_loss: 0.0202\n", "Epoch 930/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0218 - val_loss: 0.0206\n", "Epoch 931/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0203 - val_loss: 0.0211\n", "Epoch 932/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0203 - val_loss: 0.0206\n", "Epoch 933/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0211 - val_loss: 0.0206\n", "Epoch 934/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0205 - val_loss: 0.0211\n", "Epoch 935/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0208 - val_loss: 0.0199\n", "Epoch 936/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0199 - val_loss: 0.0215\n", "Epoch 937/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0200 - val_loss: 0.0198\n", "Epoch 938/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0200 - val_loss: 0.0200\n", "Epoch 939/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0215 - val_loss: 0.0209\n", "Epoch 940/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0206 - val_loss: 0.0206\n", "Epoch 941/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0209 - val_loss: 0.0197\n", "Epoch 942/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.021 - 0s 147us/step - loss: 0.0202 - val_loss: 0.0217\n", "Epoch 943/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0200 - val_loss: 0.0213\n", "Epoch 944/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0210 - val_loss: 0.0202\n", "Epoch 945/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0220 - val_loss: 0.0206\n", "Epoch 946/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0198 - val_loss: 0.0232\n", "Epoch 947/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0203 - val_loss: 0.0209\n", "Epoch 948/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0210 - val_loss: 0.0213\n", "Epoch 949/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0210 - val_loss: 0.0202\n", "Epoch 950/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0213 - val_loss: 0.0208\n", "Epoch 951/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0212 - val_loss: 0.0237\n", "Epoch 952/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0212 - val_loss: 0.0209\n", "Epoch 953/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0209 - val_loss: 0.0216\n", "Epoch 954/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0195 - val_loss: 0.0211\n", "Epoch 955/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0201 - val_loss: 0.0203\n", "Epoch 956/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0221 - val_loss: 0.0208\n", "Epoch 957/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0195 - val_loss: 0.0218\n", "Epoch 958/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0200 - val_loss: 0.0206\n", "Epoch 959/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0202 - val_loss: 0.0206\n", "Epoch 960/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0198 - val_loss: 0.0201\n", "Epoch 961/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0195 - val_loss: 0.0203\n", "Epoch 962/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0200 - val_loss: 0.0202\n", "Epoch 963/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0203 - val_loss: 0.0203\n", "Epoch 964/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.016 - 0s 73us/step - loss: 0.0221 - val_loss: 0.0214\n", "Epoch 965/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0207 - val_loss: 0.0202\n", "Epoch 966/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0206 - val_loss: 0.0232\n", "Epoch 967/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0208 - val_loss: 0.0204\n", "Epoch 968/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0214 - val_loss: 0.0219\n", "Epoch 969/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0206 - val_loss: 0.0205\n", "Epoch 970/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0201 - val_loss: 0.0200\n", "Epoch 971/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0211 - val_loss: 0.0222\n", "Epoch 972/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0207 - val_loss: 0.0204\n", "Epoch 973/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0195 - val_loss: 0.0222\n", "Epoch 974/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0197 - val_loss: 0.0202\n", "Epoch 975/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0198 - val_loss: 0.0205\n", "Epoch 976/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0215 - val_loss: 0.0215\n", "Epoch 977/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0200 - val_loss: 0.0205\n", "Epoch 978/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0198 - val_loss: 0.0227\n", "Epoch 979/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0198 - val_loss: 0.0204\n", "Epoch 980/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0204 - val_loss: 0.0204\n", "Epoch 981/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0200 - val_loss: 0.0211\n", "Epoch 982/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.016 - 0s 80us/step - loss: 0.0200 - val_loss: 0.0207\n", "Epoch 983/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0213 - val_loss: 0.0221\n", "Epoch 984/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0193 - val_loss: 0.0205\n", "Epoch 985/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0204 - val_loss: 0.0221\n", "Epoch 986/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0206 - val_loss: 0.0209\n", "Epoch 987/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0196 - val_loss: 0.0199\n", "Epoch 988/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0196 - val_loss: 0.0210\n", "Epoch 989/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0196 - val_loss: 0.0202\n", "Epoch 990/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0192 - val_loss: 0.0203\n", "Epoch 991/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0203 - val_loss: 0.0212\n", "Epoch 992/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0193 - val_loss: 0.0202\n", "Epoch 993/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0196 - val_loss: 0.0202\n", "Epoch 994/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0194 - val_loss: 0.0204\n", "Epoch 995/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0201 - val_loss: 0.0207\n", "Epoch 996/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0192 - val_loss: 0.0200\n", "Epoch 997/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0193 - val_loss: 0.0218\n", "Epoch 998/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0230 - val_loss: 0.0206\n", "Epoch 999/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0200 - val_loss: 0.0210\n", "Epoch 1000/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0206 - val_loss: 0.0203\n", "Epoch 1001/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0212 - val_loss: 0.0229\n", "Epoch 1002/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.021 - 0s 73us/step - loss: 0.0218 - val_loss: 0.0203\n", "Epoch 1003/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0204 - val_loss: 0.0243\n", "Epoch 1004/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0211 - val_loss: 0.0207\n", "Epoch 1005/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0189 - val_loss: 0.0244\n", "Epoch 1006/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0204 - val_loss: 0.0207\n", "Epoch 1007/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0200 - val_loss: 0.0202\n", "Epoch 1008/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0205 - val_loss: 0.0211\n", "Epoch 1009/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0187 - val_loss: 0.0201\n", "Epoch 1010/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0204 - val_loss: 0.0229\n", "Epoch 1011/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0199 - val_loss: 0.0203\n", "Epoch 1012/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0194 - val_loss: 0.0212\n", "Epoch 1013/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0198 - val_loss: 0.0203\n", "Epoch 1014/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0195 - val_loss: 0.0199\n", "Epoch 1015/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0211 - val_loss: 0.0225\n", "Epoch 1016/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0222 - val_loss: 0.0198\n", "Epoch 1017/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0205 - val_loss: 0.0271\n", "Epoch 1018/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0222 - val_loss: 0.0199\n", "Epoch 1019/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0210 - val_loss: 0.0228\n", "Epoch 1020/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0189 - val_loss: 0.0208\n", "Epoch 1021/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0216 - val_loss: 0.0225\n", "Epoch 1022/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0203 - val_loss: 0.0204\n", "Epoch 1023/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0198 - val_loss: 0.0208\n", "Epoch 1024/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0194 - val_loss: 0.0209\n", "Epoch 1025/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0190 - val_loss: 0.0209\n", "Epoch 1026/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0191 - val_loss: 0.0206\n", "Epoch 1027/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0192 - val_loss: 0.0207\n", "Epoch 1028/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0193 - val_loss: 0.0203\n", "Epoch 1029/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0202 - val_loss: 0.0205\n", "Epoch 1030/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0209 - val_loss: 0.0203\n", "Epoch 1031/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0188 - val_loss: 0.0225\n", "Epoch 1032/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0197 - val_loss: 0.0205\n", "Epoch 1033/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0199 - val_loss: 0.0199\n", "Epoch 1034/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0199 - val_loss: 0.0216\n", "Epoch 1035/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0195 - val_loss: 0.0207\n", "Epoch 1036/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0194 - val_loss: 0.0216\n", "Epoch 1037/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0192 - val_loss: 0.0207\n", "Epoch 1038/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0193 - val_loss: 0.0203\n", "Epoch 1039/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0237\n", "Epoch 1040/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0190 - val_loss: 0.0202\n", "Epoch 1041/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0194 - val_loss: 0.0205\n", "Epoch 1042/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0201 - val_loss: 0.0211\n", "Epoch 1043/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0197 - val_loss: 0.0199\n", "Epoch 1044/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0192 - val_loss: 0.0232\n", "Epoch 1045/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0194 - val_loss: 0.0200\n", "Epoch 1046/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0195 - val_loss: 0.0210\n", "Epoch 1047/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0188 - val_loss: 0.0199\n", "Epoch 1048/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0195 - val_loss: 0.0208\n", "Epoch 1049/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0199 - val_loss: 0.0201\n", "Epoch 1050/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0188 - val_loss: 0.0226\n", "Epoch 1051/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0220 - val_loss: 0.0204\n", "Epoch 1052/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0217 - val_loss: 0.0211\n", "Epoch 1053/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0207 - val_loss: 0.0220\n", "Epoch 1054/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0209 - val_loss: 0.0210\n", "Epoch 1055/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0195 - val_loss: 0.0236\n", "Epoch 1056/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0203\n", "Epoch 1057/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.014 - 0s 73us/step - loss: 0.0194 - val_loss: 0.0208\n", "Epoch 1058/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0190 - val_loss: 0.0202\n", "Epoch 1059/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0188 - val_loss: 0.0200\n", "Epoch 1060/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0209\n", "Epoch 1061/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0193 - val_loss: 0.0195\n", "Epoch 1062/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0184 - val_loss: 0.0215\n", "Epoch 1063/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0185 - val_loss: 0.0198\n", "Epoch 1064/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0192 - val_loss: 0.0208\n", "Epoch 1065/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.019 - 0s 67us/step - loss: 0.0192 - val_loss: 0.0209\n", "Epoch 1066/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0214 - val_loss: 0.0200\n", "Epoch 1067/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0197 - val_loss: 0.0217\n", "Epoch 1068/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0183 - val_loss: 0.0202\n", "Epoch 1069/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0188 - val_loss: 0.0218\n", "Epoch 1070/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0202\n", "Epoch 1071/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0185 - val_loss: 0.0196\n", "Epoch 1072/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0184 - val_loss: 0.0208\n", "Epoch 1073/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0187 - val_loss: 0.0201\n", "Epoch 1074/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0197 - val_loss: 0.0214\n", "Epoch 1075/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0191 - val_loss: 0.0195\n", "Epoch 1076/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0183 - val_loss: 0.0220\n", "Epoch 1077/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0196 - val_loss: 0.0200\n", "Epoch 1078/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0191 - val_loss: 0.0209\n", "Epoch 1079/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0206 - val_loss: 0.0216\n", "Epoch 1080/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0212 - val_loss: 0.0220\n", "Epoch 1081/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0194 - val_loss: 0.0197\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 1082/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0194 - val_loss: 0.0227\n", "Epoch 1083/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0208 - val_loss: 0.0206\n", "Epoch 1084/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0199 - val_loss: 0.0204\n", "Epoch 1085/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0197 - val_loss: 0.0225\n", "Epoch 1086/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0187 - val_loss: 0.0195\n", "Epoch 1087/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0187 - val_loss: 0.0211\n", "Epoch 1088/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0186 - val_loss: 0.0194\n", "Epoch 1089/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0181 - val_loss: 0.0203\n", "Epoch 1090/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0180 - val_loss: 0.0197\n", "Epoch 1091/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0185 - val_loss: 0.0201\n", "Epoch 1092/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0185 - val_loss: 0.0213\n", "Epoch 1093/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0195 - val_loss: 0.0193\n", "Epoch 1094/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0181 - val_loss: 0.0208\n", "Epoch 1095/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0186 - val_loss: 0.0201\n", "Epoch 1096/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0186 - val_loss: 0.0199\n", "Epoch 1097/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0185 - val_loss: 0.0193\n", "Epoch 1098/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0181 - val_loss: 0.0212\n", "Epoch 1099/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0181 - val_loss: 0.0193\n", "Epoch 1100/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0200\n", "Epoch 1101/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0195 - val_loss: 0.0224\n", "Epoch 1102/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0198 - val_loss: 0.0197\n", "Epoch 1103/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0181 - val_loss: 0.0268\n", "Epoch 1104/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0205 - val_loss: 0.0190\n", "Epoch 1105/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0191\n", "Epoch 1106/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0176 - val_loss: 0.0221\n", "Epoch 1107/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0189 - val_loss: 0.0206\n", "Epoch 1108/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0201 - val_loss: 0.0194\n", "Epoch 1109/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0197 - val_loss: 0.0238\n", "Epoch 1110/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0199 - val_loss: 0.0192\n", "Epoch 1111/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0237\n", "Epoch 1112/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0185 - val_loss: 0.0192\n", "Epoch 1113/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0185 - val_loss: 0.0217\n", "Epoch 1114/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0191 - val_loss: 0.0195\n", "Epoch 1115/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0180 - val_loss: 0.0193\n", "Epoch 1116/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0180 - val_loss: 0.0208\n", "Epoch 1117/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0181 - val_loss: 0.0194\n", "Epoch 1118/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0178 - val_loss: 0.0199\n", "Epoch 1119/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0189 - val_loss: 0.0195\n", "Epoch 1120/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0201 - val_loss: 0.0208\n", "Epoch 1121/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0188 - val_loss: 0.0190\n", "Epoch 1122/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0196 - val_loss: 0.0211\n", "Epoch 1123/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0208 - val_loss: 0.0193\n", "Epoch 1124/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0211 - val_loss: 0.0228\n", "Epoch 1125/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0173 - val_loss: 0.0197\n", "Epoch 1126/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0197 - val_loss: 0.0209\n", "Epoch 1127/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0186 - val_loss: 0.0195\n", "Epoch 1128/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0183 - val_loss: 0.0209\n", "Epoch 1129/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0175 - val_loss: 0.0197\n", "Epoch 1130/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0182 - val_loss: 0.0211\n", "Epoch 1131/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0195 - val_loss: 0.0206\n", "Epoch 1132/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0187 - val_loss: 0.0193\n", "Epoch 1133/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0181 - val_loss: 0.0221\n", "Epoch 1134/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0184 - val_loss: 0.0196\n", "Epoch 1135/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0184 - val_loss: 0.0210\n", "Epoch 1136/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0194 - val_loss: 0.0194\n", "Epoch 1137/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0212 - val_loss: 0.0255\n", "Epoch 1138/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0197 - val_loss: 0.0203\n", "Epoch 1139/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0186 - val_loss: 0.0225\n", "Epoch 1140/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0194 - val_loss: 0.0206\n", "Epoch 1141/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0183 - val_loss: 0.0193\n", "Epoch 1142/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0200\n", "Epoch 1143/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0196\n", "Epoch 1144/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0176 - val_loss: 0.0201\n", "Epoch 1145/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0178 - val_loss: 0.0193\n", "Epoch 1146/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0178 - val_loss: 0.0206\n", "Epoch 1147/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0175 - val_loss: 0.0191\n", "Epoch 1148/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0180 - val_loss: 0.0204\n", "Epoch 1149/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0203\n", "Epoch 1150/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0175 - val_loss: 0.0195\n", "Epoch 1151/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0186 - val_loss: 0.0192\n", "Epoch 1152/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0182 - val_loss: 0.0196\n", "Epoch 1153/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0213\n", "Epoch 1154/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0178 - val_loss: 0.0191\n", "Epoch 1155/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0180 - val_loss: 0.0203\n", "Epoch 1156/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0176 - val_loss: 0.0190\n", "Epoch 1157/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0174 - val_loss: 0.0201\n", "Epoch 1158/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0193\n", "Epoch 1159/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0176 - val_loss: 0.0207\n", "Epoch 1160/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0176 - val_loss: 0.0194\n", "Epoch 1161/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0177 - val_loss: 0.0198\n", "Epoch 1162/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0193\n", "Epoch 1163/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0202\n", "Epoch 1164/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0177 - val_loss: 0.0189\n", "Epoch 1165/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0174 - val_loss: 0.0211\n", "Epoch 1166/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0173 - val_loss: 0.0190\n", "Epoch 1167/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0175 - val_loss: 0.0197\n", "Epoch 1168/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0191 - val_loss: 0.0206\n", "Epoch 1169/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0189\n", "Epoch 1170/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0181 - val_loss: 0.0215\n", "Epoch 1171/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0176 - val_loss: 0.0186\n", "Epoch 1172/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0177 - val_loss: 0.0188\n", "Epoch 1173/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0184 - val_loss: 0.0207\n", "Epoch 1174/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0188 - val_loss: 0.0186\n", "Epoch 1175/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0177 - val_loss: 0.0205\n", "Epoch 1176/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0174 - val_loss: 0.0195\n", "Epoch 1177/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0180 - val_loss: 0.0198\n", "Epoch 1178/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0181 - val_loss: 0.0212\n", "Epoch 1179/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0175 - val_loss: 0.0189\n", "Epoch 1180/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0171 - val_loss: 0.0212\n", "Epoch 1181/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0179 - val_loss: 0.0191\n", "Epoch 1182/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0174 - val_loss: 0.0197\n", "Epoch 1183/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0183 - val_loss: 0.0209\n", "Epoch 1184/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0193 - val_loss: 0.0185\n", "Epoch 1185/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0183 - val_loss: 0.0222\n", "Epoch 1186/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0190\n", "Epoch 1187/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0176 - val_loss: 0.0230\n", "Epoch 1188/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0184 - val_loss: 0.0189\n", "Epoch 1189/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0183 - val_loss: 0.0244\n", "Epoch 1190/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0185 - val_loss: 0.0185\n", "Epoch 1191/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0189 - val_loss: 0.0199\n", "Epoch 1192/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0174 - val_loss: 0.0191\n", "Epoch 1193/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0178 - val_loss: 0.0197\n", "Epoch 1194/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0186\n", "Epoch 1195/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0173 - val_loss: 0.0208\n", "Epoch 1196/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0194\n", "Epoch 1197/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0179 - val_loss: 0.0187\n", "Epoch 1198/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0171 - val_loss: 0.0215\n", "Epoch 1199/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0185 - val_loss: 0.0190\n", "Epoch 1200/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0175 - val_loss: 0.0204\n", "Epoch 1201/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0170 - val_loss: 0.0187\n", "Epoch 1202/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0180 - val_loss: 0.0202\n", "Epoch 1203/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0185 - val_loss: 0.0202\n", "Epoch 1204/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0175 - val_loss: 0.0187\n", "Epoch 1205/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.018 - 0s 73us/step - loss: 0.0186 - val_loss: 0.0218\n", "Epoch 1206/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0170 - val_loss: 0.0187\n", "Epoch 1207/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0193\n", "Epoch 1208/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0174 - val_loss: 0.0226\n", "Epoch 1209/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0171 - val_loss: 0.0183\n", "Epoch 1210/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0179 - val_loss: 0.0193\n", "Epoch 1211/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0206\n", "Epoch 1212/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0185\n", "Epoch 1213/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0187\n", "Epoch 1214/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0182 - val_loss: 0.0202\n", "Epoch 1215/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0189\n", "Epoch 1216/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0176 - val_loss: 0.0254\n", "Epoch 1217/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0186 - val_loss: 0.0183\n", "Epoch 1218/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0193\n", "Epoch 1219/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0179 - val_loss: 0.0197\n", "Epoch 1220/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0174 - val_loss: 0.0187\n", "Epoch 1221/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0206\n", "Epoch 1222/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0182 - val_loss: 0.0189\n", "Epoch 1223/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0185\n", "Epoch 1224/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0208\n", "Epoch 1225/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0179 - val_loss: 0.0184\n", "Epoch 1226/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0194 - val_loss: 0.0203\n", "Epoch 1227/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0179 - val_loss: 0.0185\n", "Epoch 1228/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0182 - val_loss: 0.0223\n", "Epoch 1229/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0176 - val_loss: 0.0185\n", "Epoch 1230/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0200 - val_loss: 0.0217\n", "Epoch 1231/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0176 - val_loss: 0.0182\n", "Epoch 1232/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0180 - val_loss: 0.0187\n", "Epoch 1233/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0177 - val_loss: 0.0197\n", "Epoch 1234/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0184 - val_loss: 0.0190\n", "Epoch 1235/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0172 - val_loss: 0.0210\n", "Epoch 1236/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0189\n", "Epoch 1237/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0176 - val_loss: 0.0202\n", "Epoch 1238/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0193\n", "Epoch 1239/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0190\n", "Epoch 1240/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0168 - val_loss: 0.0188\n", "Epoch 1241/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0168 - val_loss: 0.0191\n", "Epoch 1242/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0172 - val_loss: 0.0190\n", "Epoch 1243/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0168 - val_loss: 0.0184\n", "Epoch 1244/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0203\n", "Epoch 1245/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0182\n", "Epoch 1246/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0177 - val_loss: 0.0187\n", "Epoch 1247/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0171 - val_loss: 0.0190\n", "Epoch 1248/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0171 - val_loss: 0.0194\n", "Epoch 1249/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0177 - val_loss: 0.0184\n", "Epoch 1250/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0175 - val_loss: 0.0194\n", "Epoch 1251/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0170 - val_loss: 0.0198\n", "Epoch 1252/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0181\n", "Epoch 1253/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0170 - val_loss: 0.0203\n", "Epoch 1254/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0174 - val_loss: 0.0200\n", "Epoch 1255/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0177 - val_loss: 0.0183\n", "Epoch 1256/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0187 - val_loss: 0.0198\n", "Epoch 1257/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0174 - val_loss: 0.0200\n", "Epoch 1258/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0197\n", "Epoch 1259/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0176 - val_loss: 0.0194\n", "Epoch 1260/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0173 - val_loss: 0.0188\n", "Epoch 1261/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0206\n", "Epoch 1262/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0184 - val_loss: 0.0180\n", "Epoch 1263/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0175 - val_loss: 0.0195\n", "Epoch 1264/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0184\n", "Epoch 1265/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0179 - val_loss: 0.0181\n", "Epoch 1266/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0181 - val_loss: 0.0230\n", "Epoch 1267/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0175 - val_loss: 0.0195\n", "Epoch 1268/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0187 - val_loss: 0.0214\n", "Epoch 1269/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0176 - val_loss: 0.0184\n", "Epoch 1270/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0198\n", "Epoch 1271/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0171 - val_loss: 0.0189\n", "Epoch 1272/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0184\n", "Epoch 1273/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0197\n", "Epoch 1274/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0187\n", "Epoch 1275/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0191\n", "Epoch 1276/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0185\n", "Epoch 1277/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0179 - val_loss: 0.0191\n", "Epoch 1278/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0188 - val_loss: 0.0199\n", "Epoch 1279/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0182 - val_loss: 0.0181\n", "Epoch 1280/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0228\n", "Epoch 1281/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0184\n", "Epoch 1282/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0175 - val_loss: 0.0181\n", "Epoch 1283/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0208\n", "Epoch 1284/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0178 - val_loss: 0.0180\n", "Epoch 1285/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0178 - val_loss: 0.0211\n", "Epoch 1286/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0179\n", "Epoch 1287/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0168 - val_loss: 0.0206\n", "Epoch 1288/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0182 - val_loss: 0.0189\n", "Epoch 1289/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0185 - val_loss: 0.0190\n", "Epoch 1290/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0171 - val_loss: 0.0194\n", "Epoch 1291/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0197\n", "Epoch 1292/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0173 - val_loss: 0.0185\n", "Epoch 1293/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0171 - val_loss: 0.0190\n", "Epoch 1294/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0173 - val_loss: 0.0199\n", "Epoch 1295/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0169 - val_loss: 0.0180\n", "Epoch 1296/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0179 - val_loss: 0.0196\n", "Epoch 1297/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0172 - val_loss: 0.0185\n", "Epoch 1298/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0174 - val_loss: 0.0182\n", "Epoch 1299/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0165 - val_loss: 0.0217\n", "Epoch 1300/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0174 - val_loss: 0.0177\n", "Epoch 1301/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0166 - val_loss: 0.0193\n", "Epoch 1302/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0198\n", "Epoch 1303/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0175 - val_loss: 0.0187\n", "Epoch 1304/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0180 - val_loss: 0.0208\n", "Epoch 1305/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0182 - val_loss: 0.0178\n", "Epoch 1306/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0169 - val_loss: 0.0221\n", "Epoch 1307/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0176 - val_loss: 0.0193\n", "Epoch 1308/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0195 - val_loss: 0.0211\n", "Epoch 1309/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0205 - val_loss: 0.0177\n", "Epoch 1310/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0178 - val_loss: 0.0190\n", "Epoch 1311/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0185\n", "Epoch 1312/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0177 - val_loss: 0.0187\n", "Epoch 1313/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0164 - val_loss: 0.0195\n", "Epoch 1314/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0172 - val_loss: 0.0183\n", "Epoch 1315/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0170 - val_loss: 0.0192\n", "Epoch 1316/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0177 - val_loss: 0.0177\n", "Epoch 1317/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0162 - val_loss: 0.0224\n", "Epoch 1318/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0187 - val_loss: 0.0187\n", "Epoch 1319/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0176 - val_loss: 0.0185\n", "Epoch 1320/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0174 - val_loss: 0.0212\n", "Epoch 1321/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0171 - val_loss: 0.0180\n", "Epoch 1322/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0181 - val_loss: 0.0197\n", "Epoch 1323/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0171 - val_loss: 0.0189\n", "Epoch 1324/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0171 - val_loss: 0.0183\n", "Epoch 1325/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0169 - val_loss: 0.0183\n", "Epoch 1326/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0173 - val_loss: 0.0192\n", "Epoch 1327/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0173 - val_loss: 0.0179\n", "Epoch 1328/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0192 - val_loss: 0.0180\n", "Epoch 1329/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0185 - val_loss: 0.0199\n", "Epoch 1330/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0194 - val_loss: 0.0178\n", "Epoch 1331/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0178 - val_loss: 0.0204\n", "Epoch 1332/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0177 - val_loss: 0.0186\n", "Epoch 1333/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0173 - val_loss: 0.0190\n", "Epoch 1334/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0165 - val_loss: 0.0203\n", "Epoch 1335/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0175 - val_loss: 0.0197\n", "Epoch 1336/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0184 - val_loss: 0.0191\n", "Epoch 1337/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0178 - val_loss: 0.0184\n", "Epoch 1338/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0174 - val_loss: 0.0212\n", "Epoch 1339/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0169 - val_loss: 0.0178\n", "Epoch 1340/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0175 - val_loss: 0.0205\n", "Epoch 1341/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0177 - val_loss: 0.0180\n", "Epoch 1342/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0188 - val_loss: 0.0188\n", "Epoch 1343/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0192 - val_loss: 0.0196\n", "Epoch 1344/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0176 - val_loss: 0.0189\n", "Epoch 1345/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0192 - val_loss: 0.0233\n", "Epoch 1346/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0186 - val_loss: 0.0186\n", "Epoch 1347/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0223\n", "Epoch 1348/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0175 - val_loss: 0.0179\n", "Epoch 1349/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0183\n", "Epoch 1350/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0165 - val_loss: 0.0188\n", "Epoch 1351/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0193\n", "Epoch 1352/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0177\n", "Epoch 1353/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0171 - val_loss: 0.0181\n", "Epoch 1354/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0168 - val_loss: 0.0195\n", "Epoch 1355/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0176\n", "Epoch 1356/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0166 - val_loss: 0.0185\n", "Epoch 1357/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0167 - val_loss: 0.0186\n", "Epoch 1358/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0171 - val_loss: 0.0175\n", "Epoch 1359/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0193\n", "Epoch 1360/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0182\n", "Epoch 1361/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0183\n", "Epoch 1362/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0170 - val_loss: 0.0182\n", "Epoch 1363/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0194\n", "Epoch 1364/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0180\n", "Epoch 1365/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0172 - val_loss: 0.0183\n", "Epoch 1366/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0169 - val_loss: 0.0189\n", "Epoch 1367/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0167 - val_loss: 0.0179\n", "Epoch 1368/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0190\n", "Epoch 1369/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0172 - val_loss: 0.0178\n", "Epoch 1370/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0173 - val_loss: 0.0204\n", "Epoch 1371/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0188 - val_loss: 0.0178\n", "Epoch 1372/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0199\n", "Epoch 1373/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0164 - val_loss: 0.0181\n", "Epoch 1374/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0184\n", "Epoch 1375/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0167 - val_loss: 0.0199\n", "Epoch 1376/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0176\n", "Epoch 1377/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0163 - val_loss: 0.0200\n", "Epoch 1378/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0178\n", "Epoch 1379/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.014 - 0s 73us/step - loss: 0.0177 - val_loss: 0.0176\n", "Epoch 1380/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0196 - val_loss: 0.0204\n", "Epoch 1381/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0237 - val_loss: 0.0177\n", "Epoch 1382/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0192 - val_loss: 0.0272\n", "Epoch 1383/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0188 - val_loss: 0.0177\n", "Epoch 1384/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0183 - val_loss: 0.0203\n", "Epoch 1385/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0182 - val_loss: 0.0181\n", "Epoch 1386/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0186\n", "Epoch 1387/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0166 - val_loss: 0.0198\n", "Epoch 1388/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0188\n", "Epoch 1389/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0172 - val_loss: 0.0190\n", "Epoch 1390/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0171 - val_loss: 0.0184\n", "Epoch 1391/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0168 - val_loss: 0.0183\n", "Epoch 1392/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0181 - val_loss: 0.0176\n", "Epoch 1393/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0225\n", "Epoch 1394/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0181 - val_loss: 0.0175\n", "Epoch 1395/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0181 - val_loss: 0.0211\n", "Epoch 1396/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0193 - val_loss: 0.0177\n", "Epoch 1397/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0182 - val_loss: 0.0203\n", "Epoch 1398/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0172 - val_loss: 0.0185\n", "Epoch 1399/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0187 - val_loss: 0.0189\n", "Epoch 1400/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0193 - val_loss: 0.0202\n", "Epoch 1401/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0173 - val_loss: 0.0173\n", "Epoch 1402/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0174 - val_loss: 0.0184\n", "Epoch 1403/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0172 - val_loss: 0.0191\n", "Epoch 1404/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0176\n", "Epoch 1405/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0168 - val_loss: 0.0196\n", "Epoch 1406/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0179\n", "Epoch 1407/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0167 - val_loss: 0.0177\n", "Epoch 1408/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0174 - val_loss: 0.0188\n", "Epoch 1409/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0173\n", "Epoch 1410/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0159 - val_loss: 0.0204\n", "Epoch 1411/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0188 - val_loss: 0.0180\n", "Epoch 1412/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0202 - val_loss: 0.0180\n", "Epoch 1413/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0191 - val_loss: 0.0210\n", "Epoch 1414/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0167 - val_loss: 0.0176\n", "Epoch 1415/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0174 - val_loss: 0.0225\n", "Epoch 1416/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0190 - val_loss: 0.0184\n", "Epoch 1417/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0177 - val_loss: 0.0176\n", "Epoch 1418/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0173 - val_loss: 0.0209\n", "Epoch 1419/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0173\n", "Epoch 1420/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0172 - val_loss: 0.0180\n", "Epoch 1421/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0181 - val_loss: 0.0187\n", "Epoch 1422/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0174 - val_loss: 0.0181\n", "Epoch 1423/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0221\n", "Epoch 1424/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0180 - val_loss: 0.0181\n", "Epoch 1425/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0162 - val_loss: 0.0190\n", "Epoch 1426/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0167 - val_loss: 0.0176\n", "Epoch 1427/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0181 - val_loss: 0.0182\n", "Epoch 1428/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0176 - val_loss: 0.0203\n", "Epoch 1429/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0184\n", "Epoch 1430/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0180 - val_loss: 0.0211\n", "Epoch 1431/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0170 - val_loss: 0.0178\n", "Epoch 1432/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0171 - val_loss: 0.0188\n", "Epoch 1433/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0169 - val_loss: 0.0191\n", "Epoch 1434/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0177\n", "Epoch 1435/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0163 - val_loss: 0.0184\n", "Epoch 1436/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0164 - val_loss: 0.0185\n", "Epoch 1437/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0166 - val_loss: 0.0183\n", "Epoch 1438/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0190 - val_loss: 0.0186\n", "Epoch 1439/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0174\n", "Epoch 1440/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0165 - val_loss: 0.0196\n", "Epoch 1441/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0182\n", "Epoch 1442/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0164 - val_loss: 0.0174\n", "Epoch 1443/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0171 - val_loss: 0.0194\n", "Epoch 1444/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0165 - val_loss: 0.0171\n", "Epoch 1445/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0168 - val_loss: 0.0180\n", "Epoch 1446/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0176 - val_loss: 0.0182\n", "Epoch 1447/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0186\n", "Epoch 1448/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0174 - val_loss: 0.0174\n", "Epoch 1449/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0192 - val_loss: 0.0197\n", "Epoch 1450/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0181 - val_loss: 0.0186\n", "Epoch 1451/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0160 - val_loss: 0.0224\n", "Epoch 1452/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0193 - val_loss: 0.0176\n", "Epoch 1453/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0167 - val_loss: 0.0174\n", "Epoch 1454/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.019 - 0s 67us/step - loss: 0.0161 - val_loss: 0.0203\n", "Epoch 1455/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0168 - val_loss: 0.0170\n", "Epoch 1456/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0174 - val_loss: 0.0180\n", "Epoch 1457/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0166 - val_loss: 0.0178\n", "Epoch 1458/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0167 - val_loss: 0.0173\n", "Epoch 1459/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0184\n", "Epoch 1460/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0166 - val_loss: 0.0181\n", "Epoch 1461/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0161 - val_loss: 0.0201\n", "Epoch 1462/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0166 - val_loss: 0.0173\n", "Epoch 1463/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0162 - val_loss: 0.0187\n", "Epoch 1464/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0160 - val_loss: 0.0178\n", "Epoch 1465/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0162 - val_loss: 0.0178\n", "Epoch 1466/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0166 - val_loss: 0.0189\n", "Epoch 1467/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0169\n", "Epoch 1468/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0172 - val_loss: 0.0202\n", "Epoch 1469/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0160 - val_loss: 0.0172\n", "Epoch 1470/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0165 - val_loss: 0.0182\n", "Epoch 1471/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0171 - val_loss: 0.0189\n", "Epoch 1472/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0162 - val_loss: 0.0179\n", "Epoch 1473/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0180\n", "Epoch 1474/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0176\n", "Epoch 1475/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0163 - val_loss: 0.0192\n", "Epoch 1476/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0162 - val_loss: 0.0178\n", "Epoch 1477/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0172 - val_loss: 0.0182\n", "Epoch 1478/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0165 - val_loss: 0.0172\n", "Epoch 1479/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0169 - val_loss: 0.0190\n", "Epoch 1480/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0156 - val_loss: 0.0183\n", "Epoch 1481/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0169 - val_loss: 0.0187\n", "Epoch 1482/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0163 - val_loss: 0.0171\n", "Epoch 1483/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0175 - val_loss: 0.0182\n", "Epoch 1484/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0165 - val_loss: 0.0190\n", "Epoch 1485/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0161 - val_loss: 0.0177\n", "Epoch 1486/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0187\n", "Epoch 1487/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0162 - val_loss: 0.0183\n", "Epoch 1488/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0162 - val_loss: 0.0180\n", "Epoch 1489/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0168 - val_loss: 0.0184\n", "Epoch 1490/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0158 - val_loss: 0.0173\n", "Epoch 1491/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0162 - val_loss: 0.0195\n", "Epoch 1492/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0168 - val_loss: 0.0185\n", "Epoch 1493/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0173\n", "Epoch 1494/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0158 - val_loss: 0.0198\n", "Epoch 1495/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0172\n", "Epoch 1496/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0175\n", "Epoch 1497/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0161 - val_loss: 0.0174\n", "Epoch 1498/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0166 - val_loss: 0.0186\n", "Epoch 1499/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0188\n", "Epoch 1500/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0158 - val_loss: 0.0173\n", "Epoch 1501/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0175 - val_loss: 0.0183\n", "Epoch 1502/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0173\n", "Epoch 1503/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0161 - val_loss: 0.0178\n", "Epoch 1504/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0176\n", "Epoch 1505/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0172 - val_loss: 0.0177\n", "Epoch 1506/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0166 - val_loss: 0.0188\n", "Epoch 1507/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0176\n", "Epoch 1508/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0171 - val_loss: 0.0186\n", "Epoch 1509/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0197\n", "Epoch 1510/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0165 - val_loss: 0.0170\n", "Epoch 1511/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0192\n", "Epoch 1512/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0169 - val_loss: 0.0173\n", "Epoch 1513/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0171\n", "Epoch 1514/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0159 - val_loss: 0.0209\n", "Epoch 1515/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0166\n", "Epoch 1516/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0171 - val_loss: 0.0171\n", "Epoch 1517/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0156 - val_loss: 0.0212\n", "Epoch 1518/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0171\n", "Epoch 1519/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0168 - val_loss: 0.0199\n", "Epoch 1520/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0174 - val_loss: 0.0187\n", "Epoch 1521/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0163 - val_loss: 0.0177\n", "Epoch 1522/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0162 - val_loss: 0.0208\n", "Epoch 1523/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0180 - val_loss: 0.0175\n", "Epoch 1524/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0159 - val_loss: 0.0201\n", "Epoch 1525/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0169\n", "Epoch 1526/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0178\n", "Epoch 1527/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0183\n", "Epoch 1528/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0165 - val_loss: 0.0171\n", "Epoch 1529/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0173 - val_loss: 0.0187\n", "Epoch 1530/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0173\n", "Epoch 1531/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0166 - val_loss: 0.0196\n", "Epoch 1532/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0165 - val_loss: 0.0174\n", "Epoch 1533/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0162 - val_loss: 0.0182\n", "Epoch 1534/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0177\n", "Epoch 1535/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0162 - val_loss: 0.0182\n", "Epoch 1536/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0161 - val_loss: 0.0184\n", "Epoch 1537/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0156 - val_loss: 0.0174\n", "Epoch 1538/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0168 - val_loss: 0.0171\n", "Epoch 1539/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0179\n", "Epoch 1540/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0161 - val_loss: 0.0177\n", "Epoch 1541/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0173\n", "Epoch 1542/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0159 - val_loss: 0.0185\n", "Epoch 1543/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0167 - val_loss: 0.0172\n", "Epoch 1544/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 60us/step - loss: 0.0159 - val_loss: 0.0178\n", "Epoch 1545/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0165 - val_loss: 0.0175\n", "Epoch 1546/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0158 - val_loss: 0.0184\n", "Epoch 1547/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0173 - val_loss: 0.0168\n", "Epoch 1548/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0161 - val_loss: 0.0192\n", "Epoch 1549/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0176\n", "Epoch 1550/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0161 - val_loss: 0.0179\n", "Epoch 1551/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0158 - val_loss: 0.0181\n", "Epoch 1552/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0162 - val_loss: 0.0170\n", "Epoch 1553/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0161 - val_loss: 0.0176\n", "Epoch 1554/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0162 - val_loss: 0.0172\n", "Epoch 1555/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0179\n", "Epoch 1556/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0168\n", "Epoch 1557/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0184\n", "Epoch 1558/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0156 - val_loss: 0.0180\n", "Epoch 1559/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0178 - val_loss: 0.0187\n", "Epoch 1560/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0172 - val_loss: 0.0184\n", "Epoch 1561/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0166\n", "Epoch 1562/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0194\n", "Epoch 1563/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0160 - val_loss: 0.0179\n", "Epoch 1564/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0179 - val_loss: 0.0170\n", "Epoch 1565/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0226\n", "Epoch 1566/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0173 - val_loss: 0.0168\n", "Epoch 1567/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0165 - val_loss: 0.0193\n", "Epoch 1568/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0175\n", "Epoch 1569/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0168 - val_loss: 0.0174\n", "Epoch 1570/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0160 - val_loss: 0.0183\n", "Epoch 1571/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0161 - val_loss: 0.0168\n", "Epoch 1572/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0164 - val_loss: 0.0174\n", "Epoch 1573/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0170 - val_loss: 0.0177\n", "Epoch 1574/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0193 - val_loss: 0.0194\n", "Epoch 1575/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0169 - val_loss: 0.0175\n", "Epoch 1576/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0182 - val_loss: 0.0232\n", "Epoch 1577/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0169 - val_loss: 0.0165\n", "Epoch 1578/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0173 - val_loss: 0.0184\n", "Epoch 1579/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0172\n", "Epoch 1580/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0177\n", "Epoch 1581/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0161 - val_loss: 0.0173\n", "Epoch 1582/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0167 - val_loss: 0.0182\n", "Epoch 1583/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0171\n", "Epoch 1584/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0157 - val_loss: 0.0199\n", "Epoch 1585/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0171 - val_loss: 0.0169\n", "Epoch 1586/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0172\n", "Epoch 1587/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0184 - val_loss: 0.0177\n", "Epoch 1588/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0204\n", "Epoch 1589/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0174 - val_loss: 0.0189\n", "Epoch 1590/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0174 - val_loss: 0.0203\n", "Epoch 1591/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0168 - val_loss: 0.0182\n", "Epoch 1592/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0165 - val_loss: 0.0180\n", "Epoch 1593/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0162 - val_loss: 0.0174\n", "Epoch 1594/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0178\n", "Epoch 1595/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0158 - val_loss: 0.0167\n", "Epoch 1596/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0180\n", "Epoch 1597/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0165 - val_loss: 0.0169\n", "Epoch 1598/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0176 - val_loss: 0.0166\n", "Epoch 1599/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0230\n", "Epoch 1600/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0167 - val_loss: 0.0199\n", "Epoch 1601/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0176 - val_loss: 0.0202\n", "Epoch 1602/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0168\n", "Epoch 1603/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0175 - val_loss: 0.0171\n", "Epoch 1604/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0207\n", "Epoch 1605/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0164 - val_loss: 0.0176\n", "Epoch 1606/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0159 - val_loss: 0.0171\n", "Epoch 1607/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0188\n", "Epoch 1608/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0168 - val_loss: 0.0166\n", "Epoch 1609/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0158 - val_loss: 0.0183\n", "Epoch 1610/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0173\n", "Epoch 1611/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0158 - val_loss: 0.0174\n", "Epoch 1612/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0159 - val_loss: 0.0173\n", "Epoch 1613/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0187\n", "Epoch 1614/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0154 - val_loss: 0.0168\n", "Epoch 1615/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0166 - val_loss: 0.0173\n", "Epoch 1616/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0173\n", "Epoch 1617/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0161 - val_loss: 0.0180\n", "Epoch 1618/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0154 - val_loss: 0.0168\n", "Epoch 1619/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0158 - val_loss: 0.0173\n", "Epoch 1620/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0168\n", "Epoch 1621/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0176\n", "Epoch 1622/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0155 - val_loss: 0.0172\n", "Epoch 1623/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0157 - val_loss: 0.0172\n", "Epoch 1624/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0162 - val_loss: 0.0179\n", "Epoch 1625/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0163 - val_loss: 0.0173\n", "Epoch 1626/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0178 - val_loss: 0.0187\n", "Epoch 1627/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0170 - val_loss: 0.0179\n", "Epoch 1628/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0180 - val_loss: 0.0172\n", "Epoch 1629/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0187\n", "Epoch 1630/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0161 - val_loss: 0.0173\n", "Epoch 1631/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0157 - val_loss: 0.0186\n", "Epoch 1632/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0162 - val_loss: 0.0168\n", "Epoch 1633/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0157 - val_loss: 0.0178\n", "Epoch 1634/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0164 - val_loss: 0.0176\n", "Epoch 1635/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0158 - val_loss: 0.0180\n", "Epoch 1636/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0167 - val_loss: 0.0167\n", "Epoch 1637/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0167 - val_loss: 0.0188\n", "Epoch 1638/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0152 - val_loss: 0.0172\n", "Epoch 1639/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0160 - val_loss: 0.0180\n", "Epoch 1640/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0177\n", "Epoch 1641/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0155 - val_loss: 0.0166\n", "Epoch 1642/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0159 - val_loss: 0.0179\n", "Epoch 1643/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0158 - val_loss: 0.0167\n", "Epoch 1644/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0168\n", "Epoch 1645/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0154 - val_loss: 0.0181\n", "Epoch 1646/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0171 - val_loss: 0.0165\n", "Epoch 1647/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0205 - val_loss: 0.0189\n", "Epoch 1648/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0203 - val_loss: 0.0173\n", "Epoch 1649/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0180 - val_loss: 0.0222\n", "Epoch 1650/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0169 - val_loss: 0.0174\n", "Epoch 1651/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0182\n", "Epoch 1652/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0159 - val_loss: 0.0169\n", "Epoch 1653/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0159 - val_loss: 0.0177\n", "Epoch 1654/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0159 - val_loss: 0.0176\n", "Epoch 1655/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0164 - val_loss: 0.0167\n", "Epoch 1656/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0154 - val_loss: 0.0203\n", "Epoch 1657/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0171 - val_loss: 0.0165\n", "Epoch 1658/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0159 - val_loss: 0.0176\n", "Epoch 1659/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0159 - val_loss: 0.0179\n", "Epoch 1660/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0169 - val_loss: 0.0167\n", "Epoch 1661/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0192\n", "Epoch 1662/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0158 - val_loss: 0.0167\n", "Epoch 1663/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0179\n", "Epoch 1664/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0159 - val_loss: 0.0172\n", "Epoch 1665/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0165 - val_loss: 0.0175\n", "Epoch 1666/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0159 - val_loss: 0.0163\n", "Epoch 1667/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0157 - val_loss: 0.0187\n", "Epoch 1668/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0164 - val_loss: 0.0173\n", "Epoch 1669/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0161 - val_loss: 0.0166\n", "Epoch 1670/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0158 - val_loss: 0.0177\n", "Epoch 1671/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0164 - val_loss: 0.0177\n", "Epoch 1672/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0178\n", "Epoch 1673/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0178\n", "Epoch 1674/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0158 - val_loss: 0.0167\n", "Epoch 1675/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0174\n", "Epoch 1676/5000\n", "150/150 [==============================] - 0s 187us/step - loss: 0.0155 - val_loss: 0.0171\n", "Epoch 1677/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0157 - val_loss: 0.0169\n", "Epoch 1678/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0165 - val_loss: 0.0178\n", "Epoch 1679/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0158 - val_loss: 0.0179\n", "Epoch 1680/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0153 - val_loss: 0.0169\n", "Epoch 1681/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0163 - val_loss: 0.0175\n", "Epoch 1682/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0155 - val_loss: 0.0173\n", "Epoch 1683/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0167\n", "Epoch 1684/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0154 - val_loss: 0.0177\n", "Epoch 1685/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0157 - val_loss: 0.0175\n", "Epoch 1686/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0157 - val_loss: 0.0174\n", "Epoch 1687/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0158 - val_loss: 0.0169\n", "Epoch 1688/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0157 - val_loss: 0.0183\n", "Epoch 1689/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0154 - val_loss: 0.0181\n", "Epoch 1690/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0174\n", "Epoch 1691/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0163 - val_loss: 0.0179\n", "Epoch 1692/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0163 - val_loss: 0.0168\n", "Epoch 1693/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0175\n", "Epoch 1694/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0158 - val_loss: 0.0170\n", "Epoch 1695/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0154 - val_loss: 0.0171\n", "Epoch 1696/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0163\n", "Epoch 1697/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0186\n", "Epoch 1698/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0159 - val_loss: 0.0166\n", "Epoch 1699/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0154 - val_loss: 0.0170\n", "Epoch 1700/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0183\n", "Epoch 1701/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0158 - val_loss: 0.0169\n", "Epoch 1702/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0154 - val_loss: 0.0176\n", "Epoch 1703/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0162 - val_loss: 0.0170\n", "Epoch 1704/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0154 - val_loss: 0.0170\n", "Epoch 1705/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0157 - val_loss: 0.0182\n", "Epoch 1706/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0155 - val_loss: 0.0165\n", "Epoch 1707/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0153 - val_loss: 0.0180\n", "Epoch 1708/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0160 - val_loss: 0.0166\n", "Epoch 1709/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0154 - val_loss: 0.0168\n", "Epoch 1710/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0159 - val_loss: 0.0168\n", "Epoch 1711/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0178\n", "Epoch 1712/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0160 - val_loss: 0.0185\n", "Epoch 1713/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0156 - val_loss: 0.0168\n", "Epoch 1714/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0177 - val_loss: 0.0168\n", "Epoch 1715/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0161 - val_loss: 0.0169\n", "Epoch 1716/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0153 - val_loss: 0.0181\n", "Epoch 1717/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0161 - val_loss: 0.0170\n", "Epoch 1718/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0157 - val_loss: 0.0168\n", "Epoch 1719/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0186\n", "Epoch 1720/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0160 - val_loss: 0.0170\n", "Epoch 1721/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0173 - val_loss: 0.0171\n", "Epoch 1722/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0166 - val_loss: 0.0171\n", "Epoch 1723/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0186 - val_loss: 0.0185\n", "Epoch 1724/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0177 - val_loss: 0.0161\n", "Epoch 1725/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0164 - val_loss: 0.0190\n", "Epoch 1726/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0160 - val_loss: 0.0169\n", "Epoch 1727/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0160 - val_loss: 0.0171\n", "Epoch 1728/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0157\n", "Epoch 1729/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0174 - val_loss: 0.0177\n", "Epoch 1730/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0186 - val_loss: 0.0172\n", "Epoch 1731/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0186 - val_loss: 0.0205\n", "Epoch 1732/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0159 - val_loss: 0.0166\n", "Epoch 1733/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0190\n", "Epoch 1734/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0166 - val_loss: 0.0163\n", "Epoch 1735/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0157 - val_loss: 0.0181\n", "Epoch 1736/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0154 - val_loss: 0.0166\n", "Epoch 1737/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0153 - val_loss: 0.0167\n", "Epoch 1738/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0160 - val_loss: 0.0171\n", "Epoch 1739/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0155 - val_loss: 0.0165\n", "Epoch 1740/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0160 - val_loss: 0.0171\n", "Epoch 1741/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0169 - val_loss: 0.0173\n", "Epoch 1742/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0161 - val_loss: 0.0180\n", "Epoch 1743/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0150 - val_loss: 0.0175\n", "Epoch 1744/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0160 - val_loss: 0.0179\n", "Epoch 1745/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0164\n", "Epoch 1746/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0162 - val_loss: 0.0177\n", "Epoch 1747/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0163 - val_loss: 0.0175\n", "Epoch 1748/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0172\n", "Epoch 1749/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0154 - val_loss: 0.0175\n", "Epoch 1750/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0157 - val_loss: 0.0172\n", "Epoch 1751/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0155 - val_loss: 0.0162\n", "Epoch 1752/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0154 - val_loss: 0.0164\n", "Epoch 1753/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0168 - val_loss: 0.0170\n", "Epoch 1754/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0156 - val_loss: 0.0163\n", "Epoch 1755/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0156 - val_loss: 0.0181\n", "Epoch 1756/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0154 - val_loss: 0.0160\n", "Epoch 1757/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0153 - val_loss: 0.0167\n", "Epoch 1758/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0152 - val_loss: 0.0169\n", "Epoch 1759/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0150 - val_loss: 0.0174\n", "Epoch 1760/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0157 - val_loss: 0.0177\n", "Epoch 1761/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0160\n", "Epoch 1762/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0159\n", "Epoch 1763/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0161 - val_loss: 0.0184\n", "Epoch 1764/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0157 - val_loss: 0.0166\n", "Epoch 1765/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0194\n", "Epoch 1766/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0153 - val_loss: 0.0160\n", "Epoch 1767/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0168\n", "Epoch 1768/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0179\n", "Epoch 1769/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0154 - val_loss: 0.0172\n", "Epoch 1770/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0152 - val_loss: 0.0176\n", "Epoch 1771/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0166\n", "Epoch 1772/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0162\n", "Epoch 1773/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0167 - val_loss: 0.0180\n", "Epoch 1774/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0163 - val_loss: 0.0168\n", "Epoch 1775/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 100us/step - loss: 0.0166 - val_loss: 0.0180\n", "Epoch 1776/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0153 - val_loss: 0.0159\n", "Epoch 1777/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0151 - val_loss: 0.0179\n", "Epoch 1778/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0157\n", "Epoch 1779/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0171 - val_loss: 0.0164\n", "Epoch 1780/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0165 - val_loss: 0.0190\n", "Epoch 1781/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0149 - val_loss: 0.0171\n", "Epoch 1782/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0160 - val_loss: 0.0178\n", "Epoch 1783/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0153 - val_loss: 0.0162\n", "Epoch 1784/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0149 - val_loss: 0.0161\n", "Epoch 1785/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0156 - val_loss: 0.0162\n", "Epoch 1786/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0166 - val_loss: 0.0159\n", "Epoch 1787/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0181\n", "Epoch 1788/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0156 - val_loss: 0.0172\n", "Epoch 1789/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0155 - val_loss: 0.0184\n", "Epoch 1790/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0157 - val_loss: 0.0162\n", "Epoch 1791/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0171\n", "Epoch 1792/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0154 - val_loss: 0.0171\n", "Epoch 1793/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0150 - val_loss: 0.0165\n", "Epoch 1794/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0162 - val_loss: 0.0167\n", "Epoch 1795/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0151 - val_loss: 0.0175\n", "Epoch 1796/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0151 - val_loss: 0.0168\n", "Epoch 1797/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0159 - val_loss: 0.0169\n", "Epoch 1798/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0154 - val_loss: 0.0163\n", "Epoch 1799/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0155 - val_loss: 0.0166\n", "Epoch 1800/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0151 - val_loss: 0.0160\n", "Epoch 1801/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0151 - val_loss: 0.0174\n", "Epoch 1802/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0156 - val_loss: 0.0161\n", "Epoch 1803/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0149 - val_loss: 0.0160\n", "Epoch 1804/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0148 - val_loss: 0.0175\n", "Epoch 1805/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0161 - val_loss: 0.0169\n", "Epoch 1806/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0163 - val_loss: 0.0179\n", "Epoch 1807/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0150 - val_loss: 0.0177\n", "Epoch 1808/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.017 - 0s 67us/step - loss: 0.0170 - val_loss: 0.0163\n", "Epoch 1809/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0162 - val_loss: 0.0182\n", "Epoch 1810/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0149 - val_loss: 0.0175\n", "Epoch 1811/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0155 - val_loss: 0.0176\n", "Epoch 1812/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0156 - val_loss: 0.0167\n", "Epoch 1813/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0154 - val_loss: 0.0169\n", "Epoch 1814/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0151 - val_loss: 0.0179\n", "Epoch 1815/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0158 - val_loss: 0.0164\n", "Epoch 1816/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0146 - val_loss: 0.0173\n", "Epoch 1817/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0155 - val_loss: 0.0161\n", "Epoch 1818/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0149 - val_loss: 0.0165\n", "Epoch 1819/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0169\n", "Epoch 1820/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0156 - val_loss: 0.0162\n", "Epoch 1821/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0154 - val_loss: 0.0164\n", "Epoch 1822/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0168\n", "Epoch 1823/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0152 - val_loss: 0.0164\n", "Epoch 1824/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0148 - val_loss: 0.0171\n", "Epoch 1825/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0147 - val_loss: 0.0169\n", "Epoch 1826/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0149 - val_loss: 0.0168\n", "Epoch 1827/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0149 - val_loss: 0.0160\n", "Epoch 1828/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0161\n", "Epoch 1829/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0150 - val_loss: 0.0169\n", "Epoch 1830/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0166\n", "Epoch 1831/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0171 - val_loss: 0.0168\n", "Epoch 1832/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0165\n", "Epoch 1833/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0154 - val_loss: 0.0161\n", "Epoch 1834/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0169 - val_loss: 0.0173\n", "Epoch 1835/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0159 - val_loss: 0.0168\n", "Epoch 1836/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0150 - val_loss: 0.0177\n", "Epoch 1837/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0154 - val_loss: 0.0162\n", "Epoch 1838/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0159 - val_loss: 0.0164\n", "Epoch 1839/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0151 - val_loss: 0.0162\n", "Epoch 1840/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0151 - val_loss: 0.0171\n", "Epoch 1841/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0151 - val_loss: 0.0160\n", "Epoch 1842/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0145 - val_loss: 0.0164\n", "Epoch 1843/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0149 - val_loss: 0.0162\n", "Epoch 1844/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0154 - val_loss: 0.0167\n", "Epoch 1845/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0159 - val_loss: 0.0175\n", "Epoch 1846/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0149 - val_loss: 0.0162\n", "Epoch 1847/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0170\n", "Epoch 1848/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0148 - val_loss: 0.0156\n", "Epoch 1849/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0152 - val_loss: 0.0158\n", "Epoch 1850/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0149 - val_loss: 0.0177\n", "Epoch 1851/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0149 - val_loss: 0.0173\n", "Epoch 1852/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0157 - val_loss: 0.0167\n", "Epoch 1853/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0160 - val_loss: 0.0159\n", "Epoch 1854/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0150 - val_loss: 0.0172\n", "Epoch 1855/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0146 - val_loss: 0.0161\n", "Epoch 1856/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0155 - val_loss: 0.0173\n", "Epoch 1857/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0145 - val_loss: 0.0161\n", "Epoch 1858/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0148 - val_loss: 0.0170\n", "Epoch 1859/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0147 - val_loss: 0.0158\n", "Epoch 1860/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0150 - val_loss: 0.0162\n", "Epoch 1861/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0158 - val_loss: 0.0165\n", "Epoch 1862/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0146 - val_loss: 0.0156\n", "Epoch 1863/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0143 - val_loss: 0.0184\n", "Epoch 1864/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0157 - val_loss: 0.0155\n", "Epoch 1865/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0161\n", "Epoch 1866/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0160 - val_loss: 0.0166\n", "Epoch 1867/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0155\n", "Epoch 1868/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0145 - val_loss: 0.0179\n", "Epoch 1869/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0152 - val_loss: 0.0160\n", "Epoch 1870/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0146 - val_loss: 0.0164\n", "Epoch 1871/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0149 - val_loss: 0.0171\n", "Epoch 1872/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0156\n", "Epoch 1873/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0154 - val_loss: 0.0174\n", "Epoch 1874/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0149 - val_loss: 0.0160\n", "Epoch 1875/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0170 - val_loss: 0.0158\n", "Epoch 1876/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0153 - val_loss: 0.0183\n", "Epoch 1877/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0153 - val_loss: 0.0160\n", "Epoch 1878/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0146 - val_loss: 0.0175\n", "Epoch 1879/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0146 - val_loss: 0.0159\n", "Epoch 1880/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0152 - val_loss: 0.0158\n", "Epoch 1881/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0150 - val_loss: 0.0173\n", "Epoch 1882/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0148 - val_loss: 0.0159\n", "Epoch 1883/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0151 - val_loss: 0.0163\n", "Epoch 1884/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0148 - val_loss: 0.0161\n", "Epoch 1885/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0155 - val_loss: 0.0156\n", "Epoch 1886/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0156 - val_loss: 0.0177\n", "Epoch 1887/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0148 - val_loss: 0.0157\n", "Epoch 1888/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0144 - val_loss: 0.0172\n", "Epoch 1889/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0155 - val_loss: 0.0169\n", "Epoch 1890/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0150 - val_loss: 0.0161\n", "Epoch 1891/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0163 - val_loss: 0.0162\n", "Epoch 1892/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0164 - val_loss: 0.0180\n", "Epoch 1893/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0145 - val_loss: 0.0154\n", "Epoch 1894/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0151 - val_loss: 0.0172\n", "Epoch 1895/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0149 - val_loss: 0.0156\n", "Epoch 1896/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0145 - val_loss: 0.0169\n", "Epoch 1897/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0149 - val_loss: 0.0157\n", "Epoch 1898/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0150 - val_loss: 0.0165\n", "Epoch 1899/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0144 - val_loss: 0.0159\n", "Epoch 1900/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0144 - val_loss: 0.0161\n", "Epoch 1901/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0146 - val_loss: 0.0158\n", "Epoch 1902/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0141 - val_loss: 0.0161\n", "Epoch 1903/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0142 - val_loss: 0.0157\n", "Epoch 1904/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0145 - val_loss: 0.0160\n", "Epoch 1905/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.015 - 0s 73us/step - loss: 0.0168 - val_loss: 0.0163\n", "Epoch 1906/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0172 - val_loss: 0.0150\n", "Epoch 1907/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0153 - val_loss: 0.0185\n", "Epoch 1908/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0156 - val_loss: 0.0171\n", "Epoch 1909/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0167 - val_loss: 0.0181\n", "Epoch 1910/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0150 - val_loss: 0.0152\n", "Epoch 1911/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0149 - val_loss: 0.0175\n", "Epoch 1912/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0146 - val_loss: 0.0161\n", "Epoch 1913/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0146 - val_loss: 0.0159\n", "Epoch 1914/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0142 - val_loss: 0.0157\n", "Epoch 1915/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0147 - val_loss: 0.0161\n", "Epoch 1916/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0141 - val_loss: 0.0170\n", "Epoch 1917/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0181\n", "Epoch 1918/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0148 - val_loss: 0.0158\n", "Epoch 1919/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0146 - val_loss: 0.0157\n", "Epoch 1920/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0147 - val_loss: 0.0162\n", "Epoch 1921/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0145 - val_loss: 0.0156\n", "Epoch 1922/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0143 - val_loss: 0.0158\n", "Epoch 1923/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0143 - val_loss: 0.0160\n", "Epoch 1924/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0145 - val_loss: 0.0158\n", "Epoch 1925/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0141 - val_loss: 0.0158\n", "Epoch 1926/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0140 - val_loss: 0.0155\n", "Epoch 1927/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0145 - val_loss: 0.0156\n", "Epoch 1928/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0143 - val_loss: 0.0160\n", "Epoch 1929/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0144 - val_loss: 0.0153\n", "Epoch 1930/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0143 - val_loss: 0.0162\n", "Epoch 1931/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0154 - val_loss: 0.0155\n", "Epoch 1932/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0150 - val_loss: 0.0155\n", "Epoch 1933/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0142 - val_loss: 0.0171\n", "Epoch 1934/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0143 - val_loss: 0.0150\n", "Epoch 1935/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0157\n", "Epoch 1936/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0140 - val_loss: 0.0175\n", "Epoch 1937/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0147 - val_loss: 0.0155\n", "Epoch 1938/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0151 - val_loss: 0.0168\n", "Epoch 1939/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0141 - val_loss: 0.0149\n", "Epoch 1940/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0146 - val_loss: 0.0163\n", "Epoch 1941/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0148 - val_loss: 0.0153\n", "Epoch 1942/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0156 - val_loss: 0.0160\n", "Epoch 1943/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0157 - val_loss: 0.0170\n", "Epoch 1944/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0141 - val_loss: 0.0158\n", "Epoch 1945/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0145 - val_loss: 0.0164\n", "Epoch 1946/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0146 - val_loss: 0.0159\n", "Epoch 1947/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0153 - val_loss: 0.0151\n", "Epoch 1948/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0145 - val_loss: 0.0168\n", "Epoch 1949/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0137 - val_loss: 0.0158\n", "Epoch 1950/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0156 - val_loss: 0.0158\n", "Epoch 1951/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0162 - val_loss: 0.0171\n", "Epoch 1952/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0154 - val_loss: 0.0154\n", "Epoch 1953/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0155 - val_loss: 0.0187\n", "Epoch 1954/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0150 - val_loss: 0.0152\n", "Epoch 1955/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0153 - val_loss: 0.0159\n", "Epoch 1956/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0143 - val_loss: 0.0173\n", "Epoch 1957/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0138 - val_loss: 0.0159\n", "Epoch 1958/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0148 - val_loss: 0.0160\n", "Epoch 1959/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0151 - val_loss: 0.0161\n", "Epoch 1960/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0136 - val_loss: 0.0152\n", "Epoch 1961/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0146 - val_loss: 0.0161\n", "Epoch 1962/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0142 - val_loss: 0.0151\n", "Epoch 1963/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0140 - val_loss: 0.0155\n", "Epoch 1964/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0144 - val_loss: 0.0154\n", "Epoch 1965/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0138 - val_loss: 0.0146\n", "Epoch 1966/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0148 - val_loss: 0.0161\n", "Epoch 1967/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0143 - val_loss: 0.0154\n", "Epoch 1968/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0145 - val_loss: 0.0160\n", "Epoch 1969/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0147 - val_loss: 0.0155\n", "Epoch 1970/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0153 - val_loss: 0.0153\n", "Epoch 1971/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0146 - val_loss: 0.0154\n", "Epoch 1972/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0143 - val_loss: 0.0172\n", "Epoch 1973/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0147 - val_loss: 0.0147\n", "Epoch 1974/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0141 - val_loss: 0.0155\n", "Epoch 1975/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0137 - val_loss: 0.0159\n", "Epoch 1976/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0151\n", "Epoch 1977/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0158 - val_loss: 0.0167\n", "Epoch 1978/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0140 - val_loss: 0.0153\n", "Epoch 1979/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0155 - val_loss: 0.0163\n", "Epoch 1980/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0137 - val_loss: 0.0157\n", "Epoch 1981/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0140 - val_loss: 0.0150\n", "Epoch 1982/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0146 - val_loss: 0.0160\n", "Epoch 1983/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0155\n", "Epoch 1984/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0155 - val_loss: 0.0171\n", "Epoch 1985/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0140 - val_loss: 0.0150\n", "Epoch 1986/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0148 - val_loss: 0.0154\n", "Epoch 1987/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0136 - val_loss: 0.0149\n", "Epoch 1988/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0144 - val_loss: 0.0151\n", "Epoch 1989/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0139 - val_loss: 0.0155\n", "Epoch 1990/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0139 - val_loss: 0.0151\n", "Epoch 1991/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0143 - val_loss: 0.0149\n", "Epoch 1992/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0139 - val_loss: 0.0170\n", "Epoch 1993/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0163 - val_loss: 0.0147\n", "Epoch 1994/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0154 - val_loss: 0.0187\n", "Epoch 1995/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0143 - val_loss: 0.0156\n", "Epoch 1996/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0154 - val_loss: 0.0157\n", "Epoch 1997/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0149 - val_loss: 0.0173\n", "Epoch 1998/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0136 - val_loss: 0.0153\n", "Epoch 1999/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0158 - val_loss: 0.0162\n", "Epoch 2000/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0148 - val_loss: 0.0157\n", "Epoch 2001/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0148 - val_loss: 0.0149\n", "Epoch 2002/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0132 - val_loss: 0.0176\n", "Epoch 2003/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0151 - val_loss: 0.0150\n", "Epoch 2004/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0137 - val_loss: 0.0150\n", "Epoch 2005/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0141 - val_loss: 0.0151\n", "Epoch 2006/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0142 - val_loss: 0.0155\n", "Epoch 2007/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0139 - val_loss: 0.0156\n", "Epoch 2008/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0137 - val_loss: 0.0143\n", "Epoch 2009/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0137 - val_loss: 0.0152\n", "Epoch 2010/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0140 - val_loss: 0.0147\n", "Epoch 2011/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0140 - val_loss: 0.0150\n", "Epoch 2012/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0143 - val_loss: 0.0157\n", "Epoch 2013/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0136 - val_loss: 0.0144\n", "Epoch 2014/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0139 - val_loss: 0.0151\n", "Epoch 2015/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0136 - val_loss: 0.0154\n", "Epoch 2016/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0152 - val_loss: 0.0146\n", "Epoch 2017/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0142 - val_loss: 0.0162\n", "Epoch 2018/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0142 - val_loss: 0.0149\n", "Epoch 2019/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0138 - val_loss: 0.0156\n", "Epoch 2020/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0138 - val_loss: 0.0147\n", "Epoch 2021/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0135 - val_loss: 0.0147\n", "Epoch 2022/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0137 - val_loss: 0.0146\n", "Epoch 2023/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0134 - val_loss: 0.0148\n", "Epoch 2024/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0148 - val_loss: 0.0149\n", "Epoch 2025/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0136 - val_loss: 0.0148\n", "Epoch 2026/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0141 - val_loss: 0.0162\n", "Epoch 2027/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0139 - val_loss: 0.0144\n", "Epoch 2028/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0137 - val_loss: 0.0153\n", "Epoch 2029/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0135 - val_loss: 0.0147\n", "Epoch 2030/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0133 - val_loss: 0.0154\n", "Epoch 2031/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0138 - val_loss: 0.0147\n", "Epoch 2032/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0136 - val_loss: 0.0148\n", "Epoch 2033/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0133 - val_loss: 0.0153\n", "Epoch 2034/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0133 - val_loss: 0.0146\n", "Epoch 2035/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0137 - val_loss: 0.0144\n", "Epoch 2036/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0136 - val_loss: 0.0146\n", "Epoch 2037/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0133 - val_loss: 0.0149\n", "Epoch 2038/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0133 - val_loss: 0.0144\n", "Epoch 2039/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0135 - val_loss: 0.0148\n", "Epoch 2040/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0131 - val_loss: 0.0153\n", "Epoch 2041/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0132 - val_loss: 0.0142\n", "Epoch 2042/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0135 - val_loss: 0.0145\n", "Epoch 2043/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0133 - val_loss: 0.0151\n", "Epoch 2044/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0134 - val_loss: 0.0146\n", "Epoch 2045/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0143 - val_loss: 0.0140\n", "Epoch 2046/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0139 - val_loss: 0.0163\n", "Epoch 2047/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0142 - val_loss: 0.0147\n", "Epoch 2048/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0143 - val_loss: 0.0148\n", "Epoch 2049/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0143 - val_loss: 0.0151\n", "Epoch 2050/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0134 - val_loss: 0.0142\n", "Epoch 2051/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0133 - val_loss: 0.0147\n", "Epoch 2052/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0134 - val_loss: 0.0140\n", "Epoch 2053/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0132 - val_loss: 0.0141\n", "Epoch 2054/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0135 - val_loss: 0.0148\n", "Epoch 2055/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0136 - val_loss: 0.0149\n", "Epoch 2056/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0132 - val_loss: 0.0140\n", "Epoch 2057/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.008 - 0s 93us/step - loss: 0.0135 - val_loss: 0.0145\n", "Epoch 2058/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0134 - val_loss: 0.0157\n", "Epoch 2059/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0148 - val_loss: 0.0140\n", "Epoch 2060/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0141 - val_loss: 0.0149\n", "Epoch 2061/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0134 - val_loss: 0.0143\n", "Epoch 2062/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0135 - val_loss: 0.0149\n", "Epoch 2063/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0135 - val_loss: 0.0142\n", "Epoch 2064/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0142 - val_loss: 0.0141\n", "Epoch 2065/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0135 - val_loss: 0.0162\n", "Epoch 2066/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0135 - val_loss: 0.0140\n", "Epoch 2067/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0141 - val_loss: 0.0142\n", "Epoch 2068/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0140 - val_loss: 0.0154\n", "Epoch 2069/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0139 - val_loss: 0.0138\n", "Epoch 2070/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0131 - val_loss: 0.0149\n", "Epoch 2071/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0140 - val_loss: 0.0139\n", "Epoch 2072/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0130 - val_loss: 0.0151\n", "Epoch 2073/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0137 - val_loss: 0.0139\n", "Epoch 2074/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0131 - val_loss: 0.0146\n", "Epoch 2075/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0131 - val_loss: 0.0150\n", "Epoch 2076/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0132 - val_loss: 0.0140\n", "Epoch 2077/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0133 - val_loss: 0.0141\n", "Epoch 2078/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0141 - val_loss: 0.0145\n", "Epoch 2079/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0131 - val_loss: 0.0149\n", "Epoch 2080/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0135 - val_loss: 0.0144\n", "Epoch 2081/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0142 - val_loss: 0.0150\n", "Epoch 2082/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0143 - val_loss: 0.0140\n", "Epoch 2083/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0136 - val_loss: 0.0155\n", "Epoch 2084/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0129 - val_loss: 0.0141\n", "Epoch 2085/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0134 - val_loss: 0.0140\n", "Epoch 2086/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0128 - val_loss: 0.0144\n", "Epoch 2087/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0135 - val_loss: 0.0142\n", "Epoch 2088/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0128 - val_loss: 0.0138\n", "Epoch 2089/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0132 - val_loss: 0.0148\n", "Epoch 2090/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0132 - val_loss: 0.0142\n", "Epoch 2091/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0127 - val_loss: 0.0154\n", "Epoch 2092/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0135 - val_loss: 0.0142\n", "Epoch 2093/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0135 - val_loss: 0.0135\n", "Epoch 2094/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0137 - val_loss: 0.0145\n", "Epoch 2095/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0146 - val_loss: 0.0134\n", "Epoch 2096/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0134 - val_loss: 0.0160\n", "Epoch 2097/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0138 - val_loss: 0.0140\n", "Epoch 2098/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0134 - val_loss: 0.0146\n", "Epoch 2099/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0138 - val_loss: 0.0140\n", "Epoch 2100/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0142 - val_loss: 0.0140\n", "Epoch 2101/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0147 - val_loss: 0.0167\n", "Epoch 2102/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0123 - val_loss: 0.0142\n", "Epoch 2103/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0155 - val_loss: 0.0135\n", "Epoch 2104/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0147 - val_loss: 0.0148\n", "Epoch 2105/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0131 - val_loss: 0.0136\n", "Epoch 2106/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0139 - val_loss: 0.0152\n", "Epoch 2107/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0130 - val_loss: 0.0145\n", "Epoch 2108/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0139 - val_loss: 0.0144\n", "Epoch 2109/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0128 - val_loss: 0.0155\n", "Epoch 2110/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0135 - val_loss: 0.0138\n", "Epoch 2111/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0128 - val_loss: 0.0135\n", "Epoch 2112/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0127 - val_loss: 0.0141\n", "Epoch 2113/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0130 - val_loss: 0.0133\n", "Epoch 2114/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0125 - val_loss: 0.0137\n", "Epoch 2115/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0125 - val_loss: 0.0141\n", "Epoch 2116/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0127 - val_loss: 0.0135\n", "Epoch 2117/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0127 - val_loss: 0.0132\n", "Epoch 2118/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0126 - val_loss: 0.0134\n", "Epoch 2119/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0129 - val_loss: 0.0140\n", "Epoch 2120/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0129 - val_loss: 0.0134\n", "Epoch 2121/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0131 - val_loss: 0.0141\n", "Epoch 2122/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0128 - val_loss: 0.0134\n", "Epoch 2123/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0125 - val_loss: 0.0133\n", "Epoch 2124/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0127 - val_loss: 0.0138\n", "Epoch 2125/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0133 - val_loss: 0.0139\n", "Epoch 2126/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0125 - val_loss: 0.0138\n", "Epoch 2127/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0129 - val_loss: 0.0139\n", "Epoch 2128/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0124 - val_loss: 0.0134\n", "Epoch 2129/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0130 - val_loss: 0.0130\n", "Epoch 2130/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0137 - val_loss: 0.0142\n", "Epoch 2131/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0128 - val_loss: 0.0132\n", "Epoch 2132/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0132 - val_loss: 0.0142\n", "Epoch 2133/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0126 - val_loss: 0.0128\n", "Epoch 2134/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0125 - val_loss: 0.0132\n", "Epoch 2135/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0124 - val_loss: 0.0135\n", "Epoch 2136/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0123 - val_loss: 0.0135\n", "Epoch 2137/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0126 - val_loss: 0.0132\n", "Epoch 2138/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0125 - val_loss: 0.0130\n", "Epoch 2139/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.009 - 0s 80us/step - loss: 0.0122 - val_loss: 0.0134\n", "Epoch 2140/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0125 - val_loss: 0.0132\n", "Epoch 2141/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0126 - val_loss: 0.0136\n", "Epoch 2142/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0125 - val_loss: 0.0131\n", "Epoch 2143/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0126 - val_loss: 0.0144\n", "Epoch 2144/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0123 - val_loss: 0.0132\n", "Epoch 2145/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.012 - 0s 73us/step - loss: 0.0123 - val_loss: 0.0131\n", "Epoch 2146/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0129 - val_loss: 0.0134\n", "Epoch 2147/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0130 - val_loss: 0.0131\n", "Epoch 2148/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0122 - val_loss: 0.0147\n", "Epoch 2149/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0126 - val_loss: 0.0130\n", "Epoch 2150/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0129 - val_loss: 0.0135\n", "Epoch 2151/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0122 - val_loss: 0.0130\n", "Epoch 2152/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0127 - val_loss: 0.0146\n", "Epoch 2153/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0126 - val_loss: 0.0133\n", "Epoch 2154/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0128 - val_loss: 0.0139\n", "Epoch 2155/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0122 - val_loss: 0.0131\n", "Epoch 2156/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0122 - val_loss: 0.0133\n", "Epoch 2157/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0123 - val_loss: 0.0138\n", "Epoch 2158/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0123 - val_loss: 0.0132\n", "Epoch 2159/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0122 - val_loss: 0.0131\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 2160/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0124 - val_loss: 0.0132\n", "Epoch 2161/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0125 - val_loss: 0.0130\n", "Epoch 2162/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0118 - val_loss: 0.0139\n", "Epoch 2163/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0125 - val_loss: 0.0130\n", "Epoch 2164/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0120 - val_loss: 0.0130\n", "Epoch 2165/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0118 - val_loss: 0.0129\n", "Epoch 2166/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0122 - val_loss: 0.0131\n", "Epoch 2167/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0118 - val_loss: 0.0131\n", "Epoch 2168/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0123 - val_loss: 0.0129\n", "Epoch 2169/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0127 - val_loss: 0.0134\n", "Epoch 2170/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0123 - val_loss: 0.0127\n", "Epoch 2171/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0126 - val_loss: 0.0137\n", "Epoch 2172/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0124 - val_loss: 0.0129\n", "Epoch 2173/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0127 - val_loss: 0.0131\n", "Epoch 2174/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0122 - val_loss: 0.0129\n", "Epoch 2175/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0120 - val_loss: 0.0135\n", "Epoch 2176/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0125 - val_loss: 0.0128\n", "Epoch 2177/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0119 - val_loss: 0.0137\n", "Epoch 2178/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0122 - val_loss: 0.0126\n", "Epoch 2179/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0118 - val_loss: 0.0131\n", "Epoch 2180/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0119 - val_loss: 0.0126\n", "Epoch 2181/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0124 - val_loss: 0.0131\n", "Epoch 2182/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0125 - val_loss: 0.0125\n", "Epoch 2183/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0125 - val_loss: 0.0127\n", "Epoch 2184/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0126 - val_loss: 0.0139\n", "Epoch 2185/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0118 - val_loss: 0.0129\n", "Epoch 2186/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0122 - val_loss: 0.0133\n", "Epoch 2187/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0120 - val_loss: 0.0128\n", "Epoch 2188/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0119 - val_loss: 0.0127\n", "Epoch 2189/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0119 - val_loss: 0.0126\n", "Epoch 2190/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0124 - val_loss: 0.0123\n", "Epoch 2191/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0120 - val_loss: 0.0132\n", "Epoch 2192/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0124 - val_loss: 0.0123\n", "Epoch 2193/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0117 - val_loss: 0.0130\n", "Epoch 2194/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0117 - val_loss: 0.0121\n", "Epoch 2195/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0122 - val_loss: 0.0123\n", "Epoch 2196/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0117 - val_loss: 0.0132\n", "Epoch 2197/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0123 - val_loss: 0.0123\n", "Epoch 2198/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0124 - val_loss: 0.0129\n", "Epoch 2199/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0118 - val_loss: 0.0124\n", "Epoch 2200/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0119 - val_loss: 0.0128\n", "Epoch 2201/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0120 - val_loss: 0.0123\n", "Epoch 2202/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0123 - val_loss: 0.0125\n", "Epoch 2203/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0116 - val_loss: 0.0133\n", "Epoch 2204/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0124 - val_loss: 0.0124\n", "Epoch 2205/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0128 - val_loss: 0.0120\n", "Epoch 2206/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0123 - val_loss: 0.0131\n", "Epoch 2207/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0119 - val_loss: 0.0123\n", "Epoch 2208/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0115 - val_loss: 0.0126\n", "Epoch 2209/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0120 - val_loss: 0.0126\n", "Epoch 2210/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0128 - val_loss: 0.0121\n", "Epoch 2211/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0123 - val_loss: 0.0135\n", "Epoch 2212/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0130 - val_loss: 0.0126\n", "Epoch 2213/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0130 - val_loss: 0.0139\n", "Epoch 2214/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0116 - val_loss: 0.0133\n", "Epoch 2215/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0122 - val_loss: 0.0124\n", "Epoch 2216/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0114 - val_loss: 0.0130\n", "Epoch 2217/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0114 - val_loss: 0.0122\n", "Epoch 2218/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0118 - val_loss: 0.0121\n", "Epoch 2219/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0113 - val_loss: 0.0119\n", "Epoch 2220/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0116 - val_loss: 0.0117\n", "Epoch 2221/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0114 - val_loss: 0.0127\n", "Epoch 2222/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0113 - val_loss: 0.0119\n", "Epoch 2223/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0116 - val_loss: 0.0123\n", "Epoch 2224/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0115 - val_loss: 0.0119\n", "Epoch 2225/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0119 - val_loss: 0.0118\n", "Epoch 2226/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0111 - val_loss: 0.0125\n", "Epoch 2227/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0116 - val_loss: 0.0122\n", "Epoch 2228/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0119 - val_loss: 0.0118\n", "Epoch 2229/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0122 - val_loss: 0.0131\n", "Epoch 2230/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0118 - val_loss: 0.0125\n", "Epoch 2231/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0116 - val_loss: 0.0131\n", "Epoch 2232/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0123 - val_loss: 0.0126\n", "Epoch 2233/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0119 - val_loss: 0.0118\n", "Epoch 2234/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0110 - val_loss: 0.0129\n", "Epoch 2235/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0115 - val_loss: 0.0120\n", "Epoch 2236/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0116 - val_loss: 0.0119\n", "Epoch 2237/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 87us/step - loss: 0.0118 - val_loss: 0.0124\n", "Epoch 2238/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0114 - val_loss: 0.0117\n", "Epoch 2239/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0115 - val_loss: 0.0127\n", "Epoch 2240/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0116 - val_loss: 0.0120\n", "Epoch 2241/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0112 - val_loss: 0.0119\n", "Epoch 2242/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0110 - val_loss: 0.0119\n", "Epoch 2243/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0111 - val_loss: 0.0118\n", "Epoch 2244/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0110 - val_loss: 0.0119\n", "Epoch 2245/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0112 - val_loss: 0.0117\n", "Epoch 2246/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0115 - val_loss: 0.0118\n", "Epoch 2247/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0112 - val_loss: 0.0122\n", "Epoch 2248/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0120 - val_loss: 0.0118\n", "Epoch 2249/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0111 - val_loss: 0.0121\n", "Epoch 2250/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0115 - val_loss: 0.0116\n", "Epoch 2251/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0115 - val_loss: 0.0121\n", "Epoch 2252/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0115 - val_loss: 0.0115\n", "Epoch 2253/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0120 - val_loss: 0.0123\n", "Epoch 2254/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0113 - val_loss: 0.0121\n", "Epoch 2255/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0114 - val_loss: 0.0126\n", "Epoch 2256/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0111 - val_loss: 0.0115\n", "Epoch 2257/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0115 - val_loss: 0.0114\n", "Epoch 2258/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0108 - val_loss: 0.0129\n", "Epoch 2259/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0111 - val_loss: 0.0114\n", "Epoch 2260/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0110 - val_loss: 0.0118\n", "Epoch 2261/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0115 - val_loss: 0.0116\n", "Epoch 2262/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0125 - val_loss: 0.0114\n", "Epoch 2263/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0119 - val_loss: 0.0130\n", "Epoch 2264/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0117 - val_loss: 0.0124\n", "Epoch 2265/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0118 - val_loss: 0.0127\n", "Epoch 2266/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0110 - val_loss: 0.0113\n", "Epoch 2267/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0129 - val_loss: 0.0118\n", "Epoch 2268/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0128 - val_loss: 0.0120\n", "Epoch 2269/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0121 - val_loss: 0.0128\n", "Epoch 2270/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0122 - val_loss: 0.0139\n", "Epoch 2271/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0110 - val_loss: 0.0124\n", "Epoch 2272/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0120 - val_loss: 0.0116\n", "Epoch 2273/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0111 - val_loss: 0.0122\n", "Epoch 2274/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0116 - val_loss: 0.0116\n", "Epoch 2275/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0109 - val_loss: 0.0136\n", "Epoch 2276/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0118 - val_loss: 0.0116\n", "Epoch 2277/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0109 - val_loss: 0.0112\n", "Epoch 2278/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0108 - val_loss: 0.0112\n", "Epoch 2279/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0105 - val_loss: 0.0112\n", "Epoch 2280/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0109 - val_loss: 0.0111\n", "Epoch 2281/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0110 - val_loss: 0.0113\n", "Epoch 2282/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0106 - val_loss: 0.0111\n", "Epoch 2283/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0108 - val_loss: 0.0110\n", "Epoch 2284/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0103 - val_loss: 0.0117\n", "Epoch 2285/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0110 - val_loss: 0.0111\n", "Epoch 2286/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0108 - val_loss: 0.0115\n", "Epoch 2287/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0111 - val_loss: 0.0110\n", "Epoch 2288/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0111 - val_loss: 0.0114\n", "Epoch 2289/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0104 - val_loss: 0.0117\n", "Epoch 2290/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0115 - val_loss: 0.0113\n", "Epoch 2291/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0106 - val_loss: 0.0112\n", "Epoch 2292/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0108 - val_loss: 0.0115\n", "Epoch 2293/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0104 - val_loss: 0.0112\n", "Epoch 2294/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0104 - val_loss: 0.0111\n", "Epoch 2295/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0110 - val_loss: 0.0109\n", "Epoch 2296/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0109 - val_loss: 0.0114\n", "Epoch 2297/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0109 - val_loss: 0.0112\n", "Epoch 2298/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0111 - val_loss: 0.0114\n", "Epoch 2299/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0105 - val_loss: 0.0112\n", "Epoch 2300/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0108 - val_loss: 0.0115\n", "Epoch 2301/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0106 - val_loss: 0.0109\n", "Epoch 2302/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0105 - val_loss: 0.0111\n", "Epoch 2303/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0105 - val_loss: 0.0109\n", "Epoch 2304/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0102 - val_loss: 0.0112\n", "Epoch 2305/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0108 - val_loss: 0.0113\n", "Epoch 2306/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0103 - val_loss: 0.0110\n", "Epoch 2307/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0101 - val_loss: 0.0114\n", "Epoch 2308/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0109 - val_loss: 0.0108\n", "Epoch 2309/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0111 - val_loss: 0.0108\n", "Epoch 2310/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0103 - val_loss: 0.0108\n", "Epoch 2311/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0103 - val_loss: 0.0109\n", "Epoch 2312/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0101 - val_loss: 0.0114\n", "Epoch 2313/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0108 - val_loss: 0.0112\n", "Epoch 2314/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0106 - val_loss: 0.0108\n", "Epoch 2315/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0112 - val_loss: 0.0107\n", "Epoch 2316/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0107 - val_loss: 0.0117\n", "Epoch 2317/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0111 - val_loss: 0.0116\n", "Epoch 2318/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0116 - val_loss: 0.0114\n", "Epoch 2319/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0111 - val_loss: 0.0113\n", "Epoch 2320/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0102 - val_loss: 0.0127\n", "Epoch 2321/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0106 - val_loss: 0.0111\n", "Epoch 2322/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0106 - val_loss: 0.0108\n", "Epoch 2323/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0105 - val_loss: 0.0112\n", "Epoch 2324/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0103 - val_loss: 0.0112\n", "Epoch 2325/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0103 - val_loss: 0.0110\n", "Epoch 2326/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0102 - val_loss: 0.0111\n", "Epoch 2327/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0105\n", "Epoch 2328/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0099 - val_loss: 0.0106\n", "Epoch 2329/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0104 - val_loss: 0.0108\n", "Epoch 2330/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0110 - val_loss: 0.0111\n", "Epoch 2331/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0106 - val_loss: 0.0112\n", "Epoch 2332/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0103 - val_loss: 0.0109\n", "Epoch 2333/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0106 - val_loss: 0.0114\n", "Epoch 2334/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0115\n", "Epoch 2335/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0107 - val_loss: 0.0107\n", "Epoch 2336/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0106 - val_loss: 0.0107\n", "Epoch 2337/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0102 - val_loss: 0.0108\n", "Epoch 2338/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0101 - val_loss: 0.0108\n", "Epoch 2339/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0105\n", "Epoch 2340/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0106\n", "Epoch 2341/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0114 - val_loss: 0.0106\n", "Epoch 2342/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0113 - val_loss: 0.0109\n", "Epoch 2343/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0103 - val_loss: 0.0124\n", "Epoch 2344/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0111 - val_loss: 0.0113\n", "Epoch 2345/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0111 - val_loss: 0.0113\n", "Epoch 2346/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0104 - val_loss: 0.0112\n", "Epoch 2347/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0098 - val_loss: 0.0110\n", "Epoch 2348/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0103 - val_loss: 0.0107\n", "Epoch 2349/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0103 - val_loss: 0.0108\n", "Epoch 2350/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0106\n", "Epoch 2351/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0110 - val_loss: 0.0106\n", "Epoch 2352/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0108 - val_loss: 0.0113\n", "Epoch 2353/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0124\n", "Epoch 2354/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0129 - val_loss: 0.0109\n", "Epoch 2355/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0112\n", "Epoch 2356/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0102 - val_loss: 0.0107\n", "Epoch 2357/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0099 - val_loss: 0.0108\n", "Epoch 2358/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0099 - val_loss: 0.0107\n", "Epoch 2359/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0097 - val_loss: 0.0107\n", "Epoch 2360/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0110\n", "Epoch 2361/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0105 - val_loss: 0.0107\n", "Epoch 2362/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0106 - val_loss: 0.0106\n", "Epoch 2363/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0103 - val_loss: 0.0113\n", "Epoch 2364/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0112 - val_loss: 0.0106\n", "Epoch 2365/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0114 - val_loss: 0.0104\n", "Epoch 2366/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0107 - val_loss: 0.0112\n", "Epoch 2367/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0111\n", "Epoch 2368/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0109 - val_loss: 0.0113\n", "Epoch 2369/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0109 - val_loss: 0.0107\n", "Epoch 2370/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0107 - val_loss: 0.0109\n", "Epoch 2371/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0112\n", "Epoch 2372/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0100 - val_loss: 0.0108\n", "Epoch 2373/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0100 - val_loss: 0.0108\n", "Epoch 2374/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0107\n", "Epoch 2375/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0100 - val_loss: 0.0107\n", "Epoch 2376/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0106\n", "Epoch 2377/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0100 - val_loss: 0.0105\n", "Epoch 2378/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0099 - val_loss: 0.0110\n", "Epoch 2379/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0099 - val_loss: 0.0105\n", "Epoch 2380/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0105 - val_loss: 0.0107\n", "Epoch 2381/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0108\n", "Epoch 2382/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0104\n", "Epoch 2383/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0106\n", "Epoch 2384/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0105\n", "Epoch 2385/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0106\n", "Epoch 2386/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0099 - val_loss: 0.0111\n", "Epoch 2387/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0102 - val_loss: 0.0106\n", "Epoch 2388/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0096 - val_loss: 0.0106\n", "Epoch 2389/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0098 - val_loss: 0.0106\n", "Epoch 2390/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0106\n", "Epoch 2391/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0108\n", "Epoch 2392/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0107\n", "Epoch 2393/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0106\n", "Epoch 2394/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0104\n", "Epoch 2395/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0103\n", "Epoch 2396/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0106\n", "Epoch 2397/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0103\n", "Epoch 2398/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0106\n", "Epoch 2399/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0107\n", "Epoch 2400/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0105\n", "Epoch 2401/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0103\n", "Epoch 2402/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0096 - val_loss: 0.0105\n", "Epoch 2403/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0105\n", "Epoch 2404/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0107\n", "Epoch 2405/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0104\n", "Epoch 2406/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0104\n", "Epoch 2407/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0095 - val_loss: 0.0104\n", "Epoch 2408/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0113\n", "Epoch 2409/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0107 - val_loss: 0.0105\n", "Epoch 2410/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0107\n", "Epoch 2411/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0110 - val_loss: 0.0108\n", "Epoch 2412/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0104 - val_loss: 0.0118\n", "Epoch 2413/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0126\n", "Epoch 2414/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0106 - val_loss: 0.0106\n", "Epoch 2415/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0100 - val_loss: 0.0108\n", "Epoch 2416/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0107\n", "Epoch 2417/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0114\n", "Epoch 2418/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0104\n", "Epoch 2419/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0105\n", "Epoch 2420/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0108\n", "Epoch 2421/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0105\n", "Epoch 2422/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0103\n", "Epoch 2423/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0109\n", "Epoch 2424/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0107\n", "Epoch 2425/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0110\n", "Epoch 2426/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0100 - val_loss: 0.0104\n", "Epoch 2427/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0105 - val_loss: 0.0108\n", "Epoch 2428/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0099 - val_loss: 0.0113\n", "Epoch 2429/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0096 - val_loss: 0.0111\n", "Epoch 2430/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0102 - val_loss: 0.0116\n", "Epoch 2431/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0109\n", "Epoch 2432/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0098 - val_loss: 0.0106\n", "Epoch 2433/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0107\n", "Epoch 2434/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0106\n", "Epoch 2435/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0099 - val_loss: 0.0107\n", "Epoch 2436/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0100 - val_loss: 0.0109\n", "Epoch 2437/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0100 - val_loss: 0.0115\n", "Epoch 2438/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0114\n", "Epoch 2439/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0111\n", "Epoch 2440/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0107\n", "Epoch 2441/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0106\n", "Epoch 2442/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0106 - val_loss: 0.0106\n", "Epoch 2443/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0101 - val_loss: 0.0105\n", "Epoch 2444/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0105 - val_loss: 0.0107\n", "Epoch 2445/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0115 - val_loss: 0.0108\n", "Epoch 2446/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0112 - val_loss: 0.0126\n", "Epoch 2447/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0120 - val_loss: 0.0127\n", "Epoch 2448/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0106 - val_loss: 0.0138\n", "Epoch 2449/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0118 - val_loss: 0.0136\n", "Epoch 2450/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0116 - val_loss: 0.0121\n", "Epoch 2451/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0115 - val_loss: 0.0111\n", "Epoch 2452/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0110 - val_loss: 0.0105\n", "Epoch 2453/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0106 - val_loss: 0.0111\n", "Epoch 2454/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0109\n", "Epoch 2455/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0110\n", "Epoch 2456/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0109\n", "Epoch 2457/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0111\n", "Epoch 2458/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0106\n", "Epoch 2459/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0105\n", "Epoch 2460/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0098 - val_loss: 0.0108\n", "Epoch 2461/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0100 - val_loss: 0.0109\n", "Epoch 2462/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0109\n", "Epoch 2463/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0098 - val_loss: 0.0108\n", "Epoch 2464/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0105\n", "Epoch 2465/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0110\n", "Epoch 2466/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0109\n", "Epoch 2467/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.010 - 0s 87us/step - loss: 0.0092 - val_loss: 0.0116\n", "Epoch 2468/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0110\n", "Epoch 2469/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0111\n", "Epoch 2470/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0106\n", "Epoch 2471/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0108\n", "Epoch 2472/5000\n", "150/150 [==============================] - 0s 227us/step - loss: 0.0094 - val_loss: 0.0108\n", "Epoch 2473/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0097 - val_loss: 0.0106\n", "Epoch 2474/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0110\n", "Epoch 2475/5000\n", "150/150 [==============================] - 0s 180us/step - loss: 0.0093 - val_loss: 0.0109\n", "Epoch 2476/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0091 - val_loss: 0.0111\n", "Epoch 2477/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0107\n", "Epoch 2478/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0113\n", "Epoch 2479/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0106\n", "Epoch 2480/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0095 - val_loss: 0.0105\n", "Epoch 2481/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0102 - val_loss: 0.0108\n", "Epoch 2482/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0102 - val_loss: 0.0110\n", "Epoch 2483/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0114\n", "Epoch 2484/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0106\n", "Epoch 2485/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0111\n", "Epoch 2486/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0103 - val_loss: 0.0126\n", "Epoch 2487/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0098 - val_loss: 0.0124\n", "Epoch 2488/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0111\n", "Epoch 2489/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0106 - val_loss: 0.0110\n", "Epoch 2490/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0112\n", "Epoch 2491/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0112\n", "Epoch 2492/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0107\n", "Epoch 2493/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0108\n", "Epoch 2494/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0117\n", "Epoch 2495/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0108\n", "Epoch 2496/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0106\n", "Epoch 2497/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0107\n", "Epoch 2498/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0112\n", "Epoch 2499/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0113\n", "Epoch 2500/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0114\n", "Epoch 2501/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0106\n", "Epoch 2502/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0103 - val_loss: 0.0107\n", "Epoch 2503/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0101 - val_loss: 0.0111\n", "Epoch 2504/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 2505/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0111\n", "Epoch 2506/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0113\n", "Epoch 2507/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0110\n", "Epoch 2508/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0101 - val_loss: 0.0111\n", "Epoch 2509/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0118\n", "Epoch 2510/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0109\n", "Epoch 2511/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0109\n", "Epoch 2512/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0111\n", "Epoch 2513/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0108\n", "Epoch 2514/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0106\n", "Epoch 2515/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0108\n", "Epoch 2516/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0109\n", "Epoch 2517/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0109\n", "Epoch 2518/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0111\n", "Epoch 2519/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0109\n", "Epoch 2520/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0109\n", "Epoch 2521/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0097 - val_loss: 0.0107\n", "Epoch 2522/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0110\n", "Epoch 2523/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0109\n", "Epoch 2524/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0114\n", "Epoch 2525/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.007 - 0s 73us/step - loss: 0.0102 - val_loss: 0.0113\n", "Epoch 2526/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0099 - val_loss: 0.0115\n", "Epoch 2527/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0112\n", "Epoch 2528/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0117\n", "Epoch 2529/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0113\n", "Epoch 2530/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0109\n", "Epoch 2531/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0111\n", "Epoch 2532/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0108\n", "Epoch 2533/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0095 - val_loss: 0.0111\n", "Epoch 2534/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2535/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0111\n", "Epoch 2536/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0109\n", "Epoch 2537/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0112\n", "Epoch 2538/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0110\n", "Epoch 2539/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0109\n", "Epoch 2540/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0108\n", "Epoch 2541/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0108\n", "Epoch 2542/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0113\n", "Epoch 2543/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0097 - val_loss: 0.0111\n", "Epoch 2544/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0111\n", "Epoch 2545/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0099 - val_loss: 0.0109\n", "Epoch 2546/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.011 - 0s 73us/step - loss: 0.0092 - val_loss: 0.0118\n", "Epoch 2547/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0111\n", "Epoch 2548/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0116\n", "Epoch 2549/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0112\n", "Epoch 2550/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0114\n", "Epoch 2551/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0111\n", "Epoch 2552/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0110\n", "Epoch 2553/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0110\n", "Epoch 2554/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0111\n", "Epoch 2555/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0112\n", "Epoch 2556/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0112\n", "Epoch 2557/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 2558/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0110\n", "Epoch 2559/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0108\n", "Epoch 2560/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0115\n", "Epoch 2561/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0112\n", "Epoch 2562/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0115\n", "Epoch 2563/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0102 - val_loss: 0.0110\n", "Epoch 2564/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0096 - val_loss: 0.0116\n", "Epoch 2565/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 2566/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0100 - val_loss: 0.0114\n", "Epoch 2567/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0098 - val_loss: 0.0108\n", "Epoch 2568/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0110\n", "Epoch 2569/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0110\n", "Epoch 2570/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0112\n", "Epoch 2571/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0101 - val_loss: 0.0112\n", "Epoch 2572/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0096 - val_loss: 0.0120\n", "Epoch 2573/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0113\n", "Epoch 2574/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0119\n", "Epoch 2575/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0110\n", "Epoch 2576/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0092 - val_loss: 0.0114\n", "Epoch 2577/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0112\n", "Epoch 2578/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0112\n", "Epoch 2579/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0116\n", "Epoch 2580/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0114\n", "Epoch 2581/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 2582/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0098 - val_loss: 0.0113\n", "Epoch 2583/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0115\n", "Epoch 2584/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0110\n", "Epoch 2585/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0107\n", "Epoch 2586/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0113\n", "Epoch 2587/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0117\n", "Epoch 2588/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0113\n", "Epoch 2589/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0110\n", "Epoch 2590/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0111\n", "Epoch 2591/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0113\n", "Epoch 2592/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0100 - val_loss: 0.0115\n", "Epoch 2593/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0111\n", "Epoch 2594/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 2595/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0103 - val_loss: 0.0115\n", "Epoch 2596/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0114\n", "Epoch 2597/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0112\n", "Epoch 2598/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0111\n", "Epoch 2599/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0108\n", "Epoch 2600/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0113\n", "Epoch 2601/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0126\n", "Epoch 2602/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 2603/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0117\n", "Epoch 2604/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0094 - val_loss: 0.0109\n", "Epoch 2605/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0122\n", "Epoch 2606/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0120\n", "Epoch 2607/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0112\n", "Epoch 2608/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0113\n", "Epoch 2609/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0113\n", "Epoch 2610/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0114\n", "Epoch 2611/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0116\n", "Epoch 2612/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 2613/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0110\n", "Epoch 2614/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0118\n", "Epoch 2615/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0117\n", "Epoch 2616/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 2617/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0112\n", "Epoch 2618/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0115\n", "Epoch 2619/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0111\n", "Epoch 2620/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0113\n", "Epoch 2621/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 2622/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0113\n", "Epoch 2623/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0090 - val_loss: 0.0112\n", "Epoch 2624/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0111\n", "Epoch 2625/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0111\n", "Epoch 2626/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0111\n", "Epoch 2627/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0098 - val_loss: 0.0116\n", "Epoch 2628/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0114\n", "Epoch 2629/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0114\n", "Epoch 2630/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0099 - val_loss: 0.0120\n", "Epoch 2631/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0111\n", "Epoch 2632/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0091 - val_loss: 0.0118\n", "Epoch 2633/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0114\n", "Epoch 2634/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0115\n", "Epoch 2635/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.009 - 0s 100us/step - loss: 0.0088 - val_loss: 0.0113\n", "Epoch 2636/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0091 - val_loss: 0.0118\n", "Epoch 2637/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0114\n", "Epoch 2638/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0111\n", "Epoch 2639/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0112\n", "Epoch 2640/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0115\n", "Epoch 2641/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0111\n", "Epoch 2642/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0116\n", "Epoch 2643/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0113\n", "Epoch 2644/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0120\n", "Epoch 2645/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0101 - val_loss: 0.0117\n", "Epoch 2646/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0104 - val_loss: 0.0120\n", "Epoch 2647/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0100 - val_loss: 0.0117\n", "Epoch 2648/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0110\n", "Epoch 2649/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0123\n", "Epoch 2650/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0117\n", "Epoch 2651/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0124\n", "Epoch 2652/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0112\n", "Epoch 2653/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 2654/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0115\n", "Epoch 2655/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0120\n", "Epoch 2656/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0112\n", "Epoch 2657/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 2658/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0113\n", "Epoch 2659/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0112\n", "Epoch 2660/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0111\n", "Epoch 2661/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0085 - val_loss: 0.0113\n", "Epoch 2662/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 2663/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0111\n", "Epoch 2664/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0115\n", "Epoch 2665/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0114\n", "Epoch 2666/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0115\n", "Epoch 2667/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0112\n", "Epoch 2668/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0114\n", "Epoch 2669/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0093 - val_loss: 0.0113\n", "Epoch 2670/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 2671/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0112\n", "Epoch 2672/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0096 - val_loss: 0.0122\n", "Epoch 2673/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0112\n", "Epoch 2674/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 2675/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0089 - val_loss: 0.0112\n", "Epoch 2676/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0086 - val_loss: 0.0115\n", "Epoch 2677/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0114\n", "Epoch 2678/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0113\n", "Epoch 2679/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0090 - val_loss: 0.0109\n", "Epoch 2680/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 2681/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0114\n", "Epoch 2682/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 2683/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0090 - val_loss: 0.0114\n", "Epoch 2684/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 2685/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0113\n", "Epoch 2686/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0111\n", "Epoch 2687/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 2688/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0113\n", "Epoch 2689/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0112\n", "Epoch 2690/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0115\n", "Epoch 2691/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 2692/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0113\n", "Epoch 2693/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0113\n", "Epoch 2694/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0113\n", "Epoch 2695/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 2696/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0130\n", "Epoch 2697/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0114\n", "Epoch 2698/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0120\n", "Epoch 2699/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0099 - val_loss: 0.0118\n", "Epoch 2700/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0117\n", "Epoch 2701/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0102 - val_loss: 0.0111\n", "Epoch 2702/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0129\n", "Epoch 2703/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0113\n", "Epoch 2704/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 2705/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0117\n", "Epoch 2706/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0113\n", "Epoch 2707/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0112\n", "Epoch 2708/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0112\n", "Epoch 2709/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 2710/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 2711/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0113\n", "Epoch 2712/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 2713/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0092 - val_loss: 0.0115\n", "Epoch 2714/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 2715/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2716/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0112\n", "Epoch 2717/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0112\n", "Epoch 2718/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 2719/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 2720/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0127\n", "Epoch 2721/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0091 - val_loss: 0.0114\n", "Epoch 2722/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0114\n", "Epoch 2723/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 2724/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 2725/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0112\n", "Epoch 2726/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0087 - val_loss: 0.0113\n", "Epoch 2727/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 2728/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0119\n", "Epoch 2729/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0092 - val_loss: 0.0127\n", "Epoch 2730/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 2731/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0093 - val_loss: 0.0122\n", "Epoch 2732/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0118\n", "Epoch 2733/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 2734/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 2735/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0113\n", "Epoch 2736/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 2737/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0115\n", "Epoch 2738/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0116\n", "Epoch 2739/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 2740/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0117\n", "Epoch 2741/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0115\n", "Epoch 2742/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0125\n", "Epoch 2743/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0116\n", "Epoch 2744/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0116\n", "Epoch 2745/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0118\n", "Epoch 2746/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0130\n", "Epoch 2747/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0120\n", "Epoch 2748/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0117\n", "Epoch 2749/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0098 - val_loss: 0.0124\n", "Epoch 2750/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0100 - val_loss: 0.0125\n", "Epoch 2751/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0127\n", "Epoch 2752/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0096 - val_loss: 0.0129\n", "Epoch 2753/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0113\n", "Epoch 2754/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 2755/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 2756/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 2757/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0114\n", "Epoch 2758/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 2759/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0096 - val_loss: 0.0118\n", "Epoch 2760/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 2761/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.009 - 0s 73us/step - loss: 0.0090 - val_loss: 0.0116\n", "Epoch 2762/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0115\n", "Epoch 2763/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 2764/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 2765/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 2766/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 2767/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0106 - val_loss: 0.0115\n", "Epoch 2768/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0117\n", "Epoch 2769/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0130\n", "Epoch 2770/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 2771/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 2772/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0113\n", "Epoch 2773/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0118\n", "Epoch 2774/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0114\n", "Epoch 2775/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 2776/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0114\n", "Epoch 2777/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0114\n", "Epoch 2778/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0114\n", "Epoch 2779/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0116\n", "Epoch 2780/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 2781/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 2782/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0115\n", "Epoch 2783/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 2784/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0094 - val_loss: 0.0117\n", "Epoch 2785/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 2786/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0114\n", "Epoch 2787/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 2788/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 2789/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0126\n", "Epoch 2790/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 2791/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 2792/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 2793/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 2794/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 2795/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 2796/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2797/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 2798/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2799/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 2800/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0101 - val_loss: 0.0115\n", "Epoch 2801/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 2802/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 2803/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0121\n", "Epoch 2804/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0101 - val_loss: 0.0119\n", "Epoch 2805/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0097 - val_loss: 0.0124\n", "Epoch 2806/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0111 - val_loss: 0.0124\n", "Epoch 2807/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0119 - val_loss: 0.0121\n", "Epoch 2808/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0117\n", "Epoch 2809/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0122\n", "Epoch 2810/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 2811/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0130\n", "Epoch 2812/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0114\n", "Epoch 2813/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 2814/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 2815/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 2816/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 2817/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0115\n", "Epoch 2818/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 2819/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0115\n", "Epoch 2820/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0126\n", "Epoch 2821/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0117\n", "Epoch 2822/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0096 - val_loss: 0.0117\n", "Epoch 2823/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0115\n", "Epoch 2824/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 2825/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 2826/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0115\n", "Epoch 2827/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0118\n", "Epoch 2828/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 2829/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0119\n", "Epoch 2830/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 2831/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 2832/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 2833/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 2834/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0114\n", "Epoch 2835/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 2836/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 2837/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0116\n", "Epoch 2838/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 2839/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 2840/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0118\n", "Epoch 2841/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 2842/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 2843/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 2844/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0120\n", "Epoch 2845/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 2846/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0091 - val_loss: 0.0126\n", "Epoch 2847/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0118\n", "Epoch 2848/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 2849/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0093 - val_loss: 0.0114\n", "Epoch 2850/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 2851/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0090 - val_loss: 0.0118\n", "Epoch 2852/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0120\n", "Epoch 2853/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 2854/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.011 - 0s 80us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 2855/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 2856/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0114\n", "Epoch 2857/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 2858/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0117\n", "Epoch 2859/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 2860/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0092 - val_loss: 0.0118\n", "Epoch 2861/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 2862/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0121\n", "Epoch 2863/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 2864/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 2865/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0128\n", "Epoch 2866/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 2867/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0093 - val_loss: 0.0122\n", "Epoch 2868/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 2869/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 2870/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 2871/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0114\n", "Epoch 2872/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0097 - val_loss: 0.0125\n", "Epoch 2873/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 2874/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0094 - val_loss: 0.0126\n", "Epoch 2875/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 2876/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 2877/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 2878/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 2879/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 2880/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 2881/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 2882/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 2883/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 2884/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 2885/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 2886/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 2887/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 2888/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 2889/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0118\n", "Epoch 2890/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 2891/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 2892/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 2893/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 2894/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 2895/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.009 - 0s 93us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 2896/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 2897/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 2898/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 2899/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 2900/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 2901/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0106 - val_loss: 0.0118\n", "Epoch 2902/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0119 - val_loss: 0.0131\n", "Epoch 2903/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0110 - val_loss: 0.0119\n", "Epoch 2904/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2905/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 2906/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0107 - val_loss: 0.0125\n", "Epoch 2907/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0119 - val_loss: 0.0123\n", "Epoch 2908/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0099 - val_loss: 0.0120\n", "Epoch 2909/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0105 - val_loss: 0.0131\n", "Epoch 2910/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0097 - val_loss: 0.0118\n", "Epoch 2911/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0133\n", "Epoch 2912/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 2913/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0135\n", "Epoch 2914/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0120\n", "Epoch 2915/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0128\n", "Epoch 2916/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 2917/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 2918/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2919/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0099 - val_loss: 0.0125\n", "Epoch 2920/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0109 - val_loss: 0.0120\n", "Epoch 2921/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0108 - val_loss: 0.0124\n", "Epoch 2922/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0130\n", "Epoch 2923/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 2924/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0133\n", "Epoch 2925/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0117\n", "Epoch 2926/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0130\n", "Epoch 2927/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0110 - val_loss: 0.0115\n", "Epoch 2928/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0103 - val_loss: 0.0116\n", "Epoch 2929/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0110 - val_loss: 0.0126\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 2930/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0120\n", "Epoch 2931/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0088 - val_loss: 0.0131\n", "Epoch 2932/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0103 - val_loss: 0.0118\n", "Epoch 2933/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0128\n", "Epoch 2934/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 2935/5000\n", "150/150 [==============================] - 0s 200us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 2936/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0090 - val_loss: 0.0114\n", "Epoch 2937/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0087 - val_loss: 0.0128\n", "Epoch 2938/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0091 - val_loss: 0.0120\n", "Epoch 2939/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 2940/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0128\n", "Epoch 2941/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 2942/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2943/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 2944/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 2945/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 2946/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 2947/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 2948/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 2949/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 2950/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0115\n", "Epoch 2951/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 2952/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 2953/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0114\n", "Epoch 2954/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 2955/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 2956/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 2957/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 2958/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 2959/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 2960/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 2961/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 2962/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 2963/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 2964/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 2965/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 2966/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 2967/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 2968/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 2969/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 2970/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 2971/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 2972/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 2973/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0129\n", "Epoch 2974/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0093 - val_loss: 0.0118\n", "Epoch 2975/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 2976/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 2977/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 2978/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 2979/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 2980/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0116\n", "Epoch 2981/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 2982/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0115\n", "Epoch 2983/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 2984/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 2985/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 2986/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 2987/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 2988/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 2989/5000\n", "150/150 [==============================] - 0s 213us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 2990/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 2991/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 2992/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 2993/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 2994/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 2995/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 2996/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 2997/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 2998/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 2999/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 3000/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3001/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3002/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3003/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 3004/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3005/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 3006/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3007/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3008/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 3009/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0116\n", "Epoch 3010/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0123\n", "Epoch 3011/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 3012/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3013/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3014/5000\n", "150/150 [==============================] - 0s 233us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3015/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 3016/5000\n", "150/150 [==============================] - 0s 326us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 3017/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3018/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 3019/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 3020/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3021/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 3022/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3023/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 3024/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3025/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0131\n", "Epoch 3026/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0099 - val_loss: 0.0122\n", "Epoch 3027/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0115\n", "Epoch 3028/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0119\n", "Epoch 3029/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3030/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 3031/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 3032/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3033/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0116\n", "Epoch 3034/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 3035/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0126\n", "Epoch 3036/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 3037/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3038/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3039/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3040/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3041/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 3042/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3043/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 3044/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0123\n", "Epoch 3045/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0114\n", "Epoch 3046/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3047/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3048/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 3049/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 3050/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3051/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3052/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0095 - val_loss: 0.0120\n", "Epoch 3053/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0092 - val_loss: 0.0116\n", "Epoch 3054/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 3055/5000\n", "150/150 [==============================] - 0s 207us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3056/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0127\n", "Epoch 3057/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3058/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3059/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3060/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 3061/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3062/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0117\n", "Epoch 3063/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0122\n", "Epoch 3064/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0119\n", "Epoch 3065/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0115\n", "Epoch 3066/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0123\n", "Epoch 3067/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 3068/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0130\n", "Epoch 3069/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3070/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0087 - val_loss: 0.0132\n", "Epoch 3071/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0119\n", "Epoch 3072/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 3073/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3074/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0090 - val_loss: 0.0124\n", "Epoch 3075/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 3076/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0102 - val_loss: 0.0124\n", "Epoch 3077/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 3078/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3079/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 3080/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3081/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3082/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0098 - val_loss: 0.0122\n", "Epoch 3083/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 3084/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 3085/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0124\n", "Epoch 3086/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0101 - val_loss: 0.0117\n", "Epoch 3087/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0100 - val_loss: 0.0124\n", "Epoch 3088/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3089/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0133\n", "Epoch 3090/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0098 - val_loss: 0.0120\n", "Epoch 3091/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0109 - val_loss: 0.0117\n", "Epoch 3092/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0116\n", "Epoch 3093/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0142\n", "Epoch 3094/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 3095/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0092 - val_loss: 0.0125\n", "Epoch 3096/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3097/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0129\n", "Epoch 3098/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 3099/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3100/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3101/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 3102/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0096 - val_loss: 0.0120\n", "Epoch 3103/5000\n", "150/150 [==============================] - 0s 240us/step - loss: 0.0103 - val_loss: 0.0125\n", "Epoch 3104/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0103 - val_loss: 0.0131\n", "Epoch 3105/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0097 - val_loss: 0.0143\n", "Epoch 3106/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0134\n", "Epoch 3107/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0093 - val_loss: 0.0128\n", "Epoch 3108/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 3109/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 3110/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 3111/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0121\n", "Epoch 3112/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3113/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 3114/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3115/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0114\n", "Epoch 3116/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3117/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3118/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 3119/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 3120/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0100 - val_loss: 0.0122\n", "Epoch 3121/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0130\n", "Epoch 3122/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0100 - val_loss: 0.0125\n", "Epoch 3123/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 3124/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0115\n", "Epoch 3125/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 3126/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0117\n", "Epoch 3127/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3128/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 3129/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 3130/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3131/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3132/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 3133/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3134/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 3135/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3136/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3137/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3138/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0094 - val_loss: 0.0120\n", "Epoch 3139/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0127\n", "Epoch 3140/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 3141/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3142/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 3143/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3144/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3145/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 3146/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 3147/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0120\n", "Epoch 3148/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 3149/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 3150/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 3151/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0118\n", "Epoch 3152/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 3153/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3154/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 3155/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 3156/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3157/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0117\n", "Epoch 3158/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 3159/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3160/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3161/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3162/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0128\n", "Epoch 3163/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3164/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3165/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3166/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 3167/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0086 - val_loss: 0.0130\n", "Epoch 3168/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3169/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3170/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3171/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0099 - val_loss: 0.0122\n", "Epoch 3172/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0122\n", "Epoch 3173/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0084 - val_loss: 0.0130\n", "Epoch 3174/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0088 - val_loss: 0.0126\n", "Epoch 3175/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0117\n", "Epoch 3176/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0122\n", "Epoch 3177/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0118\n", "Epoch 3178/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0126\n", "Epoch 3179/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 3180/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3181/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0132\n", "Epoch 3182/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3183/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3184/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3185/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3186/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3187/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3188/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3189/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3190/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 3191/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3192/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3193/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3194/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3195/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 3196/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0117\n", "Epoch 3197/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3198/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0121\n", "Epoch 3199/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0124\n", "Epoch 3200/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0096 - val_loss: 0.0123\n", "Epoch 3201/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 3202/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 3203/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3204/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 3205/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0097 - val_loss: 0.0124\n", "Epoch 3206/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 3207/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0088 - val_loss: 0.0128\n", "Epoch 3208/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 3209/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 73us/step - loss: 0.0081 - val_loss: 0.0127\n", "Epoch 3210/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.010 - 0s 87us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 3211/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3212/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 3213/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0080 - val_loss: 0.0123\n", "Epoch 3214/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3215/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3216/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3217/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3218/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0129\n", "Epoch 3219/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 3220/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0122\n", "Epoch 3221/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.010 - 0s 87us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 3222/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 3223/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0115\n", "Epoch 3224/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3225/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 3226/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 3227/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 3228/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3229/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.013 - 0s 87us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 3230/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3231/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 3232/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0116\n", "Epoch 3233/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3234/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3235/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.009 - 0s 93us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3236/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3237/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 3238/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 3239/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3240/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0123\n", "Epoch 3241/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3242/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0124\n", "Epoch 3243/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 3244/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 3245/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0115\n", "Epoch 3246/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 3247/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3248/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0127\n", "Epoch 3249/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3250/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 3251/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0121\n", "Epoch 3252/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0127\n", "Epoch 3253/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0134\n", "Epoch 3254/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3255/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 3256/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3257/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3258/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3259/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3260/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3261/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3262/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0129\n", "Epoch 3263/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 3264/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3265/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3266/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 3267/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3268/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3269/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3270/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0126\n", "Epoch 3271/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 3272/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3273/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3274/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0126\n", "Epoch 3275/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3276/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3277/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3278/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3279/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 3280/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0126\n", "Epoch 3281/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 3282/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 3283/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0115\n", "Epoch 3284/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 3285/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3286/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 3287/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 3288/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3289/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0128\n", "Epoch 3290/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0133\n", "Epoch 3291/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0126\n", "Epoch 3292/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 3293/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3294/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3295/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3296/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 3297/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 3298/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3299/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3300/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0116\n", "Epoch 3301/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3302/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3303/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 3304/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 3305/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3306/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 3307/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3308/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 3309/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 3310/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0128\n", "Epoch 3311/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0096 - val_loss: 0.0117\n", "Epoch 3312/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0122\n", "Epoch 3313/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0101 - val_loss: 0.0124\n", "Epoch 3314/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 60us/step - loss: 0.0100 - val_loss: 0.0124\n", "Epoch 3315/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0129\n", "Epoch 3316/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0119\n", "Epoch 3317/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0123\n", "Epoch 3318/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0126\n", "Epoch 3319/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3320/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3321/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 3322/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 3323/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3324/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3325/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3326/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3327/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0080 - val_loss: 0.0122\n", "Epoch 3328/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3329/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3330/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3331/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3332/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 3333/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3334/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3335/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 3336/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3337/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.007 - 0s 80us/step - loss: 0.0097 - val_loss: 0.0129\n", "Epoch 3338/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0118\n", "Epoch 3339/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0106 - val_loss: 0.0132\n", "Epoch 3340/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0112 - val_loss: 0.0122\n", "Epoch 3341/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0122 - val_loss: 0.0129\n", "Epoch 3342/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0115 - val_loss: 0.0137\n", "Epoch 3343/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0131\n", "Epoch 3344/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0095 - val_loss: 0.0134\n", "Epoch 3345/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 3346/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0132\n", "Epoch 3347/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3348/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0133\n", "Epoch 3349/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 3350/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0121\n", "Epoch 3351/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 3352/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3353/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3354/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3355/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 3356/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3357/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3358/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3359/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 3360/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3361/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3362/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0116\n", "Epoch 3363/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3364/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3365/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3366/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3367/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 3368/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3369/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 3370/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0102 - val_loss: 0.0125\n", "Epoch 3371/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0103 - val_loss: 0.0119\n", "Epoch 3372/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0099 - val_loss: 0.0119\n", "Epoch 3373/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0126\n", "Epoch 3374/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0115\n", "Epoch 3375/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 3376/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 3377/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0131\n", "Epoch 3378/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0117\n", "Epoch 3379/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0107 - val_loss: 0.0119\n", "Epoch 3380/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0124\n", "Epoch 3381/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 3382/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0129\n", "Epoch 3383/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3384/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3385/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3386/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3387/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3388/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3389/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3390/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0125\n", "Epoch 3391/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0122\n", "Epoch 3392/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0117\n", "Epoch 3393/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0098 - val_loss: 0.0115\n", "Epoch 3394/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0109 - val_loss: 0.0148\n", "Epoch 3395/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0136\n", "Epoch 3396/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0112 - val_loss: 0.0153\n", "Epoch 3397/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0086 - val_loss: 0.0137\n", "Epoch 3398/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0095 - val_loss: 0.0158\n", "Epoch 3399/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0109 - val_loss: 0.0127\n", "Epoch 3400/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0113 - val_loss: 0.0125\n", "Epoch 3401/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0097 - val_loss: 0.0126\n", "Epoch 3402/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0101 - val_loss: 0.0119\n", "Epoch 3403/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0125\n", "Epoch 3404/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3405/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0127\n", "Epoch 3406/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 3407/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 3408/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3409/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3410/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3411/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3412/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 3413/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3414/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3415/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3416/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 3417/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3418/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 3419/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3420/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3421/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 3422/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 3423/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3424/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 3425/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3426/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 3427/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 3428/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0117\n", "Epoch 3429/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0120\n", "Epoch 3430/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3431/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0125\n", "Epoch 3432/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0115\n", "Epoch 3433/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3434/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3435/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0132\n", "Epoch 3436/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3437/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 3438/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0116\n", "Epoch 3439/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 3440/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 3441/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 3442/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3443/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0123\n", "Epoch 3444/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0119\n", "Epoch 3445/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 3446/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0096 - val_loss: 0.0124\n", "Epoch 3447/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0094 - val_loss: 0.0124\n", "Epoch 3448/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3449/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3450/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3451/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0128\n", "Epoch 3452/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3453/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 3454/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0128\n", "Epoch 3455/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3456/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0128\n", "Epoch 3457/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3458/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0132\n", "Epoch 3459/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3460/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 3461/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.012 - 0s 80us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 3462/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3463/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3464/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0106 - val_loss: 0.0130\n", "Epoch 3465/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0100 - val_loss: 0.0127\n", "Epoch 3466/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0124 - val_loss: 0.0125\n", "Epoch 3467/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0144\n", "Epoch 3468/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3469/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0124\n", "Epoch 3470/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3471/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3472/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3473/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0128\n", "Epoch 3474/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3475/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3476/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 3477/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0099 - val_loss: 0.0126\n", "Epoch 3478/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3479/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 3480/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0131\n", "Epoch 3481/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 3482/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 3483/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3484/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0122\n", "Epoch 3485/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3486/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0080 - val_loss: 0.0122\n", "Epoch 3487/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3488/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3489/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 3490/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3491/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 3492/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3493/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 3494/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0117\n", "Epoch 3495/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 3496/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0101 - val_loss: 0.0121\n", "Epoch 3497/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0109 - val_loss: 0.0120\n", "Epoch 3498/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0090 - val_loss: 0.0137\n", "Epoch 3499/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0105 - val_loss: 0.0120\n", "Epoch 3500/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0102 - val_loss: 0.0132\n", "Epoch 3501/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0130\n", "Epoch 3502/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0137\n", "Epoch 3503/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0091 - val_loss: 0.0120\n", "Epoch 3504/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0100 - val_loss: 0.0124\n", "Epoch 3505/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3506/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3507/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3508/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0127\n", "Epoch 3509/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0119\n", "Epoch 3510/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0099 - val_loss: 0.0122\n", "Epoch 3511/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0120\n", "Epoch 3512/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0128\n", "Epoch 3513/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3514/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0129\n", "Epoch 3515/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3516/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3517/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3518/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 3519/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 3520/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3521/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3522/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3523/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0125\n", "Epoch 3524/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 3525/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 3526/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0115\n", "Epoch 3527/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 3528/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3529/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3530/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.008 - 0s 100us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3531/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0118\n", "Epoch 3532/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0130\n", "Epoch 3533/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3534/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0117\n", "Epoch 3535/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 3536/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3537/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3538/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 3539/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 3540/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 3541/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3542/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 3543/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3544/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3545/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3546/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0123\n", "Epoch 3547/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0130\n", "Epoch 3548/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0118\n", "Epoch 3549/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0095 - val_loss: 0.0120\n", "Epoch 3550/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0119\n", "Epoch 3551/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0093 - val_loss: 0.0130\n", "Epoch 3552/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3553/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0127\n", "Epoch 3554/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3555/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 3556/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3557/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0117\n", "Epoch 3558/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0101 - val_loss: 0.0127\n", "Epoch 3559/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0121\n", "Epoch 3560/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 3561/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0093 - val_loss: 0.0120\n", "Epoch 3562/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 3563/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0127\n", "Epoch 3564/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3565/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0081 - val_loss: 0.0131\n", "Epoch 3566/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 3567/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3568/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0094 - val_loss: 0.0119\n", "Epoch 3569/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 3570/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0126\n", "Epoch 3571/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3572/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 3573/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3574/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3575/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 93us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3576/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0132\n", "Epoch 3577/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0121\n", "Epoch 3578/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0100 - val_loss: 0.0127\n", "Epoch 3579/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.004 - 0s 87us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 3580/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3581/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0104 - val_loss: 0.0118\n", "Epoch 3582/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 3583/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 3584/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3585/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3586/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3587/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0125\n", "Epoch 3588/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3589/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3590/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 3591/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 3592/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 3593/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3594/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0096 - val_loss: 0.0123\n", "Epoch 3595/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 3596/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 3597/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0121\n", "Epoch 3598/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0096 - val_loss: 0.0122\n", "Epoch 3599/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0098 - val_loss: 0.0134\n", "Epoch 3600/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3601/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 3602/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 3603/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 3604/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 3605/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 3606/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3607/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0116\n", "Epoch 3608/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3609/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 3610/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 3611/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3612/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3613/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3614/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3615/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3616/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0124\n", "Epoch 3617/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0119\n", "Epoch 3618/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0095 - val_loss: 0.0118\n", "Epoch 3619/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0130\n", "Epoch 3620/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 3621/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0128\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 3622/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0124\n", "Epoch 3623/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.007 - 0s 80us/step - loss: 0.0085 - val_loss: 0.0132\n", "Epoch 3624/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 3625/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 3626/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3627/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3628/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3629/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 3630/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3631/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 3632/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 3633/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3634/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0130\n", "Epoch 3635/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 3636/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0091 - val_loss: 0.0127\n", "Epoch 3637/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0124\n", "Epoch 3638/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0132\n", "Epoch 3639/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0118\n", "Epoch 3640/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 3641/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 3642/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0125\n", "Epoch 3643/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0114\n", "Epoch 3644/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3645/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3646/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 3647/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 3648/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3649/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3650/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0092 - val_loss: 0.0122\n", "Epoch 3651/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 3652/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0122\n", "Epoch 3653/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0118\n", "Epoch 3654/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0080 - val_loss: 0.0133\n", "Epoch 3655/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 3656/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 3657/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3658/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 3659/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 80us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 3660/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3661/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0122\n", "Epoch 3662/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0079 - val_loss: 0.0128\n", "Epoch 3663/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3664/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 3665/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 3666/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0128\n", "Epoch 3667/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0122\n", "Epoch 3668/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3669/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3670/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3671/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 3672/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3673/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 3674/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3675/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 3676/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3677/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 3678/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 3679/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3680/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0080 - val_loss: 0.0125\n", "Epoch 3681/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3682/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0122\n", "Epoch 3683/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 3684/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 3685/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0094 - val_loss: 0.0122\n", "Epoch 3686/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0096 - val_loss: 0.0120\n", "Epoch 3687/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0140\n", "Epoch 3688/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0100 - val_loss: 0.0122\n", "Epoch 3689/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0078 - val_loss: 0.0140\n", "Epoch 3690/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 3691/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0127\n", "Epoch 3692/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3693/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 3694/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3695/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3696/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3697/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3698/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 3699/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0125\n", "Epoch 3700/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3701/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 3702/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0135\n", "Epoch 3703/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0091 - val_loss: 0.0129\n", "Epoch 3704/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0125\n", "Epoch 3705/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0128\n", "Epoch 3706/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3707/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 3708/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 3709/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 3710/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 3711/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0098 - val_loss: 0.0117\n", "Epoch 3712/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0130\n", "Epoch 3713/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0078 - val_loss: 0.0131\n", "Epoch 3714/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0121\n", "Epoch 3715/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3716/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3717/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3718/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3719/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3720/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 3721/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3722/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 3723/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 3724/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 3725/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0141\n", "Epoch 3726/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0116\n", "Epoch 3727/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0134\n", "Epoch 3728/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0123\n", "Epoch 3729/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3730/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3731/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 3732/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 3733/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 3734/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 3735/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0116\n", "Epoch 3736/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 3737/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0099 - val_loss: 0.0119\n", "Epoch 3738/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0101 - val_loss: 0.0126\n", "Epoch 3739/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0126\n", "Epoch 3740/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0117\n", "Epoch 3741/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0129\n", "Epoch 3742/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3743/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0125\n", "Epoch 3744/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3745/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0132\n", "Epoch 3746/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0098 - val_loss: 0.0126\n", "Epoch 3747/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0130\n", "Epoch 3748/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3749/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3750/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3751/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3752/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0080 - val_loss: 0.0123\n", "Epoch 3753/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3754/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 3755/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3756/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3757/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3758/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 3759/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0080 - val_loss: 0.0122\n", "Epoch 3760/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 3761/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 3762/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 3763/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3764/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0080 - val_loss: 0.0121\n", "Epoch 3765/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3766/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3767/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3768/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 3769/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0122\n", "Epoch 3770/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3771/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3772/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3773/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 3774/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 3775/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0080 - val_loss: 0.0123\n", "Epoch 3776/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3777/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3778/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0124\n", "Epoch 3779/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0097 - val_loss: 0.0127\n", "Epoch 3780/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0099 - val_loss: 0.0121\n", "Epoch 3781/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0111 - val_loss: 0.0120\n", "Epoch 3782/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0111 - val_loss: 0.0130\n", "Epoch 3783/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0099 - val_loss: 0.0135\n", "Epoch 3784/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0092 - val_loss: 0.0129\n", "Epoch 3785/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 3786/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0098 - val_loss: 0.0126\n", "Epoch 3787/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0129\n", "Epoch 3788/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0134\n", "Epoch 3789/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3790/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3791/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3792/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0122\n", "Epoch 3793/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 3794/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 3795/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3796/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3797/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3798/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0130\n", "Epoch 3799/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 3800/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0124\n", "Epoch 3801/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0115\n", "Epoch 3802/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 3803/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 3804/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0127\n", "Epoch 3805/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 3806/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3807/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3808/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0125\n", "Epoch 3809/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 3810/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3811/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3812/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 3813/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0117\n", "Epoch 3814/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 3815/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3816/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0123\n", "Epoch 3817/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0090 - val_loss: 0.0124\n", "Epoch 3818/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 3819/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3820/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0127\n", "Epoch 3821/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 3822/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0126\n", "Epoch 3823/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3824/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3825/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3826/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 3827/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3828/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0130\n", "Epoch 3829/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 3830/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 3831/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3832/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 3833/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3834/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 3835/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 3836/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3837/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 3838/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3839/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 3840/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3841/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0117\n", "Epoch 3842/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0080 - val_loss: 0.0118\n", "Epoch 3843/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3844/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3845/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3846/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3847/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3848/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3849/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3850/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.008 - 0s 93us/step - loss: 0.0083 - val_loss: 0.0127\n", "Epoch 3851/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 3852/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3853/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0133\n", "Epoch 3854/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 3855/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 3856/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 3857/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0129\n", "Epoch 3858/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3859/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 3860/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0080 - val_loss: 0.0126\n", "Epoch 3861/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0128\n", "Epoch 3862/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 3863/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3864/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.008 - 0s 93us/step - loss: 0.0083 - val_loss: 0.0125\n", "Epoch 3865/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 3866/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3867/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0127\n", "Epoch 3868/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 87us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 3869/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3870/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.007 - 0s 93us/step - loss: 0.0080 - val_loss: 0.0121\n", "Epoch 3871/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 3872/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0128\n", "Epoch 3873/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3874/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0120\n", "Epoch 3875/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 3876/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0123\n", "Epoch 3877/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0128\n", "Epoch 3878/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0120\n", "Epoch 3879/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0118\n", "Epoch 3880/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 3881/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0088 - val_loss: 0.0129\n", "Epoch 3882/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 3883/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0129\n", "Epoch 3884/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 3885/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0128\n", "Epoch 3886/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3887/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3888/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 3889/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 3890/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 3891/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0096 - val_loss: 0.0121\n", "Epoch 3892/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0129\n", "Epoch 3893/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 3894/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3895/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3896/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3897/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 3898/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3899/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 3900/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3901/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0132\n", "Epoch 3902/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3903/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 3904/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3905/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0131\n", "Epoch 3906/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 3907/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 3908/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3909/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0097 - val_loss: 0.0122\n", "Epoch 3910/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 3911/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0112 - val_loss: 0.0124\n", "Epoch 3912/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0124 - val_loss: 0.0119\n", "Epoch 3913/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0102 - val_loss: 0.0124\n", "Epoch 3914/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0122 - val_loss: 0.0119\n", "Epoch 3915/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0098 - val_loss: 0.0140\n", "Epoch 3916/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0099 - val_loss: 0.0130\n", "Epoch 3917/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0154\n", "Epoch 3918/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0127\n", "Epoch 3919/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0101 - val_loss: 0.0144\n", "Epoch 3920/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0096 - val_loss: 0.0121\n", "Epoch 3921/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0123\n", "Epoch 3922/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 3923/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 3924/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0130\n", "Epoch 3925/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 3926/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0120\n", "Epoch 3927/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0127\n", "Epoch 3928/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0126\n", "Epoch 3929/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3930/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 3931/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 3932/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0125\n", "Epoch 3933/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 3934/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0092 - val_loss: 0.0120\n", "Epoch 3935/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0124\n", "Epoch 3936/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0091 - val_loss: 0.0134\n", "Epoch 3937/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0140\n", "Epoch 3938/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0127\n", "Epoch 3939/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 3940/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0118\n", "Epoch 3941/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0133\n", "Epoch 3942/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0136\n", "Epoch 3943/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0131\n", "Epoch 3944/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 3945/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 3946/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0128\n", "Epoch 3947/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 3948/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0133\n", "Epoch 3949/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3950/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0134\n", "Epoch 3951/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 3952/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 3953/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 3954/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0129\n", "Epoch 3955/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0125\n", "Epoch 3956/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3957/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 3958/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 3959/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3960/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 3961/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 3962/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 3963/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 3964/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 3965/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0131\n", "Epoch 3966/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 3967/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 3968/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0080 - val_loss: 0.0123\n", "Epoch 3969/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 3970/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 3971/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 3972/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 3973/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 3974/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 3975/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0129\n", "Epoch 3976/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 3977/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 3978/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0118\n", "Epoch 3979/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0124\n", "Epoch 3980/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 3981/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0118\n", "Epoch 3982/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 3983/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0125\n", "Epoch 3984/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0123\n", "Epoch 3985/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 3986/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0127\n", "Epoch 3987/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 3988/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 3989/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 3990/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 3991/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 3992/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 3993/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 3994/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 3995/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0090 - val_loss: 0.0125\n", "Epoch 3996/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0116\n", "Epoch 3997/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 3998/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 3999/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4000/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 4001/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 4002/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0132\n", "Epoch 4003/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0130\n", "Epoch 4004/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0117\n", "Epoch 4005/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 4006/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4007/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4008/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0115\n", "Epoch 4009/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4010/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4011/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0125\n", "Epoch 4012/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4013/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4014/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 4015/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0128\n", "Epoch 4016/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4017/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 4018/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4019/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0080 - val_loss: 0.0121\n", "Epoch 4020/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 4021/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4022/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4023/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4024/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0123\n", "Epoch 4025/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4026/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4027/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4028/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4029/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 4030/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4031/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0128\n", "Epoch 4032/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0128\n", "Epoch 4033/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0129\n", "Epoch 4034/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4035/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 4036/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 4037/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0124\n", "Epoch 4038/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0128\n", "Epoch 4039/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4040/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 4041/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4042/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 4043/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0080 - val_loss: 0.0116\n", "Epoch 4044/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.006 - 0s 93us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4045/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4046/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 4047/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4048/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4049/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0093 - val_loss: 0.0117\n", "Epoch 4050/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0096 - val_loss: 0.0123\n", "Epoch 4051/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 4052/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 4053/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0095 - val_loss: 0.0126\n", "Epoch 4054/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0097 - val_loss: 0.0115\n", "Epoch 4055/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0137\n", "Epoch 4056/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4057/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0094 - val_loss: 0.0127\n", "Epoch 4058/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 4059/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0129\n", "Epoch 4060/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4061/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 4062/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0104 - val_loss: 0.0123\n", "Epoch 4063/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0116 - val_loss: 0.0117\n", "Epoch 4064/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0109 - val_loss: 0.0128\n", "Epoch 4065/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0126\n", "Epoch 4066/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0134\n", "Epoch 4067/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4068/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4069/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4070/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0096 - val_loss: 0.0126\n", "Epoch 4071/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 4072/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 4073/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0083 - val_loss: 0.0131\n", "Epoch 4074/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4075/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4076/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4077/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 4078/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4079/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4080/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 4081/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0140\n", "Epoch 4082/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0125\n", "Epoch 4083/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0130\n", "Epoch 4084/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0126\n", "Epoch 4085/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4086/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.008 - 0s 73us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 4087/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4088/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 4089/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4090/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 4091/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4092/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 4093/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 4094/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0116\n", "Epoch 4095/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0133\n", "Epoch 4096/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0127\n", "Epoch 4097/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0091 - val_loss: 0.0116\n", "Epoch 4098/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0125\n", "Epoch 4099/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4100/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0086 - val_loss: 0.0133\n", "Epoch 4101/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4102/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 4103/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 4104/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0123\n", "Epoch 4105/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4106/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 4107/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4108/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4109/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 4110/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4111/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0081 - val_loss: 0.0125\n", "Epoch 4112/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4113/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4114/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4115/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0116\n", "Epoch 4116/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0092 - val_loss: 0.0125\n", "Epoch 4117/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4118/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0090 - val_loss: 0.0126\n", "Epoch 4119/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0100 - val_loss: 0.0124\n", "Epoch 4120/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 4121/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0089 - val_loss: 0.0134\n", "Epoch 4122/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4123/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0088 - val_loss: 0.0129\n", "Epoch 4124/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 4125/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0087 - val_loss: 0.0131\n", "Epoch 4126/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 4127/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0080 - val_loss: 0.0125\n", "Epoch 4128/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4129/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4130/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 4131/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4132/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4133/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.006 - 0s 60us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4134/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4135/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 4136/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4137/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0080 - val_loss: 0.0121\n", "Epoch 4138/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0090 - val_loss: 0.0117\n", "Epoch 4139/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0089 - val_loss: 0.0126\n", "Epoch 4140/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 4141/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0136\n", "Epoch 4142/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 4143/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 4144/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0089 - val_loss: 0.0127\n", "Epoch 4145/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0081 - val_loss: 0.0124\n", "Epoch 4146/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4147/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4148/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4149/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4150/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.010 - 0s 60us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 4151/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0084 - val_loss: 0.0128\n", "Epoch 4152/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4153/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4154/5000\n", "150/150 [==============================] - 0s 53us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4155/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4156/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4157/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 4158/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4159/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0085 - val_loss: 0.0125\n", "Epoch 4160/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 4161/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4162/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 4163/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4164/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 4165/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0095 - val_loss: 0.0129\n", "Epoch 4166/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 4167/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0098 - val_loss: 0.0126\n", "Epoch 4168/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0128\n", "Epoch 4169/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0129\n", "Epoch 4170/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4171/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4172/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 4173/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0078 - val_loss: 0.0131\n", "Epoch 4174/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4175/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.009 - 0s 87us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4176/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4177/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4178/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4179/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 4180/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 4181/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4182/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0126\n", "Epoch 4183/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 4184/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0118\n", "Epoch 4185/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0095 - val_loss: 0.0133\n", "Epoch 4186/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0121\n", "Epoch 4187/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0079 - val_loss: 0.0131\n", "Epoch 4188/5000\n", "150/150 [==============================] - 0s 346us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4189/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4190/5000\n", "150/150 [==============================] - 0s 213us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 4191/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0101 - val_loss: 0.0120\n", "Epoch 4192/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0093 - val_loss: 0.0120\n", "Epoch 4193/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0101 - val_loss: 0.0125\n", "Epoch 4194/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0107 - val_loss: 0.0126\n", "Epoch 4195/5000\n", "150/150 [==============================] - 0s 247us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4196/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0133\n", "Epoch 4197/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 87us/step - loss: 0.0094 - val_loss: 0.0122\n", "Epoch 4198/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0128\n", "Epoch 4199/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 4200/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4201/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4202/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0127\n", "Epoch 4203/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4204/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0090 - val_loss: 0.0134\n", "Epoch 4205/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0126\n", "Epoch 4206/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4207/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4208/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 4209/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4210/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4211/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 4212/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 4213/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4214/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4215/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 4216/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4217/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 4218/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4219/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4220/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4221/5000\n", "150/150 [==============================] - 0s 207us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4222/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4223/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4224/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4225/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4226/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4227/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 4228/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4229/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4230/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4231/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4232/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4233/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4234/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0122\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 4235/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0094 - val_loss: 0.0117\n", "Epoch 4236/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 4237/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0080 - val_loss: 0.0121\n", "Epoch 4238/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0126\n", "Epoch 4239/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 4240/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4241/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4242/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 4243/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 4244/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0119\n", "Epoch 4245/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0123\n", "Epoch 4246/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0116\n", "Epoch 4247/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0094 - val_loss: 0.0131\n", "Epoch 4248/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 4249/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4250/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4251/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 4252/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4253/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4254/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4255/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.007 - 0s 87us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 4256/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4257/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 4258/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 4259/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4260/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 4261/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4262/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 4263/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4264/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0126\n", "Epoch 4265/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4266/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4267/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 4268/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4269/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 4270/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 4271/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4272/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 4273/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0093 - val_loss: 0.0125\n", "Epoch 4274/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4275/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 4276/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0125\n", "Epoch 4277/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 4278/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0126\n", "Epoch 4279/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 4280/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4281/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 4282/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0131\n", "Epoch 4283/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4284/5000\n", "150/150 [==============================] - 0s 160us/step - loss: 0.0084 - val_loss: 0.0129\n", "Epoch 4285/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 4286/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4287/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0116\n", "Epoch 4288/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 4289/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4290/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0135\n", "Epoch 4291/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 4292/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4293/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0116\n", "Epoch 4294/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 4295/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 4296/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4297/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0128\n", "Epoch 4298/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4299/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4300/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4301/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4302/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 4303/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4304/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4305/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4306/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 4307/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0080 - val_loss: 0.0121\n", "Epoch 4308/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 4309/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0119\n", "Epoch 4310/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0124\n", "Epoch 4311/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0126\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 4312/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4313/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0130\n", "Epoch 4314/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 4315/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0125\n", "Epoch 4316/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 4317/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 4318/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0126\n", "Epoch 4319/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 4320/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4321/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0080 - val_loss: 0.0119\n", "Epoch 4322/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4323/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4324/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 4325/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4326/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4327/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 4328/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0081 - val_loss: 0.0117\n", "Epoch 4329/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4330/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 4331/5000\n", "150/150 [==============================] - 0s 153us/step - loss: 0.0094 - val_loss: 0.0126\n", "Epoch 4332/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0125\n", "Epoch 4333/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 4334/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0091 - val_loss: 0.0122\n", "Epoch 4335/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 4336/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4337/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4338/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0128\n", "Epoch 4339/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4340/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0131\n", "Epoch 4341/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 4342/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 4343/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4344/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4345/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4346/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 4347/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0098 - val_loss: 0.0121\n", "Epoch 4348/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0118\n", "Epoch 4349/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4350/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4351/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4352/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4353/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0086 - val_loss: 0.0115\n", "Epoch 4354/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4355/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4356/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4357/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 4358/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 4359/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4360/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4361/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 4362/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4363/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4364/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4365/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4366/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0127\n", "Epoch 4367/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0097 - val_loss: 0.0117\n", "Epoch 4368/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 4369/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0079 - val_loss: 0.0136\n", "Epoch 4370/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4371/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4372/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4373/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4374/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4375/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0080 - val_loss: 0.0115\n", "Epoch 4376/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 4377/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4378/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4379/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4380/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0132\n", "Epoch 4381/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0093 - val_loss: 0.0118\n", "Epoch 4382/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 4383/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 4384/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0131\n", "Epoch 4385/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4386/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0096 - val_loss: 0.0128\n", "Epoch 4387/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0119\n", "Epoch 4388/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0123\n", "Epoch 4389/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 4390/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0094 - val_loss: 0.0117\n", "Epoch 4391/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.010 - 0s 66us/step - loss: 0.0097 - val_loss: 0.0129\n", "Epoch 4392/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0078 - val_loss: 0.0130\n", "Epoch 4393/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0138\n", "Epoch 4394/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 4395/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0123\n", "Epoch 4396/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0094 - val_loss: 0.0125\n", "Epoch 4397/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0126\n", "Epoch 4398/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 4399/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4400/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4401/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4402/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4403/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0115\n", "Epoch 4404/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4405/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4406/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4407/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4408/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4409/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4410/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4411/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4412/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4413/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 87us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4414/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4415/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 4416/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 4417/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0123\n", "Epoch 4418/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4419/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0080 - val_loss: 0.0118\n", "Epoch 4420/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4421/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0080 - val_loss: 0.0120\n", "Epoch 4422/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4423/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4424/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0117\n", "Epoch 4425/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 4426/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 4427/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 4428/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0080 - val_loss: 0.0128\n", "Epoch 4429/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4430/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0130\n", "Epoch 4431/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0095 - val_loss: 0.0120\n", "Epoch 4432/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0131\n", "Epoch 4433/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0127\n", "Epoch 4434/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0129\n", "Epoch 4435/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4436/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 4437/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0128\n", "Epoch 4438/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0131\n", "Epoch 4439/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0133\n", "Epoch 4440/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 4441/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4442/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 4443/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4444/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4445/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0079 - val_loss: 0.0127\n", "Epoch 4446/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 4447/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4448/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4449/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4450/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4451/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0120\n", "Epoch 4452/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 4453/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4454/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0092 - val_loss: 0.0121\n", "Epoch 4455/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4456/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 4457/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0126\n", "Epoch 4458/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0124\n", "Epoch 4459/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4460/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 4461/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4462/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 4463/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4464/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4465/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0122\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 4466/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 4467/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0102 - val_loss: 0.0119\n", "Epoch 4468/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 4469/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 4470/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 4471/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4472/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4473/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 80us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4474/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4475/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.004 - 0s 80us/step - loss: 0.0083 - val_loss: 0.0125\n", "Epoch 4476/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 4477/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4478/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4479/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 4480/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0088 - val_loss: 0.0127\n", "Epoch 4481/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4482/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0126\n", "Epoch 4483/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0132\n", "Epoch 4484/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 4485/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0124\n", "Epoch 4486/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4487/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 4488/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 4489/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0123\n", "Epoch 4490/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4491/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0123\n", "Epoch 4492/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0128\n", "Epoch 4493/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4494/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4495/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4496/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 4497/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0099 - val_loss: 0.0118\n", "Epoch 4498/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4499/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 4500/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4501/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 4502/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 4503/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 4504/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4505/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 4506/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0125\n", "Epoch 4507/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0116\n", "Epoch 4508/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0128\n", "Epoch 4509/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 4510/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4511/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 4512/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 4513/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4514/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 4515/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4516/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0097 - val_loss: 0.0118\n", "Epoch 4517/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0120\n", "Epoch 4518/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0129\n", "Epoch 4519/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0126\n", "Epoch 4520/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 4521/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0124\n", "Epoch 4522/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 4523/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0125\n", "Epoch 4524/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 4525/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4526/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4527/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0119\n", "Epoch 4528/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0127\n", "Epoch 4529/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4530/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0131\n", "Epoch 4531/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4532/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 4533/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4534/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0126\n", "Epoch 4535/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4536/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0080 - val_loss: 0.0119\n", "Epoch 4537/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4538/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 4539/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4540/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4541/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0119\n", "Epoch 4542/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 4543/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4544/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4545/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4546/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4547/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4548/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4549/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 4550/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 4551/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4552/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4553/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 4554/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 4555/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4556/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 4557/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0116\n", "Epoch 4558/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4559/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 4560/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4561/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4562/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 4563/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4564/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4565/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4566/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4567/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4568/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.004 - 0s 73us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4569/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4570/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0124\n", "Epoch 4571/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4572/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4573/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4574/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4575/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 4576/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 4577/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0099 - val_loss: 0.0121\n", "Epoch 4578/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0107 - val_loss: 0.0121\n", "Epoch 4579/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0117\n", "Epoch 4580/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0128\n", "Epoch 4581/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4582/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4583/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4584/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4585/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4586/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0122\n", "Epoch 4587/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 4588/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 4589/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0126\n", "Epoch 4590/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 4591/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4592/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 4593/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 4594/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4595/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4596/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0095 - val_loss: 0.0120\n", "Epoch 4597/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0099 - val_loss: 0.0121\n", "Epoch 4598/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0095 - val_loss: 0.0126\n", "Epoch 4599/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4600/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 4601/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 4602/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4603/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 4604/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 4605/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0092 - val_loss: 0.0122\n", "Epoch 4606/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4607/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0093 - val_loss: 0.0118\n", "Epoch 4608/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0134\n", "Epoch 4609/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4610/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4611/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0091 - val_loss: 0.0122\n", "Epoch 4612/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4613/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0125\n", "Epoch 4614/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0116\n", "Epoch 4615/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4616/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4617/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0127\n", "Epoch 4618/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0123\n", "Epoch 4619/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4620/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4621/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4622/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4623/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4624/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4625/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4626/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4627/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0095 - val_loss: 0.0121\n", "Epoch 4628/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0121\n", "Epoch 4629/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0123\n", "Epoch 4630/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0093 - val_loss: 0.0116\n", "Epoch 4631/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0124\n", "Epoch 4632/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4633/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0081 - val_loss: 0.0129\n", "Epoch 4634/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 4635/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4636/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4637/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4638/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4639/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4640/5000\n", "150/150 [==============================] - 0s 227us/step - loss: 0.0080 - val_loss: 0.0119\n", "Epoch 4641/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4642/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0128\n", "Epoch 4643/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4644/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0120\n", "Epoch 4645/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4646/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4647/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0127\n", "Epoch 4648/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0116\n", "Epoch 4649/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4650/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 4651/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4652/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4653/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4654/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4655/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4656/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 4657/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0124\n", "Epoch 4658/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 4659/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0127\n", "Epoch 4660/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4661/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0124\n", "Epoch 4662/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4663/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4664/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0097 - val_loss: 0.0127\n", "Epoch 4665/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0090 - val_loss: 0.0125\n", "Epoch 4666/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0108 - val_loss: 0.0125\n", "Epoch 4667/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0104 - val_loss: 0.0120\n", "Epoch 4668/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0127\n", "Epoch 4669/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4670/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4671/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4672/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4673/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0115\n", "Epoch 4674/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4675/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4676/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4677/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0094 - val_loss: 0.0122\n", "Epoch 4678/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4679/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0080 - val_loss: 0.0128\n", "Epoch 4680/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0091 - val_loss: 0.0119\n", "Epoch 4681/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0121\n", "Epoch 4682/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0084 - val_loss: 0.0126\n", "Epoch 4683/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 4684/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4685/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4686/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0083 - val_loss: 0.0126\n", "Epoch 4687/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4688/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4689/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4690/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0127\n", "Epoch 4691/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 4692/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 4693/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4694/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4695/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4696/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 4697/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4698/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0080 - val_loss: 0.0124\n", "Epoch 4699/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4700/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 4701/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4702/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4703/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 4704/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0130\n", "Epoch 4705/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 4706/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0130\n", "Epoch 4707/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0125\n", "Epoch 4708/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4709/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4710/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4711/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4712/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0080 - val_loss: 0.0120\n", "Epoch 4713/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 4714/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0127\n", "Epoch 4715/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4716/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4717/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4718/5000\n", "150/150 [==============================] - 0s 86us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4719/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0117\n", "Epoch 4720/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4721/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 4722/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0125\n", "Epoch 4723/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4724/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0119\n", "Epoch 4725/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0103 - val_loss: 0.0123\n", "Epoch 4726/5000\n", "150/150 [==============================] - 0s 120us/step - loss: 0.0108 - val_loss: 0.0129\n", "Epoch 4727/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0128\n", "Epoch 4728/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 4729/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0116\n", "Epoch 4730/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0125\n", "Epoch 4731/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4732/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4733/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0119\n", "Epoch 4734/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4735/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4736/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4737/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4738/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0115\n", "Epoch 4739/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4740/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4741/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0124\n", "Epoch 4742/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 4743/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 4744/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4745/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0122\n", "Epoch 4746/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0116\n", "Epoch 4747/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0129\n", "Epoch 4748/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0119\n", "Epoch 4749/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 4750/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4751/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 4752/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0124\n", "Epoch 4753/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0122\n", "Epoch 4754/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0092 - val_loss: 0.0123\n", "Epoch 4755/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4756/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 4757/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0125\n", "Epoch 4758/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0104 - val_loss: 0.0126\n", "Epoch 4759/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0130\n", "Epoch 4760/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0132\n", "Epoch 4761/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0091 - val_loss: 0.0126\n", "Epoch 4762/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 4763/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4764/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 4765/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4766/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4767/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0115\n", "Epoch 4768/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 4769/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0125\n", "Epoch 4770/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0128\n", "Epoch 4771/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4772/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0116\n", "Epoch 4773/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0125\n", "Epoch 4774/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 4775/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4776/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4777/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0115\n", "Epoch 4778/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4779/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 4780/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0092 - val_loss: 0.0121\n", "Epoch 4781/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0093 - val_loss: 0.0117\n", "Epoch 4782/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0092 - val_loss: 0.0132\n", "Epoch 4783/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0125\n", "Epoch 4784/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0130\n", "Epoch 4785/5000\n", "150/150 [==============================] - 0s 147us/step - loss: 0.0090 - val_loss: 0.0120\n", "Epoch 4786/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0097 - val_loss: 0.0133\n", "Epoch 4787/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0093 - val_loss: 0.0127\n", "Epoch 4788/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 4789/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0126\n", "Epoch 4790/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4791/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 4792/5000\n", "150/150 [==============================] - 0s 133us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4793/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 4794/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4795/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4796/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4797/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4798/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4799/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0088 - val_loss: 0.0121\n", "Epoch 4800/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0080 - val_loss: 0.0124\n", "Epoch 4801/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0116\n", "Epoch 4802/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4803/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4804/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0088 - val_loss: 0.0125\n", "Epoch 4805/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4806/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4807/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0127\n", "Epoch 4808/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0120\n", "Epoch 4809/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0090 - val_loss: 0.0122\n", "Epoch 4810/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0119\n", "Epoch 4811/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0125\n", "Epoch 4812/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0117\n", "Epoch 4813/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0079 - val_loss: 0.0131\n", "Epoch 4814/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0120\n", "Epoch 4815/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0122\n", "Epoch 4816/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4817/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4818/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0082 - val_loss: 0.0127\n", "Epoch 4819/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4820/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0117\n", "Epoch 4821/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 4822/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0088 - val_loss: 0.0118\n", "Epoch 4823/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4824/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4825/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0122\n", "Epoch 4826/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0123\n", "Epoch 4827/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4828/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4829/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0121\n", "Epoch 4830/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4831/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0128\n", "Epoch 4832/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4833/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4834/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0085 - val_loss: 0.0125\n", "Epoch 4835/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4836/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4837/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4838/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4839/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0126\n", "Epoch 4840/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0080 - val_loss: 0.0123\n", "Epoch 4841/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 4842/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 4843/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4844/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0135\n", "Epoch 4845/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0090 - val_loss: 0.0118\n", "Epoch 4846/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 4847/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4848/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0126\n", "Epoch 4849/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0118\n", "Epoch 4850/5000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0122\n", "Epoch 4851/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0126\n", "Epoch 4852/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4853/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4854/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4855/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0119\n", "Epoch 4856/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0092 - val_loss: 0.0119\n", "Epoch 4857/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0089 - val_loss: 0.0117\n", "Epoch 4858/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0079 - val_loss: 0.0129\n", "Epoch 4859/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0121\n", "Epoch 4860/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0085 - val_loss: 0.0124\n", "Epoch 4861/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4862/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0120\n", "Epoch 4863/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 4864/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4865/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4866/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0122\n", "Epoch 4867/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4868/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.008 - 0s 107us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4869/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4870/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0121\n", "Epoch 4871/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0094 - val_loss: 0.0118\n", "Epoch 4872/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4873/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4874/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4875/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 4876/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4877/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4878/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4879/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4880/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0118\n", "Epoch 4881/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 4882/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0084 - val_loss: 0.0118\n", "Epoch 4883/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4884/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4885/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4886/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0119\n", "Epoch 4887/5000\n", "150/150 [==============================] - 0s 107us/step - loss: 0.0083 - val_loss: 0.0118\n", "Epoch 4888/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4889/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4890/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0080 - val_loss: 0.0120\n", "Epoch 4891/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4892/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0094 - val_loss: 0.0121\n", "Epoch 4893/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0091 - val_loss: 0.0120\n", "Epoch 4894/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0095 - val_loss: 0.0122\n", "Epoch 4895/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0130\n", "Epoch 4896/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4897/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0095 - val_loss: 0.0124\n", "Epoch 4898/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4899/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.011 - 0s 113us/step - loss: 0.0088 - val_loss: 0.0131\n", "Epoch 4900/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0124\n", "Epoch 4901/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4902/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 4903/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0119\n", "Epoch 4904/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0127\n", "Epoch 4905/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4906/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 4907/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0126\n", "Epoch 4908/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0087 - val_loss: 0.0120\n", "Epoch 4909/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0129\n", "Epoch 4910/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0124\n", "Epoch 4911/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0099 - val_loss: 0.0120\n", "Epoch 4912/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0092 - val_loss: 0.0126\n", "Epoch 4913/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0093 - val_loss: 0.0118\n", "Epoch 4914/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0094 - val_loss: 0.0128\n", "Epoch 4915/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4916/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4917/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0117\n", "Epoch 4918/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 4919/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 4920/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0080 - val_loss: 0.0122\n", "Epoch 4921/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 4922/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0084 - val_loss: 0.0122\n", "Epoch 4923/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0080 - val_loss: 0.0118\n", "Epoch 4924/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0079 - val_loss: 0.0126\n", "Epoch 4925/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4926/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0088 - val_loss: 0.0126\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 4927/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0084 - val_loss: 0.0121\n", "Epoch 4928/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4929/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0086 - val_loss: 0.0125\n", "Epoch 4930/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4931/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0123\n", "Epoch 4932/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0118\n", "Epoch 4933/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 4934/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0091 - val_loss: 0.0125\n", "Epoch 4935/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0090 - val_loss: 0.0126\n", "Epoch 4936/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0089 - val_loss: 0.0124\n", "Epoch 4937/5000\n", "150/150 [==============================] - 0s 127us/step - loss: 0.0088 - val_loss: 0.0122\n", "Epoch 4938/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4939/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0089 - val_loss: 0.0118\n", "Epoch 4940/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 4941/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4942/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 4943/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4944/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 4945/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0118\n", "Epoch 4946/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0082 - val_loss: 0.0123\n", "Epoch 4947/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4948/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4949/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0087 - val_loss: 0.0119\n", "Epoch 4950/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0123\n", "Epoch 4951/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0127\n", "Epoch 4952/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4953/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0118\n", "Epoch 4954/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0095 - val_loss: 0.0121\n", "Epoch 4955/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0093 - val_loss: 0.0122\n", "Epoch 4956/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0088 - val_loss: 0.0123\n", "Epoch 4957/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0119\n", "Epoch 4958/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0129\n", "Epoch 4959/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0084 - val_loss: 0.0124\n", "Epoch 4960/5000\n", "150/150 [==============================] - 0s 113us/step - loss: 0.0092 - val_loss: 0.0133\n", "Epoch 4961/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0103 - val_loss: 0.0118\n", "Epoch 4962/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0089 - val_loss: 0.0125\n", "Epoch 4963/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0123\n", "Epoch 4964/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 4965/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0086 - val_loss: 0.0116\n", "Epoch 4966/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0128\n", "Epoch 4967/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0087 - val_loss: 0.0121\n", "Epoch 4968/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0101 - val_loss: 0.0120\n", "Epoch 4969/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0114 - val_loss: 0.0125\n", "Epoch 4970/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0105 - val_loss: 0.0123\n", "Epoch 4971/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0106 - val_loss: 0.0135\n", "Epoch 4972/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0094 - val_loss: 0.0124\n", "Epoch 4973/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0092 - val_loss: 0.0134\n", "Epoch 4974/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0122\n", "Epoch 4975/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0086 - val_loss: 0.0127\n", "Epoch 4976/5000\n", "150/150 [==============================] - 0s 60us/step - loss: 0.0086 - val_loss: 0.0117\n", "Epoch 4977/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0086 - val_loss: 0.0121\n", "Epoch 4978/5000\n", "150/150 [==============================] - 0s 80us/step - loss: 0.0085 - val_loss: 0.0121\n", "Epoch 4979/5000\n", "150/150 [==============================] - 0s 93us/step - loss: 0.0083 - val_loss: 0.0119\n", "Epoch 4980/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4981/5000\n", "150/150 [==============================] - 0s 167us/step - loss: 0.0082 - val_loss: 0.0120\n", "Epoch 4982/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4983/5000\n", "150/150 [==============================] - 0s 140us/step - loss: 0.0082 - val_loss: 0.0116\n", "Epoch 4984/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0083 - val_loss: 0.0121\n", "Epoch 4985/5000\n", "150/150 [==============================] - ETA: 0s - loss: 0.005 - 0s 100us/step - loss: 0.0085 - val_loss: 0.0120\n", "Epoch 4986/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0085 - val_loss: 0.0122\n", "Epoch 4987/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0081 - val_loss: 0.0129\n", "Epoch 4988/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0124\n", "Epoch 4989/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0085 - val_loss: 0.0117\n", "Epoch 4990/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0084 - val_loss: 0.0123\n", "Epoch 4991/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0126\n", "Epoch 4992/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0087 - val_loss: 0.0122\n", "Epoch 4993/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0082 - val_loss: 0.0117\n", "Epoch 4994/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0083 - val_loss: 0.0120\n", "Epoch 4995/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0081 - val_loss: 0.0120\n", "Epoch 4996/5000\n", "150/150 [==============================] - 0s 100us/step - loss: 0.0081 - val_loss: 0.0121\n", "Epoch 4997/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0087 - val_loss: 0.0117\n", "Epoch 4998/5000\n", "150/150 [==============================] - 0s 73us/step - loss: 0.0082 - val_loss: 0.0116\n", "Epoch 4999/5000\n", "150/150 [==============================] - 0s 87us/step - loss: 0.0081 - val_loss: 0.0119\n", "Epoch 5000/5000\n", "150/150 [==============================] - 0s 67us/step - loss: 0.0082 - val_loss: 0.0123\n" ] } ], "source": [ "h = model.fit(X_train, Y_train, epochs=5000, verbose=1, validation_data=(X_test, Y_test), batch_size=32)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**d)** Plot the training and validation error curves. Did the training converge? If not then go back and increase the number of training epochs. Also try different batch sizes." ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "[[-5.8469856e-01 5.6556869e-01 7.2507758e+00 -1.4728354e+00\n", " -8.0758873e-05 1.3176533e+00 1.6164204e+00 -3.1695042e+00\n", " 1.6821798e+00 -1.8079444e+00]]\n", "[ 6.4661992e-01 3.1282279e-01 -1.8029467e+00 -2.0476460e+00\n", " 1.0672025e-03 -1.4143237e+00 -1.8302863e+00 -9.1023803e-01\n", " -2.0157876e+00 2.3770895e+00]\n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYAAAAD8CAYAAAB+UHOxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xl8VNXdx/HPyWQjAULYNyEgsssaEcWiiAuIoLXUolbrUrF2sdb2eYTqI7hVWq11RYvWrQJKcUMFFwSKKCA7ssoOIeyQEMg6M+f54072yWTIwgyT7/v1yit3ztx755wh3N896zXWWkREpO6JCnUGREQkNBQARETqKAUAEZE6SgFARKSOUgAQEamjFABEROooBQARkTpKAUBEpI5SABARqaOiQ52BQJo2bWpTUlJCnQ0RkTPGihUrDltrmwWzb1gHgJSUFJYvXx7qbIiInDGMMbuC3VdNQCIidZQCgIhIHaUAICJSR4V1H4CIRI6CggLS0tLIzc0NdVYiQnx8PG3btiUmJqbK51AAEJHTIi0tjQYNGpCSkoIxJtTZOaNZazly5AhpaWl06NChyudRE5CInBa5ubk0adJEF/8aYIyhSZMm1a5NKQCIyGmji3/NqYnvMiIDwBvf7OCTtemhzoaISFiLyADw7yW7mLNuf6izISJhJCMjg8mTJ5/ycVdddRUZGRkB93nooYeYO3duVbMWMhEZAADQs+5FpISKAoDH4wl43OzZs2nUqFHAfR555BEuu+yyauUvFCIyAKidUUTKGjduHNu2baNPnz6cd955DBkyhBtvvJFzzz0XgGuvvZb+/fvTo0cPpkyZUnRcSkoKhw8fZufOnXTr1o0777yTHj16cMUVV5CTkwPArbfeysyZM4v2nzBhAv369ePcc89l06ZNABw6dIjLL7+cfv36cdddd9G+fXsOHz58mr+F0iJ2GKhVFUAkbD388Xo2pB+v0XN2b92QCSN7VPj+pEmTWLduHatXr2bBggWMGDGCdevWFQ2jfO2112jcuDE5OTmcd955/OQnP6FJkyalzrFlyxamT5/OK6+8wvXXX897773Hz3/+83Kf1bRpU1auXMnkyZN56qmnePXVV3n44Ye59NJLGT9+PJ999lmpIBMqYVkDMMaMNMZMyczMrNrxNZwfEYk8AwYMKDWG/rnnnqN3794MHDiQPXv2sGXLlnLHdOjQgT59+gDQv39/du7c6ffc1113Xbl9Fi1axJgxYwAYNmwYycnJNViaqgnLGoC19mPg49TU1Durfo4azJCI1KhAd+qnS2JiYtH2ggULmDt3LosXLyYhIYFLLrnE7xj7uLi4om2Xy1XUBFTRfi6XC7fbDTiTt8JNWNYAqktdACJSVoMGDcjKyvL7XmZmJsnJySQkJLBp0yaWLFlS459/0UUXMWPGDAC++OILjh07VuOfcarCsgYgIlLTmjRpwqBBg+jZsyf16tWjRYsWRe8NGzaMl19+mV69etGlSxcGDhxY458/YcIEbrjhBt59910uvvhiWrVqRYMGDWr8c06FCcdqSaHU1FRblQfCXPGP/9KxaX1evrl/LeRKRKpi48aNdOvWLdTZCJm8vDxcLhfR0dEsXryYu+++m9WrV1frnP6+U2PMCmttajDHR2QNwKgbWETCzO7du7n++uvxer3ExsbyyiuvhDpLkRkAfpL3AVFZLQHVAEQkPJxzzjmsWrUq1NkoJSIDwPD8z9mffU6osyEiEtYichQQoKUgREQqEZEBwGIwigAiIgFFbAAQEZHAIjIAOJd/1QBEpOrq168PQHp6OqNHj/a7zyWXXEJlQ9WfeeYZsrOzi14Hs7z06RKRAcBidP0XkRrRunXropU+q6JsAAhmeenTJSIDgEMRQESK3X///aWeBzBx4kQefvhhhg4dWrR080cffVTuuJ07d9KzZ08AcnJyGDNmDL169eJnP/tZqbWA7r77blJTU+nRowcTJkwAnAXm0tPTGTJkCEOGDAGKl5cGePrpp+nZsyc9e/bkmWeeKfq8ipadrmkROQzUgjqBRcLZnHGw//uaPWfLc2H4pArfHjNmDPfeey+//vWvAZgxYwafffYZf/jDH2jYsCGHDx9m4MCBjBo1qsJnirz00kskJCSwdu1a1q5dS79+/Yree/zxx2ncuDEej4ehQ4eydu1a7rnnHp5++mnmz59P06ZNS51rxYoVvP766yxduhRrLeeffz4XX3wxycnJQS87XV2RWQPQanAiUkbfvn05ePAg6enprFmzhuTkZFq1asWf//xnevXqxWWXXcbevXs5cOBAhedYuHBh0YW4V69e9OrVq+i9GTNm0K9fP/r27cv69evZsGFDwPwsWrSIH//4xyQmJlK/fn2uu+46vv76ayD4ZaerKyJrACIS5gLcqdem0aNHM3PmTPbv38+YMWOYOnUqhw4dYsWKFcTExJCSkuJ3GeiS/NUOduzYwVNPPcWyZctITk7m1ltvrfQ8gdZhC3bZ6eqKyBqAhoGKiD9jxozhnXfeYebMmYwePZrMzEyaN29OTEwM8+fPZ9euXQGPHzx4MFOnTgVg3bp1rF27FoDjx4+TmJhIUlISBw4cYM6cOUXHVLQM9eDBg/nwww/Jzs7m5MmTfPDBB/zoRz+qwdJWLmJrAOoDEJGyevToQVZWFm3atKFVq1bcdNNNjBw5ktTUVPr06UPXrl0DHn/33Xdz22230atXL/r06cOAAQMA6N27N3379qVHjx507NiRQYMGFR0zduxYhg8fTqtWrZg/f35Rer9+/bj11luLzvHLX/6Svn371lpzjz8RuRz0jkf7cCy2Ff3un1P5ziJyWtT15aBrwxmzHLQxJhGYDOQDC6y1U2vrs8I3pImIhI9q9QEYY14zxhw0xqwrkz7MGLPZGLPVGDPOl3wdMNNaeycwqjqfG1TeFAZERAKqbifwG8CwkgnGGBfwIjAc6A7cYIzpDrQF9vh281TzcyuhTmCRcBTOTc5nmpr4LqsVAKy1C4GjZZIHAFuttduttfnAO8A1QBpOEKj25waZu9r/CBEJWnx8PEeOHFEQqAHWWo4cOUJ8fHy1zlMbfQBtKL7TB+fCfz7wHPCCMWYE8HFFBxtjxgJjAdq1a1elDFg9FFIk7LRt25a0tDQOHToU6qxEhPj4eNq2bVv5jgHURgDwd+211tqTwG2VHWytnQJMAWcUUFUyYAHdZIiEl5iYGDp06BDqbEgJtdEUkwacVeJ1WyC9Fj6nYkYPhBERqUxtBIBlwDnGmA7GmFhgDDCrFj5HRESqobrDQKcDi4Euxpg0Y8wd1lo38Fvgc2AjMMNau/4UzzvSGDMlMzOzGrlTDUBEJJBq9QFYa2+oIH02MLsa5/0Y+Dg1NfXOKh2vZwKLiFQqIheD0zwAEZHKRWgAADUBiYgEFpYBoLp9ALr0i4hULiwDgLX2Y2vt2KSkpCqfwygKiIgEFJYBoNqMQfUAEZHAIjIA6IlgIiKVi8gAAFoOWkSkMmEZAKo/EUw1ABGRyoRlAKiJTmD1AYiIBBaWAaC6NBNYRKRyERkAQMtBi4hUJmIDgHoBREQCC8sAUP2ZwLr8i4hUJiwDgDqBRURqX1gGgOqyeiKYiEilIjIAqAdARKRyERoAQE1AIiKBRXAAEBGRQMIyANTEM4G1HLSISGBhGQCqOwrIGQaqCCAiEkhYBoDqUyewiEhlIjQAaDloEZHKRGQAsEY1ABGRykRkAHCoBiAiEkjEBgDVAUREAovIAKDF4EREKheWAaAm5gGoCUhEJLCwDADVXw3UYPREGBGRgMIyAIiISO2LyADgDANVDUBEJJCIDAAaAyQiUrkIDQCaCSwiUpmIDAAaBioiUrmIDAAiIlK5iA0AagISEQksIgOAmoBERCoXlgGgujOBjVENQESkMmEZAGrkiWCaCSwiElBYBoDqUxOQiEhlIjQAKASIiFQmIgOAGn9ERCoXkQHAuf9XGBARCSRCA4BGAYmIVCZCA4B6AEREKhOhAQDUBCQiElhEBgDneQAiIhJIRAYAUCOQiEhlIjQA6JnAIiKVicgAYDUGSESkUhEZAEDDQEVEKhMd6gz4Y4wZCYzs1KlTlY7vmruKRO+Jms2UiEiECcsaQHVXA9XFX0SkcmEZAEREpPYpAIiI1FEKACIidZQCgIhIHaUAICJSR0VkAHjZfTW5NibU2RARCWsRGQC8RGkimIhIJSI0ABhceEOdDRGRsBaRAcBDFFGqAYiIBBSRAcBiiDIW61UtQESkIhEZADzWKZZXAUBEpEIRGQC8FAYAd4hzIiISviI6AKgJSESkYhEaAJwHQno9qgGIiFQkIgNAl1aNANUAREQCicgAcEHBYmcjc1doMyIiEsYiMgC0ylwNgElbHuKciIiEr4gMAIWs9YQ6CyIiYSsiA8Cirg8C4Emu2jOFRUTqgtMWAIwxHY0x/zLGzKztzzpev6Oz4VUNQESkIkEFAGPMa8aYg8aYdWXShxljNhtjthpjxgU6h7V2u7X2jupkNlg2ylkK2noKTsfHiYickaKD3O8N4AXgrcIEY4wLeBG4HEgDlhljZgEu4Ikyx99urT1Y7dwGyUY5xVIAEBGpWFABwFq70BiTUiZ5ALDVWrsdwBjzDnCNtfYJ4OqazOSpsi7VAEREKlOdPoA2wJ4Sr9N8aX4ZY5oYY14G+hpjxgfYb6wxZrkxZvmhQ4eqljNfDQBPftWOFxGpA4JtAvLH+EmrcBF+a+0R4FeVndRaOwWYApCamlqlRf2tKawBaCkIEZGKVKcGkAacVeJ1WyC9etmpIb4mILxqAhIRqUh1AsAy4BxjTAdjTCwwBphVE5kyxow0xkzJzMys2gk0CkhEpFLBDgOdDiwGuhhj0owxd1hr3cBvgc+BjcAMa+36msiUtfZja+3YpKSkKh3vdRX2ASgAiIhUJNhRQDdUkD4bmF2jOaoJvhqAAoCISMUicikIUzgKSH0AIiIVisgA8J9VBwDYeyQrxDkREQlfYRkAqtsJPLhrSwASXHogjIhIRcIyAFS3E7hPu8YUWBcxRgFARKQiYRkAqis2Ogo3LryaCSwiUqGIDABx0VEU4MIWKACIiFQkLANAdfsAYqOjyCMWPLk1nDMRkcgRlgGgun0AcdFRnLTxmPzsGs6ZiEjkCMsAUF2x0VFkE0+U+2SosyIiErYiMgDEuVycJA5XgWoAIiIVicwAEBNFto3H5VYAEBGpSEQGgFhXFCeJJ9qjJiARkYqEZQCo7iigqChDrokn2p1TwzkTEYkcYRkAqjsKCCDX1CPGoyYgEZGKhGUAqAl5UfWI9aoGICJSkYgNAPlR9Yi2BeDWbGAREX8iOgAAUKCOYBERfyI2ABS4EpyNfAUAERF/wjIAVPuh8ECBy1cDUAAQEfErLANATYwCckcX1gBO1FCuREQiS1gGgJrgKQwAOcdCmxERkTAVsQEg2uVyNjZ+EtqMiIiEqYgNAHsSejgbDVuHNiMiImEqYgNAVGw8maYBZO0LdVZERMJSxAaAuOgoDtIEjisAiIj4E7EBIDkhlmPeBNj5daizIiISlsIyANTEPICm9ePowTZnGKg7rwZzJyISGcIyANTEPICm9WPZbVs4L+bcX/pNr8f5ERGpw8IyANSEpg3i+NAzyHmx4vXiNw5vhUcaw5OdQpMxEZEwEbEBoFn9OF73DCtOSFsOH9wNL/R3XuccDU3GRETCRHSoM1BbmtaPI5+Y4oRXh/rfccfXEN8QWvUunV6QC/++FkwU3Da79jIqIhIiERsAmtSPBWBvTAptCnb63+nbF+CLB5ztiWU6nN+7A3Yvrr0MioiEWMQ2AcW4nKINynq84p0KL/4Acx92Oos3z4GJSbApiCUkThyCH74IKj+b92eRMu5TFm87EtT+IiK1LWIDADgjgcDwnudHle+86GlY+jJMH1P+vb+0gaVTwFrndX427F4KU38C034KU3/qBI1/XgyeAnispZNmLWQfxfuvK8n44q98GPt/fLtY8xJEJDwYW3hRC0Opqal2+fLlVT4+z+2hy4OfAZad8TfVTKYeOuY0D61/3//7QyfAVw8721c/AzsWltr3uE2gwb1LMMntAXB7vEz/bjdjBrQjxhVFTr4Hr7UkxkVs65yI1CJjzAprbWow+0Z0DSAu2sW0X54PGM7NfbVmTvpIcsUXfyi++APsWVpu34YmG/NsL/jqUThxiBnfbuKfs+bzxqLtAJz3+Fx6TPi8ZvIqIhJAWAaAmpgJXOjCTk1Z89AVZJFASu40+ua+XAM5DNKa6RW/9/VT8FQnbvzqAhbF3UvnLVMAOJHnPk2ZE5G6LiwDQE3MBC4pKSGGnZNGsPmxYdw1fAADcl9kZN5jNXLumpK8+wvyZv6KeLRshYicHnWqoTku2sWvLj6bX118Ntn5br7bexNH9u0gffaT3BE9J6R56xW1A9bt4DpXAtM8FcxZqEzmXkhoDDH1ajZzh35w5kM01expkUgS0Z3AwcrIzicvN5sWcR689RpjXh0CJopjHa7GrHuP5Ix1tZ6HQh5rWG9T6PXIati9BFr1AU++059wzuWBD56YBGcPhZtL9DtM+xlYL9z0n6pnaqKvJlZ2roSIhJ1T6QSuUzWAijRKiIUEZ+JYFMDYBQA0BrjsvnL7H8rK46s127gwZgv3rmjKe9c3J6sAGq56hXddI0jI2s6l+18j8fBa3nYP5efRXwWdF5ex9DI7+Pzrb7nyq+HsbzkErzW0PjAP7v0eGrULfIJtZT7rh8+c3+488LohNjHovIhIZFMAqIJmDeIYc1F3oDvvn++kNQRo9Td+BsDFwG0A3Oi1bDqYRdeWDYtPcHQHBfvWsW7uv+l7zP+Inyu/Gg5Ay/3zixPzs6ue6edTIXN36bv4iUnQYTD84uOqn1dEzlgKALUsKsqUvvgDNO5ATOMO9O0xErfHi/38AWK+m1y9D5oXYMYzOBf/Qoe3wq5FzvaOhdX7XBE5Y4XlKKC6JNoVRcxVT8A9qyvd11pv6YTvZ0LGHmd74d+K0xdMCnxhf2UIfPz7KuRWRCKJAkC4aNwBfr824C4b50+DrXOdppv0Vc6M5Gd6YtNWlN5xwRPw5siKT5R3vHzarm+doOHOd157vc7nLP3nKRYEmHIJfP4AbJh16seKyGmjUUDhxut1ZhtXwpPUDlfJZp1gTcwsHtVTaPxeeKKNs33OldB1BETHwQd3ld7vwt9Bo/bQspezUupF95Y//8InYV6JORYTM52ltbOPQFKbU8+viJwSjQI6k0VFwagXYNZvA+5WpYs/+H8+8tTRxdtbPnd+/Pn2+dKv/QWAeX4m2M28DTbP1jBSkTCjJqBw1O/m2jv3Y83Lp1X3uQc7FsL8Jyp+f7PvgTphXNsUqYsUAMLVmXS3/OZI+O8k5+lqZR3bVbztLbHOUWaa+ghEQkwBIJzdtxEatA51Lio2MQmyDhS/9td0tLTE4nvz/1K8/eplMKMWazoiUqmwDAA1uRroGa1ha/jjxqCGiIbM3zsXb5ftIwDIySje/m5K8XbWPuf3kW1O85Gah0ROO40COpNk7nXW9Vn/PrbDxfx86ma+OVK/xA4WMCSSw/r4O0KVy8AKm7YKRyLVbwkn9pd+L5DMvZDQxFmcLucYNGhRO/kUOUOdyiggBYAIcPRkPv0e/dLvexdGreP+6HeY4r6ai6K+54bo+X73CwvXTIbWfaFFdziwHr56BAbd6/yOcsGtnziBo/NwZ5jqhg9hQgYYE+qci4QNBYA6LjO7gN6P+H9YfTLHeTTmDa52LTm9mToVt82B14eXT//xP0vMTTCAhfFpENfgdOZOJKwpAAgAaceyueivge/4Yymgg9nH2OhPGBW1mBjjOU25qyEdLoZfaDSRSCEFAClyMs/NzBVpTJi1Puhj4sgnj1hiKeA30R+y3NuFf8dOqsVcVlPZvoNjOyGxOcQmhCQ7IqGkACAB7c3IYfbafTw+e2OVjm9CJrdEf0mWrcd/PBezJn4sADk2lnomvyazGpw758ORrVC/ObS/CB5tomWupc5SAJBTlu/2MuSpBTRvGMeq3c7QzVsuaM9bi3dVciQkkkMsBRyjIT3NdvpEbeOgbUSSOcmnnoH8KXoGt0c7D6aZ4zmP4a5ltVeQLlcVzzw+kybThYt9a6BxR/WrnMEUAKRaMrML8FhL48TYojSP13LfjNXcckEKMS7DqBe+qdZn7Iy/0W96jQYIjRA6NZ4CeLQppPzIGXF1up084gTv2lwKpQ7QYnBSLUkJMeXSXFGGZ8f0LXq9c9IIvt12mChjaJ1Ujw37MsnO9+D2WI5m5/PU55txeyu+uUjNfYlYCkinKXHkE4ubLJw2+wYF2Twd8xKXu1ZUeHxQvB5w6U88aAc3OL93+lnS43R473bYvgDaXwhNzg5NHuoY/e+QKrvw7KZF2+2alO5wvWtwR9akZdK7bRKr92TQu20jZq1J56zGCeTke0hNSeayp/8Lx3LII5Y8imsbWSRwZ8EfocB5/Uj062QTx6+iT/GuNO84/K0DjH4Nev6kyuWsM0rO1A6FLN+EQH8r1kqtUBOQhJzb4yUr1832wye47fVlHM91+92vISdoao4zL+5PTHUP5abor4L/kEjuDziwAV67EoY8AAN/VfXzvHg+HNrkbBd+X0+eA826nFqTkNcDh3+A5t1g9TSnX6Zeo+A//+7FzmTAQOfPPgr1mwWfpzrkVJqAwnItIKlbol1RJCfG0r99Y9ZOvJKdk0aw8H+G8OerujLvjxcX7Xec+my3rUnJncYD7ju4Ju8RXnX7mTDmz8QkeH+s83v7Aidt1u/g3z8O7vhvX4BNn55awU6Xly5wajuf3X/qx+Znw+4AkwJPHnSahPaugIObgjvns31g8kD4+u/w4d3w0W+CzIyvv+alCypeKXbLl875nurkBIHqOrLNmWkexjfCtUkBQMJSuyYJjB18Nh2b1WfnpBEsGT+U+y7vXGqfNbYTj7lvpkPu20wsuKXyk6591/n93SvO75VvwbZ5cOJg5cd+8QC847/julZ8cp8TrI7uCLyfp6Di904eKf362xdgw0eQl+Usxw3w7k1O7SF9NXS8pHhfa0tfFF+5FCafX/78mz9zvkNrne9x3ftQ+LCirx5xfm/6BDxu+OZZp0x5WU661+s8irRwO/tw8bln3OyklS3r1NGwZrrzevafnAUIJyY5q8sC7FkGf+/mPDq1pLeugRcGlE6bmATP93MCVWaak//DW8t+i+VlH3XK5g1i0uSJg07wLPR8f1j+mv99l/7TydO2eZWft4aoCUjOKHluD7n5XrLyCvzOcr42ahHPxE6u/ESDfu9ckABu/xzaDQQgO9/NiVw3zRvGl96/cPG6qjQlbV8AGbuh3y3OXfSeJXB4CwydAPtWAwbOOq9EIbPgibbFrx866typJjSGxKaw5l1w50DfW8o/PvQ330HTzvC3jpBzFMZMd5pTEpvBX3xLizft7DTR3DgDpl1fPr+t+8LxdDhxoPx7VzzuXPw81Win7zwMLrzHCcBr34GBv4YlFfyb3fopvDECfjnPGdH1ypDgP+fedTD7f2Dks8Wr1g6dALkZzr9DyeXL71kNz/VxtidmwmMtnSfe9b0Z/vkj+Mm/4OwhTiD7YCysew+ufRm6XV08ZHb/OvjiQRj1vPP9vn1d8fn7/QJa9IA5/+u8fuiY0+mdvhp+s9RZ26rko1qr0WSpYaBSJ2xIP067Jgls3n+cF+ZtZf7mQ0Xv3e6aw0Mx/w7uRLfNce4uj+9lxMKzWJ9+nJ2TRjjvpa+G5BT4a3vn9a+XQNpy/0MVD2yAtO+g/62l08s+g7lQ8x5w0DdD+7bP4PVhgfMZ3wjunOfctYJzIf3hs2BKWDclpzizwmvK2ZfW3N35JeNhQYmn6DXuCEe3F7+uxsAFBQCps+58azlfbnDuXIdHLWWZtyvvxU6gfVSAZp7Cu0yfDd72dH9krfNiYpJzR5y+ynkd2wDys5y78ihX6fMUXuh/8Ykznn3JZOfO8b0wXZpbwtuDB52awSnSPACps165JZUpC7eRnpHLG77m5Yvzn6lw4hngPGOhhO5RvtnPhW28hRd/cC7+4LR19/qps73zG3jjquJ93ry6eFsXf6mqqNq/PKsGIBHL7fHy5Beb+ed/t2PwcpfrE8bFvBPUsXZCBsadC4+3rOVcilSgiv0AGgYqgjO8dNywrjx2bU/euH0gL3tGcUf+HzliK1/nZvtfB0HGntOQy1rUsG35tGbdSr38h+d6Vno7BXe+RD/j7l2x5dPOHgr1ksunY5x+i5K6XAU3fwjXvMjv8n/LnwvucDrL/7Aebvf/TIuAhk1ymuD6/Nz/+1c9BQ3b+H+vfku44jFnccFrJkPbEh3zvX4W3Of3GhPcfkMfgu7XwFkDIbkDNGyDN7pe8fvtLgzuPNWkGoDUGW6Pl04PzAGgBUdZGv/bEOcI3nRfTgYN+P2g5uDOhTb9mZ59HlM+/YYTNp6BCXt5vm86XPqQc1GNioL8k3BoM2Tsgm6jWJl2nLzcXC44PBMG3AXRfi7K1kL2EWcUUQkdx3+K18KKBy8j7ehJHnrp36yxnYo6wVPGOXMfPvrNIFKaJJJkj0NiE/+F2ToX2l0AsYl+377v3dV8vfUwy/481O8aTYWfVdQBXyj7KETHQ2xC0T5bHx9OtKsa969eDxTkOCOlGrR28lO2TydIuXl5RB3dQmyrnr5ze515GfUaOZ9jooJak+rmfy3l6y2Heev2AQzuXPVJbmHZB2CMuRYYATQHXrTWViG8i1RdtCuKLY8P54p/LGTHYUjJncrO+Jtq7wMbtIasdGd7xN+h4xBo1I5862LPsWyaN4hjwkTnv8Hvhxdf9LxLd7HDtgLg4+xknh9ZZoJXbCK06ef8ANdNdjo7dk76XcV5MabcxR+gcLkmt9dyzeTFgP/awDUvfkOXFg34/A+DK/6MTpdV/B7w/qq9xXk5FQmNyyUVeCzRfq7XS7Yf4duth7nvii6Bzxnlgrj6zk+Q8t1eHv90A/cMPYcm9Ys7Z7tOmEubRvX4Zhzsy8zhgifm8cZt53FJF04pqBTeixd4vIF3rEFBBQBjzGvA1cBBa23PEunDgGcBF/CqtbbCp4ZYaz85AS2VAAAOQUlEQVQEPjTGJANPAQoActrFuKKY/6dLOHIijyfmbCJlxTQSyGVD/O1VO+E1Lzp3p8DRTV9j+t1M8tmpvLtsN0n1YhjWs1W5Q/5v5lreXb6Hr0rMci7JULUVTA9l5dGswamPGgF4ft6WcmllWwc2H8gq9fq9FWm0b5JAakr5C3RtK/B6qUf5i+uYKc6s5koDQAAn89zERkcRU6aG8eWGA7y5eBeZOQU8U2JhRHCesQEULaX+9pJdrNh1jF9dfDaJccHdZy/a6kyE25B+nKHdWlQ5/6ci2DrUG0CpxjtjjAt4ERgOdAduMMZ0N8aca4z5pMxP8xKHPug7TiRkmtSP48nRvQDIJp6U3Glcnvc3huc9wXR3+clGTxZcz0LPudySfz9D856kc+6bpORO41PXUFKm1iNlaj36rbiCvq8c4A/vrub+977nV2+vBGD6d7tJGfcpeW4Px3MLeHe507ewZk9Gqc/444w1DP5b+cltG/cdL/Xa7fGS73buEh/9ZENR+nmPz2VfZk5Q5c9ze4qaUwDeXrK73D4VreZ67GQ+90xfxR//s4bRLy8u977Ha/nBFyyO5xbw9Jc/4D6Fu9rsfP9rQZXKm8fJ2/zNB8nJr9nHmPaY8Dm/fHM5mTkFpYKgx7ftCdBqXrj73I0HeX7eVp79qnxgrUyb5HqV71RDggpN1tqFxpiUMskDgK3W2u0Axph3gGustU/g1BZKMcYYYBIwx1q7sqLPMsaMBcYCtGvXLpjsiVSJMYadk0awYPNBbn19GVus02k63n0n4913+h6NGUPhGjUveq4td47fTCv/p/xBYVMHkJPvYfz73wNwPMfN3W8XLwtw34w1pY57b6WzPIOnzBIIWWUWxxv1wjds2OdMVvvXotJLRVzwxDxm/XYQv5u+inOa1+fha3oSZaBVUj0ycwpIiHUR44oqmisRiNvPle6L9ftZuTuDWWvSi9LmbTrApV1bsDcjh60HT7Bi1zGe+2oLqe2TOZiVx+6j2WwpUXtwe7yl2u9fWrCN3m2LJ8tN/24Pd1zUAa/XsmHfcXq2KT+RbtHWw8zdcIBZa9K5PrUtj17bkydmB7dWkcdreWfZbs5KTiClSSL5Hg+dmpceGPDfHw7R++EvuLRrcyZddy7NG8YXBYOP16Qz6OwmjBlQ/vpkKf2dnchzczLPXWEtIGXcp1zevQWv3FLcZH86u2Wr0wfQBig5TCINOL+CfQF+B1wGJBljOllrX/a3k7V2CjAFnE7gauRPJCiXdGnOzkkj+OWby5i7sXjCWMklqquq20PFM3XPe3xuhfvN+X5f0fb/fVT6+c1l7543+GoEs0scU1Lhw3p2Hclm7kZn5urOSSPo/fAXXHVuSybf1J+VuzL8HlvoRJ6bnhM+L5f+5YYD7D+eWyrt9jeWM+OuC7jzLeeuudDyXceKy7duf9H2mrRM+rcvHiX0189KX7h3HzkJwMsLt/G3zzYzqndrHh7Vg7V7i4dF3jO9eG7GziPZvLdiL298u7PUed5fmcasNelMuTmV2Ogodh05yddbDrPj8MlygXPbX67i0U82cOfgjqXS5206yPBnv2bufaWb6x74cB05BR7GnFccBPLd3nIX72lLdzNt6W52PHEVxtf3UeDxsmjrYR7/1Hkk65cbDpBbUFyLiXadvocYVScA+MtlhRdsa+1zwHPV+DyRWvXKLalk5bnpNfH0d0/dPbXCSjF5bi+PfbKBNsn1SrXx/zrAMWUVNhnN/t65EC/4IfACeP4u/uD8B/96y+Fy6df/s3xTUEVKNqv4G4X45uJdvFniUaSz1qTz9ZZDHMv2v/CdwWm3L2nxtiNFNazOD86pNE/PfbWFN77dWS6IABw5mU/fR79kwsjiJao9XsvDH28oVdt7/NMN9K+gPyS3wEu9WBder+XKfyxk++GTpd5//ZvizzWn8Sl2QQ8D9TUBfVLYCWyMuQCYaK290vd6PICvCahGaBiohNL499cy/TunkhtlikfMnOlG92/LzBVpIc1Dw/holj14GXe/vZJ5m4JYjfUMcV2/Nry/cm+59Mu7t2DFrmMcPZlf6Tmevr431/XzM4cjSLWyFpCfABAN/AAMBfYCy4AbrbXrKzpHsIwxI4GRnTp1unPLllPvRBGpSfluL7HRUWzcd5zhz4bocYkRKKleTKkmI3E0SojhkWt6Mqp36yodX+MBwBgzHbgEaAocACZYa/9ljLkKeAZnGOhr1trHq5TjCqgGIOEo7Vg2E2dtoGF8NBNG9WDB5oNMWbid9enHKz9YJEjlJsQFSauBioRIdr6btGM5fLhqL0dP5vPNtsPsOZpDYqyLFknxdGvZkM4tGvCPuT+EOqsS5jY+Mox6sac+OzksZwKL1AUJsdF0btGA/x3WtSjtRJ6b+mWGAd5+UQqzv9/HT/uf5ZsVHM8Hq/ZSLzaKs5ITWLc3k8Gdm7H14Ake+3Qju49mn3Je/nh5Z/7+5ZkVaGJdUeSfxpmw4ex0jAYKyxqA+gBEAtu47zjJCbG0TIonPSOH9Iycohm5y3ceJSvXzZCuzVmzJ4PrXvqWpX8eSqN6MUS7olizJ4NmDeLIyC7g3ndX8cKN/bjiHwuZOLI7mTluBnVqwvur9tKucQKp7ZPZfugk/91yiPuv7MrbS3dx88D2rNx9jNnf76Nd4wRW7s7g1VtSeWvxLr81mzt/1IFXvnaGXd42KIUbBrTjyc83l5qLsOnRYcTHFN/trtubydXPL2L88K48MWcTjRNjaZIYS+PEWLq0bMCCzYf4+/W9WbMng8c+3chP+7flP76O7Y5NE5kwqge/nbqSB6/uRtP6cezNyOHLDQeKRjD9z5VdePLzzX6/28Gdm/G7SzvRpWUDpi7ZzV8/28RP+rXlsm7NqRfrYtXuDM7v0Jj/+2gd2w6dZNzwrkyaU3oo6+8u7cR9l3fGGMO0pbv58wffkxjrokOzRNbtLW4qHN6zZdEQ2at7teL3Q8/hgQ/XcVXPltw6qMMp/12AmoBEJES+WL+f1o3q8fHadDo1q0+bRvW4sFP5NYhqw1cbD7Dj8El+3LdNqbV6KnIoK48Dx3NZtfsYN1+QgrWW1Xsy6NvO30qm5Vlr8XhtUIvSpWfkkBgbTVJCDJ+t28fgzs1IiHVqhXszcmjZMB5XVM3c8SsAiIjUUXoegIiIVEoBQESkjgrLAGCMGWmMmZKZWbVHoomISOXCMgBYaz+21o5NSiq/CqCIiNSMsAwAIiJS+xQARETqKAUAEZE6KiwDgDqBRURqX1hPBDPGHAJ2Vbqjf02B8k+uiGwqc91Q18pc18oL1Stze2tts2B2DOsAUB3GmOXBzoaLFCpz3VDXylzXygunr8xh2QQkIiK1TwFARKSOiuQAMCXUGQgBlbluqGtlrmvlhdNU5ojtAxARkcAiuQYgIiIBRFwAMMYMM8ZsNsZsNcaMC3V+qsMY85ox5qAxZl2JtMbGmC+NMVt8v5N96cYY85yv3GuNMf1KHPML3/5bjDG/CEVZgmWMOcsYM98Ys9EYs94Y83tfesSW2xgTb4z5zhizxlfmh33pHYwxS335f9cYE+tLj/O93up7P6XEucb70jcbY64MTYmCY4xxGWNWGWM+8b2O6PICGGN2GmO+N8asNsYs96WF7m/bWhsxP4AL2AZ0BGKBNUD3UOerGuUZDPQD1pVI+xswzrc9Dvirb/sqYA5ggIHAUl96Y2C773eybzs51GULUOZWQD/fdgPgB6B7JJfbl/f6vu0YYKmvLDOAMb70l4G7fdu/Bl72bY8B3vVtd/f9zccBHXz/F1yhLl+Act8HTAM+8b2O6PL68rwTaFomLWR/25FWAxgAbLXWbrfW5gPvANeEOE9VZq1dCBwtk3wN8KZv+03g2hLpb1nHEqCRMaYVcCXwpbX2qLX2GPAlMKz2c1811tp91tqVvu0sYCPQhgguty/vJ3wvY3w/FrgUmOlLL1vmwu9iJjDUGGN86e9Ya/OstTuArTj/J8KOMaYtMAJ41ffaEMHlrUTI/rYjLQC0AfaUeJ3mS4skLay1+8C5WALNfekVlf2M/U58Vf2+OHfEEV1uX3PIauAgzn/obUCGtdbt26Vk/ovK5ns/E2jCmVXmZ4D/Bby+102I7PIWssAXxpgVxpixvrSQ/W1HV+WgMObvqcp1ZZhTRWU/I78TY0x94D3gXmvtceeGz/+uftLOuHJbaz1AH2NMI+ADoJu/3Xy/z+gyG2OuBg5aa1cYYy4pTPaza0SUt4xB1tp0Y0xz4EtjzKYA+9Z6uSOtBpAGnFXidVsgPUR5qS0HfNVAfL8P+tIrKvsZ950YY2JwLv5TrbXv+5IjvtwA1toMYAFOm28jY0zhTVrJ/BeVzfd+Ek5T4ZlS5kHAKGPMTpxm2ktxagSRWt4i1tp03++DOIF+ACH82460ALAMOMc3miAWp8NoVojzVNNmAYW9/r8APiqRfotv5MBAINNXnfwcuMIYk+wbXXCFLy0s+dp2/wVstNY+XeKtiC23MaaZ784fY0w94DKcvo/5wGjfbmXLXPhdjAbmWad3cBYwxjdqpgNwDvDd6SlF8Ky14621ba21KTj/R+dZa28iQstbyBiTaIxpULiN8ze5jlD+bYe6V7ymf3B6zn/AaUN9INT5qWZZpgP7gAKcqH8HTtvnV8AW3+/Gvn0N8KKv3N8DqSXOcztOB9lW4LZQl6uSMl+EU51dC6z2/VwVyeUGegGrfGVeBzzkS++Ic0HbCvwHiPOlx/teb/W937HEuR7wfRebgeGhLlsQZb+E4lFAEV1eX/nW+H7WF16fQvm3rZnAIiJ1VKQ1AYmISJAUAERE6igFABGROkoBQESkjlIAEBGpoxQARETqKAUAEZE6SgFARKSO+n/7KLted4jZtwAAAABJRU5ErkJggg==\n", "text/plain": [ "<Figure size 432x288 with 1 Axes>" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "plt.plot(h.history['loss'], label='training')\n", "plt.plot(h.history['val_loss'], label='validation')\n", "plt.yscale('log')\n", "plt.legend();\n", "\n", "print(hidden1.get_weights()[0])\n", "print(hidden1.get_weights()[1])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**e)** Plot the data points and the neural network output to see if the model was able to well approoximate the underlying function. If it is not the case then go back to **b)** and try to solve it." ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "[<matplotlib.lines.Line2D at 0x2b14b8cdb38>]" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX8AAAD8CAYAAACfF6SlAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xl8VNXd+PHPdxL2XZKwIy644QIS9h03pCrt44baqq19qK27fR7r0tr+Wm1r+7TWvbXWrS6g1IUqWgVB2SUIyKaCCBISQgIS1kBm7vf3x5nJTCYz2Wayzvf9es0rdzlz75mrfO+555x7jqgqxhhjUouvoTNgjDGm/lnwN8aYFGTB3xhjUpAFf2OMSUEW/I0xJgVZ8DfGmBRkwd8YY1KQBX9jjElBFvyNMSYFpTd0BuLJyMjQfv36NXQ2jDGmSVmxYkWRqmZWla7RBv9+/fqRk5PT0NkwxpgmRUS2ViedVfsYY0wKsuBvjDEpyIK/McakIAv+xhiTgiz4G2NMCrLgb4wxKciCvzHGpKBG28/fGGOaPc+DggIQgW7d3N96YiV/Y4xpCJ4H48dDz57uM36821ZPLPgbY0xDKCyExYvdsiosWuS21ZOEg7+I9BGReSKyQUTWicgtMdKMF5FiEVkV/Nyb6HmNMaZJy8qCESPC6+3bQ0ZGvZ0+GXX+fuCnqvqJiHQAVojI+6q6PirdAlW9IAnnM8aYpk8EXnkF+vYFvx8OHICiIsjMdE8AWVl12gaQcMlfVfNV9ZPg8j5gA9Ar0eMaY0yz1707jBwJ6enu71FHwejR0Lt3nbcBJLXOX0T6AYOAZTF2jxCR1SLyjogMiPP9aSKSIyI5hfVY92WMMfUq1MsHYN48yM2FDz6AsWNhyRL3JLB4cZ22ASQt+ItIe+BfwK2qujdq9yfA0ap6BvAI8EasY6jqk6qararZmZlVDkdtjDFNj+fBhAnh0j1At254O4s49PEadtOFPHpwZPAIV/VTR5IS/EWkBS7wv6iqr0XvV9W9qro/uDwbaCEi9deyYYwxjUWwl88uf0fuXziWoYMDtG8PaT270dbbT1d204s8Vv1lfp3W+Sfc4CsiAvwD2KCqf46TpjtQoKoqIkNxN51diZ7bGGOamvxAFvdlvsIz+edxyGsLq2KnKzlStz3xk9HbZxTwPWCNiIR+xt1AXwBV/StwCfBjEfEDh4CpqqpJOLcxxjQZn30G48YJO3d+J+b+1r7DtO7YkjZt6v5N34SDv6ouBCrNqao+Cjya6LmMMaapOngQLr0Udu4Mbxs4EG6/rphJt55ERmAH4kuHz3LdUA91zMb2McaYenDrrbB2rVtu3RpmzoTJk0HoCK+eAIuLXHfPOmzkjWTB3xhj6tj06fD3v4fXH75vL9+a3CHYoCuuu2c9vNgVycb2McaYOrRpE0ybFl6/PHMuP/xZ1/Ivcfl8NqqnMcY0J3fcAfv2ueXj+vl5cvelSKDuX+KqigV/Y4ypIzt2wKxZ4Y6NL3W+gY6jTgsP51BP9fuxWJ2/McbUkRdegEDAVeWMZgFDV/8dtm1zwb8e6/djseBvjDF1QBWefjq8fh3/cBvT0uqlK2dVLPgbY0wdWLYMNmxwy+3TDnKJvg6jxjSKwA9W52+MMXUistR/+TVtaJ/3BXz4YYNW9USykr8xxiTZgQOub3/ID66TRlPiD7GSvzHGJNlrr4W7d554YvnZGhsLC/7GGJNkb0TMWHLttY2mpqccC/7GGJNEpaUwZ054fcqUhstLZSz4G2NMEi1dCnuDcxn27QsnndSw+YnHgr8xxiTRu++GlyeNO4jQOKcuseBvjDFJNH9+ePncF68tP4BbI5Jw8BeRPiIyT0Q2iMg6EbklRhoRkYdFZJOIfCoiZyZ6XmOMaWwOHoTly8PrY715DT6AWzzJ6OfvB36qqp+ISAdghYi8r6rrI9KcD/QPfoYBTwT/GmNMs7FsmWvwBTil7VdkHtnT4AO4xZOMaRzzgfzg8j4R2QD0AiKD/xTg+eC8vUtFpLOI9Ah+1xhjmoWPPvQIVaiM7bgaPvkaundvlH09k1rnLyL9gEHAsqhdvYBtEeu5wW3GGNNsfDS3tGx57M6ZbpKWRhj4IYnBX0TaA/8CblXVvdG7Y3ylQhO4iEwTkRwRySlshHVkxhgTz5EjsGRFy7L1MWfshczMBsxR5ZIS/EWkBS7wv6iqr8VIkgv0iVjvDeRFJ1LVJ1U1W1WzMxvxRTPGmGg5OXDokCvnHttqO73XvAMTJjTKnj6QnN4+AvwD2KCqf46TbBZwdbDXz3Cg2Or7jTHNyUcfhZfHHpkD/oafqrEyyejtMwr4HrBGRFYFt90N9AVQ1b8Cs4HJwCbgIPD9JJzXGGMajXLBv38+bG74qRorI64DTuOTnZ2tOTk5DZ0NY4ypUiAAXboo+/a5ap9NX3gc17GwQaZqFJEVqppdVTp7w9cYYxLheayeU1gW+Hu2LOTYY3Hj9zfSnj5gwd8YY2rP82DCBD46/3dlm8aWzkWKGmc9fyQL/sYYU1uFhbB4MR/qmLJNY4/La7T1/JEs+BtjTG1lZeGNGMWHjCvbNP7N2xp1dU+IBX9jjKktEdY89AHfcBTgCvwnndz4Az9Y8DfGmITM/ygcRsePbxKFfsCCvzHG1I7nQUEB8+eHu8uPH99w2akpC/7GmNQUDN7U5l2nYC8fr1cfPnxrX9lmC/7GGNOYBYM3vXvXbqatYC+fNYGT+cbfEQjW9zfS+XpjseBvjEk9BQVu3J3ajr+TlQUjR/K+77yyTePGNZ36fkjO2D7GGNMoHT4M773n4nvLlnD00fDtizyOuvxyF/hF3Pg7GRnuhlDd4RhEYN48Zo/1wyK3adKkuv0tyWbB3xjTLC1bBlOnwpYt5bf/pCVMK72M+1lJB98hePllmDjR3SFGjoR589wkLFXYu9/HgmXh8fubWvC3ah9jTLPzxhswZkzFwA9w+IiPR/RGBrCO906+xQX6WlQBvfuu+wrAoEHQs2fy8l8fLPgbY5qVzZvhiivCE6l37OBxizzEz/kNZ7KiLN02+jJp3f/x8IxursSfXrMhmF98Mbw8ZUoyf0H9sOBvjGlW7r4bSkrc8vHHw6pPlL+c+U9+w73kkM2LXElXigBQFW65Vbht0HwCX30NM2ZU6xy7Cj3eeSfcRfSqq5L+M+qcBX9jTLPx8cfl4/dzz3gcc91EWLkSOnRA0tK4cthm1q0OMHx4OHj/5SHh0oEbOdj7hKq7fnoer458kNJS1zA8bJhy/PF19IPqkAV/Y0yzoAr/8z/h9YsvhpH9XX98PA8OHoRVq2DJErqd3o0PPhD+67/C6V/fNZaJgffYueBzWL8+7stfXkEhj24Kd/G8akJ+7V4Ua2DJmsD9aRHZKSJr4+wfLyLFIrIq+Lk3Gec1xpiQWbNgwQK3nJ6u/O6nRdC1KwwZ4urzR42Ck0+GnTtBlTZt4JVX4LbbwsdYxnBG6CK+GHhZ3CeAfy3IYh2nAtCefVz5x0G1e1GsgSWr5P8sUFVHpwWqOjD4+XWSzmuMMajC3XeHS98/znqN/qO7QZcurs/n4MEwZ47r0hnxVm9aGvz5z/DwTRsRXPDezHGMCCzg7YWdKvT82bsX7vl5+D2Am+QxugZ2NuqJ2uNJSvBX1Y+A3ck4ljHG1NSiBR7r17ug3N53gHt3/MSVxPfvd3+XLYMNG+J26bzpwWN53XcxbTgIwG66coE3ixv+Xxb5+S7NwYNw1ZXKxo1uvX175fbhi2vcS6ixqM86/xEislpE3hGRAbESiMg0EckRkZzCJnYXNcY0nJeeLilbvlJfJGNQn4qJMjPjd+lMS2PKoRl8+MQGevYMP0E8/oTQrx8MGwY9eypvvR0u9f/1cSVj4RuQmwvz5zetsR2ov+D/CXC0qp4BPAK8ESuRqj6pqtmqmp2ZmVlPWTPGNGWlpfDq223K1q88bS0sXQrDh4cTjR0L3bu7t3fjBeuWLRkybRCr3y/kggvCN4AjR1wvouLicPqfyR+46txC94JYI5+oPZ56Cf6quldV9weXZwMtRCSjPs5tjGne5r7vUVTkgm+vHgHGfPKQK90vWgR5eZCfHw72lQXr4EifGWf0YlbxeN6cWcqwduX7sPSTrTzuu5Hfjp7d5Kp5otXL2D4i0h0oUFUVkaG4m86u+ji3MaYZ8/t56Zr3gfMBmDpV8BXtdIHZ54MePap/rOAwzfj9yOJFXHRgOBce+ISN9GcrR9OSI4ySpaSvXgEDBjTJ0n6kZHX1fBlYApwoIrkicp2IXC8i1weTXAKsFZHVwMPAVNUm2DHWGNN4eB6HRp3N60WjyzZdMee62o/RHxymmfR01z101SoEOIGNnMMcxvER6aOHN4vAD0kq+avqFVXsfxR4NBnnMsYYAAoLeTsni/10AOCEVls4c/0LEIjozdOtW/WPFxymmcJC1zg8fnz4xYGhQ91ocd27N4vADzakszGmqcrK4s2M62CnW53abR7ydcQY/bWpkw+1CYBrJygocMdroo26lbHgb4xpkgJ+5Z3dw8rWL9r+V7fg87kBfhIN1jVtM2hibGwfY0yT9PG7u9nl7wxAD/IY1OYzt6N9e1dtYyplwd8Y0/R4Hm/fMLtsdXJmDr5DB9zK/v1QVNRAGWs6LPgbY5qewkLe2nZG2eq3fj/aDdwWGsCtiffBrw9W52+MaXJyj2SxGtcw20JKOfuSLnBtsKdOdSdhT3FW8jfGNDmz3wkH93ET0+lwaGez7ZVTVyz4G2OanLffDi9/66tHa/9iVwqz4G+MaVJKStzQ/CHf2vp4zGGaTeUs+BtjmpT5893Y+gAnnKD0H5XVZMfUb0jW4GuMaVLeeiu8fMEFAn+0ht7asOBvjGkyVKODP+WHZDDVZtU+xpgmY9062LrVLXfsCKNHV57exGfB3xjTZLz173BvnkmToEWLBsxME2fB3xjTNHgeb/0+PLPWBaP3uHogUysW/I0xTULR+p0s2TsAAMHj/NtOsr79CUjWTF5Pi8hOEVkbZ7+IyMMisklEPhWRM5NxXmNMM+R5bhz9yFK95zH74n/gkQbACJaQESiwvv0JSFbJ/1lgUiX7zwf6Bz/TgCeSdF5jTEPzPDdJ+o4diVfDBCdRr/DGbmEh0zeGy4wXtfyPW2jXDjIyEjtnikpK8FfVj4DdlSSZAjyvzlKgs4g031kSjEkVnueCdM+e7pNoNUzEJOqRpfoiXxbvc05ZssuPPO8WbPjmWquvOv9ewLaI9dzgNmNMUxYK1uBK/YlWw0ROoh7xxu5rrwt+da8lDWcJ/dhqwzcnqL6Cf6zX7io8H4rINBHJEZGcQqvHM6bxy8iAIUPcciJz54aEJlHPzXXjOIiA5/Hyc0fKklzhewXGjIFt28JpTI3VV/DPBfpErPcG8qITqeqTqpqtqtmZNg2bMY2b58HEibB8OQwbVj5gJyL0xm4w8G8c/l3mL24Z3KVcuvJu+PBD6N7dAn8C6iv4zwKuDvb6GQ4Uq2p+PZ3bGFMXQlU+gQCsWAFpabGDcazeO9VVUMDDOaPKVs+feJgep2da0E+CpIztIyIvA+OBDBHJBX4JtABQ1b8Cs4HJwCbgIPD9ZJzXGNOAsrL4Zsi5PLFsEDNbX81XJ2XRty+ceCKccIL7O3qkxzE/mOBuEiNGwIwZ1S+xex7FF/+AZ/WVsk23FtwN3v+5pwOTkKQEf1W9oor9CtyQjHMZYxqHefOF7259izxPYL/btmcPfPppOI3PJ3xXf8gvdRvHLlgAffq4NoKFC92TQmUKC3l62QD20wGAAazlrPWPQOHPbCC3JLDbpzGmRlThvt94nHWWkpdXeQne84Tn9XucyOfczEPsCbSHpUtdg228LqHBaqL9bTL5Y/qdZZtv8T2KjLIx+5PFgr8xpkYefcTjF/f6UHWBPyNDeeYZ+OorWLIEnn0W7rrLvasV4qcFj3AzJ/EZL3AV+vHy2F1CI17y+tlJb5J/xL3A1bOnctWm/2e9e5JItJEOjJSdna05OTkNnQ1jTIRNm+D005VDh1wAHi/zeXHlAHqeEbt33sKFcM89ykcflQ/Y4zqt5LGFAxlwalQgLyiA3r15xv9dfsAzZZuffhq+by2F1SIiK1Q1u6p0VvI3xlTN8/DyC/jBD8KB/3RW826Hy+h5Sue4vXlGj/SYr+OZ4buCni3DJf0PiwcxcJBwxx1RL+hmZfHeSTfz44gRYC69sIRrr2mchdSmzIK/MaZywaqYx3r/jgULXOBPw8+zXEur/btg3LiKY/GEFBYiSxZzmTedzwIncPuPDpS18/r98Mc/Qq9ecN65yvcuPcRxx8N5a//EYVoDcFq7L3l6dndkQoxjm4RY8DfGVK6wkM2L8rnTu79s0119X2RQ+loYOtS95BU1Fg/ggrVq2XANHUadzp+eaMvKla69N+TIEXjvfeGFmW3YvDlcDZTJTl4/cB7tA8U2emcdsOBvjKmUl5HFde1ncJB2AJx6qvLzz77n3uhdtCjmWDxlDbd9+rgbwNdflzXWnnaae0F3+nTIjlEz3aa1xyXMZCWDOI4vKx7bJIU1+BpjKvW3v8H117vltDRl6VIpH7Q9z5XKs7LCPXGCDbf4/S545+ZCZmbFdMDWLcqHk37HNxuLGHzaEYYu/DMte2bAvn1uot4NG6BHD+vlU03VbfBNyktexpjmqaQEfvWr8Podd0jF0npoLJ5IodE5Fy92fzMy3JNAaH3ePJeusJCjj87i6vV3uhtDRoablX3fPrf/4MH4w0aYhFi1jzEmruefd3O0gBuu/957q/nF6NE5i4rC4/QvWgR5eS7I9+rlbgaq7gZSVOTaEEKGDLHqnjpiwd8YExYxCFsgAH/4Q3jX7bdD69YV08UVOTpnVpYb2wfcQHCnnOLeCAsEyr/xm5XlxuhPS4Phw92LAlbqrxMW/I0xTtQUijNf8fjyS7erSxeYNi12ump1wRRxg7qlB2uaQ9U6IcuDb/yGnhi2b3dPCjaAW52xK2uMcSKmUNRFi/n9bwNlu268ETp0qJiuRl0wu3d3VTxpadCpkwvsHTq4G8KIEe4pQrX8E4OpMxb8jTFOxBSKc065mVVrWwDQpg3cfHPsdDXqghlZqt+1y9X7f/ON6wYq4rqFJjoHsKk26+1jjHFCwbmwkEf+OwvWuM0//KHrhBMrXXS3zZgiu4JG9gwK/fX5Kj5J2JDNdc5K/saYMJ+PLYe68dZb4YB+002x01WraqY67QO1fZIwCbGSvzGmnL/9LdyJ59xzoX//BA4Wq30gulRf0ycJkxRJKfmLyCQR+VxENonInTH2XysihSKyKvj5YTLOa4xJrpISeOqp8PoNic6/V91SvTXy1ruES/4ikgY8BpwD5ALLRWSWqq6PSjpDVW9M9HzGmLrz6qvhIZb79vLzrcFFoAkEZSvVN1rJKPkPBTap6mZVPQJMB6Yk4bjGmHr22GPh5eu3/4K0Pj0T74FjpfpGKRnBvxewLWI9N7gt2sUi8qmIzBSRPkk4rzEmiVYs91i2zC235DA/5ClX+W/DKTdLyQj+sW7n0e98/xvop6qnA3OA52IeSGSaiOSISE6h/c9mTP3xPB678N2y1cuy5pNJkSutWw+cZikZwT8XiCzJ9wbyIhOo6i5VPRxc/TswONaBVPVJVc1W1ezMzNhzghpjkq9g/S5eKphYtn7DPwa7l7Dy8mzS9GYqGV09lwP9ReQYYDswFbgyMoGI9FDV/ODqRcCGJJzXGJMkj83I4HDwIX5oh/UMm3wy+CzgN2cJB39V9YvIjcB/gDTgaVVdJyK/BnJUdRZws4hcBPiB3cC1iZ7XGJMcBw/C40+EA/1P/34yYoG/2UvKS16qOhuYHbXt3ojlu4C7knEuY0xyPf+8G2oH4Oij4b8utsCfCmx4B2NSmOfBgw+G12+9NTzqsmneLPgbk8Leegu++MItd+oE113XsPkx9ceCvzEp7E9/Ci//6EcRY/abZs+CvzEpavly+Ogjt5yeHmf0TtNsWfA3JlVEzbsbWeqfOtWNumxShwV/Y1JB1Lj6W7/ymDkzvPunP224rJmGYcHfmKYgqtRe4++uX19uXP2Hfn+QQHCK3okTYeDA5GbXNH4W/I1p7KozG1ZV3x04ENq1g/R0ioeczVP/iHip6zabMzcVWfA3pi4lUmIPiTUbVk2/GwjA/v2wciWPjn2FfYF2AJzMeiYNtkEUU5EFf2PqSiIl9kiJzHEb+d1Ro9iZOYAHHm9ftvt/Tvg3vu42Ymcqsnf5jKkr0SX2ggI3sUlNZ7RKZDasqO/+6gZh3z636+T+fq5ee4eN2JmirORvTF2JLHWPGBHuTxn9FOD3w9q1sZ8MQtVGIrWfDSs4k9aq1cKTT4arn/74YDrpLSzwpyoL/sbUlVCpOzcXXnkldr19SQl06QKnnQZHHeX2hyRabRTR3hAIwH//txIIuGB/VuccJk+yht5UZsHfmGSI17Abmr+2W7eK9fZ+P2RkuIZYgOJiWLgwfIxEGnqjbhx/edAjJ8cF/laU8Pi+q5Eia+hNZRb8jUlUdUroIjB3LoeWrGLx/fNYvERY9cYWNh7owRFauDQ+H5xzTvgYiTT0Rtw4li70c9fd4eqdX/ju54RRmTY1Y4oTTaQLWh3Kzs7WnJychs6GMVUrKHCB3+93gTo315X0I+R+7fHLoe/wcsEEDtG23L5O7OGitLe5PvA4I1nsbhTbtrljZWRAUVHNG3pVYdw4ihasZxAryQ3OtJrdfgOL1namZd/u1tDbTInIClXNriqdlfyNSVQlJXRVePxxOOFE4emCb1UI/ADFdOafgasYxSIm8AHLdAhccom7oUyY4J4CQtVKgUDV7w14HuzcyeHnpjNVZpQF/i7s5tWDF9CylVjgN8kJ/iIySUQ+F5FNInJnjP2tRGRGcP8yEemXjPMa0yhENuxGTXb+v/8LN9wAh0rC245nI8NZwhmsokeL8vXu85nAcJZx3bJpFPo7w4IF0KcPdO0KvXq5v5VVLwWroI70OoarhnzBXD2rbNc/+R799Cu4/PLav3Ngmg9VTeiDm7f3S+BYoCWwGjglKs1PgL8Gl6cCM6o67uDBg9WYpuyxx1RdEd19TmO1fsAE1eHDVdPTVYcPVy8tXZczWK+R5zQtzSuXvjO79WFu0lLSyh8I3Pd37Kh40rw8/YbOOpE55ZLfd0ex+05l3zXNAm7u9CpjdzJK/kOBTaq6WVWPANOBKVFppgDPBZdnAmeJ2HOnaSZi9PR5/324+eZwkm/zOssYxgSZDy1awNdfw6JFyKiRZKev5tnRT/H5wiIuujB8jD104WYeZiCrmOs7B9LS3HRblTQAz/6gFUP4mA8Il/hvuw3u/l2H2jcem+apOneIyj7AJcBTEevfAx6NSrMW6B2x/iWQEeNY04AcIKdv3751eXM0JjlKS1VHjFBNS3Mler9fN2xQ7dQpXOrOZrkeoE3sUnsgoJqXpzp2rDtGx476Nufr8XxRobB/1rB9+sLzAd3/xXbVTz913w0eY9XcIp00yavwnft+46nnafhcO3ZoeINpjqjHkn+sEnx0a1R10qCqT6pqtqpmZ2ZmJiFrxtQhz4MxY2DJEtcQu3Qpu4aezwXn+ykudkl69VJmDbuftmlHYpfafT73WbTIHWPvXibzDms5lQfkTtr7DpSdbu6y9nz3ah/dTujIhNOLuLTVLIYN9chotZeBZ3Xl3XfD/8w6dvB48QXlnp9LuAki9M6BPXQbktPgmwvB7gRObyAvXhoRSQc6AbuTcG5jGk5hoZsLMchDuPKTn/LlFjdkVtu2yr//LfRY/C/Yvh127YrZKExGBrQPDrbmc/8kW0kpd4xZwhdbW3P1pYcQCZeVDtCe+Uxgpv/bfLzcxy5/57J9Isq0abBxk48rr7Igb+JLRvBfDvQXkWNEpCWuQXdWVJpZwDXB5UuAD4KPJ8Y0XVlZMGpUWSB/hJt4j/PKdv/zkWIGDSJc4k5Li13yLiqCA8ESvgisXg15eTB/Pj16p/HcK23YulX43S07OIkNMbPSlgNclvkBn6yAv/3NqvRN1ZLykpeITAb+guv587Sq3i8iv8bVPc0SkdbAP4FBuBL/VFXdXNkx7SUv0yR4HuzYwboTvsPgAx9ymNYA/KzPi/x+65XVq2JRdV03Fy92VULRTwYR59LOXdiy7yjWcwrFdKYXuRzn20LPlW/jO22AVemYar/kZW/4GlNTnldueOUj2woY1jePVQwCYOBJh1i2qrV7maqWx4xr2zY45hjK5mAMjRg6fbrV6RvA3vA1pm7EGMfn7gczywJ/K0p4YUbLmgV+qF7g9jy46qpw4B82DLZsccu9ekHPnolNGmNSigV/Y2oiaqTN2S8X86cHw/+MHpC7GNCtqO7OvWRJeH3FCjcMxOLFbl3V9RqqyeifJmVZ8DemJiLG8ckbfCHX3BruaTNZZnPT6JV119oaOndamntC8Ptdb6PsiCf89u1d7yFjqmDB35iqRL7BGxzHJ7A1l++2/RdFRa6apmdP5dk12fg+nFd3de7BYaFZudK9XxCcl5fXXnPL4HoNFdXRk4dpViz4G1OZWGP1+3z87uluzJvngrwIvPiikDmghsMu1yYvZ50FZ57pbkRff+16BvXoYUM3mBqzCdyNqUyM2bQWfNGNX/4ynOQXt+1n/Lh2xH6RvY7ysmSJayQO3WxqO8G7SVlW8jemMlFj9eceyWLq1HCHmjEdV/OLhzLqp5dNZTN7WTdPU0NW8jemMqGx+gsL2dcmiwvHCXnBwUuO6uLx0t4ppAcOh+fYjZrBq67yYiV8kygr+RtTFZ8Pf9duXHGlsGqV25SerrzyitB71NH1W9duJXyTJBb8jamG22+Ht98Or//t2D9w1kSNPYNXjPH9jWlsLPgbU4WHH4ZHHgmv38Vv+cHmn7vql+iSeKzeQcY0Qhb8jYklWHr/10zlttvCmy/L/ID70n4Vv5onRu8gYxoja/A1Jlqw9P72wk5M1dfw1P0zGT4cnp0zHt/+bfEbXEM9ckIjdFqfe9NIWfA3JnpEzcJC5ixszcXeK/iD/0ROPBHefBPatPNBu0p69FiPHNNEWLWPSW0x6ug/Wp/BRbxZNjb/sccqc+dvvPl3AAAUS0lEQVTWoBBvPXJME2DB36S2qDr6Ze/s5lvnHOaQ5wJ/nz7K3LlCr14NnE9jkiyh4C8iR4nI+yKyMfi3S5x0ARFZFfxET/FoTMM56iho0waAVa2HM+mqo9gfaAtAd/KZO2MX/fo1YP6MqSOJlvzvBOaqan9gbnA9lkOqOjD4uSjBcxqTHJ4HY8fCvn2s52TO2f8ae4rdP4kMCpk7+Gf0H961gTNpTN1INPhPAZ4LLj8HfDvB45mmpKm/zFRYCB9/zEaO5yzmUkQmAJ07K++97+OU5c9Zvb1pthIN/t1UNR8g+Ddek1hrEckRkaUiYjeI5qA5vMyUlcXWMy7iLOaygx4AtG/n8e67wqCzu1rgN81alV09RWQO0D3GrntqcJ6+qponIscCH4jIGlX9Msa5pgHTAPr27VuDw6eY6k72XZdivcxUl4Oa1YHtucrE3a+yLfjPoI2vhLffbsWwYQ2cMWPqQZUlf1U9W1VPjfF5EygQkR4Awb874xwjL/h3MzAfgrNdV0z3pKpmq2p2ZmZmLX9SM9dYStyxhhduQtVABfkeZ524jc1bXeBv2VJ5851WjB1npX2TGhKt9pkFXBNcvgZ4MzqBiHQRkVbB5QxgFLA+wfOmrvoYPqA6QVwVXnoJcnJg+vS6vSkl+aayezecc1aAzw8dDUA6pcx8ag/nnK1N5uZlTKISDf6/B84RkY3AOcF1RCRbRJ4KpjkZyBGR1cA84PeqasG/tiqb0KMm4gXU6gRxz4Nx41yagQOhVy8YPbr8TamgIDmBNJk3Fc+j+PMdnDexlDUbWgDgI8BLJ/2GC8861DieqIypL6raKD+DBw9WE0cgoLpjh6rn1f77Y8eqpqe7v4FAeN+OHW47uL87dlT4eumWXPVcWA9/0tJUR4xw3xkzJv7xa5r/auSnyt+6Y4eq36/7Rp6ro1hQlmURT59/+BuXx7Q0VZHan8eYRgLI0WrEWBvbJxXEGLsmbmNtVhY6YiRbF29nyfFXs+S+LD5dA7t2wTffwDffKAcP9qINB8hiJ0ezlTP5hKH9dnPu67+kK7tcbO3TJ/bx/X7Xt375cvfkMm+eGw4hnkQGSgs9NSxezL6BY5ic82sWMbps91//sI/vXXYYbl8MgYDbaJOgm1RRnTtEQ3ys5B9HZaX26qb3vPLbPE8DAdW5c1V/8hPV3r09jS7YV+eTRqmOH+/pM097emj02eWOX5aX4cPDX6huCbsmTwqRaYNPDcV00BEsKpfXB1vfqer3l78WY8ao5ufX/onKmEaAapb8GzzIx/tY8I+jptUg8dIHg+TmLz29+27VPn1qHuwr+2RkeHr3Lfu0YIdXPi9paeFEI0YkN9BG3+j8ft0z8nwdzpJyefszt7p8RF0LC/qmObDg31xEB6YYpfZKg1cofVqaK3UHnxSWLlW95BJVny928O7QQfXss1V/8QvVt95SXbVKdetW1b17VT1/QPduKtCNG0p11on/o/fyKx3BIhUCFY7Trp3qPfe478XLS9JE3egK1xXokOzyTzEPtb6j4hOJMc2IBf/mIF4VT2SwDwRcdUVamvsbK6CWlqqOGKGBtBb6xil36ejRsat1uqZ/ozfe4Om8ee4r1cpbWprqmWeqpqdrPt30Ad/PtG+v0grH7tFD9cV/BtQbHSOv1Sl5VydNxI1x45Ar9Pjjy//OR+VG1dGjVfPyLPCbZsuCf3NQnSqevLzyUTYvr0KSw1/v0Cd9P9IT+Cxm0D9H3tPXmaKH09pUv5dLZN6ievqUjpmgM32X6Km+tRXONY75uoYBLm1envtU1YZRk3aOQECXzi7SjIxw4BcC+gQ/qvw6GtNMWPBvDmJV8UTLzw93URRx60ElBwP6xAPF2rdvxZJ+ixaqV1+tunpVFdVI8Urc0Q2l27e7c+fnl90U/Pj0eb6r3cmr0DB8a88Zumfk+dXrYhnZVpCWppqbq7pmTcwnh5kzVdu0CZ+rNQf1X3zHfc+qe0wKsOBfn+qywbCqY8e4QZSUqD7+aED7tNpRIeh36uTpz37m4mfMc0SWsivrrx8IuFL79u3l6/FLS933Ik5anD1Rf3q7V/agEPp0I1+f57vunYHKArPfr9qpk5Y1RnTsGPoxqocPq44dqyVpbfW2XjPKV2MdFdDFvlFuxedzNwwL/KaZs+BfX2ra9TKR88S7CZSWqq5Zo4f2leqjvy2O2VUzs2tAH3gg2PAaT2mp6rx55atz4vUUCv3m4cMr9uDJzdVykT54bdatCeiECVohb6NYoEsH/MAF+Vi/N7KKKbqFet48XZI2Sk+hfBXTccepfvF5NZ6cjGlmLPjXl0TfQA2pLLhXdoMJBHTvyPP0j/K/2kPyKgTWLHboH495TPfvqyLwlZaGS9ehoB9d8g/lLbq+f/Dg8oE+Pz/uW7Oepzp9umqvXhVvAhecfVDfeUc1UBqoWKUUWh89uiyfq9sO16mXByr0MrrwAk93bSiouieUMc2QBf/6Up16+aqUlroSdLzqlTVrYt5gdu5U/flt+7QzuysG/Qy//un/AnpgcxV196Htn35aoUQd93vRvznYm6hCu0HoJhDj2uzbp3rH93dqCw5XyPvRvUv117579WOy9QjpZTcB/7Y8Xb/O02f+XqqTOy6o8L127Tx95L496o2phycxYxopC/71KZGuioGA6rBh4QgmEu6xEyrx+3zhqpVOnXTrZr/edFP5hs3Qpzt5+udjH9ED+73y54jXZTRGiVo7daq6G2b09uqmi9r3RbuBejkvx+yFBKo+/JrFDs2kQNu2if/m8YUXeLpl6KU2Ro9JeRb842mIaoCqBlKLrscO9diJqF7xQOcxTi+VVzQ9vWIQPK5fqf7tiYAe2uIGMSv3G+NVTUVvj9WLpq7bM0pLVVev1s+yr9Lb5c96VPqeuAE++iME9OKMeboiJ6pdoKoGZGOaMQv+sTRU42xl7QKeV753zJgx5d7m3TPyfH1YbtaTfRtiBsCBrNTpJ/1S/UcqCdjxqqaqqrJKVntGrGsS5zyHfG31xePv1avkBT2aryr83m7dVC+4QPX++wK6cUlh+EYX3UvJxugxKcqCfyzJCGZVPTlUFXxjBaZg6Tf05mnJwYC+8ew3evnlnraJU9UxbsRhfcc3OdxNMl5pvqrxa6ozNERtStHxuo/GuulGn8fvL2svOJTWTvOGfVsLdni6e3dEFkJdTaPbHqxx16Q4C/4hkUEo0cbZWEEsFIRCAT06+ObllY0nr3l5FYdiCB6zNK2Vvnfa7fr9awLaKW1vzIDfvr3qj3/s2mZrXZpP5PpVd3/0dcrLq/qmG+848W5YNga/MTFZ8FeNH6xrWjoMfSfi7dWywB6qshEJl1qjX5IKvQC1bVtZJPdAv1hYoI/9do9+W17XTnwTt277tHab9PFHA+E++hETlNS4NJ9M8Ur00TfAyF4/VVXJVCfvVr9vTFz1EvyBS4F1gAdkV5JuEvA5sAm4szrHTkrwT1Y1T7w3XvPzy7/gFDpH5M0iuN8D/XLAhfoyU/WHPBmzPjvycwxf6l3cr59yqnrE6AHUGLoyxru+8UYerWocnyreZ4j5BGf1+8aUU1/B/2TgRGB+vOAPpAFfAscCLYHVwClVHTspwT8ZVSDxqnE8z5W8IycniRheYfVq1Zdf8vTuns/oebyjXSmssvdK755+veUW1aWLA+qtjup3H6MHUINXdVR2fWOV4KvKe2XtFcl4gjMmBVQ3+Cc0jaOqbgAQkcqSDQU2qermYNrpwBSg7idxF3HTBEZOYRgpcnpD1djpoqcR7N6dQyXCzi0e+f91I1tW9eOrFufylb83X306kK/6w5YtSiAggADXxs1ehw7K+BaLOWfPq5xz5i5OXPY84gPwgZ4KY8aEzxsxzWKtpzVMtsqur88XznNIVXmPtz/etJPRxzfGVFt9zOHbC9gWsZ4LDIuVUESmAdMA+vbtm/iZo4P7zp34j8riUIlQctCj5KLLKFmxjv2nj2Svvy3F67axt/9gim+8h737fBQXw55vYGe7/7BzkI+CvBYUdIJ9+wB8wBPuPKXB8+0JfuLo0kXJzhZGjIBzzoFhw4QWaSOg8PiKwVME5s8vH1hDv+eDD6CoKPYNrb7FCvLxVHUzjre/Md3wjGkmqgz+IjIH6B5j1z2q+mY1zhErOmmshKr6JPAkQHZ2dsw0VXnwQXjpJSgpUUo25VFSohxKL6bEa0mJ15VAWXZ8wEy3uDLiAJ8DN0Vnv3WN8iB4HMNXnNIlnwF7FnPGiSUMefPnHNffFyNWVxI8IwNrxGTkZROfN3Tgr42qbhax9oduCgUFTfM3G9MIVRn8VfXsBM+RC/SJWO8N5CV4zLi2b4ecHHBBu7fb6E/uOdIpJYuddDs1k77bl3LMnpUc0zfAMX+5hWPuu45jV71G2yEDXEbUD5vSodOPQBKopohX9ZFKpk4tf/Pz+Ro6R8Y0WfVR7bMc6C8ixwDbganAlXV1stZVFNJ9BGjDIdpwiNaU0LqV0vbwN3Ty7aeTt5uO7KUje+n0/YvpeFwGnZ59iMzNH9Pt1EyyXvoL3f77Irp8/B9k1EhXLaOjofDEcDXFRU9D4QOQmVm+pJ5oVUWqV33Yzc+YpEoo+IvId4BHgEzgbRFZparniUhP4ClVnayqfhG5EfgPrufP06q6LuGcxzFtGkyZ4m4CrffupPW4YbQJ7KO1HKH1oJNJX70C2reHAwcgOzv4mOAHFcrVRv3zZhdkN8yFXbvCwX3hrPJ10iLlg1Co2sLzKnboSaTKoqr68uYu1W9+xiSZuJ5BjU92drbmuPqb2lOF8eNdwAgFer8f0tJg1So45ZRw6XzECPedxYvDgTs9HXJza1fCLCiA3r3d+RI5jgmLbMBPtZufMdUkIitUNbuqdM270jRUWs7NhUWLXIkxPR1GjYIBA1wpfe5cWLECZswIpx0zxt0gsrNd9U1thEqq6elWUk2W0FOVBX5jEta8gz+U7z0yfTps3eoCPbiS5MSJcMYZ0KuXewpQhTlzYOhQ96QwYYIrvRcUuH2RPC/2dih/45k/3wKWMaZRaZ7BPzooh7pJ9unjqnr69HHVQfn5rpoHXNoFC6BvXxg7FpYvd0F/0SK33ru3+47nlT9m9PZIVlI1xjRSzS/4xwrKoZ4igQAUF4eD+iWXuG0hIm7fxx9DmzZuW9u24RtBqJcJxO59YowxTUTzC/6xgnKo/j0tDTp1cvXwQ4aEXghwJfTVq11df3q6q/I5eNDtO3DApY2uu7c6fWNME1Yf/fzrV6wugZHdJDMy3NAI0f3wTzstnCZ6X6zhFFK966Uxpklrnl09q9slsLJ0iXQrtC6JxpgGktpdPavb0FpZuto21lanIdgYYxpY8wz+Dckago0xTYAF/2SzhmBjTBPQ/Bp8G5o1BBtjmgAL/nWhJhOcGGNMA7BqH2OMSUEW/I0xJgVZ8DfGmBRkwd8YY1KQBX9jjElBFvyNMSYFNdqxfUSkENja0PkAMoCihs5ENTWlvILlty41pbyC5TeZjlbVKqcgbLTBv7EQkZzqDJLUGDSlvILlty41pbyC5bchWLWPMcakIAv+xhiTgiz4V+3Jhs5ADTSlvILlty41pbyC5bfeWZ2/McakICv5G2NMCrLgH0VELhWRdSLiiUjc1nwR2SIia0RklYjUcr7JxNQgr5NE5HMR2SQid9ZnHqPycZSIvC8iG4N/u8RJFwhe11UiMqsB8lnp9RKRViIyI7h/mYj0q+88RuSlqrxeKyKFEdfzhw2Rz2BenhaRnSKyNs5+EZGHg7/lUxE5s77zGJWfqvI7XkSKI67tvfWdx4Soqn0iPsDJwInAfCC7knRbgIzGnlcgDfgSOBZoCawGTmmg/P4BuDO4fCfwQJx0+xvwmlZ5vYCfAH8NLk8FZjTivF4LPNpQ1zMqL2OBM4G1cfZPBt4BBBgOLGvk+R0PvNXQ17W2Hyv5R1HVDar6eUPnozqqmdehwCZV3ayqR4DpwJS6z11MU4DngsvPAd9uoHxUpjrXK/J3zATOEmmQWXsa03/bKqnqR8DuSpJMAZ5XZynQWUR61E/uKqpGfps0C/61p8B7IrJCRKY1dGYq0QvYFrGeG9zWELqpaj5A8G+8OS5bi0iOiCwVkfq+QVTnepWlUVU/UAx0rZfcxclHULz/thcHq1Fmikif+slarTSm/1era4SIrBaRd0RkQENnpiZSciYvEZkDdI+x6x5VfbOahxmlqnkikgW8LyKfBUsKSZWEvMYqkdZZF6/K8luDw/QNXttjgQ9EZI2qfpmcHFapOterXq9pJaqTj38DL6vqYRG5HvfEMrHOc1Y7jeW6VtcnuKEU9ovIZOANoH8D56naUjL4q+rZSThGXvDvThF5HfcInvTgn4S85gKRpb3eQF6Cx4yrsvyKSIGI9FDV/ODj/M44xwhd280iMh8YhKvbrg/VuV6hNLkikg50omGqB6rMq6ruilj9O/BAPeSrtur1/9VEqereiOXZIvK4iGSoamMd86ccq/apBRFpJyIdQsvAuUDMHgGNwHKgv4gcIyItcQ2U9d6DJmgWcE1w+RqgwpOLiHQRkVbB5QxgFLC+3nJYvesV+TsuAT7QYAtgPasyr1F15hcBG+oxfzU1C7g62OtnOFAcqiZsjESke6itR0SG4uLprsq/1Yg0dItzY/sA38GVQA4DBcB/gtt7ArODy8fielasBtbhqmAaZV6D65OBL3Cl5wbJazAfXYG5wMbg36OC27OBp4LLI4E1wWu7BriuAfJZ4XoBvwYuCi63Bl4FNgEfA8c24DWtKq+/C/4/uhqYB5zUgHl9GcgHSoP/314HXA9cH9wvwGPB37KGSnrbNZL83hhxbZcCIxsyvzX92Bu+xhiTgqzaxxhjUpAFf2OMSUEW/I0xJgVZ8DfGmBRkwd8YY1KQBX9jjElBFvyNMSYFWfA3xpgU9P8BRuK17um1pAsAAAAASUVORK5CYII=\n", "text/plain": [ "<Figure size 432x288 with 1 Axes>" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "plt.scatter(X, Y, color='red', s=5)\n", "plt.plot(sorted(X), model.predict(sorted(X)), color='b', linewidth=3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Exercise 2 (Binary classification)\n", "\n", "Now will solve a binary classification task. The training data consists of data points $(x_i,y_i)$ with $x_i \\in \\mathcal{R}^2$ and $y_i \\in \\lbrace 0, 1\\rbrace$. \n", "\n", "**a)** Generate the data using the methods `make_moons` or `make_circles` from the package `sklearn.datasets`. Plot the data points (blue for class 0 and red for class 1). Also split the data set into training and validation set." ] }, { "cell_type": "code", "execution_count": 154, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAeYAAAFpCAYAAABJdYvCAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJztnX+snUd5579zr+9dc+20xcfQFhyfG0iFSNLIJO5CtUs2JBWwoaL8qiDcpE4IdXNTUKj6B2WvtK2ELq2EVIh2oZUpAYdzKCtgEbsUCQKChlXbVW6ipJs2bRrA19ilNLmIENeAHd/ZP14f33PPeX/MzDvzzjPv+/1Ir2wfn/O+M/POzHfmmWeeUVprEEIIIUQGM7ETQAghhJAtKMyEEEKIICjMhBBCiCAozIQQQoggKMyEEEKIICjMhBBCiCAozIQQQoggKMyEEEKIICjMhBBCiCAozIQQQoggdsR46N69e/Xi4mKMRxNCCCGN88ADDzyptX6OyXejCPPi4iLW1tZiPJoQQghpHKXUuul3acomhBBCBEFhJoQQQgRBYSaEEEIEEWWNmRBCSLqcPXsWJ06cwI9//OPYSRHHzp07sW/fPszNzTnfg8JMCCHEihMnTuCiiy7C4uIilFKxkyMGrTU2NjZw4sQJXHLJJc73oSmbEEKIFT/+8Y/R6/UoyhMopdDr9WpbEijMhBBCrKEo5+OjXCjMhBBCkuNf/uVf8Ja3vAUvfOELcdlll+GGG27AY489hmPHjuGKK64I8syf/OQnePOb34xLL70UL33pS3Hs2LEgz6EwE0IISQqtNV7/+tfj2muvxTe/+U38/d//Pd73vvfhe9/7XtDnfvSjH8Wzn/1sPP744/id3/kdvPvd7w7yHAozIYSQoAyHwOIiMDOT/Tkc1rvf1772NczNzeH222+/8NmBAwfw8pe/fNv3jh07hpe//OW46qqrcNVVV+Gv/uqvAADf/e53cc011+DAgQO44oor8I1vfAPnzp3DLbfcgiuuuAK/+Iu/iA984ANTz/385z+PQ4cOAQDe9KY34atf/Sq01vUykwO9sgkhjTIcAisrwPHjwP79wOoqsLQUO1UkFMMhcPgwcPp09u/19ezfgPt7f+SRR3D11VdXfu+5z30u7r33XuzcuRP/9E//hBtvvBFra2v45Cc/iVe96lVYWVnBuXPncPr0aTz00EM4efIkHnnkEQDAD37wg6n7nTx5EhdffDEAYMeOHfjpn/5pbGxsYO/evW4ZKYDCTAhpjBCdNJHNysrW+x5x+nT2eeh3fvbsWbzjHe/AQw89hNnZWTz22GMAgF/6pV/C2972Npw9exave93rcODAAbzgBS/At771Lbzzne/Ea17zGrzyla+cul/e7DiEExxN2YSQxijrpEk7OX7c7nMTLr/8cjzwwAOV3/vABz6An/3Zn8XDDz+MtbU1nDlzBgBwzTXX4L777sPzn/983Hzzzbjnnnvw7Gc/Gw8//DCuvfZafOhDH8Lb3/72qfvt27cP3/nOdwAAzzzzDJ566ins2bPHPSMFUJgJIY0RopMmstm/3+5zE6677jr85Cc/wUc+8pELn91///34y7/8y23fe+qpp/DzP//zmJmZwSc+8QmcO3cOALC+vo7nPve5+M3f/E3cdtttePDBB/Hkk09ic3MTb3zjG/He974XDz744NRzX/va1+Lo0aMAgM985jO47rrrgsyYacomhDTG/v2Z+Trvc9JOVle3L18AwMJC9rkrSil87nOfw7ve9S780R/9EXbu3InFxUV88IMf3Pa9O+64A2984xvx6U9/Gq94xSuwa9cuAMDXv/51vP/978fc3Bx2796Ne+65BydPnsStt96Kzc1NAMAf/uEfTj33tttuw80334xLL70Ue/bswac+9Sn3TJTlL4RHWRUHDx7UPI+ZkO4xucYMZJ30kSNcY06JRx99FC9+8YuNv981h7+88lFKPaC1Pmjye86YCSGNMeqMu9RJk+z98h2bwzVmQmrge39mF1haAo4dAzY3sz/ZYROyHc6YCXGEW38IISHgjJkQR7j1hxASAgozIY5w6w8hJAQUZkIcCbE/kxBCKMyEOLK6mm31Gafu/kxCiBkxjn287777cNVVV2HHjh34zGc+E+QZAIWZEGeWlrL9t/0+oFT2J/fjEhKeWMc+7t+/Hx//+Mfx1re+NehzKMyE1IBbfwgxwPO+wljHPi4uLuLKK6/EzExY6eR2KUIIIeEIsK8w1rGPTUFhJoQQEo6I5z76PvaxKWjKJoQQEo4A+wpjHfvYFBRmQgghF/AeZjbAvsJYxz42BYWZEEIIgK3l4PV1QOut5eBa4hxgX+Ho2Md7770XL3zhC3H55ZfjD/7gD/C85z1v2/fuuOMOHD16FC972cvw2GOPbTv28cCBA3jJS16Cz372s7jzzjtx8uRJXHvttThw4ABuueWW3GMf77//fuzbtw+f/vSn8Vu/9Vu4/PLLnfNQmj8e+0gIIQTIZsh552X3+9mugxG2xz527dxHHvtICCHEC8HCzPLcRytoyiaEEAKAYWalQGEmhBACgGFmpUBhJoQQAsAuzGwM/6QU8FEuXGMmhBByAZPl4J07d2JjYwO9Xg9KqWYSlgBaa2xsbGDnzp217kNhJiQgHXNGJR1h3759OHHiBJ544onYSRHHzp07sW/fvlr3oDCTViFJCAOECCZEBHNzc7jkkktiJ6O1cI2ZtIYgwRFqUBYimLjhPSoVIQKhMJPWIE0Ig+0J7SgmAy8KN2kDFGbSGqQJIfeE+qVq4CXNYkKIK7WFWSl1sVLqa0qpR5VSf6eUutNHwgixRZoQck+oX6oGXtIsJoS44mPG/AyA39VavxjAywD8tlLqMg/3JcQKaUJosyeUVFM18JJmMSHEldrCrLX+rtb6wfN/fxrAowCeX/e+hNgiUQiXlrLg/5ub2Z+h0+J7jVXSmm3VwEuaxYQQZ7TW3i4AiwCOA/ipsu9dffXVmhDil8FA64UFrbMV1uxaWMg+j3m/wUDrfl9rpbI/XdNTdS/f+SfEJwDWtKmWmn6x8kbAbgAPAHhDwf8fBrAGYG3//v3BC4GQOvgUk6bo97eL0ujq9+Pdr2mxTPG9kW5gI8xezmNWSs0B+AKAL2mt/7jq+zyPmdjQdNCQycAgQGYyjW0Wr2JmJpO+SZTKTOkx7md6vi8hbcfmPGYfXtkKwEcBPGoiyoTYEGMLTKrevb7XWF3vN74unSfKAB2yCCnDh1f2fwBwM4DrlFIPnb9u8HBfQqKIZKrevb690l3uNzmQKoIOWYQU48Mr+/9orZXW+kqt9YHz1xd9JI6QGCKZqndvnlf6oUPZIMbFq9rFyz1vIDUJ93ITUg4jfxHRxBBJafuhbRjfnrW6Chw9Wm8ZwHa7V9mAqe4WNklbt8pIJZ1EMKZeYj4vbpcipsTaAtMG717fXtoxn5nKVqhU0kmaB017ZdtCr2xig6SjHFPCt5e2CaE82lPx7k4lnaR5bLyyKcyEtJRYIhFiIBVjkOFCKukkzdPodilCiExCrJWbrJ+GCEOaikOej3RyjZpQmAlJAJfOusqr2vaeMY9VDOWQ51sE66aTR1cSAHT+IkQ6IRyKXO4Zw5lsMs0+HfJCOWrVSWfsMibhAJ2/CGkPIdaKXe7ZtvVTiY5abStjsgXXmAlpASMza4iwli6BW1JZ5zVFYoS3tpUxcYPCTIhAxtcai6jTWbsIQMqBV/KQKIJtK2PiBoWZkIC4OhdVhbas21nnCYBS2UCgzNvaNkSnZPLKYG4OOHUqnkd028qYOGK6GO3zovMXmaQNkbYmqeNcpFS+E9DIEci1fMbLudfLLmD6ealFq3KtP5PlMTu7vRzm5tIqByIX0PmLpESq5x9XUce5KIRjUlE5P+tZwMaG32c1ia/6s3dvfjn0esCTT9ZPJ+k2dP4iSZHq+cdV1HEuCrHWWFTOeWIEyD/mcoSv+lNUDqPPGfiDNAWFmUSnrnes1A6zjnNRiLVGW6FNxRO4Ce9qBv4gTUJhJlEYF9OZglpoIgySO8y6s17foS2LyrPXS9sT2Jd3da9X/HlbrTpEJhRm0jiTYnru3PR3TIVBcocpzcO2aKBw111h0tmUJcOX2f+uu4D5+e2fzc9nn0vc80xajKmXmM+LXtndY9z7ddLzdXTNztp71RZ5LysVMjdpkeeJHdr7velziX159Rfdh6EySV1g4ZVNYSbByeukfYkpO8xymhbIET7fi4StdLHKkbQHG2GmKZsEY2TKvOmm8mAZI1ycjRgpqZxYpn5fpl8pPgSSliWkOjsSj5gquM+LM+b2YzpL9jH7kDCjkkosU7/tjJkmZDM4c08X0JRNYlPUodZdU3ahy8IdS9hsBGQw0Hp+fvt35+ezz+lDsB0OVNLFRphpyiZBqDJZLiwAR4/62wpURF1TaOpmw1imfhvT7513AmfObP/szJnsc4kHTcSE3uEdwVTBfV6cMbefshlzk7PWOjOMtpgNpVsMyqwqbXkHvuCMOV3AGTOJTdFMbTAIO0OepM4MQ/IeaRt8ByppEklOV6GwscrQ2bEbUJgbJnXTqGn6pXSodUyhNBs2Q1nELUDewMJnG85barn11uxAjbz7S2lXJDCmU2ufV1dN2amb5VJMf50002zYDINBdrzieBlLPW7RRxswCbaTUhsjZoBe2TJJvaNPNf11zupNbSCSKtLXwUfUbQO22whTaWOkGhth5nnMDTIzkzWzSZTKzHTSST39LgyH2Zry8eOZ+Xt1lWbDLlO3DRSds11Fm9tYV+B5zEJJfeuHa/pTXleXtr7ZZSTUo6o2UJVGV/+EVPoI4gcKc4Ok7lHpkn4pIRWlIEFcfNJUfqTUo7w2oFSWnr17M8etsjQWCezsbHafXm/6hKuU+gjiCVObt8+rq2vMWqezllaEbfpTXZcOQdvWrJvMj6R6NGoDowhkNuvDJmWWeh9B8gGdv4hvXDsL6SEV8/IVqmOUJC4+aDI/EuuRSdjZvDRSeLuJjTDT+YtUMjIjjgfbWFgADh0CvvjFcseoImeXfj9bs41JXr7m57Pu9OzZrc8WFvzsFW2b81yT+ZFYj4ryP4mEuk7iQ+cv4pWiCFh/+qfVa34S19XLjqM8c2a7KAP+on2l7vw3SZP5kViPTPIZO42haJuvhDhMp9Y+L5qyZVJkYjNZRyszY0oy3bnuI/VhMvUdnEJiWYZcM5eU91F6JvM/P691rycnjSFom69EU4BrzMSWssZmupYWe83PBJu8hFg3rSMuEjtEaWLZNF3Mf9t8JZqCwkysKWtseYJQNIuW3jirZv/z89PhIWOL3wh2iG50QTybzKNER7wUsBFmrjETAOUHNuQFzr/9dnlrfibs2VP8f/0+cPfdwMc+JvOQAB6qYY+U/c8haTqPbfOVEImpgvu8OGOWh8tsLLWZSN5hCaNZsvS0a80ZswtdKLOm8yhxSSUFwBkzsaXK6zXPCzO1cJUrK9Me1wAwNyc/7YBMz2TpFFkT1tfb41XctCWFR082gKmC+7w4Y5ZJ0Qy4zghZ0qy6bH05+mjfsKDqnJQl5T00SdFscrIupDzj64JVoA2Azl/EJ64NX5rJq8wjO2onFrigpL2HJmmT42KIgTNpDgoz8YqrF6a0kfxgUCzMUT1KAxeUtPfQNJOCJrIOVFAlvlUWka5aTCRhI8wMyUkqcQ2HWBayUKk45xvv3QtsbEx/HjVsYuDYlm0LBVoXieE9q6iT5qKQulwXbhaG5CRT1Amh5+p0VLZ9Qus4W1fuukugA1Xg/Sfc3rKdFJ3o6jh4FYXU9RFmlgTCdGrt86Ipu1lihYI0DX/ZtElVnFmPa8yNI64OVFBnOYIBQWQArjGTcWKuMY53gCmu7TVGYKVITYjIduoMrrruYyAFG2HmGnMHkLLGmOLaHiFSGA4z83PZMatFv+Mac3y4xky2IWWN0efaHo+dI22krF67BvRhQJD0oDB3ABtBDCl4vjqILsQ/Jt0jZL1OLUpf1/FiylZK3Q3gVwH8q9b6iqrv05TdPCZmsFRMXjSJkzbCet1ubEzZvoT5GgCnANxDYU6XGB2Dy7qZlDVzQnzCet1uGl9j1lrfB+D7Pu5F4lG1V9K3mdvVdCdlzZwQn7BekxFcYyYXKOsYQqx/uQY+SDFABCFV5NXr+Xng1Ck6OXaNxoRZKXVYKbWmlFp74oknmnossaBM8EJED3KNZlTXiYwe3UQik/W618sGwRsbdHLsGt72MSulFgF8gWvMaVO05hti/SvWmnYKDm6E0BmsXXAfM3GmaFvFnj3536+z/uVqkq4z42193ODJwrnjDpoHEqVOfGySNl6EWSn15wD+GsCLlFInlFK3+bgvkcFwCDz99PTnc3P11nVdTNJ117pb3dnlFc6f/Ak3fCdK0aC3aJBMWoRp7E6fF2NlhyNETOSiWLu9Xv17+0qLadzfVscNLjtsuHWZbT+DgdZzc9Ovb36+2VjnjLPuB1jEyqYpu0WUzSbrmH+LZpPf97hBzjR9dWe8rfboNi0EX+YBn1509MibYmkJ+Kmfmv78zJnmll4YZS8Spgru8+KMOQxlM9s6x/6FnmXanJzjIy1JzQBsEtvkjNnnWZIdOpfStu7FPrKx1RamhgGPfewmZUcr1mlcoftNm8afVB9edwRgm1mTA7B9FZbPHrsjvb9L3Y1dNLEHBm2CwtxRTCdMLo3LRmNCzwqSmPHWHUEMBlrPztr3yoOBfrrX1+eg9LfR1x/ftayf7vX9F5bPHrsjvb+LyMYeiPZ6ssdMSfQF5+mUMKf0YkJT1IibbFwpzgq84yqq478vm/mWCFZjHTlnzNa4jj9i9XGDQeZoNpneuTkZ/WzsQYstnRHm1F5ME+Q14ibLKcVZgVdqiOoFqkwfk4U59tK/M9vXN2IQXuOKXtrysp2KDAb5I8dkK0AxqY0/JO3GyCO18uyMMKf2YmLS1Kg7tVmBd2xFNY8yZ4FJwcoRyFNYmBLnIFbhyZe2vOxnTbzXS7gCFJPaAFT6CoP09E3SGWFO7cV0gc4PlmxEtYiiQpydnf59wXe/jX7z5W/y8sfFvI65P1GaGID6eob0tiw9fZN0RphTezFdILVZgXdsRLWIvEIcCf5kT1swEDgH1Xz5V42UTbzGGxhZt8Y6k0OXdrFJT98knRHm1F5MVxh1fCM9ytOT1uKrUo4X4qTgjd+vYCDwndl+88JTNVIWEJms7X1G2StwGZBIH8RIT984nRFmrdN6MV2i7R1gKT4rZZXYmRR0U42kKi0mG+0DV5K2W9mqVlI62R6F0ClhDgUFvx5t7wAbw8SRoqyyNj1CKktLmZm/oYbWdr+UsiL22R7ZP9pDYa5Jp2d7nmh7B9gYdUc4pr9vyispcsNq+4CxqIiLZtEu7VHAa0wSCnNN2t54m4Bl6Inl5fI15ipMZ9xN9bSRp1pdEJW8ImY8mPhQmGvC2V59utABBqfIO3t5ufw3473yrl35lXk8SoRrVJhEbZkJJ90Zn+2R/aMbFOaacETohy52gF6x3Xpluh1pUphdgpX79DxnBWkEX8UtPX62VCjMNeFsr0Wk3PnbBisx3Y40Kbq2I1EfI1c2siQZDGTHz5YMhdkDKffn5Dyunb+Ul28b3tPm3M/JaFw25VQ2wzYtO5qlkqTotUmJny0ZCjMhWruvnUqZyVWZpidNzUX5LXIeGwy22yVnZrbKZzK/JqE0ez3zsuNCZZLwtbljI8wzIKRBhkNgcRGYmcn+HA4DPuz4cbvPAWBlBTh9evtnp09nnzfN0hJw5AgwO5v///v3b//36iqwsLD9s4UF4PbbgX4fUCr788iR7P9uvRXY2Nj67uYmMD+f3Wdpaevz4RA4fBhYX8/64XPnptMyeq5p2U2mfcTMTEOVY4JGK2a6FL22os+JI6YK7vPijFkmoS24jU9GXWbMEqcENgW3vDw9o817mWVm8snyMQkMsrxsPrMvylPRTD/0coIkK4lwWFTugKZsYksTDa7xZUWXTEld+zQZNVWJ3e7dW7+vEsRx6h5OUVR2JubxJnp/qe9cKFJcMFKDwkysaaJvijIZte1FJE0JitJe9LmtV3bdGfPoe2XPnZ83KzsTxzVflXGy/Gxm+oQ4QmEm1jQhmslMTEzFPOTUoWiAsLxcPHCw8cq2EdI6h1OYuuuaDCp8VMaioC0RKiZnnt2CwkysaUI0JU1GbcjtQH1nZvIhRVEcyjyiq8zBRb8b/7vLIMTHrNNkzdlHZbT1XA9E3eojXdSlpy8GFGZiTVPBnFJrsEXl8nSv7088bCJ2+bxchW7yJebF83Z5xnhQ58n7zc1lA4eqNfaiylUWMHo8rQ1VzDoD4dBjQpP7VBV1igPw0FCYiRN1RbONDbKoAz0Hj7Z/m7Vhl1lx3uX6YooGETt3+nvG6DmjytjrTYebyjtzuqjyNTUbt6DO0pFP65ZLm636TTJLVg1DYSZRSK5BGoxEijrQb6PvL7Oma8NFa8xVgjMSt8kZp8tIrMrRq2hW6+KEV+WxPV7WZZWvauATYfRYp6349AdxSUfVbyTuOJQAhZlEIakGaThVKOqEPr6r5nGMJg/p9bb+byROI9OxyXp0We/qat6oGkTkPdP2Waam/fGKVVb5qjzQS/IcaumljnXJ5wDYpc1W/Sa5AXpDUJhJFJJqkIaJzetAb5kb6LPzlscxlmFrhrUx49bMv/Hvynr0soGHyzPy0uoyYx7/fc66+dO9vj4Hpb+Nvr4RAyvxNMFV9H0uGYWYMbdxScsHFOaWI9WBKqkGaTFVmCxv745febNi2wU720rhat6oms3mlUHZjDUvnSam/cmKVRR5bHm5umIazNBPYeGCOEsYaPrqA0zb7OSS/9xc+W+k9lExoTC3GOniF7VB2jw8xiJfnjdz1csMtT5Q1y04z3xeVBHLZsB5s+ai70+GALUx57tu9xq7vo2+l6KXhslOislqWuZOQPKhMLeYpMzFTeJjHTPkIl/e80y2GIV64T5GeKYDocGgXPRMlMByhms1gDF0vjsH1cm2xj7HDxTmFpOUg1WTuIqlqbCMf+/66/OfVbbGbLMlavxlDgbTdkNfp9LnzeBDmTuKZrTj78nUFmpTlibqYTFjlmSdagr2OX6gMLcYjl4LCNV7uM50TdNXdZ/BYHoPr2n86ap82ZrV6zzHJN+mz7PZXmZyP8M15nf2Bp0TZa3Z5/iCwtxipK8xRyNU7+E60zW9T9WWKx/5MhFhn/Gix8XYNn53nRnuaHuZy4y/rvWgxd5O7HP8QGFuOS3uA9zx1XtMFq6NqJQJQ1H6qgSgStjyPLmrysVGLG0tDjbrv67Pc33XEjclJwL7nPpQmEk3MXEvtXU/LTORj/97bs4sbKRt72YzOMgTA9vBRd0Zc1PPsy3LvHjevsSTtl5iAIWZkElMZjU25mYf0bdMKDsgwuR5tqbkycGG7Z4Yk+eN7pU3sy474cqVwcCvqX4SekeJR8KMn8IsFAmVo0lE5bdsb+woYVWhG8syEqJzdjELTz7PZQY7iiKxY8f2z02czqqeN7ntyWY/tCtlafIhnpwxi0bKSgOFWSBSKkdTiMtvmeiOElangw3RObuI6uTzbMV99PsiC0BRGM2y543KPm9Q04SoVQ246iKuspNxpIybKMwCkVI5mkJcfqtEbiQarluTXJ27yrA1QxeJwaTpokh0ldr6fdlzTMrCNM9leXS5Xx5lSxQ+HcDEmIe2IzhpjSBlpYHCLBCTyiGhAflKg5TGcIGqmeOok64TzMP33mDbQBo26cyb1Y4HSCl7ls+KWiWaPmajJvlNEJPXwMm8nEkChVkgVZVDQgPymQYpjWEbg0H52b5l69AuIlS3EPJm8JOXa9Diql69aFa9a5d9JSl7Vplj1uxscTrGfQN85DcxTNuqyHbYMBL6Vq0pzCKpqhwSGpDPNEhpDFYJcznVqIy6ZoOiE5PGr5mZ8vRNOliZej0XWQ9svc9NtilV5dHHu2gZpm1VnOUqEhLGZRRmoZRVDgkNyHcaJDSGXIoSZmo6znOwsrlf1UinyFvZ9Bo3w+TNuMvM8+N56fWmZ+M2lcR0m1Kdvc9dmvqNYfoaJAz4SQaFOUEkNCAJaYiKqQfzpGNA0QzcxWzgskWqKH1lgjcS3fF/m6yJF93T5vhGkzK0zWuDSBhwmrZV2yrYVN4klGHTUJgTREIDEmt+bpLxgi1bjx5h4jxg86LqzB4nn10nuEhRfvNM3EC+97rNNqWy9f9ez+xdNICUNmKTDtMq2FTeQqQ9BSjMiSKhAbWpIdTGpKB92/9txHR+flokTRwXbK/JvJiuM9tuU/JtfQiAJKuS77baVN5CzfalQ2FuOZI6h9ZT1fvZmHZH9wKKD5+wWeceCVZR+kycx0yfNZ6HKhEfT5vtNqUqD+7Io0YJviChaCpvXV0fpzC3nDZ3DslRZNodbfUp29OcJ1Ym662mL9plxlzmQW2zFjw+040lpgGe3TaxGEfajLlt/Vzjwgzg1QD+EcDjAH6v6vsU5nq0uXNIjsFgesuSyzVu3h2fWdd50bZrzDt2ZPuUR/8eba2qSo9pOusKpc3vA9lB22ZeHUfaGnPb+rlGhRnALIBvAngBgHkADwO4rOw3FOZ6tLlzSAofHtTj12Qgk6IXbRrm01ZM89ary2b6JoONsrKyqbS2vw/YqwuwqAdDkld22/q5poX5lwF8aezf7wHwnrLfUJjr0+bOIQnKvIh9XCPz9vLy1nNmZ7W+/no7l1bTWXNRXkzy6MN7vQqbtXyt22cH7Sht6udshFll33dHKfUmAK/WWr/9/L9vBvBSrfU7in5z8OBBvba2Vuu5hERjOAQOHwZOnw7/rPl54MyZrX8rlUnMJP0+cOzY9OdKVT9jYcE9LwsLwKFDwNGj2++xsAAcOQIsLWX/npnJT7dSwOZm9XOKfg8Ag0H258oKcPw4sH8/cOoUsLEx/d2iciIkMEqpB7TWB02+O+PjeTmfTbUgpdRhpdSaUmrtiSee8PBYQiKxslIuZEplguqDcVEGisXp+PHpz4bDamHu9YBnPav4/2dny3975Ajw4Q9nf/b72fP6/e2iDGRimcfMTJbOKop+DwA33QTcfDOwvp6Vz/q3UpJgAAAba0lEQVQ68MMf5r+DU6fMnkdIRHwI8wkAF4/9ex+Af578ktb6iNb6oNb64HOe8xwPjyUkEnkiOGJhAfjEJ4C7786EK+//B4PsKhM9W/bsyQRncTETu717gd/4jWIhB7L0/ehH+TPLUVoPHy4W9927t8R3aSmbiW5uZn+OizIArK5m95vk3LnsGVViubpa/v+T+Tx7Frjooul3sLFh9jxCYmJq8y66AOwA8C0Al2DL+evyst+0YY059bWP1NMflaL1zrwTj8oK2ldkLkDr3bvtnLQWFsrjcY+ntWwd3DSvWpfvqzZZa7aNHz5KR9Xz2BhIAyDCdqkbADyGzDt7per7qQtz6t6Cqac/Or4K0FdkLttrNICoG+lBKbuzp01jZ9uUe9k1Etuy57ExkIZoXJhtr9SFOfX9damnXwRVsyzX/SChL5ezRk0iiJmcImUbOzuv/Ez3VI+2elV5jLMxkIagMAcm9Z0YqadfPHWi9LuKrYmZd2ZmOsRlVTp9DB5GwmoaO9s1XaOKXRVpbfxebAxR6OLqAYU5MKkPslNPf3Rc42fnrUFPYiLOc3PT5ySbCGhZ/G7bvLgMHvLEMi92ts0pB0VpN30HbAyN09XVAwpzYFKvWKmnPwrjM76yeNJal5tsJ49HHL/vyOya9/udO82OHau7jjuJjYNa1XdHM9nxgCl5B1r4SL/pTDjhxpDqrLOrYyEKcwOk2ihGpJ7+RjGZjZpEuZqcudoeClE3FKdLz2fqCV21pjv+vTIRLHNKM7E4uJRBgo0h4fGE0+pBgq9oCgozIT4xMedWxYWevEzvWzYjLVu39tVrFwmzUtPmdK3LZ811Qnfa5iNl5TIg5Vmnbdrb8iopzKQ2bRihesPEnJu3/ln2fZs41mXXrl35pmFfL9B2elPm4FU1UHEt6yJaXIlT9lkTdB5Jo1CYiRM2y6idwnUWVxXAI9Q+5rx1W995t5nejBy8TLyyba0THSV1sbIZM6U8CBmHwkxyKWsMtsuonaJqa06ZM1aZuITaxzw7a58/m4pRtfVrlIbJ8imzEowqFyuiEW0x75qQ+iBkBIWZTFHVkDlRqcDVLFo0ax4XoiIzRZ3LNA+me4ZdgqXkKUWV4I7WrkflRtNNIS221G+jLYMQCjOZomrU6XNpj4xh06uYeEDPzm71xDMzxd8Zf37Z+kTVwMGUuvuGiwS4K+pDSmlDNaAwkymq1ml8OcOSHEx7FZOZ8LjZoihU5vXXbz3X1VRuax4pG9mNV57l5envmoTzJCRxbITZx7GPJAGKjrMdfZ53Kt/otL+843WbYPwUw8XFSCf1+UhE1ZGINoy/yA9/GLj++unv/PVfZ+msOjfa9Dl1v3/6dJaW4RA4ejST3RFKbf/3OGXHaxLSZkwV3OfVxRlzSFOMryVASeYiEetKTSeiypSd9+yyNQqT9YmZGT95rJqdl8UCN9nfTKaQ1F5JNaApWxYh+3cXp9kUGrIIT0wfMZttGAy03rEj/5m9Xv59y9YoTLdj5XlTjxywbPJUdd5y2UCBTl5W+OxTUuoXUobCLIyQIiNCwAIgYu+iSSJ8j7oGg+0z5yJBHlHmvGWz9ajqu1V5Kvv9KD541UDBZAuaD1qgRL7avQjLVEegMAsjpMiIELAAiBhwmCQidEKr9hjPzRUL4fjvq2alJrPrsjyV/X5urnhrVtMvuCVK5Kvdi2hnHYHCLAzOmO0R0X+aJCLkyKjO5vPxfcqT+4PzRN5kPbosT1W/37VrK09l6Q49omxJg/GVjbYO7CVCYRaGlDXm1BBhcaxKRMxRV5UYzs9Pz6iLKkfdGbPJHuzRNq7Q5VZGS5TIV7tvyTglCSjMAontlU0CEWpkNBiUi5zJmq2NwJaFy6zKU5FJvSjdIcutihYpkY923+aBvTQozIQ0ie+Rkcla7Oj8Y9cAInnpzAv+AVQ7oNkMEMYFMMaIkko0BQf2zUBhJiRlTIVu1Iu6zpzzBMmll7aN8R2796cSkQjYCDMjf3UcEdG1yHZMI14dP74VVWwwmA7dVsUoItc4RVHKiirKcJh9ZsP6OnD4cL3KVqfi+ozEVhO2P5KLqYL7vDhjlgGtekIxnQH3ett/NxgUR9EqukycnsrWkOuciFW1rls0s21JxW1JNoghoCm7XYSyvJVFSKSVLyIma8x5wlz027JTpExMyyYe1y5X2aCgTLVa4sDVkmwQQyjMLSLkqNpkssMRfCQmI4DZCFveSK5OZK8QolylQC4xwCVsebIYRUvOBvEPhblFxNgmyxG8AEJEyapyFCu6X10B7vXsR5cuMcBjV1TLUbTUbJAw2Agznb+EU+QH5ONEvLyjHkM9i1hicmTj6qrdPUdOT6PzPCcpetG9nt1zxllYAO66Kzs3tN/Pnm1yjmjZOaV5FXdhwb48fJP3zvIc7M4jNRtEAKYK7vPijNmcJkMx8/Q9QVStM+StL5tiW6kGgyyKWFFaxg+fWF72d9JW2exT4pYnB9u0xGyQMICm7PZQ1j81EdfCVxArdj6WVB0KMR7z2lYMXV60acxtn6RWcWibJiVQmFuGqS+PRBFtzZaQUCJRtiWoaItS0ZnNNgWcmuilgMDKztcsBwpzB0hlcO5iNRXXkfjucMedsMqOYqyzTUlaRegKgiqwpHGCoGKJBoW5A0jaalHW6GzSKakj2YbPUZCNt3Wd4B2AWVq63lu2GNdqS6tZGCjMHcC3Vrg2xKpGZ5NOsVYAn6Mg0z1qNt/Nu2Zny9PB3rL1uFTbENVCbLtuGApzB/DVgOrep6rR2dxfkhVgGz57FpNZsFJmQUHqzJjZW7Yel1ccolqUVfkujQMpzB3Bh8mpbkM0EVPTdIrVCtPRhUlGbaO6VAUFqfp9EWJHQcQXLoPuENWirPp2yUhDYSbG1G2IoZdfpxpurHXRqufaiLfJLHjyBZRNO1xMHmJHQcQnRX6GRUdsh6gWVVW+K1WOwkyMqdsQQzks5+qf5HVRm4J0iepSdP/R70d/mg5WJJcl8UrRzrv5effxpUsaTMegbYXCTIzx0RAbm8Q2OMuzzpOr6cHnTFvsiyNNkfdKy0zJVeNGn9Wi60YaCjOxIpn+uaF1UafBSp1ex/QFuMy0myKZSiSbEDskysZyTc5Wu26koTCTqATroxsacjs9puleR5LzVtd7XE+E2iFRNIaLMY7r8viNwkyiEbSPbkgAnDWvyV5Hkl1QUloSJtQOCcB8jZmEw0aYeewj8YrlyXd2LC3ZHx/oQNmJg5XpO3YM2NzM/vScrm1IOjMw5NmkCTMcAouLwMxM9udwWP79usVYVD/7feBjH9t+emevB9x99/YqapteX8R6rmhMFdznxRlzeylbz0qF4BNzXzNrKXZBzpincKlDMXdIxFqN6NIqCGjKJrEoWs+qihBZhxD6FEzzJPZEdTMrMU+RcRHZmDskYo2tujSmozCTaDQ9Y05OE6T1RD5ju0qYvQuhzu65GMUYy5dQkg9jaGyEWWXfb5aDBw/qtbW1xp9LwrO4CKyvT3/e72fLrqk/rzYzM1nfM4lS2dp004QswOEwcy44fjxbAF1dDbvuLojU6mWs9KZWTnVQSj2gtT5o8l06fxGvNO2T5N3vKLQnirNnWSBCOW4Nh8Dhw1mvq3X25+HDnfHskeSbZ0Ks9EosJxHOaKZTa58XTdnxaMJUluyuoSbs4tJs76FM69JM9hEoagdSrf5Sw9A3ScjmCa4xkzykaYIPvOapiehdtt8NTahK0aXFQwva2AbrIKkpaB12PElhbghplaoKH5VOYp69pSl0vGuphHipnDHnwmLZQmKzCTmepDA3gMRKVUXdSpdinvMo1CHXXpO97TRtqSyeoSFhC4nNphUzZgC/DuDvAGwCOGj6uzYIs8RKVUXdNKeY50lK9cJVTDrU21pNriWaVyLThjbkC4nNphVrzABeDOBFAL7eNWGWWKmqqFvpUszzJJUdo4uYdKS3naw/N2Kg11Vfb4LCawoNCVtIbTahxpONm7K7KMxSK1UVg4HWvd5Wens9+dGBfBJkcNGR3nb8/d+IgT6F9uc5BDQkZHSk2VzARpi5j9kRifvvTPnRj7b+vrFhvr005TyPCLKNuKHDNWIzvrX5fVjBLoQ6raTdmJx1ImIvbeD0dKTZuFGl3AC+AuCRnOvXxr7zdVTMmAEcBrAGYG3//v0NjVHCkuLI10eg/FGee73sSin/XRul+2S87pxDC9Y1hCKtjkpLT6qApmxShC9TbsqNNcUBlQTG3/m30U9/XcMDUnaahazTbVjCkgCFmRTiq5GxsXaTkQC8FQP9b2r7yOzf1IL+xnJ3RjlSYrOEHiS3welTAjbCXGuNWSn1eqXUCQC/DOAvlFJfqnM/Eh5f68ShQiz7RtpaXeqM1keHegkP3n4Ex1Ufm1A4hj7ero/gVUeXOlPGKyvZsvo4PpbZbf0gfKYjr71IC+/eCUwV3OfFGXNcfJi9Upgxp2xuT4EU6kBIQs0kbett6OWp5WW2Ix+Akb9IaFIQPTHC0dSidsOL53mCcCMG2fpzBxbwQ9Yvm1fZxPIU/TLqQ2EmjSCxsY6nKa+TaXxtrKkRTISR0mRHLnlvc4i6urw8Xc9Ms+szPb5ePdeSw0JhJp0kr4OKPmNuatoewTwwWd5SPbVDjFny7qlUJtax0tOF5amUoTATKyTOfF0o6liiTuCamoZEmu6M151G9zZbVNoQguN6z8FA69lZmQKYwvJUylCYiTEpNEbTPrjKfB1l0NHiGXO0NFhW2hBjFpd7Vll0JJiM2zJIlwiFmRgzHjdb0uh9hE0fLEGbpmjpGnNuB95UGixftJQZc5VFJ1Q9pdjKgMJMjBgMijsJCaN3re06QLGz/5Z5ZZeWcxNpsJyuNrXGXHXPMlEOVU/FtokOQmEmRpSN4KXMmF2iIHF2EJbolgmHBISoF7b3LFpbBsLV0+jvilzARphV9v1mOXjwoF5bW2v8uWQ7MzNZM81jMJBxysviIrC+Pv15v59FoOoKw2EWyen48Szi0upqvPdTVG+Uyk5MCs5wmB2JNh7uamFB/NFEShX/X6huOPq7IhdQSj2gtT5o8l0e+9hhikLq9Xpy+rc6IUTbEo5zpEPr61knu75uflRnCKxCNAo8L9A1SXWz0u/bfe4DhtNMFNOptc+LpmwZpLL+5GKGTCVvJkgzRxqXrcCX4JokH1mJURwCX0FnAdeYiSltXZOVJmZ18Lndx9f7NrqPwJfgmiRfWYnR3mye2db+QAI2wsw1ZlKKpLVNG9q0tuZrnb3xpVmBL8E1SQKz4p1El+6TgWvMxAvS1jZtaNPamq+jOkMdU1iIwJfgmiSBWfFO4/WDFEJhJoWk3FDb5DRW09fpAo2foe1rROER1yS5/k5aXSojlTPWO4GpzdvnxTXmNEj9tJmuO41NEmXJV+CipWuSbH+XWl0S6BLQKsA1ZuKDLu4hbnOeuYbYLKnVJdaPsHCNmXghlCXSh3kvlImwzeY8XyZxYkZqdYn1Qw6cMZNSfHtl+xiVhxzZpzbLIXJhXSLjcMZMvLG0lHUim5vZn3WFz4dDWUintNVVYH5++2fz81H9lZxJyfGojQj0fSOJQGEmjeLDvBfaRDhpRIpgVKpNqK1uFHtzaBomrlCYSaP42A8ack/pygpw9uz2z86ebX6LWF0BDGFVSGFfu7SBQ57FSVoaiUBM3bd9Xtwu1V2kxxyWsEXMR/5C5EP6dpoUtielkEYSBjBWNpFM1X7Qsv8f/R+wdb6tz+2xEsTHRxpC5EPCoKUMCe+uihTSSMJAYSbJUjajaGK2IWFG40MAQ+QjtqhUDeikDxy0TiONJAwUZiIClwhLZZ1/U8IQO1iV1JOMYg5aTJ4de+BgQgppJGGwEWY6f5EguDoKlXlcpxawwRVf22x8b3WL6WVs4syWwvakFNJIBGCq4D4vzphl4nOGFeLc2yZmGxJM2aN0CAsxHRVTE7BpucUs36aezTokC9CUTWzxLUiua2mx15hpapSJz/ciZfAVki7kMTUozMQa34JU534mXtmhZgF0zpGJT6Hp9do/+OIAUx42wsxY2QRAFuwgryoola1R2pLqSTWMbywXH3Hbh0Pgppvy/8+1rkvEd3sm9WGsbGKN72haqYYjpHOOXHw4s5VFPvMROU4KIaPjkfBQmBPHV3i/EILk2yu4CVIdUBAzyjz42zT44gAzbWjKThjf5mLfRzwSIo2ipYpeD3jyycaTExS2Z1nQlN0RfB9UkOIMlxAbimaSd90V/tm21q261jC253ShMCdMWwNu8PQdEopYSxW2AXdSOMmLhIOm7IRpowdxqt7cXYSmUnNs22ob23bXoSm7I7TRwSPEOcJkC1/WiLwZ3dveBuzdS0tHHrbWrbZaw4gZFOaEaaMHsY8OiabwfHyaR/MGUGfOABsbNL3mYbt9idudug2FOXHa5uBRt0Pi2lwxPq0RJgMlWjq2sLVutdEaRsyhMBNR1O2Q2mwKr2sJqGONmHz2nj31ntk1bK1bbbSGEQtMY3f6vBgrm5RRJx52UaxrIO0A/j5iRbvGT8579tyc1vPzxWU9uno9nnBEiNZ2sbI5YybiqGOeLzN5F5m0U1iT9mEJcLVG5D377Fngoou2ZnS9HjA3t/07c3PA00/bbRGS/h4IaQRTBfd5ccZMQpE3uyubHaZyPJ6vU69crBGuZyHbnOKUynsgxBXwdCnSZWxOEEplv2jMdLo+2+aEI5/54/5qIhHuYyadZmkp69DzmDR1p7JfNKaXruuzbTzsfb0HeuWTNkBhJq3EVExS2S8a00vX9dk2gl71HkzXn9vslU86hKnN2+fFNWbSBCbrqVzb9ENRWZuuaZe9B9N3NBgU+xaUrcXX2QVAiCmwWGOmMBPxhO442THXw0Q4TQdJed8x2eZl6/Rnk3ZCfGAjzHT+IqLhoRbyqXLcqvsOTZzIitJQ9axUnP9I+tg4f1GYiWjYccqnSjjrvkOT3xelAQAGg+IBgI3nOCF1aMwrWyn1fqXUPyil/lYp9Tml1M/UuR8hk6TiNd1lqhy36r5DEyeyojT0++Wz8lSc/0i3qOuVfS+AK7TWVwJ4DMB76ieJkC3YccqnSjjrvkMTr3DXLV08LIJIpJYwa62/rLV+5vw//wbAvvpJImQLdpzyqRJOH++wKkyr65YuHhZBRGLqJVZ1AfjfAG4y+S69sokN9JqWhcv7kP4OpaePpA98emUrpb4C4Ody/mtFa/35899ZAXAQwBt0wQ2VUocBHAaA/fv3X71e5EJJCBFLG73k25gnIo9GvbKVUocA3A7geq316arvA/TKJiRV2ugl38Y8EXnYCPOOmg96NYB3A/hPpqJMCEmXNnrJtzFPJG3qemX/dwAXAbhXKfWQUupPPaSJECKUNnrJtzFPJG3qemVfqrW+WGt94Px1u6+EEULk0UYv+TbmiaQNT5cihBjTxu1FbcwTSRuG5CSEXGA4zI5IPH48M+WurlKgCPFBYyE5CekypmcEp8Jo29D6ehY/en09+3fq+SIkNSjMhDjQRhFbWdm+lxfI/n3oUHsGH4SkAIWZEAeKRGxlJU56fFC0PejcOffBR9usCoQ0AYWZEAfauPfVZHuQzeCjjVYFQpqAwkySQ8IsrA17XyfL8YYbgLm56t+ZDj7aaFUgpAkozCQpJMzChkPg1Knpz132vsYaZOSV45/9Wfb3KkwHH220KhDSBBRmkhSxZ2EjQdvY2P55r2e/9zXmICOvHM+eBZ55Jv/7I2wGH22wKhASAwozSYrYs7A8QQOA3bvt9/vGHGTYlpdL4A1G1CLEDQozSYrYszCfA4OYgwyb8ur3gc3N7KQlm8EHI2oR4gaFmSRF7FmYz4FBzEFGXjnOzQHz89s/q1u2S0uZoLsIOyFdhcJMkiL2LMznwCDmICOvHD/2MeDuu+3LVoKXPCFtgrGyCbHEZzzp1GNTjxzYxtfKFxZosiZkEptY2RRmQogzi4uZN/kk/X5muiaEZPAQC0JII+SJMsC9yoTUgcJMCHFiOMzWovPgXmVC3KEwE0KcWFnJjxSmFPcqE1IHCjMhxIkic7XWdPwipA4UZkKIE0Xm6n6/2XQQ0jYozIQQJ2IHeyGkrVCYCSFOxA72Qkhb2RE7AYSQdFlaohAT4hvOmAkhhBBBUJgJIYQQQVCYCSGEEEFQmAkhhBBBUJgJIYQQQVCYCSGEEEFQmAkhhBBBUJgJIYQQQVCYCSGEEEFQmAkhhBBBUJgJIYQQQSidd9J56Icq9QSA9Zz/2gvgyYaTkyIsJzNYTtWwjMxgOZnBciqmr7V+jskXowhzEUqpNa31wdjpkA7LyQyWUzUsIzNYTmawnPxAUzYhhBAiCAozIYQQIghpwnwkdgISgeVkBsupGpaRGSwnM1hOHhC1xkwIIYR0HWkzZkIIIaTTiBNmpdR7lVJ/q5R6SCn1ZaXU82KnSSJKqfcrpf7hfFl9Tin1M7HTJA2l1K8rpf5OKbWplKKn6ARKqVcrpf5RKfW4Uur3YqdHIkqpu5VS/6qUeiR2WqSilLpYKfU1pdSj59vbnbHTlDrihBnA+7XWV2qtDwD4AoD/GjtBQrkXwBVa6ysBPAbgPZHTI5FHALwBwH2xEyINpdQsgA8B+M8ALgNwo1LqsripEsnHAbw6diKE8wyA39VavxjAywD8NutSPcQJs9b6h2P/3AWAi+A5aK2/rLV+5vw//wbAvpjpkYjW+lGt9T/GTodQ/j2Ax7XW39JanwHwKQC/FjlN4tBa3wfg+7HTIRmt9Xe11g+e//vTAB4F8Py4qUqbHbETkIdSahXAbwB4CsArIicnBd4G4H/ETgRJiucD+M7Yv08AeGmktJCWoJRaBPASAP83bkrSJoowK6W+AuDncv5rRWv9ea31CoAVpdR7ALwDwO83mkAhVJXT+e+sIDMlDZtMmxRMyojkonI+o3WKOKOU2g3gswDeNWH5JJZEEWat9a8YfvWTAP4CHRXmqnJSSh0C8KsArtcd3fdmUZfIdk4AuHjs3/sA/HOktJDEUUrNIRPlodb6f8ZOT+qIW2NWSv3C2D9fC+AfYqVFMkqpVwN4N4DXaq1Px04PSY77AfyCUuoSpdQ8gLcA+F+R00QSRCmlAHwUwKNa6z+OnZ42IC7AiFLqswBeBGAT2QlUt2utT8ZNlTyUUo8D+HcANs5/9Dda69sjJkkcSqnXA/hvAJ4D4AcAHtJavypuquSglLoBwAcBzAK4W2u9GjlJ4lBK/TmAa5GdmvQ9AL+vtf5o1EQJQyn1HwF8A8D/Q9ZvA8B/0Vp/MV6q0kacMBNCCCFdRpwpmxBCCOkyFGZCCCFEEBRmQgghRBAUZkIIIUQQFGZCCCFEEBRmQgghRBAUZkIIIUQQFGZCCCFEEP8fxlEOKAybH0wAAAAASUVORK5CYII=\n", "text/plain": [ "<Figure size 576x432 with 1 Axes>" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "import numpy as np\n", "import matplotlib.pyplot as plt\n", "from sklearn.preprocessing import scale\n", "from sklearn.model_selection import train_test_split\n", "from sklearn.datasets import make_moons, make_circles\n", "from sklearn.model_selection import train_test_split\n", "\n", "np.random.seed(10)\n", "\n", "# X, Y = make_moons(noise=0.2, random_state=0, n_samples=1000)\n", "X, Y = make_circles(noise=0.2, factor=0.2, random_state=0, n_samples=500)\n", "X = scale(X)\n", "\n", "plt.figure(figsize=(8, 6))\n", "plt.scatter(X[Y==0, 0], X[Y==0, 1], color='b', label='Class 0')\n", "plt.scatter(X[Y==1, 0], X[Y==1, 1], color='r', label='Class 1')\n", "plt.legend();\n", "\n", "\n", "X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=.5)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**b)** Create a neural network using `binary_crossentropy` as loss function and `accuracy` as a metric to evaluate the results and train it using the training set." ] }, { "cell_type": "code", "execution_count": 147, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "_________________________________________________________________\n", "Layer (type) Output Shape Param # \n", "=================================================================\n", "dense_54 (Dense) (None, 3) 9 \n", "_________________________________________________________________\n", "dense_55 (Dense) (None, 1) 4 \n", "=================================================================\n", "Total params: 13\n", "Trainable params: 13\n", "Non-trainable params: 0\n", "_________________________________________________________________\n" ] } ], "source": [ "import keras\n", "from keras.models import Sequential\n", "from keras.layers import Dense\n", "from keras.optimizers import adam\n", "\n", "model = Sequential()\n", "model.add(Dense(3, input_dim=2, activation='tanh')) \n", "model.add(Dense(1, activation='sigmoid'))\n", "model.compile(optimizer=adam(0.03), loss='binary_crossentropy', metrics=['accuracy'])\n", "\n", "model.summary()" ] }, { "cell_type": "code", "execution_count": 148, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Train on 250 samples, validate on 250 samples\n", "Epoch 1/1000\n", "250/250 [==============================] - 2s 9ms/step - loss: 0.7638 - acc: 0.5080 - val_loss: 0.6990 - val_acc: 0.5680\n", "Epoch 2/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.6867 - acc: 0.5960 - val_loss: 0.6679 - val_acc: 0.6840\n", "Epoch 3/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.6586 - acc: 0.7720 - val_loss: 0.6497 - val_acc: 0.8120\n", "Epoch 4/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.6320 - acc: 0.7880 - val_loss: 0.6263 - val_acc: 0.8000\n", "Epoch 5/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.6089 - acc: 0.8000 - val_loss: 0.6025 - val_acc: 0.8000\n", "Epoch 6/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.5833 - acc: 0.8160 - val_loss: 0.5767 - val_acc: 0.8040\n", "Epoch 7/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.5566 - acc: 0.8280 - val_loss: 0.5542 - val_acc: 0.7960\n", "Epoch 8/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.5348 - acc: 0.8320 - val_loss: 0.5337 - val_acc: 0.8040\n", "Epoch 9/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.5103 - acc: 0.8400 - val_loss: 0.5138 - val_acc: 0.8000\n", "Epoch 10/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.4906 - acc: 0.8400 - val_loss: 0.4989 - val_acc: 0.8040\n", "Epoch 11/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.4736 - acc: 0.8440 - val_loss: 0.4844 - val_acc: 0.8040\n", "Epoch 12/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.4557 - acc: 0.8440 - val_loss: 0.4748 - val_acc: 0.7960\n", "Epoch 13/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.4465 - acc: 0.8280 - val_loss: 0.4667 - val_acc: 0.8000\n", "Epoch 14/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.4327 - acc: 0.8400 - val_loss: 0.4582 - val_acc: 0.8000\n", "Epoch 15/1000\n", "250/250 [==============================] - 0s 224us/step - loss: 0.4250 - acc: 0.8480 - val_loss: 0.4528 - val_acc: 0.8080\n", "Epoch 16/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.4175 - acc: 0.8480 - val_loss: 0.4484 - val_acc: 0.8040\n", "Epoch 17/1000\n", "250/250 [==============================] - ETA: 0s - loss: 0.2764 - acc: 0.937 - 0s 148us/step - loss: 0.4103 - acc: 0.8400 - val_loss: 0.4444 - val_acc: 0.8080\n", "Epoch 18/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.4079 - acc: 0.8240 - val_loss: 0.4416 - val_acc: 0.7960\n", "Epoch 19/1000\n", "250/250 [==============================] - 0s 256us/step - loss: 0.4008 - acc: 0.8440 - val_loss: 0.4356 - val_acc: 0.8040\n", "Epoch 20/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.3980 - acc: 0.8400 - val_loss: 0.4310 - val_acc: 0.8000\n", "Epoch 21/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.3957 - acc: 0.8400 - val_loss: 0.4286 - val_acc: 0.8040\n", "Epoch 22/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.3921 - acc: 0.8320 - val_loss: 0.4236 - val_acc: 0.8040\n", "Epoch 23/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.3897 - acc: 0.8360 - val_loss: 0.4231 - val_acc: 0.8120\n", "Epoch 24/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.3863 - acc: 0.8480 - val_loss: 0.4210 - val_acc: 0.8040\n", "Epoch 25/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.3868 - acc: 0.8440 - val_loss: 0.4199 - val_acc: 0.8160\n", "Epoch 26/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.3858 - acc: 0.8480 - val_loss: 0.4149 - val_acc: 0.8160\n", "Epoch 27/1000\n", "250/250 [==============================] - 0s 248us/step - loss: 0.3830 - acc: 0.8320 - val_loss: 0.4137 - val_acc: 0.8160\n", "Epoch 28/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.3809 - acc: 0.8280 - val_loss: 0.4137 - val_acc: 0.8120\n", "Epoch 29/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.3794 - acc: 0.8440 - val_loss: 0.4151 - val_acc: 0.8080\n", "Epoch 30/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.3784 - acc: 0.8360 - val_loss: 0.4111 - val_acc: 0.8160\n", "Epoch 31/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.3764 - acc: 0.8400 - val_loss: 0.4097 - val_acc: 0.8120\n", "Epoch 32/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.3748 - acc: 0.8400 - val_loss: 0.4112 - val_acc: 0.8240\n", "Epoch 33/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.3760 - acc: 0.8480 - val_loss: 0.4096 - val_acc: 0.8160\n", "Epoch 34/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.3742 - acc: 0.8440 - val_loss: 0.4085 - val_acc: 0.8240\n", "Epoch 35/1000\n", "250/250 [==============================] - 0s 176us/step - loss: 0.3740 - acc: 0.8400 - val_loss: 0.4090 - val_acc: 0.8240\n", "Epoch 36/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.3714 - acc: 0.8440 - val_loss: 0.4071 - val_acc: 0.8160\n", "Epoch 37/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.3724 - acc: 0.8400 - val_loss: 0.4039 - val_acc: 0.8280\n", "Epoch 38/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.3696 - acc: 0.8440 - val_loss: 0.4061 - val_acc: 0.8200\n", "Epoch 39/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.3696 - acc: 0.8440 - val_loss: 0.4053 - val_acc: 0.8200\n", "Epoch 40/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.3732 - acc: 0.8440 - val_loss: 0.4095 - val_acc: 0.8120\n", "Epoch 41/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.3680 - acc: 0.8440 - val_loss: 0.3999 - val_acc: 0.8400\n", "Epoch 42/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.3673 - acc: 0.8480 - val_loss: 0.3997 - val_acc: 0.8320\n", "Epoch 43/1000\n", "250/250 [==============================] - 0s 188us/step - loss: 0.3642 - acc: 0.8520 - val_loss: 0.4019 - val_acc: 0.8200\n", "Epoch 44/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.3648 - acc: 0.8560 - val_loss: 0.4018 - val_acc: 0.8240\n", "Epoch 45/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.3641 - acc: 0.8480 - val_loss: 0.3970 - val_acc: 0.8320\n", "Epoch 46/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.3615 - acc: 0.8480 - val_loss: 0.3936 - val_acc: 0.8360\n", "Epoch 47/1000\n", "250/250 [==============================] - 0s 204us/step - loss: 0.3587 - acc: 0.8480 - val_loss: 0.3935 - val_acc: 0.8240\n", "Epoch 48/1000\n", "250/250 [==============================] - 0s 196us/step - loss: 0.3559 - acc: 0.8640 - val_loss: 0.3928 - val_acc: 0.8160\n", "Epoch 49/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.3552 - acc: 0.8560 - val_loss: 0.3921 - val_acc: 0.8200\n", "Epoch 50/1000\n", "250/250 [==============================] - 0s 196us/step - loss: 0.3529 - acc: 0.8560 - val_loss: 0.3833 - val_acc: 0.8320\n", "Epoch 51/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.3480 - acc: 0.8480 - val_loss: 0.3771 - val_acc: 0.8360\n", "Epoch 52/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.3467 - acc: 0.8560 - val_loss: 0.3783 - val_acc: 0.8240\n", "Epoch 53/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.3374 - acc: 0.8600 - val_loss: 0.3679 - val_acc: 0.8320\n", "Epoch 54/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.3349 - acc: 0.8520 - val_loss: 0.3589 - val_acc: 0.8320\n", "Epoch 55/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.3279 - acc: 0.8560 - val_loss: 0.3551 - val_acc: 0.8400\n", "Epoch 56/1000\n", "250/250 [==============================] - 0s 176us/step - loss: 0.3211 - acc: 0.8560 - val_loss: 0.3465 - val_acc: 0.8360\n", "Epoch 57/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.3106 - acc: 0.8560 - val_loss: 0.3318 - val_acc: 0.8400\n", "Epoch 58/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.3049 - acc: 0.8520 - val_loss: 0.3228 - val_acc: 0.8560\n", "Epoch 59/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.2955 - acc: 0.8960 - val_loss: 0.3128 - val_acc: 0.8920\n", "Epoch 60/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 212us/step - loss: 0.2847 - acc: 0.9240 - val_loss: 0.3024 - val_acc: 0.9120\n", "Epoch 61/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.2722 - acc: 0.9280 - val_loss: 0.2924 - val_acc: 0.9360\n", "Epoch 62/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.2611 - acc: 0.9400 - val_loss: 0.2814 - val_acc: 0.9280\n", "Epoch 63/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.2508 - acc: 0.9560 - val_loss: 0.2695 - val_acc: 0.9400\n", "Epoch 64/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.2386 - acc: 0.9600 - val_loss: 0.2582 - val_acc: 0.9360\n", "Epoch 65/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.2215 - acc: 0.9680 - val_loss: 0.2441 - val_acc: 0.9520\n", "Epoch 66/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.2090 - acc: 0.9800 - val_loss: 0.2313 - val_acc: 0.9600\n", "Epoch 67/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.1962 - acc: 0.9760 - val_loss: 0.2188 - val_acc: 0.9440\n", "Epoch 68/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.1854 - acc: 0.9760 - val_loss: 0.2084 - val_acc: 0.9560\n", "Epoch 69/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.1766 - acc: 0.9720 - val_loss: 0.2000 - val_acc: 0.9680\n", "Epoch 70/1000\n", "250/250 [==============================] - 0s 196us/step - loss: 0.1670 - acc: 0.9800 - val_loss: 0.1924 - val_acc: 0.9640\n", "Epoch 71/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.1617 - acc: 0.9800 - val_loss: 0.1863 - val_acc: 0.9640\n", "Epoch 72/1000\n", "250/250 [==============================] - 0s 256us/step - loss: 0.1552 - acc: 0.9760 - val_loss: 0.1806 - val_acc: 0.9680\n", "Epoch 73/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.1509 - acc: 0.9760 - val_loss: 0.1763 - val_acc: 0.9640\n", "Epoch 74/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.1470 - acc: 0.9760 - val_loss: 0.1725 - val_acc: 0.9680\n", "Epoch 75/1000\n", "250/250 [==============================] - 0s 212us/step - loss: 0.1428 - acc: 0.9760 - val_loss: 0.1682 - val_acc: 0.9680\n", "Epoch 76/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.1389 - acc: 0.9760 - val_loss: 0.1648 - val_acc: 0.9640\n", "Epoch 77/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.1368 - acc: 0.9800 - val_loss: 0.1632 - val_acc: 0.9680\n", "Epoch 78/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.1345 - acc: 0.9800 - val_loss: 0.1590 - val_acc: 0.9640\n", "Epoch 79/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.1314 - acc: 0.9760 - val_loss: 0.1569 - val_acc: 0.9680\n", "Epoch 80/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.1291 - acc: 0.9760 - val_loss: 0.1547 - val_acc: 0.9640\n", "Epoch 81/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.1274 - acc: 0.9760 - val_loss: 0.1531 - val_acc: 0.9600\n", "Epoch 82/1000\n", "250/250 [==============================] - 0s 216us/step - loss: 0.1259 - acc: 0.9760 - val_loss: 0.1514 - val_acc: 0.9640\n", "Epoch 83/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.1238 - acc: 0.9760 - val_loss: 0.1499 - val_acc: 0.9600\n", "Epoch 84/1000\n", "250/250 [==============================] - 0s 196us/step - loss: 0.1218 - acc: 0.9760 - val_loss: 0.1476 - val_acc: 0.9640\n", "Epoch 85/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.1210 - acc: 0.9760 - val_loss: 0.1464 - val_acc: 0.9600\n", "Epoch 86/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.1193 - acc: 0.9760 - val_loss: 0.1447 - val_acc: 0.9600\n", "Epoch 87/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.1180 - acc: 0.9720 - val_loss: 0.1438 - val_acc: 0.9600\n", "Epoch 88/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.1166 - acc: 0.9760 - val_loss: 0.1424 - val_acc: 0.9600\n", "Epoch 89/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.1154 - acc: 0.9760 - val_loss: 0.1420 - val_acc: 0.9600\n", "Epoch 90/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.1147 - acc: 0.9760 - val_loss: 0.1414 - val_acc: 0.9640\n", "Epoch 91/1000\n", "250/250 [==============================] - 0s 208us/step - loss: 0.1135 - acc: 0.9760 - val_loss: 0.1393 - val_acc: 0.9600\n", "Epoch 92/1000\n", "250/250 [==============================] - 0s 180us/step - loss: 0.1128 - acc: 0.9760 - val_loss: 0.1382 - val_acc: 0.9600\n", "Epoch 93/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.1122 - acc: 0.9760 - val_loss: 0.1379 - val_acc: 0.9640\n", "Epoch 94/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.1108 - acc: 0.9720 - val_loss: 0.1362 - val_acc: 0.9600\n", "Epoch 95/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.1097 - acc: 0.9760 - val_loss: 0.1356 - val_acc: 0.9600\n", "Epoch 96/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.1095 - acc: 0.9760 - val_loss: 0.1355 - val_acc: 0.9600\n", "Epoch 97/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.1082 - acc: 0.9760 - val_loss: 0.1344 - val_acc: 0.9600\n", "Epoch 98/1000\n", "250/250 [==============================] - 0s 204us/step - loss: 0.1082 - acc: 0.9760 - val_loss: 0.1343 - val_acc: 0.9600\n", "Epoch 99/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.1070 - acc: 0.9760 - val_loss: 0.1330 - val_acc: 0.9600\n", "Epoch 100/1000\n", "250/250 [==============================] - 0s 232us/step - loss: 0.1055 - acc: 0.9760 - val_loss: 0.1320 - val_acc: 0.9600\n", "Epoch 101/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.1052 - acc: 0.9760 - val_loss: 0.1309 - val_acc: 0.9600\n", "Epoch 102/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.1056 - acc: 0.9760 - val_loss: 0.1310 - val_acc: 0.9600\n", "Epoch 103/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.1037 - acc: 0.9760 - val_loss: 0.1299 - val_acc: 0.9600\n", "Epoch 104/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.1044 - acc: 0.9760 - val_loss: 0.1296 - val_acc: 0.9600\n", "Epoch 105/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.1029 - acc: 0.9760 - val_loss: 0.1302 - val_acc: 0.9600\n", "Epoch 106/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.1028 - acc: 0.9760 - val_loss: 0.1308 - val_acc: 0.9640\n", "Epoch 107/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.1018 - acc: 0.9760 - val_loss: 0.1296 - val_acc: 0.9640\n", "Epoch 108/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.1016 - acc: 0.9760 - val_loss: 0.1282 - val_acc: 0.9600\n", "Epoch 109/1000\n", "250/250 [==============================] - 0s 228us/step - loss: 0.1008 - acc: 0.9760 - val_loss: 0.1278 - val_acc: 0.9640\n", "Epoch 110/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.1001 - acc: 0.9760 - val_loss: 0.1265 - val_acc: 0.9600\n", "Epoch 111/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0995 - acc: 0.9760 - val_loss: 0.1263 - val_acc: 0.9600\n", "Epoch 112/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0995 - acc: 0.9760 - val_loss: 0.1271 - val_acc: 0.9600\n", "Epoch 113/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0993 - acc: 0.9760 - val_loss: 0.1259 - val_acc: 0.9600\n", "Epoch 114/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0983 - acc: 0.9760 - val_loss: 0.1266 - val_acc: 0.9640\n", "Epoch 115/1000\n", "250/250 [==============================] - 0s 184us/step - loss: 0.0987 - acc: 0.9760 - val_loss: 0.1255 - val_acc: 0.9640\n", "Epoch 116/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0993 - acc: 0.9760 - val_loss: 0.1240 - val_acc: 0.9600\n", "Epoch 117/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0975 - acc: 0.9760 - val_loss: 0.1244 - val_acc: 0.9600\n", "Epoch 118/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0970 - acc: 0.9760 - val_loss: 0.1238 - val_acc: 0.9600\n", "Epoch 119/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0971 - acc: 0.9760 - val_loss: 0.1234 - val_acc: 0.9640\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Epoch 120/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0962 - acc: 0.9760 - val_loss: 0.1230 - val_acc: 0.9640\n", "Epoch 121/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0968 - acc: 0.9760 - val_loss: 0.1239 - val_acc: 0.9640\n", "Epoch 122/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0963 - acc: 0.9760 - val_loss: 0.1231 - val_acc: 0.9560\n", "Epoch 123/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0964 - acc: 0.9760 - val_loss: 0.1231 - val_acc: 0.9600\n", "Epoch 124/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0948 - acc: 0.9760 - val_loss: 0.1223 - val_acc: 0.9600\n", "Epoch 125/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0956 - acc: 0.9760 - val_loss: 0.1230 - val_acc: 0.9640\n", "Epoch 126/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0945 - acc: 0.9760 - val_loss: 0.1224 - val_acc: 0.9640\n", "Epoch 127/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0949 - acc: 0.9760 - val_loss: 0.1215 - val_acc: 0.9600\n", "Epoch 128/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0940 - acc: 0.9760 - val_loss: 0.1226 - val_acc: 0.9640\n", "Epoch 129/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0940 - acc: 0.9760 - val_loss: 0.1210 - val_acc: 0.9600\n", "Epoch 130/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0944 - acc: 0.9760 - val_loss: 0.1212 - val_acc: 0.9640\n", "Epoch 131/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0934 - acc: 0.9760 - val_loss: 0.1226 - val_acc: 0.9640\n", "Epoch 132/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0934 - acc: 0.9760 - val_loss: 0.1221 - val_acc: 0.9640\n", "Epoch 133/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0928 - acc: 0.9760 - val_loss: 0.1200 - val_acc: 0.9600\n", "Epoch 134/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0936 - acc: 0.9760 - val_loss: 0.1209 - val_acc: 0.9640\n", "Epoch 135/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0918 - acc: 0.9760 - val_loss: 0.1200 - val_acc: 0.9640\n", "Epoch 136/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0923 - acc: 0.9760 - val_loss: 0.1187 - val_acc: 0.9560\n", "Epoch 137/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0918 - acc: 0.9760 - val_loss: 0.1196 - val_acc: 0.9560\n", "Epoch 138/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0913 - acc: 0.9760 - val_loss: 0.1204 - val_acc: 0.9640\n", "Epoch 139/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0921 - acc: 0.9760 - val_loss: 0.1205 - val_acc: 0.9640\n", "Epoch 140/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0913 - acc: 0.9760 - val_loss: 0.1196 - val_acc: 0.9640\n", "Epoch 141/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0924 - acc: 0.9760 - val_loss: 0.1180 - val_acc: 0.9600\n", "Epoch 142/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0902 - acc: 0.9760 - val_loss: 0.1191 - val_acc: 0.9640\n", "Epoch 143/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0917 - acc: 0.9760 - val_loss: 0.1194 - val_acc: 0.9640\n", "Epoch 144/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0907 - acc: 0.9760 - val_loss: 0.1186 - val_acc: 0.9640\n", "Epoch 145/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0903 - acc: 0.9760 - val_loss: 0.1175 - val_acc: 0.9560\n", "Epoch 146/1000\n", "250/250 [==============================] - 0s 200us/step - loss: 0.0906 - acc: 0.9760 - val_loss: 0.1190 - val_acc: 0.9640\n", "Epoch 147/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0904 - acc: 0.9760 - val_loss: 0.1189 - val_acc: 0.9640\n", "Epoch 148/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0899 - acc: 0.9760 - val_loss: 0.1176 - val_acc: 0.9600\n", "Epoch 149/1000\n", "250/250 [==============================] - 0s 192us/step - loss: 0.0901 - acc: 0.9720 - val_loss: 0.1170 - val_acc: 0.9640\n", "Epoch 150/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0892 - acc: 0.9760 - val_loss: 0.1175 - val_acc: 0.9640\n", "Epoch 151/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0883 - acc: 0.9760 - val_loss: 0.1188 - val_acc: 0.9640\n", "Epoch 152/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0893 - acc: 0.9760 - val_loss: 0.1200 - val_acc: 0.9600\n", "Epoch 153/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0901 - acc: 0.9760 - val_loss: 0.1195 - val_acc: 0.9640\n", "Epoch 154/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0882 - acc: 0.9760 - val_loss: 0.1175 - val_acc: 0.9640\n", "Epoch 155/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0880 - acc: 0.9760 - val_loss: 0.1155 - val_acc: 0.9600\n", "Epoch 156/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0886 - acc: 0.9760 - val_loss: 0.1162 - val_acc: 0.9600\n", "Epoch 157/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0886 - acc: 0.9760 - val_loss: 0.1159 - val_acc: 0.9640\n", "Epoch 158/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0878 - acc: 0.9760 - val_loss: 0.1171 - val_acc: 0.9640\n", "Epoch 159/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0873 - acc: 0.9760 - val_loss: 0.1170 - val_acc: 0.9640\n", "Epoch 160/1000\n", "250/250 [==============================] - 0s 192us/step - loss: 0.0871 - acc: 0.9760 - val_loss: 0.1168 - val_acc: 0.9640\n", "Epoch 161/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0887 - acc: 0.9760 - val_loss: 0.1190 - val_acc: 0.9640\n", "Epoch 162/1000\n", "250/250 [==============================] - 0s 228us/step - loss: 0.0872 - acc: 0.9760 - val_loss: 0.1168 - val_acc: 0.9600\n", "Epoch 163/1000\n", "250/250 [==============================] - 0s 176us/step - loss: 0.0889 - acc: 0.9720 - val_loss: 0.1145 - val_acc: 0.9600\n", "Epoch 164/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0872 - acc: 0.9760 - val_loss: 0.1160 - val_acc: 0.9640\n", "Epoch 165/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0874 - acc: 0.9720 - val_loss: 0.1168 - val_acc: 0.9640\n", "Epoch 166/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0872 - acc: 0.9760 - val_loss: 0.1181 - val_acc: 0.9640\n", "Epoch 167/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0874 - acc: 0.9760 - val_loss: 0.1175 - val_acc: 0.9640\n", "Epoch 168/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0872 - acc: 0.9760 - val_loss: 0.1159 - val_acc: 0.9640\n", "Epoch 169/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0857 - acc: 0.9760 - val_loss: 0.1141 - val_acc: 0.9600\n", "Epoch 170/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0868 - acc: 0.9760 - val_loss: 0.1157 - val_acc: 0.9600\n", "Epoch 171/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0862 - acc: 0.9760 - val_loss: 0.1148 - val_acc: 0.9640\n", "Epoch 172/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0857 - acc: 0.9760 - val_loss: 0.1167 - val_acc: 0.9640\n", "Epoch 173/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0863 - acc: 0.9760 - val_loss: 0.1159 - val_acc: 0.9640\n", "Epoch 174/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0872 - acc: 0.9720 - val_loss: 0.1131 - val_acc: 0.9560\n", "Epoch 175/1000\n", "250/250 [==============================] - 0s 192us/step - loss: 0.0864 - acc: 0.9760 - val_loss: 0.1153 - val_acc: 0.9640\n", "Epoch 176/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0848 - acc: 0.9760 - val_loss: 0.1161 - val_acc: 0.9640\n", "Epoch 177/1000\n", "250/250 [==============================] - 0s 204us/step - loss: 0.0849 - acc: 0.9760 - val_loss: 0.1138 - val_acc: 0.9640\n", "Epoch 178/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0849 - acc: 0.9760 - val_loss: 0.1135 - val_acc: 0.9640\n", "Epoch 179/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 112us/step - loss: 0.0863 - acc: 0.9720 - val_loss: 0.1143 - val_acc: 0.9600\n", "Epoch 180/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0871 - acc: 0.9720 - val_loss: 0.1141 - val_acc: 0.9560\n", "Epoch 181/1000\n", "250/250 [==============================] - 0s 284us/step - loss: 0.0849 - acc: 0.9760 - val_loss: 0.1159 - val_acc: 0.9640\n", "Epoch 182/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0860 - acc: 0.9760 - val_loss: 0.1176 - val_acc: 0.9640\n", "Epoch 183/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0843 - acc: 0.9760 - val_loss: 0.1155 - val_acc: 0.9640\n", "Epoch 184/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0853 - acc: 0.9760 - val_loss: 0.1136 - val_acc: 0.9600\n", "Epoch 185/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0845 - acc: 0.9760 - val_loss: 0.1138 - val_acc: 0.9640\n", "Epoch 186/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0845 - acc: 0.9720 - val_loss: 0.1148 - val_acc: 0.9640\n", "Epoch 187/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0837 - acc: 0.9760 - val_loss: 0.1146 - val_acc: 0.9600\n", "Epoch 188/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0843 - acc: 0.9720 - val_loss: 0.1128 - val_acc: 0.9640\n", "Epoch 189/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0842 - acc: 0.9760 - val_loss: 0.1155 - val_acc: 0.9600\n", "Epoch 190/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0860 - acc: 0.9760 - val_loss: 0.1154 - val_acc: 0.9520\n", "Epoch 191/1000\n", "250/250 [==============================] - 0s 208us/step - loss: 0.0837 - acc: 0.9720 - val_loss: 0.1143 - val_acc: 0.9600\n", "Epoch 192/1000\n", "250/250 [==============================] - 0s 212us/step - loss: 0.0828 - acc: 0.9760 - val_loss: 0.1146 - val_acc: 0.9600\n", "Epoch 193/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0848 - acc: 0.9720 - val_loss: 0.1142 - val_acc: 0.9640\n", "Epoch 194/1000\n", "250/250 [==============================] - 0s 204us/step - loss: 0.0832 - acc: 0.9720 - val_loss: 0.1144 - val_acc: 0.9640\n", "Epoch 195/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0828 - acc: 0.9760 - val_loss: 0.1141 - val_acc: 0.9640\n", "Epoch 196/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0829 - acc: 0.9760 - val_loss: 0.1141 - val_acc: 0.9600\n", "Epoch 197/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0830 - acc: 0.9760 - val_loss: 0.1118 - val_acc: 0.9640\n", "Epoch 198/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.0822 - acc: 0.9760 - val_loss: 0.1109 - val_acc: 0.9600\n", "Epoch 199/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0832 - acc: 0.9760 - val_loss: 0.1133 - val_acc: 0.9560\n", "Epoch 200/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0822 - acc: 0.9760 - val_loss: 0.1134 - val_acc: 0.9600\n", "Epoch 201/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0823 - acc: 0.9760 - val_loss: 0.1132 - val_acc: 0.9640\n", "Epoch 202/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0830 - acc: 0.9720 - val_loss: 0.1131 - val_acc: 0.9600\n", "Epoch 203/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0815 - acc: 0.9760 - val_loss: 0.1138 - val_acc: 0.9600\n", "Epoch 204/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0816 - acc: 0.9760 - val_loss: 0.1132 - val_acc: 0.9600\n", "Epoch 205/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0811 - acc: 0.9760 - val_loss: 0.1114 - val_acc: 0.9640\n", "Epoch 206/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0823 - acc: 0.9760 - val_loss: 0.1126 - val_acc: 0.9640\n", "Epoch 207/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0812 - acc: 0.9760 - val_loss: 0.1106 - val_acc: 0.9640\n", "Epoch 208/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0826 - acc: 0.9720 - val_loss: 0.1109 - val_acc: 0.9640\n", "Epoch 209/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0812 - acc: 0.9760 - val_loss: 0.1120 - val_acc: 0.9640\n", "Epoch 210/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0839 - acc: 0.9760 - val_loss: 0.1156 - val_acc: 0.9600\n", "Epoch 211/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0807 - acc: 0.9760 - val_loss: 0.1092 - val_acc: 0.9640\n", "Epoch 212/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0831 - acc: 0.9720 - val_loss: 0.1088 - val_acc: 0.9600\n", "Epoch 213/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0804 - acc: 0.9720 - val_loss: 0.1112 - val_acc: 0.9640\n", "Epoch 214/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0821 - acc: 0.9760 - val_loss: 0.1133 - val_acc: 0.9640\n", "Epoch 215/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0807 - acc: 0.9760 - val_loss: 0.1125 - val_acc: 0.9600\n", "Epoch 216/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0801 - acc: 0.9720 - val_loss: 0.1115 - val_acc: 0.9600\n", "Epoch 217/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0803 - acc: 0.9760 - val_loss: 0.1129 - val_acc: 0.9600\n", "Epoch 218/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0798 - acc: 0.9760 - val_loss: 0.1122 - val_acc: 0.9600\n", "Epoch 219/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0801 - acc: 0.9720 - val_loss: 0.1105 - val_acc: 0.9640\n", "Epoch 220/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0814 - acc: 0.9760 - val_loss: 0.1097 - val_acc: 0.9600\n", "Epoch 221/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0816 - acc: 0.9760 - val_loss: 0.1077 - val_acc: 0.9600\n", "Epoch 222/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0801 - acc: 0.9760 - val_loss: 0.1099 - val_acc: 0.9560\n", "Epoch 223/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0797 - acc: 0.9760 - val_loss: 0.1128 - val_acc: 0.9600\n", "Epoch 224/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0800 - acc: 0.9760 - val_loss: 0.1143 - val_acc: 0.9600\n", "Epoch 225/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0794 - acc: 0.9760 - val_loss: 0.1128 - val_acc: 0.9600\n", "Epoch 226/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0796 - acc: 0.9760 - val_loss: 0.1102 - val_acc: 0.9600\n", "Epoch 227/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0792 - acc: 0.9760 - val_loss: 0.1095 - val_acc: 0.9640\n", "Epoch 228/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0785 - acc: 0.9760 - val_loss: 0.1096 - val_acc: 0.9600\n", "Epoch 229/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0823 - acc: 0.9720 - val_loss: 0.1079 - val_acc: 0.9600\n", "Epoch 230/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0833 - acc: 0.9720 - val_loss: 0.1139 - val_acc: 0.9600\n", "Epoch 231/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0797 - acc: 0.9760 - val_loss: 0.1114 - val_acc: 0.9600\n", "Epoch 232/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0789 - acc: 0.9720 - val_loss: 0.1095 - val_acc: 0.9600\n", "Epoch 233/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0796 - acc: 0.9720 - val_loss: 0.1100 - val_acc: 0.9560\n", "Epoch 234/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0814 - acc: 0.9760 - val_loss: 0.1146 - val_acc: 0.9600\n", "Epoch 235/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0785 - acc: 0.9760 - val_loss: 0.1087 - val_acc: 0.9640\n", "Epoch 236/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0772 - acc: 0.9760 - val_loss: 0.1077 - val_acc: 0.9600\n", "Epoch 237/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0779 - acc: 0.9720 - val_loss: 0.1091 - val_acc: 0.9600\n", "Epoch 238/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 160us/step - loss: 0.0799 - acc: 0.9720 - val_loss: 0.1095 - val_acc: 0.9560\n", "Epoch 239/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0784 - acc: 0.9720 - val_loss: 0.1130 - val_acc: 0.9600\n", "Epoch 240/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0780 - acc: 0.9720 - val_loss: 0.1135 - val_acc: 0.9600\n", "Epoch 241/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0798 - acc: 0.9760 - val_loss: 0.1107 - val_acc: 0.9640\n", "Epoch 242/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0777 - acc: 0.9720 - val_loss: 0.1091 - val_acc: 0.9600\n", "Epoch 243/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0775 - acc: 0.9720 - val_loss: 0.1105 - val_acc: 0.9600\n", "Epoch 244/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0816 - acc: 0.9760 - val_loss: 0.1134 - val_acc: 0.9560\n", "Epoch 245/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0778 - acc: 0.9760 - val_loss: 0.1077 - val_acc: 0.9640\n", "Epoch 246/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0783 - acc: 0.9720 - val_loss: 0.1071 - val_acc: 0.9600\n", "Epoch 247/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0775 - acc: 0.9720 - val_loss: 0.1096 - val_acc: 0.9560\n", "Epoch 248/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0759 - acc: 0.9760 - val_loss: 0.1137 - val_acc: 0.9560\n", "Epoch 249/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0797 - acc: 0.9800 - val_loss: 0.1140 - val_acc: 0.9600\n", "Epoch 250/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0752 - acc: 0.9800 - val_loss: 0.1073 - val_acc: 0.9640\n", "Epoch 251/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0782 - acc: 0.9720 - val_loss: 0.1082 - val_acc: 0.9560\n", "Epoch 252/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0776 - acc: 0.9720 - val_loss: 0.1089 - val_acc: 0.9560\n", "Epoch 253/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0779 - acc: 0.9760 - val_loss: 0.1129 - val_acc: 0.9560\n", "Epoch 254/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0765 - acc: 0.9800 - val_loss: 0.1117 - val_acc: 0.9600\n", "Epoch 255/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0772 - acc: 0.9720 - val_loss: 0.1080 - val_acc: 0.9560\n", "Epoch 256/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0754 - acc: 0.9720 - val_loss: 0.1075 - val_acc: 0.9560\n", "Epoch 257/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0757 - acc: 0.9720 - val_loss: 0.1074 - val_acc: 0.9600\n", "Epoch 258/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0784 - acc: 0.9760 - val_loss: 0.1098 - val_acc: 0.9600\n", "Epoch 259/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0772 - acc: 0.9760 - val_loss: 0.1095 - val_acc: 0.9560\n", "Epoch 260/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0768 - acc: 0.9720 - val_loss: 0.1095 - val_acc: 0.9600\n", "Epoch 261/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0785 - acc: 0.9720 - val_loss: 0.1075 - val_acc: 0.9640\n", "Epoch 262/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0758 - acc: 0.9720 - val_loss: 0.1097 - val_acc: 0.9560\n", "Epoch 263/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0764 - acc: 0.9760 - val_loss: 0.1096 - val_acc: 0.9520\n", "Epoch 264/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0765 - acc: 0.9720 - val_loss: 0.1124 - val_acc: 0.9480\n", "Epoch 265/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0764 - acc: 0.9720 - val_loss: 0.1094 - val_acc: 0.9560\n", "Epoch 266/1000\n", "250/250 [==============================] - 0s 200us/step - loss: 0.0750 - acc: 0.9720 - val_loss: 0.1108 - val_acc: 0.9600\n", "Epoch 267/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0760 - acc: 0.9720 - val_loss: 0.1084 - val_acc: 0.9560\n", "Epoch 268/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0755 - acc: 0.9760 - val_loss: 0.1074 - val_acc: 0.9600\n", "Epoch 269/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0773 - acc: 0.9760 - val_loss: 0.1088 - val_acc: 0.9600\n", "Epoch 270/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0753 - acc: 0.9760 - val_loss: 0.1063 - val_acc: 0.9560\n", "Epoch 271/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0770 - acc: 0.9720 - val_loss: 0.1054 - val_acc: 0.9600\n", "Epoch 272/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0755 - acc: 0.9760 - val_loss: 0.1103 - val_acc: 0.9560\n", "Epoch 273/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0757 - acc: 0.9760 - val_loss: 0.1096 - val_acc: 0.9560\n", "Epoch 274/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0755 - acc: 0.9760 - val_loss: 0.1107 - val_acc: 0.9560\n", "Epoch 275/1000\n", "250/250 [==============================] - 0s 200us/step - loss: 0.0743 - acc: 0.9760 - val_loss: 0.1086 - val_acc: 0.9520\n", "Epoch 276/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0742 - acc: 0.9720 - val_loss: 0.1072 - val_acc: 0.9600\n", "Epoch 277/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0743 - acc: 0.9720 - val_loss: 0.1068 - val_acc: 0.9600\n", "Epoch 278/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0747 - acc: 0.9720 - val_loss: 0.1067 - val_acc: 0.9560\n", "Epoch 279/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0769 - acc: 0.9720 - val_loss: 0.1065 - val_acc: 0.9640\n", "Epoch 280/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0763 - acc: 0.9720 - val_loss: 0.1137 - val_acc: 0.9640\n", "Epoch 281/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0747 - acc: 0.9720 - val_loss: 0.1098 - val_acc: 0.9520\n", "Epoch 282/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0742 - acc: 0.9720 - val_loss: 0.1068 - val_acc: 0.9560\n", "Epoch 283/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0744 - acc: 0.9720 - val_loss: 0.1077 - val_acc: 0.9560\n", "Epoch 284/1000\n", "250/250 [==============================] - 0s 220us/step - loss: 0.0767 - acc: 0.9760 - val_loss: 0.1092 - val_acc: 0.9640\n", "Epoch 285/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0773 - acc: 0.9720 - val_loss: 0.1040 - val_acc: 0.9560\n", "Epoch 286/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0736 - acc: 0.9720 - val_loss: 0.1088 - val_acc: 0.9520\n", "Epoch 287/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0756 - acc: 0.9720 - val_loss: 0.1083 - val_acc: 0.9560\n", "Epoch 288/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0733 - acc: 0.9720 - val_loss: 0.1099 - val_acc: 0.9520\n", "Epoch 289/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0726 - acc: 0.9760 - val_loss: 0.1092 - val_acc: 0.9560\n", "Epoch 290/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0763 - acc: 0.9720 - val_loss: 0.1076 - val_acc: 0.9600\n", "Epoch 291/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0765 - acc: 0.9720 - val_loss: 0.1106 - val_acc: 0.9600\n", "Epoch 292/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0740 - acc: 0.9720 - val_loss: 0.1069 - val_acc: 0.9560\n", "Epoch 293/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0733 - acc: 0.9720 - val_loss: 0.1063 - val_acc: 0.9640\n", "Epoch 294/1000\n", "250/250 [==============================] - 0s 192us/step - loss: 0.0756 - acc: 0.9760 - val_loss: 0.1107 - val_acc: 0.9600\n", "Epoch 295/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0740 - acc: 0.9760 - val_loss: 0.1050 - val_acc: 0.9560\n", "Epoch 296/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0740 - acc: 0.9720 - val_loss: 0.1077 - val_acc: 0.9520\n", "Epoch 297/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 124us/step - loss: 0.0745 - acc: 0.9760 - val_loss: 0.1093 - val_acc: 0.9520\n", "Epoch 298/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0742 - acc: 0.9760 - val_loss: 0.1117 - val_acc: 0.9560\n", "Epoch 299/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0728 - acc: 0.9760 - val_loss: 0.1060 - val_acc: 0.9560\n", "Epoch 300/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0748 - acc: 0.9760 - val_loss: 0.1073 - val_acc: 0.9520\n", "Epoch 301/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0721 - acc: 0.9720 - val_loss: 0.1090 - val_acc: 0.9560\n", "Epoch 302/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0730 - acc: 0.9760 - val_loss: 0.1095 - val_acc: 0.9640\n", "Epoch 303/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0725 - acc: 0.9720 - val_loss: 0.1088 - val_acc: 0.9520\n", "Epoch 304/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0720 - acc: 0.9720 - val_loss: 0.1096 - val_acc: 0.9560\n", "Epoch 305/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0719 - acc: 0.9760 - val_loss: 0.1067 - val_acc: 0.9600\n", "Epoch 306/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0724 - acc: 0.9720 - val_loss: 0.1077 - val_acc: 0.9560\n", "Epoch 307/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0723 - acc: 0.9760 - val_loss: 0.1067 - val_acc: 0.9600\n", "Epoch 308/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0729 - acc: 0.9760 - val_loss: 0.1081 - val_acc: 0.9600\n", "Epoch 309/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0706 - acc: 0.9720 - val_loss: 0.1071 - val_acc: 0.9560\n", "Epoch 310/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0729 - acc: 0.9720 - val_loss: 0.1098 - val_acc: 0.9480\n", "Epoch 311/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0703 - acc: 0.9720 - val_loss: 0.1093 - val_acc: 0.9600\n", "Epoch 312/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0720 - acc: 0.9760 - val_loss: 0.1094 - val_acc: 0.9640\n", "Epoch 313/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0727 - acc: 0.9760 - val_loss: 0.1067 - val_acc: 0.9560\n", "Epoch 314/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0700 - acc: 0.9720 - val_loss: 0.1052 - val_acc: 0.9600\n", "Epoch 315/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0738 - acc: 0.9720 - val_loss: 0.1085 - val_acc: 0.9520\n", "Epoch 316/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0763 - acc: 0.9720 - val_loss: 0.1154 - val_acc: 0.9600\n", "Epoch 317/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0711 - acc: 0.9760 - val_loss: 0.1049 - val_acc: 0.9600\n", "Epoch 318/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0727 - acc: 0.9720 - val_loss: 0.1028 - val_acc: 0.9600\n", "Epoch 319/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0715 - acc: 0.9720 - val_loss: 0.1123 - val_acc: 0.9560\n", "Epoch 320/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0725 - acc: 0.9720 - val_loss: 0.1146 - val_acc: 0.9560\n", "Epoch 321/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0701 - acc: 0.9720 - val_loss: 0.1086 - val_acc: 0.9560\n", "Epoch 322/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0724 - acc: 0.9720 - val_loss: 0.1068 - val_acc: 0.9600\n", "Epoch 323/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0732 - acc: 0.9720 - val_loss: 0.1085 - val_acc: 0.9520\n", "Epoch 324/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0710 - acc: 0.9720 - val_loss: 0.1095 - val_acc: 0.9600\n", "Epoch 325/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0705 - acc: 0.9760 - val_loss: 0.1063 - val_acc: 0.9560\n", "Epoch 326/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0696 - acc: 0.9760 - val_loss: 0.1067 - val_acc: 0.9520\n", "Epoch 327/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0718 - acc: 0.9720 - val_loss: 0.1104 - val_acc: 0.9520\n", "Epoch 328/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0687 - acc: 0.9720 - val_loss: 0.1069 - val_acc: 0.9600\n", "Epoch 329/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0712 - acc: 0.9760 - val_loss: 0.1065 - val_acc: 0.9600\n", "Epoch 330/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0713 - acc: 0.9760 - val_loss: 0.1096 - val_acc: 0.9600\n", "Epoch 331/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0712 - acc: 0.9760 - val_loss: 0.1094 - val_acc: 0.9560\n", "Epoch 332/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0737 - acc: 0.9720 - val_loss: 0.1064 - val_acc: 0.9560\n", "Epoch 333/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0679 - acc: 0.9720 - val_loss: 0.1082 - val_acc: 0.9600\n", "Epoch 334/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0689 - acc: 0.9760 - val_loss: 0.1087 - val_acc: 0.9600\n", "Epoch 335/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0706 - acc: 0.9760 - val_loss: 0.1089 - val_acc: 0.9600\n", "Epoch 336/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0708 - acc: 0.9760 - val_loss: 0.1056 - val_acc: 0.9560\n", "Epoch 337/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0702 - acc: 0.9720 - val_loss: 0.1098 - val_acc: 0.9520\n", "Epoch 338/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0694 - acc: 0.9720 - val_loss: 0.1102 - val_acc: 0.9600\n", "Epoch 339/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0683 - acc: 0.9680 - val_loss: 0.1061 - val_acc: 0.9600\n", "Epoch 340/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0693 - acc: 0.9760 - val_loss: 0.1064 - val_acc: 0.9600\n", "Epoch 341/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0699 - acc: 0.9760 - val_loss: 0.1059 - val_acc: 0.9560\n", "Epoch 342/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0696 - acc: 0.9760 - val_loss: 0.1083 - val_acc: 0.9520\n", "Epoch 343/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0685 - acc: 0.9800 - val_loss: 0.1072 - val_acc: 0.9560\n", "Epoch 344/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0691 - acc: 0.9760 - val_loss: 0.1075 - val_acc: 0.9600\n", "Epoch 345/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0706 - acc: 0.9720 - val_loss: 0.1082 - val_acc: 0.9560\n", "Epoch 346/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0687 - acc: 0.9720 - val_loss: 0.1096 - val_acc: 0.9600\n", "Epoch 347/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0683 - acc: 0.9720 - val_loss: 0.1117 - val_acc: 0.9600\n", "Epoch 348/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0689 - acc: 0.9720 - val_loss: 0.1104 - val_acc: 0.9520\n", "Epoch 349/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0677 - acc: 0.9720 - val_loss: 0.1116 - val_acc: 0.9560\n", "Epoch 350/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0725 - acc: 0.9720 - val_loss: 0.1122 - val_acc: 0.9600\n", "Epoch 351/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0684 - acc: 0.9720 - val_loss: 0.1044 - val_acc: 0.9600\n", "Epoch 352/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0698 - acc: 0.9720 - val_loss: 0.1088 - val_acc: 0.9560\n", "Epoch 353/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0688 - acc: 0.9760 - val_loss: 0.1059 - val_acc: 0.9600\n", "Epoch 354/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0697 - acc: 0.9720 - val_loss: 0.1105 - val_acc: 0.9600\n", "Epoch 355/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0680 - acc: 0.9760 - val_loss: 0.1094 - val_acc: 0.9560\n", "Epoch 356/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 156us/step - loss: 0.0687 - acc: 0.9720 - val_loss: 0.1066 - val_acc: 0.9560\n", "Epoch 357/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0702 - acc: 0.9720 - val_loss: 0.1125 - val_acc: 0.9600\n", "Epoch 358/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0671 - acc: 0.9760 - val_loss: 0.1075 - val_acc: 0.9560\n", "Epoch 359/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0669 - acc: 0.9720 - val_loss: 0.1112 - val_acc: 0.9560\n", "Epoch 360/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0676 - acc: 0.9720 - val_loss: 0.1124 - val_acc: 0.9600\n", "Epoch 361/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0679 - acc: 0.9720 - val_loss: 0.1096 - val_acc: 0.9560\n", "Epoch 362/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0669 - acc: 0.9760 - val_loss: 0.1079 - val_acc: 0.9600\n", "Epoch 363/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0667 - acc: 0.9760 - val_loss: 0.1064 - val_acc: 0.9600\n", "Epoch 364/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0698 - acc: 0.9760 - val_loss: 0.1149 - val_acc: 0.9560\n", "Epoch 365/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0676 - acc: 0.9760 - val_loss: 0.1052 - val_acc: 0.9600\n", "Epoch 366/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0679 - acc: 0.9760 - val_loss: 0.1058 - val_acc: 0.9560\n", "Epoch 367/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0666 - acc: 0.9720 - val_loss: 0.1089 - val_acc: 0.9600\n", "Epoch 368/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0664 - acc: 0.9720 - val_loss: 0.1048 - val_acc: 0.9640\n", "Epoch 369/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0698 - acc: 0.9760 - val_loss: 0.1105 - val_acc: 0.9600\n", "Epoch 370/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0646 - acc: 0.9760 - val_loss: 0.1058 - val_acc: 0.9600\n", "Epoch 371/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0692 - acc: 0.9720 - val_loss: 0.1086 - val_acc: 0.9600\n", "Epoch 372/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0679 - acc: 0.9760 - val_loss: 0.1103 - val_acc: 0.9560\n", "Epoch 373/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0669 - acc: 0.9720 - val_loss: 0.1076 - val_acc: 0.9600\n", "Epoch 374/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0656 - acc: 0.9720 - val_loss: 0.1072 - val_acc: 0.9560\n", "Epoch 375/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0664 - acc: 0.9680 - val_loss: 0.1155 - val_acc: 0.9560\n", "Epoch 376/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0667 - acc: 0.9720 - val_loss: 0.1123 - val_acc: 0.9560\n", "Epoch 377/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0652 - acc: 0.9720 - val_loss: 0.1083 - val_acc: 0.9600\n", "Epoch 378/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0671 - acc: 0.9720 - val_loss: 0.1100 - val_acc: 0.9560\n", "Epoch 379/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0706 - acc: 0.9760 - val_loss: 0.1094 - val_acc: 0.9560\n", "Epoch 380/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0776 - acc: 0.9720 - val_loss: 0.1132 - val_acc: 0.9600\n", "Epoch 381/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0698 - acc: 0.9760 - val_loss: 0.1155 - val_acc: 0.9560\n", "Epoch 382/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0723 - acc: 0.9720 - val_loss: 0.1160 - val_acc: 0.9560\n", "Epoch 383/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0665 - acc: 0.9720 - val_loss: 0.1070 - val_acc: 0.9600\n", "Epoch 384/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0678 - acc: 0.9760 - val_loss: 0.1042 - val_acc: 0.9560\n", "Epoch 385/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0677 - acc: 0.9720 - val_loss: 0.1116 - val_acc: 0.9560\n", "Epoch 386/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0643 - acc: 0.9720 - val_loss: 0.1101 - val_acc: 0.9560\n", "Epoch 387/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0664 - acc: 0.9720 - val_loss: 0.1100 - val_acc: 0.9560\n", "Epoch 388/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0654 - acc: 0.9760 - val_loss: 0.1065 - val_acc: 0.9560\n", "Epoch 389/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0658 - acc: 0.9760 - val_loss: 0.1064 - val_acc: 0.9600\n", "Epoch 390/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0660 - acc: 0.9720 - val_loss: 0.1118 - val_acc: 0.9560\n", "Epoch 391/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0665 - acc: 0.9720 - val_loss: 0.1101 - val_acc: 0.9560\n", "Epoch 392/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0653 - acc: 0.9720 - val_loss: 0.1102 - val_acc: 0.9560\n", "Epoch 393/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0659 - acc: 0.9720 - val_loss: 0.1127 - val_acc: 0.9560\n", "Epoch 394/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0651 - acc: 0.9760 - val_loss: 0.1088 - val_acc: 0.9600\n", "Epoch 395/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0659 - acc: 0.9760 - val_loss: 0.1096 - val_acc: 0.9560\n", "Epoch 396/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0642 - acc: 0.9720 - val_loss: 0.1088 - val_acc: 0.9560\n", "Epoch 397/1000\n", "250/250 [==============================] - 0s 188us/step - loss: 0.0647 - acc: 0.9720 - val_loss: 0.1128 - val_acc: 0.9600\n", "Epoch 398/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0680 - acc: 0.9720 - val_loss: 0.1114 - val_acc: 0.9600\n", "Epoch 399/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0639 - acc: 0.9760 - val_loss: 0.1092 - val_acc: 0.9560\n", "Epoch 400/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0673 - acc: 0.9720 - val_loss: 0.1130 - val_acc: 0.9560\n", "Epoch 401/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0648 - acc: 0.9720 - val_loss: 0.1089 - val_acc: 0.9560\n", "Epoch 402/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0649 - acc: 0.9720 - val_loss: 0.1105 - val_acc: 0.9600\n", "Epoch 403/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0629 - acc: 0.9800 - val_loss: 0.1082 - val_acc: 0.9520\n", "Epoch 404/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0640 - acc: 0.9720 - val_loss: 0.1090 - val_acc: 0.9520\n", "Epoch 405/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0654 - acc: 0.9720 - val_loss: 0.1128 - val_acc: 0.9560\n", "Epoch 406/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0638 - acc: 0.9720 - val_loss: 0.1091 - val_acc: 0.9600\n", "Epoch 407/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0648 - acc: 0.9720 - val_loss: 0.1102 - val_acc: 0.9680\n", "Epoch 408/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0653 - acc: 0.9720 - val_loss: 0.1136 - val_acc: 0.9560\n", "Epoch 409/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0642 - acc: 0.9720 - val_loss: 0.1095 - val_acc: 0.9560\n", "Epoch 410/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0636 - acc: 0.9720 - val_loss: 0.1065 - val_acc: 0.9600\n", "Epoch 411/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0655 - acc: 0.9760 - val_loss: 0.1117 - val_acc: 0.9640\n", "Epoch 412/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0663 - acc: 0.9720 - val_loss: 0.1172 - val_acc: 0.9520\n", "Epoch 413/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0616 - acc: 0.9720 - val_loss: 0.1056 - val_acc: 0.9560\n", "Epoch 414/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0654 - acc: 0.9760 - val_loss: 0.1063 - val_acc: 0.9600\n", "Epoch 415/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 120us/step - loss: 0.0647 - acc: 0.9720 - val_loss: 0.1087 - val_acc: 0.9560\n", "Epoch 416/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0641 - acc: 0.9720 - val_loss: 0.1097 - val_acc: 0.9560\n", "Epoch 417/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0629 - acc: 0.9720 - val_loss: 0.1068 - val_acc: 0.9600\n", "Epoch 418/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0638 - acc: 0.9720 - val_loss: 0.1146 - val_acc: 0.9560\n", "Epoch 419/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0634 - acc: 0.9720 - val_loss: 0.1143 - val_acc: 0.9560\n", "Epoch 420/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0632 - acc: 0.9720 - val_loss: 0.1090 - val_acc: 0.9600\n", "Epoch 421/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0645 - acc: 0.9720 - val_loss: 0.1082 - val_acc: 0.9520\n", "Epoch 422/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0635 - acc: 0.9720 - val_loss: 0.1098 - val_acc: 0.9640\n", "Epoch 423/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0631 - acc: 0.9720 - val_loss: 0.1089 - val_acc: 0.9600\n", "Epoch 424/1000\n", "250/250 [==============================] - 0s 200us/step - loss: 0.0629 - acc: 0.9720 - val_loss: 0.1147 - val_acc: 0.9560\n", "Epoch 425/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0624 - acc: 0.9720 - val_loss: 0.1142 - val_acc: 0.9560\n", "Epoch 426/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0647 - acc: 0.9720 - val_loss: 0.1063 - val_acc: 0.9560\n", "Epoch 427/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0647 - acc: 0.9720 - val_loss: 0.1139 - val_acc: 0.9560\n", "Epoch 428/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0632 - acc: 0.9720 - val_loss: 0.1122 - val_acc: 0.9560\n", "Epoch 429/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0613 - acc: 0.9720 - val_loss: 0.1087 - val_acc: 0.9600\n", "Epoch 430/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0645 - acc: 0.9720 - val_loss: 0.1086 - val_acc: 0.9600\n", "Epoch 431/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0620 - acc: 0.9720 - val_loss: 0.1129 - val_acc: 0.9520\n", "Epoch 432/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0628 - acc: 0.9720 - val_loss: 0.1114 - val_acc: 0.9520\n", "Epoch 433/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0628 - acc: 0.9720 - val_loss: 0.1133 - val_acc: 0.9600\n", "Epoch 434/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0630 - acc: 0.9720 - val_loss: 0.1104 - val_acc: 0.9560\n", "Epoch 435/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0609 - acc: 0.9720 - val_loss: 0.1105 - val_acc: 0.9520\n", "Epoch 436/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0643 - acc: 0.9720 - val_loss: 0.1089 - val_acc: 0.9560\n", "Epoch 437/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0618 - acc: 0.9760 - val_loss: 0.1164 - val_acc: 0.9600\n", "Epoch 438/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0630 - acc: 0.9720 - val_loss: 0.1166 - val_acc: 0.9560\n", "Epoch 439/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0626 - acc: 0.9720 - val_loss: 0.1147 - val_acc: 0.9560\n", "Epoch 440/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0624 - acc: 0.9720 - val_loss: 0.1072 - val_acc: 0.9600\n", "Epoch 441/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0623 - acc: 0.9760 - val_loss: 0.1105 - val_acc: 0.9560\n", "Epoch 442/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0628 - acc: 0.9720 - val_loss: 0.1102 - val_acc: 0.9520\n", "Epoch 443/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0627 - acc: 0.9720 - val_loss: 0.1147 - val_acc: 0.9560\n", "Epoch 444/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0626 - acc: 0.9720 - val_loss: 0.1130 - val_acc: 0.9600\n", "Epoch 445/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0637 - acc: 0.9760 - val_loss: 0.1066 - val_acc: 0.9600\n", "Epoch 446/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0643 - acc: 0.9800 - val_loss: 0.1073 - val_acc: 0.9560\n", "Epoch 447/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0608 - acc: 0.9720 - val_loss: 0.1141 - val_acc: 0.9520\n", "Epoch 448/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0652 - acc: 0.9720 - val_loss: 0.1139 - val_acc: 0.9560\n", "Epoch 449/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0617 - acc: 0.9720 - val_loss: 0.1134 - val_acc: 0.9600\n", "Epoch 450/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0626 - acc: 0.9720 - val_loss: 0.1099 - val_acc: 0.9600\n", "Epoch 451/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0621 - acc: 0.9720 - val_loss: 0.1081 - val_acc: 0.9560\n", "Epoch 452/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0622 - acc: 0.9720 - val_loss: 0.1083 - val_acc: 0.9680\n", "Epoch 453/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0608 - acc: 0.9720 - val_loss: 0.1135 - val_acc: 0.9560\n", "Epoch 454/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0638 - acc: 0.9720 - val_loss: 0.1165 - val_acc: 0.9600\n", "Epoch 455/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0650 - acc: 0.9720 - val_loss: 0.1109 - val_acc: 0.9600\n", "Epoch 456/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0621 - acc: 0.9720 - val_loss: 0.1107 - val_acc: 0.9600\n", "Epoch 457/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0609 - acc: 0.9720 - val_loss: 0.1103 - val_acc: 0.9680\n", "Epoch 458/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0624 - acc: 0.9720 - val_loss: 0.1142 - val_acc: 0.9520\n", "Epoch 459/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0619 - acc: 0.9720 - val_loss: 0.1118 - val_acc: 0.9520\n", "Epoch 460/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0604 - acc: 0.9720 - val_loss: 0.1068 - val_acc: 0.9640\n", "Epoch 461/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0628 - acc: 0.9720 - val_loss: 0.1154 - val_acc: 0.9600\n", "Epoch 462/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0609 - acc: 0.9720 - val_loss: 0.1113 - val_acc: 0.9680\n", "Epoch 463/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0610 - acc: 0.9720 - val_loss: 0.1088 - val_acc: 0.9640\n", "Epoch 464/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0622 - acc: 0.9720 - val_loss: 0.1068 - val_acc: 0.9640\n", "Epoch 465/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0613 - acc: 0.9720 - val_loss: 0.1119 - val_acc: 0.9560\n", "Epoch 466/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0628 - acc: 0.9720 - val_loss: 0.1139 - val_acc: 0.9640\n", "Epoch 467/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0629 - acc: 0.9720 - val_loss: 0.1140 - val_acc: 0.9560\n", "Epoch 468/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0596 - acc: 0.9720 - val_loss: 0.1084 - val_acc: 0.9600\n", "Epoch 469/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0620 - acc: 0.9720 - val_loss: 0.1116 - val_acc: 0.9600\n", "Epoch 470/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0610 - acc: 0.9720 - val_loss: 0.1096 - val_acc: 0.9600\n", "Epoch 471/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0632 - acc: 0.9720 - val_loss: 0.1102 - val_acc: 0.9720\n", "Epoch 472/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0647 - acc: 0.9720 - val_loss: 0.1147 - val_acc: 0.9600\n", "Epoch 473/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0596 - acc: 0.9720 - val_loss: 0.1096 - val_acc: 0.9640\n", "Epoch 474/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 124us/step - loss: 0.0614 - acc: 0.9720 - val_loss: 0.1075 - val_acc: 0.9640\n", "Epoch 475/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0622 - acc: 0.9760 - val_loss: 0.1103 - val_acc: 0.9600\n", "Epoch 476/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0629 - acc: 0.9720 - val_loss: 0.1221 - val_acc: 0.9600\n", "Epoch 477/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0612 - acc: 0.9720 - val_loss: 0.1107 - val_acc: 0.9640\n", "Epoch 478/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0626 - acc: 0.9720 - val_loss: 0.1069 - val_acc: 0.9600\n", "Epoch 479/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0603 - acc: 0.9760 - val_loss: 0.1105 - val_acc: 0.9600\n", "Epoch 480/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0632 - acc: 0.9720 - val_loss: 0.1193 - val_acc: 0.9520\n", "Epoch 481/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0645 - acc: 0.9720 - val_loss: 0.1154 - val_acc: 0.9600\n", "Epoch 482/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0601 - acc: 0.9760 - val_loss: 0.1125 - val_acc: 0.9640\n", "Epoch 483/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0642 - acc: 0.9720 - val_loss: 0.1174 - val_acc: 0.9560\n", "Epoch 484/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0638 - acc: 0.9720 - val_loss: 0.1079 - val_acc: 0.9600\n", "Epoch 485/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0589 - acc: 0.9720 - val_loss: 0.1127 - val_acc: 0.9640\n", "Epoch 486/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0642 - acc: 0.9720 - val_loss: 0.1140 - val_acc: 0.9600\n", "Epoch 487/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0636 - acc: 0.9720 - val_loss: 0.1089 - val_acc: 0.9680\n", "Epoch 488/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0600 - acc: 0.9720 - val_loss: 0.1133 - val_acc: 0.9520\n", "Epoch 489/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0599 - acc: 0.9720 - val_loss: 0.1159 - val_acc: 0.9560\n", "Epoch 490/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0589 - acc: 0.9720 - val_loss: 0.1120 - val_acc: 0.9560\n", "Epoch 491/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0602 - acc: 0.9720 - val_loss: 0.1083 - val_acc: 0.9600\n", "Epoch 492/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0590 - acc: 0.9720 - val_loss: 0.1107 - val_acc: 0.9560\n", "Epoch 493/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0581 - acc: 0.9720 - val_loss: 0.1122 - val_acc: 0.9600\n", "Epoch 494/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0627 - acc: 0.9720 - val_loss: 0.1178 - val_acc: 0.9640\n", "Epoch 495/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.0641 - acc: 0.9720 - val_loss: 0.1103 - val_acc: 0.9600\n", "Epoch 496/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0657 - acc: 0.9760 - val_loss: 0.1144 - val_acc: 0.9600\n", "Epoch 497/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0599 - acc: 0.9720 - val_loss: 0.1165 - val_acc: 0.9600\n", "Epoch 498/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0638 - acc: 0.9720 - val_loss: 0.1191 - val_acc: 0.9560\n", "Epoch 499/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0587 - acc: 0.9760 - val_loss: 0.1081 - val_acc: 0.9600\n", "Epoch 500/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0610 - acc: 0.9720 - val_loss: 0.1154 - val_acc: 0.9600\n", "Epoch 501/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0617 - acc: 0.9720 - val_loss: 0.1122 - val_acc: 0.9560\n", "Epoch 502/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0605 - acc: 0.9720 - val_loss: 0.1133 - val_acc: 0.9680\n", "Epoch 503/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.0617 - acc: 0.9720 - val_loss: 0.1134 - val_acc: 0.9560\n", "Epoch 504/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0594 - acc: 0.9720 - val_loss: 0.1107 - val_acc: 0.9680\n", "Epoch 505/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0593 - acc: 0.9720 - val_loss: 0.1130 - val_acc: 0.9600\n", "Epoch 506/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0599 - acc: 0.9720 - val_loss: 0.1169 - val_acc: 0.9560\n", "Epoch 507/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0613 - acc: 0.9720 - val_loss: 0.1126 - val_acc: 0.9640\n", "Epoch 508/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0593 - acc: 0.9720 - val_loss: 0.1134 - val_acc: 0.9600\n", "Epoch 509/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0589 - acc: 0.9720 - val_loss: 0.1205 - val_acc: 0.9520\n", "Epoch 510/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0603 - acc: 0.9720 - val_loss: 0.1116 - val_acc: 0.9680\n", "Epoch 511/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0616 - acc: 0.9720 - val_loss: 0.1097 - val_acc: 0.9680\n", "Epoch 512/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0620 - acc: 0.9720 - val_loss: 0.1166 - val_acc: 0.9560\n", "Epoch 513/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0631 - acc: 0.9720 - val_loss: 0.1101 - val_acc: 0.9600\n", "Epoch 514/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0580 - acc: 0.9720 - val_loss: 0.1126 - val_acc: 0.9560\n", "Epoch 515/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0607 - acc: 0.9720 - val_loss: 0.1152 - val_acc: 0.9600\n", "Epoch 516/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0606 - acc: 0.9720 - val_loss: 0.1122 - val_acc: 0.9640\n", "Epoch 517/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0589 - acc: 0.9720 - val_loss: 0.1171 - val_acc: 0.9520\n", "Epoch 518/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0606 - acc: 0.9720 - val_loss: 0.1200 - val_acc: 0.9600\n", "Epoch 519/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0628 - acc: 0.9720 - val_loss: 0.1105 - val_acc: 0.9600\n", "Epoch 520/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0620 - acc: 0.9720 - val_loss: 0.1107 - val_acc: 0.9560\n", "Epoch 521/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0635 - acc: 0.9720 - val_loss: 0.1125 - val_acc: 0.9680\n", "Epoch 522/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0573 - acc: 0.9720 - val_loss: 0.1134 - val_acc: 0.9560\n", "Epoch 523/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0597 - acc: 0.9720 - val_loss: 0.1126 - val_acc: 0.9560\n", "Epoch 524/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0582 - acc: 0.9720 - val_loss: 0.1104 - val_acc: 0.9600\n", "Epoch 525/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0585 - acc: 0.9720 - val_loss: 0.1133 - val_acc: 0.9640\n", "Epoch 526/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0586 - acc: 0.9720 - val_loss: 0.1131 - val_acc: 0.9680\n", "Epoch 527/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0582 - acc: 0.9720 - val_loss: 0.1167 - val_acc: 0.9640\n", "Epoch 528/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0611 - acc: 0.9720 - val_loss: 0.1168 - val_acc: 0.9560\n", "Epoch 529/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0615 - acc: 0.9720 - val_loss: 0.1137 - val_acc: 0.9560\n", "Epoch 530/1000\n", "250/250 [==============================] - 0s 224us/step - loss: 0.0584 - acc: 0.9720 - val_loss: 0.1119 - val_acc: 0.9600\n", "Epoch 531/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0588 - acc: 0.9720 - val_loss: 0.1107 - val_acc: 0.9680\n", "Epoch 532/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0587 - acc: 0.9720 - val_loss: 0.1103 - val_acc: 0.9680\n", "Epoch 533/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 128us/step - loss: 0.0581 - acc: 0.9720 - val_loss: 0.1153 - val_acc: 0.9560\n", "Epoch 534/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0621 - acc: 0.9720 - val_loss: 0.1113 - val_acc: 0.9640\n", "Epoch 535/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0681 - acc: 0.9720 - val_loss: 0.1205 - val_acc: 0.9560\n", "Epoch 536/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0566 - acc: 0.9720 - val_loss: 0.1114 - val_acc: 0.9680\n", "Epoch 537/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0600 - acc: 0.9720 - val_loss: 0.1118 - val_acc: 0.9680\n", "Epoch 538/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0598 - acc: 0.9720 - val_loss: 0.1159 - val_acc: 0.9560\n", "Epoch 539/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0589 - acc: 0.9720 - val_loss: 0.1109 - val_acc: 0.9600\n", "Epoch 540/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0604 - acc: 0.9720 - val_loss: 0.1113 - val_acc: 0.9680\n", "Epoch 541/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0597 - acc: 0.9720 - val_loss: 0.1188 - val_acc: 0.9560\n", "Epoch 542/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0596 - acc: 0.9720 - val_loss: 0.1186 - val_acc: 0.9560\n", "Epoch 543/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0591 - acc: 0.9720 - val_loss: 0.1087 - val_acc: 0.9640\n", "Epoch 544/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0645 - acc: 0.9720 - val_loss: 0.1164 - val_acc: 0.9520\n", "Epoch 545/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0590 - acc: 0.9760 - val_loss: 0.1158 - val_acc: 0.9640\n", "Epoch 546/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0669 - acc: 0.9720 - val_loss: 0.1135 - val_acc: 0.9680\n", "Epoch 547/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0578 - acc: 0.9760 - val_loss: 0.1152 - val_acc: 0.9640\n", "Epoch 548/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0576 - acc: 0.9720 - val_loss: 0.1237 - val_acc: 0.9600\n", "Epoch 549/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0606 - acc: 0.9760 - val_loss: 0.1122 - val_acc: 0.9560\n", "Epoch 550/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0582 - acc: 0.9720 - val_loss: 0.1147 - val_acc: 0.9680\n", "Epoch 551/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0570 - acc: 0.9720 - val_loss: 0.1156 - val_acc: 0.9640\n", "Epoch 552/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0603 - acc: 0.9720 - val_loss: 0.1126 - val_acc: 0.9600\n", "Epoch 553/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0593 - acc: 0.9720 - val_loss: 0.1195 - val_acc: 0.9640\n", "Epoch 554/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0624 - acc: 0.9760 - val_loss: 0.1179 - val_acc: 0.9560\n", "Epoch 555/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0603 - acc: 0.9720 - val_loss: 0.1145 - val_acc: 0.9680\n", "Epoch 556/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0597 - acc: 0.9720 - val_loss: 0.1152 - val_acc: 0.9600\n", "Epoch 557/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0575 - acc: 0.9720 - val_loss: 0.1142 - val_acc: 0.9600\n", "Epoch 558/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0583 - acc: 0.9720 - val_loss: 0.1145 - val_acc: 0.9560\n", "Epoch 559/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0575 - acc: 0.9720 - val_loss: 0.1142 - val_acc: 0.9600\n", "Epoch 560/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0591 - acc: 0.9720 - val_loss: 0.1112 - val_acc: 0.9600\n", "Epoch 561/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0575 - acc: 0.9720 - val_loss: 0.1129 - val_acc: 0.9600\n", "Epoch 562/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0567 - acc: 0.9720 - val_loss: 0.1262 - val_acc: 0.9600\n", "Epoch 563/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0598 - acc: 0.9760 - val_loss: 0.1222 - val_acc: 0.9600\n", "Epoch 564/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0613 - acc: 0.9720 - val_loss: 0.1129 - val_acc: 0.9640\n", "Epoch 565/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0577 - acc: 0.9720 - val_loss: 0.1155 - val_acc: 0.9640\n", "Epoch 566/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0667 - acc: 0.9680 - val_loss: 0.1296 - val_acc: 0.9560\n", "Epoch 567/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0581 - acc: 0.9720 - val_loss: 0.1111 - val_acc: 0.9600\n", "Epoch 568/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0599 - acc: 0.9680 - val_loss: 0.1108 - val_acc: 0.9680\n", "Epoch 569/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0559 - acc: 0.9720 - val_loss: 0.1189 - val_acc: 0.9520\n", "Epoch 570/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0576 - acc: 0.9760 - val_loss: 0.1197 - val_acc: 0.9520\n", "Epoch 571/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0614 - acc: 0.9720 - val_loss: 0.1141 - val_acc: 0.9680\n", "Epoch 572/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0577 - acc: 0.9720 - val_loss: 0.1132 - val_acc: 0.9600\n", "Epoch 573/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0586 - acc: 0.9720 - val_loss: 0.1189 - val_acc: 0.9560\n", "Epoch 574/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0597 - acc: 0.9760 - val_loss: 0.1155 - val_acc: 0.9680\n", "Epoch 575/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0586 - acc: 0.9760 - val_loss: 0.1143 - val_acc: 0.9640\n", "Epoch 576/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0596 - acc: 0.9720 - val_loss: 0.1277 - val_acc: 0.9560\n", "Epoch 577/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0589 - acc: 0.9760 - val_loss: 0.1149 - val_acc: 0.9640\n", "Epoch 578/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0570 - acc: 0.9720 - val_loss: 0.1131 - val_acc: 0.9640\n", "Epoch 579/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0590 - acc: 0.9680 - val_loss: 0.1158 - val_acc: 0.9640\n", "Epoch 580/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0557 - acc: 0.9760 - val_loss: 0.1189 - val_acc: 0.9560\n", "Epoch 581/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0566 - acc: 0.9720 - val_loss: 0.1185 - val_acc: 0.9600\n", "Epoch 582/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0574 - acc: 0.9720 - val_loss: 0.1134 - val_acc: 0.9600\n", "Epoch 583/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0563 - acc: 0.9720 - val_loss: 0.1147 - val_acc: 0.9680\n", "Epoch 584/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0585 - acc: 0.9720 - val_loss: 0.1218 - val_acc: 0.9520\n", "Epoch 585/1000\n", "250/250 [==============================] - 0s 180us/step - loss: 0.0596 - acc: 0.9760 - val_loss: 0.1161 - val_acc: 0.9680\n", "Epoch 586/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0586 - acc: 0.9760 - val_loss: 0.1140 - val_acc: 0.9600\n", "Epoch 587/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0571 - acc: 0.9720 - val_loss: 0.1177 - val_acc: 0.9600\n", "Epoch 588/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0608 - acc: 0.9760 - val_loss: 0.1243 - val_acc: 0.9600\n", "Epoch 589/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0584 - acc: 0.9720 - val_loss: 0.1135 - val_acc: 0.9640\n", "Epoch 590/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0587 - acc: 0.9680 - val_loss: 0.1168 - val_acc: 0.9600\n", "Epoch 591/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0622 - acc: 0.9760 - val_loss: 0.1271 - val_acc: 0.9600\n", "Epoch 592/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 124us/step - loss: 0.0552 - acc: 0.9760 - val_loss: 0.1136 - val_acc: 0.9640\n", "Epoch 593/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0592 - acc: 0.9720 - val_loss: 0.1118 - val_acc: 0.9640\n", "Epoch 594/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0577 - acc: 0.9680 - val_loss: 0.1189 - val_acc: 0.9600\n", "Epoch 595/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0568 - acc: 0.9760 - val_loss: 0.1199 - val_acc: 0.9640\n", "Epoch 596/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0547 - acc: 0.9760 - val_loss: 0.1150 - val_acc: 0.9640\n", "Epoch 597/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0561 - acc: 0.9720 - val_loss: 0.1179 - val_acc: 0.9600\n", "Epoch 598/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0579 - acc: 0.9720 - val_loss: 0.1193 - val_acc: 0.9640\n", "Epoch 599/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0555 - acc: 0.9720 - val_loss: 0.1161 - val_acc: 0.9640\n", "Epoch 600/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0610 - acc: 0.9760 - val_loss: 0.1231 - val_acc: 0.9560\n", "Epoch 601/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0618 - acc: 0.9720 - val_loss: 0.1125 - val_acc: 0.9640\n", "Epoch 602/1000\n", "250/250 [==============================] - 0s 176us/step - loss: 0.0568 - acc: 0.9720 - val_loss: 0.1175 - val_acc: 0.9600\n", "Epoch 603/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0561 - acc: 0.9760 - val_loss: 0.1162 - val_acc: 0.9600\n", "Epoch 604/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0564 - acc: 0.9720 - val_loss: 0.1176 - val_acc: 0.9680\n", "Epoch 605/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0571 - acc: 0.9720 - val_loss: 0.1190 - val_acc: 0.9600\n", "Epoch 606/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0549 - acc: 0.9720 - val_loss: 0.1186 - val_acc: 0.9600\n", "Epoch 607/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0570 - acc: 0.9720 - val_loss: 0.1145 - val_acc: 0.9640\n", "Epoch 608/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0540 - acc: 0.9720 - val_loss: 0.1185 - val_acc: 0.9560\n", "Epoch 609/1000\n", "250/250 [==============================] - 0s 192us/step - loss: 0.0573 - acc: 0.9760 - val_loss: 0.1198 - val_acc: 0.9600\n", "Epoch 610/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0545 - acc: 0.9720 - val_loss: 0.1165 - val_acc: 0.9640\n", "Epoch 611/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0554 - acc: 0.9720 - val_loss: 0.1147 - val_acc: 0.9640\n", "Epoch 612/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0552 - acc: 0.9720 - val_loss: 0.1187 - val_acc: 0.9640\n", "Epoch 613/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0566 - acc: 0.9760 - val_loss: 0.1238 - val_acc: 0.9640\n", "Epoch 614/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0623 - acc: 0.9680 - val_loss: 0.1156 - val_acc: 0.9640\n", "Epoch 615/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0532 - acc: 0.9720 - val_loss: 0.1233 - val_acc: 0.9560\n", "Epoch 616/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0569 - acc: 0.9760 - val_loss: 0.1200 - val_acc: 0.9600\n", "Epoch 617/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0570 - acc: 0.9680 - val_loss: 0.1149 - val_acc: 0.9640\n", "Epoch 618/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0574 - acc: 0.9680 - val_loss: 0.1203 - val_acc: 0.9640\n", "Epoch 619/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0555 - acc: 0.9760 - val_loss: 0.1220 - val_acc: 0.9560\n", "Epoch 620/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0577 - acc: 0.9760 - val_loss: 0.1178 - val_acc: 0.9640\n", "Epoch 621/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0567 - acc: 0.9720 - val_loss: 0.1197 - val_acc: 0.9640\n", "Epoch 622/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0552 - acc: 0.9720 - val_loss: 0.1215 - val_acc: 0.9640\n", "Epoch 623/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0554 - acc: 0.9720 - val_loss: 0.1185 - val_acc: 0.9640\n", "Epoch 624/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0587 - acc: 0.9760 - val_loss: 0.1217 - val_acc: 0.9600\n", "Epoch 625/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0518 - acc: 0.9800 - val_loss: 0.1156 - val_acc: 0.9640\n", "Epoch 626/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0591 - acc: 0.9720 - val_loss: 0.1162 - val_acc: 0.9640\n", "Epoch 627/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0550 - acc: 0.9720 - val_loss: 0.1194 - val_acc: 0.9640\n", "Epoch 628/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0533 - acc: 0.9760 - val_loss: 0.1205 - val_acc: 0.9600\n", "Epoch 629/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0546 - acc: 0.9800 - val_loss: 0.1194 - val_acc: 0.9640\n", "Epoch 630/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0546 - acc: 0.9720 - val_loss: 0.1186 - val_acc: 0.9640\n", "Epoch 631/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0550 - acc: 0.9680 - val_loss: 0.1202 - val_acc: 0.9640\n", "Epoch 632/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0542 - acc: 0.9720 - val_loss: 0.1192 - val_acc: 0.9640\n", "Epoch 633/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0546 - acc: 0.9760 - val_loss: 0.1200 - val_acc: 0.9640\n", "Epoch 634/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0562 - acc: 0.9760 - val_loss: 0.1192 - val_acc: 0.9640\n", "Epoch 635/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0534 - acc: 0.9720 - val_loss: 0.1197 - val_acc: 0.9640\n", "Epoch 636/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0536 - acc: 0.9720 - val_loss: 0.1182 - val_acc: 0.9640\n", "Epoch 637/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0540 - acc: 0.9760 - val_loss: 0.1210 - val_acc: 0.9640\n", "Epoch 638/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0547 - acc: 0.9720 - val_loss: 0.1181 - val_acc: 0.9640\n", "Epoch 639/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0575 - acc: 0.9680 - val_loss: 0.1200 - val_acc: 0.9600\n", "Epoch 640/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0550 - acc: 0.9800 - val_loss: 0.1176 - val_acc: 0.9640\n", "Epoch 641/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0546 - acc: 0.9680 - val_loss: 0.1174 - val_acc: 0.9640\n", "Epoch 642/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0572 - acc: 0.9680 - val_loss: 0.1209 - val_acc: 0.9640\n", "Epoch 643/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0508 - acc: 0.9760 - val_loss: 0.1221 - val_acc: 0.9640\n", "Epoch 644/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0563 - acc: 0.9800 - val_loss: 0.1231 - val_acc: 0.9600\n", "Epoch 645/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0564 - acc: 0.9760 - val_loss: 0.1180 - val_acc: 0.9640\n", "Epoch 646/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0554 - acc: 0.9720 - val_loss: 0.1213 - val_acc: 0.9640\n", "Epoch 647/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0516 - acc: 0.9760 - val_loss: 0.1238 - val_acc: 0.9600\n", "Epoch 648/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0533 - acc: 0.9800 - val_loss: 0.1202 - val_acc: 0.9640\n", "Epoch 649/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0531 - acc: 0.9720 - val_loss: 0.1185 - val_acc: 0.9640\n", "Epoch 650/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0550 - acc: 0.9720 - val_loss: 0.1202 - val_acc: 0.9640\n", "Epoch 651/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 112us/step - loss: 0.0532 - acc: 0.9760 - val_loss: 0.1198 - val_acc: 0.9640\n", "Epoch 652/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0514 - acc: 0.9760 - val_loss: 0.1184 - val_acc: 0.9640\n", "Epoch 653/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0564 - acc: 0.9720 - val_loss: 0.1181 - val_acc: 0.9640\n", "Epoch 654/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0669 - acc: 0.9640 - val_loss: 0.1292 - val_acc: 0.9600\n", "Epoch 655/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0472 - acc: 0.9800 - val_loss: 0.1203 - val_acc: 0.9640\n", "Epoch 656/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0637 - acc: 0.9800 - val_loss: 0.1190 - val_acc: 0.9600\n", "Epoch 657/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0572 - acc: 0.9720 - val_loss: 0.1249 - val_acc: 0.9600\n", "Epoch 658/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0555 - acc: 0.9720 - val_loss: 0.1217 - val_acc: 0.9640\n", "Epoch 659/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0520 - acc: 0.9760 - val_loss: 0.1221 - val_acc: 0.9640\n", "Epoch 660/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0526 - acc: 0.9800 - val_loss: 0.1230 - val_acc: 0.9640\n", "Epoch 661/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0532 - acc: 0.9680 - val_loss: 0.1215 - val_acc: 0.9640\n", "Epoch 662/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0532 - acc: 0.9720 - val_loss: 0.1222 - val_acc: 0.9640\n", "Epoch 663/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0529 - acc: 0.9760 - val_loss: 0.1251 - val_acc: 0.9640\n", "Epoch 664/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0503 - acc: 0.9760 - val_loss: 0.1215 - val_acc: 0.9640\n", "Epoch 665/1000\n", "250/250 [==============================] - 0s 176us/step - loss: 0.0528 - acc: 0.9720 - val_loss: 0.1209 - val_acc: 0.9640\n", "Epoch 666/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0527 - acc: 0.9720 - val_loss: 0.1219 - val_acc: 0.9640\n", "Epoch 667/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0520 - acc: 0.9720 - val_loss: 0.1199 - val_acc: 0.9640\n", "Epoch 668/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0513 - acc: 0.9680 - val_loss: 0.1233 - val_acc: 0.9640\n", "Epoch 669/1000\n", "250/250 [==============================] - 0s 204us/step - loss: 0.0522 - acc: 0.9720 - val_loss: 0.1237 - val_acc: 0.9640\n", "Epoch 670/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0507 - acc: 0.9800 - val_loss: 0.1217 - val_acc: 0.9640\n", "Epoch 671/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0519 - acc: 0.9760 - val_loss: 0.1202 - val_acc: 0.9640\n", "Epoch 672/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0535 - acc: 0.9720 - val_loss: 0.1202 - val_acc: 0.9640\n", "Epoch 673/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0570 - acc: 0.9680 - val_loss: 0.1263 - val_acc: 0.9640\n", "Epoch 674/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0521 - acc: 0.9800 - val_loss: 0.1250 - val_acc: 0.9600\n", "Epoch 675/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0512 - acc: 0.9720 - val_loss: 0.1218 - val_acc: 0.9640\n", "Epoch 676/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0525 - acc: 0.9720 - val_loss: 0.1236 - val_acc: 0.9600\n", "Epoch 677/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0534 - acc: 0.9720 - val_loss: 0.1224 - val_acc: 0.9640\n", "Epoch 678/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0529 - acc: 0.9680 - val_loss: 0.1269 - val_acc: 0.9560\n", "Epoch 679/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0547 - acc: 0.9720 - val_loss: 0.1208 - val_acc: 0.9640\n", "Epoch 680/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0522 - acc: 0.9760 - val_loss: 0.1229 - val_acc: 0.9640\n", "Epoch 681/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0507 - acc: 0.9760 - val_loss: 0.1251 - val_acc: 0.9600\n", "Epoch 682/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0535 - acc: 0.9720 - val_loss: 0.1238 - val_acc: 0.9640\n", "Epoch 683/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0547 - acc: 0.9760 - val_loss: 0.1250 - val_acc: 0.9600\n", "Epoch 684/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0525 - acc: 0.9800 - val_loss: 0.1233 - val_acc: 0.9600\n", "Epoch 685/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0530 - acc: 0.9680 - val_loss: 0.1230 - val_acc: 0.9640\n", "Epoch 686/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0517 - acc: 0.9720 - val_loss: 0.1220 - val_acc: 0.9640\n", "Epoch 687/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0513 - acc: 0.9720 - val_loss: 0.1233 - val_acc: 0.9600\n", "Epoch 688/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0514 - acc: 0.9720 - val_loss: 0.1261 - val_acc: 0.9560\n", "Epoch 689/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0530 - acc: 0.9760 - val_loss: 0.1246 - val_acc: 0.9600\n", "Epoch 690/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0507 - acc: 0.9760 - val_loss: 0.1236 - val_acc: 0.9640\n", "Epoch 691/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0551 - acc: 0.9720 - val_loss: 0.1228 - val_acc: 0.9600\n", "Epoch 692/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0511 - acc: 0.9720 - val_loss: 0.1239 - val_acc: 0.9600\n", "Epoch 693/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0514 - acc: 0.9760 - val_loss: 0.1244 - val_acc: 0.9640\n", "Epoch 694/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0495 - acc: 0.9800 - val_loss: 0.1238 - val_acc: 0.9600\n", "Epoch 695/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0510 - acc: 0.9680 - val_loss: 0.1234 - val_acc: 0.9600\n", "Epoch 696/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0496 - acc: 0.9800 - val_loss: 0.1255 - val_acc: 0.9600\n", "Epoch 697/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0514 - acc: 0.9760 - val_loss: 0.1266 - val_acc: 0.9600\n", "Epoch 698/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0512 - acc: 0.9760 - val_loss: 0.1257 - val_acc: 0.9600\n", "Epoch 699/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0505 - acc: 0.9720 - val_loss: 0.1245 - val_acc: 0.9600\n", "Epoch 700/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0517 - acc: 0.9760 - val_loss: 0.1248 - val_acc: 0.9600\n", "Epoch 701/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0513 - acc: 0.9760 - val_loss: 0.1229 - val_acc: 0.9600\n", "Epoch 702/1000\n", "250/250 [==============================] - 0s 176us/step - loss: 0.0500 - acc: 0.9680 - val_loss: 0.1242 - val_acc: 0.9560\n", "Epoch 703/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0509 - acc: 0.9720 - val_loss: 0.1257 - val_acc: 0.9560\n", "Epoch 704/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0511 - acc: 0.9760 - val_loss: 0.1284 - val_acc: 0.9600\n", "Epoch 705/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0521 - acc: 0.9720 - val_loss: 0.1239 - val_acc: 0.9600\n", "Epoch 706/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0530 - acc: 0.9760 - val_loss: 0.1253 - val_acc: 0.9560\n", "Epoch 707/1000\n", "250/250 [==============================] - 0s 92us/step - loss: 0.0518 - acc: 0.9720 - val_loss: 0.1227 - val_acc: 0.9600\n", "Epoch 708/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0498 - acc: 0.9760 - val_loss: 0.1243 - val_acc: 0.9640\n", "Epoch 709/1000\n", "250/250 [==============================] - ETA: 0s - loss: 0.0993 - acc: 0.968 - 0s 112us/step - loss: 0.0507 - acc: 0.9800 - val_loss: 0.1271 - val_acc: 0.9560\n", "Epoch 710/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 104us/step - loss: 0.0544 - acc: 0.9800 - val_loss: 0.1286 - val_acc: 0.9520\n", "Epoch 711/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0504 - acc: 0.9720 - val_loss: 0.1240 - val_acc: 0.9560\n", "Epoch 712/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0502 - acc: 0.9680 - val_loss: 0.1235 - val_acc: 0.9600\n", "Epoch 713/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0528 - acc: 0.9720 - val_loss: 0.1268 - val_acc: 0.9600\n", "Epoch 714/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0495 - acc: 0.9800 - val_loss: 0.1250 - val_acc: 0.9560\n", "Epoch 715/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0538 - acc: 0.9680 - val_loss: 0.1266 - val_acc: 0.9560\n", "Epoch 716/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0509 - acc: 0.9800 - val_loss: 0.1265 - val_acc: 0.9600\n", "Epoch 717/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0501 - acc: 0.9720 - val_loss: 0.1237 - val_acc: 0.9560\n", "Epoch 718/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0525 - acc: 0.9720 - val_loss: 0.1271 - val_acc: 0.9560\n", "Epoch 719/1000\n", "250/250 [==============================] - 0s 312us/step - loss: 0.0501 - acc: 0.9760 - val_loss: 0.1265 - val_acc: 0.9560\n", "Epoch 720/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0512 - acc: 0.9720 - val_loss: 0.1238 - val_acc: 0.9600\n", "Epoch 721/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0503 - acc: 0.9760 - val_loss: 0.1275 - val_acc: 0.9560\n", "Epoch 722/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0510 - acc: 0.9680 - val_loss: 0.1256 - val_acc: 0.9560\n", "Epoch 723/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0513 - acc: 0.9720 - val_loss: 0.1256 - val_acc: 0.9640\n", "Epoch 724/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0519 - acc: 0.9760 - val_loss: 0.1294 - val_acc: 0.9560\n", "Epoch 725/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0526 - acc: 0.9680 - val_loss: 0.1274 - val_acc: 0.9520\n", "Epoch 726/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0500 - acc: 0.9800 - val_loss: 0.1273 - val_acc: 0.9600\n", "Epoch 727/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0580 - acc: 0.9680 - val_loss: 0.1243 - val_acc: 0.9520\n", "Epoch 728/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0480 - acc: 0.9720 - val_loss: 0.1275 - val_acc: 0.9560\n", "Epoch 729/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0528 - acc: 0.9720 - val_loss: 0.1282 - val_acc: 0.9520\n", "Epoch 730/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0514 - acc: 0.9760 - val_loss: 0.1286 - val_acc: 0.9560\n", "Epoch 731/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0483 - acc: 0.9800 - val_loss: 0.1251 - val_acc: 0.9600\n", "Epoch 732/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0527 - acc: 0.9800 - val_loss: 0.1246 - val_acc: 0.9520\n", "Epoch 733/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0523 - acc: 0.9760 - val_loss: 0.1238 - val_acc: 0.9600\n", "Epoch 734/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0567 - acc: 0.9760 - val_loss: 0.1303 - val_acc: 0.9560\n", "Epoch 735/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0508 - acc: 0.9760 - val_loss: 0.1260 - val_acc: 0.9560\n", "Epoch 736/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0509 - acc: 0.9680 - val_loss: 0.1243 - val_acc: 0.9560\n", "Epoch 737/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0520 - acc: 0.9720 - val_loss: 0.1244 - val_acc: 0.9600\n", "Epoch 738/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0514 - acc: 0.9720 - val_loss: 0.1271 - val_acc: 0.9560\n", "Epoch 739/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0507 - acc: 0.9760 - val_loss: 0.1284 - val_acc: 0.9560\n", "Epoch 740/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0519 - acc: 0.9760 - val_loss: 0.1261 - val_acc: 0.9600\n", "Epoch 741/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0487 - acc: 0.9800 - val_loss: 0.1258 - val_acc: 0.9600\n", "Epoch 742/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0546 - acc: 0.9760 - val_loss: 0.1283 - val_acc: 0.9560\n", "Epoch 743/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0544 - acc: 0.9680 - val_loss: 0.1285 - val_acc: 0.9520\n", "Epoch 744/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0488 - acc: 0.9720 - val_loss: 0.1271 - val_acc: 0.9560\n", "Epoch 745/1000\n", "250/250 [==============================] - 0s 196us/step - loss: 0.0542 - acc: 0.9680 - val_loss: 0.1275 - val_acc: 0.9480\n", "Epoch 746/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0539 - acc: 0.9760 - val_loss: 0.1291 - val_acc: 0.9600\n", "Epoch 747/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0508 - acc: 0.9720 - val_loss: 0.1282 - val_acc: 0.9560\n", "Epoch 748/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0513 - acc: 0.9720 - val_loss: 0.1262 - val_acc: 0.9600\n", "Epoch 749/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0523 - acc: 0.9800 - val_loss: 0.1278 - val_acc: 0.9600\n", "Epoch 750/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0497 - acc: 0.9720 - val_loss: 0.1271 - val_acc: 0.9520\n", "Epoch 751/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0496 - acc: 0.9640 - val_loss: 0.1249 - val_acc: 0.9600\n", "Epoch 752/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0525 - acc: 0.9720 - val_loss: 0.1279 - val_acc: 0.9600\n", "Epoch 753/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0595 - acc: 0.9720 - val_loss: 0.1252 - val_acc: 0.9520\n", "Epoch 754/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0597 - acc: 0.9680 - val_loss: 0.1307 - val_acc: 0.9560\n", "Epoch 755/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0533 - acc: 0.9760 - val_loss: 0.1318 - val_acc: 0.9480\n", "Epoch 756/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0515 - acc: 0.9680 - val_loss: 0.1275 - val_acc: 0.9560\n", "Epoch 757/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0499 - acc: 0.9680 - val_loss: 0.1297 - val_acc: 0.9520\n", "Epoch 758/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0509 - acc: 0.9800 - val_loss: 0.1297 - val_acc: 0.9520\n", "Epoch 759/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0494 - acc: 0.9760 - val_loss: 0.1291 - val_acc: 0.9520\n", "Epoch 760/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0498 - acc: 0.9720 - val_loss: 0.1279 - val_acc: 0.9560\n", "Epoch 761/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0508 - acc: 0.9720 - val_loss: 0.1281 - val_acc: 0.9560\n", "Epoch 762/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0502 - acc: 0.9760 - val_loss: 0.1314 - val_acc: 0.9560\n", "Epoch 763/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0507 - acc: 0.9720 - val_loss: 0.1275 - val_acc: 0.9560\n", "Epoch 764/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0541 - acc: 0.9680 - val_loss: 0.1300 - val_acc: 0.9560\n", "Epoch 765/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0506 - acc: 0.9760 - val_loss: 0.1286 - val_acc: 0.9560\n", "Epoch 766/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0511 - acc: 0.9800 - val_loss: 0.1314 - val_acc: 0.9520\n", "Epoch 767/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0497 - acc: 0.9800 - val_loss: 0.1313 - val_acc: 0.9520\n", "Epoch 768/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0486 - acc: 0.9760 - val_loss: 0.1269 - val_acc: 0.9600\n", "Epoch 769/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 108us/step - loss: 0.0514 - acc: 0.9680 - val_loss: 0.1274 - val_acc: 0.9520\n", "Epoch 770/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0489 - acc: 0.9680 - val_loss: 0.1274 - val_acc: 0.9600\n", "Epoch 771/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0522 - acc: 0.9760 - val_loss: 0.1319 - val_acc: 0.9520\n", "Epoch 772/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0495 - acc: 0.9760 - val_loss: 0.1300 - val_acc: 0.9520\n", "Epoch 773/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0503 - acc: 0.9680 - val_loss: 0.1291 - val_acc: 0.9560\n", "Epoch 774/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0501 - acc: 0.9760 - val_loss: 0.1276 - val_acc: 0.9600\n", "Epoch 775/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0512 - acc: 0.9720 - val_loss: 0.1300 - val_acc: 0.9520\n", "Epoch 776/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0478 - acc: 0.9800 - val_loss: 0.1290 - val_acc: 0.9560\n", "Epoch 777/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0498 - acc: 0.9760 - val_loss: 0.1262 - val_acc: 0.9600\n", "Epoch 778/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0521 - acc: 0.9760 - val_loss: 0.1293 - val_acc: 0.9560\n", "Epoch 779/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0540 - acc: 0.9760 - val_loss: 0.1302 - val_acc: 0.9440\n", "Epoch 780/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0484 - acc: 0.9720 - val_loss: 0.1283 - val_acc: 0.9600\n", "Epoch 781/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0505 - acc: 0.9800 - val_loss: 0.1284 - val_acc: 0.9600\n", "Epoch 782/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0531 - acc: 0.9760 - val_loss: 0.1287 - val_acc: 0.9520\n", "Epoch 783/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0538 - acc: 0.9680 - val_loss: 0.1315 - val_acc: 0.9520\n", "Epoch 784/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0496 - acc: 0.9760 - val_loss: 0.1272 - val_acc: 0.9520\n", "Epoch 785/1000\n", "250/250 [==============================] - 0s 184us/step - loss: 0.0505 - acc: 0.9720 - val_loss: 0.1278 - val_acc: 0.9600\n", "Epoch 786/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0514 - acc: 0.9680 - val_loss: 0.1287 - val_acc: 0.9560\n", "Epoch 787/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.0489 - acc: 0.9760 - val_loss: 0.1325 - val_acc: 0.9480\n", "Epoch 788/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0511 - acc: 0.9760 - val_loss: 0.1317 - val_acc: 0.9440\n", "Epoch 789/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0510 - acc: 0.9760 - val_loss: 0.1293 - val_acc: 0.9560\n", "Epoch 790/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0500 - acc: 0.9800 - val_loss: 0.1294 - val_acc: 0.9560\n", "Epoch 791/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0511 - acc: 0.9680 - val_loss: 0.1296 - val_acc: 0.9480\n", "Epoch 792/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0502 - acc: 0.9680 - val_loss: 0.1292 - val_acc: 0.9520\n", "Epoch 793/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0508 - acc: 0.9680 - val_loss: 0.1303 - val_acc: 0.9520\n", "Epoch 794/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0490 - acc: 0.9760 - val_loss: 0.1321 - val_acc: 0.9560\n", "Epoch 795/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0517 - acc: 0.9720 - val_loss: 0.1301 - val_acc: 0.9520\n", "Epoch 796/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0483 - acc: 0.9680 - val_loss: 0.1305 - val_acc: 0.9560\n", "Epoch 797/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0541 - acc: 0.9760 - val_loss: 0.1311 - val_acc: 0.9560\n", "Epoch 798/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0530 - acc: 0.9680 - val_loss: 0.1295 - val_acc: 0.9480\n", "Epoch 799/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0542 - acc: 0.9720 - val_loss: 0.1300 - val_acc: 0.9560\n", "Epoch 800/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0496 - acc: 0.9760 - val_loss: 0.1291 - val_acc: 0.9560\n", "Epoch 801/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0498 - acc: 0.9760 - val_loss: 0.1272 - val_acc: 0.9600\n", "Epoch 802/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0502 - acc: 0.9720 - val_loss: 0.1317 - val_acc: 0.9480\n", "Epoch 803/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0539 - acc: 0.9760 - val_loss: 0.1299 - val_acc: 0.9600\n", "Epoch 804/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0492 - acc: 0.9760 - val_loss: 0.1307 - val_acc: 0.9480\n", "Epoch 805/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0500 - acc: 0.9760 - val_loss: 0.1259 - val_acc: 0.9560\n", "Epoch 806/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0501 - acc: 0.9720 - val_loss: 0.1307 - val_acc: 0.9520\n", "Epoch 807/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0482 - acc: 0.9760 - val_loss: 0.1314 - val_acc: 0.9520\n", "Epoch 808/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0496 - acc: 0.9720 - val_loss: 0.1333 - val_acc: 0.9560\n", "Epoch 809/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0502 - acc: 0.9760 - val_loss: 0.1310 - val_acc: 0.9520\n", "Epoch 810/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0493 - acc: 0.9800 - val_loss: 0.1282 - val_acc: 0.9600\n", "Epoch 811/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0499 - acc: 0.9760 - val_loss: 0.1321 - val_acc: 0.9480\n", "Epoch 812/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0496 - acc: 0.9680 - val_loss: 0.1321 - val_acc: 0.9480\n", "Epoch 813/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0500 - acc: 0.9720 - val_loss: 0.1276 - val_acc: 0.9600\n", "Epoch 814/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0493 - acc: 0.9760 - val_loss: 0.1282 - val_acc: 0.9560\n", "Epoch 815/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0494 - acc: 0.9680 - val_loss: 0.1294 - val_acc: 0.9440\n", "Epoch 816/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0500 - acc: 0.9720 - val_loss: 0.1291 - val_acc: 0.9560\n", "Epoch 817/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0512 - acc: 0.9760 - val_loss: 0.1319 - val_acc: 0.9560\n", "Epoch 818/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0512 - acc: 0.9760 - val_loss: 0.1284 - val_acc: 0.9520\n", "Epoch 819/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0484 - acc: 0.9680 - val_loss: 0.1286 - val_acc: 0.9600\n", "Epoch 820/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0485 - acc: 0.9760 - val_loss: 0.1296 - val_acc: 0.9520\n", "Epoch 821/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0481 - acc: 0.9760 - val_loss: 0.1297 - val_acc: 0.9560\n", "Epoch 822/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0506 - acc: 0.9720 - val_loss: 0.1293 - val_acc: 0.9480\n", "Epoch 823/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0485 - acc: 0.9800 - val_loss: 0.1289 - val_acc: 0.9560\n", "Epoch 824/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0486 - acc: 0.9760 - val_loss: 0.1325 - val_acc: 0.9560\n", "Epoch 825/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0482 - acc: 0.9760 - val_loss: 0.1313 - val_acc: 0.9520\n", "Epoch 826/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0486 - acc: 0.9680 - val_loss: 0.1320 - val_acc: 0.9520\n", "Epoch 827/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0482 - acc: 0.9800 - val_loss: 0.1313 - val_acc: 0.9560\n", "Epoch 828/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 116us/step - loss: 0.0489 - acc: 0.9800 - val_loss: 0.1325 - val_acc: 0.9480\n", "Epoch 829/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0514 - acc: 0.9680 - val_loss: 0.1294 - val_acc: 0.9560\n", "Epoch 830/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0494 - acc: 0.9680 - val_loss: 0.1279 - val_acc: 0.9600\n", "Epoch 831/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0500 - acc: 0.9760 - val_loss: 0.1299 - val_acc: 0.9560\n", "Epoch 832/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0491 - acc: 0.9800 - val_loss: 0.1304 - val_acc: 0.9520\n", "Epoch 833/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0489 - acc: 0.9800 - val_loss: 0.1331 - val_acc: 0.9480\n", "Epoch 834/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0526 - acc: 0.9720 - val_loss: 0.1277 - val_acc: 0.9520\n", "Epoch 835/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0480 - acc: 0.9720 - val_loss: 0.1287 - val_acc: 0.9600\n", "Epoch 836/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0510 - acc: 0.9760 - val_loss: 0.1322 - val_acc: 0.9480\n", "Epoch 837/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0520 - acc: 0.9760 - val_loss: 0.1336 - val_acc: 0.9440\n", "Epoch 838/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0514 - acc: 0.9720 - val_loss: 0.1320 - val_acc: 0.9560\n", "Epoch 839/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0501 - acc: 0.9720 - val_loss: 0.1300 - val_acc: 0.9560\n", "Epoch 840/1000\n", "250/250 [==============================] - 0s 204us/step - loss: 0.0486 - acc: 0.9800 - val_loss: 0.1306 - val_acc: 0.9480\n", "Epoch 841/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0505 - acc: 0.9760 - val_loss: 0.1345 - val_acc: 0.9440\n", "Epoch 842/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0497 - acc: 0.9840 - val_loss: 0.1308 - val_acc: 0.9560\n", "Epoch 843/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0507 - acc: 0.9760 - val_loss: 0.1306 - val_acc: 0.9560\n", "Epoch 844/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0513 - acc: 0.9680 - val_loss: 0.1302 - val_acc: 0.9520\n", "Epoch 845/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0497 - acc: 0.9680 - val_loss: 0.1312 - val_acc: 0.9480\n", "Epoch 846/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0491 - acc: 0.9800 - val_loss: 0.1271 - val_acc: 0.9600\n", "Epoch 847/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0497 - acc: 0.9760 - val_loss: 0.1332 - val_acc: 0.9520\n", "Epoch 848/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0507 - acc: 0.9760 - val_loss: 0.1318 - val_acc: 0.9560\n", "Epoch 849/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0467 - acc: 0.9800 - val_loss: 0.1320 - val_acc: 0.9560\n", "Epoch 850/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0512 - acc: 0.9800 - val_loss: 0.1317 - val_acc: 0.9480\n", "Epoch 851/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0487 - acc: 0.9760 - val_loss: 0.1301 - val_acc: 0.9560\n", "Epoch 852/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0476 - acc: 0.9760 - val_loss: 0.1315 - val_acc: 0.9520\n", "Epoch 853/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0488 - acc: 0.9760 - val_loss: 0.1321 - val_acc: 0.9560\n", "Epoch 854/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0499 - acc: 0.9760 - val_loss: 0.1326 - val_acc: 0.9480\n", "Epoch 855/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0489 - acc: 0.9680 - val_loss: 0.1320 - val_acc: 0.9480\n", "Epoch 856/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0480 - acc: 0.9800 - val_loss: 0.1286 - val_acc: 0.9600\n", "Epoch 857/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0491 - acc: 0.9800 - val_loss: 0.1298 - val_acc: 0.9600\n", "Epoch 858/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0493 - acc: 0.9760 - val_loss: 0.1342 - val_acc: 0.9480\n", "Epoch 859/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0478 - acc: 0.9720 - val_loss: 0.1317 - val_acc: 0.9520\n", "Epoch 860/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0538 - acc: 0.9720 - val_loss: 0.1307 - val_acc: 0.9560\n", "Epoch 861/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0468 - acc: 0.9760 - val_loss: 0.1291 - val_acc: 0.9560\n", "Epoch 862/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0503 - acc: 0.9800 - val_loss: 0.1290 - val_acc: 0.9480\n", "Epoch 863/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0488 - acc: 0.9760 - val_loss: 0.1281 - val_acc: 0.9600\n", "Epoch 864/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0482 - acc: 0.9760 - val_loss: 0.1349 - val_acc: 0.9480\n", "Epoch 865/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0516 - acc: 0.9760 - val_loss: 0.1321 - val_acc: 0.9480\n", "Epoch 866/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0474 - acc: 0.9840 - val_loss: 0.1309 - val_acc: 0.9600\n", "Epoch 867/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0497 - acc: 0.9720 - val_loss: 0.1341 - val_acc: 0.9560\n", "Epoch 868/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0506 - acc: 0.9760 - val_loss: 0.1347 - val_acc: 0.9520\n", "Epoch 869/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0472 - acc: 0.9760 - val_loss: 0.1313 - val_acc: 0.9480\n", "Epoch 870/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0482 - acc: 0.9720 - val_loss: 0.1306 - val_acc: 0.9480\n", "Epoch 871/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0483 - acc: 0.9680 - val_loss: 0.1285 - val_acc: 0.9560\n", "Epoch 872/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0535 - acc: 0.9720 - val_loss: 0.1344 - val_acc: 0.9520\n", "Epoch 873/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0481 - acc: 0.9760 - val_loss: 0.1341 - val_acc: 0.9480\n", "Epoch 874/1000\n", "250/250 [==============================] - 0s 200us/step - loss: 0.0527 - acc: 0.9760 - val_loss: 0.1291 - val_acc: 0.9560\n", "Epoch 875/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0482 - acc: 0.9760 - val_loss: 0.1334 - val_acc: 0.9560\n", "Epoch 876/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0549 - acc: 0.9760 - val_loss: 0.1333 - val_acc: 0.9520\n", "Epoch 877/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0520 - acc: 0.9680 - val_loss: 0.1326 - val_acc: 0.9440\n", "Epoch 878/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0476 - acc: 0.9760 - val_loss: 0.1306 - val_acc: 0.9560\n", "Epoch 879/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0529 - acc: 0.9720 - val_loss: 0.1322 - val_acc: 0.9520\n", "Epoch 880/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0508 - acc: 0.9720 - val_loss: 0.1345 - val_acc: 0.9440\n", "Epoch 881/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0495 - acc: 0.9760 - val_loss: 0.1294 - val_acc: 0.9600\n", "Epoch 882/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0486 - acc: 0.9760 - val_loss: 0.1302 - val_acc: 0.9520\n", "Epoch 883/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0483 - acc: 0.9760 - val_loss: 0.1307 - val_acc: 0.9480\n", "Epoch 884/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0495 - acc: 0.9720 - val_loss: 0.1340 - val_acc: 0.9520\n", "Epoch 885/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0469 - acc: 0.9760 - val_loss: 0.1297 - val_acc: 0.9560\n", "Epoch 886/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0495 - acc: 0.9680 - val_loss: 0.1297 - val_acc: 0.9520\n", "Epoch 887/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 104us/step - loss: 0.0476 - acc: 0.9720 - val_loss: 0.1305 - val_acc: 0.9560\n", "Epoch 888/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0500 - acc: 0.9760 - val_loss: 0.1297 - val_acc: 0.9520\n", "Epoch 889/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0465 - acc: 0.9760 - val_loss: 0.1299 - val_acc: 0.9560\n", "Epoch 890/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0477 - acc: 0.9760 - val_loss: 0.1308 - val_acc: 0.9520\n", "Epoch 891/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0472 - acc: 0.9760 - val_loss: 0.1329 - val_acc: 0.9480\n", "Epoch 892/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0477 - acc: 0.9760 - val_loss: 0.1305 - val_acc: 0.9600\n", "Epoch 893/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0479 - acc: 0.9760 - val_loss: 0.1358 - val_acc: 0.9560\n", "Epoch 894/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0498 - acc: 0.9720 - val_loss: 0.1324 - val_acc: 0.9520\n", "Epoch 895/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0492 - acc: 0.9760 - val_loss: 0.1312 - val_acc: 0.9560\n", "Epoch 896/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0492 - acc: 0.9720 - val_loss: 0.1272 - val_acc: 0.9560\n", "Epoch 897/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0500 - acc: 0.9720 - val_loss: 0.1282 - val_acc: 0.9600\n", "Epoch 898/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0473 - acc: 0.9760 - val_loss: 0.1338 - val_acc: 0.9480\n", "Epoch 899/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0489 - acc: 0.9800 - val_loss: 0.1365 - val_acc: 0.9520\n", "Epoch 900/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0505 - acc: 0.9680 - val_loss: 0.1343 - val_acc: 0.9440\n", "Epoch 901/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0522 - acc: 0.9680 - val_loss: 0.1289 - val_acc: 0.9600\n", "Epoch 902/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0480 - acc: 0.9800 - val_loss: 0.1329 - val_acc: 0.9520\n", "Epoch 903/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0496 - acc: 0.9680 - val_loss: 0.1356 - val_acc: 0.9480\n", "Epoch 904/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0493 - acc: 0.9800 - val_loss: 0.1299 - val_acc: 0.9520\n", "Epoch 905/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0492 - acc: 0.9800 - val_loss: 0.1282 - val_acc: 0.9520\n", "Epoch 906/1000\n", "250/250 [==============================] - 0s 168us/step - loss: 0.0478 - acc: 0.9720 - val_loss: 0.1325 - val_acc: 0.9520\n", "Epoch 907/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0485 - acc: 0.9760 - val_loss: 0.1360 - val_acc: 0.9480\n", "Epoch 908/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0495 - acc: 0.9720 - val_loss: 0.1302 - val_acc: 0.9520\n", "Epoch 909/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0478 - acc: 0.9800 - val_loss: 0.1314 - val_acc: 0.9560\n", "Epoch 910/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0472 - acc: 0.9760 - val_loss: 0.1311 - val_acc: 0.9560\n", "Epoch 911/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0509 - acc: 0.9720 - val_loss: 0.1362 - val_acc: 0.9520\n", "Epoch 912/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0499 - acc: 0.9760 - val_loss: 0.1296 - val_acc: 0.9600\n", "Epoch 913/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0500 - acc: 0.9760 - val_loss: 0.1317 - val_acc: 0.9520\n", "Epoch 914/1000\n", "250/250 [==============================] - 0s 164us/step - loss: 0.0500 - acc: 0.9760 - val_loss: 0.1321 - val_acc: 0.9560\n", "Epoch 915/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0495 - acc: 0.9760 - val_loss: 0.1340 - val_acc: 0.9440\n", "Epoch 916/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0479 - acc: 0.9760 - val_loss: 0.1327 - val_acc: 0.9560\n", "Epoch 917/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0502 - acc: 0.9720 - val_loss: 0.1297 - val_acc: 0.9520\n", "Epoch 918/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0475 - acc: 0.9680 - val_loss: 0.1315 - val_acc: 0.9560\n", "Epoch 919/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0535 - acc: 0.9720 - val_loss: 0.1348 - val_acc: 0.9520\n", "Epoch 920/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0429 - acc: 0.9760 - val_loss: 0.1355 - val_acc: 0.9440\n", "Epoch 921/1000\n", "250/250 [==============================] - 0s 136us/step - loss: 0.0513 - acc: 0.9760 - val_loss: 0.1297 - val_acc: 0.9560\n", "Epoch 922/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0486 - acc: 0.9760 - val_loss: 0.1295 - val_acc: 0.9560\n", "Epoch 923/1000\n", "250/250 [==============================] - 0s 172us/step - loss: 0.0480 - acc: 0.9760 - val_loss: 0.1340 - val_acc: 0.9480\n", "Epoch 924/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0541 - acc: 0.9720 - val_loss: 0.1329 - val_acc: 0.9440\n", "Epoch 925/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0525 - acc: 0.9680 - val_loss: 0.1340 - val_acc: 0.9560\n", "Epoch 926/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0496 - acc: 0.9760 - val_loss: 0.1347 - val_acc: 0.9480\n", "Epoch 927/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0508 - acc: 0.9720 - val_loss: 0.1336 - val_acc: 0.9440\n", "Epoch 928/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0472 - acc: 0.9720 - val_loss: 0.1308 - val_acc: 0.9600\n", "Epoch 929/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0491 - acc: 0.9760 - val_loss: 0.1341 - val_acc: 0.9520\n", "Epoch 930/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0502 - acc: 0.9720 - val_loss: 0.1316 - val_acc: 0.9600\n", "Epoch 931/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0466 - acc: 0.9800 - val_loss: 0.1325 - val_acc: 0.9520\n", "Epoch 932/1000\n", "250/250 [==============================] - 0s 96us/step - loss: 0.0544 - acc: 0.9760 - val_loss: 0.1359 - val_acc: 0.9400\n", "Epoch 933/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0468 - acc: 0.9760 - val_loss: 0.1311 - val_acc: 0.9600\n", "Epoch 934/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0490 - acc: 0.9800 - val_loss: 0.1346 - val_acc: 0.9520\n", "Epoch 935/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0483 - acc: 0.9720 - val_loss: 0.1328 - val_acc: 0.9440\n", "Epoch 936/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0498 - acc: 0.9720 - val_loss: 0.1335 - val_acc: 0.9480\n", "Epoch 937/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0477 - acc: 0.9800 - val_loss: 0.1312 - val_acc: 0.9600\n", "Epoch 938/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0553 - acc: 0.9800 - val_loss: 0.1343 - val_acc: 0.9560\n", "Epoch 939/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0522 - acc: 0.9760 - val_loss: 0.1369 - val_acc: 0.9440\n", "Epoch 940/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0526 - acc: 0.9720 - val_loss: 0.1312 - val_acc: 0.9560\n", "Epoch 941/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0517 - acc: 0.9760 - val_loss: 0.1298 - val_acc: 0.9600\n", "Epoch 942/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0473 - acc: 0.9760 - val_loss: 0.1328 - val_acc: 0.9440\n", "Epoch 943/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0483 - acc: 0.9760 - val_loss: 0.1328 - val_acc: 0.9440\n", "Epoch 944/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0469 - acc: 0.9800 - val_loss: 0.1320 - val_acc: 0.9560\n", "Epoch 945/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0496 - acc: 0.9760 - val_loss: 0.1336 - val_acc: 0.9560\n", "Epoch 946/1000\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "250/250 [==============================] - 0s 112us/step - loss: 0.0475 - acc: 0.9760 - val_loss: 0.1309 - val_acc: 0.9560\n", "Epoch 947/1000\n", "250/250 [==============================] - 0s 156us/step - loss: 0.0469 - acc: 0.9760 - val_loss: 0.1329 - val_acc: 0.9440\n", "Epoch 948/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0487 - acc: 0.9760 - val_loss: 0.1334 - val_acc: 0.9440\n", "Epoch 949/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0513 - acc: 0.9760 - val_loss: 0.1351 - val_acc: 0.9520\n", "Epoch 950/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0486 - acc: 0.9800 - val_loss: 0.1306 - val_acc: 0.9520\n", "Epoch 951/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0480 - acc: 0.9760 - val_loss: 0.1348 - val_acc: 0.9480\n", "Epoch 952/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0474 - acc: 0.9800 - val_loss: 0.1354 - val_acc: 0.9480\n", "Epoch 953/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0468 - acc: 0.9760 - val_loss: 0.1343 - val_acc: 0.9520\n", "Epoch 954/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0471 - acc: 0.9760 - val_loss: 0.1329 - val_acc: 0.9560\n", "Epoch 955/1000\n", "250/250 [==============================] - 0s 176us/step - loss: 0.0502 - acc: 0.9720 - val_loss: 0.1320 - val_acc: 0.9520\n", "Epoch 956/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0487 - acc: 0.9680 - val_loss: 0.1315 - val_acc: 0.9480\n", "Epoch 957/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0481 - acc: 0.9720 - val_loss: 0.1336 - val_acc: 0.9520\n", "Epoch 958/1000\n", "250/250 [==============================] - 0s 148us/step - loss: 0.0468 - acc: 0.9760 - val_loss: 0.1321 - val_acc: 0.9560\n", "Epoch 959/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0498 - acc: 0.9800 - val_loss: 0.1345 - val_acc: 0.9440\n", "Epoch 960/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0475 - acc: 0.9720 - val_loss: 0.1336 - val_acc: 0.9520\n", "Epoch 961/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0482 - acc: 0.9760 - val_loss: 0.1378 - val_acc: 0.9560\n", "Epoch 962/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0506 - acc: 0.9800 - val_loss: 0.1388 - val_acc: 0.9480\n", "Epoch 963/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0598 - acc: 0.9720 - val_loss: 0.1370 - val_acc: 0.9400\n", "Epoch 964/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0487 - acc: 0.9800 - val_loss: 0.1297 - val_acc: 0.9600\n", "Epoch 965/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0505 - acc: 0.9720 - val_loss: 0.1340 - val_acc: 0.9520\n", "Epoch 966/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0473 - acc: 0.9760 - val_loss: 0.1392 - val_acc: 0.9440\n", "Epoch 967/1000\n", "250/250 [==============================] - 0s 144us/step - loss: 0.0503 - acc: 0.9720 - val_loss: 0.1363 - val_acc: 0.9480\n", "Epoch 968/1000\n", "250/250 [==============================] - 0s 100us/step - loss: 0.0481 - acc: 0.9800 - val_loss: 0.1345 - val_acc: 0.9600\n", "Epoch 969/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0490 - acc: 0.9720 - val_loss: 0.1337 - val_acc: 0.9520\n", "Epoch 970/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0467 - acc: 0.9720 - val_loss: 0.1343 - val_acc: 0.9520\n", "Epoch 971/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0483 - acc: 0.9680 - val_loss: 0.1346 - val_acc: 0.9440\n", "Epoch 972/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0507 - acc: 0.9800 - val_loss: 0.1342 - val_acc: 0.9560\n", "Epoch 973/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0510 - acc: 0.9760 - val_loss: 0.1328 - val_acc: 0.9440\n", "Epoch 974/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0495 - acc: 0.9840 - val_loss: 0.1333 - val_acc: 0.9560\n", "Epoch 975/1000\n", "250/250 [==============================] - 0s 192us/step - loss: 0.0504 - acc: 0.9760 - val_loss: 0.1365 - val_acc: 0.9520\n", "Epoch 976/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0490 - acc: 0.9760 - val_loss: 0.1356 - val_acc: 0.9480\n", "Epoch 977/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0511 - acc: 0.9720 - val_loss: 0.1372 - val_acc: 0.9480\n", "Epoch 978/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0471 - acc: 0.9800 - val_loss: 0.1376 - val_acc: 0.9560\n", "Epoch 979/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0509 - acc: 0.9680 - val_loss: 0.1331 - val_acc: 0.9520\n", "Epoch 980/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0489 - acc: 0.9680 - val_loss: 0.1331 - val_acc: 0.9480\n", "Epoch 981/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0574 - acc: 0.9760 - val_loss: 0.1387 - val_acc: 0.9520\n", "Epoch 982/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0456 - acc: 0.9800 - val_loss: 0.1360 - val_acc: 0.9480\n", "Epoch 983/1000\n", "250/250 [==============================] - 0s 160us/step - loss: 0.0491 - acc: 0.9760 - val_loss: 0.1335 - val_acc: 0.9480\n", "Epoch 984/1000\n", "250/250 [==============================] - 0s 116us/step - loss: 0.0505 - acc: 0.9720 - val_loss: 0.1310 - val_acc: 0.9600\n", "Epoch 985/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0482 - acc: 0.9760 - val_loss: 0.1349 - val_acc: 0.9560\n", "Epoch 986/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0459 - acc: 0.9760 - val_loss: 0.1358 - val_acc: 0.9520\n", "Epoch 987/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0489 - acc: 0.9680 - val_loss: 0.1372 - val_acc: 0.9440\n", "Epoch 988/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0473 - acc: 0.9800 - val_loss: 0.1358 - val_acc: 0.9480\n", "Epoch 989/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0494 - acc: 0.9760 - val_loss: 0.1342 - val_acc: 0.9600\n", "Epoch 990/1000\n", "250/250 [==============================] - 0s 132us/step - loss: 0.0463 - acc: 0.9760 - val_loss: 0.1348 - val_acc: 0.9520\n", "Epoch 991/1000\n", "250/250 [==============================] - 0s 152us/step - loss: 0.0533 - acc: 0.9640 - val_loss: 0.1403 - val_acc: 0.9400\n", "Epoch 992/1000\n", "250/250 [==============================] - 0s 128us/step - loss: 0.0483 - acc: 0.9800 - val_loss: 0.1336 - val_acc: 0.9560\n", "Epoch 993/1000\n", "250/250 [==============================] - 0s 124us/step - loss: 0.0485 - acc: 0.9760 - val_loss: 0.1345 - val_acc: 0.9560\n", "Epoch 994/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0463 - acc: 0.9760 - val_loss: 0.1357 - val_acc: 0.9440\n", "Epoch 995/1000\n", "250/250 [==============================] - 0s 112us/step - loss: 0.0471 - acc: 0.9760 - val_loss: 0.1339 - val_acc: 0.9480\n", "Epoch 996/1000\n", "250/250 [==============================] - 0s 120us/step - loss: 0.0483 - acc: 0.9760 - val_loss: 0.1310 - val_acc: 0.9600\n", "Epoch 997/1000\n", "250/250 [==============================] - 0s 108us/step - loss: 0.0494 - acc: 0.9720 - val_loss: 0.1357 - val_acc: 0.9520\n", "Epoch 998/1000\n", "250/250 [==============================] - 0s 104us/step - loss: 0.0504 - acc: 0.9760 - val_loss: 0.1401 - val_acc: 0.9480\n", "Epoch 999/1000\n", "250/250 [==============================] - 0s 140us/step - loss: 0.0473 - acc: 0.9800 - val_loss: 0.1369 - val_acc: 0.9400\n", "Epoch 1000/1000\n", "250/250 [==============================] - 0s 184us/step - loss: 0.0469 - acc: 0.9800 - val_loss: 0.1340 - val_acc: 0.9520\n" ] } ], "source": [ "h = model.fit(X_train, Y_train, epochs=1000, verbose=1, validation_data=(X_test, Y_test))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**d)** Plot the training and validation error curves." ] }, { "cell_type": "code", "execution_count": 150, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "<matplotlib.legend.Legend at 0x2b170117f98>" ] }, "execution_count": 150, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl8AAAE/CAYAAAB4o6baAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzs3Xd803X+wPHXJ0mb7s0so2zZqwwRQZygOM8Bjjsniuv09M55bn+nJ6ee+/DOPXEvcCEKKCJ7I6CsMkqB7jZtxuf3xydtkzYtLbRNm7yfj0ce+e7vO6Ek73ym0lojhBBCCCGahyXYAQghhBBChBNJvoQQQgghmpEkX0IIIYQQzUiSLyGEEEKIZiTJlxBCCCFEM5LkSwghhBCiGUnyJZqNUmqbUurEYMchhBBCBJMkX0IIIYQQzUiSLyGEEKIJKUO+b0Ul+WMQzU4pZVdKPamU2u19PKmUsnv3pSmlPldK5SmlDiqlFlR8aCmlblNK7VJKFSqlflVKnRDcVyKEaE2UUrcrpX7zfoasV0qd7bPvKqXUBp99w7zbOyulPlRK5SilDiilnvFuv08p9YbP+RlKKa2UsnnXv1dKPayU+hEoAborpS7zucfvSqmrq8V3plJqpVKqwBvnRKXUeUqpZdWOu0Up9XHTvVOiqdmCHYAIS3cBo4EhgAY+Ae4G/g7cAmQBbbzHjga0UqoPcD0wQmu9WymVAVibN2whRCv3G3AssBc4D3hDKdUTGAvcB5wFLAV6AE6llBX4HPgOuARwA5kNuN8lwCTgV0ABfYDJwO/AOGCOUmqJ1nq5Umok8BpwLjAX6ADEA1uB/yil+mqtN3ivezHw0OG8AaJlkJIvEQwXAQ9orfdprXOA+zEfUgBOzIdOV621U2u9QJsJSN2AHeinlIrQWm/TWv8WlOiFEK2S1vo9rfVurbVHa/0usBkYCVwJ/FNrvUQbW7TW2737OgJ/1VoXa60dWuuFDbjlK1rrdVprl/fz7Aut9W/ee/wAfI1JBgGuAF7SWn/jjW+X1nqj1roMeBeTcKGU6g9kYJJC0UpJ8iWCoSOw3Wd9u3cbwGPAFuBrb7H87QBa6y3ATZhfp/uUUu8opToihBD1pJT6o7daL08plQcMANKAzphSseo6A9u11q7DvOXOavefpJT62dukIg841Xv/invV9oPyVeBCpZTC/FCd5U3KRCslyZcIht1AV5/1Lt5taK0Ltda3aK27A6cDf6lo26W1fktrPdZ7rgYebd6whRCtlVKqK/AipvlCqtY6CViLqQ7cialqrG4n0KWiHVc1xUCMz3r7AMdon/vbgQ+AGUA77/1ne+9fca9AMaC1/hkox5SSXQi8HvhVitZCki8RDG8Ddyul2iil0oB7gDcAlFKTlVI9vb/wCjDVjW6lVB+l1PHeDzAHUOrdJ4QQ9RGLSYZyAJRSl2FKvgD+C9yqlBru7ZnY05us/QLsAR5RSsUqpaKUUsd4z1kJjFNKdVFKJQJ3HOL+kZimEzmASyk1CTjZZ///gMuUUicopSxKqXSl1FE++18DngFcDaz6FC2QJF8iGB7CNGpdDawBllPVeLQX8C1QBCwCntNaf4/50HoE2I9pLNsWuLNZoxZCtFpa6/XAvzCfK9nAQOBH7773gIeBt4BC4GMgRWvtxpTA9wR2YDoDXeA95xtMW6zVwDIO0QZLa10I3AjMAnIxJVif+uz/BbgMeALIB37Av4bgdUyyKKVeIUCZtsxCCCGEaKmUUtHAPmCY1npzsOMRR0ZKvoQQQoiWbzqwRBKv0CDjfAkhhBAtmFJqG6Zh/llBDkU0Eql2FEIIIYRoRlLtKIQQQgjRjCT5EkIIIYRoRi26zVdaWprOyMgIdhhCiGaybNmy/VrrNoc+svEopV7CzLe3T2s9IMB+BfwbMxp5CXCp1nr5oa4rn19ChJ/6foa16OQrIyODpUuXBjsMIUQzUUptP/RRje4VzOCVr9WyfxJm/LlewCjgee9zneTzS4jwU9/PMKl2FEKENa31fOBgHYecCbzmnQz5ZyBJKdWheaITQoQiSb6EEKJu6fhPkJzl3SaEEIdFki8hhKibCrAt4Bg9SqlpSqmlSqmlOTk5TRyWEKK1atFtvoRoKZxOJ1lZWTgcjmCHEhKioqLo1KkTERERwQ6lPrKAzj7rnYDdgQ7UWs8EZgJkZmbKIIpCiIAk+RKiHrKysoiPjycjIwPT+U0cLq01Bw4cICsri27dugU7nPr4FLheKfUOpqF9vtZ6T5BjEkK0YpJ8CVEPDodDEq9GopQiNTWVllItp5R6GzgOSFNKZQH3AhEAWusXgNmYYSa2YIaauCw4kQohQkWLTL6UUqcDp/fs2TPYoQhRSRKvxtOS3kut9dRD7NfAdc0UjhAiDLTIBvda68+01tMSExODHYoQLUJeXh7PPfdcg8879dRTycvLq/OYe+65h2+//fZwQxNCCNFALTL5EkL4qy35crvddZ43e/ZskpKS6jzmgQce4MQTTzyi+IQQQtRfSCRfK3fm8e6SHcEOQ4gmc/vtt/Pbb78xZMgQRowYwYQJE7jwwgsZOHAgAGeddRbDhw+nf//+zJw5s/K8jIwM9u/fz7Zt2+jbty9XXXUV/fv35+STT6a0tBSASy+9lPfff7/y+HvvvZdhw4YxcOBANm7cCEBOTg4nnXQSw4YN4+qrr6Zr167s37+/md8FIUQoW7rtIMVlLr9teSXlrNxZd+n9kcopLGPd7vwmvUd1IZF8fbVuL3d/vDbYYQjRZB555BF69OjBypUreeyxx/jll194+OGHWb9+PQAvvfQSy5YtY+nSpTz11FMcOHCgxjU2b97Mddddx7p160hKSuKDDz4IeK+0tDSWL1/O9OnTmTFjBgD3338/xx9/PMuXL+fss89mxw75sSOEaDwHi8s594VF3PTuSr/tU19czFnP/ohpetk0TnlyPqc9tbDJrh9Ii2xw31ARVgtOt0Zr3aIa8orQdP9n61i/u6BRr9mvYwL3nt6/3sePHDnSb5iGp556io8++giAnTt3snnzZlJTU/3O6datG0OGDAFg+PDhbNu2LeC1zznnnMpjPvzwQwAWLlxYef2JEyeSnJxc71hF01qTlU+Bw0mXlBiyCxxkZqQEO6SANu4toKDURUZqDG0Tohrtumuy8hmQntCgz36X28Om7CL6dUyo8ziH082OgyX0bhd/pGE2qXW783G6NYM7JXKwuJwdB0vQQNeUGH7NLsRus9K/YwL5pU525ZXSNSWGqAgrewsc9GgTx9wN2QxIT6RdQhRZuSXsOFhCQamTk/q1Z2+Bg0KHk9JyNwPTE/k1u5CM1Fj2FjhIT4pm+4ES2sbb2byviBEZyazdVYDFAke1T2DLviK6ppp7gXk/tx8ooU/7eBxON+t2F1Bc5sKiFEkxZsy/b9ZnM39TDp2Soykpd7Nhj/msLXd7sNusNV779gPFLNmWS8fEKNKTo+maGlu5r7jMxfIduRSXuRmQnoDD6WFvvoMuKTEUl7vYlF1Il5QYDhaXA7B8Ry7tE6LILnAQE2nDbrPw+/4ijj+qXaP/m4VE8mW3mQI8p1sTaZPkS4S+2NiqD5jvv/+eb7/9lkWLFhETE8Nxxx0XcDBYu91euWy1WiurHWs7zmq14nKZKoCm/NUpDt/aXfmc/oz/L/bFd55Au0ZMbhrLxCcXVC5ve+S0RrnmN+uzueq1pTx+/mDOGdap3uc99vWv/OeH3/nulvF0bxNX63G3zFrFF2v2sO7+U4i1t8yvy7W78pn8tPkb+Nd5g7nlvVUBj5s6sgtv/2JKrK0WxaBOiazYkcfMS4Yz7fVlgPl3Of+FRezON58f10/oyTPztlRe49IxGbzy0zaSYyLILXEyZURn3lmyk2Fdkli+I48bT+jFU3M3A3BBZmfeXbqTqSM7849zBgFw98dreX9ZFiv+fhKPfrmRd5b4ztpV5Y8v/VJjm8MZOPka/9j3fuu+f1vXvrmcHzbVf0ibc577yW89Lc7O/qKyRvt79dUy/5oaKMJqEq5yt4dIW0jUpIoWrCElVI0lPj6ewsLCgPvy8/NJTk4mJiaGjRs38vPPPzf6/ceOHcusWbO47bbb+Prrr8nNzW30e4iGKyh11tj2zfpsOiVHE2m14NFVn4/14XB5sNssKMDl0bg8miibhVKnm/goGy63pqjMRXSklUirBYfTg1LgdHuIs9vIK3ESY7diVYpYu40yl5v0pBi27Cvyu8/m7ELioyL4LacIi1IM6ZxESbkLp9sk+ZuyC1EK4uw2lFJEWBVxdhudkmNYuysfl8eD1jB7jRnrdsOeAvbkl5JdUEac3crBYiduj6ZTcjQ2q2JvvoP4KBt2m5VCh4uPV+wCYHeeg3K3h07JMcT5JFcOp5u9+Q6+8F6/qMxVI/nKL3GSW1KOw+WmT7t4lFLkFJaRW1Jer5KyknIXy7fn0bNtHEpBSmwkOYVlREVYOVBUxoHicrqmxrDjQAmRNvNvmRIbidPtIS3OTqHDSaHDVRkjwJy1tY/969su2u3RrNhh2lG9/vP2yu0rduRWJl4AL/241e8ar/y0DYDcEvN39+Fy8z4u916rIvECeHepSaw+X7WHS0ZnsK/QwfvLsgCYu3FfrYlXbVZn5aFQxNqtOJweIqyKQoerxnELNufQJt5OSbm7QYlXIPuLyji2V9oRXaM2IZF8RVq9JV8uD9gPcbAQrVBqairHHHMMAwYMIDo6mnbtqorBJ06cyAsvvMCgQYPo06cPo0ePbvT733vvvUydOpV3332X8ePH06FDB+LjW3ZVTDiwWGomVq2h/etJT8z3W582rjsz5/9+yPPG9W7D/ABfqEopjv7Hdw2OY/obyygsc9EpOZqFtx1fuf36t5bz7YZ9leuFDmeN0sTBD3xdufz3yf24Ymw3Rjxshmx55sKhTB7Usc57D33gG8pcnsr1S0Z39UuEDodvzNV5aim8XrC5quPM2dVKfkrK6+5NXe721LkfoLDMxalPLfDbdmstpXN1ueR/NUvDjuS4+hrTQ5KvWkVUVjse+g9BiNbqrbfeCrjdbrczZ86cgPsq2nWlpaWxdm3Vl/Ktt95aufzKK6/UOB4gMzOT77//HoDExES++uorbDYbixYtYt68eX7VmCI4HM6qL8enpg6lbbydR+Zs9Osd9taVo+p1rf3F5dz49orKcy7872IAEqMjyA9QwlaXeLuNwrKapRK1qW9Ps0CJF8Bv1UrW6qsixqzcUjZlV5UsV09i1u8p9E9e3OXYKaeMSAA+WbmLUd2q2trNWbv3kKVfvokXcMSJVygZ2zONhVsavzf1X07qTVqcnTs/WlPvcxpSctwQIZF8VZR8Vf9jFkI0jh07dnD++efj8XiIjIzkxRdfDHZIAv/k6+R+7YiKsDJ5UAe/5GtMz/r9ci9zmWsdf1Rbv3POHppeWd1UH/F2Gyf1b1dZJVUfP26p2Tu3IeZurL3Ep75OrlYa56siKa3wfeTN/BqVTYbD/CBanVXV7grgi9V7+GK1TP95uE4b1KFJkq+LRnUhKSayQclX9zaxhz7oMIRE8tVnxzu8G/k+Tvf4YIciREjq1asXK1asOPSBolk5nOYH5/MXDiXKUwqeGC4/phuZGSl0TIyqV7VQBbvNylc3jaNTcjQAC2+bwLrdBZzcrx1nDumIy6NpFx/Fip25FJQ6GdU9lTi7DY/W7Mot5UBxOXabhSGdk7BZLAzulERGWizZ+Q5cHk2HxCiKy11oDaXlbjxa0zU1lvgoG9sPlJBf6sSjNTaLIiU2ktQ4O8u359IuMQqtNQeLy2kbH4XT7SE5NpJZS3byxZo9ZHZN5rJjupFfatpgdUyKYkjnZFZn5fHnd8ywBX89pQ9pcZFER9rweDS5JeUoIDk2kqzcUtonRFX2yANweTzkFJYRHWklzm7DZvFpS6w1GR9lA3D3aX1xeTSdk2MAyC0pp6TcRXpSDGgPFncZHlt05XlKu1DaA9pDucXO9v3FtIm34/RoUmIjuf4t839sdLdkft56EFD0bBtX2WZuVLcUTujblvioCO74sCqB+PeUIZWvtcIZgzvy6ardh/x375AYxR6fdl6dU6LZeTBwZ5wjkZ4UzfXH9/SL25fva0j29nwEWHb3iYx/7HuKApSk+pbQArx86Qgue2WJ3zGf3zCWA8XlpMWZvylfM84bXGcV6EfXjmFol6bp2d0ik6+Gzu0Y7cxjkGUjG53lTRuYEEK0IKVON9daP2bShxdWbrNc8CZD+k4+rOv1aV9VVdYpOYZO3qTC9wuoS2pMjfMqjvP1pzEZ9b7vgPTAU8kN71r7F1+UzcIXa/Ywunsqpw3qUGN/t7TYyi/z6yYE+C754TH4+CG4Kxsi6ugd6iiARzpXraf1rly88tjuoDXcnwRjboSTH4SPpsN33iYClgi4bjE8Paz261e45kfes6zi1chHYQ98GnE0Z1gXQQGM4hmySeGa8d2Z8E5vOPp67mAMYBrhnzkkPWDy9fX6vZUJem3uOq1vZdIHcN/p/bni1aU1jhvcKZFVWYeuHj6qfTwb99bsHHTJ0V2ZMqJzwOSre1osZw5J563FO1i89SAdEk3CemLfdqTG2blgRGf+t3Cr3/WjI6yM6ZlWmZx2TIxiwlFtGZieyJpdJs6U2Mha/7YAzh3eiQNFZfxjzsaAcTdV4gUtNPnSWn8GfJaZmXlVfY73RJs3yF10EGiZY9wIIURjKy5zcbGt2rycq96G6slXbUOFaA1K1b4fDVRr81IxnpbWYLGAx1O1/XDHWay4fwPOH9U9ldcuH8no7qm1HjP7xmNJiK7la+6np81zeXHdyVdRtSrN/Zv8150lVdc7+UFY5dM20+OEnYupl+0/8fSA38B7+TOsiyp3vXBKDLnth3FcN28V2KJnWHDr7Xy9YZ/pjefxMPcvx6K0Zk9BGRv35DO+dypRVnA4NW9ePoIuKTFs3ldIVIQVZbGRkRrNxr2FHNenLR2vjSY9KZpN2YUc26sNr14+EofTTa+2cXy2ag+juqfQu1082w4UU+7ysHTbQWZ8bQL97PqxFJe7KsfwOmNwR4rKXCzcsp/0pGh+3VuIR2uuPLY7Sinev+Zofs8ppkNSFOlJ0cz7NYcTjmoLwJNThrBqZx6DOyfxzrTRlYnT3yb2YVzvNsTZbXRLi2V3Xilt4k0p1v/+lMn7y7K4/Bgz7uFrl4/k9/1F2G3WymOq++6W8VT8xV91bHd6t4tneEYy8zbuY3jXZMY+Oq9+/2ZHQLXk8XsyMzP10qU1M/Dqtsx9hZ4L/syyyV8xPLPxe3oJsWHDBvr27RvsMEJKoPdUKbVMa50ZpJAaVX0/v47EE99s4uQF59LfUktj7WNvhRP+Dg+kmUSgOQy9BFa8AWNugJ+e8t+XngnnvwZP9INLPoYeE0zi9eRAyN8Jg6dCfhZYIyEiGvK2m9fw3p/gtm0QXc+SiGWvwGd/hrv3ga2WjiEPd6hKnO7NM6VXJ9wLx/7F/7j9W+CZ4Q15Bw7PqGtg8QtNf5/a3LUXbFHmfTjpATjmzzB/Bnz3INznX+K1r8DByP+by5geqbx1Veh952bc/gVweOPR1fczLCQGxbInmMahZYVH1mhTCCFak6IyF+uVt0qt7xk1D1gwA1xlzZd4Aax4HdA1Ey+AXUthm3fYgRVvmOfyYpN4gSm127YAfpsLGz+HvWvgh3+afbkN6A34zb3mubCORu9un/fEVWae595f8zhXzQGLm8TuRmhTOfzS2vdNuMs8apO/C0oOmuWK9++7B82zy79JT9uEKF64eDjPXVSP6tRW6Nu/jOOD6WOa9B4tstqxoaK9yZezUCb6FQIgLi6OoqIidu/ezY033lg5cbav4447jhkzZpCZWfuPtCeffJJp06YRE2Pa9Jx66qm89dZbJCUlNVnsov50UTbnqblm5YLX4b4A7Vv+2aN5gzqUj642z2vfN49D2bfOPM88jA5V/x5ctZzWB/b/Gvi4h32mjwn0HjaH+lZP1uW4O0ypXyDj/2ae5z0ceP9/xoGz2Lui/d+Hh9rUOHwiwIS7YfxfoTQXHs2oec1rFkL7gbD4PzDnbzDgXPNvHtfelDr+eRV8diNs+Ax6nQLn/AeeGABT34aVb8Pe1ZDtHSLn3jyTMD/eF06bYZL2T64z+5QV7twNr54OvU+BcbeaZPKJfv7xXPsztPUpbf/vSZD1C5z2LxhxJeT8Cs+OpCfAwPPgo8VwU/17RjZESJR8xSWb+mJ3sZR8CeGrY8eOAROv+nryyScpKSmpXJ89e7YkXi1I5zz/nl1cOts8HzUZbNEwaAoMvRgGng/JGdB7IkT4NI7vPbHmRfudBcf/HU6dAX/4n/liq0vvSaaaqm2/uo8LttoSr8OVUP/pjI5Yai/oXEf13shpcPpTEN/eVOueOgMGXQDTfoCOw+CSj6qOPee/Nc9P7AJtetfcfijzHjLPu5YH3v+ztxp1jjfxq0i2i/ZCWYEp7dvwmdm2+SvYsxrKi+D7R03buWyfAYMdeSbJK9kPs/8Kn1xftU+7oWCXSaQqSus2Vw2CW2nVO/7rWd4BWb+4peb+Ne9B3o6qUtFGFhIlX/YEb4PLEkm+RGi67bbb6Nq1K9deey0A9913H0op5s+fT25uLk6nk4ceeogzzzzT77xt27YxefJk1q5dS2lpKZdddhnr16+nb9++fnM7Tp8+nSVLllBaWsq5557L/fffz1NPPcXu3buZMGECaWlpzJs3j4yMDJYuXUpaWhqPP/44L730EgBXXnklN910E9u2bWPSpEmMHTuWn376ifT0dD755BOio6Ob780KI9pZ7Ysh45ga7XNq+OVFmH0rjPuraU/1cLVJg89/1X+9aB98dUfN60TGQ3khnPEUxLU1bYR+/RLevsC03Vr1dsNfUGvyl3XNV0p2g7ftYG33m/io6fwA0M/7GTDS219tWrXG44POgw+v9N928xrT+/Nwqj7reg9WvmEetXnjHP/117xV59sX1jzWt2StOMBgu8t9/m5ri+nHJyF9uPn7L8r231fbOWWFtbcbPAIhkXwpewJObFgcMt+cCE1Tpkzhpptuqky+Zs2axZdffsnNN99MQkIC+/fvZ/To0ZxxxhmoWnqMPf/888TExLB69WpWr17NsGFV7TUefvhhUlJScLvdnHDCCaxevZobb7yRxx9/nHnz5pGW5j9Q57Jly3j55ZdZvHgxWmtGjRrF+PHjSU5OZvPmzbz99tu8+OKLnH/++XzwwQdcfPHFTffmhDF9OO2Rhl5i2kKNvdn08ptwF7QbAMX7oMOQmsdnXm5KG5K6QloveHmS2X75HPjtO4j1qZLqeSKM+5tpPD78MtjyDdjj4eBW2DrffJE58sDdDMMCnf5vmHNbVZutdgNM6UjpEX5PpGdCr5MD7zt1hqn2yw4wxdOo6abE57d5YLGZ9xRg/O2Quw26HWtirSiF6XVyzdKbM56Bdv3g9x+gUyasfMskE5YGVmKd+aypftMacjaabUMuhJ0/w5Zv6z63pfrx3/U7bs5tNROvujjyIbbxpxgKieQLpchX8USU5R36WCGO1JzbTUPgxtR+IEx6pNbdQ4cOZd++fezevZucnBySk5Pp0KEDN998M/Pnz8disbBr1y6ys7Np3759wGvMnz+fG2+8EYBBgwYxaNCgyn2zZs1i5syZuFwu9uzZw/r16/32V7dw4ULOPvtsYmNN1/dzzjmHBQsWcMYZZ9CtWzeGDDFf4sOHD/ebskg0Lu06jCQmIgpOuKdqvaItUH2Pr9BugPm79WW1wfHeRt2xqdCllqmNKkoZrlkIL4ytX9x+9x4I2dX+D6b2hBuW+W+rrQH6D//0b/uU1MVUMR17Cyz4V83jx/4FFj5uli/+AKJrqXofeVVViZNvSUpqr5r/vyv2T/ApVSzeb5IvZYWL3qtZGjPsEvOc7u192W1c4DgOZajPj6FO3jafienmtd2XBGjT+zF/l39Pz5vX12xH1doUHnrgWT//Own+duh5RxsqNJIvoMiSiN0pJV8idJ177rm8//777N27lylTpvDmm2+Sk5PDsmXLiIiIICMjA4ej7pKQQKViW7duZcaMGSxZsoTk5GQuvfTSQ16nriFqfOd8tFqtftWbonF53PWfP7HRXPEtbP3h8Mf08hWVCCfeb661f7PpKXnJR7B6lqm29K2+PONpKCuC+Y/BKQ+B9sDaD83wEx63KaGrrwF/gH0bzHAYK143Dce/e9AMlGqJMEM+nPEUbPrKxHbMjdBhEGxdYGKu8MdPIHu9KRlR1UqfTn4IOg411xj2p5oxTH4CEtL9t0WnmISx54ne638K+9bX/3U1hivnmhK3iGhI6W6S7Oy1prF8Qi2ThZ/1Anx8TfPG2VyaqDlTyCRfpbZEop31m5xViCNSRwlVU5oyZQpXXXUV+/fv54cffmDWrFm0bduWiIgI5s2bx/btdXfFHzduHG+++SYTJkxg7dq1rF69GoCCggJiY2NJTEwkOzubOXPmcNxxxwEQHx9PYWFhjWrHcePGcemll3L77bejteajjz7i9ddfb5LXLWrndLsPfVBj6zzCPI6ELRpcpRAZB2Nvqtp+5jPmucfxcLa3sXZF8jXsj+b56Gurju9x/OHdP7UHnPeyWU73Vr+f/qR5nnBHVWlUP582lP3PNg9f3Y8zj0DG3GCeM2op2QuULFosprq08vrjzaM5dRpuHhXxTP/Rf39cO1Ntd+Zz8Mm15vUPmWoeELzeoq1MSPR2BHBEJBHnluRLhK7+/ftTWFhIeno6HTp04KKLLmLp0qVkZmby5ptvctRRR9V5/vTp0ykqKmLQoEH885//ZOTIkQAMHjyYoUOH0r9/fy6//HKOOeaYynOmTZvGpEmTmDBhgt+1hg0bxqWXXsrIkSMZNWoUV155JUOHDm38Fy3qFJSSr8Zw2RcmmYqqR8/Z4++Gc19q+phE/Vw4y1RbDjjHlIZNfqLu409+CLocbZaP/3vV9k4j4ejrA/e47TDYJOgV7An++2PbmvZ1E31+CCd1hSGtp21pSIxwD/DL03+i54HvSLlvZxNHJcKRjHDf+GSE+yP3zD1XcL3F233/UL0chWgO1Uu+7sn17xDwxa2w5EWd/6HsAAAgAElEQVSY9BiMmma2LfgXzH3ALJ/xdFUppy+3Ex70lsBXjMvle7/78mHxTJjz10PHeMuv8K8+AWLPD1xy14D/W616hHul1OlKqZn5+fV/we6oFBJ1IdoThGJ4IYRoZlprlMfb4P7M54IbjBAVJj9pqmfPe8WMAVe9J+bR10Kbo/yrcIdcbIYuiU6GPrVM6WONgGQzfyN9faqDT7zf9KwFc83Unv5j2VU36AKICdB7cbR3wNaz/1Pny2ssLbLNV0Mn1gYgJgWr0hTmHyDeO+iqEEKEqjKXh5Mt3t59Qy8KbjBCVMi8zDygZhs5MI34r6s2mn98O7gz69DX/vPKmtt82wzGtfHv8VpZyvZPGHW1/3kXvQ9vngs9ToBLPqzaPnhK1SwMTahFJl+HQ8WYgVaLc7Ml+RJChLzScje9LLuCHYYQLdfwS2HJfwO3K0sfbiYS903eKoyabnqZWiNN54wmEDLJly3eFCOW5AcY+VaIRqC1rnUAU9EwLbmtaWtRUlpCcrCDEKIlaz8A7qtl/M+YFLi7lsFWm6FHe4ts83U4IryTa5dJ8iWaQFRUFAcOHJCkoRForTlw4ABRUVHBDqVVi170eLBDEEIcppAp+YpOMFNcuIr2BzkSEYo6depEVlYWOTmS3DeGqKgoOnVqxomJQ5D2zm/njEggIsixCCEaJmSSr9hkMzmsR5Iv0QQiIiLo1q1bsMMQopJTm49vqQkXovUJmWrH+IQkyrQNXXIw2KEIIUSTc3mkClyI1ipkkq84ewS5xGNxSPIlhAh9RREpABT2PCvIkQghGipkki+LRZGvEohwyOTaQojQd8Blpl+xTrg9yJEIIRoqZJIvgGJLApFOSb6EEKGvxFEGQHxsHaN5CyFapJBKvkpsiUQ7ZX4zIUTo024nAMoaMv2mhAgbIZV8OSKSiHVL8iWECH0Wj9O7IANNCNHahFTyVW5PJk4XgUyuLYQIcUq7zIJVki8hWpuQSr7cUclY8YBDSr+EECHO48KDAos12JEIIRoopJIvHW0m19YlB4IciRBCNC3lceFGEi8hWqMWmXwppU5XSs3Mz29YCZaKNePelBfIFDBCiNBm8bhwSfIlRKvUIpMvrfVnWutpiYmJDTrPFmfmdyzJ29cUYQkhRIuhtFuSLyFaqRaZfB2uyPg0ABxS8iWECHFKu/BI8iVEqxRSyVdUoin5chbK5NpCiPpTSk1USv2qlNqilKoxZLxSqqtSaq5SarVS6nulVKdgxOkXk3bjVpJ8CdEahVTyFR+fSKGOhvydwQ5FCNFKKKWswLPAJKAfMFUp1a/aYTOA17TWg4AHgH80b5Q1KY9bGtwL0UqFVPKVEB3JZp1OVO6mYIcihGg9RgJbtNa/a63LgXeAM6sd0w+Y612eF2B/s7NItaMQrVZIJV+J0RFs8nQivmBLsEMRQrQe6YBvcXmWd5uvVcAfvMtnA/FKqdRmiK1WUu0oROsVUslXQnQEm3Rnop25UCSN7oUQ9aICbNPV1m8FxiulVgDjgV2Aq8aFlJqmlFqqlFqak9O0n0EW7cYjyZcQrVJIJV9Wi2JPdG+zkvVLcIMRQrQWWUBnn/VOwG7fA7TWu7XW52ithwJ3ebfVGIhQaz1Ta52ptc5s06ZNU8aM0jLIqhCtVUglXwB5qYNxqCjYMvfQBwshBCwBeimluimlIoEpwKe+Byil0pRSFZ+XdwAvNXOMNUjJlxCtV8glX22SElhqGQRbvgFdveZACCH8aa1dwPXAV8AGYJbWep1S6gGl1Bnew44DflVKbQLaAQ8HJVgfFu2WBvdCtFK2YAfQ2NonRvF1+UDG5v0CB7ZAWq9ghySEaOG01rOB2dW23eOz/D7wfnPHVRcp+RKi9Qq5kq/2CVF85xpkVrZ8G9xghBCiiShJvoRotUIu+eqQGEWWboMjqScs+S+4yoMdkhBCNDoLknwJ0VqFXPLVNTUWgA09rzTVjotfCHJEQgjR+KxS8iVEqxVyyVe3NJN8zY86EdJ6wzd/hwX/CnJUQgjRuKzahZbkS4hWKeSSr+hIK+lJ0azdnQ8nezskzX0A7kuEty4IbnBCCNFIpNpRiNYr5JIvgHG92/DTlv3oXifBFd9U7dj0JexZHbzAhBCikVi0G61CrsO6EGEhJJOvvh3iKS53k5VbCp1Hwq0+cz3+51jweIIXnBBCNAIrbtySfAnRKrXI5EspdbpSamZ+fo3ZO+plXC8zrcfnq/eYDXFt4K+/gSXCrK/7sDHCFEKIoDElX1LtKERr1CKTL631Z1rraYmJiYd1fkZaLMO7JvPh8ix0xSj3sWnwl/Vgi4LP/gzlxY0YsRBCNC8rknwJ0Vq1yOSrMZw9NJ3N+4p46IsNVRvj2sIZT0N5ERz8PXjBCSHEEbJqN9oi1Y5CtEYhm3ydMyyd9KRo3vh5O7nFPgOtpvU2z5J8CSFaMQseKfkSopUK2eQrJtLGy5eNoMzlYeiD37CvwGF2pHQzzx9cGbzghBDiCFmRki8hWquQTb4AereL56YTzcTaI/9vLqf+ewHanmB2umXaISFE6+PxmHas0uZLiNYrpJMvgD+f0Iurx3cHYP2eAtbvKYARV0HU4TXmF0KIYPlmfTbd75zNln2FUvIlRCsW8smXUoo7JvXlH+cMBOC0pxZSFt0WHPlQXhLk6IQQov6+XrcXgGXbc7FqafMlRGsV8slXhakju/C3iX0AWLw/0mws2hvEiIQQomEibeYj+7YP1ki1oxCtWNgkXwDXHteTzK7JzPrNW1S/b0PdJwghRAtit1UlWzbcINWOQrRKYZV8AfxpTAbf5Keblez1wQ1GCCHq6becIl76cSsACg9WpclzyFRpQrRGYZd8HdsrjTIiKbElQeHuYIcjhBD18v6yrMrlSFwA7C6S5EuI1ijskq+kmEgGd05in06CwuxghyOEEPVS6HBWLlckX04ighWOEOIIhF3yBdAjLZYDOg5KDwY7FCGEqJcih6tyORKTiJ0/ukewwhFCHIGwbK3ZMSmabGcs7uL9SF8hIURLcvfHa+iQGM3GvYV8vno3d0w6ijh7BEVlvsmXWU5LSghWmEKIIxCWydeo7ilsXxCHu2izJF9CiBbljZ93+K3/3+yNAAzrkgTA8K7JXNIrDhYCVntzhyeEaARhWe2YkRpLLvHYyvLAIw1WhRAtw7PzttS6b/mOPM4Zls4H08dw1sA2ZqMtspkiE0I0prBMvjokRpFHPBY84MgLdjhCCAHAY1/9Wuf+yYM6mAV3mXmWki8hWqWwTL5sVgs6OsWslEijeyFE8O3OK61z/1TrXI5P2GNWNn1lnqXkS4hWKSyTLwBrnLfYXno8CiFagKXbc+vc/4+I/8HM8Wbl+3+YZ3tiE0clhGgKYZt8RcSnmYWSA8ENRAgh8B9KorpFdxxftfLLi1XLCR2bMCIhRFMJ2+QrKrEtALp4f5AjEUIIKCpz1rqvTZxP267Zt1Ytx7VrwoiEEE0lbJOv+GTzoeUoyAlyJEIIAYW1lHxFR1ixWQN8VA+9BKxhOVqQEK1e2CZfyUlJlOkIHPmSfAkhgi+3pDzg9jKXO/CQOFaZWkiI1ipsk6+UODsHicdVJNWOQojg25Pn8Fu/enx3ADwa2LOy5glHTW6GqIQQTaFFJl9KqdOVUjPz8/Ob7B4psZHk6nhp8yWEaBH2F5X5rdttZv6Ni0d3gRcn1DzBJmN8CdFatcjkS2v9mdZ6WmJi03WjTo6N5KCOQ8lQE0KIFqDU6WZi//bcdGIvALTWbP3HqTw0pCDwCRapdhSitWqRyVdzSImJNFMMOeoeW0cIIZpDqdNNVISFUd1SARjVLRXlccMrpwY+IalLM0YnhGhMYdtVJjrSSoFKwF6+PtihCCEEDqeH6EgrR/dIZe39pxBnt4GjllKvMTdAQofmDVAI0WjCtuQLoDQiiSh3IbhrH9xQCCGag6PcXdnOK87u/V3srGXKocj4ZopKCNEUwjr5Ko9IxIKGslp+XQohRDNxuNxER1qrNrjKAvdyBIhKaJ6ghBBNIqyTL0/Fr0dH0/WqFEKIQ3F7NE63Jsrmk3zNvhXeOj/wCe0HNk9gQogmEbZtvgC03Zt8lRUGNxAhRFgrczpReLBH+PwePri15oGDpsDR10GHQc0XnBCi0YV1yZelouheqh2FEEFkf/FYNtgvw27zfiSXHIRtC6oOGPZH85zSTRIvIUJAWJd8qWjvOGJS8iWECCLr/o1YVdXAqhTs9j/g9Kdg4HnQZUzzByeEaHRhnXxFeJMv7chHBTkWIYSw2yxwX4DBpZWCbuOaPyAhRJMI62rHiNgkAJwl0uBeiHCmlJqolPpVKbVFKXV7gP1dlFLzlFIrlFKrlVK1jHx6ZCKtAX4GTvu+KW4lhAiisE6+7HEm+SorzgtyJEKIYFFKWYFngUlAP2CqUqpftcPuBmZprYcCU4DnmiKW0z+uflug49CmuJUQIojCOvmKi4mlXFtxSvIlRDgbCWzRWv+utS4H3gHOrHaMBioG10oEqjXKaiJXz2+W2wghmldYt/mKj46gkBjcpdLbUYgwlg7s9FnPAkZVO+Y+4Gul1A1ALHBis0TWYXCz3EYI0bzCuuQrITqCIh2Np1TafAkRxgL1t9HV1qcCr2itOwGnAq8rpWp8fiqlpimlliqllubk5NQ7AEdMx5obZSBVIUJWeCdfUTYKiQGHDDUhRBjLAjr7rHeiZrXiFcAsAK31IiAKSKt+Ia31TK11ptY6s02bNvUOoCiua82NvSfV+3whROsS1tWOCVER7CeaNuVS7ShEGFsC9FJKdQN2YRrUX1jtmB3ACcArSqm+mOSr/kVbh6A9HgDyBl5GUvvukL8TjrmxsS4vhGhhwjr5io+KoFBHY3EWBzsUIUSQaK1dSqnrga8AK/CS1nqdUuoBYKnW+lPgFuBFpdTNmCrJS7XW1asmjyQGfvH0ocOE/yMpJaaxLiuEaKHCOvmKirBQQjQ2Z3awQxFCBJHWejYwu9q2e3yW1wPHNNX9PVqjUVXTCwkhQlpY/09XSlFqiSXSLSVfQojgqah2rJxeSAgR0sI6+QIos8Zgl+RLCBFElSVfEWH/kSxEWAj7/+nl1lhs2gmusmCHIoQIU1prtFZEWsP+I1mIsBD2/9OdtlizUFYU3ECEEGFLaw8ohcUSaMgxIUSoCfvky2WLMwtlMtyEECI4tEejlCReQoSLsE++3JEVyZcMtCqECA6NJF9ChBNJviLizYIkX0KIINEejyRfQoSRsE++dKQ3+SqXNl9CiODQWqMsYf9xLETYCPv/7cou1Y5CiODSWqodhQgnknzZK6odpcG9ECI4JPkSIrxI8hWdYBak5EsIEUSSegkRPsI++bLZ4/BohbtUSr6EEMGhTH/HYIchhGgmYZ98xdhtFBGNyyElX0KIYJHkS4hwEvbJV3SklUKi8ZTmBzsUIUSYUmiQNl9ChI0WmXwppU5XSs3Mz2/6hCg6wkqRluRLCBFEGnSwYxBCNJsWmXxprT/TWk9LTExs8nvFRFrJJR5KDjT5vYQQIhCFRprcCxE+WmTy1ZyiI20c1PFYSg8GOxQhRNiSNl9ChJOwT77i7FYO6nisDkm+hBDBISVfQoSXsE++Yu02DhJPRFkeeDzBDkcIEaakzZcQ4UOSr0gbB3UCCg848oIdjhAiDElvRyHCiyRfdtPmC5BG90KIoFBa2nwJEU4k+bJ7ezsCFO8PbjBCiDAlbb6ECCdhn3zZbVYKlHd+Ryn5EkIEgZIWX0KElbBPvgBKIlPNQuGe4AYihAhbUu0oRPiQ5AsojUzDhQ0KdgU7FCFEuJIG90KEDUm+gNioSHJtbSA/K9ihCCHCkFQ7ChFeJPkCYuxW9lvaQL6UfAkhgkB6OwoRViT5AuLsNvZa0qTkSwgRFDLCvRDhRZIvzECre3SaafPlcQc7HCFEGNKSewkRNiT5wgy0muVOAe2Gwr3BDkcIEWak5EuI8CLJF5AcE8Fv5UlmJX9ncIMRQoQdhbT5EiKcSPIFpMbZWVWeblZ2LQtuMEKIMCQlX0KEE0m+gNS4SPaSijsqGQ7+HuxwhBBhRiGDrAoRTiT5AtrE2QEoj24HBTLKvRCimWktBV9ChBFJvjAlXwAFMZ0hZ0OQoxFChBtp8yVEeJHkC0jzlnztiRtgqh2LZYJtIUTzkd6OQoQXSb6AlFhT8vW7va/ZsGtpEKMRQoQnSb6ECBeSfAFREVbi7TY2qB6grLDzl2CHJIQIIwqNlom1hQgbknx5pcXb2VNqgQ6DYPuPwQ5HCBFWZGJtIcKJJF9eqbGRHCgqh4xjIWspOB3BDkkIEUakwb0Q4UOSL6/UuEj2F5VB51HgccKelcEOSQgRJiTtEiK8SPLllRZn50BxOXQZDSj4/YdghySECBPS21GI8CLJl1dqnJ3cknJcUSnQaQRsmhPskIQQ4UJrkAb3QoQNSb682sRFojUcLCmHPpNg9wrYuiDYYQkhmoFSaqJS6lel1Bal1O0B9j+hlFrpfWxSSuU16v1lkFUhwookX16p3oFW9xeWw6hrICoJVr8T5KiEaOGKcqDkoP82twse7gBzboO3LwRXeXBiqyellBV4FpgE9AOmKqX6+R6jtb5Zaz1Eaz0EeBr4sFFjaMyLCSFaPEm+vCpGud9X6IDIGOhyNGz+BjzuIEcmRAv1zkUwoyfM6GXWl74E9yWaKbqcJbD4Bfj1C8heE9w4D20ksEVr/bvWuhx4BzizjuOnAm83bghS8iVEOJHkyysjLQaA33OKzYbep0BRNmyVhvciRHncUFYIP/4byothx+KaVe3lJaZk660psG1h1XatYePn3uu44Men4PObzfoLY/2vUb1krOVJB3b6rGd5t9WglOoKdAO+a8wAFNLmS4hwYgt2AC1Fmzg7STERbN5XaDYM+IOpNvn8ZrhxpXwwipav+AA4iyGpi1n3uMGRDz89BW36QucRYE+A8iJIzoDXzoRt3mTrm3uqrnPW8/DTM3Dc7fD5TVDinet01zI4/m747EaY/KT/vb/5e+1x5e+sfV/LEOg/d22jnk4B3tdaBywSV0pNA6YBdOnSpRHCEEKEIkm+vJRS9Gobx5Z9RWZDVAIcfR0sfBzWfgADzw1ugEK4ymD5azD8MrB6/+t6PODIg19mwvf/MNvOnml+PDw3Gg5sDnytrmNh+8LA+z6ebp5nXeK/vXifSbzAJGX1ldfik68soLPPeidgdy3HTgGuq+1CWuuZwEyAzMzMeg9bLw3uhQgvknz56JISy49b9ldtOP7vsPxVWPOeJF+i8eVsgt+/h54nQHEO7F4JqT2h14mmveGb58KwP5oG7KOmwTf3mmrw2beaTiGlubD63ZrX/WiaedSltsSrsfQ/G9Z9ZJYzxtZ9bPAtAXoppboBuzAJ1oXVD1JK9QGSgUWNHYCpdmzsqwohWipJvnx0SYnhgwIHDqebqAgrWCww4Fz45T+mIfFNayGp86EvJIKnNA+iElteNXFpHix7BcbcaP6uAJ4dEfjY8bfBD4+a5eWvmedVb/kfs/iFJgnzsIz7G/Q7w1RtPphqtp33ikm+lNUkly2Y1tqllLoe+AqwAi9prdcppR4AlmqtP/UeOhV4R2vdBBMxaqQJrhDhQ5IvH51TogHIyi2lZ9s4s3H0dJN8gWlgPHp6kKITh5S3A54cCKf8n6kybgwfXQMJHeGEew59rNZmUnaPCxI6QVpP+PIOM1tC7lbTAzA/y5RY2ey1X6ci8Qq29EwYeRXEtYUPrjRtv8bcANHJpuRt8zfw3p+g4xBoP9Ccc9Ma8/oArpwLsW2CF38DaK1nA7Orbbun2vp9TXV/hUytLUQ4keTLR+928QCs31NQlXyldIM/r4JnR8GXt5shKDoOCWKUola528zzxtmQvR4S02HCneAshV9nm3ZQDbXKO6KAb/KltZl8vbNPyVV5Cax4Heb8rWrbhLvg5+f8r7fkxYbdv+Mwk0h+cEXg/T2Oh9+8He9GXwc/P2uWb1gOzx8DrlKzntIDDv4GV3wDudshMhY6DIYnfIazatsPJj4Cce3Me5k+zCReADevh7KCqnWA/mdBm8XQ9qiqbUldqhr8d8ps2GsNY0rL9EJChBNJvnz0aR+P3WZh5Y48zhjcsWpHcgYcdwd8ey/MHA+3bTO//kXL4XbBq6ebZaVg5RtmecKd8NWdZgyq+I7Q9ejDu77HY553/ASvnGaWz30Jdi2HnifCG3+A6h3g5j1cv2t3ORoOboWivVXbUrpDp5FwjrfUddEzZtYFgIHnwTF/hrb9zfqK16FgFxxzk0l8vvm7Of/uvWaYh4Ld0LYvoEyVZ+eRVffpeZIZVqXPJLDHmypb8E+oACKizKO66seJwyTJlxDhRJIvHxFWC4M6JbJiZ27NnSOuNMkXwKw/wvmvQ3RS8wYoaiorAnucKZWpUL2919KXzHN5kf/2vB1VpTTVeTxQ6jM+1QPJJgn3Tbrfv9w8L3qmYTGPuKqqBCy1F1z+pSlNKy8yVXar3oFxf/V/HdO+h3cvMaV3/c/yv97wP1Utj77GPCrEpJhHbS5+v2GxiyajJfcSImxIC89qhnROYt3uAspdHv8d9ji4O8csb50Pj3Y1X747Fjd/kML49Uv4Rzrs/MW0s6qgfP6si316r5bmmTZcZYVm+JAnB8JLk2pe1+kwydZjPfy3526rKn1qiIr2UACDLoDTZsDftpretFd7B/FVypQ8JXWB8X8L3GHggtdrJl4iJCik3EuIcCIlX9UM7ZLMiwu2smFPAYM7VyvZskXCX3+r+lJe+4F5TP8J2vVv/mBbu0e6woBzYPITh3d+RVunrKWmUXwln6+xL/5StbzgX2bqm/aD4MAWs23HT6Yn62n/MqWbjgJY8UbDYxl5NeRthx2LzMCmZz0P8e1NaVlKd9PurOSgaYcGpjRq3K0Nv48ISWacL/ktLES4kOSrmqFdTMK1YkduzeQLIDYN7ss3o99XdPd/fgwMvhBO/zfsXQMl+007GlE3R56pEgyUfB3cCm9PgUs+Nr1Mh15i2hzt3wKvTjY96SraWDlLwO0zefOuZVXL6z+pWs7ZYJ6VBTZ95X+/L24xj4Zo0xdGXGGStkMNbRERXZV4CVGNkr6OQoQVSb6q6ZAYTbsEOyt35tV94KRHYeQ0eHqYWV/1lv9YTGe9ANlrTcNo3x5ioqZv74cTve3pPrkO0npD/i7I2QhvnedNaA+Y6W6W/BcK95ipcYqyzTnfPWhKnSr4tv8K5MvbGh7jlXPhv97xqo65yTRQb9ffVBUKccRkbkchwokkXwEM7ZzMikMlXwCpPfy/lH197G30vOgZuHqBafez9H+m1CXz8sYNOJj2bTCNxLuOqbnP46kaULQuCx83idVH18C6D8223hPNc5G3nZ2zBAqzweUw69WnzakYjLSpdBhiSjyFaAIKJPkSIoxI8hXAkC5JfLluLweLy0mJjaz74E6ZcG8erHzTlNoE8p9j/de7joXkrlUDbWoNe1ebcZeaiiMfLBEQGdOw8w78ZkqY9m0wVWzVPTfaPFdPTF4+FayR8MePzfpD7U2CNeIKU1pUfZDwD66EDZ9WrW/60jxXDL/gdMC/ejcs9oa6+AOIa2/mNuyUCcfeCoV7zXAKVvmvIpqOzO0oRHiRb5QAhnUxwwn89Nt+Jg/qeIijMb9Yh15sHmVFpgdeXSqmlblxpRnEden/THujP34C3Y8zbZba9jONt1HQY4I5vr4lSYE84h1SoT6dA35+3iRHHYfAyz69AQMlXxV+eMwkKOnDods4M9I7wA//NO+Lq9QM1fHtvXDVPNiz0v9838QrEN9hHxpbl6PNe91+MMS1gWsWVO2TdlqiWcg4X0KEE0m+AhjeNZl2CXY+Xbm7fsmXL3ucmWLFEgHPZNYcW8rXU9VGyn/vUrhwFvzvJP/tXcaYZMVZAlPehqNO9d9fVmSqPs98DjoNrzu+58ccuvrsy9vN81nP19xXctCMj+XIhz2rqrbPe6hq2XdohXkP1xxs9MUJdd8/kDXvNfycQxn2R7BFw8kPmt6IMm6bCBIzvZAkX0KEC0m+ArBaFKcO7MCbi3dwoKiM1Lg65uELpGLgzjt3mWTl23tNm6Rep8Bmby+7rsdUlQ5VKM2tmXiBGQ6hwjtT4YpvYfHzZsqY4+8yiVnORvj6LjNgJ5iR15Mz6h5gs0LuNvj4OlPS5TtgaGSs/3Gbv4U36zFFz941hz6mqUXGQ3khtBsIHQaZauEKHYeaQUt91TXXohBNTKGl4EuIMCLJVy3OHprOyz9u48ffDvhPNdRQMSlwxtPmAabqMHermfLl1cmHd81dS834YmB6BlZU2ZUXQc4m+PY++PUL6Dwarviq5vnbF5lpdkq8VXn/9rY1277Q/7iVb/uv1yfxam59ToWMsaZkzHcA1Duz/I/bugAGnmsSsa5jmzdGIQ5BSbWjEGFFkq9a9OuQQGyklS9W7z6y5Ks6i8X0kkzqCqOmw+ALYOZxDbtGRbUgwIdXVi3vXVPVngxg589mSIZdy/3Pf3kiJHaG/J3+VYTVbZrTsLgaU+dRsLPa7AHxHcwwE75OehDSepppd/auMR0ESgNMD3VzCyiNE6IOUu0oRPiQ5KsWNquFUwa058Plu9hxoIQuqQ3sJXgoVhtMesQs35UND7eD9ExTqtWYfv8+8Pb8nea5JVQRBnLF17DiTTN+13F3gNtphulw5JnJn9+eAlu+NYOXghlNPr499ApQbStEC6dknC8hworMZ1GHPx6dAcAHy7PqPvBIRUTBtT/D+dXGqqqYxHnSY017/0PxnXzaGglnzzQlTbUNjXHCvTDmBv9tPaqNhRbfwX+94zAYdQ0cf3dVadzQi2DCneZLyRZpEtbYNLBGwB/+C1Pekt6IIiRYpNpRiLAiJV91GNI5iaPax/PvuZuZNq47sfYmfLva9jXPf/3dPMemmuf8LIjvCPn7KXEAACAASURBVAVZZpLooRebqXfaD4A3z68aB6uppPWGMTfCp9eb9TE3mKrSwReYOHx7bPY9HTZ8ZqYCik4CZYUfn4QRV5nJpBc+Ydqj9TgBpr4DPzxqSvoufM8kVBW//Mf99dBxRSfDUac1+ssVIlik2lGI8CElX4dwdA+TBD34+frmuWFsalXiBZDYybQTO+kBOOs5M5L80ItMqdMF3gmgr5wLf15V81oXvW+GlTjrhaptf/ifGWdr6rtV26Yvqlq+ay/0OL5q/ZqFppoPILUXHOsz/2FKNzj3ZYhrB7dtN/e5bI4ZK8saASfdD7fvhFO9JXdjbzbxXPKhKck64e9mbDNbpFS5CCGECBvNVvKllOoO3AUkaq3Pba77Hqm/nNSbl3/cxjtLdvLIHwYFOxx/nUfAPbkmOasYMb73RDh1BpQXm5HZAYZMhU9vAI/TlGR18L4O3/G+hl8KHpdpQ3XBG7DqHTMkg81ungHG3Vpz+IkB55hHherTDEUlNNrLFSIkef/vys8PIcJHvUq+lFIvKaX2KaXWVts+USn1q1Jqi1Lq9trOB9Ba/661rmOI9JYpPiqCwZ1Myc+SbU04yvrhqhjxXim4eT2c9yokda5KvCoc551MurZk6PR/w5nPmuXIWDOafbp30vCkznDPQRg8pfHjFyLceZMvLaW/QoSN+lY7vgJM9N2glLICzwKTgH7AVKVUP6XUQKXU59UebRs16mb26uUjsdssPP3dlmCHUrfEdNN4P5Bjb4Wb15mBVw+HxXrYYQkh6lIxz6kkX0KEi3olX1rr+UD1Yp+RwBZviVY58A5wptZ6jdZ6crXHvkaOu1klxURy9bjuzN+Uw0crsigpdwU7pIZTyrQfE0K0LFqSLyHCzZE0uE8HdvqsZ3m3BaSUSlVKvQAMVUrdUcdx05RSS5VSS3Nyco4gvMY1ZaQZbuHmd1dx7yfrghyNECJ0SLWjEOHmSJKvQJ8UOsA2s0PrA1rra7TWPbTW/6jjuJla60ytdWabNm2OILzG1TEpmn96G9x/umo35S5PkCMSQoQEKfkSIuwcSfKVBXT2We8E7D6ycFq280d05qwhHSlzeeh99xxJwIQQjUCSLyHCzZEkX0uAXkqpbkqpSGAK8GnjhNVyPXjWgMrlYx79LoiRCCFCgvR2FCLs1HeoibeBRUAfpVSWUuoKrbULuB74Ctjw/+3de1yUVf7A8c9hBBFE5KKGooK3VBAR8ZaXvOclu5ibVrbZrrpZrVvtr9Iumt20zTWrzUrb2rbUMrtaWqnpquUNy7wrpqhoInhBlPvM+f0xF2ZgQECYGZjv+/Xy5fM8c55nvvMIx++cc55zgGVa61o/GCrI35eNjw0AID0rjzsWbiErt8DNUQkhaq5SR2sIIWqp8j7teIfWOkJr7au1jtRa/9tyfKXWup1lHNcL1Ruq52geGsDLY8zjvzYfOUvX59egtVSgQohKkDFfQngdWV6okv6Q2JxVf+sLQH6hiejpK1m89ZiboxJC1DyW5Eu6HYXwGpJ8XYUOEQ14YEBr2/6Tn+/h3OV8N0YkhKhxpOVLCK8jyddVevSG9vzn3m62/YTnVtP6iZXkFRrdGJUQouawDrh3cxhCCJfxyORLKTVKKbUwMzPzyoU9QP9rG/P23V1pFW5edNpo0tz9zjbaPLGS42ez3RydEMKj2Vq+PLI6FkJUA4/8bddar9BaTw4ODnZ3KOV2Q8w1rH7ketv+tpRzFJo0/V5eR06+UZ6IFEKUQh7WEcLb1HF3ALWJwUexefpAXlubzNJtRSsvdZjxrW3767/2IbZZzUkqhRBCCFG1PLLlqyaLCK7H7NFxfDS5JyPjIkq8fuPrm7j3vW18sDnF5bEJITyQlqcdhfA20vJVTXq2CqNnqzBahx/ktR8OO7y27mA66w6mMzKuKaGBfm6KUAjhGWTMlxDeRn7bq9kjQ6/lwHPDuNFJK9hra5MxmTTHzl5m+KsbOXspz/bapbxCzsu0FULUflqedhTC20jy5QL+vgb+dWcCB54bxoMD2tiO/+enFFo9sZLrX17P/t8v0mt20VqRfV76gS7PrXZHuEIIt5DsSwhvIcmXC/n7Gvi/G67lywd6O30932hiY3I6R9IvcSHb/HTkpbxCMrMLWHfgjCtDFUK4ikyyKoTXkTFfbtC5eUNS5ozknY1HeP6b/Q6v3f3vbUSFBdj2u7+whq4tQ9iYnEHSU4MJr1/X1eEKIaqVDLgXwttIy5cbTezbim+m9qFhgK/D8RS7iVmz841sTM4AYP3BdJfGJ4RwAS3zfAnhbTwy+appM9xfjZimweycMZSdM4bQrGG9Msv+3ye/snZ/GlprtFTYQtQS1gH30vIlhLfwyG5HrfUKYEViYuIkd8fiKg0D/Phx2kAAZq/az9v/O+K03J/fT7JtH509AuWCCju/0ISvQbnkvYTwOrK8kBBeR37bPdD04R1Y9pde3N+/dZnlFm44wsC568ktMLLnZCYPf7yTDYeqtmsyO7+Qdk+t4tW1yVV6XSE8iVJqmFLqoFLqsFJqWillbldK7VNK7VVKLam6d7eO+aq6KwohPJtHtnwJ6B4dSsuwAI5mXObuni25852tJcrMXnUAgAFz1/N7Zi4An/9yktFdmjFvbLxD2W/3/E5IgB89WoVVKI7MHPNTlx9vP8FDg9tV5qMI4dGUUgbgDWAIkApsV0p9pbXeZ1emLTAd6K21Pq+UalxlAcjTjkJ4HWn58mBNGvjz5viuXNcmnPbXBJVazpp4WX32y0k+STrhcOy+D39m7MItFY7BJF/KRe3XHTistT6itc4HPgJuLlZmEvCG1vo8gNa6Cud+keRLCG8jyVcN8fHkXgDERQYzsU80gzs0KbP8o8t3kVAFk7QWGk0AMt5L1GbNAPtvK6mWY/baAe2UUj8qpbYopYZV2bvL2o5CeB3pdqwhggN8SZkz0rZvNGl+S79Edr6RW9740ek55y7nk1tgpG6dohw7M6eAer4GZny5hwcGtKF5aIDTc63yC01V8wGE8FzOsp7ijxPXAdoC/YFIYKNSKlZrfcHhQkpNBiYDtGjRopxvLy1fQngbSb5qKIOPol0Tc1fk6of70SioLoP++T/OFlsPstvza8jKK7Ttd571PROui+Kj7SdIz8rj3xO6AXDiXDbh9etSz8/gcH6eJF+i9ksFmtvtRwKnnJTZorUuAI4qpQ5iTsa22xfSWi8EFgIkJiZWaD4YLcmXEF5Duh1rgbZNgmgY4MeaR64H4O9DigbG2ydeVv/5KQWAur5F//x9/7GOif/dXqKsNfk6eSGnKkMWwpNsB9oqpaKVUn7AOOCrYmW+AAYAKKXCMXdDOp8PpqIs3Y7S6yiE95CWr1okJNCPIy+OwMdHcVfPlvScvbbMbsPfzlzGZNIYLZX/j4fPligj3Y6ittNaFyqlHgS+AwzAu1rrvUqpZ4EkrfVXlteGKqX2AUbgUa11yV+YykVg+VuyLyG8hbR81TI+PuYKPDTQj0PPDye+ecNSyx5My+LVtcm0fXKV7diKXx17W/IKjdUTqBAeRGu9UmvdTmvdWmv9guXYDEvihTZ7RGvdUWvdSWv9URW+OSAPtQjhTTwy+fKm5YWq2zv3JNq2GwWVXJS7+OSpf136C51mfkfUtG+ImvYNl/OKki9Z0kiI6iBPOwrhbTyy29EblxeqLuH165IyZySZOQUE1/Pl612neHDJL2WeYz9O7PlvbPNMkldowt/X4OwUIURlySSrQngdj2z5ElUvuJ4vACNiI7ije3kfgXecwPVI+mUAnvh8N1/8crJqAyynQ2lZtHtqFanns93y/kJUPWn5EsLbeGTLl6g+Pj6K2aM7EVzPl5ZhAXSIaEBmTgH3vLuNjhEN2Pf7xVLPnfxBEpseH8iSrcdZsvU4I+Mi8DW4Nn9fvOUY+YUmVu9Lo2/bcLQ2P+1pZTJplKq68TMHTl+knq+BlmGBVXI9IUqQli8hvI4kX15q2vD2Dvspc0aSnV9IxxnflXpO6vkcxrz5k23/5Pkczl7O5/zlfAZ3LHvG/apSYFnvqI7Bh8HzNgDYJp9Nu5hLjxfXMmd0J8ZVoHWvLMPmb3R4DyGqjbR8CeE1pNtR2AT41eHAc8P44M/dSy2TdOy8bfuNdYe57c2fmPjfpFKnpNBac/ZSXrnePyXjMje8soGMMspblzvy9Sn5H9XRDHO36Gdu6hIVolLkaUchvI4kX8KBv6+Bvm0b0bdt+BXLfrIj1bY986s9HDh9kX9+f5Aj6Zdsx/+7+Rhdn1/DqNc38fDHO23Hfzycwe7UTDIu5ZGeZU623lh3mINpWazel1bqexYai1q+hKgdpNtRCG8j3Y7CqVu7NGNjcgZfPNCbpsH+dH9xLQAJLRry8/ELJcov3XaCpdvMaxO//sNhAP59TyIzv9oLwO6Tmew+mckrY+MBuOudrQ7nD7i2ET7l+OZv7XbcfvRc6YVkRgxRk0jLlxBeR5Iv4dTohEhuiW9mm7R1zSP9yMwppHlIPVsidiVPfL673O+37mA6EcH+AEz/bDd3dG9BgdHEtqPnOJSWxRvrfiPpqcG2bscTLn7a8VBalm0tTSGqljztKIS3keRLlMrHblxVm8ZFiUfKnJFETfvmiuenXSw5dqus8+yntQB4dsU+PthyzLZvNGkKLN2O9q1kSSnnaNLAv8xOmwKjqcSTmSfOZdO4QV3q1rny3GVH0i9L8iWqhdYmy8+uJF9CeAsZOCMqZdPjA9j42ACah9YD4IVbY4mLDK6y6//vUDo/HDjjcCw7v9C23JHJbrb9MW9tpu8/1pV6reU7Umn75CpOnMvmk6QTpF3M5VJeIX3/sY4nPttTrniMJunLFNVDm6TlSwhvI8mXqJTIkACahwYwuIN5iokbOzXl8/t7O5S5mmTsnne3cfJCjsOxYfM3sjE5A4BCJ8nQFzvN61JuSznHq2uKlk36JMk8Fm3N/jQeXb6L6Z/t5rJlFv8NyenliqfQJAuMi+ph+0lWUh0L4S3kt11clSdHdGDL9EEEB/hi8FEsmdiDoZY5v16/owuNnawnWVn2ydg2JwPul247btt+Zc0hAHILjGy1lD1t6dYsMJpsyZuhnK0N0vIlqosOa0vH3Hc5Gtbf3aEIIVxExnyJq1LH4MM1loHyANe1CadX6zDOZxcQGujHtw/1s83z9cvxC6zen1bmVBJVKSnlHOezC0ocV0phtIwdMziZL8wZZy1tV8u6RNMtXZpV+bVdYe+pTL7bc5pHhl7r7lBqNO1jIBt/MEh1LIS38MiWL6XUKKXUwszMTHeHIipBKUVooB8AoYF+tG0SRNsmQdzerTmL/phIh4gGtrL7nx1WbXGMeWszk/6bZNt/e8MR23a+0Tx2zMcHzlzM5ZFlO8nOLyxxDavqaPl66OOdPGQ391lNc+uCn3jth8O2J1BF5ZhkqgkhvI5HftXSWq8AViQmJk5ydyyi6q36W1+H/ciQeqSedxzfdU+vlry/+RjVQWtNnmVGfoNStqkz2l8TxOR+rZ2eczUtX5k5BQT4GVy+DmZ1s65qYNTaMyuSGkJLj7YQXqd2/W8gaqRNjw/ktxdHANCrVRg//P16Zt0cy+qH+5V6jq9B0S0qpFLvpzVczDG3cqWcLZovrMCo0VqzK9U8iazJLuEyXkXrTudZ33P/4p8djplq0Rgy67MIWmvOZOWWXViUqjyTDAshagf5wio8gsFH8dO0gYQG+uHva553q22TIL58oDfbU87x/Df7eXlMHNe1Caf3nB/wr2Ng2vD23Pbm5gq/16bDGWw6nFHieIHRxCc7Unls+S4ADj0/3PaateXrTFYua/ad4c4eFVu4e/W+NH7PzCEi2Dw1R0EtenrS/CSogQ+3HufpL/bw7UN9aX9NgyueJ8yKuh3dHIiotQoKCkhNTSU3V74cVRV/f38iIyPx9fWt1PmSfAmP0bRhvRLHOjdvSGyzYJo1rMew2Gu4mGtusYprHkzXlqHMuimG9QfP8OzNsQ5zfT0zqiPPrNhXofc/fzmfz+0W5V5/sGiesYs55oH7Uz78mR3HztO3bTjNQwOueE1t16fUa/YPfP9wP9o1CbJNFlsbWPPIjYfM03akZFyW5KsCrD8iknuJ6pKamkpQUBBRUVEytrAKaK05e/YsqampREdHV+oa0u0oPJ7BRzG8UwRKKYLr+bJ0Uk/eHN8VgHuui+K9e7vTPDSAn6YNpP+1jQAYZJl/DOCff+hcrvd5f/Mxjtl1Q0796Bfb9tnL+UDRdBXlHadTfKD+iXPm6xcU1p6WL6PlZhR9UqncK8J636TbUVSX3NxcwsLCJPGqIkopwsLCrqolUZIvUeP0ah1GA/+STb1NG9bj1bFdePvurjQPDWBS32heGduZtk3qV+p9cguKEqRMS8uXNZnKKTBSaDRxMbeAp77YXeqTksUH6luXbHJHt2PGpTz2nKz4E8T7f79I1LRvSE7Lcvq6dQJaLRO1V4p0OwpXkMSral3t/ZTkS9QqwQG+3BBzDQBPjuzIrV0i6dQsmJFxEQA8MaJ9pa5rTcSsrTw3zN9AmydXMWTe//hwy3EWbyma4HXl7t9tSU5BsYH66w+coceLazh85pLtWKHRZGtRcya/0FSppKm4Ua9v4sbXN5WrbH6hiTMXc9l29Bx/+WAHAN/s/t1p2aI8suS6m+LK5GlH4Q0uXLjAggULKnzeiBEjuHDhQpllZsyYwZo1ayobmlvImC9R6ymleOPOBCb2OU9884YkpZzn+wpO9GpdU7J4N6J18XD7Fi7rk41vjU+gZ6swh/Ifbj2O0aS5c9FW27GXvj3Aoo1H2fHUYMLq17UtPv7D36+nVaP6tHtqFWDuft337A3lWggczK11L3yzjxmjYvA1KNvC5VrrMr+1FRhNPLJsJ1/vcky28krpKrUmpCYZu1Q5thZDuXOi9rImX/fff7/DcaPRiMFQep22cuXKK1772Wefver4XE1avoTX6NIiBKUUr4yNZ/HEHhU6d2NyBluPnC3RkmWlLf+DWsd0Adz34c8lBtY7m6zVOuP/+ewCh+tPfD/JoZzRpEvMh1aWN9f/xrKkVD7ccoy/LS2azPX0xbLHKdz8rx9LJF5QNK9XcdbVArQtCdN0fW41n/2cWu5YvZm2tRi6ORAhqtG0adP47bffiI+Pp1u3bgwYMIA777yTTp06AXDLLbfQtWtXYmJiWLhwoe28qKgoMjIySElJoUOHDkyaNImYmBiGDh1KTo65PpwwYQLLly+3lZ85cyYJCQl06tSJAwcOAJCens6QIUNISEjgL3/5Cy1btiQjo+RT764iLV/C6wTWrUPvNuE8f0ssIQF+PLDE3FK14K4ELmQX8NWvJ9lypOTakWMXbin1mtauI/snLgHW2T0xWZpLeUbrVWxjy8D5uDBnXXoXcwt4adUBnhzZgQC/ol9pk11/1rd7T9u2e83+gZQ5I+1id2wJ2/f7RadxLt+RytM3dixxvPiA+9xCE2cv5/P0F3sYnRDp9FqiiLQYCleatWIv+045/x2vrI5NGzBzVEyZZebMmcOePXvYuXMn69evZ+TIkezZs8f2tOC7775LaGgoOTk5dOvWjdtuu42wMMeeg+TkZJYuXcqiRYu4/fbb+fTTTxk/fnyJ9woPD+fnn39mwYIFzJ07l3feeYdZs2YxcOBApk+fzrfffuuQ4LmDtHwJrzW+Z0vbWDCAG2Ku4c4eLahft3LztiSllEzYrHOGleXcZXPX5eB5Gxjx6kbb8QDfOg5TVYDzlrO31v/G4q3H6TjjO4fj1nNz8o0lzrEX/+xqRi/48Ypx2ieGjjE5DrjXxbohRdm0LC8kvFD37t0dpml47bXX6Ny5Mz179uTEiRMkJyeXOCc6Opr4+HgAunbtSkpKitNrjx49ukSZTZs2MW7cOACGDRtGSEjlJumuKtLyJbze1EFteW1tst0i2xXPGl7+7mCl398+STmTlWfbPpiWxX0f7nAom19owmjSLN56jAHXNiY738iC9b/ZXl+WdII2jevjZ/CxJUOvri1ZidnLzCng5+NlD2i1Wr0vjbxCIzfGNbUdM5ogt8DIOct0HCa77kdxZUVTTbg1DOElrtRC5SqBgYG27fXr17NmzRo2b95MQEAA/fv3dzqNQ926dW3bBoPB1u1YWjmDwUBhoflJ9OJfZN1Nki/h9R4Z0o5HhrSz7VuTsDFdI1m+w73jlr7b6/hgQF6hkV2pF5jx5V6C/A8S4Oc4UNW+pW1in4pN/vfB5hSe/nJvmWWsC5XbJ197T2Vyw/wNtv2TlnFpJq25lFdI/bpSzZTFJHN0CC8QFBREVpbz6WoyMzMJCQkhICCAAwcOsGVL6UM8KqtPnz4sW7aMxx9/nO+//57z589X+XtUhHQ7ClHMczfH8sdeLZk9upPtWDPL7PutGwXyyX29eHxYe+Iig5nhZAxUdbp1wU/83ye/ApCVW2h72tKZrFznc48BnLqQQ3pWHsftJpWdt/pQueN4ceV+2/Yjy351eG3u9+brFBg1sTO/40J2frmv65VkzJfwAmFhYfTu3ZvY2FgeffRRh9eGDRtGYWEhcXFxPP300/Ts2bPK33/mzJl8//33JCQksGrVKiIiIggKCqry9ykv5WlNcQBKqVHAqDZt2kxy1u8rhKtYp334cdpAfj52nlGdm5Za5koigv1t0z14kw2PDqBF2JWXYgJQSu3QWidWc0gukZiYqJOSkq5YLu1iLj1eXMsLt8ZyV4+WLohMeJv9+/fToUMHd4fhVnl5eRgMBurUqcPmzZuZMmUKO3fuvPKJZXB2X8tbh3lky5fWeoXWenJwcLC7QxECMLd8OUu8AO7pZf4Pc+NjA0h+YThLJvUgLNCP6cPbOzxVWHywfJ824cQ2q/1rIOYWlj3g39tZv//K5LRCVJ/jx4/TrVs3OnfuzNSpU1m0aJFb45HBGEJcpVk3xzLr5ljb/nWtw9nx9JAS5ewHoP+hayQv/6Ez+YUmcvKNDJq3noxLJbvnbolvyhc7TxEa6Gcb0P7szTHMuMLYLE9ypactvZ1teSE3xyFEbda2bVt++eWXKxd0EY9s+RKiNvpDYnO2TB/Ep1N68cKt5vFkfnV8CA7w5csH+/DaHV1sZf/UO5qUOSN5aUwcAH/uE83dPc0tbHf1aElCi4a2sr6Gov+2Vz/czxUfpUKyJfkqkzUll4YvIbyHtHwJUc2W39eLS3mF9GvbCB8fxTXB/iXKNGtYj2YN63HiXDb7fr/IjFHmgfx16xhsXZdGk+bx4e0x+CiW33cdmw5nEFi3Dm2b1OfTHakM7tCE5qHlG1vlSjkFpQ/8FzLPlxDeSFq+hCjDHd2bX/U1EqNC6X9tY3zKMZHTAwPa8MadCU5fM/go27QNPj6Kfu0a0bVlCA38fbm3d3SJxOuXYl2fSyb2YONjAwjyL/rOdV3rMBoF1eXo7BEM7tCYoR2b2F7rER1q2370hmudxjR1YJsrfqZdqVe/KHhtpuVpRyG8jrR8CVGG2aPjmD06zt1hVEhcZDCRIfVoGODLAwNac1PnZlx7TdEj1Z9NuY4hr5jn5VoyqeiR7nfu6QbAsPkbOHA6i6dv7MiNr28CYMr1rQmv78fK3afp2jKEvacy6dkqjHt7R7P+UDo9W4UxtGMTxry12SGWVo0CSWjh3pmkPV3RNF+SfgnhLST5EqKW+erBPrbtR29oX+L1tk2C+HXmUM5eKn2OMDA/fWf/tObYbi0Y261Fqe+3u1gL1109WvDMTTH4GqSBvSzWhbUl9RKiSP369bl06RKnTp1i6tSptoWz7fXv35+5c+eSmFj6zA7z589n8uTJBASYewZGjBjBkiVLaNiwYannuIIkX0J4oeB6vgTXq9walqWJCjdXbrd2acZLt8XhV0eSrvKwTTUht0uIEpo2beo08Sqv+fPnM378eFvytXLlyqoK7arIr7sQokoE+fuSMmckr4yNl8SrAoqmmpC2L1F7Pf744yxYsMC2/8wzzzBr1iwGDRpEQkICnTp14ssvvyxxXkpKCrGx5ql8cnJyGDduHHFxcYwdO9ZhbccpU6aQmJhITEwMM2fOBMyLdZ86dYoBAwYwYMAAAKKiosjIyABg3rx5xMbGEhsby/z5823v16FDByZNmkRMTAxDhw4tdQ3JqyEtX0IIB7LUoGvJVBPCpVZNg9O7q/aa13SC4XPKLDJu3Dgeeugh7r//fgCWLVvGt99+y8MPP0yDBg3IyMigZ8+e3HTTTaWOf3zzzTcJCAhg165d7Nq1i4SEooeTXnjhBUJDQzEajQwaNIhdu3YxdepU5s2bx7p16wgPD3e41o4dO3jvvffYunUrWmt69OjB9ddfT0hICMnJySxdupRFixZx++238+mnnzJ+/PirvEmO5OupEMKBbQySJAMuIQPuhTfo0qULZ86c4dSpU/z666+EhIQQERHBE088QVxcHIMHD+bkyZOkpaWVeo0NGzbYkqC4uDji4ooehlq2bBkJCQl06dKFvXv3sm/fvjLj2bRpE7feeiuBgYHUr1+f0aNHs3HjRgCio6OJj48HoGvXrqSkpFzlpy9JWr6EEA6Kpj6QZMAVtMxwL1zpCi1U1WnMmDEsX76c06dPM27cOBYvXkx6ejo7duzA19eXqKgocnPLXv/W2ZeUo0ePMnfuXLZv305ISAgTJky44nXKWte6bt26tm2DwVAt3Y7S8iWEcDAs9hoAwuv7uTkS7yDdjsJbjBs3jo8++ojly5czZswYMjMzady4Mb6+vqxbt45jx46VeX6/fv1YvHgxAHv27GHXrl0AXLx4kcDAQIKDg0lLS2PVqlW2c4KCgsjKynJ6rS+++ILs7GwuX77M559/Tt++favw05ZNWr6EEA4eHtyOP/WOJiTQe5IvpdQw4FXAALyjtZ5T7PUJwMvAScuhf2mt36mK924RGsDXf+1D8xDPW51AiKoUExNDhL7PQwAAB1JJREFUVlYWzZo1IyIigrvuuotRo0aRmJhIfHw87duXnBrH3pQpU7j33nuJi4sjPj6e7t27A9C5c2e6dOlCTEwMrVq1onfv3rZzJk+ezPDhw4mIiGDdunW24wkJCUyYMMF2jYkTJ9KlS5dq6WJ0RpXV9OZuiYmJOikpyd1hCCFcRCm1Q2td+qQ91fOeBuAQMARIBbYDd2it99mVmQAkaq0fLO91pf4SnmL//v106NDB3WHUOs7ua3nrMOl2FEJ4u+7AYa31Ea11PvARcLObYxJC1GKSfAkhvF0z4ITdfqrlWHG3KaV2KaWWK6WuftFPIYTXkuRLCOHtnA11Lz4eYwUQpbWOA9YA7zu9kFKTlVJJSqmk9PT0Kg5TCFFbSPIlhPB2qYB9S1YkcMq+gNb6rNbauhjmIqCrswtprRdqrRO11omNGjWqlmCFqAxPHt9dE13t/fTI5EspNUoptTAzM/PKhYUQ4upsB9oqpaKVUn7AOOAr+wJKqQi73ZuA/S6MT4ir4u/vz9mzZyUBqyJaa86ePYu/v3+lr+GRU01orVcAKxITEye5OxYhRO2mtS5USj0IfId5qol3tdZ7lVLPAkla66+AqUqpm4BC4BwwwW0BC1FBkZGRpKamIl3hVcff35/IyMhKn++RyZcQQriS1nolsLLYsRl229OB6a6OS4iq4OvrS3R0tLvDEHY8sttRCCGEEKK2kuRLCCGEEMKFJPkSQgghhHAhj15eSCmVDpS90maRcCCjGsOpLjU1bqi5sUvcrlWRuFtqrWvFHA0VrL/AO/59PYnE7VreEne56jCPTr4qQimV5Oo14apCTY0bam7sErdr1dS4Xa2m3ieJ27Ukbteqrril21EIIYQQwoUk+RJCCCGEcKHalHwtdHcAlVRT44aaG7vE7Vo1NW5Xq6n3SeJ2LYnbtaol7loz5ksIIYQQoiaoTS1fQgghhBAer1YkX0qpYUqpg0qpw0qpae6Ox55SqrlSap1Sar9Saq9S6m+W46FKqdVKqWTL3yGW40op9Zrls+xSSiW4OX6DUuoXpdTXlv1opdRWS9wfWxYiRilV17J/2PJ6lBtjbqiUWq6UOmC5771qwv1WSj1s+RnZo5RaqpTy99T7rZR6Vyl1Rim1x+5Yhe+xUuoeS/lkpdQ9rvwMnkLqr2qNX+ov18Ut9VdFaK1r9B/MC+H+BrQC/IBfgY7ujssuvgggwbIdBBwCOgL/AKZZjk8DXrJsjwBWAQroCWx1c/yPAEuAry37y4Bxlu23gCmW7fuBtyzb44CP3Rjz+8BEy7Yf0NDT7zfQDDgK1LO7zxM89X4D/YAEYI/dsQrdYyAUOGL5O8SyHeKunxs3/btL/VW98Uv95ZqYpf6qYP3ltl+KKryJvYDv7PanA9PdHVcZ8X4JDAEOAhGWYxHAQcv228AdduVt5dwQaySwFhgIfG354csA6hS/98B3QC/Ldh1LOeWGmBtYKgFV7LhH329L5XXC8otcx3K/b/Dk+w1EFau8KnSPgTuAt+2OO5Tzhj9Sf1VrrFJ/uS5uqb8qWH/Vhm5H6z+6VarlmMexNK12AbYCTbTWvwNY/m5sKeZJn2c+8BhgsuyHARe01oWWffvYbHFbXs+0lHe1VkA68J6lu+EdpVQgHn6/tdYngbnAceB3zPdvB55/v+1V9B57xL13sxpzD6T+cgmpv9zHpfVXbUi+lJNjHvcIp1KqPvAp8JDW+mJZRZ0cc/nnUUrdCJzRWu+wP+ykqC7Ha65UB3Nz8pta6y7AZcxNyKXxiLgt4wtuBqKBpkAgMNxJUU+73+VRWqw16TNUlxpxD6T+chmpvzxPtdRftSH5SgWa2+1HAqfcFItTSilfzBXXYq31Z5bDaUqpCMvrEcAZy3FP+Ty9gZuUUinAR5ib7ucDDZVSdZzEZovb8nowcM6VAdvFkaq13mrZX465MvP0+z0YOKq1TtdaFwCfAdfh+ffbXkXvsafce3fy+Hsg9ZdLSf3lPi6tv2pD8rUdaGt5qsIP8+C9r9wck41SSgH/BvZrrefZvfQVYH064h7MYymsx/9oecKiJ5BpbQp1Ja31dK11pNY6CvM9/UFrfRewDhhTStzWzzPGUt7l32S01qeBE0qpay2HBgH78PD7jbm5vqdSKsDyM2ON26PvdzEVvcffAUOVUiGWb85DLce8idRf1UDqL5eT+qui9ZcrB7hV1x/MTyMcwvzU0JPujqdYbH0wN0XuAnZa/ozA3L+9Fki2/B1qKa+ANyyfZTeQ6AGfoT9FTwu1ArYBh4FPgLqW4/6W/cOW11u5Md54IMlyz7/A/CSKx99vYBZwANgDfADU9dT7DSzFPLajAPM3wD9X5h4Df7J8hsPAve7+WXfTv7vUX9X7GaT+ck3cUn9VIAaZ4V4IIYQQwoVqQ7ejEEIIIUSNIcmXEEIIIYQLSfIlhBBCCOFCknwJIYQQQriQJF9CCCGEEC4kyZcQQgghhAtJ8iWEEEII4UKSfAkhhBBCuND/A9uwXlly9gZ6AAAAAElFTkSuQmCC\n", "text/plain": [ "<Figure size 720x360 with 2 Axes>" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "plt.figure(figsize=(10,5))\n", "plt.subplot(1,2,1)\n", "plt.title('loss')\n", "plt.plot(h.history['loss'], label='training')\n", "plt.plot(h.history['val_loss'], label='validation')\n", "plt.yscale('log')\n", "plt.legend()\n", "\n", "plt.subplot(1,2,2)\n", "plt.title('accuracy')\n", "plt.plot(h.history['acc'], label='training')\n", "plt.plot(h.history['val_acc'], label='validation')\n", "plt.legend()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**e)** Plot the data points and color the surface according to the probability of belonging to each class. This should make visible the decision area and its uncertainty." ] }, { "cell_type": "code", "execution_count": 155, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "<matplotlib.colorbar.Colorbar at 0x2b171691748>" ] }, "execution_count": 155, "metadata": {}, "output_type": "execute_result" }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiwAAAFpCAYAAAC/CnOrAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzsvX94W+V99/+6HcWyLdsRiYmJE4wNpIlcXEpLf101W0IHZYMCy3g2OaQlzNDvs4LXsT3X1TYa6yiPMvrds4f2q9LtIQgCBKyuWQaM8qxdm7CStdsIbZnbiBBITAhOY2LiOLZlO4rv7x/nHOno6Bz9sCRbcu7XdemydHR0zq0fPud9Pp/353MLKSUKhUKhUCgU5UzVfA9AoVAoFAqFIhtKsCgUCoVCoSh7lGBRKBQKhUJR9ijBolAoFAqFouxRgkWhUCgUCkXZowSLQqFQKBSKskcJFoVCoVAoFGWPEiwKhUKhUCjKHiVYFAqFQqFQlD1KsCgUCoVCocgbIcR1QogDQog3hBBftnn+IiHEj4QQ/yWEeFEIscr03NeFEL/Ub3+Q0/7mozW/EB4J3jnfr0KhKC8+/OEVvPKK3XJ45ZVjcz8ghaJkDJ6QUp4/V3t7nxByvMBtDML3pZTX2T0nhFgEvA5cAxwFXga6pZT7Tet8F3heSvm4EOJq4HYp5WeFENcDfwL8NuAG/hW4Wko5mmk8rgLfzyzxAl+Yn10rFIqyYfv2u+jt9fLii8ll69ZBKDRCZ+dD8zUshaIE/Plbc7m3cQo/y/45NGV4+qPAG1LKQwBCiAhwE7DftE4HcI9+fw/wjGn5v0op40BcCPEqcB3w95nGo1JCCoVi3ggG9xAOT7NuHbhcmlgJh6cJBvfM99AUCkVmVgJvmx4f1ZeZeRX4Pf3+7wINQohl+vLfFkLUCSGagPXAhdl2OE8RFoVCoYBIRLsYC4XW4/MtIRo9RSCwJ7FcoVDMjiqgtvDNNAkh9pkePyylfFi/L2zWt3pM/gfwLSHEZuDHwDtAXEr5AyHER4CfAO8CPwXi2QajBItCkQd+fweBQPLkGgyqk2uhRCL71WeoUJQnJ6SUVzo8d5TUqMgqYNC8gpRyENgAIISoB35PSnlKfy4IBPXnngYOZhuMEiwKRY74/R0Eg9fT01PN3r3Q1eUlHL4eQJ1wFQrFucbLwGohRDta5MQPbDSvoKd73pNSzgBfAR7Vly8CvFLKYSHEB4APAD/ItkPlYVEociQQWE9PTzUvvgjxOLz4IvT0VBMIrJ/voSkUCsWcohtm7wa+D0SBv5dS/koI8TUhxI36auuAA0KI14Fm9IgKsBh4SQixH3gY2KRvLyMqwqJQ5IjPt4S9e1OX7d2rLVcoFIpyYhHQWOJ9SClfAF6wLPsL0/2dwE6b102iVQrlhYqwKBQ5Eo2eoqsrdVlXl7ZcoVAoFKVFCRaFIkdUCa5CoVDMHyolpFDkiCrBVSgUlUKRyprLCiVYFIo8UCW4CoVCMT+olJBCoVAoFIqyR0VYFAqFQqFYYMxFldBcoyIsCoVCoVAoyh4lWBQKhUKhUJQ9KiWkUCgUCsUCowpomO9BFBkVYVEoFApFwfj9HfT330U8voX+/rvw+/NuZKpQZERFWBQKhUJREGpiUMVcoCIsCoVCoSgINTGoYi5QERaFQqFQFISaGLT8WIidblWERaFQKBQFoSYGVcwFSrAoFAqFoiDUxKCKuUClhBQKhUJREGpi0PJjIXa6VYJFoVAoFAWjJgZVlBqVElIoiojqRaFQKBSlQQkWRcVRrqLA6EXR2+ulpkbQ2+slGLy+bManUCjOHYwqoUJu5YYSLIqKopxFgepFoVAoFKVDCRZFRVHOokD1opgfyjXiplAoiosSLIqKopxFgepFMffkGnFTokZxrlEFNLoKu5UbSrAoKopyFgWqF8Xck0vErZzTiAqFIncKFixCiAuFEHuEEFEhxK+EEF8sxsAUCjvKWRREIvsJBL5HKDTC5KQkFBohEPieKvUsIblE3Mo5jahQKHKnGEGfOPBnUsqfCSEagFeEEP8ipVRHaUXRKfcGVaoXxdyiRdy8vPhicpk14lbOaUSFQpE7BQsWKeUx4Jh+/7QQIgqsBNRRW1ESlCiwx+/vIBBICrlgsHyEXKnQIm7X09NTzd69mlgJh6cJBJIRt1xEjUKx0Kiqglp3gRuJF2UoRaOoHhYhRBtwBfAfxdyuQqHITCl9GsU0rBbb/JpLGq6c04gKhSJ3iuYDFkLUA/8A/ImUctTm+c8Dn9ceqVCsonxYCJEJs08Dkj6NUGh9Qe/FEELJCIaXcPh6gLy3W8xtmckWcSv3NKJCUakIIa4Dvok2ddEjUsoHLM+3Ao8DXn2dL0spXxBCLAYeAT6EpkOekFL+Vdb9SSmLMejFwPPA96WU/zv7+islfKHg/Soqn/kWC+knUSOlUFlm2Xh8CzU1grgphOtyweSkxOXaOuvt9vffRW9vajpl3ToIhUbo7Hxo3ralUFQef/6KlPLKudrblYuF3Le0sG2IIRzHLIRYBLwOXAMcBV4Gus3+VSHEw8DPpZR/K4ToAF6QUrYJITYCN0op/UKIOjQLyTop5UCm8RSjSkgAYSCai1hRKAzKodx0oVSQlKrcu5iGVWV+VSgWFB8F3pBSHpJSTgMR4CbLOpLkpNFLgEHTco8QwoU2C8A0kJaZsVIMD8sngc8CVwshfqHffqcI21UscMpBLCyUk2ipfBqFCiGzZ2VkZIZ77539thQKRVmxEnjb9PiovszMXwKbhBBHgReAXn35TmAcrWDnCPC/pJTvZdthMaqE9gKi0O0ozj3KQSwslAqSUvk0cqnCcSI93baIp5+WgOD++/PblkKhyJMqoKbgrTQJIfaZHj8spXxYv2933rd6TLqB7VLKvxFCfAJ4UghxGVp05izQApwHvCSE+KGU8lCmwZRh813FuUI5iIVCTsjlhp35tFCPUCFCyM4IvHGjYOfOswQCVcr8qlCUPycy+G6OAheaHq8imfIx6AGuA5BS/lQIUQM0ARuBf5ZSngGGhBD/BlwJKMGiKE/KQSws5AqSYlXlzLbvjVMEzeutKsgIbGa+TduzpVLHrVCYeBlYLYRoB94B/GhCxMwR4FPAdiGEDy3m866+/GohxA6gDvg48I1sO1SCRTFvlItYWKiN6EpV6pwrpY6glapMutRU6rgVCjNSyrgQ4m7g+2gly49KKX8lhPgasE9K+RzwZ8A2IcQ9aOmizVJKKYR4CHgM+CVaaukxKeV/ZdtnUcqa80WVNSsUpadUpc65UuqS8Uotk67UcSsKZY7LmmuF3Nde2DZE1LmseT5QERaFYoEy3x6hUkfQysG0PRsqddwKxXxT1Nb8CoWifChlS/pcW+xHIvvp7HwIl2srnZ0PFTXlUareM6WmUsetUMw3SrAoFGXObOffyWWendlsuxwa/kHp5wgq9rxHBsUcd6nGqFgAVAGeAm9lhkoJKRRlTKEGzUyG4tlue77NvAalTDmV0hhbrHEr867iXEOZbhWKMqaUBs3Zbnu+zbxzQSUYYythjAozc2y69Qi5r8CAm9hXXqZblRJSKMqYUho0Z7vtc8GDUQnG2EoYo2IeEYC7wFuZoQSLQlFmzNX8O7MVHqX2jpQDlSDKKmGMCkUxUYJFoSgjrIbWW25ZxJ13Su67r/jiYLbCIxczb6VTCaKsEsaoUBQT5WFRKIpMIW3XnXwJO3eexeutKnobd/NY33orhsslWbmyTrWLJ/173L37IFdfvbqs2umrFv+VxBx7WJYIue8ThW1DfF95WBTnGOdS6WWhJb/Z5t8ptJeJ9bsA6Ox8iE2bngFc3HabZ15LlYtJob87cw+ZYHAPN9xwObt2eYlGBT6fl29/+2a++c1rSzT6/MdY7D43CkW5oQSLoqSUS8+OucJc8huPJ0t+A4H1Ob2+lL6ETN9FoeMuN4r9uwsE1rNjRzWbNkFvL9TUwIYNgltuuXLB/pYVinJDCRZFSVloJ8JsFFq5UUpfQqbvYqFVnBT7d+fzLWHDBujpIWWbt94qUrZ5LkUTFYq5RgkWRUkpxomwkk4ChUZISmlozfRdLLSKk2ILsGj0FD4fGbd5rkUTFWWOKmtWKPKj0BNhpZ0EihEhKZUvIdN3sdAqTootwILBPYyOyozbPNeiiQrFXKMEi6KoWKMhu3cfLOhEWGkngXIu+c0kSko97rmOkhVbgEUi+3nyyX089ZR03OZCS6spFOWGmktIUTTs5za5nOeff5VQaPWs5k2pxJNApvl75pNsc9iUatzzMedNKeYZ+uIXf8BPf3rUcZtaVCe1JL2S02qKCmcRZTmBYSEowaIoCHMfiJGRGR56aJHNpHirZz23iToJFM589+ooxWSJubynUgiwTNvUojpmYaZFYAKBykyrKRTlhhIsirwxThZr1y5haEirlNAO0IsIhyEahUhEW7fQaIg6CRRGOczoW+woWTm8JztKOXv0bJlvsapQFBMlWBR5YT5ZhEJaT4rUK2cIhZKCpdBoSDmeBCoJ5+jGp4hEDhdxTzHHZ4odJStFxKZYlFM6sFyFnWKOMKqEFhCqNb8iL8yt4+NxrYFWPJ583uWCyUlteTIaUh6m03OD2pRH8fg91NQIm+9I4nI9WMJxJAVM+okz++8iU2QgHt+S4T1tLeF7qiycpnkIhUZmnaJVFMIct+ZvEnLfZwrbhtiuWvMrKhhzeD8axbbMc2Tk7JxXyFRSr5biU2u6pRKNnnYoxT09Z2OKRA7nVYHk93fwwAOfSSllf+CBzyS+04XWM6ZUVKJhXaHIhBIsirwwnyyCQQiHSSvzvPvu5+Z0bpNi9WqpHNFTSyaRYiYYfIlw+IzlOzpDMPhS6YdpIhI5TGfno7hcD2b9XQSDn2bzZldKKfvmzS6CwU/rzy+snjGlQgm7c5wqoL7AW5mhPCyKvDCbYHfuBJ8Pdu2SNDYyb/6SYngayj/fn1mYOBGJHAAgFLoKn6+BaPQ0gcBLieXzg/m9pHtf2tpqbSMDbW3a65SvKTeUYV2x0FAeFkXelFvlQTE8DeWZ75+dSCmMxgJfPzrL1yWFy6lTW7jpJpH2XTz7rGTJknPbo5Lv/165/a+e28yxh2W5kPv+oLBtiG+Vl4dFRVjOcWZzQCunSggoThVKeeX750KoFCpM8t1uNiFjvOcYIyMTPPaYh9tvJxEZeOwxGBmZKOI4K4/ZRAHL7X9VoSgE5WHJk8rxOWSn0ubpcaIYnob5z/fn5kmZHY02t7km1/3X8qUvvUhNTZxt27SKs23boKYmzpe+9IO5GWoRKMVxwm6aih07qvnWt25cEMcjRZExOt0WciszVIQlD8rf55Af5dzPIh+K4WmYv3x/qQTKbClVdMfqVTGPMTX6YvhrAoGrgAYmJ09x772Vk8oo1nHCGv1cuzY1Cuj3w6ZNcMstixbE8UihyIbysORBefocZo/qZ5HK3OX7y0WkzIdHBjI1mcucOsr0uvKhGMcJu941Tz0leeQRwVe/auwntXHjbPajmEvm2MOyQsh9txW2DfH1zB4WIcR1wDfR4jmPSCkfsDz/IGDMVFsHLJdSeoUQ6wFzI6i1gF9K+UzG8SjBkjsL7QRf6IFVGfrypdgCIR+Rkuu+5yJdZCdK7MSIk3gpb+GS7TiRy/+N0//mrl2SDRu0qTCMBo0L5Xi08JljwdIi5L47C9uG+JqzYBFCLAJeB64BjgIvA91SStuTgBCiF7hCSvmHluVLgTeAVVLKjEY15WHJg/n3ORSXQrwfC8X/MjcU05uSjw8lmy8mk7eloUQ3u/06jdXpfSbXKUdPWabjRK7/N04m8MZGEg34RkbOLqjjkaLi+CjwhpTykJRyGogAN2VYvxvos1l+C/B/s4kVUIIlLxZaw6pIZH9eHUjN2BkAe3qqCQTWZ3vpOYR2Avb719Dffwfx+D3099+B379mFtsqhkjJZHy1ExbW7WW++f2r6O//BPH4b9Lf/wn8/lU262UTMOZ9Znrvtfj9HyxL0Ww9Ttx3nxYZ8fmW8K1v3ciOHdn/bzKJns7Oh3C5tnL33c8tqOORouJYCbxtenxUX5aGEOIioB3YbfO0H3shk4Yy3ebBQmxYNduyx/IqAy4nUoWC37+GYPBaenoW616ERsLhawFyaN6Wa3omU/TGbhsNNsucBE5u+/P7PQSDy+jpqdLfp5tw+GJgCZHIOMk0jrHNUdO2cknxxNJeGwhcVZamcfNxwpjR3Ejj5DqjeS4m8IV4PFIUkSqKUenTJITYZ3r8sJTyYf2+sFnfyWPiB3ZKKc+aFwohVgCdwPdzGYzysJzDFOJBqTQDst17BYrowbEXDf39d9Db22jzOY3S2fmIw7ZyjaQ4kYtIsb4+PYKRfZvJdfr73fT2Vtm8zxk6O6dIFyVmf0rMssy87mnH5+LxnrL3lDn/n0Bnp/lx+v+N8ogtNObYw7JKyH1/XNg2xJcyelg+AfyllPLT+uOvAEgp/8pm3Z8Dd0kpf2JZ/kXg/VLKz+cyHhVhOUdxKr38xCdWcfXVq7MeJCup7bfde92+/TO43dDd7SqwJDSzN8Xna3CIRFkFRKHRlHxFSiaB0uiw3Gbf+jWWz4fD+6wCUQvS+tpkozhtf6Mkoy3G40zEiEbH6OpqsGkYWOqJHXPHORKpiatM/zeq6ZuizHkZWC2EaAfeQYuibLSuJIRYA5wH/NRmG93AV3LdoRIs5yhOPVh27brSFL52PolXUjja7r1u3uxi2zYKSCfkZqLVZktuzHBSLSSaUohIsVvuIE6MwG+NZdOmcHP0gPa+0t7nAWCZ6TXjpu1P6n+leX+x5PNZxEsw+Crh8MdM6TZtYsdA4CVyTzWVFqcuzNqM5lVl/X9TKlTkaI6oIv1/tohIKeNCiLvR0jmLgEellL8SQnwN2CelfE5ftRuISEs6RwjRBlwI/Guu+1QpoXMU59JL7a9BOad5MmE+KAoBbnf295pbOiG/ap90D0vypBqJHMry6lx9JVaTrN36mQSK/tecka4hNf9trFpvWUfHfyMEN0OPqZ1++DEIbIeIYbMbM7120rTM0BXj+vLEISmm30ZJTRvFMFJFfv95BAIfxOfzEI2OEQy+TCTyC1KZP+Fi109Fi6jkZm5faJzbn8ccp4Rahdz3PwrbhviimktIMUdkupJxnn8ndRuVaKS1HhSjUfur/8OHU1+XuSS0WLMljxEIvJxBrNjvx+9fSyDgw+erJRqNEQwOEIkMObzOLpKSRaSYBYpZnNSY7gO49b9mjVQPkUPA9yD0GPguhOjbEPiBvrzN8mYM4XIamDItmyQpYMb1gcSMiiFr5EX7G4nEiUT2kOpxsTP2wnwIl0qKRM4FC6W7tmJ+UIJlgZKtPbidB8XopGmmEvs6WA+K996rTZ5nnkxv+/Y4bjesW+fK4sEpvH9KJHKASORYlrWc0z5+fwvB4OX09BhjrSMcXgNUE4kMJ9ZL31aeIqXetLyeVHHiJFhMUZbIKYg8bRn+WstjQ6wYERZDtIxZ7qeIl1qYrDVFXhpJCpdMTeis6aTUNNFcpSaUFyWJqi5UFIISLAuUbFcydld+O3ceZNOmy9mzp/yNtJmwHhQjEaiqgmeflXg8mgD78pe19+R85VuMRm+z9aekvi4QeD89PS7Ld7mIUKiVSOSMZTs2IiWTQDFHUNwkxYlZmNTYrGuNvJjFjhPmNJDxd9Jy/wSpwuWE6W8MGF+qa47USEuqGDGEinO0xe9vX1DzguXLfPlIijGzuiJHilPWXFYowbJAyeVKxu7K76c/PVrx4eu33orR1VWXclAcHIQTJ2IsWfJgyrrp720uhEquJlotjOHzuR2+y8X6tjJEUpxEip1AMUdNmvT75vXN9xOCRTMGVbm13E5dQ/a0y8RpbXwzU24YcyWjKcbfE6QKmBPAMTQRkxAwRtRlKfAe9sLFIN28Gwh86pxNTcznJK6VVF2oKD+UYFmgzPZKZiGEr10umZYCeuwxqKrKZjAvVKwUV6gYr4lGp+nqctt8lzMk0yMWkQKaUMlFpNSQLkYMgdJkPI5T5Z6iriFGXZ3WQbuaaeqYoFo3otTpQqE6YUxJZxq3NgUaMEEt07iZoI5pqjlxfJkmYk640qMrq4B39fGc0MdmPDe5VC+btgqVBpJ9XMyMZig3X/ipifn0kShPj6IQlGBZoORyJbNQywtXrqzjc5/TmnP5fJrpNhCAJ56oc3jFfAiVTCLF/JpGgsEThMMrTF1kIRyWBAJngaXpKZ98REqT5X7idXHqm0aoq5tIESZ1xBJ/3UxRy0SaUKnLYm6d0N+bJlZqiVHHFG4mmmsZwct0q5sRvExM1DE20JSMslhvx0gKmBNmg+5x0v0tqX1enMvNTy3Y/wuD+faRLISLooqgimRqd4GgBMsCJduVTKFh4XI+qEejpxgc9CY6iYJWnj06qo27eD6VwoRKt38FWwMX0epbxpHoMFuCJ+mLnLSsV6v7VCYIhWrx+aqIRgWBgCDynWptF/mIFEOUmMVJE4kISlPzMNVM4+VkQpycx0hCmKSKlaRoAXAznbhvYBYvE6bPI6aHWaaoTgiWGLWcxMsIXmLUcbLOy0iHlxHOY2TCy9gJLwy4YIDUqMuA/l7eRTPoxtpITxOlRluCwX2Ew1dZys2nef75gwve26J8JIpKRfVhOUcppLV+ufdS8Ps7+Ju/uZlbbxWm8cGOHbBhwwidnY8WuIfCIyrd/ovZFvTi6dmUyFuNh5/izsBZ+iJjmCMsGQ20mYSKKXri/ygE3g++ZogOQ3ASIjIZRfEyQjVT6PKAWrRlhjjx6ssMUWKOsJiFipvpxHuss4gXg4k0sVINwAjnMUFtQrCM4OUkXo7RkhAyQzQzdKRFEy4nSIqXo6b77wLDgIyRGm0x+rdokRe/fwWBwEfw+eqJRk8TDL5EIHCVw1QKldeLyIly//9duMxxH5aLhdz3tcK2IT5bXn1YlGA5R3FuHJd9HpZKmEfozJktvPaaSKSEgkHYudN4fw9m34AtxUv9HO5/P229v4/1QxwI7aK90/iftAgVJwPtMlKjKeeTEj3xr4HghdBza9LT82jfDP/f8qP8pGoALyO4mWIZw4mIijeLaKnTkzlupqid0sXK1AyuRCdbkhVAVnSPTVx/L1PuKmJuLcoyzDKmqE6IlkFa9NF4OU5z4vEgLQweX8HMOx74JamiZYBk2igGWrTFECzG33f0waTOYxSP31P28xPlQrYIaC4R0nKOolYmcyxYLhVy318Xtg2xobwEi0oJnaMUEha2y4G3tEBr6xLi8S1lcXB77bVTtqJqdvPMFNujUkurb5nt5DutviVo5TA5RlNWkIyqGBEVS9on0C7p2SBSTJZ/2F1F+LmljDS8oMsBLZ7hZpplDKcIFC3ikhQnCWEyidbgbYpky/1cBIv+nlxo78vlmcHDGCwdY2XNMHEPjDTWM4WbNgYSYmWYZbpoWUEzx2lpHmSwuYXBlSuYOeDR3m8TmoAbwJQmWqr7W46bBrASO0NuJm9LpZBLujebj2Q+K4kUCieUYDlHKaS80Cp2/H4tgnHTTdnnIJor7N+fMc9MrhRfqBjrHImO0GbTfvdIdByE/honoWIIEnM0xRArFyTvVy0bp6l5GN/MhbYmy4s8Hj7Iz9PSQIZA8U6dTIqTYZLCxCpWIClUJi2PM+EhtaLJrf111UDT8jFwj7Fy2TDjS6sYcZ+nC5blDNLCMVoS4mWguZ2B5jaG2vRUUZP+uawgGW0ZMHtbrCQNuU7elkoquy1GFZDqSKsoR5RgqUCKEaotpLzQKgbuv18rIS6ng5u232pTS/zT+vw9B3LcQiaxMnuhYjzeEjzDtvDTeHo2mjwsfWy5z5MuVJqw96aYIymJ+5ovxVtnOEBGGBhfRleXJy1qcGJ8iCsafqGnfTSx4h0dSxcow6RGUuwiK0YzOHNFczbRYqS33CQbz3mApSQ/g6XgWTaDZ+mwLl7e4Ji7hUFdsLzJJbRwjGaOc7i1jYGmdsaampKfiZtkpcS7wLBRAm14WwxD7jtoBmdtyoRQ6ErL76ZyTtLFqAKa70oiRREwtzhYICjBUmEUM1Q72/JCq9gRAvbuTW3pP78HN00gaC3xcxUoBsUSKpkmG2ykL1ILCLaGnqPV5+HIa+NsecBD354qbe6dTGkfazSlKVnhk0zvJAXLzzzDPNn3u3y2252chqEvxoznfjrYnypS3iMpSsz3zQJljNTIink+IDPO7Vg0DCFhbvvvQdMS9SSjLsvQRMwy8Cyf4dKlR1nRMsgxd4smVGjDywgrGKSl7hgDHW28Xt8BTS5tO+eTrJKqRS+BbsYpRRSJHNKFi3Nb/3KmGFVAqpJIUY4owVJhlEuo1ix2+vvvKqOD22xLledGqJibvPV9B/qerdGjKQ2p6R6ntE+WaEoLg4m0jiFcqBpmePkBvvvcnSz1XMDo+FHqq76M660+TXAYwsSIqlgjKmOm+0bLfEhPB1kjKpk8LJB69ecx/TUiLoaAOQI0kxAtLAXP2zNceqEmXLxu7X2/yaX6+z6Jt3WE/U0djNU3JT+verQUEeiipQ37FBE4t/Uvf9FSjG6yqiOtohxRgqXCKMdQbXkc3OZbqJjXdxIq+jJzxY+TP8Up7dOU9KZYoyktHEuJrJynRx3OYwRv1R688i+pegu8hkgxhIlx3ypQjHO5WbBYvCoxk0iJWSIqo3Ey0mg5+tS6odYQLsbnYwiYY2iCxQ20ogmXI+BpneEDF77OifbBhGBboYs2b90I+z/UwdGmNqh3ads0UkSGIffE0gwjtJtEcf5mfs6VYnSTVR1pFeWIEiwVRjmGaktxcMvPpzMbsZKvoTbfiIr5vs3cPnZlycZfQ6g0kWKirW87kRJNOY8RljFMM8cTwmUZw3g5STND2v2pk3jem0lN9xj3zWkfs0ixpnyMtBCaQDELE7MocTqFG6d66ycYM7221tjWeFLIJASM8ZkN65/Re2iixZhGaBia3hvDu/qXnNc4opdpT3OeXq7tbT3JfncHMzUe7TO3NhUeXmqaCdqO9LmIyiHakul/pBjdZFVH2gpnEWryQ8X8Uh7RjHSKeXDL3aczF1GVfCMqOQoVs5HWECoO/hRIjDWgAAAgAElEQVQj7WOke6wRFHMKKLHO6BguQ5gMkRQn5qiKnUixRFGMCEpsyl6cmE/h1udyWW58gmZRE4snBUzjlCZcGIbaZfrYPPpYDcGivzfXEFy69ijedq3iaRnD1BJjGcPUNcfY39DBmDGrYw2pX2WKGdfsbUlt6V8uokWVHSvORZRgqTDyjWZUYvOn3Hw6xY6qZEv/5JP6KVyoVK1MT/skBYs5BWR4V46lR1PMf3MRKQ4CxU6cWE/Ts+luY31dg2m7icRLHNAFjCFeaj1oImy56f216n9N0RZ347TeT+Yk1UxRXTfF67+xhqH61lT/jHF/uBZks/6gvEVLuXjZFIq5pCiCRQjxKHADMCSlvKwY21Q4k2s0Y76vwmYrlrL7dIopVuZJqNgZaW38Kc0c19M8I4nUj1m4LGOY5tHh1GiKNeVznNQSZRuRYqR5RuPpAsV8Oraf+7g4jJL8ZI39GIXHo+iRF1PUpdaoXjIEWSsJY7BrHD6w+nXqVkykzW/Eh2CIVvtBVIhoKUcvm6LMqEKlhBzYDnwLeKJI21MUgVyuwkoVgSlELDn7dE6Tv1iZR6FirvQxSpOzCJXluihJMcya7puXJdI+b5NuoDUiKO+RrOxxECrmSEouIiWfNFA+GMLEoFHf92mS0ZdaLMJl3JQqMtJEpvd56eRRattjKXMcAZlFywlzaq88RUs5etkUilJTFMEipfyxEKKtGNtSFI9sV2GljMAUErIufZfafA21pRUqLc3H0nwpZn+KuVR5xdSgc9rHbKrNEk3JJFLMAqUQcZIt8pJmws1xu1bh0owebYH0XjCTsHJ8GPdlWghJmwcpRh0x9n9omqNcmlgvhZTqofITLeXqZVMoSonysCxgsl2FlTIPXkjIOt2nU6wutblGVayVP3mUJxupnzyFSktCnBxL8aekVPsMzqSnfeyiKYZgKYJIyXT6LTQV5PR64xux2/eo5flatPfXbERbzP1jTE3vmhij47L03/T0h6q1SItdz5gyFi1OXjbQ+iJVkmdNUSIEyeaMC4Q5EyxCiM8Dn9ceqTzrXJDtKqyUefBCQ9ZJn04+KaBUoeL3X0wg8BF8vnqi0XGCwV8Qibxls24uQsV43kao5BJRaSOtNDlVnCT9KoZXJaM/xSpUskRTzL6U2YqUYnlVsmFNC0GqGdd2HOZoi93znnTRorlbarX00JgpPXSaZHffmCFarJ+KdZTzI1qsMzCryiHFQmbOBIuU8mHgYQAhVmbseqAoDpkqivz+DkZGZpicXEQ0qk1eGIkULw9eeMi6MK+K338xwaB5Ert6wuGPo80XM6ivlWeJcq5mWmuzt7bUjrTZhEoLg+n+lGxpH710OZ9oivXE7xTRyId8T9PZvmVrD5esEZ9xi2gxd+YFmjxjdLTvZ0qrG2IaN9O4mbiyTit5HkP7gIzXnUCf6dlppHbN5Yx3NrdGXFU5pFjoqJTQAseuosi4ErvllkUmMQE+H2zaVJw8eCHN5Pz+DxIIJCctDAYzpYPs0z+BwEfo6VlsOXi7CIV8umDJ5FPJQahYG745pH6yRVRaLD4V7+gYriPY+1OOk2qiNc31ExuHszf5qflKgBqfD080ytvBIBK4PBDA7fMRi0b5WTDIaCQCzF6gFPM07LQtqzywxjPM1UNmjkO6aIHkd1ajR1rao/r+6wCYqKtl/5paZoyyCnPXXtu5h+zexfyKFlU5pFjoFKusuQ9YBzQJIY4CX5VShouxbUXxsb8Sg507z3L33d8r2tXYbJrJ+f0fJBi81hQZaSQcvlbfnlW0OHtVfL56h4N3LZrKyMFQO1uhkiGiYi1NNv42HRtLChNDsJjLkrP4U87e4mfZ/UGqenpg717cXV20b98ObjeLurth717qurr4eDjMNHBIFy2QXaTMR2s0az8WM4aPxWlcKaLFSO1Ycvkra4aJrRggRh1TuBlmGdPNbl6f6oATLi2yMqbfYsBkLUjr3EIGmYTJ3ImWaPQU997rZcMG7eIjGoVdu1Tl0DmLKmu2R0rZXYztKJKUsuGb05WY11s1z6HjWgKBq2wiI4sJha6yCJbMxtpodJyurnobD80UzlEVi6F2GZlb6Del35wiKnkJFSPNcxxbf0psOL1vysWBgCZWTB/cos2bYdu2lGWunh4+EgpxKBLJKFTKZaYcJ+FiNd9aMURL45RW6Zx24PZAm+coE411nMRLjFqmcDPRWsfRE5dqgmWSPPwsTibcuWP37oPceeeVbNwoEpHTp5+WfPe7B+dlPFCZjSsV5YtKCZUhmcxzQMEHgLnu4ZDbQUs7Jfl8DQ6RESOFk1sFUDB4gHD4cnp6XKa011kCgaPklf6xzp5sI1KsVT/ZhEpKabKdRyUHf4rVRLvW58P2g2tvT1tW7/MVvZdKh9/P+kCAJT4fp6JR9gSD7DdFcYq1PfO3nNLOn/QUUQwgDrwHS42JDz3Jm8sDl1z2BiNuLzE04TJBLSNrvYyNmfwsoN0/it6+v9m0l9Oki5T5SQ1dffVqNm4UKWJ/40ZBKLQa+EHJ929FmYAXPkKI64Bvos1c9IiU8gGbdX4f+Eu0GbtelVJu1Je3Ao8AF+rP/Y6UciDT/pRgKUOczHOPP34t8fjigg8Ac9nDIbeDVvI0FI2epqur0aFpXC5ipVHf9iBQSyh0MT6fm2h0mkDgBJHIGXJO/5hLlM2zJ7eRsTw5k0clEVHJJlTeA672wxcCyDU+ZqJRxoJBTpgiI2YTbSwapa6ri7QP7vDh1I+qq4vT0WjiYTFOox1+P9cHg1Tr6ShvVxfXh7WM8GxESy7bswoXJ9HSDFoF0TDJmaBB+16XgmfpDJe0v8kU1YzgZRo3I3Veftnm1VJDkyRTcTH0TriY9tRAUrRAsfwss4lMlJuHRZmA55kqUqegKDJCiEXAQ8A1aHL+ZSHEc1LK/aZ1VgNfAT4ppTwphFhu2sQTQFBK+S9CiHpgJts+lWApQ5xTNnXcdJMo+AAwl1PHZz9oWSMjLxEOmz0sRtO4fTZbz9QAroFIZIhI5LTlOXP6x9T4zS79Yy5RtsyebMz1s9wUQblm5nxuGV/PEs9KYuOHecuzncVVP8xPqBgRlev8yD8NInp6EHv3sqiri6XhMKeAWCSSVpL8s2CQj4XDLNZP8nR1Edc9LK516xLLzoTDvBQIFPV6f30goIkL05dc3dPD+lBoVoIll+0Z4sQgm2g5Pg5tw/qCerT02zKgJulnMaaQbOEYI63naakhw8dyDFNqqJZ0F43xjRTuZ5ltZKLcut+Wm4BSFJ2PAm9IKQ8BCCEiwE2A+Ud6J/CQlPIkgJRySF+3A3BJKf9FX25t+WiLEixliNOBp77ePuo/mwPAXE0dn/mglW6pNHwqoVCySigQ2EckcsiyppNYsZYq55D+MdI9NSQjK+YSZZNQoSnO8lYtWrLcFEG5duZ8bhy6jMXdtyUMrmv7wojqN2Ew4ixUzJMQmlI/1b0BFlk9KT09tIRCRC3+kxjwWiTCNPCRUIh6n4+zI6NULWlkZPAENeFHqb2oVUutBAIFpWrsWOKQjlri85V0e9lECyRlRDPw3nu6nwU0ITqIFlFbrvlZTjZ6GWaZHivzJlNDJ9CiajmVOmdKDUEuomW2kYly635bbgIqG8pvkzcr0Y5sBkeBj1nWeR+AEOLf0NJGfyml/Gd9+YgQYhfQDvwQ+LKU8mymHSrBUoY4HXgGBuJ0ddXNywFgtv/MmecFsicSOWAy2NqlgdJTQBp2pco5pH/M/hQ7Q20b0KRV/rTUDSbSPUYL/d+caeHTp3+bqu4NKQJDdPfAd0Lwi4izUHEoTa7q6IBQKNkgB2DvXmp1/4md7fMXkQgTCC4PbuNzPefpv51mngiP8/KmzxKN9Dl+5oVwKhrFa5OOOmVKPZVqe5lECyTnI6pFayxXOw61NWjfwXE0BXMEXG5ouewYxxjkOM0co4WROi+vG6mhE/pr0qqG7MSHWZTkb8KdbWRiLiOn2fD7O6irq+ZHP5IcOiT46ldhcLB8pw9YiH4bWQXxwquEmoQQ5vD2w3pPNdCOpGm7tTx2AavRqohXAS8JIS7Tl18FXIEW7/wOsBnIWF08L4Llwx9ewfbtdykF60CmttvzcQVVyD/z7OcFyiZUzOtki6pkSP9YfSrWm6lEuYXBlCZvKxjkYzOX8pGha6g6v9E+/LXMBwfJGlGxK01ONMiBRFe/8Wg05RRpPRVeEdjKph5PytX553o8PBnaWjLBsicY5PpwOOE5oauL6XCYPYFA3tvq8PtZ3NiI7OtD6OXY2baXi2iJ6bfjRrmzBy3NY+rP0tw6zIrGwUSE5SReRlq9DJ1o1QTLu2h/z8eUGnKKssw+NVRIZGKuIqeZSD9eQF+fZHp6gi996QfzPj47lN/GkRNSyisdnjuKZpg1WIUWt7Su8+9SyjPAYSHEATQBcxT4uSmd9AzwccpRsLzyCvT2Vr6CLSWZDjxzfQVVyD+zvfj6tyzzAs1GrOQRVTkf+34q5vTPqvTKH0OwGLMntzDIh8bvYVH37Vo0xM70ejSqCRazmdahdb5daTI9Pdq2f/1rzoTD/Kd+0jaEyhq/n6sCARp8Pk5Ho9SvvdBWN63wOcxMXASMFNP6UChR1XPw+edZHwhw844dOVcNJcy2t90GLS2wbRvy4ouJDQzw/SypLKtoMWN8VsbzsSnNhIsHLcJiVA0d0aIsI3g5RgstDDKCl6GmFmhyab8RIy3kweRlMY8CslcNGaOxFy3lltrJF7vjRXe3IBQ6U7bHeuW3mRUvA6uFEO3AO4Af2GhZ5xmgG9guhGhCSwUdAkaA84QQ50sp3wWuBuyMiikIKee+S74QV0rYx7p1EAqN0Nn50JyPQZE78fgWamoE8XhymcsFk5MSl2trnlvL1ox9tikgh6hKPmXKbTgaao0SZfNMyi0co2XmOMJdA7fcoqVvzNGR7WH4bgCeiaQJlePj2miN1vkx4DfjcURNDdYPWk5NMbZ/Pz8OBjlgOmmv8fu5NhhMMdrOPPU0Wx9p5t6vViXWW7cOngwN8HinpcS5RFirfIwIyfeyiI67+vvx9vamir516xgJhXioszOnfdv9UoxfhNEusBlo9kBtK9CKFl1rBT4O8U/C/sb38XM+SJQOfs4V/IIPMvSzVngR7bqwH+3wPADEErEbkt/kadInRHBKC9mLlkr2UxT3eDE39PffRW+v1/rTK/L56c9fyRCtKDof+rCQL/2kKvuKGaivmck4ZiHE7wDfQPOnPCqlDAohvgbsk1I+J4QQwN8A1wFn0aqCIvprr9GfE8ArwOellNOZxjOvHhalYCuD4pnniiVW7FJAeURVsqR/WuoGWc7xFGFilCgnBMvUoDZ7ckNU+zCME3EopLUZHR2Bb9wNL0Qc+6hYJyJ0Kk0+vX8/j9icsD8ZCGhixXQpW3XrRr686xl+tGdJQjc9ER7nPwNbMnzuxSXXqiFrr5Ula9cWbN41R1qcOuIaqaE2I8riRhOvw0aUZZDjLE9EWQZZkYyynED7/RivjZmaDSa2DrlFWZyZy9ROscVRpZltofKjWvOFlPIF4AXLsr8w3ZfAn+o362v/BfhAPvsrTH4VSLn/iBUa2j/zNOvWaVdK69Zp/8zBYD7/zMUWK4Z/wEasiFpt0TI0IXIhyQhKG1qm1bi1AWuh/soTrGodoKNuP20McClv0sH+9NvUfi49dhTPL2fgNeC/gvBEWPtQdu6E3l44+hb8r7thewSOaJ1p33tPO0kej2vX4sdJXosb1+M/CwY5E9a35XLBffchd+2iwefjjv5+1vj9QNKP4VRNs6ixnidDA0xOzvBkaICXA3eWzL9iRy5VPkYUxtvbi6ip0SIrQ0Nw772pryvAvAv2EiER+zCmOBjWb4Mwc6ab6lP9fGrmX7nt9BbWz6zAywj1TSPJ349hzq5Ftx1aI3x2k2pCbqnOucXwm/T2eqmpEfT2egkGr8fv75j1NotzvJhbIpH9BALfIxQaYXJSEgqNEAgUb5oSRXGYtwiL8SNWCrb8Kbz6IF+xkotfJYcUkOFLcTLVtpEoU9ZKlFM71Bo+lWaGaGFQ66UyiBYx0U9wvBeBA8AjIWjzwZtR+OsAfDeSNaICqYkDozT5ylCIhrVrYWgIsWED7N1LY1cX1+jzABlRikzVNI/nmEIpBblU+dhFYcSttyJ37ULs2VOQedfOz2IXZUnxsrwHM6u6GTp/G903e/Tdn88Tfb/NyeX/zlDdIK83NSV/O0aUJab/lYaAnn1UZT7I5k+bTfSlnKqV8qEcDMuKzMyLh+XKK6+U27f/sKLysgqN2YWPMwmW2YoVhxSQceVrLVU2m2rbSMz7ky3908IxmkeHkzMom+f7OYJmwrQ0f8tXqGBZBnBHfz+NWfwcuXpFit02Pxu5jGuLk19nclJLDxVhrNbYhlnmLsfkZVkGXAinnzzMjX/YluZj2P7cKX6/4VX2Hf8wMwc8WmTtZTShOoDelwWSM1Vm8rKAvYiZn9mbMvlNNm16Jq3aR7vIVJGH2TG3HpYPXrlI/nBfXUHbOF+MzemYszFPVULHSmK0rWSjWiUwu/Lm2YqVTOZas1hZmrtXpY2Eqbal+VhK2/xkRMUkWAyfirnpm1mwWHqpGJMSmoUKJE9dkPvpqyGHtIpddY61MVyx2+bnQi7jyhSFydVgmy92Zc6xKbQZnd8DT2urbaXIKk+j9ptoPsbR4Uu139f5aGKl3rSxCiST36SUpb7qWK2YDQumcdxCbPxTbuR/ACtUrDiVLNukgKxVP/VoHhXL3D/mqIq1+scx/TNIUqAcJ1WwOAiVfCMqZmLk3jxtfySSUXjk0za/mJGYbOMqZu8WJ+wMuMb9Bv2vuZnc+OEjdHW1pZ28fz3+HssahvFyksFl48xc4Ek13xpelkRayEq2vizk8HxpyGQ23bHj5pKU+qpjtWK2LBjBohr/lJ7S9SrIJlYy+FXMKSBzX5V2HKMqttEUa/rHHFUZJFWomJq+vfeeNjKrWJmtUDEo1gl9ic+n9TTp79eql6JR+Ku/Squ8metITC5RmFJgiBij+605yuJ5cgt9j2+j+zZP4uT9dN8EA57H8BLX5oRqHmaoyaMJYqO6aBJNNMcyCXQz5eNtyeQ3CQTWl6TaRx2r5waJYAp3gVvJaYqfOWPBCBbV+Kf05FeumE90xbo8i1gRtZmNtZbIil1UpY3D9ukfI+2TZZLCbD4VO6EC+bkXzrhcLH72Waivz6l5mh0T77yDJxiE229P9od57DEm3nknZb3ZTmBYSFQmWxSmGGRqKBdDlw5GlOWFPpY3wHOPb8WzspXxk0eYWfo1xqoO0syV+tSII6mN5Ix2/R404SIzNZIrzyiLk9m0VKW+6litmC3zWtZcTLSTaeoyVTZdXIpTruiUCspFrCy1FyuGOFkFrNXvXwa0xVnV8QYddftZwwG9NDnKGg7QzgBrOMAlvEnb6FE8B/Uy5V+gdaYdRBMuB/W/R4AhiB3JrUQZ0qMqVrHiZH0woh2e225DLFuG+NSnqK2r49qvf50OvbTZjg6/n7v6+9kSj3NXfz8dfj8iHtfEyosvagbXF1+E22/XlpuYzQSGdqXJ1weDGceYCbvxFwNr6zazv8gQmjE9alb1wz4a/rqdqr9ZRMPL7XjGHjOmQmSF7nmqbxpJph2NW0IVmeWRtcTZjJNoLx9KVeqrjtWK2bJgIiyq8U/pyb1c0emadrZipZk0v4o5BdRGMrKit9V38qpcwpv2plpr9Y9D+qcYhtps19C2Jb/d3Xi2beP6YBBIT9M4pXQWX3SRrRCpveiilEWzmcBwtlEZO+bDHGx8b0aUZekkWrRkGK1l/zC4huG8xhGWMZyIsHjrRhirb0pG9ozUUA36l2t2y2DeSw6jmp8oixPFLvWttEkRK5kZqojNc5+fYrNgBEul1v5XGtkPYMUSKxnMtYZY0VNA3VfPsPXD47Q2ezhyaowHlkzxb1UDifl/2hhI8aq0M6CZaofRxIphqrWmgbKkfwr1qTjhFO2gvZ3q3/otW0HgJB7O7tzJohyEyGw8M7OJyjhRTPGTCXP3WztZwRTa9/5e8n4tMWMqRLycpI4JaIpDjSktZET9YmbBYddnt7wEyVxSiZMiKsqLBZMSAu1k2tn5EC7XVjo7H1L/AGWLVdTkIFaWkdpldAWwShMr264Yom3jjVS53bRtuIkHh85w20ycNg6zhgO0cTilU23T4TF9BmU/fKAfNsfhv/XDRf6U9A9ZOtSa0zy5RFVyPU2diurt/s1oMXNHQeAkHqqWLGHa3D133TpNiOiRGoP9kQjfCwQYCYWQk5OMhEJZ5/5xGuepaDTv9E4xxU+ujJIUnWDqfGtMcKiLV+/USdxMJ1JDXkaock9p0bx6tN+jkRbK2vnWTPmnhYqJ2WxrZCe7uwWjo+U7KaKivFhQgkUx3+QSXbFWBOUoVgy/ShtJv0obbP3oOJ5N3SkejdruW+kZX5XSUj/NqxL3Q3sQNvVCTQ38YS/cEoQP+rWoyjAMDCWFijmqYvWpmD0SufpUMrEnGEwTGYTD2uSKDmkaR/Hw2msZhUiH38+fvPUWW06d4uannqK6ro5nNm3ioc5OW7FiFiKLGxuJb9+eJoYO7t6dt7clk/gpBpkEZSJqZkRXxkhGWsbBM/77fPz04/w/M4/wwOkN/O5MLU3Nw1pUpQbtJ2ykhYDkb9ypCWI2FlYY30CZbeeWGaqYoK6gW7mhBIuiKPj9HfT330E8fg/9/Xfg96/Rn8lWEQSpV6QZxIoRXWknIVxamzy2V+ZeTwsdaPMCtTPAymPDuA6iRVYOApcG4HM9FjNqDzN/FEhEVczpnyFSoyqlECoGRrRjIhxGTk3Btm3aHDu//rVtdATsRY6x7v5IhIc6O9nqcqUIkQ6/n888+CCemRnETTch3G7qenr4zAMP2IoLq8nWc9tt4HZr4zSJodVXX51M7+ifbXVPD+szpJfsxh/fvp3FjY1FN+E6MRpHq/QxxMoU4PHDVJDqG/8I4a7hvBs/x58MtbBpZib5mzREiyFggNxFx7kTZVFmW0WhKMGyQNEExF3E41vo77+roMnMctlXMHgDvb2N+gRqjQSD15pEi4Gdb8V8YNfNtU5ipQ1NrCQmLYzzzvgp2yvzqfE3aWeAS6be0FJAvwCiJCuAVtqnIMRan21UJd/0T6Hsj0R48JJLeObWWxmZnEQ+8UTGNI1V5Mhnn8XV1MTNTz7peLJfHwjgGhtLqyJybd5sKy5SfCbGut3dTE9MJMQQwJKOjrzTO9aU1EQ4DG63VilVU4N31y5u/va3iypeDEF6muR3lpIWOg6sDsDG1Pdc072JPxuvh/p4aqVQPZqPRZj34jTD+LlHJU6KqCgvFozpVpFkrjtJarnpxZZGUIsJhX6DSOQ7+lpOJltjmUms1GJftmxUArWhlSy3DvDwzDSBvh3UdG9KmEXP9j3KhGcrbaNHtSZwB0lvr39IT0FYzKhnotFZm2pLYaXMt1eJCxCf+hS0tCDuuw8uvhjvkiXc+O1v8+lgkNqLLkr0S0kIiBzFRTafiRGBEYcO2X622dI75vd6V38/dYY48vth06bERJCzrSBy6nxrfh5ITQstt3/PF3iWUjURY6bepUVVzP25Ek3kDNPtbIy2C8+cqwojFIWiIiwLEDtzW09PNYHA+pLszzk3bST1czHZ6o8Fzj1WdLFStWacVa0DtHGY16te5pnlP2H0uceQU5NMP/e3jC//GkuPP5KaAjL6qhzR7/9dEB5NTUGcDYc5EgxmjKoUO/2TL5nMrIkIyAUXwP33w513gtsNjz3GoslJ6np6UjwlsbfegsOHc/aOZPOZJPb/wgvw9NMpn618+mkO7t6d8/tMEUeBAOSZYsoXQ5yevckP/7cf/jUO9/fDu2/Zvud3x99N+ljqSfpYsgZPzp0UkB2qMGLumEEwRXVBt3JDCZYFyFyb26LR0w65aWtb50wm26XOaSCjc+1lmlhpaTY61WolyiNVL/AfDZ+lv6qDM9Xvp/Gt7ZpIMVJARp+VYyTnB9oRYezPA8T1FMRUKMSBQIBoJJLRVGulFELFSZRka9SWOMlbT/AbNsDGjWknfOlyEa+vh8ceS/OO5OuTSdn/1VdrvptQCCYnIRRCbNvG6quvzvkzSBFHJa4gMr7XxX4/dfcH4S7diH1HLyyqg6e2p77nvid40nOIaqaTHhbDx5KSFrLOLeSUFsosYuYyvatQlDNKsCxA5trcFgy+RDh8xpKbPkMw+DLpqSDIS6wY9y/T5gO6tPlN2jjMpbxJB1F87OcS3kj4VRJVQEZkxUgFmbvV6uXKbz8d4ZXOTv7T5eKnnZ28GYk4RlVW+P3c0d/PPfE4d/T30+73l0ysOIkSOw9JdU8P137969zV3w9CaKXPa9emnuAdTvh1K1fyT/fcw3hVFfLZZ5FTU0yEw/zTl7+c0SfjVHGUEBk+nxbh6ezUfhCdnXD//XkJjBRxVMQKIut3Zn58YSBAleXz5fe7oUbAd0PIqUkmn/s//Gj5P/Orqle1fiz18WR5s9E8Lg1zFDEfahPp3d5er+4P8xIMXq9Ei+KcRAmWBcjcmttqiUQOEAj8gFBoVG/hfZpA4CUikUOWdQ2TraV82TqJodWzYhIryznOpbxJG1pKqJ0BOogmS5YNoXLEctObwcWGk31VzNU+pwGvv5tP9R/GHz/L1f2Hafd3Mwqs8fu5NhikURcRjQW2oDewi6Q4iRJjvp404dHSQl11tSZw3G4tDfTuu1pVkUGGE/7+SIRvXHQRW5csYeuiRTx4ySUZfSHmiqM9wSDrA4HE+A/u3q2JjDzSTJn2kxBHa9cin3oqay8ZJ6yf8xqb720UqHZq2LdkJezv5OyYi/6G3+Otqp9QzVSyH4sRYTF8LLX64yKYa4uR3i1VhKaUkR8VVSocSRUx6gq6lRtKsCxASjEHSMsQA4EAACAASURBVLYDSCRygM7OR3C5wnR2fkcXK3bRFTN6yLyGVLGyjJzFyiW8qc2wbI6oGH4VIwWkR1nsypWNvipefzftwW1097ZRU1PFxt423hfcxvv83VwVCLC4yB4Kp0jKEmt0BBLpD1sPyX33IbpT+9CwcSPyj/84eYLftQtp8ZTkc8LPdfyX33ADrz7/POPV1ci+voL3lxBHixfzzJ/9WVpkB8janM5unNcEg6zx+xkFLvb7uam/n43xOHJkJFXogfZ5H4vCuNam3800tUxQR4xqpqhriCWjK5j+JsiU7skuaApN75YqQlPIdrMdS1RUSeGEEiwLlGKa25wPIB/M4dXWEmYjuqJXBVmFitFnJUexkuivEiVdrAwDH9NMlHIwTuPP+1msn6isxtpLA1v5bI8n5bz/uR4PVwS20lACD4VTJGXmlH2ZtlHZY/WQyIsvto8KNDQkT/AbNrDvu9/Nq4ttXuO/5RYIhai+6CI+dOut/OBLX+KZe+4p6v6svWSAnJrTOX3OVwUCrPH7uSoYpF7fRtUttyDvvBPuuy8Zmtwehp8Gtf4s47CMYeqI4WZKuwatm7BvIOfJ9G5yN946+8NyS++WyoA/2+3mIkbmumhAUTmosmZFVswHEEgeQEKhq4hEDpjWdOpo22B63pIK8pDabt/UZ6VqpWawNYuVDvYnJjFceWxYEyhRtL9GyfIRtJLUj/mRfxpE9PQg9u7F1dXFBeEwp4Ah3a8CmnhZ7mu1Pe9f4GvlVHR/3hMDZsOxlb7Xy7TDnD7GCX99KJSIuFQPDFDnMDbjxG7wA8sYjBSUsS2jyVxe4/f7tQ68+ngX6SXH3wsE0vZfyH6tr1nc2Ej1bbdlnXvI6XNu8PlSI2f6NsTGjcidOyEQQLwRhe8G4EREm/17CtxM4eUktbpo8TLCUH0LGOXNkPwrAGk13kLqhAAG9pMjav6wa2Y9qWupDPiz3a7zsWR94qJKdcQtDjNUlWWlTyGoCIsiK84HEKc5UsypoAabZZbGcPUk5gZKtt+PJ6qBzJGVnMTKe8AQzPxRAGG5ul7U00NbIJBmrP119Ijtleyx6BH2BINpLeidKmlyJVOJsLWB2hmXi5t37NCMtZASafh+IJDTXEFWslUc5Tz+PEuOZ7Nfu9fUVVdDS0vqijZRL6fP+XQ06hg5w+vlZI0Lru2EH+rix5gIcWoCN9O4maKWCX5nxkt/YxXxm6H/C+C/Tt9OkXrDaf6w2ad3S2XAn+12cxEjqiOuwgklWBRZcT6AmK8Us+XqTS337XwrFrPt8tZBWhh0TAOliBWjbNlirhVr7U9ItfpJzVwF9PPgFp4Ij6cYlZ8Ij/OfwS3aSm63Vqo7Oan9dbsphFxa6T+zaRMuSHZ7tTm5z2bSQnBOleTqy0mMP890WUqvmJ//HH74Q6pnZrj261/Pa6yiu1tL3ZhxmIXa7nN+KRjktIOYOWONnI0DLX5Y20/d4mk+dvoJrjt7PY+M/jb/m1U0nRhn82dn6P0sBG8B/43662wrhvKnkPRuqQz4s91uLmJEdcRVOKEEyzxRSS54+wPIGYLBl2zWtkZXbKqCrEKljWQH21WwvONISkSlmeO0MJgqVowGcFaxMq61Vz8+jnbisTk6jkejab1VXo/08WrgTp4MDTA5OcOToQFeDtxJNNKntbDv7obVq7UPYPVqXN3dBTcuO+NypZQTW4VGrqLCaa6gTBQ6O7IhlM6OjORVEbTE59MiI8Eg9Or9Tu68k7rqascoi9NY5cUX5xRZigPyRz9CDg8z/vjjfC8Q4EAkwkvBIGfC6c0D3w0Gk/MKjQEdfrgsCH/Qi3DX4L7xv3P50Ptp+MbfIdxuLrjzRv7u/iFWXDBDz2YI3IplIkQ7MW+ezTkbsw/XlMKAn+92zce6urpq+vriGcVIqcZ8rjGzAKuElIdlHpjr1vmFYt9S+98s/pVcMPlW6kmabE3Cpb7tBMsTIuUYbQwk+qxcMH4NLA7AFT64IAqxIPwikmy5P5wUKzHgSDBIezjMIpMf5Ew4zH/qJ3yrY+DVSB+vRvrSRl3Iyd3OrwGaYbT6ttsS43LpreaLtd9snIpGC/blGMLoegfPjeN+77svmUaCRMTE6j/JNtbYwADTJj/PweefZ30gwM07dmiPd+/m8htuSBnbYv1zjgEH9H39RihEvc9HLBplIBBARCJJiTAFXGeaKNMY78aNWmO8r34VXnwRT083W0PPsfqKBnwX66/1YGr0Uot960Ez9j6WQolE9pfkuJLLdtOPdXVs3x7n8cfHWbmyzrE9f6nGrKhslGCZB3IxnpUbqQcQ6xWf9WrR/NgSXTGnghpINopr0022dYO0cIwVDCa62TYzxKrpq1h06n/CrcmTD9vD8BYJ0WIWK6NALBLhFOALhaj1+RiLRnk5EOCQyXBrkKkJ3MQ77+CJRqG9XetpEgzCr3+d9eRueC+ME6YxB84Zlysnw2gxRIUTe4LBvISGE3ZG4D0ZUlJ7gkFufuophJ0Q6+igw+9Pe63TWL9v2o/dZ33lrl3a/EMZPucDkQjH9Pu1wEqSicvYuP6LbnXwupiF4969tPo8WnrjMBkayC2s+YGyYXes27zZRSg0xkUXbZ3PoSmKgBDiOuCbwCLgESnlA5bnNwN/DbyjL/qWlPIR/bmzQL++/IiU8kayoATLPHBuuOCtkxvq0RVzKsiIrrSRMNku19M/7QwkIyxTb1Az8Q+aWEk58vXA/wnBNyLpYgW9x0okwmuRSMaJC83PWSMiB3fvxi2l1pDNEEqPPUa8poY999yT8RNISenoY67u6WHxs8/mFDk5uHu3dtJtbNSE0q5dTG/alLeosMNJaIDW2ySfCp58JmjcH4nw6WDQtrJJHDrE9XoEyry9XESR3WctGhtnHaFKmQhxwH6iTMzCsauLocPjhB9vIPBd04YEIM1brsW+Smhhcm4c685NhBCLgIeAa4CjwMtCiOeklNar7u9IKe+22URMSplLb4wEysMyDywsF7w1P99ouW8x2hqpoHq0EuY2NN+KbrJtYZA2BlIiLJ7BGVjicJV7iY/YOMSm0sUKpseYHpuxihVrNcqVn/0srs2bUxuz3X470xMTWU/STikd6uuz+j46/H4uv+EGLUJQUwO9vcg77uDV558vqJ+Jmdn2NikUu8omwmG4916qe3q48VvfSttnNp+O7WedY0t/82/CLCViU/qdfwzC45b+N08/Dbt2JR/39XFmmYfATyDyKtrvOxGIdJr8c+GzsI51lYVEMIW7oFsWPgq8IaU8JKWcBiLATaV8T0qwzAOV7YJ3MgBa+65Y0kNGdMVosGWKtFQtG8fLSMK3YphsWzhG07ExzVg7ZH/ymXktSmwqtdX+bMQKOFSjOFyl1150kcPnkMSppDY2MJC1FNl2LLfemtcEggZ2UwDkOy1AMTD2efOOHQnDsTE5IoEARCJaLxevN2+hZPtZ79pVUEv/BD+MwI8C8GQIpiaZeTbE4PIfM3rP55BTkxx77u/Z5F5C68tVRP7LaSONdPvrOdx/CWfjv8Hh/o/S7V+T3zgqkMo+1imysBLt6GxwVF9m5feEEP8lhNgphLjQtLxGCLFPCPHvQoibc9mhSgnNA/Ym1nTjWeVhjbY4RFfMRtsmEqmgFQwmxEozQzSP6hVBw8B/6Fe5t5k8LI+GmfjzQFHECmS5Sp+FjyST9wIypzgyGW7v6u9nydq1zJw6RZXXmzF1Y+ft+ExfH0xNaZEjfdnNfX3Q0JBTb5N8SKTY1q6FoSHErbfC3r14urqQu3bBb/2WbaqlurfX0YRrh+1nvWkTr+7cyeoc/TUGMUy/5HH9748jMBCBj8PMJ2Fw0WW80nAlP+eD/KLhCvZPdAC12m/cDdaL026/i23BRXh6/hvs3UtbVxfbwjuAGH1pzRftjLe1VKL/ZeEe684ZmoQQ+0yPH5ZSPqzfFzbrS8vjfwL6pJRTQoj/DjwOGFddrVLKQSHExcBuIUS/lPLNTINRgmWeWBgueLtyTXMps6mM2ehouwLNaKs3iatvO5FIBRlm2xX6Y5dRrvw2MBjR7j8cgot9cCDK9H0BTjydbqDN9jiGffWOrclVv0o3TrRmc2q2jq3ZvBeZTpxOhltGR/Hu2gWbNiWqnwwzr9027bwdrrExzZNjqdJh2zbNUDwzo0U79H3O1uSbIpZCIa2M2bzPUCjtsyUc1qIteQqlTJ+1tcOvlTV+P1d//evUeL1QX8+ZgQGOBAKMRiIsBa20eSr9dSlzCtVNMGY12hqzN8dga0Di6bk15f17ejaxNdRHX+QIlShGcmVhHOsqjxmqmCi8g+EJKeWVDs8dBcwRk1Vol5gJpJTDpofbgK+bnhvU/x4SQrwIXAEowaIoFrmkg8zLTGXM55O88mwCLgDa4njrRvAykpIKOo8RLRX0HolyZY6jVQP9bUTrtTKcNNlC6txAF/n9XB4IUO/zcToa5aVgMFHCaogVu+qdV59/nstzvEoHbLcB6YbR2XhO7CIG8qmnEKEQbNiQVhZsV2UEDpGa9nZ7b017uxbx2LYNdu6cdeWQQYpYshvH/ffDli2c3bmTRV6vFs0yUkPr1uUtlGbzWa/x+/n0gw/impyEm26CvXup7uri4u3bGQbYbdreFDAJrnFwN04nFtcZv0JDoKRRS6vPbfuZt/qW5zVehaKMeBlYLYRoR6sC8gMbzSsIIVZIKY/pD29Ea/eJEOI8YEKPvDQBnwT+32w7VB4WRZGwNorTMcqYDe+KUcbcBPVNI4noipcRljFMM0OsmBrURIpxM4TLGAmxYmeyNcTKx4JBGnTjaGNvL9fqs/Ma4sbJq7H66qttu8b+4ItfTDN8ltrvAamN5Wb+8R+huVk7yefRn8XW23H4sK23hmg00ZDNeP+v6r1NMs2I7ESKWHIywL72Gs/dfTfTb72lRWB27oT77kPu2pVIfxXb9GsQA64KBLSI0+23p3yXVZs3c57xXZqjK+PAJNTp0RW3/mQ1SQFDg34ziZcj0dO27/9IdKjo70tReiqp8WepkFLGgbuB76MJkb+XUv5KCPE1IYRRovzHQohfCSFeBf4Y2Kwv9wH79OV7gAdsqovSUIIlR9QPNInfv4b+/juJx3vo7/fj99sZUPV0kNEkzvCvGNGVpjgtdZpQMQy3zRxnGcNaVdDbaDcjwjKl35/UxMpoXNuL2bcCcLl5Qjv95LO4p4frTJUnmfwhuXaNLWVTNyMC5LntNsSyZYhPfYqqWAwxNJQUFjl2l7Wd4fm88+CJJ9KrdIJBbTv797PV5WJPMMgVN92UUjX0mQceyH++IdC2nWEqgoRQjMWQd9yB2LChpJVKBg0+n2PEaZH5uxwn6WfR0eYTSoqWhE/LjD5r85agm/Hw0ynvfzy8gy3Bg8V7M4o5IZcZp8uBueh0K6V8QUr5PinlJVLKoL7sL6SUz+n3vyKlfL+U8nIp5Xop5Wv68p9IKTv15Z1SyvTOmTYowZIDlfIDnQv8/jUEg9fS29ugfxb1BIMfx+9vIW3OIEEyutJEWnRFSwUNmqqCBjWjrSFSjFSQ8Vhvuz8aT0ZXDIz79Q5Cwlx5kmniwVwpxjacsIvesHEjxGLaSX/XLseTvxW7uYZYsgS+/GVtG1NTWgro3nvh179O2c6ng8G0km7X5s18OscqmxSxtHMn7NiB3LXLds4jQyieeu01zdNSwsiVmdPRqGPEKW1OIUhOgmjynGhelonkOjZpob6I5M5AFQOhf2BmcoqB0N9zZ+CY7l9RVBLmZnjGv2dPTzWBwPr5HtqCRwmWHFA/UDB8KoHAVfT0LLZ8Fi4Cgfeb1tVTQsYVp/mmR1cM74r15hoGhtBEihFV0VNBRnTFLhWE/njMQUgQjSZOfLZRh6ee4uDu3Tl/Gk6T6h3cvTutXDhfHPu3XHSR5vHYsAFaW5HPPJPThIfWqNGp116DwUG45BK49VattPiJJzi7c2fKdmrb2uxLutvacnofaWJpwwae+cIXMkavShm5suOlYJB4fT089ljKdzmzfTvvBoPEjMjKFNrvUMeYrdltzhfVx5NixRppAfoiMdo7j7PI9WPaO39FX+RY+kplgoooO6Oa4c0fSrDkQD4/0HL7Ry/2eHy+BofPwqjjrE3+9ZCMrhi3mqR3xcvJhHfFywje0bFkNGUSLboyReKq1oiu2GFc775sM6FdIt1hSvu8+vzzWlmt3gdEPPIIl99wQ84Cwy5y8erzz3P5DTcU3HjNKXpDNKoZUjs74ZprOPX223lNeGiQFvno7f3/2Xv36Dbu+8z7MyBFEARJUaKiC21TF0e2oZbNTU6bPXRru10nbWLHVZU3oCzHF0bJNpbW66Z77Ajruq4XatN3d10XtnsSBZESS+LUUWVbjdPGTazsG7U+XTuNvWwEy4ktSpFoSxYtSCJ4E4V5//jNb+Y3gxlcSJAEKTznQMRlMBhAwMwz3+/zPF/Gjh5l/+bNzvUMDnpvx6A4cntlubhR6mBGx3uPRqG3F0ZGyKbT1vqLed1icVjX+f599zEUCFh6odFkktcfeIBzXtsqyYsJ1SlkodH11zJ/ljLwsPwodl9QrSjnx2wJwxPBcXWTulQaqi6hIiC+oC0ecRHOL2ilDTWciu1Jpc7T2dns8VmM4tghy3aQrKwoxEV1BgnScoZWBpzVFVlhKbK6IvGqGcP/u3v3EsjjPFl9443OOTNAHXDL449bw/MKxdK7XSn39PZ6RvGXkicC3g4h9uyB971PHMD37WP8jjs48MADRa/Tvd1QeP7PUDpNeMcOIUhVxhIMmeShGJdUqbDe+65dsHGjcEMdPEiNuf7LP/Yxx0DDcrzuYV3nsK47pl41AUvcC6pEZXSIYNDpFAoER8k2mrtUN3/3xdQMPHSjlH3BbJx1Np0QYXjqZynC8GKxahjeVKNaYSkCxaY1VlrraCq2Jx7/McnkBddnMU48ftxcQhlyqIZoLRK3RaqtIClBRu120OgZZ3Ulg01alOqKOxAOj9uHdZ1/2LyZC6rzxKXzyGk9RKMi22T9+glXR8rVzpAVoIt792KMjmI884zQmYRCIqJ/0yZ++txzAAUrDWuiUe578022XrzI1rNn+S9Hj1oDBgtVPl64/37G6+vFa4+MwPbtjNfX88L990/YJVWoOiIrVxfvuce2bq9fD4kEdcuXs/ZznxNkRr7u0qXUZbPcunv31DmKMjjaQVJ8K1pCY87qSlnglW8kXnGiKGVfUG155IeuHyIWe55EIs3IiEEikSYWe75K5qYB1QpLESg2rbHSfuhTsT26fhgIkUhcSyTSSCo1TCzWh66fR5yTKjtV1crcCCyCRUtE+6eBYVpIs8C8hN/LiuqKSVAYQBwkTmFVV1R4VVfU24d1nTH8qwg5wWyxWFHZJvnC4iY7XdmRCvvuu2jmgToncG3DBn4lmSxYaVgTjXLzX/yFlWhLZyfhHTu4+dFHHcv5bkckwvDRo4zW1dEAnB0Z4cCDD3JI17l1166SydlNjz3G2vXrraC4fNk11vqjUdHOM9+nJsPl5GdqPqaVscqjYnjURRNGxJ/gaDYnzRZwZrE4Hp/ZpNpS9gXFVpQvZVTD8GYG1QpLkdD1Q3R0PEFt7TY6Op7w/LJWWm+zfNvjPLPT9bfo6NhPbe0eOjp+iK6fwnFWKMPiVLFtvbjUMWaRFfViVVZk7oqiFRg2z2iHKVxdUZGvipAjmi2iOuI1HFGtwvgJcYuZXWOte98+tPPn0TZs8A9cM4WvhSocN8RinoMbawcHfSsh7vfY0N3NvPFxnt240fEZluqSWhONsvb224t2AFnrV4mkXRoQ93s8Vk5HUU6jJuP8G2TMmXbbVAwhmRn9Sin7gur8n7mB6bA1TzeqhKWMqLQf+tRsj3sas/uxkB0WJ9tB9cBSZztIFduGRofsyoosv8vbrtwVcOauSBQzK0iFWzR7MZ0ueAAu1AbxEuLmc++osNa9bh2oAxf9BLiDgwUJlq/baOVK5kcirIlG+S9Hj7L17Fm2XrzIfW++ycelLqUACbDI2cMPC13N+DjGvn2+TqsbYjHfQZJeVRlr/X7vIRIpKUBv0pAVvkzepSxEr4Pev4HxI+LjmaIYmaJRyr5gIi2PmTAbVJrBoYqpR7UlVEaUOugrGl1DLGYvG4+XdyjY9AweUyczK2jETvw0KywNTTZ3lwPMGxgS7SBVs6JYSNXqilfuymSgimbXRKOegwrVWPpiNCoTjeK31h2J2CRl6VJoaIAf/hDeegseegj6+xlLJrmQThMu0H7ynUd05AjDgQA3uyLpGzo7MX74w6JIwCFd5/KPfYy1n/+81eLROjv5QDLJ8Zde8h4RUMIgSfn8Wx5/nBqP51xMpwnMmydaRBNswYH4XnnVPM6r94/kPl6bAZqxvssNDFvzhKK/BvGroPsO54gkqMP5sTRh0++pbRmVui8opeUxE2aDUl9zqve1VUwPqoSlzCj2hz5dP/Lp67W6ZgdJ0e0i+3pDw1BOKyjEsC2ylX9dpfdyV1e8UIxzZrIalXyw1p1KiWC4nh4hdFUcOkZPD0NjY7xw//0ABQnWgXicW/72b6k5d06kuR45Ak1NjBsGxtiY9xDEt94qmlSsvvFGu8VjPt/PFXU2lRJDG5NJS48i5yMd+PKXPT8TuQ6v9/n85s1FfQYTwTn8Za+ARWAcYXEKYmvANecQMf8xRJmkNRPCVO0LZsJVVMprVpp7c7qQJcCop9Bq9qJKWGYIc9o66D5lNSssdYxZlZUQQ3bwlkpWpBtD6lfyGDDKfT5aqDrisBu3tYmZN6tWUdfXZzlv3FAFrNl0msD8+Zx9/fUcy3SOnffCBXu+DVgTlS+4yEAha7IxPCxIiSQIe/Zw9NVXWfXxj4Nh5FZTHnoIo6dHTG/OQwLWRKPMb2+HH/xAkKx4XFjHDx5k/jXXcE9vr0OYfCAeFzboXbuEkDgSwTh3jleeeqqgdbzQ+yz0GZQN6nfRo+oiEVni18Wamx34mTAblPKac3pfe4mhSlhmCJXmKCoOhYKvQs6kT6lfaQQax2kwSUrIaguNOfUrEvLAoBwUVDvz1KdWeEMeCG/61rdoqKtD6+pCM1spXu4Ud1ZJzYMPwpYttEQi3Prkk1z+sY/xwr33Op53QywmiEAggDbJ9pNDdAuWw2jVvn2ikgK51ZT+fobGxriQhwTI96WZrSSl5yFaWqdO0bJli8MJ9HwsxvOxmK/DqtDnni/Ft1SC4nZ6/bMyzduNc+OwUL3DRaDVeUJ1jEEjpE55FqlIpYyStnO2YCZcRaW85uzc11bhhblJ+WcBKs1R5A8/cuKezqwU0SVZUfQrgeCoZWdW9SvB0aztCJIXcBCYmTODeqM+HBYVCJcw9aavftWRefLpr33NzgxZv15UTtatg/p6tHXrWLt+vSM3xHI1zZvH2UOHJj2ryNLFyMTY8XGR6tvUJGYHeUTSj+/cyQv33583o8Vz1lF3NzzyCMaWLb5OINW1dSAen/AUaBWlpt56Ob3kNG/I813L5F4P+SwdfwOSnrMlfWKaZzlmwmxQymvOnn1teWGgMUxoUpdKQ1kIi6Zpn9A07bCmab/QNG1i8ZuXGCrNUVQWSP2KhGllphHL8lmntINCDAvxooS8rghu/aL4ZwLyYFfT0pJb829ro6GujobubrRgEO3Tnybw3nvwhS8IwuBhwdVuu42Px+OeB9xyzCo6m0oJYhKPiyyX+nrx99QpscB990EgAGYk/VAyyU+fe86TSKjEYH57u2iJqTh4EGPVKqfDSXmsFHt4qf8fpazHi2zN6+7mujIOV9SPQuyfIZE0Jz88LsOWL5btNSoJMxGkVsprzsl97SWKSbeENE2rAZ4A/iNwHHhZ07T9hmFUm4N5MD0OnhmCDIxTbjc0DDkSQYOYseY+QttKhHWwSyRya/4PP2xXXcDKPGH7dnG0ypOn0vDbv221T27dvdvRKlL1GT83ZxUVG0t/IB7n1iefdI4g+NGPxMDDZFIMP5QjCxIJW2fiWr87Dl9G9JPNYqlIOztFVQgKCpMdpMHcpomMMPBbz8eTSd/1+Dm9mkq1QjvSbnOFVvr/Bf1poA94HQ/Ny8wGyZUbMxGkVuxrzul97SWGcmhYPgr8wjCMtwA0TdOBTwPVb0MBzJ20RNMhBM4OkgyNww6Mk9oVwHNHb6HCiItDZHr0qHDxKMJUY9UqT80JK1eK6z6WXu2tt5wak9tuY+2+fYBw4ah6j1IP9PkSaVmxAjZssGzSB0x9idf6P7x3LzXr13uTsb17c4S55bCHFwO/9YRWrPAVQfs5vc57tNoknRjOQKgVZzw/zu+vCJBzuYbk8MMweUW6VUw95s6+tnjMRZdQOVpClwG/VG4fN++rYtZDZR9+Rk/lflVw26gs0ujs68iWkIUR8+I6IFQKHCLT+nrR2hkdhW99C0ZGuLh3L8N9fd4Bb0eOcDGdxrjmGow9e5zChp4eka2i4uBBtOZm1q5fn9PqmMiBfvjoUe/teustjL/5G0e4nd/6AwsWeN5vrFqVE5BXTHheqSm5fvBcz4MPoqXT3Lp7N/e9+aYIxlPaW16ttgvJJD8uIo0YKEykLZF5SW+liiqqKALlICyax305cnhN076gadormqa9UnGnz1WUgJDPdR+YO2559uk4Kx0dmhVfBU+R6Z13wtgYY0ePsn/zZr4fizG+c6eTkOzYwXhjI/s3b2bbvHk8+0d/xMW9e01hQwLGxqC/3/linZ2QTnsKV7PptNCkSAFtby88+GDeOPy6hoYcYS07dwqiFA7zREcHIIYoksl4EonsmTPepOfs2ZzIfsg/EgG89TlGTw/zmptL0rEciMcxenrs9Tz8sLBw/8EfoAWDNHR3E85m0T73OYv0AQ5CNZRMcqG2lt/btYvP9vayaqYjaauoogpflKMldBy4Qrl9OdDvXsgwjK8DXwfQtMvmpr/vkkKT993qmWW9DSkDRgAAIABJREFU86G8LaAKhl/lwVi1iudvu81xQP54MkloxQoYHGQoneaF++6zHrdaNPX1gnBEozlBauzeDV6i3oMHCcyfj7Fpk5gzpOSq/Pw738mx6v78xRf58G23iVbO0qVW9glHjoCmQX8/Z1Mpp/W6rU2QGyWsbiyZ5LXdu1m7e7eVaCttzFoiYbl/SoFc/vd6eqhraEBrbEQ7epTwU09ZpKJYu/NNX/0q4e3bRestnRZuLHfrKpGAjg6rfSZJlPXe7xCRtE2dnVyXTPKvwJl8r698jS0tVhVVVDHlKEeF5WVgtaZpKzVNqwOiwP4yrHdKMBfnT0z/e/JqDynVFjlDyEQgODuJioRvC+PQoZzKwqNXXsm2mhq2zZ/PXy1f7pn4aq1L14Ugd/t20WJ65hlBJnwqHWQy9lBE6TTasIFf+dSnctwyaz/zGdvNpOvQ0WEPebzsMmsoo6N6tGcPfOUrsH07xuio1c554d57YfFiceCX1aFYDB55xLcdVYzduC6bFW22YBDuvhs2bqRu166Shhe+cP/9jAUC8Du/40v0kNvoap/5OYY+6Hr9mvVReKYX/vc4/HEvrIw6SEvIJ/HWH3lzdKuooizIErD8mBO9VBomTVgMwxgHNgPfB1LA04Zh/Gyy650KyIjmLVtaqK/X2LKlhXj8k7OatMzF91RpmMwU5oLreucdxufNIzs8LCoEGzbAH/6hIA+uIA+tsbHoyc3ahg1ifR7E52I67a9b0XVxgDcM53Tm118XluiNG8Vyu3ZBKsXQiRM55OSmxx4raDe+IRbLaXthDn8sRXzr0Mz4ET3ZMnPpZPwqZ2Hl9cPRKPP+JA7/ybSF37kFropDS7V1VEUV042y5LAYhvE9wzCuMgzjSsMwSt+LTxPUiGZ7H1lHLHbDTG/ahDF176kUdm2eMYbzL+UJOXdIihXV+ysExQhJiw0w81rX3z/wANrFi3YU/5498L735VY0SpzczPz5ouXkIlr7N2+2iUiRAtgD8TgX9+2D//W/7EyXTZuoN4coOqo7t99ecOKz7yRps6VVCqRm5tkvfjGHWLJjB/z5nztIpvy/QtPEZ6r+X3V2klFef0EshubWL93eDfOdVZiGOWRRrqKKSsUlFc0/FyOaZ/49TbxsOBxsIJzHGhQKQ/MoMF5aHP9UJFzki4B3R/AXykfxWtetu3c7D+CyoqHabyMRDJeWxHdy84MPwtmz0N4urMctLaKyopAVcM1HKjBAMACiAqRoRGpuv120tFRrthoeJ0PzIhHmZzKW3djPXmycO+dZuXJrdLxi/d1zh4aPHsUIBGj49retEQOA4//KypQJBKC/nwvJJK8q773Gj1gtjAjR+Nxyjc4qVCcw54eBxtgc+4JeUoRlJmZeTDWm5z0VmiFUGobNRBZPhBH6gPdyt2BYuX4Om5jI21OBYg6UDi2EeYCuW76cWx5/HChOQDrc10eDegCPx3MFsBs38trevax2zfkBV/bJgw8Kce769Q5x7r995zsFD/J+AwRviMWcRERCzZqRkFWbpUvF+zC3S+vs5OadOwFvomTs3u05DLEUQlhottA9vb05WTPcdRfGc89x/tgxXonFeFvXrW/8xVSKWq/BQO+lfHXnVUw9LtUJzJc6LinCIiKa1S+5iGiOxWZvRPNseE/Z0SBDNJgThGzGP0oQ6gcFSZGXAfNB9boJSZfOT/H2WiTlmmvg1CmrouF3oHTM7FEO0DUeybWO9Ssk6PuxGDfv3CmGFR48CO+8w8WGBkaTSULLlzuIxAs+2y1JRzadzgl60zZsYHUi4fncYgYIzo9EfMPvOHLEufC+faISNDRkjyMwt6P2zjv5eDLJo1de6djms6kUB778Zc/tKFcyrvU+vEhXOMzTpsVbpeVn4nEWJZOiLSR/YE8l4WxsVhGWuVaNqE5gvjRxSRGWuRjRPNve0xAhRqljlDr7zjC2s0h1GNVDKFjcPCF3lWWibSHH2Xwi4WzL+BworfaGOi/IXF4m1x5/6SWnlVZWCx58kFuffBKamxk+epTxp58mGAxCYyOjfX18v0jbsEo6to6PlyVJVv1MOHcO9u3LsWEbe/ZwUdOovf76nErQ2s2bPdN/QytWlPT68yMRYbnu7RWC4BMn4OJF5i9fzj29vUVPfYbSkm4BMrrO/CDUPZmAqyJwPAVvxGC+DpfPjtmxc7EaMfOt8MqHdAnNJcyOX1wZoeuH6Oh4gtrabXR0PDFrf7AqZst7Un88oi0UYtwtrpXCW6X12lyb24yaKmOo42y+yGRZy/njs7zW3GwJTh3rNyc4a+vWodXX09DdTXB8HO2v/soKPpPumlKmEpcrSVb9TLREQjiEdu2yxMDGM8/wyne+w9/fd1+OIPmFe+8VYmAfkXApgwuHTpywBzh+7nNihtHdd6MFgyUPTpxI0u3FvTp8ogN+qxb+RwccKa2qk4upamB6Yy6aDSY6gXkuxlpcSrjkCMulitJ+qJNk5aPmRcbtD9YyZlZUxgha+pUhGhgNBgQ5kS0hnH9DCqGRJEWtxOcbHjCRd+FoGRR54JfOn4s+NmLSadFecq/fa4Lzhg2wbp3DXXPTV79a0lTictqwrW1+5BGxvevWCWKWSkFjIy/ce69vsu1QOp2bsrtjB0PptE3cli6Fn/4UfvAD6rJZbvrqV3NeXxsftx1UX/mKfd3HgZQPXi6tF2IxDpfYWgIhGp8NmIvViIlMYK5GQJQfmqZ9QtO0w5qm/ULTtAfyLLde0zRD07S1rvvbNU0b1DTtj4t5vSphuQQwbT9Uv5j9ERgaamCMoNkSCnKGFkapEzt91dos20ONWMQlFBRVFihuutFk4KhOxOOetmCvA/8hXWf/5s0Yu3fn5KfwxBNw6hRrolHn+vNYe9XbDS0tnjbh33v0UR44fZqtFy/ywOnT3PTYY9a2FLJhT+gzUQPotmwR2Sx58ML99zNeXy9cRCMjsH074/X1vHD//XabR1ZOTJt0Q11dDhELLV9uf05lGJzoJlhFkZUCNvtKLr1PtBpRydD1Q8Riz5NIpBkZMUgk0sRiz+etLs/FStNMQtO0GuAJ4HeBNUCXpmk5BxVN05qA/wz8q8dqHgX+odjXrBKWOQxZVdm9+1ay2TqWLp3mH+ogjoGGQ4SsVpB0CgnhLXaFRZIVqWWpt6ss6iFhqqosjurE3r2waxfGvn1FHfgP6Tqv7N2L8eyzzvyUhx5Cu+02bojFnOv3y1VRKzidneAVGNfWRh1Qs349WjBIzfr1rP3MZxykJd88nwl/JiVUbA7pumgXjYxgAOmREf7eHFVwNpUSs3/cFaaurpxqiYPklbndNWEoLUtfx1uFYCLViNmAUlvhc7HSlA9ZAtZ+dqKXAvgo8AvDMN4yDGMM0IFPeyz3CPCXuGaWa5p2K/AWUHTQbJWwzFIUavGoVZVgUGPTJnEyK09ep+yHOuj6a14fOh9izHQJiYsQ3lo6FklWVMcQOKosIAiJF0HJR0xKIS051Yl163j2S18q+sD/wr33CoJRXy+qEXJ5swrgWP8119gVmQ0b4Oc/x/jhD0Vo3NGjMD6OsW8fY6dO5R6kH34Yrasrp5304dtuK+Hd2sinkZlIxUau79ZduwByhiQeiMcxVq0qTSN0/fUiBM7VZppMu2uycIjHKxQTqUbMRczFStMM4zLgl8rt4+Z9FjRN+xBwhWEY33XdHwbuBx4u5QUvKZfQXEExqn9v25846df1Kf6hDgKLEHzavEhrs5xRMUwDaRZwhjQtzWkWLRyEVoT2JYNdaTFvh7BD5FR4WZwnm8tSjM03H/ycKLIKoK5/TTTKTd/6Fg11dWhdXcJV09kp0m6ffBJt3z5qdu5kvKeH2q4u252zapWnAyewYIHvdvllyhSTc1LKZ1Ls+j4ejzuzZ1yfk7q9QydOMG5avIdOnEDzsHtPG8qSxTW9ybi6fuiSIyhuzIYIiArEIk3TXlFuf90cZAygeSxvDTbWNC2AaPnc6bHcw8CjhmEMaprXarxRJSyzEMVkEPiXP+2ScHl+qDK6bRhHLUMyCdkWMoW3YwRJ08KoqWex20KDzipLI7DQfK5SZfGyOLuD5NT7JKYi/dYPpaTHHtJ14cLZt8+eqJxKCd3Hli3wL/9C7Z13MpRMMqhkljSfPYvmcbDPnjnjuU35SEQ5c07yre+WvXut9wzw/VjM93Nyb2/YfOzZjRunjZxMSJViCc2xq4x+2q4qphWzLQJisjDQHLlXE8RpwzDW+jx2HLhCuX050K/cbgJ+FfiRSUqWAvs1TbsF+HVgvaZpfwm0AFlN00YMw3g838ZUCcssRDG9WL8E3EzGIJEo9w/VpAsjCFYgO5Xnleum8HaoQVZXWqzLAK20LDxDeGFW7NxllWUQu8oChEb8qyxepMWN6SItxabHSsy/5hphGVbDyZJJaGoSGpgPfYjQ8uU8Wmv/XG967DHW7tkjXEVqmu3u3dYyaoUim05T88QTnqTEL0xtorktfuuraWnhk2brRn4WF2prmffcc9DYyLCSO+OVSDsZEjUZWE41RQgO5BwMhrx6/iO5d00c1XlFk0W10lRWvAys1jRtJXACiAIb5IOGYZxF1NoB0DTtR8AfG4bxCnCdcv+fAoOFyApUCcusRDFx/H7lzy9+cZp619LarJxtDp0PMdYQzKmupGkhHVxAuHVARPK7qywSGVvLMtHW0HSSlmIPrNmzZ6lxx8V3dwvRbyTiKSqVybkf2bcPbf58GBxkbGiI4y+9BORWVGokCUqlcnQ1hVpYpcJvfaRS1G3Zwg2JBGDO9LnjDotw1ZoVH/AnPRMlUcWi6KqKSVzkd7igQNHI/3AVVcw2GIYxrmnaZuD7QA3wTcMwfqZp2p8BrxiGsb/cr1kV3VYISslJKUb1P3GhXRnsme55hlLHMhC2qiprsh9iw/mHWJd9jmvOP0cgGxXi24UILUureV1NvzVzWUIeVc4mvAW4XtZnr3dYSjBbuRFoafG2N7e0wJEjvqLS4y+9xPiZM2i//dtora0Eu7qsfBZHW8a2homKjURnJ8NHj5Y9t8VrfSSTQvVtkg6v7VPzVModfleZUGn21NPoamjapQWZdDuZSyEYhvE9wzCuMgzjSsMw4uZ9f+JFVgzDuN6srrjv/1PDMP5HMe+pWmEpElM5i6PU6Oxie7HTXv40EK2cEHZlRfb03wEWibbQb9Zfxq2nlnNHV635fpeyu+f/JbN4iPmtO+zQuQFgCXa1Jo8Adxjb6lxIz4Lr8VKnLRczELEU+E4uPn+eobo6Xrj//pJn7PjOzIlEBIkwpxQbgYBvCwvEsMBS36dc5pa9e6lpaRFVnVhMVHauv56zqVTBCkopOqBpQTj3+rBrhz5Gnf2d90SxpKT8SbhTEc8/1+YTVVH5qFZYisBUB69NJNBo+uL4i41nG3ZeHUGcQEodi0lcBk+3cHemnTu7ah3v97auEBczf0ZmYUBUVuoRVZYgdrWlFViMNWNIRvYXY3XOV2kpdLavopRI+WLhVZEwdu/mlW9/m79avtxf+5LnoO9XoSCddmTENFwmXIgyt+XZjRsBuPWpp7j1f/7PCb1PSegC8+djnDolxMN798L11zO+cycH4nHv7XvwQbLpNFvHx7khFuO17363bOF3fnCfQy6PRvn93l4+NT7OB3t7WbTB9X5d1T3RDgrlihvl979CUO7QtGpqbBUzgSphKQJTnZA4pwKN1LNLWRU5jTjzPA0M1nJVuNa7AxJuIx1cwHgrQnsuSYpXa6jVm7TIKksppKUUvUQp5KZYeOWcvLJ3L6tvvDFvi2r46FHftolnW2bHDti8Wdzu6ID+fkeLxUHGXn9dTKku8X061hEKoW3fLoYmmkm3BMWBPWf7Hn4YY9MmEYRnEqQPfOpTHIjHyxJ+VwxWRaP8RjxOg0nSglu2EH4kDrean73KSeqLXKm7PTpDwtly72OqqbGVjywBxghO6lJpqBKWIjDVhGL2BxoNO/9Khw/mX1llMUlL6pzh+X7fzbzLAK2kmxudOpZWRGtoIbYQV9GzTJa0nC9BLzFVYlA1mfZAPM4HPvWpvNWNNdEodQ0NOSFqsoLhJkFDyaSIyX/nHV+dykQGP7qRQ+geekjMH0qlYPVqas0kW/f2XbznHuF4KiMRLBXXxmLUupN3u7vhv7i2wSQrMldIisiHhhrs77mKsjqFJoZy72Pm1ElWFbMGVcJSBKaaUMzu6GxXv111Q0iyIq+braH4+QDJHsPxfr/dM0Yq/LeWKNdqDUnSspBc0tJaHtLy43icC0WKTqdDDFpMFeeGWEwEyX3lK9b0ZP7u76i5/HJuMHNMVIwNDfHTp5/O22KZyOBHNzwJ3bp1cMUVcPEi7N1rDYJUSZqf8HiqXUEqGv10P1dF7OqKqV/JmTLuRk47yK+yMj2Tm8u9j5n9J1lVzEZURbdFYKoTEisn0KhUh5AaFnfOvq7msYwixLNNiJZQPej1QCTLk/uzXBWu5Wgmw4HwDzkT+BltLBFbEhwm2D5gf0HNLBZHCJfMZ1E3qciMFlWIK4ffXZdI0FQgN2U6xKDFVHGsZcbNNxyPQ3c3mikavnnnTggGrXTcls5OPpBM5tWBOMS/cvBjie8zR0D82GOwfj3cequdMbNnD9HvfQ/9937P/3kw7a6gwVSKJi879hvmNrjGRYjREqLKMkaQofMhp+h2FB9MpC00uVZSufcx1dTYKmYCVcJSBKaDUMy+QKPzOEcQgoPASLdQPeLs9Lx53ayy6O/W8N0VZ2gL9LO46STv522W0GZ2TkfpZxnB5lEWtQ46d/w+qaETJS2Y9x/WdcfUXr/DQ6mhcKViTTQqQt68DpyZDPf09lqiVesAH4vZQwSjUYjFqL38cjh7FnXiZaHgtQPxOLfu3i20KzIDZt8+aG7mYjrN85s3F3yfP3/xRWeg3caN8Ad/4HAyaRs2sOrZZ1kTjTpmC820K+jleJzfTCZFW0iG8SWTaE8o22C2g0aDgfzZK/I76/giTU81xQ/l3MdUzklWFX7IolX0FPGJoEpYisTsIxQzhWEwPH4kssoiLc71MNjYQrp9iDpGGaCVOkYJscRKAeinDZb1s0iKAkbIG3NeCmkxtxQoPWBusrOG/CAFqzVPPJFT3WDHDrQ//ENa+vv5ZDLJa9/9Lh+QB3hZbYlGrUqLIzEXhKW4rY357e1sHR/3tCkf0nUu/9jHWLtvH1pzs2gLJRKMbdxYtENn9Y03CqGtHDOgaZ7VIq2pydKyyNeGqSOCxeAtXacO+HAiQSgS4UIqBX8Wo+7/6NCOoy00HBRkZdQcNzFEiOxo0ENki09oXAVZiCaI6j6xiulGlbBUUQbIw72iDlHbQgPmfecRbaFG+beW08FW6paMcZJhQgzRwDBv0wZAkDFRcF84RHgkW9SW5CMt7i0uJqsFps/X4dCumGSBSERUSu65x0qorevuZrWpQbkhkWB+JiPmCqmVFnBOvASIx9E+/em8WTMv3Hsvx196yc6ZWbeuJOIwPxKBRx4RYluA06cFcXJXi86e9SRP00lQvM49j+o6Z3SdJoRkakUYQVYk5LwrcltCFrwmlueF+g2b2SpMFVVUMqqEZY5h+sOc3DoWiZCzLeShZeE0ZOvDpJtaaGgwKyrWs4c4aepZCMKytn7CZIsaJBeqxyZJHqSlmBaRY31MD2lxaFd0XVxqa4WgVj2Qm3oWeYBfE42Kdsry5f5hcY88AnfdVdRsnokQB5m9gqYJsvXgg2Kbd+8Wk6eVmUfs2QNQkDxNJ6RYuxnxFQ3JSxibpDTay8s4fkf+ymCtI3MIUL6v8htUJSRVTA8M09Y8l1AlLHMIk0uzLKXXeY6CgXKyDK4OQ1S1LKfN+1ytoQaGHVUWCyppkfByapi/zxBiWCLvURRpkZsKM1dt8Z3B098Pvb32JOd9+xxiVCtZ9vHHPbUvRiYDq1ahTdKF4074/fmLL7L6xhuF6+fUKaF9UVpY/If/AL/7uxhLlsCzz6I1NorgOk0TzqESBxsWShgudwKxJ8wcIDUsTlRZPCzNZbQzV1Nlq6iiamueU/ALc3r88VumcX7IOcRhfdjWnEjnhKyyvIt9JnoaOF3L6ZOtpFlAP8vop42TLOFt2uhnGSdZQj9tvB1sI9MWEKFy7Qhrs/zbCizDtkGHgXpYuBCWhGGJh+1Zzh/ysz77Bc1NlYzNK+wtu2sXRjgs0mLr62HLFoxNm/j5iy86nntI19m/ebPnTKBnv/hFzh46NCk7thUIt28fWipFSyTC2s99Ttz2CJnjrrswbr8dNm1CCwbRbr0V4+RJjPnzMSZgYS6UMDwVCcSALRoPK5d6ezKzJC5WLD/YCc8WXDlFFoqrtkSjK6upslVUQbXCMq2Y6rMkvzCnlpYa6uvLMz/ECdUpJHe+yuFcrbKAvfOXGhaJRsgSpp9lsAQaGCbIqCnCHSbImL2sV6UFxIHkPXLvO1VY11JqtUW+y3JXW9zC0+GjRwk1N6PdckuOy2Z1IsELBZ7vFq66XTjG7t3MX7zYch7lq0bcEItRt2uXcP2Y69CkqNenFaU1NTm3+7bbSJt6mlItzPnmJh3S9YKPl4pmRLaPBZmyXC8yWEapyw2NA2dVxXeuUGmCW/VEBOwTkUTihmqVpQpfiOGHBaaIzzJUCcs0YSqGj7khwpxacjoKqZSz4lL+HZ2qY3FhxHV9FPtMtN68vCNuSj1Lwd+YJC31WXs9bqgEJlNY16KiFG0LlJe4WKQjFmN+eztaQ0NJ1Qg//YmbzHDuHFoiAY88UpSGZH4kIto4XqLevXu9hbVuAmJu97MbN5ZsYS6UTTOZBGJ3JS3nmyy1K7LCYiJHcCtbQnIkhQPOb09XdBnbYtfSHlnMsdRJtsYP0aO/5rl91VTZKqoQqLaEpgnTMXvDO81SOF0lJrejK+bQPIyjLWRgkxY5wVnOFnrXvK7MGhrsW0R6SKTd5raFnO2h08saoQ0xELFNubTjjPWXUf6t/i0iv3TcQrH+XstNBo7WRjg84cRZL8hk2bOpFNq6dcLJU2QM/tlUyjeun/nzxRfNNbyRffs8t9trdlIh23ShhOFyJRCryUIh2QqSVUFXBkuaFsvSbIXGgf09H8asMqq/m2G6osvYHl/Gii1dBOqDrNiyge3xq+mKXuW5TdVU2SqqEKhWWKYJ03GW5A5zSqezPPFEjcNg4r2jm8zh1t0W8jmsS7eEKsBtRJAWcDgwBllE/wpyKi3Dru0cosHOaVF1BiqC2JWWsNiOYq3PUHy1BcpTccmxNu/bl5PJYuzezYEvf3nCrzGRasSBeJxbn3xStIHclZTXX4d9+zDMkLmzqRQ/37uXD2zcSN2BA55VlFKdSIWC5UoNnvP6xjd7Pa6m25qXdHABaVoYJcgArYwRtDNY8gptxbdmW+wKwt1RR6Uq3H072xK76NHfcD1nuJoqW8WEYKAJbdUcQpWwTBP82zXlPUtSw5xkG+rAganY0bnJidoWch2yjZC9I5eLjAJvI4SyUnyrYLCxhf5FOEiL26I3av4YR5cNsCQ8QK1KVuQBRiUr72G7iOqFiyiUgeFROJdH26KiWOIyUdLiIBPxuLjs2mVlshjnzvHKU09Nyv0ykRh8K1ROJuEq5InFi0Vey5e+5Niu4y+9VLYguEL6nIkGz3nRa6vCppIVqWFZaAtthy0NS4NtaZaCW0fIoaw4CrRH3udJGNsjSz23sZoqW0UVAlXCMk3Id5Y0VWLcqd7RRaNtxGIRIpEQqdQo8fhxdH0Az8O8TL9VBbiQJ1yrlkFySYu9xgaW0Q+YxKUZWoJnhBjXdHJIkaSjyqIOsStCkAuFRblQPn2LRSaWLhVBcO3tsHkzRksLZw8dsqzEa83ANctaXIKVd6Ix+FaonEoKvvxl39crdxBcofVN5vVCrr+W4FYSFZdDSBXcjlGXR7+S+w04lnqXFR6E8VjqHdeS9nOrqbJVVFElLNMGP/IATKkYd6p2dNHocuLxD9DdXWtudz3J5ErzNS/g3FErQxElXBUV73wjm7QMNcgo9KDj7zANtNEv/gb7aVmZZlH9YK4VVV5O4qy2vOdfbfFzCJXqJlKXk/DLDDkQj3NzTw+1IyMi6M0kFBd37uTnL77IBz71KYtotHR2irk927cXLZ5VH5tI9WO602hnAhaJrUe0KqV+RXEIOYYeDjWI77MkLfLi0x7aGn+D7cmnCHffbv3/ZpLfZmvsp1P8zopDNfOlikpFlbBMI7zIQ2/vPbPSshiLfZDu7lrXdteQSFyBrr+KZ+1BbQ1BkcFatQwOLmKoNQRLsIZ5yb9uAiN1LS3hQdEiKrbakrGJS93vRAl9JUbAnCdzLB5nwOMg7SYuxepbpLBWJR4q0fh4PC4G8Ckfbu2dd/LhvXupWb8+x+JMIiEEtCVYeS8F4uEHt35F3vYV3CqEVxXcysvg6RabpEjHssVQTfG51RY6T49ualkSe0yX0Cm2xn7ioV+ZfkyHm3Ey21YlUsUjmw3Ydvs5giphmWHMvGWxVMGt0KpEImGf7a7LWda+DtHPhojFzNDWNyD+DdAP4FNhsSFzWsaWiPWrxEVoCUQuRhv9jFJHW/PbokVUn3USk3zVFoCbozR+OQ53CzJR19nJqmSSEHBc1xlGHNjUNhEUFuZeHY1yXSxGUySCkU4TeOIJ38yQkE+uSWDBAv/ofeV2Kem1lzL89CuO/BVVv+IS3FoJt4O1zoqK/Os59FCgRz9Gj37YvOVFcadrgpUTlZr5UslEqorpQ9XWPMOYCstiNLqG3t57JpVuG41eTW/v5xkf76a397NEo6tc253x2W7ZwFd3wufMddYQjyuhrV+C+GaI3oDISDmv/D2de8kOhDl9spVTLOGUtDcrqbhv08abvN/8e6VlfR5vB65B2J297M8yIbcd+M8xQVYU/3mgu5slsRjNiIF4+ZJyvdJyr45GuSkep9lMYQ2sXy8C2B77ToxfAAAgAElEQVR7TETuj49DIiEi7vG36GbPnPG835F30tlJNp2mWKyJRrmnt5et4+Pc09s7+WTYCocfPc/JYlFnCMnriuA2TQtnaLH1K1JwK3OGHIJbL8wMISmEyZxAlWO/44fpiIWoovJRrbDMMMptWSzHmUg0ejXx+E10d88z19FEMnmduY63zO1+lWTyNxQNCySTF4nF3iK3shICzhGLzc/NHbsbEk+C/g/YwW6QG7xlnsFmR8KcGg0ytKiBoYYGxui3xI9tvM2omZErrtfRQjq32uKutLyH0CkMAFd6W35rIhGW1Hq7iVR46Vt+MxZjnvuN79oFn/88qPN3du9mTTSaK4p98EGMLVsINDVhuFw67NkD27eL/BMzeTYwv7jqXKHW1KWAkOt6CLMdpKYy1wOtMN4KZ2hhgFZLv3L6ZKudIyTJijU/Sx14KNpBlY6JuhmnugIy85Xo2YfsxYDIB5pDqBKWGUa5nTzlKOnGYtfR3T3PtY55JBLXWoRF14+a2/1BIpEwqdQwsVgKXVd3ytL6LNtIAe+dzlXYbqEBvOFIsxViXEwH0RChnBaRdBENm+MU24L9tCxL+2tbBsz7jqU8U1u1wylxIDPfUWg8/zmySlwavXJP1q0TZMUVXX9DIsETHR2AKYqVgwXXrbPJi5l3kk2nqTlwQKwrFhOVll27OLtuXZ4ts1HuOPtKRzHVleZapbrSiFVZ8dOvZAfCdjsoR2gr9SsqipkfNHPVl4meQE11K8mPSB09WpmVqiqmBlXCUgEop5OnHGcikUiTzzpkupsgILp+1CQu6i6/ybGMSlpSqYt0dtbmnr29jnMfPUBuhUUlLIPAoC3GHVtSxxhBxgiyhJOMETQrLUGGOWlVXVpIs6A5zbJgv3e1ZQB4JQ7fSsIdtuWXHUlIxAi1ClHusFnuD6+PsjoWoyYSYSyV4pfxOMddB/pzQCaVotFNgvKEt4WwRbH39PbSsmWL/dyHHkI7cIB0IiEqMUqFpFh7ssRk4uxnO9wtoCb1vnpytCssdOpX0ojrFlE5jSigyHZQHv2K88te3ADE6cJET6CmugISjx9g586bufNOu6K7YwfU19cRja6p6lguEVQ1LHMMpWlivM85U6nzPuvwC01Rd7peZW/xeDx+hmQymzs64L8jdvLD0HVjliM957l4IMuRr56n6wNZcTBQY/zl5ThkT4Q5dayN/qE2+mnjCCvoN+P8+1lGHyt4kystvUs/bbwZfD8nlrWSWa1Mfl5tXs7p8NMYfCsBIyPwzQTsjcHPdDECoNWM+L87yqJ4nFpTlxLcsoVV8TiXR6M5GpfD8Tjj7uj6c+c89SjnTT2KbE/kIxU3xGLMa2/n4t69RUfcqyhXnP1sgNc3XY5mUG873EFytIPZDhqglTNmZcUaeCi/i1K/4mgHnWM2tYMkdP0QHR1PsHHjszQ3z+NrX7uVixe38uab9/nqUqZ6fICuH0LTRtm+XfwsEwn4ylegq6u2qmPxg6GRHQ1O6lJpqFZY5hjKoYmJx39MMqlqWCCZvEAs9nKRa5Bx/c6hiLouwlcSiQVEIjWkUgax2Di6XgcadH0my/YvnyJ8dxccPMiKzk627+iBkcX0HA4I0jKCaAWBEjonWkRDQbvasthsC6nVlrR5uJFOouFgiJaVaVpaB6mVLaF6YECHZ3XxWu8hyNRCxEFMLvdHMTRXOyXQ3U17IsEFF2E4r+u8BkQSCUKRCJlUiuNPPcWVyaTQtpgf8oVkkh+7qiPnUymaveLwz50TlZeDB6mRlZUiQuNUTDRAbrbDq7piBcep7SDZEgpDurnR0Qpy2JnVoZ55qyvuakrluIPciEbX8OijNzMyUsunPy2/Hg3s3HkzkKtLmY7xAZdd1kB9vRDdStTWVnUslxKqhGWOoRyaGN20WyYS1xGJNJFKDRKLvWzpVwTcE5r95gg5W0O6fhpdlxYKOXpQJOFu+68ZQVbUGSt3dbHt6/vp+VKTMyhjECdxGawlu6iW08BQkxDkpjljCXJVbcswDbQyQJozLOEUWuNvUUuMpuVXkDlzjPDJrQSO9ghyoupbVBv0+70rH/PMlo4b53Wd/6Pr1nn2OeD0Sy/xwUSCcCTCYCrFy7EYb7sIx4/jcW5yERtj924xaXmS2pPJBMjNBTRjkxXLzix1TQudF5uoLGCAVtIsgNO1zuGd0i0EOPUrlVtd8cs2icVuYHCwlk2bnFrxO++s9dSlTMf4gOkab1JF8dA07RPAY0AN8A3DMP7C9fh/Au4BLiJ+JV8wDOOQpmkfBb4uFwP+1DCMZwq+nmHkbbZOCTTtMgO+NO2vW4UbxSrI880pzrdck8dyza77mq3rF8eDBOqDOadQ2ZFRaq4NiPJ8E+LMNwi8z7y+CHGgWWpeXzRO46I0DQ1DrnNiWWF527q+JvshLjt1L7d1hawzw549GRZf2ESgzyQtssoitTUngT/phbu3OCsf118PTyQY/miHlZir+kQg9xCmnmP7yTPVDJfzqRRN11yDFgrlfE7GyAjbaqvnIG74ffvchGVFGELtiBbhMiAC/AaMfxB+2vyrHGINr/IhXuEjvDr0IQYPLoJXgCPAYaAP+CWI/8mTyt/z5H4DJl9dmWyQWq6zR1ZFnmfXrlsBzbOiMTJiUFu7raRtLQfybe/s0LD8t58YhrF2ul5N+9W1Bk+/MrmV/Irmu82aptUAbwD/ETgOvAx0GYZxSFmm2TCMc+b1W4AvGYbxCU3TGoAxwzDGNU1bBrwGtBmGkdeHWdWwXLIoze4Wja6it/ezvrksAsVkTnjtuMX1Y6lBT03FsTcy4sx1AKeF1K1reQdx0OirZbBvkUPbIjQtQsfSxwr6WEkfK2nLbOK2rpAj36FrQ5jB+duEvkVqW6TWRea3/MgU56qCnB1JSMaFxmUhLAnDklrxSS/Bpmal5rgc1nW+0dHBo7W1fKOjg/Ovv17R2pNKynbxIivux63/FyVvhSUIgrzQbge9bX6PrHbQO4jv4gC2nRnIT0r8UDpZicc/yZYtLdTXa2zZ0kI8/smSsk/yZZukUmc5csQv9mdmKhqi8vM8iUSakRGDRCI9i8jKnMRHgV8YhvGWYRhjgA58Wl1AkhUTYcyGqWEYQwo5qaeATF2iejpWRUFEo6uIx6/zyGX5Mbr+dp5nSi0LOFtITruz3LFvjTewPdlDuLtLmbHSw9a/COcOSZSrlm4i6SJS7aWLvJ1EaptoaXihp7uhsamd0/MbaTnnskGPmNff1eFfgZ0JuDwCfSnoicG/muJcJepfzijyG66oopgBi15togsVoj2ppGwXP0quVles27Id1IqzJdQK/bRZriDx12wHjSBIs2M6sxrDD8VXV0pDOWzE+Zw9Gzc+y6OP3syOHbXqSCt27hzngQfKp0spFdUhkBWFyzBriiaOA7/uXkjTtHuAPwLqgBuV+38d+CawHLi9UHUFqhWWKopALHatlctin4nNIxa71mNp985Y7d97nUXaO/EefYhNsQb6Es+RHRmlL7GfTbHF9OwKiMUkETmNnYgrLaXvIn4u7mrL6VwnkXAPiQrL0Yx3Yu/pzCmOsIK+5ss5vbKR8dXkpuWe0+FvO+DPauGvOuAN3XKVsBCnq6iMFZe3dZ0XYjHOJRIYIyOcSyR4IRbjiK6XPGih3HBku5hflrrubm6YYTLlbkTK61K7Egoi/r+CiP8ck7xkFgZI00I/yzjJEnH95DL7ezaAXWHxDIvLD5EofXfJ6bDlsBHnc/bo+iHuu+/vCQQyPPecweioQTI5xAMP/H0OYZjKhNtiUQnbMEexSNO0V5TLF5THNI/lcyolhmE8YRjGlcD9wH9T7v9XwzB+BbgW+IqmafXu57pRrbBckijtsBaJNNLWJlLkIxGRT/bnfy5zWdziW8gV4HpVWtRl7Pt69GZ6dBClE/M58meRcb2c1JR4VVukm2gQoW1RclvSTS20NKQZIsS3w0fY2RPhzi473+GpnlEOhZ/mHG3WvJjR5gFagmcItmapXYjQtZxCHOBGsQYo0mi+5nvm/UH7Ma+KizqDqBjIZd/WdZ7W9byDFtXlpwuVku2S7xvurq6EgoJUOiorV4jL28E2k6gIsW0/bSIs7h1sO7Mk0Qb4E5Xc+3ITpYtPhy2HALWQs6eYakYlzPiphG2oSGQpcsBsXpzOo7s5jvilSFwO9OdZlw78jftOwzBSmqZlgF9FqMJ8Ua2wVFEQJ04MO2cAbYF4HE6cKOXX4FVp8ZKcquVz82JgOzAkaVHTRSVpUastqralD0duy2DfIvpPLuMUS+gJGDy++Chf3z/MyKjBN/ef58XF/8j/DvRzhBWiymJmubwZfD8nm1vJtAVEheUKhDBTnU0ktS5SuCkrLp+Jwj/1Qv84oZd7WXh3lCVh8+we/4oLFFd18YN7eS+UU3NSCdkuflJwd+6KVV2RLb9WbO2KK4rfqrAMtTi/WznVFedk5nxWZjVRutT5OIJsjLkyjcaIx4tv15RDE1IJM34qYRsuUbwMrNY0baWmaXVAFNivLqBp2mrl5ieBn5v3r9Q0rda8vhy4GrGnzotqhaWKghgfJ2cG0F13idA3gWKqLG449Su5mhb19rBzyvPrEH8M9H/EPljUk1ttOY9wEklis8j+q84k+nbDEPubfi6cRE3SUSTElS2krfyWBaTFfcE0rcsGaFl4hvB7ZmLuYvwrLmui8HtxuFNJz/1mktA8CH1HL6riIqdDy0/WT+ciH3fDr+pSbs3JTGe75CMrEtIZ5Jgb5LIxcwWcbG51DNfsp43BvkU2WXkbV3XFr57lfb9/onThtk65bMST1YRUwoyfStiGSxGmw2cz8H2ErfmbhmH8TNO0PwNeMQxjP7BZ07TfAS4AZ4A7zKd3Ag9omnYBUQv6kmEYpwu9ZpWwVFEQy5eHPHcIy5cXajnmaw1BYSGuuB2NNhKPG3R3a0rpGhgF/XsFNkGGzeVpEw02jlvDFAVJabFs0Gp+iyQwJROXT8Xgcy7Gd3c3fC0BL+o5rSJ1VlEz/hJNlbjITxPIEem6oZKdcs8Tmslsl0Ime6/clVAr4v+sEbu60i6qK9JZJqsrlnalD/G9kk4132Tb/EFxk23rVIIAtRKyUSphGyoSWXLNCmWGYRjfA77nuu9PlOv3+jzvKeCpUl+vSlguOZQuyxRR/c0eOwT11+BVZYGJkxYbYsqz5nJEiHhu/TvKgvmqLUHsfq5KXkaAelvf0tA0zFBDA0OEPCssMnhOJOe2FEdc2rx1HVwZsQbrqRoXOauoGXs6dCHiAhOrukyF5kTOQVKxJhrlhljMJjElpvIWQj6y4pW5EkIIoa1WkKystIm/XtWV7EDYbi9Ki72VbKu0MB3wcwYNT0s67FSjEt5DJWxDFdODKmGpoiAmH9U/EdJi3+875TmCTUIyiIOPRIE2UdfKLNs+mqF9aZhjpzNsfSdMz4kwg/VhywZdx5hlfxZ0xdUaIp2fuEhx7oD3BGiOpYTWRYbSeRCXUBArhK7YT9pNXMC/6uIX/V9OzclUW50LkRX3YyGEY8vRCmpHEJfFkGkL5FRX0kMttgvtXeVvTnUFvKsruZiOdNipRiW8h0rYhnyQAX933vnMR2Z6W2Y7qoRlFmOySZfFwo7q/00ikUafqH7wr7J4oXjS4jvlOZUFI5CrhC+wGV0rsmz/yCnCG5WZRU/1wInF9AwGLH1LIDjK2JI60ixQ2kMerSE/4jJ6RkyFNuKwOwm3KRqWp5LwUkwcKGXkfwZBqgbEdoYy4j61XXSuSGdRoXaRXMYv0+WfY7GSHUx+KHfbScLrv9iLrKjVFZm5Egpj2c6tCospnn4z+H76WJmrXelDkBZHdcVdWSkkLlfvz9/Wma7f92RRCa2pStgGL6gOpp/8ZJpfvDwuoYpClbDMUkzMyjfxlA5dP1wgJE6i2NYQFCYtAM3E4ydJJpfR3R1QKjxicCJcBMN8jnQSqdIajzbRtt/JEL7DNbPo9i627d5Pzw+aTGtyLdnG2hziIqss12av4o7MChaFF5PO9PP/hQ/QHzjgQ1yepn4cavYlYH4ETqfgSEzkuLRh611kpUXeLtISna9dJD9Jv6rL27rOC8B1iYQV/f/jWIzDJpEoh0W63G0nv29xIbIidSsLlSnMjlbQFXB6WSMnWWxN+ra0K2rGjzo7KEe7AuUIhqtadYtHJRM7d8BfFZNDlbDMUpQj6VIiGr2aWEwOOjxPPP5jq6riRCHnj0S5SIt4jjwJTyTeRyQyj1TqIrHYCLo+5lyVERKZLQXaRO3Lw54H0PYlYXEwkq4iH+LyW9nLueeUQahrAxw8yILOTj7R8xTPLA7xZuBfaDFpjUOcG/wnWmqfJjiYpTaD+OWtxtEKyrleBHFRdS5e8HMXyU8a7EwXuYwf3P+jxRKYs6kULWVoOxXKVnEv5yYrS2RAnLsVZF7G2+EIK+injT5Wmi2hNrInTO3KEYQzSJIWYyKptsV9auX8fc9lVDqx83IwVTFxXLI5LLM9GbFcVj4ZXrVlS7M5k6SZePwmotGry7i1KrwOiefJLaU75w/p+jk6Ot6ktvZ1OjqOo+unyT04mJktZm5L9NPQ+yMYPwK9eyF6o7j/2JsZ75lFv8yIM+k+cjNc+mqtxNz/eq6BUNdtjjTXYNft/Ham08pteZs23xwXKzlXZrbI66ux8128slw80nMLJeguJjdF153p4pXrUgwtDXlcvHAgHmcs6Zy7NJZMciAeL+k1vODeVj+y4giIk60gqVsxP/O+5svNWVMrrQrL8WMrzPlU2Km272J+5dy5KxKTIytQteoWi0rPYPFKE542SJfQZC4VhkuywlLprLwYlG7l897lq+FVYMfuJxLXTVGVJd86imsR2Tt+d2aLjehnQ8Tjwk1ktZG+CYzB1r8Os/2bPYTvVmYW7ehh69+ZFRbpKJJToGVLybRCX9HZ5FmhaQ0LzYOqcfHTvKSbh2hpTtPSOkjtALZAtxVxUJR/1fRc2TYaVR7Du+oibdGF4Ocwkp+017L54PU/fkTX+SfstpO0Ok9mlIB720Ku+91kZUnYRVbURNtr4PTKRgfB7GMl/UNt0Fdr25jVsDhfG3NpWSx+qFp1i0OlEzvVwVRtC00ekyIsmqZ9BvhTRN7nRw3DmOQs6+nBXCi3lsvK5x9e1eT9hJJQbtLifp5/+FwslqW7O+D8P74bEo9Dx/UBYDHbntxP+1Vhjr2ZYevOMD3/HBAvrabnyuA5hbgcuybDCo8WR38mzZGmFaYV2u0qct622kbNaUFczg1SK0mKbAu5Q+jew9kmGjW3tQhbtIph1yesfrL5NCvq/9bSaBcfim1jaaSdd1LH+Gl8K2/oPR5rFTis65YuZrIolaw48lZaEVUrORNqtXAFuVtBfaxg8PVFThuzvJ7XxiwxOR1L1apbHCqd2KkOpjvvnNltmQvQDKOoqc7eT9a0CKLw9DXgj4slLJp2mQFfmvDrThbj41upr9cYV3bmtbUwMmJQW7ttxrarVBQvNvM/j+3t/TxbtjgzVq6/HhKJc3R0fMPnWcVUWIp7ff/1eR1SizWwNjM+vsDn/xhq5arD5tMazdv15vWg+fLy/kXKY/XQ9WtZtl9zivDtSoVmTw9fqJ/PDxa8Sx1jtJjkpIFh1/W0dXsBaUIMOUhMaHRIWKLlnKRTCKL0Hra2RVZa1JlFagUmg6XjGZYiXQR58RqAIKE25bwOt3LZldEuro5v5/busHUw3dszTGjoHULL24VwNx6fMoIi4UVU5P2OykoYu/0jyUo78EEYXw2Hmq/iMFdziDX8hI9wmKt549CvwevYl39HzKUdAIz3EJ/QSbyrKxNvBamoZDFppSC3Wi6JXWkjBqYH/+0neebylB3a+9ca/OUkawh/oE3rNhfCpCoshmGkADTNa2hj5aLSWXmxkD9IuVOTfdtSfqj+GSs/zvOsYttCEoXszn5iXCit2gKy4pJKzaezs8bj/5j8Nmiv/Baw20T10PN/AjCymG2799O+JMyxgQxbT4fpGQnA+TZPO7S8ribppjlDkDFaGbCqL8HgmJXlEhxVBi3K6ou8LecqebWLlNt+mS7FOozUT1p+RL8S20ZXd9j6bJctzRIeOUt9991w8CDNnZ3clEwSAt4qQsyrotC3yq23kfCsqrg1KypZMfVCfc2XWxqjQ0Q4wgp+cfJKW8t0BGeVxSG09Uq0LQ9ZgeKtupcysan0DJYZxUUqUocyGVySGpa5Um4tTouTv7phZ6zYLqFYzM8lNBlMhLSAd4sIZV1qO8h+nXj8PZLJVg8r9AUw6sSibht0IeKiJOb2/EuAnv/bZFZemhwaFy87tE1W0gzQSisDCnERf8V9ZxRn0QDB5lF/nYvaLhpFnPCr7SKXu4gRUW3wcxj5kRfVaQSwONLuaCNui2Wov8tpE5/X3c21iYRFWEqtyanIV1fzqqoURVaugBPLWi3dyhFTt3KKJbYrKIVTgG0MY1dVzuHtCppeVLoebzrIVKVmsFRRfhRsCWma9gPErtiNmGEYz5nL/IgCLSFN074AfEHcmv8R+K8T2+IyYS6clfT23sOWLS0e7Zw0HR1PmPeUJmss3uIMEzsMTaQ9BKW3iCAaXUQstoBIpIZUKks8Poyu1+Q+X8MmLWHloXytInldaRU5/i6Sf8cJBEdpaBqmpSFNg9kCaiFNkFGTpIjr8v4QQ1aLSLSSzOecGxSW6GLbRaOu+8HRMpJaF6+WEfi3jT7ae4TolhXW9+7ieJZAfRB3/80YGWFPbW1Jh/JircteRAVcOStq7P5qnGRlZSuHiHCINdalj5UcP/R+MeD+VWx30C8RhIX3EITFT2hbvupKsShuHzAzmF3tmunANLeEVq41eHiSLaE7KqslNCkNi7WSIgiLc/mZ1bDMFRTW4pROVuJxr/bQCz6kZaLnzcVsV/mIi3cjIeT8K7uaU0Vc6iHQmskhLnWMssCsrsjrLS5ti/P6sEjQfS+bS1akMFe2i9Sqi0e7CMirdwHvQ3FjtIuW+HY2mhqWX6bOs3TTLbiPmoOJBM91dDAR5FM1ubUrOVWVMIKkSLLirqysbDVbQGss7coh1giy8jqCsBynLLqVqT4xmoweb6q3rZLJ1MxgmgnL8rUGsUkSli9WFmG5JFtCcwXl1uJMncXZjWIi/IttEcn1QaE2kXO9rrNfNS230OaqLiK1VfSueVu1Q8sAukWQHbFnFTU0DZNusFtCjqA5D2eRen0g2GpF/wdbs/7tItVp5K66uBxGfnoX2TZyxPTrPWSA7yS2sTDSzuCJIbI7dxK4805LhJzdvYdTe/dOqhXk/l/OS1TkbKDF5nuWOStXmNdXi8uJZU6y8iZX2nkrUmD7Lj66FTdZ8YNNVqa6XTPRfcB0bFulW46rmH2YrK3594EEwvz5vKZprxqG8fGybFkVBVFuLc7UWpzdmCxpAW/i4hblgnd2i/t+E8UQl0b8xbnqbbXKIolLo01cBhvHGVrUQEPDEGNmpL9KToSrqMXRMmplgCCjls6lITiUa4uWk6JHyBXpemldWskR6oI/eQFA7+FdvYd3zZstj/0179v3LIHmJn6ZyvDUN8LctfFuVr30Eu/msTvnQ766WA5RkVWVRuyqSpty3UVWpND2MFfzxrE18HqtU2Tbh4+FudCsIPv7NB3xCRPdB5R727yqNXPF3FBF5WCyLqFngGfKtC1VlIj8CvnSI7lSqfN0djZ77GDO+z5n4lUWKJ604PMaYrui0SuJxVYRiQRJpUaJx99C190mXW83kb3uCVZcVOISNO8bxCHQtchKPU7yUl/L4OAiBhvHOR1sZdGSgTwC3bQ9o0iJ/rfaRc1pQs3DTneRbBG14mwfycfc5AVyXUatWAMYQQh2wSYwIEhM/Y038x/XzTe/O4K9/fBAmL9LbGOwBMKSt5kno/XJQ1SC2BZmMzF4fLVwAx1hBW/yfvpYYVVYfnHySkFW/h3bGXQCQVaGwalbkd8pv1aQ8zs0HRWGibpkyrltftWa7373NZLJD8x6c8OshUy6nUOotoRmObwV8hPLD52YxRmmnrT4v0Y02kY8fhXd3bXmNteTTF4NHEbXTynr9moTwZQRFw9nkSM1V2pbRoB6p7Mo3dRCS0M6h5y0MqAQleGcdlGQMe+qiyQranqubA3JCozaMnKTF6nncVVfwK7ABF3OIRAHwJZIO0s8PjaJvCJbc++UQ1LASVSC5vUl2NoVM24/szrA20ExJkEKa2Vl5RcnryT7WtiewtyLICtW9L5bZIvHdX9MV4VhIi6Zcm6bf7VmNbHY81XLcRVlQ5WwVGFh+izObpRCWkAlLrFYhO7uWtfOsoZEYpVJWIrRt8jHy0dcum7Jsu22DO2rwhw7mmHrP4TpeTWQS1xUga5pic6nc2kwiYpXu2hN9oO8P/NJQuGVjGV+wenGv+Zi89/mal1Uh1EGbyIjyYtajQkqz1fGAgCcP3yMzs4VjgPgQw9mIX2W5ePjZFMphv88ztjTel5qK8mJddtNUsLmdkiiojqBZFXF1KycXtZopdZK27JsB/3i5JVkD5tk5XUEWZFzggagHI6gSo5PKOe25avWVC3HVZQTVcIy5zDR6SwCun54ggRlMlUWyCUW/ohGFxCLfZhIJEQmA21tzsfFztJ15Murb3G//uSIS9fvZtl+xylrVtGKzk627+iBkcX0vBbwrrioraM8Opc6qV9R2kV1jHJDto2Pnfp15nXdAQcPEuzs5H09O3hj8QIGgt+xqi5Wkq7aMnK3iaRzyIvASHeRrL6YBCa8eys939xO193COfTQg1m2bjpFYL34DGo6O2n8ZlK8v2d9EnDrleth1183SQkiKipSYKsQlvF2ONks9ConWcKbXOkgLGKgYa0lso22QuwJiFwpggXj8UF0ffJzgnT9EB/72OXs3fthFiwIcOZMlt27X5vANPXyu3nKGbhW1apUKLJ476dmMS9RSWAAACAASURBVKqEpYoyYrKkBQpVW6LR5cTjv6G0gGDHDshmQSbBi52l10HEixSVn7hs684IsqKUfcJ3dbHt6/vp+cem/K0iSVzy6FzqGMtpF/12ppN5XRscr1nTdRdX7E/yYtPLltZFJukGGbXJiyQnkry4yYr7NtgEBnFf4N97WKzD/u3bCK9sh7NnCfyB8zPg7m74WgJ+bP5HSTLihuSajcpttbIi20H12ERlMdBmV1VOspg3eT8nWSIqKlwpclaOrXBoVqKtEL8Nuu9ShmQmG4BxdH1ySbbR6Bo+9akPsH59jbnuGpLJD/DSS8eLJgZT6eYpV/WjkitJVcwtlCWHpeQXreawTBFCJQa/TRUmS1rAj7T09n6aLVsac7Idtm+HSETuLMeJxV5D1/vNJfxcTl6vUUyGi3q/K8cFuHjBO0gtOzJKzbUB73lFfnkui8jJd5F5Lg0NdpbLa9mPEAh6hLeNjvDXgS9ZIwBk+0iG04UYFvfJUDqVnJzCm6z4ERiwScxT41Bfn7M9jIzAJ33Ok9SimLuyIgmL2gKSNuaFYoDh+Xn/iQWZ/0xd+P2cz/yS58Kv8t3AWfpYSR8rOHWo3c5XSQED0PsXsOWLOREyJBIjdHT8E7nxef4iWzcmk0Miqypr1sznrbc0HnzQJuSVmGUyF4I4px7TnMNy2VqDeyaZwxKr5rBUMSUIeQS/NZNM3gQwA6SlOESjq4jFriUSaSSVGiQefxldfwu/FlEkEvbsl69aZTAyIiorsVhKISuQ3wbtfo1JVlyAYynvac7H3sjAiLkN7iwXt7PI3SaShMXVLpJVl/5Mmss9XvO9zDv0Na20rNCy0tLAMG+r5MV0GTkqL+42kZvQoNyH6/4TKcEe3X2CYykRk+8FSd5UsqK6gTw0K+Nh0f4Zyd7F8lObqe2605pn9JmeXfz74mG+N9LCYN8iO2dFWpcHRBvIW38RZDJkBSbuxPFOiBWP6XplZplUtSpVTAeqhGUOIR6/ocTgt6lCca2haHQV8fh1CsFqIpm8DsAkLeBuEaVSGTo7G3OOg4cOZejoeM68x++1J0tcvIS77gNXiK3xMNuTPYS7lWnOyR62/kXY2e4p1Vnk0S6S7qL7my6wvWcPDV0brNcc7tnN9vAJ3uRKy2EkJ0bLSosq5A0yRoihnLZRcDSbW30xCUx2aReZtm2EF7WTefcY4UNbCRzpgZ/F4akk3N5t91q+nYR/iwnbMXi3hOrxJyxmK2i8FUaDAdLBBQzQykkW87HMHwqyonz567s28qVnnuUv35pv25YlWTHdQCkfXpVKydLRxMgKTFzbIV03S5fCT38qKodHjsBXvyoIy3ToQ6oVkzmAqq25isqEqK6sWBGaxuC3QihMWmKxa30I1rUKYQGVUMTjr5JMOjUsogX0agmvPVHi4jbkej/WowO0sC2xn/ZImGOpDFtjYXqeC4gFwnhbohvJn6DrFf9vDlzc01iL8b4g2/Y9R/v8Rk5kzvLV8Aj/FAjSwhKLmMj5RXIMgNUSMistchlZjQkGx2gIDjmqL5LAZANdnLq4na5o2Py/WEHP7u0sXgCBd3oEKdAT8L4InErBGzE4r4vUWRUqcfFyBJmkJbMwwHCwwXJMpc1M4H6W8Tvh5Z6lkiuam0TUfh+CsLyN2K4+8X8Qj4+RTNa6hmTK79PEyQpMXNsRicynrQ0eeQS6Fb7X0wMPPwwbN06tPqTSBypWcemiqmGZ9RBHvd7ez1Nf38ymTV79+HN0dHxjRrYuH3EYH+/OMwcl6fOsENHocmKxDxKJhEmlMsTjr6LrR0t+fRtTqXFRruebVwQ2CZHXpa7FfdtvblEerUsdY7SYdug6Rs1Ki31bJS8hhhyaF3lb/m1giBDDhM//G79/S3vO923/vj6a0iudjqJCkPoVxcY8HrYrKcMmnZLheW7C8sj5z7L4lq6cL3/frv2s/J9NIgiuD/vvMOY/J4lG5xGLtZrBg8PE4/+Grr9urmRiZEViIpWK3t57qK9v8fwt7917kc2b908pcajOAJoqTLOGZdlag7smqWH588rSsFQJy6yGfVAcH7+Pz31O8zgrM7jvvu/NsIbFmzT09n6WLVuaPHaM5+no+NsC6yzVvl2IuOSrQlUAcZGb6CYu6vJu8iJnGjXaE6NVoa4feWlwVV0kWWlwEZrO7EsEg4Ecwjk6muW9wHxCo0OOTyg4ms35FEeDAev6cLBB3EfQIiij1JGmhWEaGCXIAK05pKWfNj6aXc0jp4KOlljm2z1s+ofF9PxjwElYFLLinA8ExabYTiWi0TXs3n0rweDEhhpOFpMZqFhFPkwzYVmy1uC2SRKWRyuLsFRbQnMEqdR5+vubicUgkbD73mNjwxUguPVu0cTjL5NMXueRrPtyEessPrfF3gY8t0PAr03k91qlinOV8YFuS7RsCWVwtosaIXoDxL5oZoS8BfGnQf8JdtWiBK2LKtR1k5c0LQ6yoopzxW0nWQFYnRmgs/N9ORqNgcxJ+ppWQBBrWWtbPTBKnfJJC5IiCYqo5YjKyqg5bylNCwO0WoTlFEs4eLyNt8cDbNP3074ozLETGbbuD9PzbEAEwVlTl0H4t93DDMX/aTS6TBGBS5fdqxSDcuk+dP0Q8fjH6exsmJFsk2quShWViiphmbVwHqjVWP0Pfcg++N9/f6VkIeSSFqlTSSRsl1As9rJLv5If4gDz0SLbQ3I7yNkWG+oMonK5inygOIu88lyin4D4Zui+W8kI2QFcAP2flU0sRutitYz8yUud2e6pM6sraRZYJEVWXwCLxBwM/5Bv96zjc122RuOpnhF+Fv4O77LSeh9B83mXZX+LSOb/4f9v7/yD5qjrO/76JA8hIcnDQ6GtyG+mkD6pT4stCNakElGkyJi2k5YVsTA+GdsanrHUdiysM3bsrG3HmVrnTGuBh5Ei5NS0tGloCyow2I7YgE095TEKMcWIFJ/YkF9PeHJ5vv1jd5/b29u927vdu9u9+7xmbnK3+93dz+09ee79fH6uWHkRc0e/x8zKz7N3yVOL63xx4gsWoE6oHGOFN4ZgbPEx+79nsvCDlTAL2/bBtlk//OP9O0ute+2iWAl3sHU/R8s6O5QE7lfZzbcUHlnnfdj2I33rbaJ9VZSkiMh1wCeBpcA9xpg/D+3/A2AzUMX9n/heY8z/iMhlwN/g/uI8CTjGmFZudQ0JFZNor0I+erC0IoseLS6NVUZ+wuRTLURLO/ZkGSoK728eLqrsgqnbInKSPg0T76Kxh0tcTxdome8SDhsFxUs4VORa64qYX164gGuP/jLnrhzlB0df4d9XfpnvLtnlmVFLXhlfeD3Xv3z1Yide1q3jxLb7+JefeoLdS9wv9DlW8KrnhjnGCua9EFCDYDk2xpHZMZgdcb1I+3CFyX5c/XEgsG0WVwwaf9pyVLt99+rxIcrWuRvdyPvoZ6WOVgl1gx6HhH7ycsOvpQwJ3RMfEhKRpcB3gLfh/u/bBbzLGPNsYM0G4GvGmGMi8nvA1caYG0XkUsAYY74rIq8FngHGjTEHm74nFSxFI13r/XyQjWiJ/4I5EihxztKmrJJzw/ujhUv1RHzvtZGLSJbr4r8OiBPrUrB/EcZ/GmZmwXkJyq8QKV6ASAED1D33PTHA4jZ/DcDHDm/izHfe3KC+Znc8wB+t3sEx730HQ0Hz3r8HOYN5lrnelFdPdYXKLPASrhjZR82LEvSqHCWUrzJHnFiBVkngzXM3hj3vQwVOEgZOsLwR+BNjzNu913cAGGP+LGb964FPGWPeFLHvv4FNxpjvNjNHQ0KFIr1YyYcXplVYJhnj46tiyrjjer6ntamTcmj/nMlKohcxK+J7hOwJHBLu6QK10mjf1EDIyFoPzi/A5LsDYab7gYNQ3uedixEWlo9wZNVKWA5HVrnfwkEPDLAoYtzntXBR8LXPGSvPjiw7/omVr+FZ1gIERMppzLOMY8dO49jhFTWRcgRXjPj/+t6TWdxy5QOh7XX5Ks3FChxiZuYw69aNdpS7Mcx5H1oGPbScg5sd5rMfuLLJ+kngX8MbReQNwDLg+VYXVMFSGLIRK/nqhJtu9tDMzBHWrVsd8SVxhPaTcoM20cKudoRL1Dnj8lzqxYvjLGV6+hQmJ6Wu46ltU+s8G0zSDYuXcF8XwF4Pk79dP+Jn8j1Qut9L5g17bby8F6CW+wKLXhigzhPjsyyYbAuxnXh/cPQVN0EX6gXKkZFa8vBxat4UX6z8yHtvP/T+DXpVGkJA0EqsuPe7lgfWbu7GMOd9+I3u6vspLaNU2qCCpficJSJBN81dxpi7vOcSsT4yZCMiNwOXA28ObT8buB+4xRjTWEYYQgVLIcgmDGTb63PSCTdI56IlWZVRvHBp3s8lC+ESvm4z4RLeP+fNjllFqTTC+LgwMyPYtjdTJupXRRhfuAQGFY6fH9OK/mxq3glftASfQ03AgJu8ywisgiN4nhiAVdW6c/ui5kPLlnDXg9tYeVOg+++D2/jQwdN5+ftn1Ow9EvgXah4Vf1tYqIS9LQaiQ0DQTKyAL9rnO5pgnGb6cdHDKZ2OIFC6zALJeiA1Z7ZJGGs/cF7g9bnAi+FFIvJWwAbebIx5NbB9FHgY+LAx5qnwcVGoYMk92eWsjI+vzlEn3CCdiZb2qozqxUHj1OdVTE9f5Z03mLCbRrg0Xjf6nPHhonL5yOLQu7pzJC2NDnldZp6PCTO9gPvrJ1xt5IsWaEziJfB68XngV8pyXFEDPAiYs07lY5/fwflnruSF2aPc+cOVbPvfJTVxEhYrYZFyBPdW+x4V35G2WAHke1WiQkDQTKz429LMxOnk2EEIpwxzOGzI2QVcIiIX4faPtoCbggu8vJW/Ba4zxrwc2L4MeAj4O2PMF5JeUAVLrsk2wTY+Rn849pje0VleS7m8t60yaP+LybYvY3JyJORtGqFUuiymwqibwiV43iThooieLpBIvDh/C9P3NpZK2w/gei1804N9Xn4U2B4WKv7r4LYwXm+YbSxh2+KJAkMgg7b7osW/lQeoiZSgoJkjEP6Beq8KdCJW+sEghFOGORw2zBhjqiJyG/AIblnzvcaYb4nIR4GnjTE7gI/j/mb4gogAvGCMeSfwW8CvAGeKyK3eKW81xjRteqRVQrkl+2qgxhwWP4TyaCYhoewSekebTHHOhuYVIQ8msrE17ZZER523WXVReH/gebibLix21LVuBHsLjF8KM8+B8xkoP+KtCZZGQ/1ogOD24L7wdXyiBEx4GJsvVHxd8WpgTWKhEvaq+NujQkCQF7ECg1NdVPSwVm/ocZXQGZcbNqSsEnpIO90qLelO6bIvHkqlmqiw7WyqhLJM6G1s4BU1xTkd8Qm7R2OPqacdjwukz3Px1zWrPgrge17maMh3Kd8L5WCLpuBIgGDeC9Q8LVFv5QCNoiQoaH4UbVpdXD0oYA4H9oXDQy2FCrT2qgS3hbf3h0EJp6QJpSlKUtTDkjuK2WelUtnM1NRoRE+U9gcvxp8ryYyhZEQ3nTuBbX+tjaZzQZKGsrLwukStbfS8WNZSbHuE8fElzMyA44SESoT3pe7Uq2LWNhMpSYkSLcGwkK8ljga2NwgVIp4nCQEFt/eXxhwWP5zysAqAgUM9LGlRD0uuKKZYgWwTeuPPFZck0T6+p2Z6+o1ceOFyjhyBgwerwDy1pI92SJqDkyTPBZp7XYJroyqMwLJO4jjLQyXRBpBaEm/EOIBIc/zkXah5YXyC3ph2CQqU4PWOBvYv/j01R6MnJSxKiiVWIF11kaI05ST1ntEBQAVLbiiuWIFsE3rjz9XpN2MzRrjmGv9LfUVE6KlT4QKdh4ug/SRdf627xrbPZXJSQsmcQqm0QLnsuTeCCbvB0FGwCR3UkncDl6vLg/kOOFuhvCPibTQjLFD8a0MLoeI/9/cnDQGF9+UDDacoSjKWtF6idJ9iixXwm26d4Oqr3aTBq692QyyO85UMz/UkLQcKtoFtX7HYl6ZarfWlse0rAquCX5jtcohk9h7G/9K1rJ+iUrmKavXNVCpXYVmrY2w4ROP55xYf4+NLY7xUUreu7mGoPeZwxYP/OOA9joJ1PTh/AFPvd0cHTL3ffW1tCJ1ytsXDP+fxwDFmzmv69mPqBxUG+6ocChyQTKxY1hoqlfdSrd5JpbIFy1rb8CnkDctaS6WypVA2K0o3UQ9LXym+UPHJKqHXrzS64IIRtm9f4PTThW9/O3yubrf2jwo9ddo5F5Laa1mrcZxLA71hljM9vQaAcvnlJjY0el5mZk6wbt2yCC/VSWp9b1oIMRPxXufA/iOYnAx1y52EUgnK9zc/pXveiJM2PI8L68SFf8LnqRcrjvO2QvU6GYT+LEqfMWTROC5XqIelbwyOWPEpl/cwMXEPIyOfYGLinkix4v6lu5lq9XYqlc1Y1pq6fY5zLVNToyxfLmzatIQXXqg2KY9O521xK4Xqt7UOPWXhcYm227bHF3vD1Dw+S7Hti2NsiLLDPb/j/Ijp6YWQl2oBx/m/CFuCYqD1Y3zcxAg96r00DY+5QMt8//HjwHPfmxL0oIRfQztiBcC237TY66R2X5dh2xsi7l8+CPZnKYrNitJtVLD0hcETK0kIC5KpqVEc59pF0RIcHVAfolnf5KxJwy6NuK39o0JPu1odSjrhAlF2j4+viBECpxKfPRctXsrl/dj2XkqlVzl+3FAqzWPbP6Rcno05JrmAmZlZiBF6CzHH+OGd8PagQDkUsz0sVA4TtNGyzqZSeSfV6iSVyo1Y1tkN96eIreOLaLOidBsNCfWUwRAqnTaIazXLKF2lUfthovZa+8cRV9mTlJpomZmZY9260yLCOGHPQtz9qLelXD5AuXwgYl1UtVGQcL+XehznJNPTZzVUINn2MQgNPWwuJqPycqL2HY5cE92vJ9j7xz1HEXudFNFmRek26mHpGYMjVpp5SZrRSpC41UH1+9uvNEruccm+m246r4vjfJ3p6WrI41PFcWZCKw8TTNRtbkt82Cj+XkV5SWqUy0ex7VlKparnvali27Oe9ybu3HHnjPPqBN8nkWvik6bX19nsto6fD93XeRwnv63ji2izkjNO4rYMSPPIGeph6TqDIVR80kx8blX67FYHRY0OaL/SqNVAxcbGcVl20+0sQddvWFcq1SZI2/YeyuWGAagBWpVHB+3xaZa06xO+d2HRMucJlHaIE5Jh+6I9KuG18UnT9fehiL1OimizonQb7XTbVforVrKb7VOjWr29yeyTT7S0p9Uso27YHCVcKpUbmZpa3dVuuvWk/VnopCIqadO+Xv+cRnl9wt6i1v1TKpXfiumIfJCJia0dWzesc3GG9X33jh53uj3tcsPPpux0+1/a6XYI6L9XJcvZPkHSNIhLUvpcLu/JZLZRPY35Le2VNGdBmrJoSN6QLkgS7wvEh7Gy+jluFiZrJVSijnfXRHvk0k0JHtZy4mF93wONoXU364KhHpbM6b9YgWxn+wTp9sTn3jDaBw9LmKx+TjrtRdP+yIRsiBO2SYRK47p6j1x6r8Dzz9/O5ORpmXtt8k6lsoWpqbGhe9+9pccelhWXGy5K6WGZUQ/LgJIPoeKT5WyfIN2c+Nw7DuE4T8bkyyQpac6CtNVFPuEv+k68L0GyFDJJkqWTCpXota5Hbnc7RsViWWu58MK40vLBLifWMmqlCKhgyYR8iRXIdrZPmO6EbXpLo/DqpKQ5K9KGi4J0EjoK0otpae2UOneyvjNsewN79wrr1jF05cRaRj2A+FVCA4SWNadiBVl8yTTr/topWc72GVTqO/Pe3SexEqRZKXIntCpf7iWt7GhWgh1HdmIFXG/CRz4C09PU/b/Zts0MfDmxllErRUA9LB2RnUelW8mxgxG66TVpvRNZklXIKEiS8uWsSCqQmomO3okVcL0JL744hm27c5HGx+F734P5+WMDn3iqZdRKEdCk27bJNvzTreRYJSv6LVzC5C/82BmdCpVWx3ZOY6WMX3X0cCG+uLUsOe/0OOn2lMsNYymTbmfzlXSrIaHEZBP+CdOt5Ng8040QWPfIQzglSNZho16SxPb+iBVwvQy2/TCl0kGvg+/BvogVy1pLpbKFavVOKpUtWNbaRMc4zjuYmhrzOlCP4TjvSHSsohQFDQm1pLt/0XYzOTaPdCsE1n3an1XUfVp1sO0NlnUBtl3rzus4uxe79iYXGElEYfdFWrn8bF+9Ep32QwlOd4badOdSaYN6WZSBQT0ssXTHoxJm2JJjO5vInCfy5nEJEj//p1tY1gU4zlVMTa3y/rJfheNcySc/eVloivLFTc6SD7GSB4LCo/b/Yxm2vaHpcVqWrAwDKlgi6d1fquXyHmz7UUqlQ54b+lDBmrC1R1YhsP6HlfJSfdOMxoGDlnUBlcpGqtWbqFQ2YlkXpLqCbV/G5ORIgwB9z3vWMjW12hMxq3Gc9RGiJen9Gw6xAp0LD7csuX6bliUPOSeBoykfOUNDQnX0x6U+CH1NkpJFCCx/YaU8hosacQc+Xhm4b6uYnr4SmO+4pHt8fGXkF+zoqEQMyLwicJ0sqogGj077obhlyVEJw1qWrAwO6mEBehX+UbIJgeU3rNQ9r4tlXUylcmPCEEs0tn1FzH27omO7ZmaOxPxlX7+tNqepnfszXGIFOu+HkpeEYUXpJkPuYVGR0muy6A9TjMqq7LwurmdkfcAzsprpaVecteMZyWLgo2VdjG1fwfj4KmZmjvDYYy8wPX1p3YiDBx4w3HOP1B3XfiL58IkVSNcPpd8Jw0rOGMDhh0MqWFSo9JO0IbC4sNLBgwbLWpOz8Fr6ZnRBzwjEhVha43pDVkeEG5L1744WTpeyc+d3KJXOXxQx27e/wM03X8rjj9fPadq583kqlc2LQtVx4oTqcIoVHxUeihLNkIWENPQzCESHlWDr1iU4zrWxCbhFTdTNwjMC4Di7YsJxyQY+xoWU3vKW85mY+BwjI9NMTHyOD3zgqw2J5Dt3fpMbbngdU1OjXiLuaNPPCjrrR6IoSu8QketEZI+IPCcifxyx/1dE5OsiUhWRTaF9/yYiB0VkZ9LrDYmHRUXKIOH/Vb59+68yNraEmRmwbSiX4fHHT6FUWt/wl3uRE3XTekZ8fG9MqVQL6bQz8DGZcDrkXetQ3X2tVDbHeInCn5XrXem0H4miKAG62MheRJYCW4G3AfuBXSKywxgT/A/6AnAr8IcRp/g4cBrwO0mvOQQeFhUrg0i5vIexMWH5cpiYcMUKxOeyFDlRN61nJEi5vLfOG9J+SKl+W004NX8PyfKOaqGgTvuRNEM9NoqSKW8AnjPG7DXGzANlYGNwgTFmnzHmG8BC+GBjzJdpczz8AAsWDf8MOm4uS/22uOTO4iTqNgqYcnkvtv0VSqXDXojFT1Tu7XTpeOH0ZMtjW39W9XkrWTdC09b1ipI55wDfD7ze723rGgMYElKRMiy4uSzX1lWoTE+fwLYbS6SLOQIhKFp291yghHGvP9dRhVfzz6oxybbTfiRxaOt6ZfhYIIME9rNEJDhB8S5jzF3ec4lY39VpygMkWFSoDBvtlEi3I27ySzjk0otGdfXXDOemJCX+s9oduT5JI7Sk04ktay3nn386X/qS2x/GcdwQYhKPjU5AVoac2SbTmvcD5wVenwu82E1jBkCwqFApOpa1Btten6DctZGkJdJZ9H/JH62qjdoRNNk0u2v2Wfqflb/ms5+9Htt+U6QIaNWPJGlSrr9u40YJCB9330svNffYaOKvojRlF3CJiFwE/ACwgJu6eUExpqsenOiLyjkG3p/yLCpUBoHG6h3f8zG485QGlSSfZfSa+ba7slYqW5iaqg8ZXX01lEoHmZjY2nLd3XfDkiXNr5v0GoqSjA8/08RbkTkiv2jgP1Ke5bSmNovI9cBfAUuBe40xjoh8FHjaGLNDRK4AHgLOwG1j95Ix5ue8Y78C/CywCjgATBpjHmn6noonWFSoDBKVymampkYjvhQOMTFxT7/MUjogyWcZv6Y9EVCt3sny5UK1Wts2MgLHjxtGRj7Wct2rrxre/e5/bCqSkl5DUZLRa8FymYHHUp7lzJ7a3IoCVQlp1c8gUozqHSUJST7L+DXtVf8knU4ct+7ZZ19p6dHRCciKki9SCRYR+biIfFtEviEiD4nIWFaG1VChMsi0U5qs5Jskn2VWIiDpkMBOhwmmPVZRlOxJm3T7ReAOY0xVRP4CuAP4UHqzQEXKcDAY1TsKJPks5xJV/yQh6ZDAtMMEOz1WUfrPAt2YHN9PMsthEZFfBzYZY97dem2zHBYVKsNGmiohJV/Ef5ZzgTVaKqwMI73OYfl5AztSnuWiXOWwZClY/hn4nDHms63XRgkWFSqKUiSSC81iT19WgaVkgwqWtLQMCYnIl4DXROyyjTH/5K2xgSrwQJPzvA94n/sqmGCnQkVRikb+hkl2B+3FohSXTDrd5orUHhYRuQX4XeAaY8yxZMecZ+D2VNdVFKV/JC9HL/YvTO3FomRHrz0srzPwhZRnWVssD0szROQ63CTbNycVK4qiFJ92py8XlayHMCqK0jlp+7B8ClgNfFFEdovIpzOwSVGUnNPu9OWior1YlOJyksYJ8O0+8kUqwWKM+RljzHnGmMu8x+9mZZiiKPnFLWE+EepRcgLHGaxydO3Foij5YQCGHyqK0muaD5McDO8KaC8WRckTfZolpEm3ijKYDI5YUZRs6XXS7biBz6Q8y1W5Srot0CwhRVF6gWWtoVLZTLV6O5XKZixrTb9NUhSlbfyy5jSPfKGCRVESMCxf4n5/lampUZYvF6amRnGcaxO+3/z9glMUZXBQwaIoLUj3JV4sbHs9k5On8MQTUK3CE0/A5OQpOM6GoRBsiqLkFxUsitKCuC9x217fb9MyJ66/yoUXrmgh2Fp7VyxrLZXKFqrVO6lUtmBZa7M1XlGUAAvA4ZSPfKGCRVFakKxJ2mAQ119l715pItiSiRXHS0XxwwAABiVJREFUeQdTU2Oe6BnDcd6hokVRlMSoYFGUFrRuklZcwrk5jz32fEN/lW3bDB/5SP1x7Qo2297A5OSykOhZhm1vyPDdKIoyyKhgUXJJXpJcLWsNo6MjbNtmMmuSlqf3Fs7NueGG17Fz5ze5775jHDhg+PKXDaeealgTMrEm2JIl2mqLe0XpNYPX6VYbxym5Iy+TgH07brnlFF77Wrj7brj4YsO+fXNe87D2bcnLe4P63ByohXruu28N1eopbNwono3Cgw8aQPjTP3XFyvT0CWw7uWBzW9zXDxHUFveKorSDeliU3JGXJNegHQ8+CJdcAtdcIxw7Vu1YXOTlvUF8bs7Y2IoGG2+6SdiyZYHjxw2l0iFs+1HK5d2Jr6Ut7hVFSYt6WJTckZck127YkZf3Bn5uzmiD12PVKmKEjDAy8omOrqUt7hVFSYt6WJTckZck127YkZf3BvEDDPftm0tgY/tN4srlZ5mY2MrIyMeYmNiqYkVRuop2ulWUrpOXScDdsCMv7w3cnBnbfpRS6VBdqMe2H8+NjYqiKD4aElJyR/NJwMW2Iy/vLWhP3LXjbYz/y8uy1mLbtbCP42jYR1GUbNBpzYqitEm0YPGbw01OLvOqi9zEWtt+WEWLovR8WnMW37Mf1GnNiqIMHtocTlGUbqKCRVGUNogPB2lzOEUZLkTkOhHZIyLPicgfR+w/VUQ+5+3/mohcGNh3h7d9j4i8Pcn1VLAoipIay1rLwYMLHD8OlQpYlrtdm8MpSr/obpWQiCwFtgK/CqwF3iUi4eFgk8D/GWN+BvgE8BfesWsBC/g54Drgr73zNUWTbhVFSUjz3JVNm5YGcldgfBxuvnke29bmcIoygLwBeM4YsxdARMrARiCYsLYR+BPv+XbgUyIi3vayMeZV4Hsi8px3vq82u6AKFkVRUhHMXQE/dwW2bz/Jbbdpwq2iDCjnAN8PvN4PXBm3xhhTFZFXgDO97U+Fjj2n1QX7JFj2z8IH/6c/184NZwGz/Tai4Og9zIZU9/HWWx/6pWeeqd/2xBPw9rfDM8+E9wws+rOYnkG/hxf09nIvPgIfPivlSZaLyNOB13cZY+7ynkvE+nDZcdyaJMc20BfBYoz5yX5cN0+IyNN5KhcrInoPs0HvY3r0HqZH72G2GGOu6/Il9gPnBV6fC7wYs2a/iIwApwM/TnhsA5p0qyiKoihKu+wCLhGRi0RkGW4S7Y7Qmh3ALd7zTcBjxm3+tgOwvCqii4BLgP9sdUHNYVEURVEUpS28nJTbgEeApcC9xphvichHgaeNMTuAaeB+L6n2x7iiBm/d53ETdKvAFmPMyVbX7EunWwVE5H2BWKDSAXoPs0HvY3r0HqZH76HSChUsiqIoiqLkHs1hURRFURQl96hg6SMi8nER+baIfENEHhKRsX7bVDRE5DdF5FsisiAiWmHQBq3aaiutEZF7ReRlEflmv20pKiJynog8LiIz3v/lD/TbJiWfqGDpL18EXmeM+XngO8AdfbaniHwT+A3gyX4bUiQSttVWWvMZ3NbiSudUgQ8aY8aBq4At+rOoRKGCpY8YYx41xlS9l0/h1qIrbWCMmTHG7Om3HQVksa22MWYe8NtqK21gjHkSt/pB6RBjzA+NMV/3nh8GZkjQ9VQZPlSw5If3Av/abyOUoSGqrbZ+SSh9xZvm+3rga/21RMkj2oely4jIl4DXROyyjTH/5K2xcd2iD/TStqKQ5B4qbdNRa2xF6RYisgr4e+D3jTGH+m2Pkj9UsHQZY8xbm+0XkVuAG4BrjNaYR9LqHiod0VFrbEXpBiJyCq5YecAY8w/9tkfJJxoS6iMich3wIeCdxphj/bZHGSqStNVWlK4jIoLbEXXGGPOX/bZHyS8qWPrLp4DVwBdFZLeIfLrfBhUNEfl1EdkPvBF4WEQe6bdNRcBL9vbbas8AnzfGfKu/VhUPEdkGfBVYIyL7RWSy3zYVkDcB7wHe4v0e3C0i1/fbKCV/aKdbRVEURVFyj3pYFEVRFEXJPSpYFEVRFEXJPSpYFEVRFEXJPSpYFEVRFEXJPSpYFEVRFEXJPSpYFEVRFEXJPSpYFEVRFEXJPSpYFEVRFEXJPf8PAYsVVSpNn6gAAAAASUVORK5CYII=\n", "text/plain": [ "<Figure size 720x432 with 2 Axes>" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "import numpy as np\n", "\n", "# create grid\n", "x = np.linspace(X[:,0].min(), X[:,0].max(), 100)\n", "y = np.linspace(X[:,1].min(), X[:,1].max(), 100)\n", "xx, yy = np.meshgrid(x, y, indexing='ij')\n", "grid = np.stack((xx, yy), axis=-1).reshape(-1, 2)\n", "\n", "# evaluate model in the whole grid\n", "prediction = model.predict(grid)\n", "\n", "plt.figure(figsize=(10, 6))\n", "contour = plt.contourf(xx, yy, prediction.reshape(100, 100), cmap='jet', levels=100)\n", "plt.scatter(X[Y==0, 0], X[Y==0, 1], color='b', label='Class 0', edgecolors='white')\n", "plt.scatter(X[Y==1, 0], X[Y==1, 1], color='r', label='Class 1', edgecolors='white')\n", "\n", "plt.xlim([X[:,0].min(), X[:,0].max()])\n", "plt.ylim([X[:,1].min(), X[:,1].max()])\n", "plt.colorbar(contour)" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.1" } }, "nbformat": 4, "nbformat_minor": 2 }