{ "cells": [ { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "%reload_ext autoreload\n", "%autoreload 2\n", "%matplotlib inline\n", "import os\n", "os.environ[\"CUDA_DEVICE_ORDER\"]=\"PCI_BUS_ID\";\n", "os.environ[\"CUDA_VISIBLE_DEVICES\"]=\"0\" " ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Using TensorFlow backend.\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "using Keras version: 2.2.4\n" ] } ], "source": [ "import ktrain\n", "from ktrain import graph as gr" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Node Classification in Graphs\n", "\n", "\n", "In this notebook, we will use *ktrain* to perform node classificaiton on the Cora citation graph. Each node represents a paper pertaining to one of several paper topics. Links represent citations between papers. The attributes or features assigned to each node are in the form of a multi-hot-encoded vector of words appearing in the paper. The dataset is available [here](https://linqs-data.soe.ucsc.edu/public/lbc/cora.tgz).\n", "\n", "The dataset is already in the form expected by *ktrain*, so let's begin." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### STEP 1: Load and Preprocess Data\n", "\n", "We will hold out 10% of the nodes as a test set. Since we set `holdout_for_inductive=False`, the nodes being heldout will remain in the graph, but only their features (not labels) will be visible to our model. This is referred to as transductive inference. Of the remaining nodes, 10% will be used for training and the remaining nodes will be used for validation (also transductive inference). As with the holdout nodes, the features (but not labels) of validation nodes will be available to the model during training. The return value `df_holdout` contain the features for the heldout nodes and `G_complete` is the original graph including the holdout nodes. " ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Largest subgraph statistics: 2485 nodes, 5069 edges\n", "Size of training graph: 2485 nodes\n", "Training nodes: 223\n", "Validation nodes: 2013\n", "Nodes treated as unlabeled for testing/inference: 249\n", "Holdout node features are visible during training (transductive inference)\n", "\n" ] } ], "source": [ "(train_data, val_data, preproc, df_holdout, G_complete) = gr.graph_nodes_from_csv(\n", " 'data/cora/cora.content', # node attributes/labels\n", " 'data/cora/cora.cites', # edge list\n", " sample_size=20, \n", " holdout_pct=0.1, holdout_for_inductive=False,\n", " train_pct=0.1, sep='\\t')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `preproc` object includes a reference to the training graph and a dataframe showing the features and target for each node in the graph (both training and validation nodes)." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Neural_Networks 726\n", "Genetic_Algorithms 406\n", "Probabilistic_Methods 379\n", "Theory 344\n", "Case_Based 285\n", "Reinforcement_Learning 214\n", "Rule_Learning 131\n", "Name: target, dtype: int64" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "preproc.df.target.value_counts()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### STEP 2: Build a Model and Wrap in Learner Object" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "graphsage: GraphSAGE: http://arxiv.org/pdf/1607.01759.pdf\n" ] } ], "source": [ "gr.print_node_classifiers()" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Is Multi-Label? False\n", "done\n" ] } ], "source": [ "learner = ktrain.get_learner(model=gr.graph_node_classifier('graphsage', train_data, ), \n", " train_data=train_data, \n", " val_data=val_data, \n", " batch_size=64)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### STEP 3: Estimate LR \n", "Given the small number of batches per epoch, a larger number of epochs is required to estimate the learning rate. We will cap it at 100 here." ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "simulating training for different learning rates... this may take a few moments...\n", "Epoch 1/100\n", "3/3 [==============================] - 1s 441ms/step - loss: 1.9648 - acc: 0.1302\n", "Epoch 2/100\n", "3/3 [==============================] - 0s 158ms/step - loss: 2.0053 - acc: 0.0873\n", "Epoch 3/100\n", "3/3 [==============================] - 1s 191ms/step - loss: 1.9632 - acc: 0.1510\n", "Epoch 4/100\n", "3/3 [==============================] - 0s 138ms/step - loss: 1.9682 - acc: 0.1411\n", "Epoch 5/100\n", "3/3 [==============================] - 1s 176ms/step - loss: 1.9776 - acc: 0.1304\n", "Epoch 6/100\n", "3/3 [==============================] - 0s 153ms/step - loss: 1.9632 - acc: 0.1536\n", "Epoch 7/100\n", "3/3 [==============================] - 1s 186ms/step - loss: 1.9682 - acc: 0.1562\n", "Epoch 8/100\n", "3/3 [==============================] - 0s 130ms/step - loss: 1.9509 - acc: 0.1275\n", "Epoch 9/100\n", "3/3 [==============================] - 1s 175ms/step - loss: 1.9662 - acc: 0.1137\n", "Epoch 10/100\n", "3/3 [==============================] - 1s 174ms/step - loss: 1.9605 - acc: 0.1919\n", "Epoch 11/100\n", "3/3 [==============================] - 1s 198ms/step - loss: 1.9962 - acc: 0.1042\n", "Epoch 12/100\n", "3/3 [==============================] - 0s 140ms/step - loss: 1.9700 - acc: 0.1179\n", "Epoch 13/100\n", "3/3 [==============================] - 1s 178ms/step - loss: 1.9776 - acc: 0.1578\n", "Epoch 14/100\n", "3/3 [==============================] - 1s 193ms/step - loss: 1.9769 - acc: 0.1406\n", "Epoch 15/100\n", "3/3 [==============================] - 1s 170ms/step - loss: 1.9816 - acc: 0.1510\n", "Epoch 16/100\n", "3/3 [==============================] - 0s 133ms/step - loss: 1.9620 - acc: 0.1481\n", "Epoch 17/100\n", "3/3 [==============================] - 0s 162ms/step - loss: 1.9662 - acc: 0.1591\n", "Epoch 18/100\n", "3/3 [==============================] - 0s 166ms/step - loss: 1.9790 - acc: 0.1288\n", "Epoch 19/100\n", "3/3 [==============================] - 1s 193ms/step - loss: 1.9705 - acc: 0.1198\n", "Epoch 20/100\n", "3/3 [==============================] - 0s 132ms/step - loss: 1.9617 - acc: 0.1508\n", "Epoch 21/100\n", "3/3 [==============================] - 1s 167ms/step - loss: 1.9818 - acc: 0.1317\n", "Epoch 22/100\n", "3/3 [==============================] - 0s 161ms/step - loss: 1.9709 - acc: 0.1288\n", "Epoch 23/100\n", "3/3 [==============================] - 1s 190ms/step - loss: 1.9560 - acc: 0.1615\n", "Epoch 24/100\n", "3/3 [==============================] - 0s 132ms/step - loss: 1.9715 - acc: 0.1508\n", "Epoch 25/100\n", "3/3 [==============================] - 1s 171ms/step - loss: 1.9741 - acc: 0.1851\n", "Epoch 26/100\n", "3/3 [==============================] - 1s 187ms/step - loss: 1.9706 - acc: 0.1406\n", "Epoch 27/100\n", "3/3 [==============================] - 0s 165ms/step - loss: 1.9826 - acc: 0.1398\n", "Epoch 28/100\n", "3/3 [==============================] - 0s 126ms/step - loss: 1.9698 - acc: 0.1262\n", "Epoch 29/100\n", "3/3 [==============================] - 1s 183ms/step - loss: 1.9711 - acc: 0.1523\n", "Epoch 30/100\n", "3/3 [==============================] - 0s 161ms/step - loss: 1.9680 - acc: 0.1549\n", "Epoch 31/100\n", "3/3 [==============================] - 1s 190ms/step - loss: 1.9472 - acc: 0.1615\n", "Epoch 32/100\n", "3/3 [==============================] - 0s 127ms/step - loss: 1.9847 - acc: 0.1646\n", "Epoch 33/100\n", "3/3 [==============================] - 0s 162ms/step - loss: 1.9565 - acc: 0.1411\n", "Epoch 34/100\n", "3/3 [==============================] - 1s 168ms/step - loss: 1.9785 - acc: 0.1549\n", "Epoch 35/100\n", "3/3 [==============================] - 1s 197ms/step - loss: 1.9499 - acc: 0.1927\n", "Epoch 36/100\n", "3/3 [==============================] - 0s 130ms/step - loss: 1.9497 - acc: 0.1578\n", "Epoch 37/100\n", "3/3 [==============================] - 0s 163ms/step - loss: 1.9379 - acc: 0.1880\n", "Epoch 38/100\n", "3/3 [==============================] - 1s 193ms/step - loss: 1.9216 - acc: 0.1823\n", "Epoch 39/100\n", "3/3 [==============================] - 1s 167ms/step - loss: 1.9734 - acc: 0.1358\n", "Epoch 40/100\n", "3/3 [==============================] - 0s 126ms/step - loss: 1.9371 - acc: 0.1481\n", "Epoch 41/100\n", "3/3 [==============================] - 1s 175ms/step - loss: 1.9302 - acc: 0.1468\n", "Epoch 42/100\n", "3/3 [==============================] - 0s 163ms/step - loss: 1.9158 - acc: 0.2099\n", "Epoch 43/100\n", "3/3 [==============================] - 0s 141ms/step - loss: 1.8992 - acc: 0.2222\n", "Epoch 44/100\n", "3/3 [==============================] - 1s 181ms/step - loss: 1.8642 - acc: 0.3021\n", "Epoch 45/100\n", "3/3 [==============================] - 1s 178ms/step - loss: 1.8753 - acc: 0.2552\n", "Epoch 46/100\n", "3/3 [==============================] - 1s 186ms/step - loss: 1.8553 - acc: 0.3281\n", "Epoch 47/100\n", "3/3 [==============================] - 1s 169ms/step - loss: 1.8448 - acc: 0.3155\n", "Epoch 48/100\n", "3/3 [==============================] - 0s 122ms/step - loss: 1.8037 - acc: 0.3582\n", "Epoch 49/100\n", "3/3 [==============================] - 0s 166ms/step - loss: 1.7770 - acc: 0.4334\n", "Epoch 50/100\n", "3/3 [==============================] - 1s 181ms/step - loss: 1.7460 - acc: 0.4323\n", "Epoch 51/100\n", "3/3 [==============================] - 0s 164ms/step - loss: 1.6978 - acc: 0.4980\n", "Epoch 52/100\n", "3/3 [==============================] - 0s 128ms/step - loss: 1.6504 - acc: 0.5324\n", "Epoch 53/100\n", "3/3 [==============================] - 1s 183ms/step - loss: 1.6264 - acc: 0.5573\n", "Epoch 54/100\n", "3/3 [==============================] - 1s 176ms/step - loss: 1.5451 - acc: 0.5914\n", "Epoch 55/100\n", "3/3 [==============================] - 1s 172ms/step - loss: 1.4829 - acc: 0.7040\n", "Epoch 56/100\n", "3/3 [==============================] - 0s 127ms/step - loss: 1.4272 - acc: 0.8013\n", "Epoch 57/100\n", "3/3 [==============================] - 0s 160ms/step - loss: 1.3344 - acc: 0.8698\n", "Epoch 58/100\n", "3/3 [==============================] - 0s 157ms/step - loss: 1.2562 - acc: 0.8808\n", "Epoch 59/100\n", "3/3 [==============================] - 1s 188ms/step - loss: 1.2021 - acc: 0.8646\n", "Epoch 60/100\n", "3/3 [==============================] - 0s 120ms/step - loss: 1.0503 - acc: 0.9575\n", "Epoch 61/100\n", "3/3 [==============================] - 0s 164ms/step - loss: 0.9593 - acc: 0.9562\n", "Epoch 62/100\n", "3/3 [==============================] - 1s 189ms/step - loss: 0.8614 - acc: 0.9479\n", "Epoch 63/100\n", "3/3 [==============================] - 1s 169ms/step - loss: 0.7299 - acc: 0.9836\n", "Epoch 64/100\n", "3/3 [==============================] - 0s 125ms/step - loss: 0.6011 - acc: 0.9781\n", "Epoch 65/100\n", "3/3 [==============================] - 1s 178ms/step - loss: 0.4877 - acc: 0.9836\n", "Epoch 66/100\n", "3/3 [==============================] - 1s 204ms/step - loss: 0.4136 - acc: 0.9740\n", "Epoch 67/100\n", "3/3 [==============================] - 0s 132ms/step - loss: 0.2811 - acc: 0.9941\n", "Epoch 68/100\n", "3/3 [==============================] - 1s 173ms/step - loss: 0.2441 - acc: 0.9896\n", "Epoch 69/100\n", "3/3 [==============================] - 1s 184ms/step - loss: 0.1701 - acc: 0.9948\n", "Epoch 70/100\n", "3/3 [==============================] - 1s 173ms/step - loss: 0.1220 - acc: 0.9945\n", "Epoch 71/100\n", "3/3 [==============================] - 1s 170ms/step - loss: 0.0776 - acc: 0.9945\n", "Epoch 72/100\n", "3/3 [==============================] - 0s 135ms/step - loss: 0.0630 - acc: 0.9945\n", "Epoch 73/100\n", "3/3 [==============================] - 1s 200ms/step - loss: 0.0780 - acc: 0.9844\n", "Epoch 74/100\n", "3/3 [==============================] - 1s 182ms/step - loss: 0.0392 - acc: 0.9945\n", "Epoch 75/100\n", "3/3 [==============================] - 0s 166ms/step - loss: 0.0540 - acc: 0.9836\n", "Epoch 76/100\n", "3/3 [==============================] - 0s 124ms/step - loss: 0.0416 - acc: 0.9945\n", "Epoch 77/100\n", "3/3 [==============================] - 0s 165ms/step - loss: 0.0482 - acc: 0.9945\n", "Epoch 78/100\n", "3/3 [==============================] - 1s 167ms/step - loss: 0.0385 - acc: 1.0000\n", "Epoch 79/100\n", "3/3 [==============================] - 0s 144ms/step - loss: 0.0917 - acc: 0.9643\n", "Epoch 80/100\n", "3/3 [==============================] - 1s 170ms/step - loss: 0.1521 - acc: 0.9427\n", "Epoch 81/100\n", "3/3 [==============================] - 0s 161ms/step - loss: 0.1830 - acc: 0.9286\n", "Epoch 82/100\n", "3/3 [==============================] - 1s 183ms/step - loss: 0.2672 - acc: 0.9115\n", "Epoch 83/100\n", "3/3 [==============================] - 0s 157ms/step - loss: 0.1182 - acc: 0.9671\n", "Epoch 84/100\n", "3/3 [==============================] - 0s 124ms/step - loss: 0.0851 - acc: 0.9726\n", "Epoch 85/100\n", "3/3 [==============================] - 1s 186ms/step - loss: 0.1062 - acc: 0.9688\n", "Epoch 86/100\n", "3/3 [==============================] - 1s 170ms/step - loss: 0.0700 - acc: 0.9684\n", "Epoch 87/100\n", "3/3 [==============================] - 0s 165ms/step - loss: 0.0589 - acc: 0.9890\n", "Epoch 88/100\n", "3/3 [==============================] - 0s 127ms/step - loss: 0.0858 - acc: 0.9849\n", "Epoch 89/100\n", "3/3 [==============================] - 1s 188ms/step - loss: 0.0380 - acc: 0.9794\n", "Epoch 90/100\n", "3/3 [==============================] - 1s 191ms/step - loss: 0.1058 - acc: 0.9688\n", "Epoch 91/100\n", "3/3 [==============================] - 0s 164ms/step - loss: 0.1064 - acc: 0.9739\n", "Epoch 92/100\n", "3/3 [==============================] - 0s 130ms/step - loss: 0.0653 - acc: 0.9836\n", "Epoch 93/100\n", "3/3 [==============================] - 1s 179ms/step - loss: 0.1252 - acc: 0.9507\n", "Epoch 94/100\n", "3/3 [==============================] - 1s 179ms/step - loss: 0.0929 - acc: 0.9643\n", "Epoch 95/100\n", "3/3 [==============================] - 1s 190ms/step - loss: 0.1500 - acc: 0.9583\n", "Epoch 96/100\n", "3/3 [==============================] - 0s 134ms/step - loss: 0.2589 - acc: 0.9343\n", "Epoch 97/100\n", "3/3 [==============================] - 0s 162ms/step - loss: 0.3288 - acc: 0.9246\n", "Epoch 98/100\n", "3/3 [==============================] - 0s 161ms/step - loss: 0.3882 - acc: 0.8931\n", "Epoch 99/100\n", "3/3 [==============================] - 1s 189ms/step - loss: 0.6683 - acc: 0.8854\n", "Epoch 100/100\n", "3/3 [==============================] - 0s 130ms/step - loss: 0.6474 - acc: 0.8957\n", "\n", "\n", "done.\n", "Please invoke the Learner.lr_plot() method to visually inspect the loss plot to help identify the maximal learning rate associated with falling loss.\n" ] } ], "source": [ "learner.lr_find(max_epochs=100)" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEKCAYAAAAIO8L1AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3deZgU5bn+8e/TPSswM2zDvgwIIgiIOm7geuKCK0lcookGXELISaImOckxx5NjNMkvizE5R40LKnFJ1LiLSwSTqLjDoMimyKoyigw7A7P38/ujC2yxBweZnuruuT/X1ddUvVXV/UyJfU/VW1WvuTsiIiK7ioRdgIiIpCcFhIiIJKWAEBGRpBQQIiKSlAJCRESSUkCIiEhSOWEX0Jq6d+/uZWVlYZchIpIx5s6du87dS5Mty6qAKCsro6KiIuwyREQyhpm919wynWISEZGkUhYQZtbfzJ4zs8VmtsjMLkuyjpnZ9Wa2zMzmm9lBCcsmmtnS4DUxVXWKiEhyqTzF1Aj8yN3fMLMiYK6ZPevuixPWORkYGrwOA24GDjOzrsBVQDngwbbT3X1jCusVEZEEKTuCcPeP3P2NYHor8DbQd5fVJgB3e9xrQGcz6w2cBDzr7huCUHgWGJ+qWkVE5LPapA/CzMqAA4HXd1nUF/ggYX510NZcu4iItJGUB4SZdQIeBi539y0peP/JZlZhZhVVVVWt/fYiIu1WSgPCzHKJh8Nf3f2RJKtUAv0T5vsFbc21f4a7T3X3cncvLy1Neinv55qxaA0r120jFtOjz0VEdkhZJ7WZGXAH8La7/6GZ1aYD3zOz+4l3Um9294/MbAbw/8ysS7DeicBPU1FnbUMT37/vTeobY0HdkBMxohGjIDdK1w55lBblM7i0E7lRY3NNA9W1jQzs1pHhvYvYVtfIltpGehTlM6pfCT2LC6hrjFHfGGPj9nre+Wgr723YRuKwG00xZ/XG7eRGI+RGI9Q3xXB33Im/8J3rO/E2MyjMjdIxP4qZ4e7EYvF1Yw6x+Ibk50bo0iGP+sYYdY0x6hqbaIw5pZ3yiUSMhsYYjTGnvilGUX4OBblRCvOi5EYjVNc2kptj1NQ3UdcYwwAMIma4w8dbaqlvjBGJGFGDSMSImBE1i7dFIGqGWXz/RXcsj8TfI/5PIr5/C3IjFORGiZjR0BSjsckxgy4d8siJBu9hRpeOufQoyqdP50I65GXVbTsiaS+V/8eNAy4AFpjZvKDtv4ABAO5+C/A0cAqwDNgOXBgs22BmvwDmBNtd4+4bUlFkfk6EGZcfzcvL1rG+up6mWPwLtCnmbK9vYsP2ej7aVMPMRWtojDklhbl0yIvy8vJ11DbEWvQZedEI0YjtnDeD3iUFNMXiX+450fiX4Y5lhmGfrL4zEGobmthW34Q7RCy+bsQs+PKNz9fUN7FxewP5OfEv4PycCBEz1m6tBdgZSrlRY0tt485gTGQW3y8AMQeC0OpRVEBBboSYE9TuxGJOkztNsXhIxTy+73a0x4L2Jnf2dmyqHb9TYW6U4sIcehYX0LO4gOKCXGoamqhvjFFUkENxQQ7FhbkUF+RSXJhDj+IC+nYu3BmSItIylk0jypWXl3tb3Und2BTj/Q3bKSrIpUuHXCo31TB/9WY2ba8nPzdKXjRCp/wchvUqol+Xwp1/PaeTHf/tq+saaWxyigpyaIw5+TmRlNbb2BSjtjFGbUMTsZjHAysnQlPM2bitPggTpzHmbNhWz9otdVRuqmFLTQO1DU1sr29ic00DH2+p5eMtdWypbaBDXg55UaO6rpGtdY1Jwyg3avQuKaRP5wL6dC6kb+dC+nfpwLBeRezbs4jCvGjKfmeRdGVmc929PNkyHbN/QTnRCINLO+2cH9itIwO7dQyxoj23IwSKCnJ3tuW0wXdkTjRCpyBAd1VSmJtkiz0TiznV9Y1sqWlgc00Da7fUsXpTDR8Gr8qNNby2fD1rttSyo9spGjEO6FfCiD7FDO1RxNAenRjSsxOlnfLTMtxF2oICQrJOJGLx00sFufTrAvv3Sb5eY1OM1RtreGfNVhZUbuL1FRt4fN6HbK1t3LlOr+ICysu6UD6wC+VlXRneu/hTpwtFspkCQtqtnGiEsu4dKevekfEjewHx025VW+tYuraaJWu28uYHm6hYtYEn538EQKf8HA4c0JnDB3fjpP17MqRHUZi/gkhKqQ9CpAUqN9VQsWoDc1ZtoGLVRt5ZsxWAIT06cfLIXowf2YsRvYt1Okoyzu76IBQQIl/AR5trmLnoY/6+8CNmr9xAzGFwaUdOH92HM8b0YZ+E/imRdKaAEEmh9dV1zFj0MdPfquT1lRtwh8MHd2XKMftwzL6lOqqQtKaAEGkjH2+p5dE3K7nz5VWs2VLL8N7FTDlmMKeO6k1OVMOvSPpRQIi0sfrGGI/Nq2TqrBUsW1vN4O4dufLU4fzbfj10RCFpZXcBoT9pRFIgLyfCOeX9mXn50dx6wcFgcPFdFVxwx2wWVm4OuzyRFlFAiKRQJGKctH8vZlx+NP9z2ggWfriZ0254icvuf5PVG7eHXZ7IbukUk0gb2lLbwC3PL+eOl1biwCVHDuKy44eS3xa3sIskoVNMImmiuCCXn4zfj+f+41hOG9Wbm55fzgV3zGbtltqwSxP5DAWESAj6dC7kD18bw/+dO4b5qzdxwh9n8fi8SrLpiF4ynwJCJEQTxvTl6UuPYp/Sjlx2/zym/GUuG7fVh12WCKCAEAnd4NJOPDhlLD89eT+ee6eKc259VaecJC0oIETSQDRifPuYfbjzokOo3FTD6Te+xNz3UjJGlkiLKSBE0sjYfbrz8HfGUpAb5dypr/FAxQdhlyTtmAJCJM0M713M9O8eyeGDu/GTh+Zz/+z3wy5J2ikFhEgaKumQy23fLOeYfUu54pEF3PDPpbrCSdqcAkIkTRXkRrntm+V89cC+XPfsu/zs8YXEYgoJaTsaUU4kjeXlRLjunAMoLc7n1hdWEHP41ZdH6oF/0iZSFhBmNg04DVjr7iOTLP8x8I2EOoYDpe6+wcxWAVuBJqCxudvARdoDM+OK8fsRMePm55czoGsHphyzT9hlSTuQylNMdwLjm1vo7te6+xh3HwP8FHjB3ROv6zsuWK5wkHbPzPjJScM4bXRvfvvMOzy7+OOwS5J2IGUB4e6zgJZeyH0ecF+qahHJBmbG788+gFF9S7js/jd5+6MtYZckWS70Tmoz60D8SOPhhGYHZprZXDOb/DnbTzazCjOrqKqqSmWpIqHb0XFdVJDDJXdVULW1LuySJIuFHhDA6cDLu5xeOtLdDwJOBr5rZkc3t7G7T3X3cncvLy0tTXWtIqHrWVzA7d88hPXb6pjyl7nUNTaFXZJkqXQIiHPZ5fSSu1cGP9cCjwKHhlCXSNoa1a+E684ew9z3NvLTRxboHglJiVADwsxKgGOAxxPaOppZ0Y5p4ERgYTgViqSvU0f35vLjh/LIG5XcOmtF2OVIFkrlZa73AccC3c1sNXAVkAvg7rcEq30FmOnu2xI27Qk8GlznnQPc6+7PpKpOkUx22ZeGsmxtNb995h32Ke3ECSN6hl2SZBENOSqS4Wobmjjn1ldZtraah78zluG9i8MuSTKIhhwVyWK6sklSRQEhkgV2XNm0rrqOKx6er05raRUKCJEsMapfCT8+aRj/fGctT8z/KOxyJAsoIESyyIXjBnFAvxKunr5IY1vLXlNAiGSRaMT4zZmj2VzTwC+eXBx2OZLhFBAiWWZ472K+c+w+PPJmJc8vWRt2OZLBFBAiWei7xw1hcGlHrnx0IdvqGsMuRzKUAkIkCxXkRvntmaOp3FTD72cuCbscyVAKCJEsdUhZVy44fCB3vrKKN97fGHY5koEUECJZ7Cfjh9GruIArHp5PfWMs7HIkwyggRLJYUUEuv/rKSN79uJqbnl8WdjmSYRQQIlnu3/bryRkH9OGm55bz3vptn7+BSEABIdIOXHnqcHKixq+eejvsUiSDKCBE2oGexQV897ghzFz8MS8vWxd2OZIhFBAi7cTFRw6if9dCrnliMY1N6rCWz6eAEGknCnKjXHnKcJZ8vJX75nwQdjmSARQQIu3ISfv34vDBXfnDzCVsrmkIuxxJcwoIkXbEzPjvU0ewcXsDf355ZdjlSJpTQIi0MyP7lnDiiJ5Me2klW2p1FCHNU0CItEOXfmkoW2obufPlVWGXImlMASHSDo3sW8Lxw3ty+4srdBQhzUpZQJjZNDNba2YLm1l+rJltNrN5wet/EpaNN7MlZrbMzK5IVY0i7dllwVHEXTqKkGak8gjiTmD856zzoruPCV7XAJhZFPgTcDIwAjjPzEaksE6RdmlUvxK+tF8Pbn9pJVt1FCFJpCwg3H0WsOELbHoosMzdV7h7PXA/MKFVixMRAC47fiibaxq4+9X3wi5F0lDYfRBHmNlbZvZ3M9s/aOsLJN7FszpoS8rMJptZhZlVVFVVpbJWkawzul9njhtWym0vrqBaI8/JLsIMiDeAge5+AHAD8NgXeRN3n+ru5e5eXlpa2qoFirQHlx2/L5u2N3D3q6vCLkXSTGgB4e5b3L06mH4ayDWz7kAl0D9h1X5Bm4ikwJj+nTlm31Jum7VC41fLp4QWEGbWy8wsmD40qGU9MAcYamaDzCwPOBeYHladIu3BZccPZeP2Bv7ymvoi5BM5qXpjM7sPOBbobmargauAXAB3vwU4C/iOmTUCNcC57u5Ao5l9D5gBRIFp7r4oVXWKCBw0oAvjhnRj2ssruXDcIPJywu6elHRg8e/k7FBeXu4VFRVhlyGSkV54t4qJ02Zz7VmjObu8/+dvIFnBzOa6e3myZfozQUQAOHpod/brVcTUWSuIxbLnD0f54hQQIgLEn/T67WMGs3RtNc+/uzbsciQNKCBEZKfTRvehT0kBt76wIuxSJA0oIERkp9xohIuOHMTrKzfw1gebwi5HQqaAEJFPOffQARQV5DD1RR1FtHcKCBH5lE75OXz9sAH8fcFHvL9+e9jlSIgUECLyGReNG0Q0Ytzxko4i2jMFhIh8Rs/iAiaM6csDFavZuK0+7HIkJAoIEUnqW0cNpqahSY/faMcUECKS1LBeRRw7rJS7Xl1FbUNT2OVICBQQItKsyUcPZl11PY++qQcqt0cKCBFp1hGDuzGybzG3vajHb7RHCggRaZaZ8a2jBrOiahuzlmrExvZGASEiu3XyyN6UFuVz5yurwi5F2pgCQkR2Ky8nwjcOG8DzS6pYUVUddjnShhQQIvK5vn7YAHKjxt2v6pLX9kQBISKfq0dRAaeN7sNDc1dTrXGr2w0FhIi0yMSxZVTXNfLw3NVhlyJtRAEhIi0ypn9nxvTvzF2vrNIlr+2EAkJEWmzS2DJWrNMlr+1FygLCzKaZ2VozW9jM8m+Y2XwzW2Bmr5jZAQnLVgXt88ysIlU1isieOWVU/JLXu3TJa7uQyiOIO4Hxu1m+EjjG3UcBvwCm7rL8OHcf4+7lKapPRPbQjkten1tSxcp128IuR1IsZQHh7rOADbtZ/oq7bwxmXwP6paoWEWk9Oy551VFE9kuXPoiLgb8nzDsw08zmmtnkkGoSkSR6FBVw6qjeuuS1HQg9IMzsOOIB8Z8JzUe6+0HAycB3zezo3Ww/2cwqzKyiqkodZyJtYdK4QbrktR0INSDMbDRwOzDB3dfvaHf3yuDnWuBR4NDm3sPdp7p7ubuXl5aWprpkESF+yesBuuQ164UWEGY2AHgEuMDd301o72hmRTumgROBpFdCiUh4LtQlr1kvlZe53ge8Cgwzs9VmdrGZTTGzKcEq/wN0A27a5XLWnsBLZvYWMBt4yt2fSVWdIvLF7Ljk9c8vrwq7FEmRnFS9sbuf9znLLwEuSdK+Ajjgs1uISDrJy4lw/mED+eM/3mXZ2mqG9OgUdknSykLvpBaRzPX1wwaQF43oktcs1aKAMLPLzKzY4u4wszfM7MRUFyci6a20KJ/TD+jDw2+sZnNNQ9jlSCtr6RHERe6+hXiHcRfgAuA3KatKRDLGhePK2F7fxANzPgi7FGllLQ0IC36eAtzj7osS2kSkHRvZt4RDy7py5yuraGyKhV2OtKKWBsRcM5tJPCBmBJeh6l+CiABw0ZFlVG6q4R9vfxx2KdKKWhoQFwNXAIe4+3YgF7gwZVWJSEY5YUQv+nYuZJouec0qLQ2II4Al7r7JzM4H/hvYnLqyRCSTRCPGxLEDmb1yA4s+1FdDtmhpQNwMbA/GbPgRsBy4O2VViUjG+Vr5AApzo7pxLou0NCAa3d2BCcCN7v4noCh1ZYlIpinpkMuZB/dl+rwPWVddF3Y50gpaGhBbzeynxC9vfcrMIsT7IUREdpo0dhD1TTHuff39sEuRVtDSgPgaUEf8fog1xAf3uTZlVYlIRhrSoxNH71vKPa+9R32jLnTMdC0KiCAU/gqUmNlpQK27qw9CRD7jonFlVG2t46kFH4Zdiuyllj5q4xziT1Y9GzgHeN3MzkplYSKSmY4eWsrg0o5Me2kV8a5LyVQtPcV0JfF7ICa6+zeJD+Dzs9SVJSKZKhIxLho3iAWVm5mzauPnbyBpq6UBEQlGd9th/R5sKyLtzJkH9aNzh1xuf3FF2KXIXmjpl/wzZjbDzCaZ2STgKeDp1JUlIpmsMC/KNw4bwLNvf8yqddvCLke+oJZ2Uv8YmAqMDl5T3f0/U1mYiGS2bx5RRk7E+PPLK8MuRb6gFo8o5+4PAw+nsBYRySI9iws4/YA+PFCxmh+eMIySDrp1KtPs9gjCzLaa2ZYkr61mtqWtihSRzHTxkYOoaWji3tm6cS4T7TYg3L3I3YuTvIrcvbitihSRzLR/nxLG7tONu15ZpRvnMpCuRBKRlPrW0YNZs6WWx+dVhl2K7KGUBoSZTTOztWa2sJnlZmbXm9kyM5tvZgclLJtoZkuD18RU1ikiqXPsvqUM713MLS8sJxbTjXOZJNVHEHcC43ez/GRgaPCaTPyx4phZV+Aq4DDiN+VdZWZdUlqpiKSEmfGdY/dhedU2Zi7WiHOZJKUB4e6zgA27WWUCcLfHvQZ0NrPewEnAs+6+wd03As+y+6ARkTR2ysheDOjagZtfWK7Hb2SQsPsg+gIfJMyvDtqaa/8MM5tsZhVmVlFVVZWyQkXki8uJRph89GDe+mATry5fH3Y50kJhB8Rec/ep7l7u7uWlpaVhlyMizTjr4H5075TPzS8sD7sUaaGwA6IS6J8w3y9oa65dRDJUQW6Ui48cxItL17FgtcatzgRhB8R04JvB1UyHA5vd/SNgBnCimXUJOqdPDNpEJIOdf/gAigpyuEVHERmhxY/a+CLM7D7gWKC7ma0mfmVSLoC730L8gX+nAMuA7cCFwbINZvYLYE7wVte4++46u0UkAxQV5HLB4QO5+YXlLK+qZp/STmGXJLth2XRFQXl5uVdUVIRdhojsRtXWOo7+3XOcMKIn1593YNjltHtmNtfdy5MtC/sUk4i0M6VF+Vx0ZBnT3/qQxR/qkW7pTAEhIm1u8lH7UFyQw3Uzl4RdiuyGAkJE2lxJh1ymHLsP/3xnLRWr1L2YrhQQIhKKSWPL6N4pn9/NWKK7q9OUAkJEQtEhL4dLvzSE2Ss3MGvpurDLkSQUECISmnMPGUC/LoVcO+MdHUWkIQWEiIQmLyfCD47fl4WVW/j7wjVhlyO7UECISKi+fGBfhvboxHUzl9DYpFHn0okCQkRCFY0YPzpxGMurtvHom3rkWjpRQIhI6E7avycH9Cvhf/+xlLrGprDLkYACQkRCZ2b8+KT9qNxUw32vvx92ORJQQIhIWhg3pBtHDO7Gjc8tY1tdY9jlCAoIEUkTZsaPxw9jXXU9f355ZdjlCAoIEUkjBw3owgkjenLLCyuo2loXdjntngJCRNLKT0/ej7rGJn4/Qw/yC5sCQkTSyuDSTkwaW8YDcz9gYaWGJg2TAkJE0s73vzSUrh3yuPqJRXoER4gUECKSdooLcvnxScOYs2ojT87/KOxy2i0FhIikpbPL+7N/n2J+/fTb1NTr5rkwKCBEJC1FI8ZVp+/Ph5truXXW8rDLaZcUECKStg4d1JVTR/fmlheW88GG7WGX0+6kNCDMbLyZLTGzZWZ2RZLlfzSzecHrXTPblLCsKWHZ9FTWKSLp68pThhMx42ePL1SHdRtLWUCYWRT4E3AyMAI4z8xGJK7j7j9w9zHuPga4AXgkYXHNjmXufkaq6hSR9NancyE/OnEYzy+p4qkF6rBuS6k8gjgUWObuK9y9HrgfmLCb9c8D7kthPSKSoSaNLWNU3xKufmIxm2sawi6n3UhlQPQFPkiYXx20fYaZDQQGAf9KaC4wswoze83Mvtzch5jZ5GC9iqqqqtaoW0TSTDRi/Pqro1hfXcdvn3kn7HLajXTppD4XeMjdE69lG+ju5cDXgf81s32SbejuU9293N3LS0tL26JWEQnByL4lXDRuEPe+/j4VqzaEXU67kMqAqAT6J8z3C9qSOZddTi+5e2XwcwXwPHBg65coIpnkByfsS9/OhfzXowuob9TwpKmWyoCYAww1s0Fmlkc8BD5zNZKZ7Qd0AV5NaOtiZvnBdHdgHLA4hbWKSAbomJ/DNRP2592Pq7n5ed0bkWopCwh3bwS+B8wA3gYecPdFZnaNmSVelXQucL9/+vq14UCFmb0FPAf8xt0VECLCl4b35IwD+nDDv5ay+MMtYZeT1SybrisuLy/3ioqKsMsQkRTbuK2eE/44ix5F+Tz+vXHkRtOlOzXzmNncoL/3M7RXRSTjdOmYx6++MpLFH23hD8++G3Y5WUsBISIZ6aT9e3HeoQO4+fnlPPHWh2GXk5UUECKSsa4+Y38OHtiFnz6ygPfWbwu7nKyjgBCRjJWXE+H68w4kGjG+f9+buvS1lSkgRCSj9e1cyG/PHM381Zu5dobusm5NCggRyXjjR/bi/MMHcNuLK3lxqR6501oUECKSFa48ZQRDe3Ti0vveZOnHW8MuJysoIEQkKxTmRbl9Yjk50Qjn3fYab76/MeySMp4CQkSyxsBuHbnvW4dTmBflG7e/riOJvaSAEJGsMqRHJx789lg65EX59j1zWbulNuySMpYCQkSyTq+SAm4+/2DWbKnlzFteYe57ejz4F6GAEJGsdEhZV+791uHEYnD2La8yc9GasEvKOAoIEclaY/p35pnLj2JU3xIuu38ezyzUmNZ7QgEhIlmtqCCX2ycewtCenZjylzc47vfPa0S6FlJAiEjWKy3K56EpY/nVV0bSFHO+fc9c3l+/Peyy0p4CQkTahbycCN84bCDTJh1CfVOM0254kVnv6q7r3VFAiEi7MqRHJ578/pH06VzIt++Zyxu6oa5ZCggRaXcGduvIPRcfRo/ifC66cw7L1uqGumQUECLSLpUW5XPPRYeRG41wwR2z+XBTTdglpR0FhIi0WwO6deDOCw+huraRr9/2Gh9sUMd1opQGhJmNN7MlZrbMzK5IsnySmVWZ2bzgdUnCsolmtjR4TUxlnSLSfu3fp4S7Lj6Ujdsb+OrNr/D2R1vCLiltpCwgzCwK/Ak4GRgBnGdmI5Ks+jd3HxO8bg+27QpcBRwGHApcZWZdUlWriLRvBw3owoNTjiBqxjm3vsrrK9aHXVJaSOURxKHAMndf4e71wP3AhBZuexLwrLtvcPeNwLPA+BTVKSLCvj2LePjfx9KjKJ8Lps3WozlIbUD0BT5ImF8dtO3qTDObb2YPmVn/PdxWRKTV9O1cyINTxjKidzFT/jKXR95YHXZJoQq7k/oJoMzdRxM/SrhrT9/AzCabWYWZVVRV6aYXEdk7XTvm8ddLDuPwwd340YNv8dfX3wu7pNCkMiAqgf4J8/2Ctp3cfb271wWztwMHt3TbhPeY6u7l7l5eWlraKoWLSPvWMT+HaZMO4bhhPbjy0YXc/uKKsEsKRSoDYg4w1MwGmVkecC4wPXEFM+udMHsG8HYwPQM40cy6BJ3TJwZtIiJtoiA3yi3nH8wpo3rxy6fe5vp/LsXdwy6rTeWk6o3dvdHMvkf8iz0KTHP3RWZ2DVDh7tOBS83sDKAR2ABMCrbdYGa/IB4yANe4ux6/KCJtKi8nwvXnHkhB7nz+8Oy7VG6s4eoJ+1OQGw27tDZh2ZSI5eXlXlFREXYZIpJlYjHnD8++y43PLeOood2ZekE5hXnZERJmNtfdy5MtC7uTWkQk7UUixn+cNIxrzxrNy8vWMenPs9lW1xh2WSmngBARaaGzy/vzx6+NoeK9jZx+40ssWL057JJSSgEhIrIHJozpyz0XHcr2uia+ctPL3DZrRdZ2XisgRET20Ngh3Xnm8qM4fnhPfvX021zx8AIammJhl9XqFBAiIl9A5w553PSNg/jecUP4W8UHTPrzbDbXNIRdVqtSQIiIfEE7Oq9/f/YBzF65gdNveImZi9ZkzSknBYSIyF466+B+/PWSw8nLiTD5nrl8c9ps1lXXff6GaU4BISLSCg4d1JW/X3YUV5+xP3NWbeCMG17isTcraYpl7tGEAkJEpJXkRiNMHFvGg98eS3FhLpf/bR4n/98s5q/eFHZpX4gCQkSklY3qV8LTlx7FjV8/kOraRs66+VXueXVVxvVNKCBERFIgEjFOG92Hpy49inFDuvGzxxdx/h2vs6KqOuzSWkwBISKSQl065nHHxEO4ZsL+LFi9mVOvf4mfT1/EO2vSf+xrBYSISIpFIsY3jyhj5g+O4Zh9S7l/zvuccePLTHtpJfWN6XuDnZ7mKiLSxtZX1/HDB97ihXer6NelkCtO3o/TRvcJpRY9zVVEJI1065TPnRcewl0XHUpJYS7fu/dNJt9dwePzKlm/h/dPNDbF+HBTTUrqTNmAQSIi0jwz45h9Sxm3Tzdu+Ncy7n51FTMXf4wZHNi/M9//t6Hk5UTYXNPAMfuW8t767azdWsvIviVc/cRiauobueCIMu5+ZRXvrNnKzB8cTcf81v1K1ykmEZE00BRzFlRuZta7VfxtzgdUJhwVdO6Qy6bt8ec85UaNhianKD+HrXWNmMEvJozk/MMHfqHP3d0pJh1BiIikgWjEGNO/M2P6d+aSowbx4tJ1lBTmUlPfxA3/Wsqxw3pw0IAu3P7SCg4p68qF4+WLvEsAAAkUSURBVMp4bcV6unXM54D+nVNSk44gRETaMXVSi4jIHlNAiIhIUikNCDMbb2ZLzGyZmV2RZPkPzWyxmc03s3+a2cCEZU1mNi94TU9lnSIi8lkp66Q2syjwJ+AEYDUwx8ymu/vihNXeBMrdfbuZfQf4HfC1YFmNu49JVX0iIrJ7qTyCOBRY5u4r3L0euB+YkLiCuz/n7tuD2deAfimsR0RE9kAqA6Iv8EHC/OqgrTkXA39PmC8wswoze83MvtzcRmY2OVivoqqqau8qFhGRndLiPggzOx8oB45JaB7o7pVmNhj4l5ktcPflu27r7lOBqRC/zLVNChYRaQdSeQRRCfRPmO8XtH2KmR0PXAmc4e47H0Li7pXBzxXA88CBKaxVRER2kbIb5cwsB3gX+BLxYJgDfN3dFyWscyDwEDDe3ZcmtHcBtrt7nZl1B14FJuzSwZ3sM6uA91r9l0md7sC6sIvIcNqHe0f7b+9kw/4b6O6lyRak7BSTuzea2feAGUAUmObui8zsGqDC3acD1wKdgAfNDOB9dz8DGA7camYx4kc5v/m8cAg+M+kvma7MrKK5OxilZbQP9472397J9v2XVY/ayDTZ/o+rLWgf7h3tv72T7ftPd1KLiEhSCohwTQ27gCygfbh3tP/2TlbvP51iEhGRpHQEISIiSSkgREQkKQWEiIgkpYBIU2YWMbNfmdkNZjYx7HoykZl1DJ7TdVrYtWQiM/uymd1mZn8zsxPDricTBP/m7gr22zfCrmdvKSBSwMymmdlaM1u4S/tux8fYxQTijydpIP6gw3ajlfYfwH8CD6SmyvTWGvvQ3R9z928BU/jkMfztzh7uy68CDwX77Yw2L7aV6SqmFDCzo4Fq4G53Hxm0RYk/emTn+BjAecTvMv/1Lm9xUfDa6O63mtlD7n5WW9UftlbafwcA3YACYJ27P9k21aeH1tiH7r422O464K/u/kYblZ9W9nBfTgD+7u7zzOxed/96SGW3irR4mmu2cfdZZla2S/PO8TEAzOx+4s+X+jXwmVMgZrYaqA9mm1JXbfpppf13LNARGAHUmNnT7h5LZd3ppJX2oQG/If6F1y7DAfZsXxIPi37APLLgDI0Cou0kGx/jsN2s/whwg5kdBcxKZWEZYo/2n7tfCWBmk4gfQbSbcNiNPf03+H3geKDEzIa4+y2pLC7DNLcvrwduNLNTgSfCKKw1KSDSVDDS3sVh15Hp3P3OsGvIVO5+PfEvPGkhd98GXBh2Ha0l4w+BMkiLxseQZmn/7T3tw9bTLvalAqLtzAGGmtkgM8sDzgWmh1xTJtH+23vah62nXexLBUQKmNl9xAc5GmZmq83sYndvBHaMj/E28EDi4EnyCe2/vad92Hra877UZa4iIpKUjiBERCQpBYSIiCSlgBARkaQUECIikpQCQkREklJAiIhIUgoICY2ZVbfBZ5zRwkeDt+ZnHmtmY7/Adgea2R3B9CQzu7H1q9tzZla266Ouk6xTambPtFVN0jYUEJLxgkcvJ+Xu0939Nyn4zN09x+xYYI8DAvgvMvTZR+5eBXxkZuPCrkVajwJC0oKZ/djM5pjZfDO7OqH9MTOba2aLzGxyQnu1mV1nZm8BR5jZKjO72szeMLMFZrZfsN7Ov8TN7E4zu97MXjGzFWZ2VtAeMbObzOwdM3vWzJ7esWyXGp83s/81swrgMjM73cxeN7M3zewfZtYzeCz0FOAHZjbPzI4K/rp+OPj95iT7EjWzImC0u7+VZFmZmf0r2Df/NLMBQfs+ZvZa8Pv+MtkRmcVHOHvKzN4ys4Vm9rWg/ZBgP7xlZrPNrCj4nBeDffhGsqMgM4ua2bUJ/62+nbD4MSDjR1GTBO6ul16hvIDq4OeJwFTAiP/R8iRwdLCsa/CzEFgIdAvmHTgn4b1WAd8Ppv8duD2YngTcGEzfCTwYfMYI4s/zBzgLeDpo7wVsBM5KUu/zwE0J81345GkElwDXBdM/B/4jYb17gSOD6QHA20ne+zjg4YT5xLqfACYG0xcBjwXTTwLnBdNTduzPXd73TOC2hPkSIA9YARwStBUTf7JzB6AgaBsKVATTZcDCYHoy8N/BdD5QAQwK5vsCC8L+d6VX6730uG9JBycGrzeD+U7Ev6BmAZea2VeC9v5B+3rigyg9vMv7PBL8nEt86MdkHvP42BCLzaxn0HYk8GDQvsbMnttNrX9LmO4H/M3MehP/0l3ZzDbHAyPi4+8AUGxmndw98S/+3kBVM9sfkfD73AP8LqH9y8H0vcDvk2y7ALjOzH4LPOnuL5rZKOAjd58D4O5bIH60QXwsgzHE9+++Sd7vRGB0whFWCfH/JiuBtUCfZn4HyUAKCEkHBvza3W/9VGN8VLjjgSPcfbuZPU98CFGAWnffdaS9uuBnE83/265LmLZm1tmdbQnTNwB/cPfpQa0/b2abCHC4u9fu5n1r+OR3azXu/q6ZHQScAvzSzP4JPNrM6j8APiY+XGsESFavET9Sm5FkWQHx30OyhPogJB3MAC4ys04AZtbXzHoQ/+t0YxAO+wGHp+jzXwbODPoiehLvZG6JEj4ZA2BiQvtWoChhfibx0dkACP5C39XbwJBmPucV4o+Thvg5/heD6deIn0IiYfmnmFkfYLu7/wW4FjgIWAL0NrNDgnWKgk73EuJHFjHgAuJjVe9qBvAdM8sNtt03OPKA+BHHbq92ksyigJDQuftM4qdIXjWzBcBDxL9gnwFyzOxt4mMjv5aiEh4mPmTkYuAvwBvA5hZs93PgQTObC6xLaH8C+MqOTmrgUqA86NRdTLy/4FPc/R3iQ3sW7bqMeLhcaGbziX9xXxa0Xw78MGgf0kzNo4DZZjYPuAr4pbvXA18jPqTtW8CzxP/6vwmYGLTtx6ePlna4nfh+eiO49PVWPjlaOw54Ksk2kqH0uG8RYEefgJl1A2YD49x9TRvX8ANgq7vf3sL1OwA17u5mdi7xDusJKS1y9/XMAia4+8awapDWpT4Ikbgnzawz8c7mX7R1OARuBs7eg/UPJt6pbMAm4lc4hcLMSon3xygcsoiOIEREJCn1QYiISFIKCBERSUoBISIiSSkgREQkKQWEiIgkpYAQEZGk/j9GIM5xUhF9XQAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "learner.lr_plot()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### STEP 4: Train the Model\n", "We will train the model using `autofit`, which uses a triangular learning rate policy. The training will automatically stop when the validation loss no longer improves. We save the weights of the model during training in case we would like to reload the weights from any epoch." ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "early_stopping automatically enabled at patience=5\n", "reduce_on_plateau automatically enabled at patience=2\n", "\n", "\n", "begin training using triangular learning rate policy with max lr of 0.01...\n", "Epoch 1/1024\n", "4/4 [==============================] - 7s 2s/step - loss: 1.9479 - acc: 0.2029 - val_loss: 1.7514 - val_acc: 0.3060\n", "Epoch 2/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 1.6925 - acc: 0.4066 - val_loss: 1.6553 - val_acc: 0.3492\n", "Epoch 3/1024\n", "4/4 [==============================] - 6s 1s/step - loss: 1.5708 - acc: 0.5345 - val_loss: 1.5262 - val_acc: 0.4898\n", "Epoch 4/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 1.4280 - acc: 0.6994 - val_loss: 1.4030 - val_acc: 0.7074\n", "Epoch 5/1024\n", "4/4 [==============================] - 6s 1s/step - loss: 1.2972 - acc: 0.8828 - val_loss: 1.2960 - val_acc: 0.7765\n", "Epoch 6/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 1.1721 - acc: 0.9143 - val_loss: 1.2132 - val_acc: 0.7879\n", "Epoch 7/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 1.0570 - acc: 0.9572 - val_loss: 1.1320 - val_acc: 0.8003\n", "Epoch 8/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.9660 - acc: 0.9531 - val_loss: 1.0657 - val_acc: 0.8008\n", "Epoch 9/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.8845 - acc: 0.9685 - val_loss: 1.0068 - val_acc: 0.8053\n", "Epoch 10/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.8171 - acc: 0.9692 - val_loss: 0.9503 - val_acc: 0.8132\n", "Epoch 11/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.7351 - acc: 0.9612 - val_loss: 0.9076 - val_acc: 0.8127\n", "Epoch 12/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.6809 - acc: 0.9766 - val_loss: 0.8652 - val_acc: 0.8182\n", "Epoch 13/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.6138 - acc: 0.9886 - val_loss: 0.8332 - val_acc: 0.8102\n", "Epoch 14/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.5587 - acc: 0.9846 - val_loss: 0.8024 - val_acc: 0.8207\n", "Epoch 15/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.5186 - acc: 0.9886 - val_loss: 0.7824 - val_acc: 0.8236\n", "Epoch 16/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.4747 - acc: 0.9886 - val_loss: 0.7619 - val_acc: 0.8192\n", "Epoch 17/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.4317 - acc: 0.9927 - val_loss: 0.7434 - val_acc: 0.8152\n", "Epoch 18/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.4035 - acc: 0.9960 - val_loss: 0.7226 - val_acc: 0.8187\n", "Epoch 19/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.3773 - acc: 0.9960 - val_loss: 0.7148 - val_acc: 0.8187\n", "Epoch 20/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.3537 - acc: 1.0000 - val_loss: 0.7064 - val_acc: 0.8187\n", "Epoch 21/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.3154 - acc: 1.0000 - val_loss: 0.6969 - val_acc: 0.8162\n", "Epoch 22/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.2936 - acc: 1.0000 - val_loss: 0.6849 - val_acc: 0.8147\n", "Epoch 23/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.2744 - acc: 1.0000 - val_loss: 0.6781 - val_acc: 0.8197\n", "Epoch 24/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.2564 - acc: 1.0000 - val_loss: 0.6704 - val_acc: 0.8207\n", "Epoch 25/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.2442 - acc: 1.0000 - val_loss: 0.6647 - val_acc: 0.8187\n", "Epoch 26/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.2218 - acc: 1.0000 - val_loss: 0.6667 - val_acc: 0.8177\n", "Epoch 27/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.2138 - acc: 1.0000 - val_loss: 0.6546 - val_acc: 0.8212\n", "Epoch 28/1024\n", "4/4 [==============================] - 6s 1s/step - loss: 0.2000 - acc: 1.0000 - val_loss: 0.6521 - val_acc: 0.8236\n", "Epoch 29/1024\n", "4/4 [==============================] - 6s 1s/step - loss: 0.1819 - acc: 1.0000 - val_loss: 0.6467 - val_acc: 0.8167\n", "Epoch 30/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1757 - acc: 0.9960 - val_loss: 0.6344 - val_acc: 0.8202\n", "Epoch 31/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1791 - acc: 0.9920 - val_loss: 0.6349 - val_acc: 0.8187\n", "Epoch 32/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1593 - acc: 1.0000 - val_loss: 0.6287 - val_acc: 0.8172\n", "Epoch 33/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1563 - acc: 0.9960 - val_loss: 0.6207 - val_acc: 0.8266\n", "Epoch 34/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1498 - acc: 0.9960 - val_loss: 0.6221 - val_acc: 0.8222\n", "Epoch 35/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1391 - acc: 0.9960 - val_loss: 0.6202 - val_acc: 0.8266\n", "Epoch 36/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1330 - acc: 1.0000 - val_loss: 0.6186 - val_acc: 0.8296\n", "Epoch 37/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1202 - acc: 1.0000 - val_loss: 0.6260 - val_acc: 0.8227\n", "Epoch 38/1024\n", "4/4 [==============================] - 6s 1s/step - loss: 0.1172 - acc: 1.0000 - val_loss: 0.6193 - val_acc: 0.8251\n", "\n", "Epoch 00038: Reducing Max LR on Plateau: new max lr will be 0.005 (if not early_stopping).\n", "Epoch 39/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1163 - acc: 0.9960 - val_loss: 0.6133 - val_acc: 0.8266\n", "Epoch 40/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1114 - acc: 1.0000 - val_loss: 0.6221 - val_acc: 0.8236\n", "Epoch 41/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1093 - acc: 1.0000 - val_loss: 0.6210 - val_acc: 0.8222\n", "\n", "Epoch 00041: Reducing Max LR on Plateau: new max lr will be 0.0025 (if not early_stopping).\n", "Epoch 42/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1098 - acc: 1.0000 - val_loss: 0.6320 - val_acc: 0.8207\n", "Epoch 43/1024\n", "4/4 [==============================] - 6s 1s/step - loss: 0.1059 - acc: 1.0000 - val_loss: 0.6224 - val_acc: 0.8222\n", "\n", "Epoch 00043: Reducing Max LR on Plateau: new max lr will be 0.00125 (if not early_stopping).\n", "Epoch 44/1024\n", "4/4 [==============================] - 5s 1s/step - loss: 0.1035 - acc: 1.0000 - val_loss: 0.6291 - val_acc: 0.8197\n", "Restoring model weights from the end of the best epoch\n", "Epoch 00044: early stopping\n", "Weights from best epoch have been loaded into model.\n" ] }, { "data": { "text/plain": [ "" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "learner.autofit(0.01, checkpoint_folder='/tmp/saved_weights')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Evaluate\n", "\n", "#### Validate" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " precision recall f1-score support\n", "\n", " Case_Based 0.73 0.81 0.77 227\n", " Genetic_Algorithms 0.90 0.96 0.93 331\n", " Neural_Networks 0.83 0.86 0.84 592\n", " Probabilistic_Methods 0.87 0.83 0.85 314\n", "Reinforcement_Learning 0.80 0.75 0.77 170\n", " Rule_Learning 0.86 0.60 0.71 106\n", " Theory 0.73 0.70 0.72 273\n", "\n", " accuracy 0.82 2013\n", " macro avg 0.82 0.79 0.80 2013\n", " weighted avg 0.82 0.82 0.82 2013\n", "\n" ] }, { "data": { "text/plain": [ "array([[183, 5, 11, 4, 6, 2, 16],\n", " [ 1, 318, 10, 0, 1, 0, 1],\n", " [ 18, 5, 507, 25, 12, 1, 24],\n", " [ 5, 0, 31, 262, 5, 0, 11],\n", " [ 3, 16, 19, 1, 128, 0, 3],\n", " [ 21, 0, 4, 2, 0, 64, 15],\n", " [ 20, 9, 29, 8, 9, 7, 191]])" ] }, "execution_count": 8, "metadata": {}, "output_type": "execute_result" } ], "source": [ "learner.validate(class_names=preproc.get_classes())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Create a Predictor Object" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "p = ktrain.get_predictor(learner.model, preproc)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Transductive Inference: Making Predictions for Validation and Test Nodes in Original Training Graph\n", "In transductive inference, we make predictions for unlabeled nodes whose features are visible during training. Making predictions on validation nodes in the training graph is transductive inference.\n", "\n", "Let's see how well our prediction is for the first validation example." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([[0.00738885, 0.00764509, 0.94959724, 0.00979447, 0.00634191,\n", " 0.00760743, 0.01162501]], dtype=float32)" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "p.predict_transductive(val_data.ids[0:1], return_proba=True)" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([0., 0., 1., 0., 0., 0., 0.])" ] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "val_data[0][1][0]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's make predictions for all **test** nodes in the holdout set, measure test accuracy, and visually compare some of them with ground truth." ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [], "source": [ "y_pred = p.predict_transductive(df_holdout.index, return_proba=False)" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [], "source": [ "y_true = df_holdout.target.values" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
Ground TruthPredicted
0TheoryTheory
1Genetic_AlgorithmsTheory
2Neural_NetworksNeural_Networks
3Neural_NetworksNeural_Networks
4Reinforcement_LearningReinforcement_Learning
\n", "
" ], "text/plain": [ " Ground Truth Predicted\n", "0 Theory Theory\n", "1 Genetic_Algorithms Theory\n", "2 Neural_Networks Neural_Networks\n", "3 Neural_Networks Neural_Networks\n", "4 Reinforcement_Learning Reinforcement_Learning" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import pandas as pd\n", "pd.DataFrame(zip(y_true, y_pred), columns=['Ground Truth', 'Predicted']).head()" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "0.8232931726907631" ] }, "execution_count": 18, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import numpy as np\n", "(y_true == np.array(y_pred)).mean()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Our final test accuracy for transductive inference on the holdout nodes is **82.32%** accuracy." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.8" } }, "nbformat": 4, "nbformat_minor": 2 }